We are mindful that many organisations will be considering how best to evaluate the impact of their online activity, of whatever kind. A few weeks ago, we posted a reminder about the work we did with The Space and the BBC, looking at the most meaningful analytics across social and web platforms.
This was a primer to thinking about meaningful metrics beyond only looking at growth (of followers/fans) or aggregate reach type figures. But to determine the impact of your online work you will probably need to take a mixed methodology approach. Therefore, we have written this short guide to help you to consider what other methods can help you and how you might develop a useful framework for measurement and evaluation.
The first step in the process is to establish the questions you are trying to answer, what is it you are trying to evaluate?
Particularly with online activity, it is common to fall into the trap of focusing solely on metrics that are foregrounded by the platforms, or easy to collect (such as website session duration or aggregate reach on Facebook) but you must tie your evaluation back to your project’s objectives.
A framework for a community participation programme that aims to engage a particular group in a specific geographic location, for example, will look very different from one for a series of online live streams that is seeking to reach a young audience.
Think long and hard about your project's core objectives and from that write a list of questions you’d like to answer. E.g.
- Did our project reach and engage the target audience we were aiming for?
- Did the audience find the work engaging/entertaining/informative?
- Did the project succeed in helping the participants to feel more connected with the local area?
When working with data, one tip is to remember the acronym VUMI: Vital Useful or Merely Interesting. There is a myriad of types of data you could be collecting, some of which may be interesting but won’t particularly help you to assess impact. You’re aiming to collect data that fits within the ‘Vital’ and ‘Useful’ categories.
Digital analytics can tell you a huge amount about whether or not audiences are engaging with your project, but tools like Google Analytics and insights from social media platforms are mainly designed to tell you about what people are doing, not so much who they are or what they think. They do not tell you what their motivation is, their response to the activity or content, or anything about longer term impact. For a comprehensive evaluation of digital activity, then, you will almost certainly need to use other research techniques, both quantitative and qualitative.
A survey can be one of the simplest ways to build a more complete picture of the composition of your audience and to gather valuable feedback. Depending on the nature of your project you can deploy it on your website, share it via social media and/or distribute it via email.
- When deploying a survey on your site, it is essential to put this overtly in front of audiences to obtain good response rates. A URL hidden at the bottom of a page will yield poor results. A popup or banner is much more effective.
- It is good practice to keep surveys on your webpages as short as possible to gain maximum completions.
- Social media and newsletter lists can provide great response rates, but do remember that this may only be representative of a certain type of loyal audience member.
- Surveys can also be a great way to gather a pool of people who you could talk to in more depth at a later point. Ask for an opt in at the end of your survey to further research.
For deeper, richer insights, online discussion groups are likely to be a vital part of your evaluation process. But you will need to plan the sessions carefully, sometimes participants in an online session can be more reluctant to contribute than they would be in an in-person session. It is harder to read the non-verbal cues and bring people into the discussion.
Consider how you will gently encourage feedback from everyone and be clear in explaining the purpose and format of the session. One tip is that some online meeting software, e.g. Zoom, will allow you to save the chat. You can therefore ask people to type responses or answers to questions into the chat and then save this for use in your evaluation.
You may decide that it would be useful to record the session but be mindful of privacy and make it clear beforehand that the session will be recorded.
And for particularly sensitive areas of discussion, for example willingness to pay or donating, it may be easier to conduct sessions one-to-one by video or telephone.
There are a number of challenges with collecting quantitative type data from digital platforms. Metrics can be different across platforms, for example Twitter and Instagram use ‘Impressions’ but Facebook doesn’t (well it does but only for its paid ads). There are variations in how far back you can go and with a platform like Instagram you can’t download the insights directly (unless you connect your account to a third party social monitoring tool like, for example Hootsuite or Sprout Social).
With digital there are also occasions where it’s difficult to get the exact information you’d like, such as whether a specific age group is engaging with a particular ring-fenced set of content on Instagram (in this particular example, you can see age as it pertains to your audience generally but not tied to individual pieces of content).
While there is no easy fix for these issues, the key is to return to your objectives and be clear how the data you’re collecting relates back to what you’re trying to achieve.
Pulling it all together
In most evaluation activity, it is good to gather a bedrock of robust activity data and build on this with deeper qualitative insights. Digital evaluation lends itself to this, with readily available usage data such as Google Analytics that can be layered up with additional primary research. Despite different sources sometimes giving conflicting data (age, for example, often varies between survey and Google Analytics methods) by conducting them together you can build an understanding of how they relate and use one to inform the other.
You may want to test the usefulness of certain metrics by explaining them to colleagues and seeing if they find them meaningful.
When conducting any evaluation, consider your baselines carefully. What are you going to compare against? Is it worth running research pre as well as during or post activity? This can also be the case for some social media metrics, for example, where ongoing collection is required.
Ways we can help
Due to the myriad of different analytics tools employed across each platform and channel, evaluating the impact of your online activity can be difficult. We are experts in digital metrics so whether you want to develop a meaningful measurement framework to help with your monitoring and reporting or understand the impact of your online activity on a project basis, we can help.