Eight challenges to measuring off site social media performance
I’m an Audience Researcher within BBC Future Media, which means I contribute to my department’s remit to measure product performance, understand audiences’ attitudes and behaviour and provide recommendations to inform decision-making in the future.
One of the areas I work across is social media; an area that continues to grow in visibility and importance. As Holly Goodier blogged last year, 77% of the UK online population now actively participates on the internet using social tools.
The likes of Facebook and Twitter get a lot of attention, but social media extends beyond social networks to include functionality such as commenting or sharing that can make any website, including BBC.co.uk (and indeed this very page), social.
BBC ONE's Facebook page
However, for the purposes of this post, I will focus upon off-site activity – that which occurs outside of BBC.co.uk.
Off-site social media is a great way to reach and connect with our audiences, and as such we operate many accounts across our channels, brands and divisions to accomplish this. The majority of our accounts are within Facebook and Twitter, but several of our programmes and services also have a presence on sites such as Google+, Instagram, Pinterest, Tumblr and YouTube.
Watching our teams use these different environments in multiple ways is fascinating, but it also creates challenges when it comes to measuring how we are performing.
Listed below are eight challenges we face when assessing social activity, both directly related to our official presence and in terms of wider online conversations. Not all of these are limited to social media, and few if any are BBC specific, but they give an idea of some of the considerations we face:.
1. No official measurement source: TV has BARB and Radio has RAJAR – two well established bodies, with consensus on the most appropriate metrics to use. Within digital, there is the relatively new UKOM - while it offers a range of measures, it does not break down social media into specific accounts (such as @BBCSport on Twitter or BBC One on Facebook). Social networks may offer useful insight tools themselves, but only top-level information is made public. It can therefore be difficult to place performance in the context of the performance of other accounts or organisations.
2. Limited geographic restrictions: I work within the public sector side of the BBC, and so am principally interested in UK performance rather than global. Again, insight tools can offer geographic splits but there isn’t much publicly available UK-specific data to compare to.
3. Aggregating across multiple accounts: It can be difficult to assess overall performance when multiple accounts are being used – for instance, if we wanted to measure combined performance across @BBCBreaking and @BBCNews on Twitter. Action-orientated metrics (such as measuring the number of ‘likes’ or views) can be added together, but others such as total audience cannot, since people that follow multiple accounts would be counted more than once unless data could be de-duplicated . The challenges of measuring your own organisation are magnified when trying to measure others.
4. Totalling activity across multiple services: The ideal would be to evaluate our performance across the entirety of social media, but different services with different functionalities with different ways of measuring make this impractical. For instance, is a Facebook share the equivalent of a Pinterest re-pin?
5. Distinguishing active from lifetime audience: Metrics such as followers or likes are based on lifetime activity – they take no account of recency and so could count activity from several years ago. Changes over time can be used to assess growth, but it doesn’t give an accurate reflection of the active audience – people that interacted with the site more recently (e.g. in the last week or last month). Again, some insight tools offer this function, but once again there is an inability to place performance in context.
6. Interpreting behaviour: Adding up the number of comments or mentions produces a measure of audience engagement, but it assumes all interactivity is good when in fact audiences could be using social media to protest against something or talk about how much they hate a particular programme. Sentiment analysis can provide some context. While tools continue to improve and innovate, ambiguities in tone and meaning mean that analysis is not yet fully accurate
7. Identifying relevant activity: Counting the volume of mentions for a programme across social media could be limited to searching by the programme name, or it could include a search for mentions of the on-air talent, topic or notable incidents. Furthermore, that on-air talent can appear across multiple programmes or formats. Agreeing on parameters can be hard to do. Some tools do automate this to provide a consistent view for all users, but without an industry standard it is still possible for other organisations to announce radically different figures due to different measurement criteria.
8. Measuring impact: Metrics such as likes or retweets are not ends in themselves, but are signifiers of audience engagement. Social media objectives should be broader than stimulating this type of behaviour alone, and could have goals such as increasing the audience figures for a TV or Radio programme or raising positive opinion towards a programme, channel or service. This is something that is hard to measure in any medium, but the nature of social, where ease of interaction encourages high volume of messages – makes it harder than most to measure this type of impact.
In a forthcoming post, I will outline some notable performances within social media to date, the above challenges notwithstanding. In the meantime, feel free to participate below the line with your thoughts.
Simon Kendrick is a Research Manager, Audiences, BBC Future Media