Posted by Miranda Marcus on , last updated
There's cross team collaborations on the COVID-19 study, and a whole lot of planning going on as we shape up what the rest of 2020 looks like
COVID-19 diary study
Bringing together UX and user research expertise from across research, product and design, the Internet & Society team have designed and set up a four week research study working with three key groups: young people in low socio-economic groups, multi-generation households, and vulnerable isolating populations. They’re using a range of novel methods including data postcards and cultural probes to understand how people’s lives and use of technology and media is changing during the COVID-19 crisis.
BBC Together (a way to watch or listen together with remote friends and family) is still available on BBC Taster. So far it has received a good deal of coverage (over 330 articles) and Romesh Ranganathan made a joke about it, which can only be a good thing. We’ve seen that over 60% of the usage is by people under the age of 35. The most common watch party size by far is two, but some sessions have had a whopping 31 users.
Understanding the impact of COVID-19 on tech and media use
On top of BBC Together and the COVID-19 Diary Study, the Internet & Society team have been starting to plan their next set of projects. The theme they’ll be focusing on is making machine learning technologies more accessible. As technologies using increasingly sophisticated machine learning techniques become ever more prevalent in our daily lives, it's important for people to understand how it works and how it affects them. This will mean they can make more informed choices about how they use this tech, and how they interact with those who make it.
Similarly, the Anansi team are getting stuck in to planning what the rest of 2020 looks like for them as they wind down a number of pieces of work including the Synthetic Voice & Personality Study, the recent collaboration with BBC News Labs exploring segmented audio for News production and the audio augmented reality prototypes (watch out for an online demo that takes some of that work and puts it online).
Meanwhile, over in recommendations corner, the Data Team are working with Queen Mary University to look into cross-media recommender models so content from across the BBC can be suggested across different channels - TV, radio and online. In parallel, those that work with the speech-to-text system have been experimenting with automatically identifying trending words and different types of non-speech such as applause or traffic, as well as generating phonetic transcriptions (all of which seems ridiculously useful).
W3C Web Standards
In the W3C Media & Entertainment Interest group, Chris hosted a meeting which heard an introduction to a proposed client-side video editing API from Microsoft, generating some useful feedback. Chris also participated in the CSS Working Group, to talk about media queries for video rendering capabilities of TV devices.
This post is part of the Internet Research and Future Services section