BBC R&D

Posted by Chris Newell on

360 Video storytelling, voice-controlled radio and face recognition are some of the many projects featured in Sprint notes from the IRFS team.

This week Zillah published a blog post about our work on 360 Video storytelling and film making together with Peter Boyd Maclean.

Talking with Machines

This sprint saw Henry talking about TwM and presenting our work on the prototype voice-controlled radio player to the Radio & Music Multiplatform team and then again later at a BBC Academy event. It’s been good to show our work to people around the BBC - we’ve had a lot of excited feedback and made some new friends around the corporation.

Meanwhile Ant & Tom were making progress on the prototype, setting up development and testing environments, improving voice commands for podcasts and implementing logging to help test and improve the player.

We’re planning on testing our radio player in a few ways: first, members of the team and BBC staff in an alpha testing group will be conducting independent expert user testing with their own devices. Second, we’ll be providing a testing script to facilitators for ad-hoc testing.

Face recognition

Jana has been getting Open Face, the open source face recognition system, working within Docker. The system is performing at a much faster rate than previously - we can now process 60 frames of video in 10 seconds including the overhead of booting the Docker container and loading the neural network model. We believe this number will reduce significantly with better staging of our data.

Denise has been continuing her work applying face detection to MPEG Dash streams, where she is trying to approach real-time speeds using the IP Studio project APIs. Currently an extra encoding step is required but once this is eliminated she should be able to reduce the overhead by 50%.

Editorial algorithms

The Discovery team welcomed Matt Spendlove back into IRFS, to work on a new Stream Builder application for our Editorial Algorithms project. Stream Builder will allow users to customise their stream to find articles relevant to their interests. David finalised the UX/design for the app, enabling Matt to begin the development phase for the UI. Underlying this, Matt has also completed the Streams API and Swagger interactive documentation for serving, creating and updating a generic representation of a stream.

Olivier, Kate and Katie have been working together to map out our future user research efforts. Katie is continuing to engage with various parts of the BBC (including BBC Live, BBC Sport, and "Trust Me I'm a Doctor") to increase awareness of the Editorial Algorithm project's successes over it's lifetime, and shape the life of the platform and tools after the end of the project.

In our collaboration with NewsLabs on ELMer, the External Linking Manager, Thomas has exposed a statistics API, able to provide a breakdown by publisher per week and a list of news stories associated with external links, so that the NewsLabs team can continue to develop the frontend for presenting this information. Thomas has also replaced our Content Analysis entrypoint with an Amazon Lambda function, so we have fewer deployed instances to maintain and lower running costs.

Fionntán has got his news article clustering system working using entities from our semantic tagger and a home-cooked salience algorithm. The aim is to identify big news stories and give journalists a high level view of events.

Atomised Media

Barbara, Joanne and Tim have been designing a study to get in-depth qualitative insights from the Newsbeat Explains pilot. Tim is building an online tool that will track events attached to each user and participants will then be prompted to complete a survey to describe the context in which they read the stories. This will allow us to understand how they read the stories, the sequence of interaction and the context (e.g. while commuting, at home etc). In parallel, Barbara has been moving forward with the online survey in collaboration with BBC Marketing & Audiences - this is aimed at giving us more qualitative data using more in-depth questions than those currently on BBC Taster.

Out and about

Chris N met with colleagues in R&D to discuss high dynamic range colour on the web.

Thomas and Chris N attended EBU DevCon 2016 where Chris presented a short update on the W3C TV Control and Second Screen Working Groups. The main topic of this year's conference was container and orchestration technologies such as Docker, Docker Swarm, and Kubernetes.

Thomas also gave a talk at the London School of Economics about data; a bit of history from physical to digital, connecting content, World Services Archives, ELMer and Editorial Algorithms.

Manish attended muCon, the microservices conference hosted by Skills Matter. It was a brilliant conference covering all aspects of microservices from security, deployment, automation to social impact.

Fionntán attended the Web Summit 2017, a tech/startup expo with more than 50,000 attendees. The talks were about technology and its cultural effects on a wide variety of areas, including media, AI, robotics, sports, software, health and government. One topic relevant to IRFS was the echo chamber/filter bubble effect, where it was claimed that no one has any good solutions because a solution isn't profitable. This could therefore become an important research topic.

Interesting Links

Topics