Research & Development

Posted by Scott Cawley on , last updated

We've arrived and are ready for the show to open tomorrow morning - but Day 1 of IBC is all about the conference. As things get going here are out highlights and sessions to check out.


We'll have a daily blog with updates from IBC between now and Tuesday - so check the site each day to hear the latest about what BBC R&D are doing there, read our tips of what to see - and find out about other highlights from as we get out onto the show floor.

R&D's Henry Cooke made it to IBC (after some weather delays) to deliver his paper this morning, our first bit of activity in Amsterdam.  The session "Intelligent Media Interfaces" looks at how, as deep-learning algorithms become more commonplace alongside the popularity of voice controlled devices like Amazon's Alexa, our devices are increasingly intelligent and are able to learn a lot about us.

Do also check out a number of other sessions in the conference today which we're interested in. The talks "The next generation TV channel may not be broadcast", "Enhancing the audience experience through OTT" or "What does the future of TV delivery look like?" then you might be interested in our work on synchronising broadcast and IP video which we wrote about only yesterday and are demonstrating at IBC...

And David Johnston from our Reality Labs will be talking in the "VR: The Top of the Crop" session later today, highlighting the best from the 12 months, and what lies ahead - David has written about our thoughts on this for us already.

On our own stand in the IBC Future Zone in Hall 8 (stand 8.G10) we'll be highlighting five projects that we're working on right now.  IP Studio returns to the show again this year - and we'll be showing how we've now moved the technology into the cloud, allowing scaleable, flexible production to take place over the public internet which enables users to produce live multi-camera video via a web browser.

A woman using our web-based live video production tool.

Also on show will be our work and findings from our Atomised News trials.  Newsbeat Explains is a prototype that we built to break down stories into smaller, reusable, self-contained pieces of information that were linked together by metadata.  The aim was to make stories easier to understand, but it also enables non-linear and contextual content experiences. The pilot was so successful that it became part of the BBC News UK General Election coverage earlier this year. We're now working on creating atomised videos of multiple lengths from "unatomised" content.  We'll be taking a closer look at this on the daily blog tomorrow.


IBC 2017 sees the first outing for BBC R&D's new venture - Reality Labs.  Over the last two years we've carried out lots of successful work in the world of VR and 360 video - and Reality Labs will continue that success, looking into WebVR, a new BBC Taster VR app, as well as work on a new Augmented Reality project which we'll be demonstrating at IBC 2017.

A scene from a virtual reality trial - a user's hands in virtual reality move towards a box with a face mask emerges from within followed by a trail of pink smoke.

We'll be showing some amazing work by our Speech-to-Text team - BBC R&D actually has a paper on the subject being presented as part of the conference on Monday morning, but you can meet our team at our stand throughout the show.

Speech to Text

Our transcription tools are now in widespread use across the BBC, helping producers quickly create social media content, enabling them to bettter and more quickly search the programme archive, and much more besides. We'll explain how we used our video and subtitle archives and open source software to develop these tools - and how you could too.

Earlier this week we published a blog post about our object-based audio production work as part of the ORPHEUS project we are involved in. Our drama The Mermaid's Tears was the first ever live object-based radio programme, allowing the listener to move around between multiple narratives to follow three characters as they move in and out of the scene.

Interface of the ORPHEUS Mermaid's Tears trial.

Visit us on the stand to try the experience, but to find out more about the control interfaces to create object-based productions which we have installed in our special studio in Broadcasting House and the ADM metadata which powers the experience.

BBC R&D colleagues are in many other places across the show as well - we're part of the IP Showcase, we're showing our work on synchronising broadcast and IP video streams on the Opera TV stand and in the EBU zone we'll be demonstrating HDR video conversion with screenings of Planet Earth II. Our COGNITUS team will be on the VITEC stand showing how we are working on enhancing user generated content for future interactive UHD services, and as well as showing on our stand the ORPHEUS work on object-based audio will be present in several spaces across the show.  Full details on our page listing all of our activity at IBC this year.

Download our guide to what we're showing on our stand this year

Download a guide to things we're doing on other, non-BBC stands at IBC this year

If you're at IBC 2017 then come to our stand and visit us to say hi - and if you visit us or see any of the sessions listed here, be sure to put your pictures, posts and tweets on Facebook and Twitter with the hashtag #BBCIBC - we'll highlight the best in a daily post every day through the course of IBC.