BBC R&D

Posted by Scott Cawley on , last updated

BBC Research & Development visits IBC 2016 to present our latest work on our own stand and take part in some of the conference sessions to share our learnings on subjects as varied as HDR, object-based production, subtitling, multi-language translation and 360 degree video.

If you’re visiting the Amsterdam RAI Exhibition and Congress Centre, we’ll be at 8F20 - that’s Hall 8, on stand 20, exhibiting in the Future Zone - plus we’ll be popping up on other stands elsewhere at the show, and we’re presenting several talks throughout the IBC conference.  If you aren’t going (or even if you are) make sure to follow our daily updates and any last minute changes here on the blog, or via @BBCRD on Twitter or on Facebook.

Here’s our schedule of where, when and what you can expect from the BBC R&D team at IBC:

On the BBC Research & Development Stand:

Download our guide to what we're showing on our stand this year.

IP Studio
Since 2012 our IP Studio project has investigated new approaches to capturing, producing and delivering content. Visit us to see our vision of IP Studio’s role as a model for future broadcasting for numerous experiments including how we used the system to deliver the 2014 Commonwealth Games. Since then we have been working with numerous industry partners through the Advanced Media Workflow Association (AMWA) to test the interoperability of our techniques with systems from outside the BBC. Stand visitors will be able to find out the latest project updates and where we plan to take IP Studio next.

Nearly Live Production
Our prototype Nearly Live Production tool, previously codenamed Primer, presents the operator with a single-screen interface built as a web browser application. It is similar in nature to a clean switcher, selecting a camera to be 'live' as the action unfolds but also features a broadcast preview. Nearly Live Production has been integrated with IP Studio and this interaction was recently tested at the T-In-The-Park music event in July and at last month’s Edinburgh Festival and you can see findings from both of these experiments on our stand.

Squeezebox
Squeezebox enables users to adjust the duration of a video package using a simple slider control.  Footage that’s uploaded to Squeezebox is automatically analysed and segmented into individual shots. Then, rather than manually editing the footage to create different shots, the user marks up the most relevant and important portions of each shot, indicating that the rest is a candidate for being cut. The user also marks up the priority of each shot, determining how the footage behaves as the duration is reduced - which shots will hit the cutting room floor first, and which ones will be preserved. Using this metadata, as the slider is moved, the purpose built algorithm establishes new in and out points per shot, and in some cases drops shots entirely.

Visual Perceptive Media
Visual Perceptive Media is a film with the capability to change its form based on the situation in which it is being viewed. The film is composed of a number of small content pieces: video clips, sounds, and sound tracks; and visual filters (grading). The film played out is a composition of these pieces in real-time in response to the audience situation: the time of day, location, audience characteristics. These could be individual preferences for watching solo or group characteristics (matinee, or late night) for audiences. The current demonstration uses data from a phone app to identify individual preference profiles including music genre preferences. This data is then used to inform the composition of the film, pacing, mood, cinematic effects and storyline twists.

The BBC R&D CAKE teaser trailer

Cook-Along-Kitchen-Experience (CAKE)
Is a new experiment from BBC R&D that demonstrates what can be achieved through object-based media by using a cooking programme as an example of a learning experience. CAKE is a real-time, interactive cookery show that changes as you cook with it. It customises recipes based on your familiarity with ingredients and methods, your tastes or dietary preferences, and how many people you’re inviting round for dinner. The experience reacts ‘in the moment’ to your progress, allowing you to create new dishes at your own pace. Novices can level-up and experts can cut to the chase, supported by an evolving dialogue between audience and presenter.


VR and 360 with BBC Taster
At IBC 2016 we will be hosting demonstrations of some of our most recent and engaging pilots. These include ‘true VR’ pilots ‘We Wait’ – a virtual reality experience produced with Aardman Animations that puts viewers on the frontline of a migrant’s perilous journey crossing the sea from Turkey to Greece, and ‘The Turning Forest’, which is an enchanting immersive VR experience that incorporates interactive spacial sound (binaural audio) as part of a compelling fantasy produced with award-winning film maker, Oscar Raby. We will also be showcasing ‘The Vic’, which is testing TV on-set environment modelling by allowing users to explore perhaps Britain’s most famous pub – The Queen Vic from the ever-popular soap, EastEnders. As well as trying out these pilots on the latest VR headsets, visitors to the BBC R&D stand will be able to view some of our 360-degree video pilots, including a walk with dinosaurs and Sir David Attenborough. There will also be a whole host of other new content ideas currently featured on BBC Taster that people can try out on mobile, tablet or desktop computer, or try them for yourself on BBC Taster.

Download our guide to what we're showing on our stand this year.

 
BBC Research & Development Contributions on Partner Stands:

Download a guide to things we're doing on other, non-BBC stands at IBC this year.

EBU Stand (Hall 10 - 10.F20 - EBU)
The IP Studio team will contribute to the "end-to-end IP" demo on the EBU stand, the live capture system with AMWA Incubator partners, sending live and not-so-live content from our store to downstream IP distribution and personalisation systems from other EBU partners.

Also on the EBU stand the BBC R&D team will be providing encoded content for demonstration of UHD1 phase 2 (i.e. 3840x2160p100 HLG HDR) for delivery via DVB-T2 and DVB-DASH.

There will also be a demonstration of a personalised news channel produced by a live IP Studio. Users are able to skip the live content and access a custom playlist of content created by the EBU Recommendation System. User Authentication will be managed using the work BBC R&D contributed to on Cross Platform Authentication.

Joint Task Force on Networked Media (Hall 8 – Stand 8.D10 - IBCTV IP Studio)
In addition the IP Studio team will be working with the Joint Task Force on an interoperability demo at the IBCTV IP Studio area in Hall 8 - in particular the Advanced Media Workflow Association (AMWA) Networked Media Open Standard discovery and registration specification together with other industry partners.

COGNITUS Horizon 2020 Project (Hall 7 – Stand 7G.16 - VITEC)
COGNITUS - the Horizon 2020 project led by BBC R&D - will be showing the group’s content on VITEC’s display screens throughout the event and more information on the project will be available.

BBC Research & Development Technical Papers & Presentations:

Download a guide to things we're doing on other, non-BBC stands at IBC this year.

BBC R&D Colleagues will be presenting technical papers based on their work all week at IBC.  This is a great opportunity to get details on the work our teams do day to day - you can find all these sessions in the Emerald Room.

Future-proofing Live IP: an emerging roadmap towards an industry architecture
In the Paper Session: Lessons from Experimental IP Studios
The most fundamental advance in production since video went digital is on the near horizon and it will revolutionise both the art and the technology of media creation.  IP interconnectivity and IT-centric architectures will, so the experts say, improve: flexibility, creative potential and efficiency. Researchers from a major international collaboration will reveal what they have learned from the world's first fully-IP live television studio, including the efficiency of production and interoperability between equipment vendors using open standards.  Exciting too, will be a look at the AMWA media incubator which is being used to develop new standards and to explore the integration and interoperability of inherently non-real-time processes. We shall also examine the critical role of precision timing within the IP studio.
With Peter Brightwell, Friday 9th September, 12:00 - 13:30


Automatic recovery and verification of subtitles for large collections of video clips
In the Paper Session: Novel Technologies for Assisting Sensory-Impaired Viewers
This session, preceded by a complimentary networking breakfast, aims to become a regular and informative forum for all those involved in these diverse and developing technologies. Legislation across most of the world now ensures that sensory-impaired viewers have access to an increasing amount of content through services such as: subtitling, signing, audio description, slow speech and clean speech. But the quality and production efficiency of these services are advancing all the time. Discover the most novel technologies in live subtitling, virtual human deaf signing and automatic subtitling of video clips. And come to meet the innovators themselves.
With Mike Armstrong, Saturday 10th September, 08:45 - 10:15

Directing attention in 360-degree video
In the Paper Session: Exploring New Ideas in VR and 360º Immersive Media
The affordability of fast processors and hi-res head-mounted displays is creating a new and exciting market in 360° media - be that both video and 3D-captured VR environments.  These systems have the potential to produce an alarming sense of reality and to present huge creative opportunities for storytellers. But so many difficulties remain about how to create narrative with these media, not the least of which is that conventional production grammar doesn't apply when the viewer is part of the action.  In this fascinating session we shall explore such fundamentals as: how can we prompt a 360° viewer to be looking in the right direction, how might we create an environment for live 360° distribution and how might we broadcast mixed reality?
With Alia Sheikh, Saturday 10th September, 10:45 - 12:15

Creating object-based experiences in the real world
In the Paper Session: Novel Ideas and Cutting Edge Technologies
Every year we receive papers describing novel ideas and innovative developments which just don't fit into any session category. Each year we bring these to you in the special and popular 'Cutting Edge Technologies' session. The 2016 collection of such papers explores: the very latest ideas in virtual production; new approaches to crowd-sourced news gathering; a feast of novel studies in object-based media; and a very innovative approach to gaze-tracking by processing views of the cornea. Our supporting paper explores voice control in an 'internet of things' environment. Widen your horizons and join us for a thought-provoking glimpse of what the future may hold.
With Michael Evans, Saturday 10th September, 13:00 - 14:45

The open-source Turing codec: towards fast, flexible and parallel HEVC encoding
In the Paper Session: New Applications of High-Efficiency Video Coding
HEVC is a vastly complex compression technology which will be essential for the transmission of UHD television services through any practical channel. Although it is an international standard, much scope remains for improvements in performance, flexibility and architectures. It is these dimensions of maturing HEVC technology that we shall feature in this enlightening session. We shall explore: multi-rate HEVC coding for adaptive streaming; an open-source software codec designed from scratch with a fast, parallel architecture; and, hot out of the lab, are developments on how HEVC will handle the greater numerical depth required by HDR imagery.
With Saverio Blasi, Sunday 11th September, 08:30 - 10:00

Video Translation: weaving synthetic voices into the multilingual production workflow
In the Paper Session: Advanced Ideas in Audio Production
Too often when we contemplate new developments in production such as VR or IP in the studio, we concentrate on video.  In this session we shall examine three fascinating new influences on audio production and also look at how they will impact on workflow and content management. First a holistic examination of IP audio production which will recommend a new IP-based facility control structure together with dynamic allocation of production resources. Next some novel ideas and innovative workflows in multilingual production which have already been prototyped and applied by BBC News.  Then a fascinating look, involving engineering and psychology, at how audio may be used in immersive storytelling to stimulate emotional responses.
With Susanne Weber, Monday 12th September, 08:30 - 10:00

Image Adaption Requirements for High Dynamic Range Video under Reference and Non reference Viewing Conditions
By Manish Pindoria and Simon Thompson.
This is a supporting paper for the session: A Brighter Future: High Dynamic Range TV and wide colour gamut - and so won’t be presented at IBC, but will be published as part of the Technical Conference.

BBC Presenting Elsewhere at IBC:

Download a guide to things we're doing on other, non-BBC stands at IBC this year.

Industry Support for Interoperability: All pulling in the same direction
A panel of personnel from the major trade bodies will explain how their organisations are contributing to open specifications and standards in the move to IP based systems.
At the Technology In Action Theatre (Hall 3 - 3.B22) with Alex Rawcliffe, Friday 9th September, 12:00
 
Ultra HD Master Class: Overview of the Hybrid Log-Gamma HDR System
BBC R&D's Andrew Cotton will explain our motivation for developing the BBC/NHK Hybrid Log-Gamma (HLG) HDR solution. Find out how it works; how it differs from PQ (SMPTE ST.2084), removing the need for metadata and how to transcode between the two solutions. The presentation will also discuss how to signal HLG in distribution applications, allowing the signal to deliver a compatible picture to SDR UHD displays.
At the SES Seminar Series (Hall 1 Balcony Suite - BM10/BM11) with Andrew Cotton, Friday 9th September, 15:00 or Monday 12th September, 12:00

 
An Update on Advanced Media Workflow Association (AMWA) Incubator
Get an overview of the work happening in AMWA on hands-on testing of interoperability of discovery and control for IP-based media systems, and the Networked Media Open Specifications that the group is creating.
At the EBU Stand (Hall 10 - 10.F20) with Peter Brightwell, Friday 9th September, 16:00
 
What do audiences really want? The truth about changing TV consumption
What's really happening to TV consumption? Is linear TV really doomed? Is YouTube the future? Are Millenials actually switching off or over? Are pay-TV subs really cutting the cord? What do the numbers really tell us? A panel of leading experts tells us the real story - and paints a credible picture of the future of the TV market.
At The Forum with David Bunker, Head of BBC Audiences, 10th September, 09:45 - 11:00

The Turing Codec at EBU’s Open Source Meet-up
A lightning talk about the Turing Codec.
At the EBU Stand (Hall 10 - 10.F20) with Saverio Blasi, Saturday 10th September, 16:30 - 18:00
 
The Future is Now
What tech breakthroughs are revolutionising the media industry? What are the latest cutting-edge projects and products that are still being developed in research labs? What amazing tech solutions offer new ways to make creative ideas a new reality? Early adopters will give a glimpse into how the magic is done.   The session will conclude with quick-fire predictions about most astonishing discoveries and developments that will change the world in the next decade. How will they change consumer experiences?  What skills will be in demand in the future?   What do you need to learn today to be competitive in the robot dominated future?
At G102/3 with Si Lumb, Saturday 10th September, 16:30 - 18:00


Networked Media Incubator Project: One Year On
The release of the first Networked Media Open Specification (NMOS) seven months after the launch of the project at IBC 2015 demonstrated how much has been achieved by a group of enthusiastic participants. This session will explain the significance and future direction of the Incubator project and NMOS, for both suppliers and end users, with an overview of our progress so far. To date 41 organisations, 23 of which have participated in one or more of 3 workshops.
At the AMWA stand (Hall 3 - 3.B22s) with Peter Brightwell, Monday 12th September, 12:00 - 12:30

If you're at IBC 2016 then come to our stand and visit us to say hi - and if you see any of the things listed here, be sure to post your pictures, posts and tweets on Facebook and Twitter with the hashtag #BBCIBC - we'll highlight the best in a daily post every day through the course of IBC.