BBC R&D

What we're doing

Current broadcast coverage of live events provides an experience very different to actually being there.

For the audience at the event itself, we would like to be able to enrich their experience with additional live data, analogous to what a broadcast viewer might get through commentary or subtitles.

For viewers at home, what they see is controlled by cameramen and the director, instead of being able to look where they want. For events where there are many different things happening at the same time, we would like to give viewers the ability to look around freely. Furthermore, we would like to provide an audio mix corresponding to the part of the scene that they have chosen to look at and provide the viewer with details relating to what they are seeing.

The Venue Explorer project is looking at ways of offering this kind of service, building on our previous work on panoramic imaging in the FascinatE project, and our work on navigating around live events in the VSAR project.

BBC R&D - Exploring film soundtracks with Radio 2 and BBC R&D

Radio 2 - Friday Night is Music Night Remixed

Improving the experience for the audience at a live event

undefined

Venue Explorer delivering programme notes during a BBC Philharmonic concert in November 2018 at the Bridgewater Hall, Manchester

To enhance the experience of the audience at live events, we have been focusing on a specific use case in partnership with the BBC Philharmonic: providing live programme notes that audience members can view on their mobile phones. The notes are prepared in advance, and then triggered at the right moments in the performance by an operator following the musical score and delivered to a web application. The application can also show the musical score, with an indication of the current location.

Improving the experience for the audience at home

For remote audiences, we have focused on allowing them to freely explore a wide-angle view of the scene, with audio being re-mixed to match their view.

An ultra-high definition video of a live scene is captured from a fixed wide-angle camera overlooking the whole event. The image is displayed on a conventional tablet or PC web browser, in a way that allows the user to pan and zoom around the scene to explore the areas of most interest to them, much as they would when using a map application. We have investigated approaches whereby we only have to transmit to them the portion of the scene that they are looking at, significantly reducing the bandwidth requirements. This type of experience could work as either a stand-alone or second screen experience; a tablet is an obvious starting point for second screen applications.

To provide an audio feed of the area that the user is currently looking at, we create audio feeds relating to individual areas of the scene and an overall mix suitable for a wide view, and mix between these as the view is changed. When viewing a wide shot, the audio will convey the overall ambience, similar to what would be heard by someone in the audience. As the viewer zooms in to an area, the audio is re-mixed to be appropriate for the selected region. For an application in an athletics stadium, the audio feeds for different events could be obtained from the existing outside broadcast operation, and the ambience feed from a microphone near the camera. For an application in a music or arts event, different audio mixes could be created for different areas, using feeds from many microphones. The work on audio is being carried out by R&D’s Audio team, and forms part of our work on the ICoSOLE project.

We also acquire data relating to the scene. For an application at an athletics meet, this data could include background information on various athletics events, and live information giving the latest results. For an arts event, it might include the names and biographies of actors. We use a version of the authoring tool we developed for the Augmented Video Player, modified to receive a live video input, to specify the location in the image associated with live data feeds, and also to create additional overlays manually. The user can choose to overlay this information on the image, aligned with the corresponding location, providing an ‘augmented reality’ display. This approach could in future be automated by using techniques such as object and face recognition, potentially allowing details of every athlete visible in a stadium to be made available as an overlay.

BBC Philharmonic - Introducing the Red Brick Sessions

BBC R&D - Red Brick Sessions - Venue Explorer Trial

Venue Explorer is an example of one way in which broadcasting could move towards what is known as an object-based approach: current TV systems send the same audio and video to everyone, mixed by the broadcaster. In this system, the content is divided into separate ‘objects’: the video is divided up into tiles, the audio is sent as a number of separate streams relating to particular picture areas, and overlay data is sent separately, with information about the place in the image it relates to, and what kind of data it is (results, schedule, etc). The user’s application assembles these objects according to the view selected by the user.

Outcomes

We conducted the first live public trial of the system at the 2014 Commonwealth Games in Glasgow.  We had a camera viewing the opening ceremony in Celtic Park, which we then moved to Hampden Park to cover the athletics and the closing ceremony. In our public demonstration area at the Glasgow Science Centre, we showed both live and recorded content to members of the public. Data feeds for schedules and results were taken from the sources already available to BBC Sport from the games organisers. Audio feeds were taken from a double MS mic next to the camera (for stadium ambience) as well as from feeds provided for various events produced as a part of the main TV coverage. All video and audio feeds at the production side were handled using BBC R&D's IP Studio system.

The Venue Explorer Project Film

We also worked with TNO (one of our partners from the FascinatE project), to conduct trials on the open internet, using their tiled streaming application for the iPad.

The results of the trial were presented in a paper at IBC 2014, and shown on BBC R&D's stand in the IBC Future Zone.

Following this, we captured a performance of the Mad Hatter’s Tea Party to investigate an application in the area of music and arts, and showed the results of this on the BBC R&D stand at IBC 2015.

We then conducted various other public trials, including the BBC Philharmonic Red Brick Sessions, and Radio 2's Friday Night is Music Night in February 2017, both of which are discussed in a blog post. NB: these trials only work in the Chrome web browser.

The in-venue programme notes system is being used regularly for the BBC Philharmonic concerts at the Bridgewater Hall in Manchester from September 2018. Look for the “Philharmonic Lab” symbol in the programme and book a seat in the designated area in the auditorium.

Tweet This - Share on Facebook

BBC R&D - Exploring film soundtracks with Radio 2 and BBC R&D

Radio 2 - Friday Night is Music Night Remixed

BBC Philharmonic - Introducing the Red Brick Sessions

BBC R&D - Red Brick Sessions - Venue Explorer Trial

BBC R&D - How the Commonwealth Games is defining the future of broadcasting

BBC R&D - Augmented Video Player

This project is part of the Immersive and Interactive Content section

This project is part of the Visual Computing for production work stream

This project is part of the Audio Research work stream

Topics

People & Partners

Project Team

Project Partners