BBC R&D

Posted by Chris Northwood on

Between the 21st June and 1st July, teams from BBC Research & Development will be working with partners from within the BBC and Culture UK at the Great Exhibition of the North at the Sage in Gateshead to provide technology for live streamed performances from a number of artists. At first glance, live streaming an event might not seem the most innovative thing, after all, the BBC have been covering events for years including streaming them online, but for GET North we’ll be using R&D’s Trial Platform to produce this content.

At BBC R&D we’ve been asking ourselves the question of what the next generation technical platform to support broadcasting will look like. Through our IP Studio project, we’ve been looking at how we can use Internet technologies to move broadcasting infrastructures from bespoke, SDI-based systems to ones based on streaming of bits over IP networks. Going beyond this, we’ve looked at how we manage broadcasting infrastructure within a completely IT based system, and have a built our prototype IP Studio software to test the concepts that feed into the NMOS open specifications that are being published via AMWA’s Networked Media Incubator.

We think that moving broadcasting to software allows us to radically change the way production workflows look, allowing for greater flexibility and more efficient ways of working, eventually leading to enabling object-based media to allow us to deliver new experiences for our audiences. By providing a platform as a set of APIs and capabilities, we can build new tools and workflows on top of that platform. We’ve built many APIs, capabilities and tools, which we have previously used with trials in partnership with BBC teams as part of our living lab at the Edinburgh Festivals, but each one of those has been a bespoke deployment for that event. What we have been building since then is an instantiation of this platform, with a supported set of capabilities and tools to support low-cost remote live production. This incorporates SOMA and capabilities of IP Studio including IS-04 and dynamic media composition. I’ve previously written about how we intend to use devops techniques to build a site reliability engineering team to build and support this platform, and our trial at GET North is the first production use of this platform with this automation.

What we have built is a set of bare metal machines running Ubuntu Linux which can be automatically provisioned and configured using industry standard tools PXEBOOT and Ansible. This is on top of a high-performance networking layer that we can also control using our Ansible automation suite. In future, we aim to extend this with cloud computing systems, and our Cloud-Fit Production project is investigating how to best utilise the cloud in production environments.

Some of these machines are based in our MediaCityUK datacentre, but a number of these are also loaded into portable flight cases which we can take to the remote production site. SOMA is designed to work with unoperated cameras, with static locked-off UHD cameras used to give a wide shot, where crops can be taken out of these shots to create virtual cameras, giving more choice to the SOMA operator than the locked-off cameras alone give. The output of the system is ultimately 1080p HD footage, so no loss in quality is introduced by using crops.

An image showing the SOMA system in use on a laptop.

For our trials at Edinburgh, we used dark fibre connecting back to R&D’s own network at 100Gbit/s, which allowed us to experiment with moving raw video around a remote production site, but this approach is not always available at a reasonable cost to all sites. With the Trial Platform, we want to be able to use commercially available fibre Internet connections, where providers offering up to 1 Gbit/s are becoming increasingly available at reasonable cost. As a result of this, the machines we take on site record the content, and only low-resolution proxies are streamed at low latency to the SOMA interface. The machines then store a higher quality (but still compressed) version at full resolution. The high quality output is then rendered from machines at our MediaCityUK datacentre, but this render is delayed by a number of seconds. The render then only needs to fetch the content from the store that is currently being used in the rendered output, and by delaying by a number of seconds, gives the render time to warm up a buffer by looking in advance of any edit decisions made to smoothly switch to the remote stream. This allows us to run events from places with relatively low Internet speeds, but still with UHD broadcast quality content.

Once rigged, this system can be left unattended and a remote operator can use the SOMA interface by connecting over the public Internet, meaning they can be based at a BBC base, or even work from home. The bandwidth requirements into the SOMA interface are relatively low, and a home wi-fi connection is all that is needed to control the system. Due to the automation we have in place, our support engineers can also work remotely, limiting the amount of staff that needs to be on-site.

You will be able to watch the opening night on BBC Arts live page, with the following nights on BBC Taster. My R&D colleague Jasmine Cox will also be at GET North as a mentor for artists in residence, and you can see more information on the BBC’s whole coverage of GET North.

R&D’s presence at the Great Exhibition of the North is part of the Culture UK initiative, which partners the BBC with the arts and cultural sector all over the UK enabling them to take advantage of R&D’s technology on the Trial Platform, including low-cost live streaming.

Tweet This - Share on Facebook

BBC R&D - High Speed Networking: Open Sourcing our Kernel Bypass Work

BBC R&D - Beyond Streams and Files - Storing Frames in the Cloud

BBC R&D - IP Studio

BBC R&D - IP Studio: Lightweight Live

BBC R&D - IP Studio: 2017 in Review - 2016 in Review

BBC R&D - IP Studio Update: Partners and Video Production in the Cloud

IBC 365 - Production and post prepare for next phase of cloud-fit technology

BBC R&D - Running an IP Studio

BBC R&D - Building a Live Television Video Mixing Application for the Browser

BBC R&D - Nearly Live Production

BBC R&D - Discovery and Registration in IP Studio

BBC R&D - Media Synchronisation in the IP Studio

AMWA - Advanced Media Workflow Association

BBC R&D - Industry Workshop on Professional Networked Media

NMOS - Networked Media Open Specifications

BBC R&D - The IP Studio

BBC R&D - IP Studio at the UK Network Operators Forum

BBC R&D - Industry Workshop on Professional Networked Media

BBC R&D - Covering the Glasgow 2014 Commonwealth Games using IP Studio

BBC R&D - Investigating the IP future for BBC Northern Ireland

Topics