What we're doing

At the heart of object-based broadcasting is the idea that a piece of media lives together with its metadata and are manipulated as a single entity. BBC R&D's previous efforts in this area have focussed on the user experience for our audience, with projects such as Responsive Radio and Visual Perceptive Media but has also more recently investigated what new workflows this could enable for production teams, in projects such as Squeezebox.

These initial projects were undertaken as standalone prototypes to assist with research, and the logical next step is to investigate ways we can start delivering these tools in a more sustainable way to real production teams solving real production problems. BBC R&D are designing the Optic Framework (Object-based Production Tools In the Cloud) to allow us to build these new production tools.

For the end-user, they will appear as web apps in a browser, but as a web app, the video processing and data is then kept server-side in the BBC's Cosmos cloud. The Optic Framework aims to deliver re-usable data models to represent this metadata, so different production tools can use the same underlying data models but present differing views and interfaces on this based on the current needs of the end user. By enabling metadata to live with an object throughout, this can enable new, more efficient workflows. For example, assigning a script to an object prior to shooting can give more accurate automated speech-to-text analysis, or when building a highlights package, re-using any edit decisions made for the original production can speed up that process.

Building on top of the work being done in the R&D IP Studio and Content Analysis Toolkit workstreams, Optic uses the JT-NM data model at it’s core, and each individual component within it uses the NMOS standards to allow us to develop tools within an open and interoperable framework.

Microservices and well-defined interfaces are at the core of the Optic framework. In Optic, there are 3 layers, and communication across layers is only possible by using a well-defined interface, making the underlying implementations replaceable and interoperable.

Diagram showing the layers of Optic: media defined as objects, production tools which consume that and output an EDL to a rendering layer, and the composed output goes back to the input for other tools

Why it matters

We believe that the future of production tooling is based around a larger number of smaller, more specialised tools deployed into a cloud, rather than monolithic software suites. By picking and choosing from a set of interoperable smaller tools, we believe that this allows for easier ways of introducing new production workflow innovations, as well as customising a particular workflow to meet the needs of that production or broadcaster.

Our goals

In the longer term, BBC R&D will start to develop structured data models to represent production metadata at a level that is meaningful for editors. Once standard data models are in place, by deploying these tools into a cloud environment, this allows us to capture metadata, both explicitly from a production team, and implicit decisions made during the standard production process, from the very start of the process all the way to the end. We believe that this metadata, captured end-to-end in a meaningful way, will allow for a whole new range of new content experiences to be produced based on reuse of existing metadata in new and exciting contexts.

Tweet This - Share on Facebook

BBC R&D - Object-Based Media

BBC R&D - IP Studio

BBC R&D - Content Analysis Toolkit

NMOS - Networked Media Open Specifications

BBC R&D - VIsual Perceptive Media

BBC R&D - Squeezebox

BBC R&D - Responsive Radio

This project is part of the UX work stream


People & Partners

Project Team