Advances in real-time rendering and delivery technologies pioneered in the games and visual effects industries offer broadcasters the opportunity to deliver completely personal experiences.
Project from - present
Why Does it Matter?
Advances in the technologies used in visual effects, games and virtual worlds give audiences experiences that are personal, responsive, interactive and immersive.
Modern audiences now enjoy media on an ever-increasing range of devices. Every year these devices gain new capabilities and are now able to play object-based, virtual or augmented reality experiences with ease.
The Internet is also becoming the primary means for distributing the content that people experience. Audiences expect their media to be delivered wherever they are. It can be tailored to the moment they are in and the device they are using.
In broadcasting, we are used to delivering one fixed, linear experience to many people at once. Yet there is an argument that giving more freedom and flexibility to media would allow for more personal experiences. BBC Research & Development have been pursuing this as a research priority: object-based media. The hard question is how we do this at scale, for very large audiences.
Can we transform broadcasting to take advantage of new technologies, new media forms and growing audience expectations?
What is the Challenge?
Technology is transforming the craft and tools used to make media and the devices we use to consume it. The experiences people have are no longer just audio-video files delivered to a playback appliance. Delivering even “simple” linear media uses combinations of several media objects, execution logic and intelligence to stream and adapt to a wide variety of devices, capability and user preference. We anticipate that shortly all media experiences will be created and delivered more like packages of interactive software than traditional file-based broadcast media.
The BBC has previously focused on audio and video distribution services over the Internet. New technologies now offer the opportunity for these services to evolve and cater to new audience behaviours.
- in even higher quality and fidelity,
- play other content types including object-based experiences,
- support a growing range of devices,
- tailor and personalise the content appropriately to the device and the user.
All delivered from a personalised playback interface and ideally created as one software codebase.
How might we achieve this? Is it possible to still broadcast to millions, and yet tailor that experience to be unique to individuals? Could we move from a one-to-many, broadcast style of media to a many-to-many, Twitch or YouTube style of media? Can we deliver many content types and formats to any device new or old? Is this achievable in a single ecosystem that scales without increasing cost? Can we embrace the same trend for streaming interactive content as Google’s Stadia or Microsoft’s Project xCloud? Are there new approaches beyond interactive video?
We are taking these questions and more into account in our work.
How Does it Work?
We are taking inspiration from media that is already software-based - immersive and interactive entertainment such as video games - as well as modern approaches to software distribution over the Internet. The approaches used to code and deliver these experiences are constantly developing and they provide some of the solutions to the challenges of at-scale responsive and personal media.
For example, consider recent developments in remote game streaming. To reach more customers with high-quality game experiences providers such as Google are looking to leverage high-speed internet and the video streaming backbone that delivers services like Netflix. The aim is to remove the requirement for a powerful console and deliver to people on any device with a web browser. They do this by running the games on a cloud server, outputting as video, which is streamed to the much less powerful device. The device in turn sends back data – e.g. from a gamepad – to the game. Similar technology could be used to deliver interactive scenes through a future version of a service like iPlayer and allow viewers on a phone or smart TV to explore BBC content in more depth.
Remote game streaming, in itself, is not enough. We are looking into services that adapt to the computing capability in the user’s devices and supplement this with computing power from remote servers. New techniques from edge-computing and the Internet of Things mean rendering and compositing could be targeted more sensibly: relying on high bandwidth video streaming for interactivity is not a future-fit solution.
On top of this, we also want to reduce the amount of software that has to be written to target all these devices and adapt to users and context responsively – so we are looking at new approaches like WebAssembly to “write once and run everywhere”.
What are we Doing?
We are already researching and developing means for the BBC to author, distribute and playback these new software experiences. We plan to use what we learn to suggest new media standards for how to make, distribute, play and archive them. We are embracing existing workflows and tools to develop a solution that fits with existing production forms.
Our team has been working on a series of technical tests and demonstrators to explore what a new broadcasting system for these experiences must support. We want the new system to be low-cost as well as simple to develop and grow. It must be climate-friendly, offer scalable distribution of new content types to millions of devices and people, be sustainable and easy to extend.
We have developed a ‘Single Service Player’ that:
- switches seamlessly between video and interactive experiences that are written in game engines (like Virtual Reality) or our very own StoryKit, used to make BBC Click’s recent interactive episode;
- switches between remotely streamed and locally rendered content according to the capability of the device;
- optimises performance for the target device using ‘write-once run-anywhere’ techniques from video game development that we have adopted;
- And plays all your favourite BBC content too.
A version of the player is now in use by other BBC R&D teams to test it on a range of new content and service experiments.
It was used to evaluate the benefits of game streaming in a public trial as part of our ‘Smart Tourism’ project in 2018. The project recreated the ancient Roman Baths in a game engine for an augmented reality experience. Using a 5G mobile phone network testbed in Bath, we ran the rendering jobs in the cloud and streamed the ancient artefacts to devices. Users were able to navigate through the simulation on mobile phones, sending data back to the cloud to remotely draw a 3D historical recreation of the scene based on the phone’s location and orientation.
We are conducting further tests to determine the limit for acceptable loss in quality in remote game streaming. By integrating remote game streaming in our multi-format player, we expect experiences to play on both high-powered and low-end devices without sacrificing a lot of their richness. This meets our aim of allowing all our audiences to be able to access our services on whichever device they have and at the best quality possible.
Our next goal is to develop examples of experiences that customise to the capabilities of devices by making smart decisions on where to run the code – locally or remotely and to ensure this scales.
This project is part of the Future Experience Technologies section