Posted by Ian Wagdin on , last updated
Every year in April content producers, Engineers and Broadcasters from around the world get together in Las Vegas to discus the latest trends in Broadcasting and technology at the National Association Of Broadcasters Show - or NAB. Over 90,000 people from 160 counties attended and BBC R&D were there to take part in the debates and speak to our partners and suppliers. In this post we'll look at trends and how our work compliments that of the wider industry, and in subsequent posts we'll looking at how our work has helped shape the show floor and what we may see in years to come.
I have been lucky enough to attend NAB several times and each year there seems to be a central piece of technology that captures the headlines, some such as 4K UHD and drones have made it to normality and others such as 3D fall aside. Last time I was here, in 2016, 360 cameras and VR headsets were everywhere and while they have a reduced presence this year the top spot on the podium this year was firmly held by eSports.
Virtualcasting is Here
American TV is dominated by sports, coverage of its many forms from college leagues to the Super Bowl drives a huge amount of revenue for the rights holders and this in turn means that a huge amount of investment goes into covering every field goal or home run.
In recent years we have seen the rise of computer games and while most people think of them as a solitary pastime they now command a huge audience both online and on on traditional TV channels and the audiences (and the participation and interaction of those audiences) are huge. So it comes as no surprise to find booth space, large parts of the show floor and conference time dedicated to the coverage of the gladiatorial battles of athletes in virtual environments.
Computer games and broadcasting are coming together in some interesting ways, the real and virtual worlds from both a technical and editorial point of view.
Technically, we are seeing dedicated workflows, media management and archive solutions being developed. Arenas are being built to house major tournaments, one of the newest and biggest is in the Luxor Hotel on the Las Vegas strip. It has been kitted out with facilities any broadcaster would be familiar with, and this drives both the venue audio, screens and graphics, as well as the ‘broadcast‘ output - with the added complication that the audience in both the venue can join in, take an active part and even win the event.
Editorially, we are learning how to cover events in a virtual environment. In traditional sports we know where the action is likely to happen and we position cameras to ‘follow the ball’ and directors know the grammar of the story, when to go to a wide shot, close up or graphic.
In eSport there are no wides or close-ups, the camera is set to a permanent ‘focal length’ and virtual camera operators roam the field of play looking for the action to offer to the director rather than the director calling the shots. There is also no ‘line’ and cameras can be anywhere in 3D space surrounding the action.
There are new rights models with software developers owing the lion's share and licensing and revenue models are being developed too.
There is also a note of caution to broadcasters. They don’t own the format and it does not fit into a nice time bounded slot of 90 minutes with 15 minutes at half time for advertising. Games can go on for minutes or days and the audience know the content inside out. This is test match Cricket not Twenty20.
So with the growth of technical platforms to support these events we need to look at new ways to tell stories in virtual environments and with render and display tech getting better year on year we can expect new formats and genres
Broadcasters need to no longer think of themselves as radio, TV or online, there is growth and room for a fourth pillar and for virtualcasters.
Interoperability - More Important Than Ever
IP migration is moving from the innovation zones to the exhibitor booths. The benefits are well understood and some form of IP production tool can be found on most booths.
On the surface there appears to be two competing systems, the ST 2110 family of standards and NDI. Each has certain advantages and one of the big stories on the show floor this year was the acquisition of NewTek, who are behind a lot of the NDI work flows, by VizRT who have been a proponent of ST 2110.
The reality is that, just as we don’t have a single camera for all types of production, we are unlikely to have a single IP solution. What we need is a way for the two systems to interoperate and pass content between them.
A broadcaster may use ST 2110 in their studio environment but have NDI in their live news production workflows where more compressed workflows are important to deal with less than optimal connectivity.
So how will this work? The answer lies in the Network Media Open Specifications or NMOS, until now we have seen most of this work in the ST 2110 space but this does not mean that the various protocols can not be applied to any transport system. In an ideal world we should be able to use both systems with an NDI source appearing on a ST 2110 matrix in the same way an OS would appear on an SDI router.
Convergence on the two systems is a welcome development and means that the right tool will be available for content producers what ever their requirements or budget.
5G: Everyone is Talking About it But No-one Can Find it.
5G was a major discussion point with lots of conference sessions on the subject, but while everyone recognises it will play a major part in the future of media creation and consumption it was not as prominent on the show floor as I had expected.
On distribution the jury is out on how 5G will be deployed. Will content be delivered via existing OTT systems or will some of the new broadcasting modes already in the standards be deployed? The importance is key to our capability to deliver content. If we treat 5G as an OTT model then every device that we connect to a stream of media needs a dedicated connection. In broadcast we are used to sending a single signal to a transmitter and having thousands of devices receive that content.
The one to one method works well for non-live content when individuals consume media on-demand but we can easily break this model during period of peak demand for live content in especially in today’s higher resolutions. We need the capability to send a single stream of media and for users to be able to subscribe to that stream from their device during live events such as the World Cup, Olympics, music festivals or even breaking news coverage.
The technology to do this is starting to appear on the show floor and as the technologies develop I expect to see a growth in this area.
BBC R&D - All of our articles on 4G and 5G including: