Archives for September 2012

RadioDNS at the IBC

Post categories:

George Wright George Wright | 15:37 UK time, Wednesday, 19 September 2012

RadioDNS in action

 

Last week, BBC R&D and others showed our recent work on RadioDNS at the International Broadcasting Convention in Amsterdam. The IBC is one of the broadcast industry's most important trade shows and conferences, and the BBC regularly demonstrates new and exciting developments there.

Read the rest of this entry

An Amazing Summer

Post categories:

Hannah Fraser | 18:37 UK time, Tuesday, 18 September 2012

Hi, I’m Hannah Fraser, one of BBC R&D’s Research Engineers, and for me, like many of my colleagues here in R&D and across the wider BBC, this has been an extraordinary summer.  We often have the opportunity to work with partners from elsewhere in the BBC and the wider technology industry, but this has been a particularly interesting few months- allow me to explain…

Panoramic Camera

 Panoramic Camera

 

Read the rest of this entry

IRFS Weeknotes #123

Post categories:

Andrew Nicolaou | 16:47 UK time, Monday, 17 September 2012

Week 123 was a busy one in the team with work wrapping-up on our two summer projects (the ABC-IP World Service prototype and FI-CONTENT TV authentication) and new projects starting up. I'm also pleased to announce the return of our weekly segment about people in the team called Chris.

Read the rest of this entry

Search Engine Optimisation: Knowledge Graphs, Schema.org, Instant Answers, Siri, Provenance and the Economy of Attribution

Post categories:

Michael Smethurst Michael Smethurst | 10:46 UK time, Monday, 17 September 2012

The BBC Internet blog recently ran a series of posts on search engine optimisation at the BBC:

  1. Duncan blogged about insights to be found in search logs, the benefits of well written meta descriptions and the importance of honesty and trust
  2. Oli blogged about the /food site rebuild and the importance of a redirect strategy
  3. Martin tackled copywriting and particularly headline writing for the BBC News site

Over in R&D our job is to experiment with possible futures so this post covers some work we've been doing to help describe TV and radio in schema.org but also some thoughts on how search (and optimisation for search) works now and how that might be about to change. So post number 4 in a series of 3.

Read the rest of this entry

IRFS Weeknotes #122

Post categories:

Barbara Zambrini | 17:15 UK time, Wednesday, 12 September 2012

This week at the office there has been lots of activity as our two main projects are going towards the finishing line with demos in big events like IBC and NEM. In parallel other projects are shaping up - but more about those later in this post.

Read the rest of this entry

The Challenges of Adaptive Streaming

Post categories:

Rosie Campbell Rosie Campbell | 11:30 UK time, Wednesday, 12 September 2012

There are few things that are universally hated (always a fun start to a blog post) but I would be willing to bet that the experience of trying to watch internet videos on a temperamental network connection is one of them. I wonder how many hours I've wasted watching that little spinning buffer wheel go round and round and round…

Well, the good news is that the Connected TV team at BBC R&D is working towards making those ‘insufficient bandwidth’ messages a thing of the past.  One technology that is helping us get there is adaptive streaming.

Adaptive streaming works by encoding the same media file at a number of different bit-rates, which produces multiple ‘representations’ of the content, each at different qualities. As the quality of the internet connection varies, the streaming client can switch between the different representations to provide a smooth viewing experience. When the connection is good, a high bit-rate representation will be requested, resulting in a good quality picture. If the connection worsens, the client will request a lower bit-rate, resulting in a slight decrease in picture quality but a better overall experience since it avoids freezing and rebuffering.

Adaptive bit-rate streaming. Source: Wikipedia user Daseddon

Adaptive bit-rate streaming. Source: Wikipedia user Daseddon

The BBC is already using adaptive streaming (e.g. for Wimbledon and the Olympics), however there is still work to be done to understand how the technology can be optimised for different network environments. This is where I come in - for my second trainee project I am investigating the characteristics of residential broadband networks, and looking into how we can produce and test different algorithms that decide which representation to request based on the network environment. It might sound trivial – surely you just check the download speed and request the maximum quality representation the connection can cope with at that time?!

Alas, if it were only that simple. As it turns out, both determining the download speed and deciding which quality to request are more complicated than you might imagine…

Networks can suffer from short-term fluctuations; in particular Wi-Fi is notorious for brief losses of connection (blame the microwave!). Although the connection will quickly recover, if the measurement happens to take place during the fluctuation it can give an inaccurately low value for the available bandwidth, and, if the algorithm doesn’t account for this, an unnecessary shift to a lower bit-rate.

The behaviour difference between wired and wireless networks also causes more fundamental problems. TCP, the protocol responsible for reliable data transmission over the internet, was designed with wired networks in mind. In wired networks, packet loss usually indicates congestion on the network, so TCP sensibly decreases the amount it is sending in order to reduce the network traffic. Unfortunately, TCP behaves exactly the same on wireless networks, where packet loss is far more likely to be just a random occurrence and no traffic reduction is necessary.

Bearing all this in mind – the streaming client needs to decide which representation to choose. Should it aim to maximise quality or minimise bit-rate changes? If the level of quality keeps jumping around all over the place it might be more irritating than watching something at a slightly lower but more consistent quality.

The size of the buffer may influence the choice. It might be a good idea to download some video ahead of time and store it in a buffer. If there is a lot of data in the buffer, you can risk waiting out short drops in bandwidth and maintaining a higher quality as video in the buffer will see you through. However, larger buffers mean slow start up times and slow reaction to controls like skipping, as you have to wait for the buffer to fill. You don’t want to be flicking through the channels and have to wait four seconds or so for anything to appear each time you change! Another disadvantage is that the more you buffer the further you are behind the ‘live’ stream – and you really don’t want to be in the frustrating situation of hearing your neighbours cheer for a goal you then know you’ll see in a few seconds time!

Then of course we have the problem of competing users. If there are multiple people watching adaptive video, currently the streaming clients get all confused. Either the quality oscillates wildly as they alternate bandwidth allocation, or one client ends up hogging all the bandwidth leaving the other with very poor quality. Ideally, we would like them to even out fairly and consistently.

So how can we make sense of all this to try and produce a more effective algorithm? I'm working on creating models of typical residential broadband networks over which we can test different adaptive algorithms. This involves using network simulation software to create topologies based on typical residential network characteristics. We can then simulate running different algorithms over these network topologies and output a range of metrics to allow us to assess their performance. This is likely to involve both statistical data (such as the likelihood of freezing/rebuffering, number of bit-rate shifts and the average quality) and graphical data (comparing how available bandwidth and requested bit-rate vary over time, to see how closely the two match).

This framework will allow us to assess existing algorithms and use the results to inform the development of improved algorithms which can then be refined in a similar way.

You might be wondering why any of this really matters to an organisation whose primary delivery method is through conventional broadcasting. Although currently IPTV is used mainly for accessing on-demand content, its advantages can’t be ignored, and there is an emerging trend that could see delivery over IP applied to a much wider range of services. As well as on-demand viewing, IP delivery enables viewers to watch content on the go on mobile devices, and allows access to a potentially much larger selection of content than can currently be broadcast through conventional means. With over 106 million requests for online BBC Olympic video content, it's clear that the public is embracing IP streaming.

As more and more TVs and set-top boxes come with internet connectivity, it is likely that what you watch from your sofa will be increasingly delivered over IP. And, just like an effective adaptive algorithm, we hope to make the transition as smooth as possible.

BBC R&D Labs

Post categories:

Tristan Ferne | 13:43 UK time, Friday, 7 September 2012

We've created a new section on our website to showcase the latest prototypes and code on the web from our teams at BBC Research & Development - BBC R&D Labs.

Screenshot of R&D Labs

For instance, you could listen to a subtlely personalised radio play from Perceptive Media, get some quick programme recommendations with the Sibyl Recommender, try working out what to watch with a group of friends with NoTube's collaborative viewing app or even set up your own radio bookmarking service using our RadioTAG reference implementation.

The prototypes in Labs are works-in-progress or experiments so they may not be there forever, but keep checking back as we'll be putting more things up there soon.

We're also working on refreshing our entire R&D site and you can help us out by answering a few questions about it.

RadioDNS live on national radio networks

Post categories:

George Wright George Wright | 10:20 UK time, Tuesday, 4 September 2012

 

BBC Radio now includes RadioDNS and RadioVIS technologies so when listening on a hybrid radio supporting RadioDNS you will see accompanying visuals - basic station logos for most of the national networks and an enhanced service showing images, live text and more on Radio 1 and 1Xtra.

Read the rest of this entry

More from this blog...

BBC © 2014 The BBC is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.