This is a recent set of work reviewing the problems of live subtitling and how they can be overcome for live television and for subsequent viewing on iPlayer

What we're doing

We have been looking into the issues which impact on the quality of live subtitles for our audience.

We have examined ways in which we might use language models for individual programme topics to improve the performance of speech to text engines and to detect errors in existing subtitles. We have had some early success modelling weather forecast subtitles which suggests there may be some value in this approach, but it will require a great deal more work.

We have carried out a ground-breaking study into the relative impact of subtitle delay and subtitle accuracy. This work required the development of new test methodologies based on industry standards for measuring audio quality. A user study was carried out in December 2012 with a broad sample of people who regularly use subtitles when watching television. The results were presented at IBC2013 in September.

More recently we have been exploring ways to take the live broadcast subtitles and carry out automatic post-processing to remove the original delay and improve the formatting. Early results are promising and we are in the process of talking to the iPlayer team about the potential for this work. We are also looking at how live subtitles could be realigned and reformatted durring streaming.

Over the coming months we will also be doing research with an aim to developing guidelines for the display of subtitles on smaller devices like tablets and mobile phones.

Project updates

More project info

Why it matters

At least 7 million people use subtitles regularly and mostly for reasons other than hearing difficulties. This is a large audience for subtitles. Whilst the quality of subtitling for pre-recorded programmes is very good, subtitling for live programmes faces problems of accuracy and delay.

This video from See Hear explains how live subtitles are made.

The delay in the arrival of the subtitles is a particular problem for people in our audience with hearing difficulties as they are watching with the sound on and using the text to supplement their understanding. These people will often turn the subtitles off if they are late as they are too confusing.

For people watching without sound the delay isn't quite as bad, but because the subtitles are their only source of information the accuracy of the subtitles is most important.

Our goals

We are aiming to contribute to improvements to subtitling quality, for both broadcast and streamed content over the coming years.

How it works

We are using speech recognition and language modelling tools to look at processes that can be used to realign and reformat subtitles for later streaming. We are also carrying out user research to measure the impact of various issues on the perceived quality of subtitles.


We are in the process of publishing papers on our work and are talking to colleagues in BBC Future Media about using our approach in iPlayer and other web video.

We demonstrated some of our work at IBC2013, 12th to 17th September in Amsterdam

An R&D White Paper based on the IBC paper is now online.

People and partners

Project team