This is a recent set of work reviewing the problems of subtitling and how they can be overcome on all our platforms

What we're doing

We are examining the issues which impact on the quality and availability of subtitles for our audience across all our platforms.

We first looked into ways in which we might use language models for individual programme topics to improve the performance of speech to text engines and to detect errors in existing subtitles. We have had some early success modelling weather forecast subtitles which suggests there may be some value in this approach, but it would appear that other topics will be less successful. See White Paper WHP 256: 'Candidate Techniques for Improving Live Subtitle Quality' for more details.

At the request of our Technology, Distribution and Archives, Solution Design team we carried out a ground-breaking study into the relative impact of subtitle delay and subtitle accuracy. This work required the development of new test methodologies based on industry standards for measuring audio quality. A user study was carried out in December 2012 with a broad sample of people who regularly use subtitles when watching television (photo above). The results were presented at IBC2013 in September and are available as White Paper WHP259: 'The Development of a Methodology to Evaluate the Perceived Quality of Live TV Subtitles'.

BBC Audiences have run some surveys for us to provide background data on the level of use of subtitles and how people are using them and what issues they have. More recently we are starting to examine the iPlayer statistics on subtitle use as they have the potential to give us insight into the use of subtitles on a programme by programme basis. We have also started building an automatic subtitle monitoring tool to allow us to track long term trends with issues that we can measure, such as position and reading rate, as originally outlined in White Paper WHP 255: 'Measurement of Subtitle Quality: an R&D Perspective'.

Over the past year we have developed a way of matching up video clips on our web pages to the same piece of video in our broadcast archive in order to locate matching subtitles for the web video. Our prototype has focused on the News web pages where it is able to find matches for around 40% of the video clips. We have applied for a patent for our technique and have written it up as a paper at the NAB 2015 conference.

During December 2014 we ran a set of user research experiments in collaboration with Mike Crabb who was on placement from The University of Dundee. This research has been looking at how subtitles could be presented with a video clip on a web page, adjustment of subtitle size and followup work on dynamic subtitles. This work is in the process of being written up as a series of papers. The the first of these papers, "Dynamic Subtitles: the User Experience" was presented at TVX2015 in June and the second paper, "Online News Videos: The UX of Subtitle Position" has been accepted for presentation at ASSETS’15 in October, along with a short paper, "The Development of a Framework for Understanding the UX of Subtitles" which will be part of the poster session.

We presented our paper on "Responsive Design for Personalised Subtitles" at the Web for All conference in May. This paper introduces our concept of Responsive Subtitles where subtitles are rendered in the client in a way that takes account of the viewer's personal preferences and the capability of the display.

In March and April we ran a further series of user research experiments, this time looking at the issue of subtitle reading rate, using specially shot news stories at a series of reading rates and a series of off-air clips. A paper detailing the results of this work has been accepted for publication by IBC2015 in September.

Finally, a second paper giving an overview of our subtitles research has also been accepted for publication at IBC2015 and will be presented in the session on assistive technologies. We will also have a presence on the BBC R&D stand in the IBC Future Zone, featuring our work on Responsive Subtitles.

Project updates

More project info

Why it matters

At least 7 million people use subtitles regularly and mostly for reasons other than hearing difficulties. This is a large audience for subtitles. Whilst the quality of subtitling for pre-recorded programmes is very good, subtitling for live programmes faces problems of accuracy and delay.

This video from See Hear explains how live subtitles are made.

The delay in the arrival of the subtitles is a particular problem for people in our audience with hearing difficulties as they are watching with the sound on and using the text to supplement their understanding. These people will often turn either the subtitles or the sound off to stop them being confusing and many will just change channel.

For people watching without sound the delay isn't quite as bad, but because the subtitles are their only source of information the accuracy of the subtitles is most important.

Our goals

We are aiming to contribute to improvements to subtitling quality and availability, for broadcast, on demand, streamed and web content over the coming years.

How it works

We are using speech recognition and language modelling tools to look at processes that can be used to realign and reformat subtitles for later streaming. We are also carrying out user research to measure the impact of various issues on the perceived quality of subtitles.


People and partners

Project team