What we're doing

Around 10% of television viewers in the UK use subtitles on a daily basis. They help many people enjoy television where they cannot have sound turned on, as well as being an access service for people with hearing difficulties or language issues.

For the past three years we have been looking into the ways in which the quality and quantity of subtitling can be improved for our audiences. Our work benefits from our being part of a public service broadcaster. We have access to the resources of the BBC including its audience research, programme archives and with help from production teams we can also create bespoke content material for our tests.

Audience surveys have helped us understand how people use subtitles. 90% of people who watch television with subtitles do so with the sound turned on. They use subtitles in combination with sound and lip reading to follow the programme. The surveys also help us design our User Research to model how people use subtitles at home. We carry out research in a purpose-built lab that replicates a living room environment. We recruit representative groups of subtitle users to take part in our research and using opinion scores and structured interviews we can build up a detailed understanding of the experience of using subtitles.

This year's IBC conference saw the publication of two papers on subtitle quality.

The first is called "The Impact of Subtitle Display Rate on Enjoyment Under Normal Television Viewing Conditions" and was based on a set of user research we carried out in March and April 2015. The tests used specially shot news stories, which were read at a series of different word rates along with a series of off-air clips. The results show that subtitle users want the subtitles to match the word rate of the speech even when the rate of the speech far exceeds current subtitling guidelines. Indeed, the rating of the speed of the news clips by subtitle users was closely matched by the rating of hearing viewers when watching without subtitles. This paper was amongst the top eight in the conference and has been published in the journal 'The Best of IET & IBC 2015-16' and James has written a blog post to go with the paper called How fast should subtitles be?

The second paper, "Understanding the Diverse Needs of Subtitle Users in a Rapidly Evolving Media Landscape" gives an overview of our subtitles research covering the past two years, bringing together work which has been published at academic conferences with developments in our understanding of the viewers' experience of subtitles.

We also had a presence on the BBC R&D stand in the IBC Future Zone, featuring our work on Responsive Subtitles, which showed the potential for subtitles to be formatted into blocks in response to the device capabilities and user input. This work was originally presented at the Web for All conference in May 2015, in our paper "Responsive Design for Personalised Subtitles".

In December 2014 in we carried out a programme of user research in collaboration with Mike Crabb who was on placement with BBC R&D from The University of Dundee at the time. This research looked at how subtitles could be presented with a video clip on a web page, adjustment of subtitle size and follow-up work on dynamic subtitles. This work is in the process of being written up as a series of papers. The first of these papers, "Dynamic Subtitles: the User Experience" was presented at TVX2015 in June 2015 and the second paper, "Online News Videos: The UX of Subtitle Position" was presented at ASSETS’15 in October 2015, along with a short paper, "The Development of a Framework for Understanding the UX of Subtitles", which was part of the poster session.

Subtitle availability

Over the past year we have developed a way of matching up video clips on our web pages to the same piece of video in our broadcast archive in order to locate matching subtitles for the web video. Our prototype has focused on the News web pages where it is able to find matches for around 40% of the video clips. We have applied for a patent for our technique and it was written up as a paper at the NAB 2015 conference, "Automatic retrieval of closed captions for web clips from broadcast TV content”.

Previous work

When we first started this work we began by looking into ways in which we might use language models for individual programme topics to improve the performance of speech to text engines and to detect errors in existing subtitles. We had some early success modelling weather forecast subtitles, which suggests there may be some value in this approach, but it would appear that other topics would be less successful. See White Paper WHP 256: "Candidate Techniques for Improving Live Subtitle Quality" for more details.

Then, at the request of our Technology, Distribution and Archives, Solution Design team we carried out a ground-breaking study into the relative impact of subtitle delay and subtitle accuracy. This work required the development of new test methodologies based on industry standards for measuring audio quality. A user study was carried out in December 2012 with a broad sample of people who regularly use subtitles when watching television. The results were presented at IBC2013 in September and are available as White Paper WHP 259: "The Development of a Methodology to Evaluate the Perceived Quality of Live TV Subtitles". Following on from this work the BBC and its subtitling partners have been making significant improvements to live subtitles available on news bulletins by using the presenter’s scripts to create the subtitles. This can result in news bulletins which have word-for-word subtitles presented without delay and without errors.

BBC Audiences have conducted some surveys for us to provide background data on the level of use of subtitles and how people are using them and what issues they have. More recently we have started to examine the iPlayer statistics on subtitle use as they have the potential to give us insight into the use of subtitles on a programme-by-programme basis. We have also started building an automatic subtitle monitoring tool to allow us to track long term trends with issues that we can measure, such as position and reading rate, as originally outlined in White Paper WHP 255: "Measurement of Subtitle Quality: an R&D Perspective".

Why it matters

At least 7 million people use subtitles regularly and mostly for reasons other than hearing difficulties. This is a large audience for subtitles. Whilst the quality of subtitling for pre-recorded programmes is very good, subtitling for live programmes faces problems of accuracy and delay.

This video from See Hear explains how live subtitles are made.

The delay in the arrival of the subtitles is a particular problem for people in our audience with hearing difficulties as they are watching with the sound on and using the text to supplement their understanding. These people will often turn either the subtitles or the sound off to stop them being confusing and many will just change channel.

For people watching without sound the delay isn't quite as bad, but because the subtitles are their only source of information the accuracy of the subtitles is most important.

Our goals

We are aiming to contribute to improvements to subtitling quality and availability, for broadcast, on demand, streamed and web content over the coming years.

How it works

We are using speech recognition and language modelling tools to look at processes that can be used to realign and reformat subtitles for later streaming. We are also carrying out user research to measure the impact of various issues on the perceived quality of subtitles.

This project is part of the UX work stream


People & Partners

Project Team