Posted by Chris Baume on , last updated
Today we’re launching a prototype podcast player on BBC Taster. This is an experiment in how we can use data to enhance the listening experience for our radio programmes and podcasts. One potential application of this technology is to tackle “fake news” by showing you the data sources that we use for our audio content. For this reason, we collaborated with the Radio 4 programme “More or Less” to bring you Even More or Less. Their emphasis on data and fact-checking make a perfect match for this project.
The ‘Even More or Less’ playback interface.
Our prototype provides a rich user interface that brings you a whole host of new features. You can navigate by topic or using a transcript, view relevant charts and images, follow related links, view contributor profiles, and share clips.
We’re putting our prototype player on Taster to try to understand which information and interaction features listeners value most; and also how best to present these. In this blog post, we explain how the project originated and the design process we followed to produce this groundbreaking new interface.
This project stemmed from a need to better understand what data we should generate during production. As we upgrade our production systems to use IP technology, we unlock new opportunities to collect and store much more information as part of the production process. For example, by labelling microphones with people’s identity, we can tag who is speaking, and when, on a recording.
We are also working on technology that can automatically extract much more information from audio recordings. Our work on speech-to-text, for example, has already made it easier to search through archives of speech recordings. There is a variety of potential routes for this research, but without a solid understanding of what data we want to generate and why, it is difficult to know where we focus our efforts.
This is where our prototype player comes in. By gaining a better understanding of what information and interaction features our audience value most, we can target our research to enable us to deliver those experiences on a large scale.
There have been a few previous attempts to use data to enhance podcast listening. In early 2018, The Guardian’s Mobile Innovation Lab released a special player for their podcast “Strange Bird”. Their player featured a chat-like interface that popped up with related visualisations, images and links. It also used push notifications to alert mobile phone users to new content.
At BBC R&D, our research into prototyping news formats produced a player called “FastForward”. This allowed listeners to scroll using a transcript, and a timeline that segmented the programme by speaker.
We kicked-off the design of our interface by compiling a list of the information we could generate during radio production. The list included topic segments, contributor identities, transcripts, tags, links to related information, and images. We then needed to choose which of these we should offer to listeners, and how we should present them.
To design our data-rich interface, we used a technique called ‘paper prototyping’. This is where the rough outline of a user interface is constructed using paper and pen. We recruited ten podcast listeners to design their dream playback interface by using the information from our list. We wanted to quickly capture a wide variety of concepts and ideas. To do this, we scheduled ten 30-minute prototyping sessions to take place in a single day. We gave our designers three rules:
- There are no right answers
- There are no rules
- Have fun
Most people listen on mobile, rather than on a PC, Mac or Tablet. We took a mobile-first approach and based the designs on a template of a mobile phone. To save time, we prepared paper cut-outs for each bit of information available that could be arranged onto the template.
The large amount of information we could display made the design process challenging. There is very limited space - especially on mobile - to display information, so the designers had to be very selective about what to show, and when.
As the volunteers designed their interface, we encouraged them to describe what they were doing and why. The software engineer who would build the prototype was involved throughout the design process. This allowed them to understand the logic behind the design choices, and to clarify any details during the process.
Designs produced from our paper prototyping sessions.
Common design choices
During the sessions, there were many choices that almost everybody included in their designs. Most started their design with a play/pause button and basic programme information. This included the programme name, episode title and accompanying artwork.
Everybody listed the topics that were in the programme, and displayed the currently playing topic. Each design had an interactive timeline, segmented by topic, that would jump to the start of the topic when clicked. Additionally, short synopses for each topic were also included, which we hadn’t previously anticipated. Each design included a ‘share’ button, but crucially, this would share the current topic, rather than the programme as a whole.
Related charts and images were included on all designs. Some suggested these could expand to include video, 360-degree images, and text. External links also formed part of each design. These can guide listeners to background information, and to enable onward journeys.
Finally, the name, role, picture and profile of each contributor was included, as well as an indication of who is currently speaking. It was hoped that this would help listeners better understand the credibility of contributors, and expose any biases.
Contentious design choices
Although there was much on which people agreed, there were many design choices where opinions differed. Transcripts were the most controversial element of the design. Opinion was equally split between including them, not including them, or only having them as an option. Some thought they were an unwelcome distraction from listening. Others thought they were an important navigation tool. However, one argument in favour that we hadn’t previously considered was they could be particularly useful to non-native speakers, or the hard-of-hearing.
While everybody included a timeline with topic labels in their design, not everybody used the same timeline orientation or topic labelling technique. Most chose a horizontal timeline, but one design used a circular timeline (similar to the iPlayer Radio app). Whereas another used a vertical timeline. Some designs displayed pop-up topic labels when the timeline was first clicked. Others expanded and contracted labels as the timeline was scrubbed. Each decision has their pros and cons, so the best design was far from obvious.
Other contentious design choices included which contributors should be displayed and when, which navigation buttons to include, and whether or not to notify the listener of additional information using sound effects or push notifications.
Our process for selecting the final design involved working alongside our software engineer. We wanted to find the best compromise between the various design options, while also being realistic about our ability to implement those designs.
We chose to display a transcript, with the option to switch it off. We found the transcript to be the most effective way to navigate the audio. By including a transcript, we resolved several other design choices. It meant we could remove all other navigation buttons, and display contributors as part of the transcript. Transcripts have the side-benefit of helping people understand what it being said. Those who are distracted can just switch them off.
Against popular opinion, we opted to use a vertical timeline. We felt that for a mobile phone interface, this worked better as it freed up more vertical space for navigating the content. The placement of the timeline on the right replaces the classic scrollbar, so we thought users would find it fairly intuitive. The vertical layout also gave us space to the side in which to display topic titles.
Finally, contrary to the design of the “Strange Bird” prototype, we decided not to alert users to the presence of new information using sound or notifications. The feedback we received was that the listening experience should not be interrupted. So long as there was enough information, users would quickly learn to seek it out by themselves.
Our prototype will be up on BBC Taster for the next three months. During this time, we will be collecting survey results and tracking how users are interacting with the interface. We also plan to run some formal face-to-face user studies to gain a deeper understanding of which elements of the interface our audience value or not. Later this year, we will compile and publish our results. These will be used to help inform the future development of BBC iPlayer and BBC Sounds, and the development of the tools we need to deliver these new experiences.
This post is part of the Immersive and Interactive Content section