BBC R&D

What we've done

Making Musical Mood Metadata (M4) was a TSB funded collaborative project with BBC R&D, QMUL and I Like Music. Over eighteen months, the team developed new and innovative methods of extracting high-level metadata from music contact, including information about the mood and emotional content of tracks. Having access to this information makes it easier for content producers to find the music they are looking for.

Why it matters

The digital music revolution has seen an explosion in the size of music libraries. TV and radio producers now have a wider choice than ever before over which track to use in their programme, and finding the ideal track can often be a lengthy process.

Using the latest techniques in digital music analysis and machine learning, we can make it easier for people to find the track that is right for their situation. In broadcast, music tracks are often chosen for the emotion and mood that they convey. For this reason, the project focussed on allowing people to search for music by its mood.

How it works

Machine learning is a method of training a computer to make connections between two pieces of information - in this case, music and mood. This is done by providing the computer with thousands of music tracks of various moods so that it can learn to distinguish between them.

By partnering with music suppliers I Like Music, the project gained access to over 100,000 music tracks that were each hand-labelled with detailed information about their genre, instrumentation and mood. We processed the audio and metadata using sophisticated algorithms and statistical techniques to find underlying structure and to help classify the audio content.

Outcomes

By performing an analysis of I Like Music's large music database, we developed a model for representing the mood content of music as a series of numbers. This allows us to create systems which can interpret and compare the mood and emotion of music tracks.

We used the N8 High Performance Computing Cluster to perform an in-depth musical analysis of 128,000 tracks. This data was combined with the mood model to train and test machine learning systems. Through this work, we discovered which musical features are most critical in determining the mood of music.

We have already used this technology to enhance the BBC's online music library service with a recommendation function which gives producers a wider variety of music to use in their programmes. We are currently looking for ways to bring the full benefit of this technology to the wider public.

Publications

This project is part of the Immersive and Interactive Content section

This project is part of the Audio Research work stream

Topics

People & Partners

Project Team

Project Partners