An Affective Interface for Mood-Based Navigation
Sometimes we need 'cheering up', 'calming down' or 'a bit of excitement'. Whether we notice it or not, we use affective (i.e. mood-based) language pervasively in life. Yet when we browse for media, we find content categorised into rigid, traditional genres, with searching restricted to factual metadata such as title or director. There is little allowance for relative scales or subjectivity.
The Multimedia Classification project in Archives Research is attempting to address this by answering the question of how we can use mood as a meaningful navigation tool. It certainly isn’t a feckless pursuit: if the BBC archive is ever to be made available to the public, we’re going to need some help finding what we want. From hundreds of thousands of hours of programmes spanning over 75 years, simply searching for ‘comedy’ isn’t going to get you very far! In fact, in our recent study, the majority of people said they would find it useful to be able to search by mood. I understand if you’re sceptical, we’re so used to conventional searching, it’s hard to imagine a useful alternative. However we’re not suggesting this approach will replace conventional methods, more that it will augment and improve them. It may even turn out that searching by mood is something people didn’t realise they wanted it, but once it’s there they’ll wonder how they ever managed without it!
So how does it work? The classification system automatically analyses programmes for a range of different video and audio features, such as luminosity, laughter and motion (see here for further details). The results of this are then used to assign each programme a rating on a set of mood scales. For example, a programme with a high level of motion but not much laughter might score 5/5 on the ‘slow-moving to fast-paced scale’, but 1/5 on the ‘serious to humorous’ scale, meaning it is quick but not very funny (a thrilling drama for instance). The advantage here is that it is then possible to compare different programmes based on their mood scores, allowing you to, for example, search for something ‘more exciting than Spooks’. Moreover, the combination of scores on each scale gives the programme a kind of mood fingerprint, so the system could recommend programmes with similar mood fingerprints to ones it knows you like.
We're currently in the process of researching how users might interact with this mood data. A prototype interface we have developed is a 2-dimensional scatter chart, with each axis representing one mood scale.
This allows users to view the relative moods of programmes as dots plotted on the chart. They can then either pick the mood they want using the sliders, or search using conventional means, using mood as a reference. It is hoped that this frontend will be integrated with BBC Redux (a BBC internal research tool) within a few months, as a preliminary trial to see how useful people find this type of affective interface.
Another area of interest is how mood varies throughout a programme. We’ve built a prototype system using the Universal Control API which allows users to view a dynamic graph of mood data on a second screen (such as a tablet PC or smart phone) that moves along in sync with a programme on a set top box.
This graph can display individual features (such as luminosity shown above) but for public use will most likely show an amalgamation of the many different audio and video features of the programme as analysed by the classification system, to provide an overall value for what we are provisionally calling ‘interestingness’. Imagine a kind of ‘intelligent skipping’ service, so that by tapping on peaks and troughs shown on the graph you can skip ahead to goals and car chases, or straight to the desired performance in entertainment shows.
There are a number of directions this project could take in the future. A more public-friendly user interface than a graph is desirable, and here there is scope for innovative interface design. The problem of mood subjectivity will need to be addressed; since something I find humorous may not be so funny to you (this became apparent in our user trials where many people rated a serious documentary on 70s fashion as very humorous…) The current view is that the system would take advantage of machine learning technologies so that it could adapt to your tastes: the user would rate the success of mood categorisations and the system would gradually learn, improve and become personalised.
Ultimately, the Multimedia Classification team hope to augment browsing, searching, skipping and programme recommendations utilizing mood-based metadata, and, with user experience in mind, bring you an effective affective interface.