The Music Bore
From Chris Lowis, one of a team of BBC developers who created a winning entry at the recent Music Hackday...
On the weekend of 11th-12th July the Guardian kindly allowed a group of hackers to take over one of their offices to hack on music related projects. Patrick, Yves, Nick and I represented A&Mi. As well as helping others to work with /music and /programmes APIs, we spent the weekend building an automated DJ system which trawled the web of linked data (and other webservices with open APIs, echonest and last.fm for example) to create a continuously evolving playlist of music. A robotic voice then droned on with facts derived from the data. We called the resultant mash-up The Music Bore.
You can see the Bore at work in this video:
It was, as organiser Dave Haynes pointed out, significantly better than listening to Chris Moyles. We couldn't possibly comment.
In a moment of madness, we decided to implement the whole thing as a collection of inter-communicating IRC bots. Using IRC bots allowed us to work on and test the different parts in parallel, and to each use our favourite languages. We collaborated on shared code using github.
Yves worked on the playlist generation and fact-finding code using RDF and SPARQL queries. The exciting part of this work was that it allowed the playlists to be utterly unpredictable. Given an arbitrary starting point the Bore could follow many possible paths through the web of linked data. If you want to learn more about how this worked in practice, or how to mine linked data using these technologies, Yves recently blogged about that in detail.
Patrick built a suite of "fact-finders" which randomly generated facts for particular artists from a variety of sources. He also put together, in record time, the "controller bot" which drove the demo - probably the most impressive example of speed coding I've ever seen!
Nick worked on the audio generation, mp3-finding and re-encoding magic that was necessary to allow music to be played. A particularly interesting aspect for me was the use of JACK to allow programmatic volume fading and mixing of the different audio streams. I did some amateur natural-language generation work to make the bot sound a little more human. This involved turning abstract facts into human sounding sentences using some basic NLG techniques.
We were lucky enough to be awarded the last.fm category prize, and to our great surprise the "best in show" prize too. A testament to what can be achieved in 24 hours if you let 4 hackers use their favourite tools to build something they are passionate about; or a clear demonstration of the lack of judgement caused by sleep deprivation? You decide.
It's well worth taking a look at the list of other hacks produced. I especially liked the Music Zeitgeist visualisation of the lyrics of each decade of music, an iPhone app for displaying last.fm events on a map and this site for generating a "sounds of your city" radio station.