IRFS Weeknotes #124
We moved offices this week, away from clunky-but-lovely Henry Wood House, to shiny and modern West London. This disrupted us for around 20 minutes and then we got on with researching and invented the future (with a bit of dissemination and meet-the-colleagues on the side.)
I spent Monday rehearsing for a big BBC internal conference. On Tuesday I went to it, along with pretty much all of the team. Kirsty Wark chaired the day and so Rob and I explained our work to her and people from around Future Media and the BBC. I signed her up to the World Service Archive trial, then Yves, Tristan and others joined colleagues from in R&D in the break, demoing more of our work to the attendees. I enjoyed the session with the new DG and the Director of Future Media, and then scooted off to Warsaw to present some of our ideas for collaboration at an EU Proposers Day- kinda like blind dating but for cross-border R&D.
James is working on Snippets, porting the core application to a new framework for future development and working with Gareth on restructuring the core app and its APIs. For Rob, this week’s effort has mostly been around presentations and securing new datasets. We have a number of new academic partners interested in our Framestore data (half a billion images representing every single second from the last 5 year’s worth of BBC TV broadcasts) which we’d like to partner with.
James is also working on the egBox demo platform, looking at portable DVB modulators to help properly demonstrate the complete system from antenna to screen in isolation outside the UK.
Chris Newell is working on an update to our Sibyl recommender prototype which will replace the current metadata-based approach with a hybrid which includes collaborative filtering. MyMediaLite, the open-source recommender system library initiated in the MyMedia collaborative research project, has released version 3.03 with improved support for the kNN algorithms we use.
Pete and Andrew Nicolaou worked on a website, and Chris Lowis and Matt Paradis fine-tuned the demos for the upcoming Radiophonic Workshop in public Day (to be held at the Southbank in London. ) - we’re putting together a micro-site showing Chris’s and Matt’s four interactive demos alongside the source code and a bit of history about the original machines.
Editing of speaker names in the World Service archive is now live (and already 250 names have been contributed by our initial set of users), and Yves’s working on more tools to validate or correct speaker names when they have been inferred by our automated tools, so that we can use the edited data as a basis for evaluating our speaker recognition algorithm.
Libby ran a workshop on possible applications for VistaTV data (with the help of Olivier, Chris Newell, Tristan and Vicky). 32 people attended from both inside the BBC and outside, and they produced a large number of interesting ideas for us to explore in the next few weeks. Libby also attended a very useful course on data visualisation run by Andy Kirk.
On Wednesday Andrew invited Knight-Mozilla Fellow Laurian Gridinoc to give an overview of his work with BBC News Specials and new approaches to telling stories on the web. He gave a very interesting survey of the current state-of-the-art, mainly using HTML5 media framework Popcorn.js to mix web content and services with audio and video. Some of this was experimental, mixing video, street view and maps ) and some was an evolution of established presentation types like slideshows
Andrew N plugged Chrome Canary’s new support for CSS Shaders which enable “Cinematic Effects using CSS and HTML.” See a video and examples here