BBC R&D

Posted by Ian Forrester on , last updated

Ian Forrester writes about touring our project as it visits IBC, FutureFest and Thinking Digital and people experience some of the 50,000+ different variations that can be produced!

Recently Visual Perceptive Media has been shown at IBC 2016 in Amsterdam, FutureFest in London, then finally Thinking Digital in Manchester as part of the object based media work BBC R&D have been pioneering.

Our pilot was viewed by many people at these events and we received plenty of questions ranging from production process to user experience and the editorial ethical decisions involved - so I thought it might be interesting to mention some of those in this blog post.

At IBC the questions mainly focused on the production process, as you can imagine. When fully explaining about objects (media + metadata) during discussions, it became clear to people how the very structured scheduling during shooting could benefit not only the production, edit and director but also the audience in a new broadcasting system. The reaction to an object-based production process was positive - a person from a well known documentary making company, was amazed that they could finally make use of the many hours of perfectly useable footage without relegating it to DVD extras which far fewer people would see.

Likewise there was surprise to see it all running in a web browser, including the real time colour grading. We also gave simple applications like being able to change the music based on region and rights, bypass swearing or violence based on the watershed or viewer age.

At Future Fest & Thinking Digital, the user experience and editorial questions dominated. The audience started to explore some of the challenges with Visual Perceptive Media. Visual Perceptive Media is made to deliberately nudge you one way or another using cinematic techniques rather than sweeping changes like those seen in branching narratives. Each change is subtle but they are used in film making every day, which raises the question of how do you even start to demo something which has 50000+ variations?


This is also the challenge we are exploring for a BBC Taster prototype. Our CAKE prototype deployed a behind the curtains view as well, which helped make it clear what was going on - it seems Visual Perceptive drama needs something similar?

On the editorial ethics questions I had plenty to say but was able to point to the stellar work we are already doing in this area.


The presentation at FutureFest was rapid (58 slides at 20 seconds a slide) giving over much time for Q&A from the fully packed room. I didn't realise that the talk would be so popular, people were sat in the aisles and some told me they couldn't get in the room, and had to watch through the windows. The presentation prompted questions about filter bubbles, data use, shared experiences, etc. But it was very interesting to hear how passionate people were about the serious concerns that visual perceptive drama raises.

Besides the slides which featured our research work into data ethics, I mentioned a three year project we are involved in called databox. This deserves a blog post in the future.

Finishing the presentation at Futurefest and Thinking Digital, I suggested that Perceptive Media could create a better experience than Mixed Reality (VR + AR). I positioned Perceptive Media as a type of Hyper Reality, using real connected objects, media and building a new type of shared experience.

Our research into new content experience with object media continues onwards… Thank you to everyone who came and spoke to me over all three conferences.

Ian Forrester is at MozFest this weekend with other colleagues from BBC R&D in the Dilemmas in Connected Spaces strand.  Look out for sessions on Usable privacy in the connected home and a demo bulit on the principles of our CAKE project. BBC News Labs will be at MozFest too with the Gifenator, VR and 360 Video.

Topics