Posted by Alia Sheikh on , last updated
There are seven of us crowded into a small studio, our actors - Sally Carman, Richard Harrington and Helen Kay - are against a background of the brightest green, calmly informing the Prime Minster that the world is about to flood. We plan to take the resulting footage, and stitch it into a 360 panorama, to create a companion experience to BBC Radio 4’s latest production of John Wyndham’s The Kraken Wakes. The play, adapted by Val McDermid from John Wyndham's classic sci-fi novel, tells the story of an alien invasion that causes catastrophic flooding across the world.
This was a few weeks ago; the final part of The Kraken Wakes will be broadcast on Radio 4 on June 4th, and The Kraken Wakes 360 experience is live right now on BBC Taster.
In actual fact, this 360 content video was created as part of our ongoing tests into the format. We are interested in the cinematography of 360 video and VR experiences in general, and how audiences respond to the different ways which we can use to try to direct their attention around a 360 videoscape.
We have made three slightly different versions of our Kraken Wakes 360 video, and when someone clicks through from the BBC Taster website to view the video, one of these three versions is selected and played out at random. In each version, the spacing between the characters differs, affecting how much the user is encouraged to look around the scene. We are hoping to discover how varying the separation between elements of interest affects the viewers’ experience of the story.
Our director, Justine Potter of Savvy Productions, who was also responsible for the The Kraken Wakes radio drama, identified elements from the radio play which allowed us to do this test whilst at the same time making a compelling scene in 360. The scene was shot with actors against a greenscreen - which they each performed as a single-take so that we could stitch the performances together with the actors manipulated digitally to be at the various required distances from each other.
My colleague Paul Golds had the task of putting additional CGI elements into the scene. As we needed to storyboard the CG elements before the shoot itself, Paul was also able to create pre-visualisations of the significant moments in the scene and let the actors see what the final footage should look like. The biggest challenge on the day of the shoot was making sure that we had performances from each of our actors, that would line up perfectly with each other - a six minute single take, interacting with to-be-added-later CG elements is no small request. With a lot of professionalism on the day and some tricks in post-production, the result is something that feels like a single seamless performance.
In addition to those I’ve mentioned above, I’d like to thank everyone else who helped with this experimental production, including Cat Harris (compositing), Eloise Whitmore (sound design), Greg Furber and colleagues from Rewind (shooting and stitching), and Ian Wagdin, Vanessa Pope, Angela McArthur and Simon Rankine (R&D colleagues who helped out).
More on Virtual Reality and 360 Video:
This post is part of the Immersive and Interactive Content section