Posted by Nick Hanson on , last updated
In part one of this series I discussed some of the ways BBC R&D has investigated interactive storytelling and shared some of the key lessons we’ve learned over the years. Here I’d like to share how we’re building on this work to help explore the storytelling of the future.
For many people the phrase ‘interactive story’ conjures up the image of ‘choose your own adventure’ choice-based narratives. This is the core concept behind most the interactive BBC projects covered in my previous post, and is also how Netflix’s recent Black Mirror episode, Bandersnatch, plays out.
But is this what we actually mean when we talk about ‘interactive drama’? And is branching narrative really the future of storytelling?
Although the format opens up some exciting opportunities, it also presents many challenges. As a successful writer once explained to me - giving ownership of their story to the audience by writing multiple ‘paths’, risks taking autonomy away from the author and diluting the quality of the story. When watching BBC One’s hit show Bodyguard do the audience really want to ‘lean forward’ to control the actions of the eponymous protagonist? Or do we want ‘lean back’ and let the writers take us on a journey?
The truth is both options are valid. It depends on the audience and the type of stories and experiences that work for them. ‘Choose your own adventure’ may well continue to emerge and evolve as a format, but that format is limited, and won’t appeal to everyone. So where does this leave us?
As content is increasingly consumed over the internet, producers and storytellers know ever more about their audience through the data they collect. No two viewers are the same. We all have different anxieties, needs and motivations. Our context changes as we go about our days and move (quite literally) through our lives. To meet this challenge, BBC R&D is exploring how we can create stories and content that adapt to individuals and their context.
One solution to this challenge is object-based media (OBM), a concept that regular visitors to this site will already be familiar with. We’re currently developing our own set of tools and partnering with other organisations to experiment with their tools for creating object-based stories.
One such recent collaboration was between BBC R&D, the University of York and Symbolism Studios on What is Love? (2018), an OBM film created for the inaugural York Mediale media arts festival. Set in the near future, in a world where Artificial Intelligence is commonplace within our daily lives, the film tells the story of a young couple living apart who communicate through an AI platform. When they begin to neglect their relationship, AI detects the problem and steps in.
It’s powered by The Cutting Room, an OBM tool developed by Davy Smith at the University of York, which allows viewers to play a role in the lives of the protagonists. The technology responds to the audience in terms of length, depth of interest, location, personal preferences and other factors to create an algorithmically tailored version of the story.
Unlike the choice-based branching mechanics of the previous examples, OBM is more versatile in how the component parts of a story could be remixed according the actions of the viewer, thus leading to greater potential for how the story plays out.
Although What is Love? still relies on the viewer ‘leaning into’ the film to consciously change the outcome, OBM could potentially see the shape of content change much more implicitly based, not on their choices, but their data.
As data is collected either through your BBC login or the devices you use to access content, a profile can be built over time allowing the content to be algorithmically shaped to you and even your content at the time (e.g. your location). For example BBC R&D’s Visual Perceptive Media project explored how the ‘objects’ in a drama such as music and colour grading is shaped in real time to suit your personality or mood.
A similar principle was applied to the Living Room of the Future project, an object-based perceptive media experiment in which the Internet of Things powered a hyper-personalised and multi-sensory immersive environment.
As well as demonstrating the potential for stories to interact with your data as much as your choices, these experiments also show how narratives don’t need to branch, forcing writers to create multiple storylines. They could simply layer ‘objects’ to provide a version of the story that is personal to you.
Of course interactive and perceptive storytelling doesn’t just apply to drama. The BBC has a long history of factual storytelling and we are currently piloting where the value of OBM lies here.
At the time writing, we are about to commission an object-based documentary made using StoryFormer, a web-based tool for developing responsive stories. We want to better understand if, through a combination of explicit audience choices (interactive) and implicit data (perceptive), we can serve up multiple different versions of a documentary for diverse audiences between 16 and 34 years of age.
We’re also incredibly excited about the potential of using video game engine technology to create, distribute and consume object-based media. Can we seamlessly deliver these experiences to you as easily as you get iPlayer today, without the need for downloading apps, switching to websites or leaping other barriers? There’s progress on the front that we’ll share in more depth later in the year.
Time will tell us where the real value of object based media lies, and whether people want to lean in to interact or whether data and algorithms will enable better, more personal storytelling. The one certainty is that for the technology to be meaningful it must serve to deepen the audience’s engagement. Never forget the first rule; don’t let interactivity get in the way of a good story!
This post is part of the Future Experience Technologies section