Posted by Emma Young on , last updated
Performance group 1927 are the first external producers to launch a pilot created with Audio Orchestrator, after its public release just three weeks ago. Their creative team has produced an orchestrated version of ‘I’m Alright, Jack’, the first episode of their Decameron Nights series, a Culture in Quarantine and BBC Arts commission that will air over three nights from 10th – 12th August 2020.
Audio device orchestration is the ability to use any available device that can play audio as a way to enable immersive and interactive audio playback. After several years working on this tech and experimenting with potential use cases, which included the trial production The Vostok-K Incident in 2018, we recently launched our new production tool Audio Orchestrator. We are thrilled that the award-winning performance group 1927 quickly spotted the potential of the tool and have used it to produce this public-facing pilot.
The first episode of 1927’s Decameron Nights, ‘I’m Alright, Jack’, is a trio of curious folk tales commissioned for Radio 3’s The Essay series, with haunting music and captivating sound effects that lends itself well to being re-imagined through orchestration. While the stereo version will air for radio audiences at 22:45 on 10th August and remain available on-demand via the BBC Sounds, listeners can experience the orchestrated version on BBC Taster, our platform for trialling new and experimental technology with the public.
Paul Barritt, Illustration Artwork & Sound at 1927 writes:
Working with the R&D team at the BBC on Audio Orchestrator (AO) was great fun! It is actually quite a simple piece of software to use once you get your head around the basic principles and very effective once you get it working really well. I managed to get several devices working pretty much straight away, and it was a giddy thrill indeed.
As a first experiment, I felt we did a cool thing; however, we were kind of working backwards. If we were to develop something using the tool again, obviously we would be taking it into consideration at the beginning of the process so would create the work with this in mind. That said, there was already plenty enough sound design created by Laurence Owen to make this a really effective project. In many ways, it was Laurence that did most of the work as it is down to the sound designer to create a variety of “Stems” to be played in the software. Again Laurence felt that he would have worked differently if he’d taken this into consideration from the offset. He too is of the belief, however, that AO has immense potential, especially as this is really his field.
Its application has the possibility to be seriously widened. I could imagine listening to a play, say a three-hander, in which each of the protagonists speaks from their own device and the listener can sit right in the middle as if sitting directly onstage rather than in the audience. I could also imagine scenarios in which quite intricate audio worlds could be created around the listener, creating various atmospheres, from wide-open space to quite claustrophobic environments.
I think the secret to using it is, not to think of it as a surround sound thing, but rather as a set of objects placed around a space that can make noises. In this sense, I think some very interesting outcomes could be developed. I found it nice to walk amongst the devices for example. Possibly even walk with them. Perhaps a strange AO walking tour?
With the Decameron project, we managed to incorporate some basic imagery to go along with the sound. This is a part of the software that is as yet undeveloped but which I think has great potential. Again it would benefit from conceiving of a project as an AO experience form the start and really thinking how best to use imagery.
In general, it has amazing potential, and there is a great team behind it, all really easy to work with and keen to experiment further with what can be achieved. So many thanks to Emma and Kristian for your help with setting up Decameron nights and hopefully it will be a very gratifying experience for all those who use it!!!
The Audio Team at BBC Research & Development started experimenting with audio device orchestration because we wanted to find more ways to give our audiences great listening experiences on the audio devices they have access to at home. It is widely accepted that surround sound - when the sound of the action is emanating from all around you - is a much more immersive listening experience. We wanted to find a way of creating that effect in the home, knowing that the vast majority of our audiences don’t have the sort of expensive speaker configurations that support spatial audio. It's becoming increasingly common for listeners to have access to different devices that are capable of reproducing sound, including mobile phones, tablets, and laptops. Device orchestration enables us to adapt the piece to make the best use of these.
Through our first trial production The Vostok-K Incident (part of the S3A project, a five-year EPSRC-funded collaboration with the Universities of Surrey, Salford, and Southampton) we learned from audiences that there is an appetite for orchestrated content. We focused our efforts on understanding the kinds of new media experiences that producers could make and on developing a flexible and easy to use tool to make the production of orchestrated content accessible.
The tool allows creatives to synchronise several audio devices into a connected system and take an object-based approach to production, routing parts of an audio scene to one or multiple devices to make highly immersive content. It is also possible to introduce interactive elements, where audiences can make choices during the experience, for example, to steer the narrative or to switch parts of an audio scene on or off. The tool can also allow the production of experiences that sync together speakers in multiple locations, for example, enabling people in two households to experience the same content at the same time. The underlying synchronisation technology, which is also used in our BBC Together trial, was developed as part of the 2-IMMERSE project.
Audio Orchestrator is now available for anyone to use through BBC MakerBox, which is a platform for creators to access tools that enable exploration of new technologies and a community for discussion and sharing of ideas. Audio Orchestrator comes with an example project to get new users up and running, and a set of detailed documentation, with technical support available through the MakerBox forums. If you would like to use Audio Orchestrator, you can request access through MakerBox.
This post is part of the Immersive and Interactive Content section