States of Mind: Real Worlds
It is 2030, and for the past fourteen years Charlie's mother Ella has been confined to her hospital room following a brain injury. Although Ella has remained behaviourally unresponsive, she is conscious and able to communicate using Virtual Reality (VR) and a Brain Computer Interface (BCI) which controls a speech synthesiser. Charlie has struggled growing up without the support of her mother and is reluctant to communicate with her mother in VR. Charlie has her own problems with a new baby, a husband who spends all his time in VR and a young son Kieran who she fears is heading the same way. But will Kieran be the key for Charlie to reconnect with her mother?
Real Worlds was developed through Wellcome Experimental Stories in consultation with Anil Seth (Professor of Cognitive and Computational Neuroscience and co-director at Sackler Centre for Consciousness Science, University of Sussex). The drama was inspired by the themes of the current States of Mind exhibition at Wellcome Collection in London which explores the nature of consciousness and runs until 16th October 2016.
The drama imagines a time when advances in neuroscience have made interaction through VR and BCI not only possible but fluent. Even today, new research is able to decode some aspects of people's mental states by combining brain imaging with machine learning: 'brain reading'. The technologies of brain implantation and VR are advancing rapidly. Fourteen years provides a plausible horizon for when these technologies could provide new opportunities for immersive interactions in VR without relying on the physical body.
Written by Jane Rogers
Directed by Nadia Molinari
Sound Design by Steve Brooke
Programme Consultant Anil Seth.
At the edges of awareness
by Anil Seth
Professor of Cognitive and Computational Neuroscience and co-director at Sackler Centre for Consciousness Science, University of Sussex
In Real Worlds, Jane Rogers takes us several years into the future. Communication with behaviourally unresponsive patients is now far advanced and is based on amazing developments in ‘virtual reality’. The clinical context for this drama is the ‘locked-in syndrome’ where a patient may have more-or-less normal conscious experiences but completely lack the ability to move. In Real Worlds, a locked-in patient transcends these limitations by controlling a virtual reality avatar directly using brain signals. These avatars inhabit virtual worlds in which the avatars of different people can interact, while the ‘real’ person behind each may remain hidden and unknown.
This drama deliberately inhabits the realm of science fiction, but there is solid science behind it too. The development of so-called ‘brain computer interfaces’ (BCI) is moving fast. These interfaces combine brain imaging methods (like EEG or fMRI, or sometimes more ‘invasive’ methods’ in which electrodes are inserted directly into the brain) with advanced machine learning methods to perform a kind of ‘brain-reading’. The idea is to infer, from brain activity alone, intended movements, perceptions, and perhaps even thoughts. These decoded ‘thoughts’ can then be used to control robotic devices, or virtual avatars. In some cases, a person’s own body might be controlled via direct stimulation of muscles. Progress in this area has been remarkably rapid. In a landmark but rather showy example, the Brazilian neuroscientist Miguel Nicolelis used a BCI to allow a paralysed person to ‘kick’ the first ball of the 2014 football world cup, through brain-control of a robotic avatar. More recently, brain-reading methods have allowed a paralysed man to play Guitar Hero for the first time since his injury.
The other technology highlighted in Real Worlds is virtual reality (VR), which – thanks to its enormous consumer potential – is developing even more rapidly. All the major technology and AI companies are getting in on the act, and VR headsets are finally becoming cheap enough, comfortable enough, and powerful enough to define a new technological landscape. Here at the Sackler Centre for Consciousness Science at the University of Sussex, we are exploring how VR can help shed light on our normal conscious experience. In one example, we use a method called ‘augmented reality’ (AR) to project a ‘virtual’ body into the real world as seen through a camera mounted on the front of a VR headset. This experiment revealed how our perception of what is (and what is not) our own body can be easily manipulated, indicating that our experience of ‘body ownership’, which is so easy to take for granted, is in fact continuously and actively generated by the brain. In a second example, we developed a method called ‘substitutional reality’ in which a VR headset is coupled with panoramic video and audio taken from a real environment, manipulated in various ways. The resulting experiences are much more immersive than current computer-generated virtual environments and in some cases people cannot distinguish them from actually ‘real’ environments.