Search
  • Event Brief
  • Industry & Technology

Neuroscience in XR: Driving Immersive Reality

by Johnny Kung Jul 2 / 20

Extended reality (XR) technologies, including virtual reality (VR) and augmented reality (AR), leverage our sensory systems and brain-body interactions to immerse users in augmented and virtual environments. Frontier research in neuroscience and cognitive science presents an immense potential for tackling challenges that are limiting greater immersion. At the same time, advancements in XR are providing researchers with new tools to further probe the workings of our brain and mind.

On May 25-26, 2020, CIFAR convened a virtual roundtable that brought together Fellows in CIFAR’s Azrieli Brain, Mind & Consciousness program with other international experts from academia and the XR industry. Through short presentations and facilitated discussion, the workshop explored research that has direct implications for addressing some of the major bottlenecks in driving more immersive XR experiences, as well as how XR can lead to a better understanding of human consciousness and brain-body interactions. This was the third in a series of engagements to expand the societal impact of CIFAR’s Brain, Mind & Consciousness program, bringing together the neuro-/cognitive science and XR communities for a dialogue aimed at identifying actionable and collaborative opportunities for mutual advances that neither academia nor industry could achieve alone. 

Impacted Stakeholders

  • Researchers in neuroscience, cognitive science, philosophy, ethics and law

  • Developers of neuroimaging tools and XR technologies

  • Designers, engineers, content creators and legal counsels in XR gaming and film industries

  • Practitioners interested in the use of XR for education, psychotherapy, community building, and other applications

Key Insights

  • Compared to the much richer environment of the real world, setups in traditional psychology labs are limited, even as they are useful for learning certain things about our mind. By letting researchers add in more richness to the (virtual) environment, XR tools allow for more “realistic” experimental paradigms. Research that makes use of XR can thus provide fundamental insights into the workings of our brain/mind, while at the same time inform on what makes XR experiences immersive — e.g., showing that when interacting with virtual objects, slight differences in the timing (or contingency) of actions and their visual consequences, but not in the direction (or congruency), affect how quickly a subject becomes consciously aware of the virtual objects. However, there is a trade-off between allowing more richness and flexibility (e.g., the ability of the subject to look around the virtual environment) and the variability of the observed effect (since it would be harder to control what the subject is looking at).

  • Using virtual environments as experimental context, researchers have demonstrated the role of movement and action in learning and memory. For example, subjects randomly assigned to play first-person shooter games demonstrated improvements in a number of cognitive tasks, such as attention and spatial cognition, compared to those assigned to games that don’t require motion (such as Tetris). In another study, subjects who were able to place objects in different locations as they navigated a virtual environment showed memory enhancement when recalling the objects, compared to those who could only observe the objects during navigation. Finally, in an experiment that combined neuroimaging with the navigation of a virtual city, researchers showed that both spatial and temporal distances between events shape the “event map” in a subject’s memory and its neural encoding. 

  • Embodiment, in the sense of “knowing” one’s own body in terms of agency (“when I move, this moves”), location (“this is right here”) and ownership (“this is mine and no one else’s”), is a “controlled hallucination” shaped by exteroception (senses of the external environment, such as vision, hearing, smell, touch, etc) and interoception (senses of the internal state of the body, e.g., the sense of body position or , the sense of balance and spatial orientation provided by the vestibular system, and the perception of heartbeats). Expert users of tools, vehicles or sports equipment are often able to achieve millimetre-precision with what essentially become “extensions” of their bodies, even though these objects themselves provide no direct sensory information. VR equipment, in combination with other tools such as motion- and eye-tracking, can play an important role in understanding how embodiment occurs, with relevance for fields such as prosthetic limbs.

  • XR experiences that exploit the effects of embodiment are finding potential applications in fields as diverse as diplomacy and conflict resolution, restorative justice and psychotherapy, education and environmentalism. For example, high-level diplomats and policymakers can be given an “on-the-ground” VR experience of war-torn regions to hopefully put a “human face” on the decisions they make regarding recovery and peacebuilding. “Body swapping” exercises with individuals from other communities or those with different life experiences aim to build empathy, address biases and promote conflict resolution; while body swapping with a virtual psychologist to have a self-dialogue may be useful for psychological counselling. Some XR experiences even try to embody the user in non-human animals or other organisms that are nothing like humans (such as trees or fungal mycelia), with the goal of helping users better connect with the rest of the ecosystem, with possible added potential for scientists to embody in the model organisms most commonly used in neuroscience experiments (such as flies and mice).

  • VR may not need to be completely “realistic” for an experience/illusion to be immersive, because humans don’t always “correctly” experience reality. Our brain does not experience sensory data “as they are”, but instead samples certain input from the environment to make predictions/interpretations and form a model of the environment based on expectations. An example in VR is that, when blinders are added to the virtual field-of-view to block side vision, subjects do not necessarily experience a difference, as in the case of drivers’ focused vision. More studies will be needed on what has to be included or can be excluded from virtual experiences, such as our nose in our field-of-view (normally removed by our brain), or the use of visual “glitches” to simulate eye blinks. A VR experience may not even need to be completely “believable” to be immersive — given the right sensory cues, the desired behavioural response may be triggered regardless of whether the user believes the experience to be “real” or not. Importantly, it may be ethical for VR experiences to never feel fully realistic in order to maintain a “firewall” between the real world and virtual reality.

  • An emerging area of research and development is to bring together XR and artificial intelligence (AI). Training AI algorithms in virtual environments may help create AI applications that work better in the real world (e.g., using VR models created from 3D full-body scans of patients to train AI programs for robotic prosthetic limbs).

Priorities and Next Steps

  • As XR systems become ever more complex and immersive, the line between play-tests and neurocognitive experiments will be increasingly blurred. This presents an opportunity for scientists to collaborate more closely with XR developers and content creators, both in creating more engaging XR experiences that allow better storytelling, and in designing more creative and realistic scientific experiments. The large amount of data generated by industry can play an important role in advancing scientific understanding, provided there are proper safeguards for privacy and research ethics.

  • Beyond improvements in visual and auditory signals, increasingly more immersive virtual experiences will require input from other senses. The sense of balance, spatial orientation and motion provided by the vestibular system is particularly important, as a disconnect between the visuals of movement and a lack of vestibular feedback can cause motion sickness and a break of immersion. Improvements in hardware can increase immersion, e.g., with wireless and less bulky headsets, as well as more accurate and better integrated eye-tracking to follow where the user’s eyes are focused on. For vestibular feedback, researchers and industry are exploring the use of devices that induce vibrations or use electrical stimulation; alternatively, it may be possible to “overwhelm” the brain with noisy vestibular input so that it will ignore the signals and fill in its own vestibular inferences. 

  • Some studies are showing that embodiment or other subjective experiences reported in experimental settings may be affected by a trait known as phenomenological control. This is a trait, variable between individuals, where one is driven by implicit suggestions within a given context to generate a genuine subjective experience (and its associated neurological responses) to meet the expectations of the context (e.g., that I am supposed to feel like I have embodied a rubber hand in an experiment involving the rubber hand illusion). More research will be needed to determine how individuals’ varying suggestibility affects the conclusions drawn from cognitive science research. Phenomenological control also has implications for how easily different users may find XR experiences to be immersive or not.

  • An area that merits further exploration is the relationship between XR and social experiences. While currently XR is often a solitary activity, there is increasing interest in potential multi-user applications, e.g., shared musical experiences, or games where multiple players need to cooperate to solve puzzles. Further interdisciplinary conversations may lead to a deeper understanding of the psychology and neuroscience of collective experiences, as well as the creation of more authentic and immersive social experiences in XR.

  • There is a need for more research on how XR use influences cognitive development in children, particularly with regards to suggestibility, self-other awareness, and their sense of reality, given the increasing number of games and other XR experiences aimed at children who are likely to begin using such devices at ever earlier ages. Recent work has shown that children more easily report a sense of agency and embodiment in in-game avatars despite deviations and asynchrony. 

  • More broadly, it is important for researchers and XR developers to work closely with social scientists and philosophers to begin anticipating and addressing potential ethical and legal issues that may arise with broader deployment of XR. A number of questions deserve urgent attention, such as: What are the physiological, psychological and social consequences of long-term use of immersive XR experiences? How do interactions with other people’s avatars in XR affect interactions in real life? What are the implications of interacting with increasingly realistic AI-driven avatars? How do we prevent the use of XR experiences, designed to be immersive and persuasive, for propaganda or other harmful purposes? And would XR usage lead to a broader acceptance that humans do not perceive the world “as is” and that consciousness is itself a “biologically grounded VR”, and if so what would be the sociocultural and political consequences?

Roundtable Participants

  • Craig Chapman, Associate Professor, University of Alberta / Azrieli Global Scholar, Azrieli Brain, Mind & Consciousness program, CIFAR

  • John Cumming, SVP of Product & Technology, Secret Location

  • Pietro Gagliano, Founder and Creative Director, Transitional Forms

  • Melvyn Goodale, Professor and Canada Research Chair in Visual Neuroscience, Western University / Ivey Fellow, Azrieli Brain, Mind & Consciousness program, CIFAR

  • Walter Greenleaf, Visiting Scholar at Virtual Human Interaction Lab, Stanford University

  • Atsushi Iriki, Team Leader of the Laboratory for Symbolic Cognitive Development, RIKEN Center for Biosystems Dynamics Research / Fellow, Azrieli Brain, Mind & Consciousness program, CIFAR

  • Philipp Kellmeyer, Leader of the Neuroethics and AI Ethics Lab, University Medical Center Freiburg

  • Robert Kentridge, Professor, Durham University / Fellow, Azrieli Brain, Mind & Consciousness program, CIFAR

  • Peter Lush, Research Fellow, University of Sussex

  • Alberto Mariola, Graduate Student, University of Sussex

  • Daanish Masood, VR Artist and Researcher, BeAnotherLab / Co-lead of Innovation Team, UN Department of Political and Peacebuilding Affairs

  • Thomas Metzinger, Professor, Johannes Gutenberg University Mainz

  • Brandon Oldenburg, Chief Creative Officer, Flight School Studio

  • Adrian Owen, Professor and former Canada Excellence Research Chair in Cognitive Neuroscience and Imaging, Western University / Co-director and Koerner Fellow, Azrieli Brain, Mind & Consciousness program, CIFAR

  • Olivier Palmieri, Director of L’Atelier XR and Game Director, Ubisoft Montreal

  • Aniruddh Patel, Professor, Tufts University / Fellow, Azrieli Brain, Mind & Consciousness program, CIFAR

  • Brian Schwab, Former Director of Interaction Lab, Magic Leap

  • Anil Seth, Professor and Co-director of the Sackler Centre for Consciousness Science, University of Sussex / Co-director, Azrieli Brain, Mind & Consciousness program, CIFAR

  • Mel Slater, Distinguished Investigator and Co-director of Event Lab, University of Barcelona

  • Barnaby Steel, Founder and Director, Marshmallow Laser Feast

  • Keisuke Suzuki, Research Fellow, University of Sussex

  • Laurel Trainor, Professor, McMaster University / Fellow, Azrieli Brain, Mind & Consciousness program, CIFAR

  • Nicholas Turk-Browne, Professor, Yale University / Fellow, Azrieli Brain, Mind & Consciousness program, CIFAR

Further Reading

CIFAR resources:

The Future of Neuroscience and VR (event brief)

2018 Game Developers Conference panel - The Future of VR: Neuroscience and Biosensor Driven Development (event brief)

Opening the (virtual) doors of perception (research brief)

Is today’s artificial intelligence actually conscious? Not just yet (research brief)


Other resources:

Virtual Reality: Ethical Challenges and Dangers, by Ben Kenwright

Phenomenological control: response to imaginative suggestion predicts measures of mirror touch synaesthesia, vicarious pain and the rubber hand illusion, by Peter Lush et al.

Real Virtuality: A Code of Ethical Conduct. Recommendations for Good Scientific Practice and the Consumers of VR-Technology, by Michael Madary and Thomas Metzinger

What can virtual reality tell us about real-world psychology? By David Matthews

The Neuroscience of Reality, by Anil Seth

Sensorimotor contingency modulates breakthrough of virtual 3D objects during a breaking continuous flash suppression paradigm, by Keisuke Suzuki et al.

The Ethics of Realism in Virtual and Augmented Reality, by Mel Slater et al.

For more information, contact
Amy Cook
Senior Director, Knowledge Mobilization
CIFAR
amy.cook@cifar.ca