• Feature
  • News
  • Brain, Mind & Consciousness

EEG in the Wild

by Eva Voinigescu Jun 28 / 17

What if we could track our brain activity the same way a smart watch tracks our heart rate?

But instead of telling us if we’re in the fat burning zone, the data could help researchers learn how the brain makes decisions or help doctors diagnose and treat diseases like Parkinson’s and mental health disorders like depression.

In fact the technology has been around for over a century. Electroencephalography (EEG) allows us to record electric activity in the brain non-invasively by placing electrodes on the scalp. Though EEG is typically used to diagnose conditions like epilepsy and coma, it’s also a useful tool for brain research. Now new, low-cost, light-weight EEG sets may hold the key to understanding the brain on a whole new level.

CIFAR Azrieli Global Scholar, Alona Fyshe

“The popularity of the Fitbit has shown how eager people are to measure their activity and their progress over time,” says Alona Fyshe, a member of the inaugural cohort of CIFAR Azrieli Global Scholars and CIFAR’s Azrieli Program in Brain, Mind & Consciousness. “Measuring brain activation at home on a regular basis, rather than only in controlled lab settings, has the potential to improve our understanding of the brain, as well as our ability to monitor and treat mental illness.”

“To date, most brain imaging research has been done in highly controlled lab situations that allow researchers to adjust stimuli and measure results. Consequently, our understanding of the brain is based on very unnatural settings that don’t accurately reflect our everyday lives,” says Craig Chapman, a fellow member of the CIFAR Azrieli Global Scholars and the Azrieli Program in Brain, Mind & Consciousness. The two researchers are behind a new project to push EEG research out of the lab and into the real world.

Their long-term goal is to enable people to collect EEG data as they go about their daily lives, giving us insights into how our sensorimotor system works, how meaning is represented in the brain and how neural processes like subconscious decision making play out. For the researchers, taking well established sensorimotor and neural recording techniques like EEG outside of the controlled lab setting is an incredibly important evolution toward truly understanding the human experience.

But to get to that point, they must first prove that the new technology is up to the task. That’s where CIFAR comes in. Last year, CIFAR launched a new Catalyst fund to support and accelerate the process of collaboration among fellows. The fund designates money to help facilitate high-risk, interdisciplinary research collaborations across CIFAR’s membership.

CIFAR Azrieli Global Scholar, Craig Chapman

Chapman and Fyshe will use their Catalyst funds to finance trainees who specialize in EEG collection and analysis, and to purchase some of these new low-cost light-weight EEG systems.

The idea for the project came up at a CIFAR meeting of the Azrieli Program in Brain Mind & Consciousness last year. Fyshe, a computer scientist with a background in machine learning and computational linguistics, had been pursuing the idea of deploying consumer-grade EEG in the home in order to collect larger data sets quickly. Currently, brain image data sets are very small and require significant acquisition time per subject. Having larger data sets would allow Fyshe to employ machine learning techniques to mine the EEG data for meaningful patterns and useful insights about how the brain works.   

“With at home EEG we can have multiple sessions per person as opposed to one in the lab. This creates hours and hours of data and all of a sudden we have the amount of data we need to apply deep learning algorithms,” says Fyshe.

Chapman, a cognitive neuroscientist, was also already using EEG in his research, integrating it with eye-tracking and motion-tracking to understand the unconscious brain processes that facilitate movement and decision making.

Because our decisions about movement often fail to reach our consciousness, studying them can be useful in helping us understand whether consciousness emerges instantaneously or gradually.

“Movement and thought are not separate; they are continuous, with movement dynamically reflecting ongoing thinking,” says Chapman. “This makes movement recording an exciting and powerful research tool for science and diagnostic tool for medicine.”

The two researchers decided they would work together to create and validate a new EEG recording and analysis technique using new consumer-grade EEG machines. But first they will have to confirm that these simplified EEG sets can capture the same quality of data as their lab counterparts.

In order to do this Chapman and Fyshe will build on an experimental framework established in Chapman’s lab at the University of Alberta. This work examines functional movement tasks like reaching, grasping and moving a pasta box, and uses eye-tracking and motion-capture to measure how neural processing unfolds during these movements. Now they will incorporate EEG measurement into the experiment.

What they’re looking for is what’s going on in the brain when eye and hand movement coordinate, and in particular, what happens in the brain at the moment during reaching and grasping when the eyes move from the item they’re reaching for, towards the spot they’ll be moving that object to.

“We’re looking for whether the brain shows classic decision signals when someone ‘decides’ to move their eyes or hand,” says Chapman.  “In other work, we’ve isolated a clear buildup of brain activity immediately before someone presses a button to tell us which of two objects they see as brighter. Will we see this same buildup before the much more automatic shift of eye gaze? If we do, it will represent an important demonstration of shared and conserved brain processing across a diverse array of decision tasks.”

Once the EEG data is collected, Fyshe and her team at the University of Victoria will mine the data from the EEG, eye tracking and motion capture to determine what is going on in the brain during this planning and decision-making process. Fyshe and Chapman will also bring on consultants from CIFAR’s Learning in Machines & Brains program, including Joel Zylberberg (University of Colorado Denver), a fellow CIFAR Azrieli Global Scholar. The goal is to create a new data set for the wider machine learning community that includes sensorimotor data and specifically labels key brain data as having occurred at specific moments in relation to an eye movement or reach movement.

“Proving that we can send these devices home with people and that they can collect good data could change the way we do medical diagnosis,” says Fyshe. 

The research will be conducted with professional-level EEG machines and the new low cost EEG systems concurrently in order to verify whether they provide a similar quality of data. If they do, the research could have important implications for neuroscience, computer science and medicine.

“Proving that we can send these devices home with people and that they can collect good data could change the way we do medical diagnosis,” says Fyshe. 

One such potential change is the possibility to monitor the progression of Parkinson’s in individual patients and measure the efficacy of treatments at a level of detail not possible within the limitations of today’s healthcare system. It could also help psychiatrists monitor conditions like depression, in which it can take months to understand if a medication is working. EEG may let us see drug related brain changes before patients notice them.

Though the science still has a long way before such potential impacts come to life, Chapman and Fyshe’s collaboration is a perfect example of the big picture ideas CIFAR is trying to encourage with programs and catalyst funding.

“Research is the ultimate collaborative endeavor and having the opportunity to work with some of the smartest people in the world on the topic of consciousness will enormously benefit the impact and quality of my work,” says Chapman.