The average human blinks every 5 seconds, closing the eyelid and blocking vision completely. Over a lifetime, these blinks amount to thousands of waking hours spent with our eyes closed, but thanks to our complex visual system these automatic blinks remain unregistered in our conscious experience.
“We tend to think that we see because we open our eyes and capture the world around us, but spontaneous blinks indicate to us that this is naïve thinking. We see because the visual system is busy creating an image,” says Rafi Malach
(Weizmann Institute of Science), a senior fellow in the Brain, Mind & Consciousness program
The ability of humans to perceive the world around us as a stable continuum despite the constant interruptions of blinking and eye-movement is a powerful example of the difference between what information our eyes take in and what our brains perceive. Malach is one of the researchers working to understand the neural mechanisms responsible for our stable and continuous experience of the visual world.
“A major part of our conscious experience is around vision,” says Malach. “We are very visual animals and the richness and vividness of vision is very powerful in humans.”
For Malach, whose lab focuses on understanding human vision, the visual system presents a great model for understanding human consciousness. It allows researchers to easily manipulate external visual input to demonstrate the dissociation between the information eyes receive and the resulting conscious experience.
Data from two recent papers published by Malach’s lab suggests that perception is actually linked to the higher order visual areas of the brain, rather than the early visual cortex, where visual information is first processed.
“The perceptual, the conscious awareness state, does not appear to “read” directly what’s happening in the early visual cortex,” says Malach. “Events that happen in early visual cortex do not seem to be crossing the awareness threshold of the individual.”
The first of the two papers
, published by eLIFE, demonstrates this by examining the neural response of the visual system to spontaneous blinks, voluntary blinks and externally generated retinal interruptions (such as turning off the lights in a dark room, or for the purposes of this experiment, showing subjects a blank video frame). The research was led by Tal Golan in Malach’s group, in collaboration with the group of Prof. Ashesh Mehta from the Feinstein Institute for Medical Research, Manhasset.
In the experiment, patients already undergoing intra-cranial electrocorticographic evaluation for intractable epilepsy were presented with visual stimuli in the form of grayscale photographs of faces and non-face images. Every 10 seconds a gray screen was displayed to serve as a baseline measure. Blank video frames acted as a gap in visual input. Electrodes already implanted into the patients’ visual system allowed Golan to evaluate the high-frequency broadband power (HFB) envelope responses of neurons in those areas, while patients’ voluntary and spontaneous blinks were simultaneously recorded.
The results showed that in the early visual cortex the disappearance of visual stimuli caused similar neuronal response whether it was a result of an invisible spontaneous blink or a blank video frame. Both caused an initial drop in HFB activity level upon the disappearance of the stimuli and a positive overshoot beyond baseline levels upon reappearance (when the eyes re-opened or the image appeared back on screen).
“You get the same signals in the early visual cortex for something you are perfectly aware of and something that you are completely oblivious to. The conclusion I draw from it is that these early visual activities are definitely not being read into perceptual awareness,” says Malach.
However, as the information moves up the visual hierarchy to the high-order visual cortex, the post-interruption overshoot of activity subsides for blinks but not for gaps. Malach says these results contradict existing hypotheses that our brain “fills-in” the missing information when we blink. Instead, he says, the data suggest the reason we do not see blinks is that a suppression mechanism is blocking the interruption-induced positive bursts of neuronal activity that signal that an optical interruption is taking place.
Though the origin of such a suppression signal is still unknown one possibility is that a copy of the motor command to blink is sent in parallel to the visual cortex to tell it to suppress the representation of the blink. When the visual information is interrupted through an external cause (such as a blank video frame) a copy command is not sent and the optical interruption is not suppressed, leading to the experience of a flicker.
Malach’s second study, led by Ella Podvalni in collaboration with the group of Professor Ashesh Mehta, and published in Current Biology, extended the study of neural mechanisms of perceptual stability to real world, ecological conditions. Studying vision under natural conditions is important because during natural viewing the visual system operates in substantially different manner than during artificial laboratory conditions. For example, during natural vision humans continuously sample the dynamic environment with constant eye movement. The neuronal mechanism that allows us to maintain a stable perception under these conditions is unknown, but Malach’s research suggests higher-order visual areas play a role.
“The reason for the second paper was to take the research out of the lab and into the real world and look at how perception and sensory input are related to each other in real life,” says Malach. His lab is one of a few that are pushing to start exploring the human brain embedded in natural settings.
While visual system research has begun to move away from the highly artificial viewing conditions towards conditions that more closely resemble how we interact with the world on a daily basis, a direct comparison of neural responses to naturalistic stimuli versus controlled lab tests hadn’t been undertaken.
Podvalni and Malach designed a paradigm to compare recordings taken in real-life visual environments with recordings of the same patients in controlled lab conditions. In the lab, images were flashed briefly to subjects instructed to hold a central visual fixation. For natural viewing conditions subjects were fitted with a mobile recording setup including glasses that could record subjects’ eye movements and the visual scene in front of them.
“This is a very powerful way of studying natural behaviour because you are not constraining the patient… the patient is exploring the environment at will so we can see a more naturalist behaviour of the visual system.”
The results showed that in early visual areas the neuronal response corresponds with the physical properties of the visual input, meaning that if you look at the target longer, you get a longer response in the brain. This reaction was observed in both lab and natural viewing conditions.
However, in higher-order visual areas that deal with more abstract vision, there was a complete dissociation between the duration that the person looks at an image and the neuronal response in these areas. Neurons in this area give a burst or “ignition” of activity, the length of which is independent of how long you look at the image in question.
Malach says this indicates these areas may be responsible for recognition. Once the target has been identified and labeled, they stop processing the information.
“The brain makes a great attempt to get rid of the external parameters and to reach a consistent solution,” says Malach. “And that makes sense because you want stability.”
A member of CIFAR’s Brain, Mind & Consciousness program
since January 2016, Malach finds it exciting to be part of an interdisciplinary group that shares a common passion for understanding conscious awareness.
“If you look at the history of brain research, and then science in general, the most advanced and creative solutions always came when people mixed disciplines.”