As you read this sentence, your eyes are sending signals to your brain, and your brain is interpreting those signals to decipher words.
Research in Joel Zylberberg’s lab identifies the language of this signalling. For example, which patterns of nerve impulses correspond to an “a” or an “e”? By understanding this ‘neural code,’ Zylberberg’s work leads to computer algorithms that mimic the mammalian visual system, and to implantable devices that can stimulate a blind person’s brain, to restore their ability to see.
Parallel to his work on visual signal processing, Zylberberg has ongoing projects to decipher the olfactory neural code, develop better cochlear implants, determine how learning via synaptic plasticity depends on behavioural context, and elucidate the role of synaptic plasticity in the maintenance of short-term memory networks.
Sloan Research Fellowship, 2017
Google Faculty Research Award, 2017
Howard Hughes Medical Institute (HHMI) International Student Research Fellowship, 2011
Fulbright Science and Technology PhD Fellowship, 2008
Zylberberg, J. et al. "Direction-selective circuits shape noise to ensure a precise population code." Neuron 89 (2016): 369–83.
King, P., J. Zylberberg, and M.R. DeWeese." Inhibitory interneurons decorrelate excitatory cells to drive sparse code formation in a spiking model of V1." Journal of Neuroscience 33 (2013): 5475–5485.
Zylberberg, J., J. Murphy, and M.R. DeWeese. "A sparse coding model with synaptically local plasticity and spiking neurons can account for the diverse shapes of V1 simple cell receptive fields." PLoS Computational Biology 7 (2011): e1002250.