Search

Joel Zylberberg

LMB_JoelZylberberg

Appointment

  • Associate Fellow
  • Learning in Machines & Brains

Institution

  • York University
Centre for Vision Research

Country

  • Canada

Education

PhD (Physics), University of California, Berkeley
BSc (Physics), Simon Fraser University

About

Visual stimuli elicit action potentials in the retina, that propagate to the brain, where further action potentials are elicited.

What is the nature of this visual representation? In other words, what is the correspondence between patterns of action potentials in the brain, and the stimuli that caused them? Moreover, how are these representations learned through experience? Finally (how) can we "upload" the brain's visual representations into computers, to make better artificial intelligence systems? These are the central themes of my research program.

Awards

Sloan Research Fellowship, 2017

Google Faculty Research Award, 2017

Howard Hughes Medical Institute (HHMI) International Student Research Fellowship, 2011

Fulbright Science and Technology PhD Fellowship, 2008

Relevant Publications

W. Kindel, E. Christensen, and J. Zylberberg (2019). Using deep learning to probe the neural code for images in primary visual cortex. Journal of Vision 19: 29.

J. A. Pruszynski, and J. Zylberberg (2019). The language of the brain: real-world neural population codes. Current Opinion in Neurobiology 58: 30.

J. Zylberberg and B. Strowbridge (2017). Mechanisms of persistent activity in cortical circuits: possible neural substrates for working memory. Annual Review of Neuroscience 40: 603-627.

J. Zylberberg, J.T. Murphy, and M.R. DeWeese (2011). A sparse coding model with synaptically local plasticity and spiking neurons can account for the diverse shapes of V1 simple cell receptive fields. PLoS Computational Biology 7: e1002250. 

Connect

Zylberberg Lab