Visual Context
visual context
Context-dependent motion processing in the retina
A critical function of sensory systems is to reliably extract ethologically relevant features from the complex natural environment. A classic model to study feature detection is the direction-selective circuit of the mammalian retina. In this talk, I will discuss our recent work on how visual contexts dynamically influence the neural processing of motion signals in the direction-selective circuit in the mouse retina.
Feedback controls what we see
We hardly notice when there is a speck on our glasses, the obstructed visual information seems to be magically filled in. The visual system uses visual context to predict the content of the stimulus. What enables neurons in the visual system to respond to context when the stimulus is not available? In cortex, sensory processing is based on a combination of feedforward information arriving from sensory organs, and feedback information that originates in higher-order areas. Whereas feedforward information drives the activity in cortex, feedback information is thought to provide contextual signals that are merely modulatory. We have made the exciting discovery that mouse primary visual cortical neurons are strongly driven by feedback projections from higher visual areas, in particular when their feedforward sensory input from the retina is missing. This drive is so strong that it makes visual cortical neurons fire as much as if they were receiving a direct sensory input.
Model-guided discovery of a retinal chromatic feature detector that signals visual context changes
COSYNE 2023