Sensory Signal Processing
sensory signal processing
Retinal responses to natural inputs
The research in my lab focuses on sensory signal processing, particularly in cases where sensory systems perform at or near the limits imposed by physics. Photon counting in the visual system is a beautiful example. At its peak sensitivity, the performance of the visual system is limited largely by the division of light into discrete photons. This observation has several implications for phototransduction and signal processing in the retina: rod photoreceptors must transduce single photon absorptions with high fidelity, single photon signals in photoreceptors, which are only 0.03 – 0.1 mV, must be reliably transmitted to second-order cells in the retina, and absorption of a single photon by a single rod must produce a noticeable change in the pattern of action potentials sent from the eye to the brain. My approach is to combine quantitative physiological experiments and theory to understand photon counting in terms of basic biophysical mechanisms. Fortunately there is more to visual perception than counting photons. The visual system is very adept at operating over a wide range of light intensities (about 12 orders of magnitude). Over most of this range, vision is mediated by cone photoreceptors. Thus adaptation is paramount to cone vision. Again one would like to understand quantitatively how the biophysical mechanisms involved in phototransduction, synaptic transmission, and neural coding contribute to adaptation.
The Dark Side of Vision: Resolving the Neural Code
All sensory information – like what we see, hear and smell – gets encoded in spike trains by sensory neurons and gets sent to the brain. Due to the complexity of neural circuits and the difficulty of quantifying complex animal behavior, it has been exceedingly hard to resolve how the brain decodes these spike trains to drive behavior. We now measure quantal signals originating from sparse photons through the most sensitive neural circuits of the mammalian retina and correlate the retinal output spike trains with precisely quantified behavioral decisions. We utilize a combination of electrophysiological measurements on the most sensitive ON and OFF retinal ganglion cell types and a novel deep-learning based tracking technology of the head and body positions of freely-moving mice. We show that visually-guided behavior relies on information from the retinal ON pathway for the dimmest light increments and on information from the retinal OFF pathway for the dimmest light decrements (“quantal shadows”). Our results show that the distribution of labor between ON and OFF pathways starts already at starlight supporting distinct pathway-specific visual computations to drive visually-guided behavior. These results have several fundamental consequences for understanding how the brain integrates information across parallel information streams as well as for understanding the limits of sensory signal processing. In my talk, I will discuss some of the most eminent consequences including the extension of this “Quantum Behavior” paradigm from mouse vision to monkey and human visual systems.