Prey Capture
prey capture
How fly neurons compute the direction of visual motion
Detecting the direction of image motion is important for visual navigation, predator avoidance and prey capture, and thus essential for the survival of all animals that have eyes. However, the direction of motion is not explicitly represented at the level of the photoreceptors: it rather needs to be computed by subsequent neural circuits, involving a comparison of the signals from neighboring photoreceptors over time. The exact nature of this process represents a classic example of neural computation and has been a longstanding question in the field. Much progress has been made in recent years in the fruit fly Drosophila melanogaster by genetically targeting individual neuron types to block, activate or record from them. Our results obtained this way demonstrate that the local direction of motion is computed in two parallel ON and OFF pathways. Within each pathway, a retinotopic array of four direction-selective T4 (ON) and T5 (OFF) cells represents the four Cartesian components of local motion vectors (leftward, rightward, upward, downward). Since none of the presynaptic neurons is directionally selective, direction selectivity first emerges within T4 and T5 cells. Our present research focuses on the cellular and biophysical mechanisms by which the direction of image motion is computed in these neurons.
How fly neurons compute the direction of visual motion
Detecting the direction of image motion is important for visual navigation, predator avoidance and prey capture, and thus essential for the survival of all animals that have eyes. However, the direction of motion is not explicitly represented at the level of the photoreceptors: it rather needs to be computed by subsequent neural circuits. The exact nature of this process represents a classic example of neural computation and has been a longstanding question in the field. Our results obtained in the fruit fly Drosophila demonstrate that the local direction of motion is computed in two parallel ON and OFF pathways. Within each pathway, a retinotopic array of four direction-selective T4 (ON) and T5 (OFF) cells represents the four Cartesian components of local motion vectors (leftward, rightward, upward, downward). Since none of the presynaptic neurons is directionally selective, direction selectivity first emerges within T4 and T5 cells. Our present research focuses on the cellular and biophysical mechanisms by which the direction of image motion is computed in these neurons.
On the Hunt: Ingenious Foraging Strategies in Bats & Spiders
Vision for escape and pursuit
We want to understand how the visual system detects and tracks salient stimuli in the environment to initiate and guide specific behaviors (i.e., visual neuroethology). Predator avoidance and prey capture are central selection pressures of animal evolution. Mice use vision to detect aerial predators and hunt insects. I will discuss studies from my group that identify specific circuits and pathways in the early visual system (i.e., the retina and its subcortical targets) mediating predator avoidance and prey capture in mice. Our results highlight the importance of subcellular visual processing in the retina and the alignment of viewing strategies with region- and cell-type-specific retinal ganglion cell projection patterns to the brain.
Stereo vision and prey detection in the praying mantis
Praying mantises are the only insects known to have stereo vision. We used a comparative approach to determine how the mechanisms underlying stereopsis in mantises differ from those underlying primate stereo vision. By testing mantises with virtual 3D targets we showed that mantis stereopsis enables prey capture in complex scenes but the mechanisms underlying it differ from those underlying primate stereopsis. My talk will further discuss how stereopsis combines with second-order motion perception to enable the detection of camouflaged prey by mantises. The talk will highlight the benefits of a comparative approach towards understanding visual cognition.
Cones with character: An in vivo circuit implementation of efficient coding
In this talk I will summarize some of our recent unpublished work on spectral coding in the larval zebrafish retina. Combining 2p imaging, hyperspectral stimulation, computational modeling and connectomics, we take a renewed look at the spectral tuning of cone photoreceptors in the live eye. We find that already cones optimally rotate natural colour space in a PCA-like fashion to disambiguate greyscale from "colour" information. We then follow this signal through the retinal layers and ultimately into the brain to explore the major spectral computations performed by the visual system at its consecutive stages. We find that by and large, zebrafish colour vision can be broken into three major spectral zones: long wavelength grey-scale-like vision, short-wavelength prey capture circuits, and spectrally diverse mid-wavelength circuits which possibly support the bulk of "true colour vision" in this tetrachromate vertebrate.
Motion processing across visual field locations in zebrafish
Animals are able to perceive self-motion and navigate in their environment using optic flow information. They often perform visually guided stabilization behaviors like the optokinetic (OKR) or optomotor response (OMR) in order to maintain their eye and body position relative to the moving surround. But how does the animal manage to perform appropriate behavioral response and how are processing tasks divided between the various non-cortical visual brain areas? Experiments have shown that the zebrafish pretectum, which is homologous to the mammalian accessory optic system, is involved in the OKR and OMR. The optic tectum (superior colliculus in mammals) is involved in processing of small stimuli, e.g. during prey capture. We have previously shown that many pretectal neurons respond selectively to rotational or translational motion. These neurons are likely detectors for specific optic flow patterns and mediate behavioral choices of the animal based on optic flow information. We investigate the motion feature extraction of brain structures that receive input from retinal ganglion cells to identify the visual computations that underlie behavioral decisions during prey capture, OKR, OMR and other visually mediate behaviors. Our study of receptive fields shows that receptive field sizes in pretectum (large) and tectum (small) are very different and that pretectal responses are diverse and anatomically organized. Since calcium indicators are slow and receptive fields for motion stimuli are difficult to measure, we also develop novel stimuli and statistical methods to infer the neuronal computations of visual brain areas.