translational motion
Latest
An optimal population code for global motion estimation in local direction-selective cells
Neuronal computations are matched to optimally encode the sensory information that is available and relevant for the animal. However, the physical distribution of sensory information is often shaped by the animal’s own behavior. One prominent example is the encoding of optic flow fields that are generated during self-motion of the animal and will therefore depend on the type of locomotion. How evolution has matched computational resources to the behavioral constraints of an animal is not known. Here we use in vivo two photon imaging to record from a population of >3.500 local-direction selective cells. Our data show that the local direction-selective T4/T5 neurons in Drosophila form a population code that is matched to represent optic flow fields generated during translational and rotational self-motion of the fly. This coding principle for optic flow is reminiscent to the population code of local direction-selective ganglion cells in the mouse retina, where four direction-selective ganglion cells encode four different axes of self-motion encountered during walking (Sabbah et al., 2017). However, in flies we find six different subtypes of T4 and T5 cells that, at the population level, represent six axes of self-motion of the fly. The four uniformly tuned T4/T5 subtypes described previously represent a local snapshot (Maisak et al. 2013). The encoding of six types of optic flow in the fly as compared to four types of optic flow in mice might be matched to the high degrees of freedom encountered during flight. Thus, a population code for optic flow appears to be a general coding principle of visual systems, resulting from convergent evolution, but matching the individual ethological constraints of the animal.
Motion processing across visual field locations in zebrafish
Animals are able to perceive self-motion and navigate in their environment using optic flow information. They often perform visually guided stabilization behaviors like the optokinetic (OKR) or optomotor response (OMR) in order to maintain their eye and body position relative to the moving surround. But how does the animal manage to perform appropriate behavioral response and how are processing tasks divided between the various non-cortical visual brain areas? Experiments have shown that the zebrafish pretectum, which is homologous to the mammalian accessory optic system, is involved in the OKR and OMR. The optic tectum (superior colliculus in mammals) is involved in processing of small stimuli, e.g. during prey capture. We have previously shown that many pretectal neurons respond selectively to rotational or translational motion. These neurons are likely detectors for specific optic flow patterns and mediate behavioral choices of the animal based on optic flow information. We investigate the motion feature extraction of brain structures that receive input from retinal ganglion cells to identify the visual computations that underlie behavioral decisions during prey capture, OKR, OMR and other visually mediate behaviors. Our study of receptive fields shows that receptive field sizes in pretectum (large) and tectum (small) are very different and that pretectal responses are diverse and anatomically organized. Since calcium indicators are slow and receptive fields for motion stimuli are difficult to measure, we also develop novel stimuli and statistical methods to infer the neuronal computations of visual brain areas.
translational motion coverage
2 items