Latest

SeminarNeuroscienceRecording

Target detection in the natural world

Karin Nordstrom
Flinders University
Nov 15, 2021

Animal sensory systems are optimally adapted to those features typically encountered in natural surrounds, thus allowing neurons that have a limited bandwidth to encode almost impossibly large input ranges. Importantly, natural scenes are not random, and peripheral visual systems have therefore evolved to reduce the predictable redundancy. The vertebrate visual cortex is also optimally tuned to the spatial statistics of natural scenes, but much less is known about how the insect brain responds to these. We are redressing this deficiency using several techniques. Olga Dyakova uses exquisite image manipulation to give natural images unnatural image statistics, or vice versa. Marissa Holden then uses these images as stimuli in electrophysiological recordings of neurons in the fly optic lobes, to see how the brain codes for the statistics typically encountered in natural scenes, and Olga Dyakova measures the behavioral optomotor response on our trackball set-up.

SeminarNeuroscience

An optimal population code for global motion estimation in local direction-selective cells

Miriam Henning
Silies lab, University of Mainz, Germany
Nov 4, 2021

Neuronal computations are matched to optimally encode the sensory information that is available and relevant for the animal. However, the physical distribution of sensory information is often shaped by the animal’s own behavior. One prominent example is the encoding of optic flow fields that are generated during self-motion of the animal and will therefore depend on the type of locomotion. How evolution has matched computational resources to the behavioral constraints of an animal is not known. Here we use in vivo two photon imaging to record from a population of >3.500 local-direction selective cells. Our data show that the local direction-selective T4/T5 neurons in Drosophila form a population code that is matched to represent optic flow fields generated during translational and rotational self-motion of the fly. This coding principle for optic flow is reminiscent to the population code of local direction-selective ganglion cells in the mouse retina, where four direction-selective ganglion cells encode four different axes of self-motion encountered during walking (Sabbah et al., 2017). However, in flies we find six different subtypes of T4 and T5 cells that, at the population level, represent six axes of self-motion of the fly. The four uniformly tuned T4/T5 subtypes described previously represent a local snapshot (Maisak et al. 2013). The encoding of six types of optic flow in the fly as compared to four types of optic flow in mice might be matched to the high degrees of freedom encountered during flight. Thus, a population code for optic flow appears to be a general coding principle of visual systems, resulting from convergent evolution, but matching the individual ethological constraints of the animal.

SeminarNeuroscienceRecording

Motion processing across visual field locations in zebrafish

Aristides Arrenberg
University of Tuebingen
Sep 28, 2020

Animals are able to perceive self-motion and navigate in their environment using optic flow information. They often perform visually guided stabilization behaviors like the optokinetic (OKR) or optomotor response (OMR) in order to maintain their eye and body position relative to the moving surround. But how does the animal manage to perform appropriate behavioral response and how are processing tasks divided between the various non-cortical visual brain areas? Experiments have shown that the zebrafish pretectum, which is homologous to the mammalian accessory optic system, is involved in the OKR and OMR. The optic tectum (superior colliculus in mammals) is involved in processing of small stimuli, e.g. during prey capture. We have previously shown that many pretectal neurons respond selectively to rotational or translational motion. These neurons are likely detectors for specific optic flow patterns and mediate behavioral choices of the animal based on optic flow information. We investigate the motion feature extraction of brain structures that receive input from retinal ganglion cells to identify the visual computations that underlie behavioral decisions during prey capture, OKR, OMR and other visually mediate behaviors. Our study of receptive fields shows that receptive field sizes in pretectum (large) and tectum (small) are very different and that pretectal responses are diverse and anatomically organized. Since calcium indicators are slow and receptive fields for motion stimuli are difficult to measure, we also develop novel stimuli and statistical methods to infer the neuronal computations of visual brain areas.

optic flow coverage

6 items

Seminar6
Domain spotlight

Explore how optic flow research is advancing inside Neuro.

Visit domain