Visual Signals
visual signals
How does seeing help listening? Audiovisual integration in Auditory Cortex
Multisensory responses are ubiquitous in so-called unisensory cortex. However, despite their prevalence, we have very little understanding of what – if anything - they contribute to perception. In this talk I will focus on audio-visual integration in auditory cortex. Anatomical tracing studies highlight visual cortex as one source of visual input to auditory cortex. Using cortical cooling we test the hypothesis that these inputs support audiovisual integration in ferret auditory cortex. Behavioural studies in humans support the idea that visual stimuli can help listeners to parse an auditory scene. This effect is paralleled in single units in auditory cortex, where responses to a sound mixture can be determined by the timing of a visual stimulus such that sounds that are temporally coherent with a visual stimulus are preferentially represented. Our recent data therefore support the idea that one role for the early integration of auditory and visual signals in auditory cortex is to support auditory scene analysis, and that visual cortex plays a key role in this process.
Physical Computation in Insect Swarms
Our world is full of living creatures that must share information to survive and reproduce. As humans, we easily forget how hard it is to communicate within natural environments. So how do organisms solve this challenge, using only natural resources? Ideas from computer science, physics and mathematics, such as energetic cost, compression, and detectability, define universal criteria that almost all communication systems must meet. We use insect swarms as a model system for identifying how organisms harness the dynamics of communication signals, perform spatiotemporal integration of these signals, and propagate those signals to neighboring organisms. In this talk I will focus on two types of communication in insect swarms: visual communication, in which fireflies communicate over long distances using light signals, and chemical communication, in which bees serve as signal amplifiers to propagate pheromone-based information about the queen’s location.
Multisensory encoding of self-motion in the retrosplenial cortex and beyond
In order to successfully navigate through the environment, animals must accurately estimate the status of their motion with respect to the surrounding scene and objects. In this talk, I will present our recent work on how retrosplenial cortical (RSC) neurons combine vestibular and visual signals to reliably encode the direction and speed of head turns during passive motion and active navigation. I will discuss these data in the context of RSC long-range connectivity and further show our ongoing work on building population-level models of motion representation across cortical and subcortical networks.
Beyond energy - an unconventional role of mitochondria in cone photoreceptors
The long-term goal of my research is to study the mammalian retina as a model for the central nervous system (CNS) -- to understand how it functions in physiological conditions, how it is formed, how it breaks down in pathological conditions, and how it can be repaired. I have focused on two research themes: 1) Photoreceptor structure, synapse, circuits, and development, 2) Hibernation and metabolic adaptations in the retina and beyond. As the first neuron of the visual system, photoreceptors are vital for photoreception and transmission of visual signals. I am particularly interested in cone photoreceptors, as they mediate our daylight vision with high resolution color information. Diseases affecting cone photoreceptors compromise visual functions in the central macular area of the human retina and are thus most detrimental to our vision. However, because cones are much less abundant compared to rods in most mammals, they are less well studied. We have used the ground squirrel (GS) as a model system to study cone vision, taking advantage of their unique cone-dominant retina. In particular, we have focused on short-wavelength sensitive cones (S-cones), which are not only essential for color vision, but are also an important origin of signals for biological rhythm, mood and cognitive functions, and the growth of the eye during development. We are studying critical cone synaptic structures – synaptic ribbons, the synaptic connections of S-cones, and the development of S-cones with regard to their specific connections. These works will provide knowledge of normal retinal development and function, which can also be extended to the rest of CNS; for example, the mechanisms of synaptic targeting during development. In addition, such knowledge will benefit the development of optimal therapeutic strategies for regeneration and repair in cases of retinal degenerative disease. Many neurodegenerative diseases, including retinal diseases, are rooted in metabolic stress in neurons and/or glial cells. Using the same GS model, we aim to learn from this hibernating mammal, which possesses an amazing capability to adapt to the extreme metabolic conditions during hibernation. By exploring the mechanisms of such adaptation, we hope to discover novel therapeutic tactics for neurodegenerative diseases.
A balancing act: goal-oriented control of stability reflexes by visual feedback
During the course of an animal’s interaction with its environments, activity within central neural circuits is orchestrated exquisitely to structure goal-oriented movement. During walking, for example, the head, body and limbs are coordinated in distinctive ways that are guided by the task at play, and also by posture and balance requirements. Hence, the overall performance of goal-oriented walking depends on the interplay between task-specific motor plans and stability reflexes. Copies of motor plans, typically described by the term efference copy, modulate stability reflexes in a predictive manner. However, the highly uncertain nature of natural environments indicates that the effect of efferent copy on movement control is insufficient; additional mechanisms must exist to regulate stability reflexes and coordinate motor programs flexibly under non-predictable conditions. In this talk, I will discuss our recent work examining how self-generated visual signals orchestrate the interplay between task-specific motor plans and stability reflexes during a self-paced, goal-oriented walking behavior.
Natural stimulus encoding in the retina with linear and nonlinear receptive fields
Popular notions of how the retina encodes visual stimuli typically focus on the center-surround receptive fields of retinal ganglion cells, the output neurons of the retina. In this view, the receptive field acts as a linear filter on the visual stimulus, highlighting spatial contrast and providing efficient representations of natural images. Yet, we also know that many ganglion cells respond vigorously to fine spatial gratings that should not activate the linear filter of the receptive field. Thus, ganglion cells may integrate visual signals nonlinearly across space. In this talk, I will discuss how these (and other) nonlinearities relate to the encoding of natural visual stimuli in the retina. Based on electrophysiological recordings of ganglion and bipolar cells from mouse and salamander retina, I will present methods for assessing nonlinear processing in different cell types and examine their importance and potential function under natural stimulation.
Distinct organization of visual and non-visual signals in visual cortex
COSYNE 2023
Neural representation and predictive processing of dynamic visual signals
COSYNE 2023
What you don’t see is what you get: Nonvisual signals dominate vestibulo-ocular reflex adaptation when retinal motion detection is impaired
FENS Forum 2024