sensory processing
Latest
Computational Mechanisms of Predictive Processing in Brains and Machines
Predictive processing offers a unifying view of neural computation, proposing that brains continuously anticipate sensory input and update internal models based on prediction errors. In this talk, I will present converging evidence for the computational mechanisms underlying this framework across human neuroscience and deep neural networks. I will begin with recent work showing that large-scale distributed prediction-error encoding in the human brain directly predicts how sensory representations reorganize through predictive learning. I will then turn to PredNet, a popular predictive coding inspired deep network that has been widely used to model real-world biological vision systems. Using dynamic stimuli generated with our Spatiotemporal Style Transfer algorithm, we demonstrate that PredNet relies primarily on low-level spatiotemporal structure and remains insensitive to high-level content, revealing limits in its generalization capacity. Finally, I will discuss new recurrent vision models that integrate top-down feedback connections with intrinsic neural variability, uncovering a dual mechanism for robust sensory coding in which neural variability decorrelates unit responses, while top-down feedback stabilizes network dynamics. Together, these results outline how prediction error signaling and top-down feedback pathways shape adaptive sensory processing in biological and artificial systems.
Top-down control of neocortical threat memory
Accurate perception of the environment is a constructive process that requires integration of external bottom-up sensory signals with internally-generated top-down information reflecting past experiences and current aims. Decades of work have elucidated how sensory neocortex processes physical stimulus features. In contrast, examining how memory-related-top-down information is encoded and integrated with bottom-up signals has long been challenging. Here, I will discuss our recent work pinpointing the outermost layer 1 of neocortex as a central hotspot for processing of experience-dependent top-down information threat during perception, one of the most fundamentally important forms of sensation.
Neural architectures: what are they good for anyway?
The brain has a highly complex structure in terms of cell types and wiring between different regions. What is it for, if anything? I'll start this talk by asking what might an answer to this question even look like given that we can't run an alternative universe where our brains are structured differently. (Preview: we can do this with models!) I'll then talk about some of our work in two areas: (1) does the modular structure of the brain contribute to specialisation of function? (2) how do different cell types and architectures contribute to multimodal sensory processing?
Rethinking Attention: Dynamic Prioritization
Decades of research on understanding the mechanisms of attentional selection have focused on identifying the units (representations) on which attention operates in order to guide prioritized sensory processing. These attentional units fit neatly to accommodate our understanding of how attention is allocated in a top-down, bottom-up, or historical fashion. In this talk, I will focus on attentional phenomena that are not easily accommodated within current theories of attentional selection – the “attentional platypuses,” as they allude to an observation that within biological taxonomies the platypus does not fit into either mammal or bird categories. Similarly, attentional phenomena that do not fit neatly within current attentional models suggest that current models need to be revised. I list a few instances of the ‘attentional platypuses” and then offer a new approach, the Dynamically Weighted Prioritization, stipulating that multiple factors impinge onto the attentional priority map, each with a corresponding weight. The interaction between factors and their corresponding weights determines the current state of the priority map which subsequently constrains/guides attention allocation. I propose that this new approach should be considered as a supplement to existing models of attention, especially those that emphasize categorical organizations.
Time perception in film viewing as a function of film editing
Filmmakers and editors have empirically developed techniques to ensure the spatiotemporal continuity of a film's narration. In terms of time, editing techniques (e.g., elliptical, overlapping, or cut minimization) allow for the manipulation of the perceived duration of events as they unfold on screen. More specifically, a scene can be edited to be time compressed, expanded, or real-time in terms of its perceived duration. Despite the consistent application of these techniques in filmmaking, their perceptual outcomes have not been experimentally validated. Given that viewing a film is experienced as a precise simulation of the physical world, the use of cinematic material to examine aspects of time perception allows for experimentation with high ecological validity, while filmmakers gain more insight on how empirically developed techniques influence viewers' time percept. Here, we investigated how such time manipulation techniques of an action affect a scene's perceived duration. Specifically, we presented videos depicting different actions (e.g., a woman talking on the phone), edited according to the techniques applied for temporal manipulation and asked participants to make verbal estimations of the presented scenes' perceived durations. Analysis of data revealed that the duration of expanded scenes was significantly overestimated as compared to that of compressed and real-time scenes, as was the duration of real-time scenes as compared to that of compressed scenes. Therefore, our results validate the empirical techniques applied for the modulation of a scene's perceived duration. We also found interactions on time estimates of scene type and editing technique as a function of the characteristics and the action of the scene presented. Thus, these findings add to the discussion that the content and characteristics of a scene, along with the editing technique applied, can also modulate perceived duration. Our findings are discussed by considering current timing frameworks, as well as attentional saliency algorithms measuring the visual saliency of the presented stimuli.
Measures and models of multisensory integration in reaction times
First, a new measure of MI for reaction times is proposed that takes the entire RT distribution into account. Second, we present some recent developments in TWIN modeling, including a new proposal for the sound-induced flash illusion (SIFI).
Rodents to Investigate the Neural Basis of Audiovisual Temporal Processing and Perception
To form a coherent perception of the world around us, we are constantly processing and integrating sensory information from multiple modalities. In fact, when auditory and visual stimuli occur within ~100 ms of each other, individuals tend to perceive the stimuli as a single event, even though they occurred separately. In recent years, our lab, and others, have developed rat models of audiovisual temporal perception using behavioural tasks such as temporal order judgments (TOJs) and synchrony judgments (SJs). While these rodent models demonstrate metrics that are consistent with humans (e.g., perceived simultaneity, temporal acuity), we have sought to confirm whether rodents demonstrate the hallmarks of audiovisual temporal perception, such as predictable shifts in their perception based on experience and sensitivity to alterations in neurochemistry. Ultimately, our findings indicate that rats serve as an excellent model to study the neural mechanisms underlying audiovisual temporal perception, which to date remains relativity unknown. Using our validated translational audiovisual behavioural tasks, in combination with optogenetics, neuropharmacology and in vivo electrophysiology, we aim to uncover the mechanisms by which inhibitory neurotransmission and top-down circuits finely control ones’ perception. This research will significantly advance our understanding of the neuronal circuitry underlying audiovisual temporal perception, and will be the first to establish the role of interneurons in regulating the synchronized neural activity that is thought to contribute to the precise binding of audiovisual stimuli.
Multisensory processing of anticipatory and consummatory food cues
Multisensory influences on vision: Sounds enhance and alter visual-perceptual processing
Visual perception is traditionally studied in isolation from other sensory systems, and while this approach has been exceptionally successful, in the real world, visual objects are often accompanied by sounds, smells, tactile information, or taste. How is visual processing influenced by these other sensory inputs? In this talk, I will review studies from our lab showing that a sound can influence the perception of a visual object in multiple ways. In the first part, I will focus on spatial interactions between sound and sight, demonstrating that co-localized sounds enhance visual perception. Then, I will show that these cross-modal interactions also occur at a higher contextual and semantic level, where naturalistic sounds facilitate the processing of real-world objects that match these sounds. Throughout my talk I will explore to what extent sounds not only improve visual processing but also alter perceptual representations of the objects we see. Most broadly, I will argue for the importance of considering multisensory influences on visual perception for a more complete understanding of our visual experience.
Trial by trial predictions of subjective time from human brain activity
Our perception of time isn’t like a clock; it varies depending on other aspects of experience, such as what we see and hear in that moment. However, in everyday life, the properties of these simple features can change frequently, presenting a challenge to understanding real-world time perception based on simple lab experiments. We developed a computational model of human time perception based on tracking changes in neural activity across brain regions involved in sensory processing, using fMRI. By measuring changes in brain activity patterns across these regions, our approach accommodates the different and changing feature combinations present in natural scenarios, such as walking on a busy street. Our model reproduces people’s duration reports for natural videos (up to almost half a minute long) and, most importantly, predicts whether a person reports a scene as relatively shorter or longer–the biases in time perception that reflect how natural experience of time deviates from clock time
Hunger state-dependent modulation of decision-making in larval Drosophila
It is critical for all animals to make appropriate, but also flexible, foraging decisions, especially when facing starvation. Sensing olfactory information is essential to evaluate food quality before ingestion. Previously, we found that <i>Drosophila</i> larvae switch their response to certain odors from aversion to attraction when food deprived. The neural mechanism underlying this switch in behavior involves serotonergic modulation and reconfiguration of odor processing in the early olfactory sensory system. We now investigate if a change in hunger state also influences other behavioral decisions. Since it had been shown that fly larvae can perform cannibalism, we investigate the effect of food deprivation on feeding on dead conspecifics. We find that fed fly larvae rarely use dead conspecifics as a food source. However, food deprivation largely enhances this behavior. We will now also investigate the underlying neural mechanisms that mediate this enhancement and compare it to the already described mechanism for a switch in olfactory choice behavior. Generally, this flexibility in foraging behavior enables the larva to explore a broader range of stimuli and to expand their feeding choices to overcome starvation.
Untitled Seminar
Giordano Lippi – Beyond transcription – microRNA mechanisms of brain development; Maria Isabel Carreño-Muñoz– Role of GABAergic circuits in the generation of sensory processing dysregulations in SYNGAP1 haploinsufficiency; Rhys Knowles-TBA; Nigel Kee- That other half: Derivation of posterior axial tissues from human stem cells
Hierarchical transformation of visual event timing representations in the human brain: response dynamics in early visual cortex and timing-tuned responses in association cortices
Quantifying the timing (duration and frequency) of brief visual events is vital to human perception, multisensory integration and action planning. For example, this allows us to follow and interact with the precise timing of speech and sports. Here we investigate how visual event timing is represented and transformed across the brain’s hierarchy: from sensory processing areas, through multisensory integration areas, to frontal action planning areas. We hypothesized that the dynamics of neural responses to sensory events in sensory processing areas allows derivation of event timing representations. This would allow higher-level processes such as multisensory integration and action planning to use sensory timing information, without the need for specialized central pacemakers or processes. Using 7T fMRI and neural model-based analyses, we found responses that monotonically increase in amplitude with visual event duration and frequency, becoming increasingly clear from primary visual cortex to lateral occipital visual field maps. Beginning in area MT/V5, we found a gradual transition from monotonic to tuned responses, with response amplitudes peaking at different event timings in different recording sites. While monotonic response components were limited to the retinotopic location of the visual stimulus, timing-tuned response components were independent of the recording sites' preferred visual field positions. These tuned responses formed a network of topographically organized timing maps in superior parietal, postcentral and frontal areas. From anterior to posterior timing maps, multiple events were increasingly integrated, response selectivity narrowed, and responses focused increasingly on the middle of the presented timing range. These results suggest that responses to event timing are transformed from the human brain’s sensory areas to the association cortices, with the event’s temporal properties being increasingly abstracted from the response dynamics and locations of early sensory processing. The resulting abstracted representation of event timing is then propagated through areas implicated in multisensory integration and action planning.
Feedback controls what we see
We hardly notice when there is a speck on our glasses, the obstructed visual information seems to be magically filled in. The visual system uses visual context to predict the content of the stimulus. What enables neurons in the visual system to respond to context when the stimulus is not available? In cortex, sensory processing is based on a combination of feedforward information arriving from sensory organs, and feedback information that originates in higher-order areas. Whereas feedforward information drives the activity in cortex, feedback information is thought to provide contextual signals that are merely modulatory. We have made the exciting discovery that mouse primary visual cortical neurons are strongly driven by feedback projections from higher visual areas, in particular when their feedforward sensory input from the retina is missing. This drive is so strong that it makes visual cortical neurons fire as much as if they were receiving a direct sensory input.
Connecting structure and function in early visual circuits
How does the brain interpret signals from the outside world? Walking through a park, you might take for granted the ease with which you can understand what you see. Rather than seeing a series of still snapshots, you are able to see simple, fluid movement — of dogs running, squirrels foraging, or kids playing basketball. You can track their paths and know where they are headed without much thought. “How does this process take place?” asks Rudy Behnia, PhD, a principal investigator at Columbia’s Mortimer B. Zuckerman Mind Brain Behavior Institute. “For most of us, it’s hard to imagine a world where we can’t see motion, shapes, and color; where we can’t have a representation of the physical world in our head.” And yet this representation does not happen automatically — our brain has no direct connection with the outside world. Instead, it interprets information taken in by our senses. Dr. Behnia is studying how the brain builds these representations. As a starting point, she focuses on how we see motion
Attention to visual motion: shaping sensation into perception
Evolution has endowed primates, including humans, with a powerful visual system, seemingly providing us with a detailed perception of our surroundings. But in reality the underlying process is one of active filtering, enhancement and reshaping. For visual motion perception, the dorsal pathway in primate visual cortex and in particular area MT/V5 is considered to be of critical importance. Combining physiological and psychophysical approaches we have used the processing and perception of visual motion and area MT/V5 as a model for the interaction of sensory (bottom-up) signals with cognitive (top-down) modulatory influences that characterizes visual perception. Our findings document how this interaction enables visual cortex to actively generate a neural representation of the environment that combines the high-performance sensory periphery with selective modulatory influences for producing an “integrated saliency map’ of the environment.
Invariant neural subspaces maintained by feedback modulation
Sensory systems reliably process incoming stimuli in spite of changes in context. Most recent models accredit this context invariance to an extraction of increasingly complex sensory features in hierarchical feedforward networks. Here, we study how context-invariant representations can be established by feedback rather than feedforward processing. We show that feedforward neural networks modulated by feedback can dynamically generate invariant sensory representations. The required feedback can be implemented as a slow and spatially diffuse gain modulation. The invariance is not present on the level of individual neurons, but emerges only on the population level. Mechanistically, the feedback modulation dynamically reorients the manifold of neural activity and thereby maintains an invariant neural subspace in spite of contextual variations. Our results highlight the importance of population-level analyses for understanding the role of feedback in flexible sensory processing.
Synergy of color and motion vision for detecting approaching objects in Drosophila
I am working on color vision in Drosophila, identifying behaviors that involve color vision and understanding the neural circuits supporting them (Longden 2016). I have a long-term interest in understanding how neural computations operate reliably under changing circumstances, be they external changes in the sensory context, or internal changes of state such as hunger and locomotion. On internal state-modulation of sensory processing, I have shown how hunger alters visual motion processing in blowflies (Longden et al. 2014), and identified a role for octopamine in modulating motion vision during locomotion (Longden and Krapp 2009, 2010). On responses to external cues, I have shown how one kind of uncertainty in the motion of the visual scene is resolved by the fly (Saleem, Longden et al. 2012), and I have identified novel cells for processing translation-induced optic flow (Longden et al. 2017). I like working with colleagues who use different model systems, to get at principles of neural operation that might apply in many species (Ding et al. 2016, Dyakova et al. 2015). I like work motivated by computational principles - my background is computational neuroscience, with a PhD on models of memory formation in the hippocampus (Longden and Willshaw, 2007).
NMC4 Short Talk: Neurocomputational mechanisms of causal inference during multisensory processing in the macaque brain
Natural perception relies inherently on inferring causal structure in the environment. However, the neural mechanisms and functional circuits that are essential for representing and updating the hidden causal structure during multisensory processing are unknown. To address this, monkeys were trained to infer the probability of a potential common source from visual and proprioceptive signals on the basis of their spatial disparity in a virtual reality system. The proprioceptive drift reported by monkeys demonstrated that they combined historical information and current multisensory signals to estimate the hidden common source and subsequently updated both the causal structure and sensory representation. Single-unit recordings in premotor and parietal cortices revealed that neural activity in premotor cortex represents the core computation of causal inference, characterizing the estimation and update of the likelihood of integrating multiple sensory inputs at a trial-by-trial level. In response to signals from premotor cortex, neural activity in parietal cortex also represents the causal structure and further dynamically updates the sensory representation to maintain consistency with the causal inference structure. Thus, our results indicate how premotor cortex integrates historical information and sensory inputs to infer hidden variables and selectively updates sensory representations in parietal cortex to support behavior. This dynamic loop of frontal-parietal interactions in the causal inference framework may provide the neural mechanism to answer long-standing questions regarding how neural circuits represent hidden structures for body-awareness and agency.
Inhibitory circuits in sensory processing and behaviour
Being awake while sleeping, being asleep while awake: consequences on cognition and consciousness
Sleep is classically presented as an all-or-nothing phenomenon. Yet, there is increasing evidence showing that sleep and wakefulness can actually intermingle and that wake-like and sleep-like activity can be observed concomitantly in different brain regions. I will here explore the implications of this conception of sleep as a local phenomenon for cognition and consciousness. In the first part of my presentation, I will show how local modulations of sleep depth during sleep could support the processing of sensory information by sleepers. I will also how, under certain circumstances, sleepers can learn while sleeping but also how they can forget. In the second part, I will show how the reverse phenomenon, sleep intrusions during waking, can explain modulations of attention. I will focus in particular on modulations of subjective experience and how the local sleep framework can inform our understanding of everyday phenomena such as mind wandering and mind blanking. Through this presentation and the exploration of both sleep and wakefulness, I will seek to connect changes in neurophysiology with changes in behaviour and subjective experience.
Migraine: a disorder of excitatory-inhibitory balance in multiple brain networks? Insights from genetic mouse models of the disease
Migraine is much more than an episodic headache. It is a complex brain disorder, characterized by a global dysfunction in multisensory information processing and integration. In a third of patients, the headache is preceded by transient sensory disturbances (aura), whose neurophysiological correlate is cortical spreading depression (CSD). The molecular, cellular and circuit mechanisms of the primary brain dysfunctions that underlie migraine onset, susceptibility to CSD and altered sensory processing remain largely unknown and are major open issues in the neurobiology of migraine. Genetic mouse models of a rare monogenic form of migraine with aura provide a unique experimental system to tackle these key unanswered questions. I will describe the functional alterations we have uncovered in the cerebral cortex of genetic mouse models and discuss the insights into the cellular and circuit mechanisms of migraine obtained from these findings.
Seeing with technology: Exchanging the senses with sensory substitution and augmentation
What is perception? Our sensory modalities transmit information about the external world into electrochemical signals that somehow give rise to our conscious experience of our environment. Normally there is too much information to be processed in any given moment, and the mechanisms of attention focus the limited resources of the mind to some information at the expense of others. My research has advanced from first examining visual perception and attention to now examine how multisensory processing contributes to perception and cognition. There are fundamental constraints on how much information can be processed by the different senses on their own and in combination. Here I will explore information processing from the perspective of sensory substitution and augmentation, and how "seeing" with the ears and tongue can advance fundamental and translational research.
Population level mechanisms of feedback-mediated invariant sensory processing
Expectation of self-generated sounds drives predictive processing in mouse auditory cortex
Sensory stimuli are often predictable consequences of one’s actions, and behavior exerts a correspondingly strong influence over sensory responses in the brain. Closed-loop experiments with the ability to control the sensory outcomes of specific animal behaviors have revealed that neural responses to self-generated sounds are suppressed in the auditory cortex, suggesting a role for prediction in local sensory processing. However, it is unclear whether this phenomenon derives from a precise movement-based prediction or how it affects the neural representation of incoming stimuli. We address these questions by designing a behavioral paradigm where mice learn to expect the predictable acoustic consequences of a simple forelimb movement. Neuronal recordings from auditory cortex revealed suppression of neural responses that was strongest for the expected tone and specific to the time of the sound-associated movement. Predictive suppression in the auditory cortex was layer-specific, preceded by the arrival of movement information, and unaffected by behavioral relevance or reward association. These findings illustrate that expectation, learned through motor-sensory experience, drives layer-specific predictive processing in the mouse auditory cortex.
Understanding the role of prediction in sensory encoding
At any given moment the brain receives more sensory information than it can use to guide adaptive behaviour, creating the need for mechanisms that promote efficient processing of incoming sensory signals. One way in which the brain might reduce its sensory processing load is to encode successive presentations of the same stimulus in a more efficient form, a process known as neural adaptation. Conversely, when a stimulus violates an expected pattern, it should evoke an enhanced neural response. Such a scheme for sensory encoding has been formalised in predictive coding theories, which propose that recent experience establishes expectations in the brain that generate prediction errors when violated. In this webinar, Professor Jason Mattingley will discuss whether the encoding of elementary visual features is modulated when otherwise identical stimuli are expected or unexpected based upon the history of stimulus presentation. In humans, EEG was employed to measure neural activity evoked by gratings of different orientations, and multivariate forward modelling was used to determine how orientation selectivity is affected for expected versus unexpected stimuli. In mice, two-photon calcium imaging was used to quantify orientation tuning of individual neurons in the primary visual cortex to expected and unexpected gratings. Results revealed enhanced orientation tuning to unexpected visual stimuli, both at the level of whole-brain responses and for individual visual cortex neurons. Professor Mattingley will discuss the implications of these findings for predictive coding theories of sensory encoding. Professor Jason Mattingley is a Laureate Fellow and Foundation Chair in Cognitive Neuroscience at The University of Queensland. His research is directed toward understanding the brain processes that support perception, selective attention and decision-making, in health and disease.
Science and technology to understand developmental multisensory processing
Workshop: Spatial Brain Dynamics
Traditionally, the term dynamics means changes in a system evolving over time. However, in the brain action potentials propagate along axons to induce postsynaptic currents with different delays at many sites simultaneously. This fundamental computational mechanism evolves spatially to engage the neuron populations involved in brain functions. To identify and understand the spatial processing in brains, this workshop will focus on the spatial principles of brain dynamics that determine how action potentials and membrane currents propagate in the networks of neurons that brains are made of. We will focus on non-artificial dynamics, which excludes in vitro dynamics, interference, electrical and optogenetic stimulations of brains in vivo. Recent non-artificial studies of spatial brain dynamics can actually explain how sensory, motor and internal brain functions evolve. The purpose of this workshop is to discuss these recent results and identify common principles of spatial brain dynamics.
Neural codes in early sensory areas maximize fitness
It has generally been presumed that sensory information encoded by a nervous system should be as accurate as its biological limitations allow. However, perhaps counter intuitively, accurate representations of sensory signals do not necessarily maximize the organism’s chances of survival. We show that neural codes that maximize reward expectation—and not accurate sensory representations—account for retinal responses in insects, and retinotopically-specific adaptive codes in humans. Thus, our results provide evidence that fitness-maximizing rules imposed by the environment are applied at the earliest stages of sensory processing.
Workshop: Spatial Brain Dynamics
Traditionally, the term dynamics means changes in a system evolving over time. However, in the brain action potentials propagate along axons to induce postsynaptic currents with different delays at many sites simultaneously. This fundamental computational mechanism evolves spatially to engage the neuron populations involved in brain functions. To identify and understand the spatial processing in brains, this workshop will focus on the spatial principles of brain dynamics that determine how action potentials and membrane currents propagate in the networks of neurons that brains are made of. We will focus on non-artificial dynamics, which excludes in vitro dynamics, interference, electrical and optogenetic stimulations of brains in vivo. Recent non-artificial studies of spatial brain dynamics can actually explain how sensory, motor and internal brain functions evolve. The purpose of this workshop is to discuss these recent results and identify common principles of spatial brain dynamics.
Workshop: Spatial Brain Dynamics
Traditionally, the term dynamics means changes in a system evolving over time. However, in the brain action potentials propagate along axons to induce postsynaptic currents with different delays at many sites simultaneously. This fundamental computational mechanism evolves spatially to engage the neuron populations involved in brain functions. To identify and understand the spatial processing in brains, this workshop will focus on the spatial principles of brain dynamics that determine how action potentials and membrane currents propagate in the networks of neurons that brains are made of. We will focus on non-artificial dynamics, which excludes in vitro dynamics, interference, electrical and optogenetic stimulations of brains in vivo. Recent non-artificial studies of spatial brain dynamics can actually explain how sensory, motor and internal brain functions evolve. The purpose of this workshop is to discuss these recent results and identify common principles of spatial brain dynamics.
Cellular mechanisms that control state-dependent modulation of sensory processing and plasticity in the cortex
Brief Sensory Deprivation Triggers Cell Type-Specific Structural and Functional Plasticity in Olfactory Bulb Neurons
Can alterations in experience trigger different plastic modifications in neuronal structure and function, and if so, how do they integrate at the cellular level? To address this question, we interrogated circuitry in the mouse olfactory bulb responsible for the earliest steps in odor processing. We induced experience-dependent plasticity in mice of either sex by blocking one nostril for one day, a minimally invasive manipulation that leaves the sensory organ undamaged and is akin to the natural transient blockage suffered during common mild rhinal infections. We found that such brief sensory deprivation produced structural and functional plasticity in one highly specialized bulbar cell type: axon-bearing dopaminergic neurons in the glomerular layer. After 24 h naris occlusion, the axon initial segment (AIS) in bulbar dopaminergic neurons became significantly shorter, a structural modification that was also associated with a decrease in intrinsic excitability. These effects were specific to the AIS-positive dopaminergic subpopulation because no experience-dependent alterations in intrinsic excitability were observed in AIS-negative dopaminergic cells. Moreover, 24 h naris occlusion produced no structural changes at the AIS of bulbar excitatory neurons, mitral/tufted and external tufted cells, nor did it alter their intrinsic excitability. By targeting excitability in one specialized dopaminergic subpopulation, experience-dependent plasticity in early olfactory networks might act to fine-tune sensory processing in the face of continually fluctuating inputs. (https://www.jneurosci.org/content/41/10/2135)
Sensory Processing and Arousal in Neurodevelopmental Disorders
Learning Neurobiology with electric fish
Electric Gymnotiform fish live in muddy, shallow waters near the shore – hiding in the dense filamentous roots of floating plants such as Eichornia crassipes (“camalote”). They explore their surroundings by using a series of electric pulses that serve as self emitted carrier of electrosensory signals. This propagates at the speed of light through this spongiform habitat and is barely sensed by the lateral line of predators and prey. The emitted field polarizes the surroundings according to the difference in impedance with water which in turn modifies the profile of transcutaneous currents considered as an electrosensory image. Using this system, pulse Gymnotiformes create an electrosensory bubble where an object’s location, impedance, size and other characteristics are discriminated and probably recognized. Although consciousness is still not well-proven, cognitive functions as volition, attention, and path integration have been shown. Here I will summarize different aspects of the electromotor electrosensory loop of pulse Gymnotiforms. First, I will address how objects are polarized with a stereotyped but temporospatially complex electric field, consisting of brief pulses emitted at regular intervals. This relies on complex electric organs quasi periodically activated through an electromotor coordination system by a pacemaker in the medulla. Second, I will deal with the imaging mechanisms of pulse gymnotiform fish and the presence of two regions in the electrosensory field, a rostral region where the field time course is coherent and field vector direction is constant all along the electric organ discharge and a lateral region where the field time course is site specific and field vector direction describes a stereotyped 3D trajectory. Third, I will describe the electrosensory mosaic and their characteristics. Receptor and primary afferents correspond one to one showing subtypes optimally responding to the time course of the self generated pulse with a characteristic train of spikes. While polarized objects at the rostral region project their electric images on the perioral region where electrosensory receptor density, subtypes and central projection are maximal, the image of objects on the side recruit a single type of scattered receptors. Therefore, the rostral mosaic has been likened to an electrosensory fovea and its receptive field referred to as foveal field. The rest of the mosaic and field are referred to as peripheral. Finally, I will describe ongoing work on early processing structures. I will try to generate an integrated view, including anatomical and functional data obtained in vitro, acute experiments, and unitary recordings in freely moving fish. We have recently shown have shown that these fish tract allo-generated fields and the virtual fields generated by nearby objects in the presence of self-generated fields to explore the nearby environment. These data together with the presence of a multimodal receptor mosaic at the cutaneous surface particularly surrounding the mouth and an important role of proprioception in early sensory processing suggests the hypothesis that the active electrosensory system is part of a multimodal haptic sense.
Contextual modulation of cortical processing by a higher-order thalamic input
Higher-order thalamic nuclei have extensive connections with various cortical areas. Yet their functionals roles remain not well understood. In our recent studies, using optogenetic and chemogenetic tools we manipulated the activity of a higher-order thalamic nucleus, the lateral posterior nucleus (LP, analogous to the primate pulvinar nucleus) and its projections and examined the effects on sensory discrimination and information processing functions in the cortex. We found an overall suppressive effect on layer 2/3 pyramidal neurons in the cortex, resulting in enhancements of sensory feature selectivities. These mechanisms are in place in contextual modulation of cortical processing, as well as in cross-modality modulation of sensory processing.
Circuit mechanisms underlying the dynamic control of cortical processing by subcortical neuromodulators
Behavioral states such as arousal and attention can have profound effects on sensory processing, determining how – sometimes whether – a stimulus is processed. This state-dependence is believed to arise, at least in part, as a result of inputs to cortex from subcortical structures that release neuromodulators such as acetylcholine, noradrenaline, and serotonin, often non-synaptically. The mechanisms that underlie the interaction between these “wireless” non-synaptic signals and the “wired” cortical circuit are not well understood. Furthermore, neuromodulatory signaling is traditionally considered broad in its impact across cortex (within a species) and consistent in its form and function across species (at least in mammals). The work I will present approaches the challenge of understanding neuromodulatory action in the cortex from a number of angles: anatomy, physiology, pharmacology, and chemistry. The overarching goal of our effort is to elucidate the mechanisms behind local neuromodulation in the cortex of non-human primates, and to reveal differences in structure and function across cortical model systems.
Cholinergic regulation of learning in the olfactory system
In the olfactory system, cholinergic modulation has been associated with contrast modulation and changes in receptive fields in the olfactory bulb, as well the learning of odor associations in the olfactory cortex. Computational modeling and behavioral studies suggest that cholinergic modulation could improve sensory processing and learning while preventing pro-active interference when task demands are high. However, how sensory inputs and/or learning regulate incoming modulation has not yet been elucidated. We here use a computational model of the olfactory bulb, piriform cortex (PC) and horizontal limb of the diagonal band of Broca (HDB) to explore how olfactory learning could regulate cholinergic inputs to the system in a closed feedback loop. In our model, the novelty of an odor is reflected in firing rates and sparseness of cortical neurons in response to that odor and these firing rates can directly regulate learning in the system by modifying cholinergic inputs to the system.
Circuit dysfunction and sensory processing in Fragile X Syndrome
To uncover the circuit-level alterations that underlie atypical sensory processing associated with autism, we have adopted a symptom-to-circuit approach in theFmr1-/- mouse model of Fragile X syndrome (FXS). Using a go/no-go task and in vivo 2-photon calcium imaging, we find that impaired visual discrimination in Fmr1-/- mice correlates with marked deficits in orientation tuning of principal neurons in primary visual cortex, and a decrease in the activity of parvalbumin (PV) interneurons. Restoring visually evoked activity in PV cells in Fmr1-/- mice with a chemogenetic (DREADD) strategy was sufficient to rescue their behavioural performance. Strikingly, human subjects with FXS exhibit similar impairments in visual discrimination as Fmr1-/- mice. These results suggest that manipulating inhibition may help sensory processing in FXS. More recently, we find that the ability of Fmr1-/- mice to perform the visual discrimination task is also drastically impaired in the presence of visual or auditory distractors, suggesting that sensory hypersensitivity may affect perceptual learning in autism.
Thalamic reticular nucleus dysfunction in neurodevelopmental disorders
The thalamic reticular nucleus (TRN), the major source of thalamic inhibition, is known to regulate thalamocortical interactions critical for sensory processing, attention and cognition. TRN dysfunction has been linked to sensory abnormality, attention deficit and sleep disturbance across multiple neurodevelopmental disorders. Currently, little is known about the organizational principles underlying its divergent functions. In this talk, I will start with an example of how dysfunction of TRN contributes to attention deficit and sleep disruption using a mouse model of Ptchd1 mutation, which in humans cause neurodevelopmental disorder with ASD. Building on these findings, we further performed an integrative single-cell analysis linking molecular and electrophysiological features of the TRN to connectivity and systems-level function. We identified two subnetworks of the TRN with segregated anatomical structure, distinct electrophysiological properties, differential connections to the functionally distinct first-order and higher-order thalamic nuclei, and differential role in regulating sleep. These studies provide a comprehensive atlas for TRN neurons at the single-cell resolution and a foundation for studying diverse functions and dysfunctions of the TRN. Finally, I will describe the newly developed minimally invasive optogenetic tool for probing circuit function and dysfunction.
Multi-layer network learning in an electric fish
The electrosensory lobe (ELL) in mormyrid electric fish is a cerebellar-like structure that cancels the sensory effects of self-generated electric fields, allowing prey to be detected. Like the cerebellum, the ELL involves two stages of processing, analogous to the Purkinje cells and cells of the deep cerebellar nuclei. Through the work of Curtis Bell and others, a model was previously developed to describe the output stage of the ELL, but the role of the Purkinje-cell analogs, the medium ganglion (MG) cells, in the circuit had remained mysterious. I will present a complete, multi-layer circuit description of the ELL, developed in collaboration with Nate Sawtell and Salomon Muller, that reveals a novel role for the MG cells. The resulting model provides an example of how a biological system solves well-known problems associated with learning in multi-layer networks, and it reveals that ELL circuitry is organization on the basis of learning rather than by the response properties of neurons.
Decoding of Chemical Information from Populations of Olfactory Neurons
Information is represented in the brain by the coordinated activity of populations of neurons. Recent large-scale neural recording methods in combination with machine learning algorithms are helping understand how sensory processing and cognition emerge from neural population activity. This talk will explore the most popular machine learning methods used to gather meaningful low-dimensional representations from higher-dimensional neural recordings. To illustrate the potential of these approaches, Pedro will present his research in which chemical information is decoded from the olfactory system of the mouse for technological applications. Pedro and co-researchers have successfully extracted odor identity and concentration from olfactory receptor neuron low-dimensional activity trajectories. They have further developed a novel method to identify a shared latent space that allowed decoding of odor information across animals.
Algorithms and circuits for olfactory navigation in walking Drosophila
Olfactory navigation provides a tractable model for studying the circuit basis of sensori-motor transformations and goal-directed behaviour. Macroscopic organisms typically navigate in odor plumes that provide a noisy and uncertain signal about the location of an odor source. Work in many species has suggested that animals accomplish this task by combining temporal processing of dynamic odor information with an estimate of wind direction. Our lab has been using adult walking Drosophila to understand both the computational algorithms and the neural circuits that support navigation in a plume of attractive food odor. We developed a high-throughput paradigm to study behavioural responses to temporally-controlled odor and wind stimuli. Using this paradigm we found that flies respond to a food odor (apple cider vinegar) with two behaviours: during the odor they run upwind, while after odor loss they perform a local search. A simple computational model based one these two responses is sufficient to replicate many aspects of fly behaviour in a natural turbulent plume. In on-going work, we are seeking to identify the neural circuits and biophysical mechanisms that perform the computations delineated by our model. Using electrophysiology, we have identified mechanosensory neurons that compute wind direction from movements of the two antennae and central mechanosensory neurons that encode wind direction are are involved in generating a stable downwind orientation. Using optogenetic activation, we have traced olfactory circuits capable of evoking upwind orientation and offset search from the periphery, through the mushroom body and lateral horn, to the central complex. Finally, we have used optogenetic activation, in combination with molecular manipulation of specific synapses, to localize temporal computations performed on the odor signal to olfactory transduction and transmission at specific synapses. Our work illustrates how the tools available in fruit fly can be applied to dissect the mechanisms underlying a complex goal-directed behaviour.
Recurrence in temporal multisensory processing
Bernstein Conference 2024
VIP interneuron contributions to state-dependent sensory processing
COSYNE 2023
From Circuits to Behavior: Modeling Flexible Context-Driven Sensory Processing
COSYNE 2025
The geometry and role of sequential activity in sensory processing and perceptual generalization
COSYNE 2025
Cortical circuits for context dependent sensory processing
FENS Forum 2024
Developmental Cajal-Retzius cell death contributes to the maturation of cortical inhibition and somatosensory processing
FENS Forum 2024
Elucidating the mechanisms of altered cortical sensory processing in a mouse model of Huntington disease
FENS Forum 2024
Impact of inter-areal connectivity on sensory processing in a biophysically-detailed model of two interacting cortical areas
FENS Forum 2024
Layer 1 NDNF+ interneurons control bilateral sensory processing in an age- and layer-dependent manner
FENS Forum 2024
Sensory processing and membrane properties of external globus pallidus neurons in dopamine-depleted mice
FENS Forum 2024
Serotonergic neurons of the caudal raphe contribute to sensory processing during adaptive locomotion
FENS Forum 2024
Tactile sensory processing deficits in the Shank3 KO mouse model of autism spectrum disorder
FENS Forum 2024
sensory processing coverage
55 items