← Back

Sensory Integration

Topic spotlight
TopicWorld Wide

sensory integration

Discover seminars, jobs, and research tagged with sensory integration across World Wide.
22 curated items14 Seminars8 ePosters
Updated almost 2 years ago
22 items · sensory integration
22 results
SeminarNeuroscienceRecording

The Role of Spatial and Contextual Relations of real world objects in Interval Timing

Rania Tachmatzidou
Panteion University
Jan 28, 2024

In the real world, object arrangement follows a number of rules. Some of the rules pertain to the spatial relations between objects and scenes (i.e., syntactic rules) and others about the contextual relations (i.e., semantic rules). Research has shown that violation of semantic rules influences interval timing with the duration of scenes containing such violations to be overestimated as compared to scenes with no violations. However, no study has yet investigated whether both semantic and syntactic violations can affect timing in the same way. Furthermore, it is unclear whether the effect of scene violations on timing is due to attentional or other cognitive accounts. Using an oddball paradigm and real-world scenes with or without semantic and syntactic violations, we conducted two experiments on whether time dilation will be obtained in the presence of any type of scene violation and the role of attention in any such effect. Our results from Experiment 1 showed that time dilation indeed occurred in the presence of syntactic violations, while time compression was observed for semantic violations. In Experiment 2, we further investigated whether these estimations were driven by attentional accounts, by utilizing a contrast manipulation of the target objects. The results showed that an increased contrast led to duration overestimation for both semantic and syntactic oddballs. Together, our results indicate that scene violations differentially affect timing due to violation processing differences and, moreover, their effect on timing seems to be sensitive to attentional manipulations such as target contrast.

SeminarNeuroscienceRecording

Measures and models of multisensory integration in reaction times

Hans Colonius
Oldenburg University
Jan 17, 2024

First, a new measure of MI for reaction times is proposed that takes the entire RT distribution into account. Second, we present some recent developments in TWIN modeling, including a new proposal for the sound-induced flash illusion (SIFI).

SeminarNeuroscienceRecording

Multisensory integration in peripersonal space (PPS) for action, perception and consciousness

Andrea Serino
University Hospital of Lausanne
Nov 1, 2023

Note the later time in the USA!

SeminarNeuroscienceRecording

Rodents to Investigate the Neural Basis of Audiovisual Temporal Processing and Perception

Ashley Schormans
BrainsCAN, Western University, Canada.
Sep 26, 2023

To form a coherent perception of the world around us, we are constantly processing and integrating sensory information from multiple modalities. In fact, when auditory and visual stimuli occur within ~100 ms of each other, individuals tend to perceive the stimuli as a single event, even though they occurred separately. In recent years, our lab, and others, have developed rat models of audiovisual temporal perception using behavioural tasks such as temporal order judgments (TOJs) and synchrony judgments (SJs). While these rodent models demonstrate metrics that are consistent with humans (e.g., perceived simultaneity, temporal acuity), we have sought to confirm whether rodents demonstrate the hallmarks of audiovisual temporal perception, such as predictable shifts in their perception based on experience and sensitivity to alterations in neurochemistry. Ultimately, our findings indicate that rats serve as an excellent model to study the neural mechanisms underlying audiovisual temporal perception, which to date remains relativity unknown. Using our validated translational audiovisual behavioural tasks, in combination with optogenetics, neuropharmacology and in vivo electrophysiology, we aim to uncover the mechanisms by which inhibitory neurotransmission and top-down circuits finely control ones’ perception. This research will significantly advance our understanding of the neuronal circuitry underlying audiovisual temporal perception, and will be the first to establish the role of interneurons in regulating the synchronized neural activity that is thought to contribute to the precise binding of audiovisual stimuli.

SeminarNeuroscience

Inter-tissue signals modify food-seeking behavior in C. elegans

Sreekanth Chalasani
Salk Institute for Biological Studies
Oct 10, 2022

Animals modify their behavioral outputs in response to changes in external and internal environments. We use the nematode, C. elegans to probe the pathways linking changes in internal states like hunger with behavior. We find that acute food deprivation alters the localization of two transcription factors, likely releasing an insulin-like peptide from the intestine, which in turn modifies chemosensory neurons and alters behavior. These results present a model for how inter-tissue signals to generate flexible behaviors via gut-brain signaling.

SeminarNeuroscienceRecording

Hierarchical transformation of visual event timing representations in the human brain: response dynamics in early visual cortex and timing-tuned responses in association cortices

Evi Hendrikx
Utrecht University
Sep 27, 2022

Quantifying the timing (duration and frequency) of brief visual events is vital to human perception, multisensory integration and action planning. For example, this allows us to follow and interact with the precise timing of speech and sports. Here we investigate how visual event timing is represented and transformed across the brain’s hierarchy: from sensory processing areas, through multisensory integration areas, to frontal action planning areas. We hypothesized that the dynamics of neural responses to sensory events in sensory processing areas allows derivation of event timing representations. This would allow higher-level processes such as multisensory integration and action planning to use sensory timing information, without the need for specialized central pacemakers or processes. Using 7T fMRI and neural model-based analyses, we found responses that monotonically increase in amplitude with visual event duration and frequency, becoming increasingly clear from primary visual cortex to lateral occipital visual field maps. Beginning in area MT/V5, we found a gradual transition from monotonic to tuned responses, with response amplitudes peaking at different event timings in different recording sites. While monotonic response components were limited to the retinotopic location of the visual stimulus, timing-tuned response components were independent of the recording sites' preferred visual field positions. These tuned responses formed a network of topographically organized timing maps in superior parietal, postcentral and frontal areas. From anterior to posterior timing maps, multiple events were increasingly integrated, response selectivity narrowed, and responses focused increasingly on the middle of the presented timing range. These results suggest that responses to event timing are transformed from the human brain’s sensory areas to the association cortices, with the event’s temporal properties being increasingly abstracted from the response dynamics and locations of early sensory processing. The resulting abstracted representation of event timing is then propagated through areas implicated in multisensory integration and action planning.

SeminarNeuroscienceRecording

Transcriptional controls over projection neuron fate diversity

Esther Klingler
Jabaudon lab, University of Geneva
Jun 28, 2022

The cerebral cortex is the most evolved structure of the brain and the site for higher cognitive functions. It consists of 6 layers, each composed of specific types of neurons. Interconnectivity between cortical areas is critical for sensory integration and sensorimotor transformation. Inter-areal cortical projection neurons are located in all cortical layers and form a heterogeneous population, which send their axon across cortical areas, both within and across hemispheres. How this diversity emerges during development remains largely unknown. Here, we address this question by linking the connectome and transcriptome of developing cortical projection neurons and show distinct maturation paces in neurons with distinct projections, which correlates with the sequential development of sensory and motor functions during postnatal period.

SeminarNeuroscience

From natural scene statistics to multisensory integration: experiments, models and applications

Cesare Parise
Oculus VR
Feb 8, 2022

To efficiently process sensory information, the brain relies on statistical regularities in the input. While generally improving the reliability of sensory estimates, this strategy also induces perceptual illusions that help reveal the underlying computational principles. Focusing on auditory and visual perception, in my talk I will describe how the brain exploits statistical regularities within and across the senses for the perception space, time and multisensory integration. In particular, I will show how results from a series of psychophysical experiments can be interpreted in the light of Bayesian Decision Theory, and I will demonstrate how such canonical computations can be implemented into simple and biologically plausible neural circuits. Finally, I will show how such principles of sensory information processing can be leveraged in virtual and augmented reality to overcome display limitations and expand human perception.

SeminarNeuroscienceRecording

The vestibular system: a multimodal sense

Elisa Raffaella Ferre
Birkbeck, University of London
Jan 19, 2022

The vestibular system plays an essential role in everyday life, contributing to a surprising range of functions from reflexes to the highest levels of perception and consciousness. Three orthogonal semicircular canals detect rotational movements of the head and the otolith organs sense translational acceleration, including the gravitational vertical. But, how vestibular signals are encoded by the human brain? We have recently combined innovative methods for eliciting virtual rotation and translation sensations with fMRI to identify brain areas representing vestibular signals. We have identified a bilateral inferior parietal, ventral premotor/anterior insula and prefrontal network and confirmed that these areas reliably possess information about the rotation and translation. We have also investigated how vestibular signals are integrated with other sensory cues to generate our perception of the external environment.

SeminarNeuroscienceRecording

What happens to our ability to perceive multisensory information as we age?

Fiona Newell
Trinity Collge Dublin
Jan 12, 2022

Our ability to perceive the world around us can be affected by a number of factors including the nature of the external information, prior experience of the environment, and the integrity of the underlying perceptual system. A particular challenge for the brain is to maintain a coherent perception from information encoded by the peripheral sensory organs whose function is affected by typical, developmental changes across the lifespan. Yet, how the brain adapts to the maturation of the senses, as well as experiential changes in the multisensory environment, is poorly understood. Over the past few years, we have used a range of multisensory tasks to investigate the role of ageing on the brain’s ability to merge sensory inputs. In particular, we have embedded an audio-visual task based on the sound-induced flash illusion (SIFI) into a large-scale, longitudinal study of ageing. Our findings support the idea that the temporal binding window (TBW) is modulated by age and reveal important individual differences in this TBW that may have clinical implications. However, our investigations also suggest the TWB is experience-dependent with evidence for both long and short term behavioural plasticity. An overview of these findings, including recent evidence on how multisensory integration may be associated with higher order functions, will be discussed.

SeminarNeuroscienceRecording

Multisensory Integration: Development, Plasticity, and Translational Applications

Benjamin A. Rowland
Wake Forest School of Medicine
Sep 20, 2021
SeminarNeuroscience

Understanding "why": The role of causality in cognition

Tobias Gerstenberg
Stanford University
Apr 27, 2021

Humans have a remarkable ability to figure out what happened and why. In this talk, I will shed light on this ability from multiple angles. I will present a computational framework for modeling causal explanations in terms of counterfactual simulations, and several lines of experiments testing this framework in the domain of intuitive physics. The model predicts people's causal judgments about a variety of physical scenes, including dynamic collision events, complex situations that involve multiple causes, omissions as causes, and causal responsibility for a system's stability. It also captures the cognitive processes underlying these judgments as revealed by spontaneous eye-movements. More recently, we have applied our computational framework to explain multisensory integration. I will show how people's inferences about what happened are well-accounted for by a model that integrates visual and auditory evidence through approximate physical simulations.

SeminarNeuroscience

Multisensory Perception: Behaviour, Computations and Neural Mechanisms

Uta Noppeney
Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
Jan 17, 2021

Our senses are constantly bombarded with a myriad of diverse signals. Transforming this sensory cacophony into a coherent percept of our environment relies on solving two computational challenges: First, we need to solve the causal inference problem - deciding whether signals come from a common cause and thus should be integrated, or come from different sources and be treated independently. Second, when there is a common cause, we should integrate signals across the senses weighted in proportion to their sensory reliabilities. I discuss recent research at the behavioural, computational and neural systems level that investigates how the brain addresses these two computational challenges in multisensory perception.

SeminarNeuroscienceRecording

A Rare Visuospatial Disorder

Aimee Dollman
University of Cape Town
Aug 25, 2020

Cases with visuospatial abnormalities provide opportunities for understanding the underlying cognitive mechanisms. Three cases of visual mirror-reversal have been reported: AH (McCloskey, 2009), TM (McCloskey, Valtonen, & Sherman, 2006) and PR (Pflugshaupt et al., 2007). This research reports a fourth case, BS -- with focal occipital cortical dysgenesis -- who displays highly unusual visuospatial abnormalities. They initially produced mirror reversal errors similar to those of AH, who -- like the patient in question -- showed a selective developmental deficit. Extensive examination of BS revealed phenomena such as: mirror reversal errors (sometimes affecting only parts of the visual fields) in both horizontal and vertical planes; subjective representation of visual objects and words in distinct left and right visual fields; subjective duplication of objects of visual attention (not due to diplopia); uncertainty regarding the canonical upright orientation of everyday objects; mirror reversals during saccadic eye movements on oculomotor tasks; and failure to integrate visual with other sensory inputs (e.g., they feel themself moving backwards when visual information shows they are moving forward). Fewer errors are produced under conditions of certain visual variables. These and other findings have led the researchers to conclude that BS draws upon a subjective representation of visual space that is structured phenomenally much as it is anatomically in early visual cortex (i.e., rotated through 180 degrees, split into left and right fields, etc.). Despite this, BS functions remarkably well in their everyday life, apparently due to extensive compensatory mechanisms deployed at higher (executive) processing levels beyond the visual modality.

ePoster

Neuronal Heterogeneity Enhances Sensory Integration and Processing

Arash Golmohammadi, Christian Tetzlaff

Bernstein Conference 2024

ePoster

Critical Learning Periods for Multisensory Integration in Deep Networks

Michael Kleinman, Alessandro Achille, Stefano Soatto

COSYNE 2023

ePoster

Effects of multimodal sensory integration in D. melanogaster optokinetic response

Giulio Maria Menti, Matteo Bruzzone, Patrizia Visentin, Andrea Drago, Marco Dal Maschio, Aram Megighian

FENS Forum 2024

ePoster

Examining multisensory integration in weakly electric fish through manipulating sensory salience

Emine Ceren Rutbil, Gurkan Celik, Alp Demirel, Emin Yusuf Aydin, Ismail Uyanik

FENS Forum 2024

ePoster

Modality specificity of multisensory integration and decision-making in frontal cortex and superior colliculus

Alice Despatin, Irene Lenzi, Severin Graff, Kerstin Doerenkamp, Gerion Nabbefeld, Maria Laura Pérez, Anoushka Jain, Sonja Grün, Björn Kampa, Simon Musall

FENS Forum 2024

ePoster

Neuronal circuit for multisensory integration in higher visual cortex

Mio Inoue, Yuta Tanisumi, Daisuke Kato, Nanami Kawamura, Akari Hashimoto, Ikuko Takeda, Etsuko Tarusawa, Hiroaki Wake

FENS Forum 2024

ePoster

Postural constraints affect the optimal weighting of multisensory integration during visuo-manual coordination

Célie Dézé, Clémence Daleux, Mathieu Beraneck, Joseph McIntyre, Michele Tagliabue

FENS Forum 2024

ePoster

Sensory somatotopy of flow stimuli in a sensory integration center in the zebrafish hindbrain

Elias Lunsford, Claire Wyart

FENS Forum 2024