← Back

Multisensory Integration

Topic spotlight
TopicWorld Wide

multisensory integration

Discover seminars, jobs, and research tagged with multisensory integration across World Wide.
14 curated items8 Seminars5 ePosters1 Position
Updated 1 day ago
14 items · multisensory integration
14 results
PositionComputational Neuroscience

Dr. Jorge Mejias

University of Amsterdam
Amsterdam
Dec 5, 2025

The Cognitive and Systems Neuroscience Group is seeking a highly qualified and motivated candidate for a doctoral position in computational neuroscience, under the recently acquired NWA-ORC Consortium grant. The aim of this Consortium is to understand the fundamental principles used by our brains to integrate information in noisy environments and uncertain conditions, and then implement those principles in next-generation algorithms for safe autonomous mobility. Within the Consortium, the main objective of the present PhD project is to develop a biologically realistic computational model of multi-area brain circuits involved in multisensory perception under uncertainty. The model will be constrained by state-of-the-art neuroanatomical data (such as realistic brain connectivity and multiple cell types), and we will identify and study biological aspects of the model which contribute to an optimal integration of sensory information (following Bayesian and other principles). Model predictions will then be compared to experimental data from collaborators. The project will be supervised by Dr. Jorge Mejias, head of the Computational Neuroscience Lab, and Prof. Dr. Cyriel Pennartz, head of the Cognitive & Systems Neuroscience group. The candidate will also closely collaborate with other computational neuroscientists, experimental neuroscientists, theoreticians and machine learning experts. You are expected: -to perform research of multisensory integration and perception using computational neuroscience methods; -to review relevant literature and acquire knowledge on neurobiology, perception and computational neuroscience; -to build biologically realistic multi-area computer models of cortical circuits for multisensory perception, and compare their predictions with experimental findings; -to collaborate with other groups in the Consortium; -to take part in the teaching effort of the group, including supervision of bachelor and master students; -to write scientific manuscripts and a PhD thesis. Our offer: A temporary contract for 38 hours per week for the duration of four years (the initial contract will be for a period of 18 months and after satisfactory evaluation it will be extended to a total duration of four years). This should lead to a dissertation (PhD thesis). We will draft an educational plan that includes attendance of courses and (international) meetings. We also expect you to assist in teaching undergraduates and master students. Based on a full-time appointment (38 hours per week) the gross monthly salary will range from €2,434 in the first year to €3,111 (scale P) in the last year. This is exclusive 8% holiday allowance and 8.3% end-of-year bonus. A favourable tax agreement, the ‘30% ruling’, may apply to non-Dutch applicants. The Collective Labour Agreement of Dutch Universities is applicable.

SeminarNeuroscienceRecording

The Role of Spatial and Contextual Relations of real world objects in Interval Timing

Rania Tachmatzidou
Panteion University
Jan 28, 2024

In the real world, object arrangement follows a number of rules. Some of the rules pertain to the spatial relations between objects and scenes (i.e., syntactic rules) and others about the contextual relations (i.e., semantic rules). Research has shown that violation of semantic rules influences interval timing with the duration of scenes containing such violations to be overestimated as compared to scenes with no violations. However, no study has yet investigated whether both semantic and syntactic violations can affect timing in the same way. Furthermore, it is unclear whether the effect of scene violations on timing is due to attentional or other cognitive accounts. Using an oddball paradigm and real-world scenes with or without semantic and syntactic violations, we conducted two experiments on whether time dilation will be obtained in the presence of any type of scene violation and the role of attention in any such effect. Our results from Experiment 1 showed that time dilation indeed occurred in the presence of syntactic violations, while time compression was observed for semantic violations. In Experiment 2, we further investigated whether these estimations were driven by attentional accounts, by utilizing a contrast manipulation of the target objects. The results showed that an increased contrast led to duration overestimation for both semantic and syntactic oddballs. Together, our results indicate that scene violations differentially affect timing due to violation processing differences and, moreover, their effect on timing seems to be sensitive to attentional manipulations such as target contrast.

SeminarNeuroscienceRecording

Measures and models of multisensory integration in reaction times

Hans Colonius
Oldenburg University
Jan 17, 2024

First, a new measure of MI for reaction times is proposed that takes the entire RT distribution into account. Second, we present some recent developments in TWIN modeling, including a new proposal for the sound-induced flash illusion (SIFI).

SeminarNeuroscienceRecording

Multisensory integration in peripersonal space (PPS) for action, perception and consciousness

Andrea Serino
University Hospital of Lausanne
Nov 1, 2023

Note the later time in the USA!

SeminarNeuroscienceRecording

Hierarchical transformation of visual event timing representations in the human brain: response dynamics in early visual cortex and timing-tuned responses in association cortices

Evi Hendrikx
Utrecht University
Sep 27, 2022

Quantifying the timing (duration and frequency) of brief visual events is vital to human perception, multisensory integration and action planning. For example, this allows us to follow and interact with the precise timing of speech and sports. Here we investigate how visual event timing is represented and transformed across the brain’s hierarchy: from sensory processing areas, through multisensory integration areas, to frontal action planning areas. We hypothesized that the dynamics of neural responses to sensory events in sensory processing areas allows derivation of event timing representations. This would allow higher-level processes such as multisensory integration and action planning to use sensory timing information, without the need for specialized central pacemakers or processes. Using 7T fMRI and neural model-based analyses, we found responses that monotonically increase in amplitude with visual event duration and frequency, becoming increasingly clear from primary visual cortex to lateral occipital visual field maps. Beginning in area MT/V5, we found a gradual transition from monotonic to tuned responses, with response amplitudes peaking at different event timings in different recording sites. While monotonic response components were limited to the retinotopic location of the visual stimulus, timing-tuned response components were independent of the recording sites' preferred visual field positions. These tuned responses formed a network of topographically organized timing maps in superior parietal, postcentral and frontal areas. From anterior to posterior timing maps, multiple events were increasingly integrated, response selectivity narrowed, and responses focused increasingly on the middle of the presented timing range. These results suggest that responses to event timing are transformed from the human brain’s sensory areas to the association cortices, with the event’s temporal properties being increasingly abstracted from the response dynamics and locations of early sensory processing. The resulting abstracted representation of event timing is then propagated through areas implicated in multisensory integration and action planning.

SeminarNeuroscience

From natural scene statistics to multisensory integration: experiments, models and applications

Cesare Parise
Oculus VR
Feb 8, 2022

To efficiently process sensory information, the brain relies on statistical regularities in the input. While generally improving the reliability of sensory estimates, this strategy also induces perceptual illusions that help reveal the underlying computational principles. Focusing on auditory and visual perception, in my talk I will describe how the brain exploits statistical regularities within and across the senses for the perception space, time and multisensory integration. In particular, I will show how results from a series of psychophysical experiments can be interpreted in the light of Bayesian Decision Theory, and I will demonstrate how such canonical computations can be implemented into simple and biologically plausible neural circuits. Finally, I will show how such principles of sensory information processing can be leveraged in virtual and augmented reality to overcome display limitations and expand human perception.

SeminarNeuroscienceRecording

What happens to our ability to perceive multisensory information as we age?

Fiona Newell
Trinity Collge Dublin
Jan 12, 2022

Our ability to perceive the world around us can be affected by a number of factors including the nature of the external information, prior experience of the environment, and the integrity of the underlying perceptual system. A particular challenge for the brain is to maintain a coherent perception from information encoded by the peripheral sensory organs whose function is affected by typical, developmental changes across the lifespan. Yet, how the brain adapts to the maturation of the senses, as well as experiential changes in the multisensory environment, is poorly understood. Over the past few years, we have used a range of multisensory tasks to investigate the role of ageing on the brain’s ability to merge sensory inputs. In particular, we have embedded an audio-visual task based on the sound-induced flash illusion (SIFI) into a large-scale, longitudinal study of ageing. Our findings support the idea that the temporal binding window (TBW) is modulated by age and reveal important individual differences in this TBW that may have clinical implications. However, our investigations also suggest the TWB is experience-dependent with evidence for both long and short term behavioural plasticity. An overview of these findings, including recent evidence on how multisensory integration may be associated with higher order functions, will be discussed.

SeminarNeuroscienceRecording

Multisensory Integration: Development, Plasticity, and Translational Applications

Benjamin A. Rowland
Wake Forest School of Medicine
Sep 20, 2021
ePoster

Critical Learning Periods for Multisensory Integration in Deep Networks

Michael Kleinman, Alessandro Achille, Stefano Soatto

COSYNE 2023

ePoster

Examining multisensory integration in weakly electric fish through manipulating sensory salience

Emine Ceren Rutbil, Gurkan Celik, Alp Demirel, Emin Yusuf Aydin, Ismail Uyanik

FENS Forum 2024

ePoster

Modality specificity of multisensory integration and decision-making in frontal cortex and superior colliculus

Alice Despatin, Irene Lenzi, Severin Graff, Kerstin Doerenkamp, Gerion Nabbefeld, Maria Laura Pérez, Anoushka Jain, Sonja Grün, Björn Kampa, Simon Musall

FENS Forum 2024

ePoster

Neuronal circuit for multisensory integration in higher visual cortex

Mio Inoue, Yuta Tanisumi, Daisuke Kato, Nanami Kawamura, Akari Hashimoto, Ikuko Takeda, Etsuko Tarusawa, Hiroaki Wake

FENS Forum 2024

ePoster

Postural constraints affect the optimal weighting of multisensory integration during visuo-manual coordination

Célie Dézé, Clémence Daleux, Mathieu Beraneck, Joseph McIntyre, Michele Tagliabue

FENS Forum 2024