← Back

Multisensory

Topic spotlight
TopicWorld Wide

multisensory

Discover seminars, jobs, and research tagged with multisensory across World Wide.
68 curated items52 Seminars16 ePosters
Updated 7 months ago
68 items · multisensory
68 results
SeminarNeuroscienceRecording

Restoring Sight to the Blind: Effects of Structural and Functional Plasticity

Noelle Stiles
Rutgers University
May 21, 2025

Visual restoration after decades of blindness is now becoming possible by means of retinal and cortical prostheses, as well as emerging stem cell and gene therapeutic approaches. After restoring visual perception, however, a key question remains. Are there optimal means and methods for retraining the visual cortex to process visual inputs, and for learning or relearning to “see”? Up to this point, it has been largely assumed that if the sensory loss is visual, then the rehabilitation focus should also be primarily visual. However, the other senses play a key role in visual rehabilitation due to the plastic repurposing of visual cortex during blindness by audition and somatosensation, and also to the reintegration of restored vision with the other senses. I will present multisensory neuroimaging results, cortical thickness changes, as well as behavioral outcomes for patients with Retinitis Pigmentosa (RP), which causes blindness by destroying photoreceptors in the retina. These patients have had their vision partially restored by the implantation of a retinal prosthesis, which electrically stimulates still viable retinal ganglion cells in the eye. Our multisensory and structural neuroimaging and behavioral results suggest a new, holistic concept of visual rehabilitation that leverages rather than neglects audition, somatosensation, and other sensory modalities.

SeminarNeuroscienceRecording

Multisensory perception in the metaverse

Polly Dalton
Royal Holloway, University of London
May 7, 2025
SeminarNeuroscienceRecording

Multisensory computations underlying flavor perception and food choice

Joost Maier
Wake Forest School of Medicine
Apr 16, 2025
SeminarNeuroscience

Where are you Moving? Assessing Precision, Accuracy, and Temporal Dynamics in Multisensory Heading Perception Using Continuous Psychophysics

Björn Jörges
York University
Feb 5, 2025
SeminarNeuroscienceRecording

Time perception in film viewing as a function of film editing

Lydia Liapi
Panteion University
Mar 26, 2024

Filmmakers and editors have empirically developed techniques to ensure the spatiotemporal continuity of a film's narration. In terms of time, editing techniques (e.g., elliptical, overlapping, or cut minimization) allow for the manipulation of the perceived duration of events as they unfold on screen. More specifically, a scene can be edited to be time compressed, expanded, or real-time in terms of its perceived duration. Despite the consistent application of these techniques in filmmaking, their perceptual outcomes have not been experimentally validated. Given that viewing a film is experienced as a precise simulation of the physical world, the use of cinematic material to examine aspects of time perception allows for experimentation with high ecological validity, while filmmakers gain more insight on how empirically developed techniques influence viewers' time percept. Here, we investigated how such time manipulation techniques of an action affect a scene's perceived duration. Specifically, we presented videos depicting different actions (e.g., a woman talking on the phone), edited according to the techniques applied for temporal manipulation and asked participants to make verbal estimations of the presented scenes' perceived durations. Analysis of data revealed that the duration of expanded scenes was significantly overestimated as compared to that of compressed and real-time scenes, as was the duration of real-time scenes as compared to that of compressed scenes. Therefore, our results validate the empirical techniques applied for the modulation of a scene's perceived duration. We also found interactions on time estimates of scene type and editing technique as a function of the characteristics and the action of the scene presented. Thus, these findings add to the discussion that the content and characteristics of a scene, along with the editing technique applied, can also modulate perceived duration. Our findings are discussed by considering current timing frameworks, as well as attentional saliency algorithms measuring the visual saliency of the presented stimuli.

SeminarNeuroscienceRecording

The Role of Spatial and Contextual Relations of real world objects in Interval Timing

Rania Tachmatzidou
Panteion University
Jan 28, 2024

In the real world, object arrangement follows a number of rules. Some of the rules pertain to the spatial relations between objects and scenes (i.e., syntactic rules) and others about the contextual relations (i.e., semantic rules). Research has shown that violation of semantic rules influences interval timing with the duration of scenes containing such violations to be overestimated as compared to scenes with no violations. However, no study has yet investigated whether both semantic and syntactic violations can affect timing in the same way. Furthermore, it is unclear whether the effect of scene violations on timing is due to attentional or other cognitive accounts. Using an oddball paradigm and real-world scenes with or without semantic and syntactic violations, we conducted two experiments on whether time dilation will be obtained in the presence of any type of scene violation and the role of attention in any such effect. Our results from Experiment 1 showed that time dilation indeed occurred in the presence of syntactic violations, while time compression was observed for semantic violations. In Experiment 2, we further investigated whether these estimations were driven by attentional accounts, by utilizing a contrast manipulation of the target objects. The results showed that an increased contrast led to duration overestimation for both semantic and syntactic oddballs. Together, our results indicate that scene violations differentially affect timing due to violation processing differences and, moreover, their effect on timing seems to be sensitive to attentional manipulations such as target contrast.

SeminarNeuroscienceRecording

Measures and models of multisensory integration in reaction times

Hans Colonius
Oldenburg University
Jan 17, 2024

First, a new measure of MI for reaction times is proposed that takes the entire RT distribution into account. Second, we present some recent developments in TWIN modeling, including a new proposal for the sound-induced flash illusion (SIFI).

SeminarNeuroscienceRecording

Bayesian expectation in the perception of the timing of stimulus sequences

Max De Luca
University of Birmingham
Dec 12, 2023

In the current virtual journal club Dr Di Luca will present findings from a series of psychophysical investigations where he measured sensitivity and bias in the perception of the timing of stimuli. He will present how improved detection with longer sequences and biases in reporting isochrony can be accounted for by optimal statistical predictions. Among his findings was also that the timing of stimuli that occasionally deviate from a regularly paced sequence is perceptually distorted to appear more regular. Such change depends on whether the context these sequences are presented is also regular. Dr Di Luca will present a Bayesian model for the combination of dynamically updated expectations, in the form of a priori probability, with incoming sensory information. These findings contribute to the understanding of how the brain processes temporal information to shape perceptual experiences.

SeminarNeuroscienceRecording

Multisensory perception, learning, and memory

Ladan Shams
UCLA
Dec 6, 2023

Note the later start time!

SeminarNeuroscience

Making Sense of Our Senses: Multisensory Processes across the Human Lifespan

Micah Murray
University of Lausanne
Nov 5, 2023
SeminarNeuroscienceRecording

Multisensory integration in peripersonal space (PPS) for action, perception and consciousness

Andrea Serino
University Hospital of Lausanne
Nov 1, 2023

Note the later time in the USA!

SeminarNeuroscienceRecording

Rodents to Investigate the Neural Basis of Audiovisual Temporal Processing and Perception

Ashley Schormans
BrainsCAN, Western University, Canada.
Sep 26, 2023

To form a coherent perception of the world around us, we are constantly processing and integrating sensory information from multiple modalities. In fact, when auditory and visual stimuli occur within ~100 ms of each other, individuals tend to perceive the stimuli as a single event, even though they occurred separately. In recent years, our lab, and others, have developed rat models of audiovisual temporal perception using behavioural tasks such as temporal order judgments (TOJs) and synchrony judgments (SJs). While these rodent models demonstrate metrics that are consistent with humans (e.g., perceived simultaneity, temporal acuity), we have sought to confirm whether rodents demonstrate the hallmarks of audiovisual temporal perception, such as predictable shifts in their perception based on experience and sensitivity to alterations in neurochemistry. Ultimately, our findings indicate that rats serve as an excellent model to study the neural mechanisms underlying audiovisual temporal perception, which to date remains relativity unknown. Using our validated translational audiovisual behavioural tasks, in combination with optogenetics, neuropharmacology and in vivo electrophysiology, we aim to uncover the mechanisms by which inhibitory neurotransmission and top-down circuits finely control ones’ perception. This research will significantly advance our understanding of the neuronal circuitry underlying audiovisual temporal perception, and will be the first to establish the role of interneurons in regulating the synchronized neural activity that is thought to contribute to the precise binding of audiovisual stimuli.

SeminarCognition

Prosody in the voice, face, and hands changes which words you hear

Hans Rutger Bosker
Donders Institute of Radboud University
May 22, 2023

Speech may be characterized as conveying both segmental information (i.e., about vowels and consonants) as well as suprasegmental information - cued through pitch, intensity, and duration - also known as the prosody of speech. In this contribution, I will argue that prosody shapes low-level speech perception, changing which speech sounds we hear. Perhaps the most notable example of how prosody guides word recognition is the phenomenon of lexical stress, whereby suprasegmental F0, intensity, and duration cues can distinguish otherwise segmentally identical words, such as "PLAto" vs. "plaTEAU" in Dutch. Work from our group showcases the vast variability in how different talkers produce stressed vs. unstressed syllables, while also unveiling the remarkable flexibility with which listeners can learn to handle this between-talker variability. It also emphasizes that lexical stress is a multimodal linguistic phenomenon, with the voice, lips, and even hands conveying stress in concert. In turn, human listeners actively weigh these multisensory cues to stress depending on the listening conditions at hand. Finally, lexical stress is presented as having a robust and lasting impact on low-level speech perception, even down to changing vowel perception. Thus, prosody - in all its multisensory forms - is a potent factor in speech perception, determining what speech sounds we hear.

SeminarNeuroscience

How the brain uses experience to construct its multisensory capabilities

Barry E. Stein
Wake Forest School of Medicine
Apr 19, 2023

This talk will not be recorded

SeminarNeuroscienceRecording

Multisensory processing of anticipatory and consummatory food cues

Janina Seubert
Karolinska Institute
Feb 1, 2023
SeminarNeuroscienceRecording

Multisensory perception with newly learned sensory skills

Marko Nardini
Durham University
Nov 16, 2022
SeminarNeuroscienceRecording

Using multisensory plasticity to rehabilitate vision

Benjamin A. Rowland
Wake Forest School of Medicine
Nov 2, 2022
SeminarNeuroscienceRecording

Hierarchical transformation of visual event timing representations in the human brain: response dynamics in early visual cortex and timing-tuned responses in association cortices

Evi Hendrikx
Utrecht University
Sep 27, 2022

Quantifying the timing (duration and frequency) of brief visual events is vital to human perception, multisensory integration and action planning. For example, this allows us to follow and interact with the precise timing of speech and sports. Here we investigate how visual event timing is represented and transformed across the brain’s hierarchy: from sensory processing areas, through multisensory integration areas, to frontal action planning areas. We hypothesized that the dynamics of neural responses to sensory events in sensory processing areas allows derivation of event timing representations. This would allow higher-level processes such as multisensory integration and action planning to use sensory timing information, without the need for specialized central pacemakers or processes. Using 7T fMRI and neural model-based analyses, we found responses that monotonically increase in amplitude with visual event duration and frequency, becoming increasingly clear from primary visual cortex to lateral occipital visual field maps. Beginning in area MT/V5, we found a gradual transition from monotonic to tuned responses, with response amplitudes peaking at different event timings in different recording sites. While monotonic response components were limited to the retinotopic location of the visual stimulus, timing-tuned response components were independent of the recording sites' preferred visual field positions. These tuned responses formed a network of topographically organized timing maps in superior parietal, postcentral and frontal areas. From anterior to posterior timing maps, multiple events were increasingly integrated, response selectivity narrowed, and responses focused increasingly on the middle of the presented timing range. These results suggest that responses to event timing are transformed from the human brain’s sensory areas to the association cortices, with the event’s temporal properties being increasingly abstracted from the response dynamics and locations of early sensory processing. The resulting abstracted representation of event timing is then propagated through areas implicated in multisensory integration and action planning.

SeminarNeuroscienceRecording

Multisensory interactions in temporal frequency processing

Jeff Yau
Baylor College of Medicine
May 4, 2022
SeminarNeuroscienceRecording

The Multisensory Scaffold for Perception and Rehabilitation

Micah Murray
The Sense Innovation and Research Center, Lausanne and Sion, Switzerland; Lausanne University Hospital and University of Lausanne, Switzerland
Apr 6, 2022
SeminarNeuroscienceRecording

Healing the brain via Multisensory technologies and using these technologies to better understand the brain

Amir Amedi
IDC Israel (and Sorbonne U France)
Mar 16, 2022
SeminarNeuroscience

From natural scene statistics to multisensory integration: experiments, models and applications

Cesare Parise
Oculus VR
Feb 8, 2022

To efficiently process sensory information, the brain relies on statistical regularities in the input. While generally improving the reliability of sensory estimates, this strategy also induces perceptual illusions that help reveal the underlying computational principles. Focusing on auditory and visual perception, in my talk I will describe how the brain exploits statistical regularities within and across the senses for the perception space, time and multisensory integration. In particular, I will show how results from a series of psychophysical experiments can be interpreted in the light of Bayesian Decision Theory, and I will demonstrate how such canonical computations can be implemented into simple and biologically plausible neural circuits. Finally, I will show how such principles of sensory information processing can be leveraged in virtual and augmented reality to overcome display limitations and expand human perception.

SeminarNeuroscienceRecording

The vestibular system: a multimodal sense

Elisa Raffaella Ferre
Birkbeck, University of London
Jan 19, 2022

The vestibular system plays an essential role in everyday life, contributing to a surprising range of functions from reflexes to the highest levels of perception and consciousness. Three orthogonal semicircular canals detect rotational movements of the head and the otolith organs sense translational acceleration, including the gravitational vertical. But, how vestibular signals are encoded by the human brain? We have recently combined innovative methods for eliciting virtual rotation and translation sensations with fMRI to identify brain areas representing vestibular signals. We have identified a bilateral inferior parietal, ventral premotor/anterior insula and prefrontal network and confirmed that these areas reliably possess information about the rotation and translation. We have also investigated how vestibular signals are integrated with other sensory cues to generate our perception of the external environment.

SeminarNeuroscienceRecording

What happens to our ability to perceive multisensory information as we age?

Fiona Newell
Trinity Collge Dublin
Jan 12, 2022

Our ability to perceive the world around us can be affected by a number of factors including the nature of the external information, prior experience of the environment, and the integrity of the underlying perceptual system. A particular challenge for the brain is to maintain a coherent perception from information encoded by the peripheral sensory organs whose function is affected by typical, developmental changes across the lifespan. Yet, how the brain adapts to the maturation of the senses, as well as experiential changes in the multisensory environment, is poorly understood. Over the past few years, we have used a range of multisensory tasks to investigate the role of ageing on the brain’s ability to merge sensory inputs. In particular, we have embedded an audio-visual task based on the sound-induced flash illusion (SIFI) into a large-scale, longitudinal study of ageing. Our findings support the idea that the temporal binding window (TBW) is modulated by age and reveal important individual differences in this TBW that may have clinical implications. However, our investigations also suggest the TWB is experience-dependent with evidence for both long and short term behavioural plasticity. An overview of these findings, including recent evidence on how multisensory integration may be associated with higher order functions, will be discussed.

SeminarNeuroscienceRecording

NMC4 Short Talk: Neurocomputational mechanisms of causal inference during multisensory processing in the macaque brain

Guangyao Qi
Institute of Neuroscience, Chinese Academy of Sciences
Dec 2, 2021

Natural perception relies inherently on inferring causal structure in the environment. However, the neural mechanisms and functional circuits that are essential for representing and updating the hidden causal structure during multisensory processing are unknown. To address this, monkeys were trained to infer the probability of a potential common source from visual and proprioceptive signals on the basis of their spatial disparity in a virtual reality system. The proprioceptive drift reported by monkeys demonstrated that they combined historical information and current multisensory signals to estimate the hidden common source and subsequently updated both the causal structure and sensory representation. Single-unit recordings in premotor and parietal cortices revealed that neural activity in premotor cortex represents the core computation of causal inference, characterizing the estimation and update of the likelihood of integrating multiple sensory inputs at a trial-by-trial level. In response to signals from premotor cortex, neural activity in parietal cortex also represents the causal structure and further dynamically updates the sensory representation to maintain consistency with the causal inference structure. Thus, our results indicate how premotor cortex integrates historical information and sensory inputs to infer hidden variables and selectively updates sensory representations in parietal cortex to support behavior. This dynamic loop of frontal-parietal interactions in the causal inference framework may provide the neural mechanism to answer long-standing questions regarding how neural circuits represent hidden structures for body-awareness and agency.

SeminarNeuroscienceRecording

How does seeing help listening? Audiovisual integration in Auditory Cortex

Jennifer Bizley
University College London
Dec 1, 2021

Multisensory responses are ubiquitous in so-called unisensory cortex. However, despite their prevalence, we have very little understanding of what – if anything - they contribute to perception. In this talk I will focus on audio-visual integration in auditory cortex. Anatomical tracing studies highlight visual cortex as one source of visual input to auditory cortex. Using cortical cooling we test the hypothesis that these inputs support audiovisual integration in ferret auditory cortex. Behavioural studies in humans support the idea that visual stimuli can help listeners to parse an auditory scene. This effect is paralleled in single units in auditory cortex, where responses to a sound mixture can be determined by the timing of a visual stimulus such that sounds that are temporally coherent with a visual stimulus are preferentially represented. Our recent data therefore support the idea that one role for the early integration of auditory and visual signals in auditory cortex is to support auditory scene analysis, and that visual cortex plays a key role in this process.

SeminarNeuroscienceRecording

Conflict in Multisensory Perception

Salvador Soto.Faraco
Universitat Pompeu Fabra
Nov 10, 2021

Multisensory perception is often studied through the effects of inter-sensory conflict, such as in the McGurk effect, the Ventriloquist illusion, and the Rubber Hand Illusion. Moreover, Bayesian approaches to cue fusion and causal inference overwhelmingly draw on cross-modal conflict to measure and to model multisensory perception. Given the prevalence of conflict, it is remarkable that accounts of multisensory perception have so far neglected the theory of conflict monitoring and cognitive control, established about twenty years ago. I hope to make a case for the role of conflict monitoring and resolution during multisensory perception. To this end, I will present EEG and fMRI data showing that cross-modal conflict in speech, resulting in either integration or segregation, triggers neural mechanisms of conflict detection and resolution. I will also present data supporting a role of these mechanisms during perceptual conflict in general, using Binocular Rivalry, surrealistic imagery, and cinema. Based on this preliminary evidence, I will argue that it is worth considering the potential role of conflict in multisensory perception and its incorporation in a causal inference framework. Finally, I will raise some potential problems associated with this proposal.

SeminarNeuroscienceRecording

Migraine: a disorder of excitatory-inhibitory balance in multiple brain networks? Insights from genetic mouse models of the disease

Daniela Pietrobon
Department of Biomedical Sciences and Padova Neuroscience Center, University of Padova, Italy
Oct 27, 2021

Migraine is much more than an episodic headache. It is a complex brain disorder, characterized by a global dysfunction in multisensory information processing and integration. In a third of patients, the headache is preceded by transient sensory disturbances (aura), whose neurophysiological correlate is cortical spreading depression (CSD). The molecular, cellular and circuit mechanisms of the primary brain dysfunctions that underlie migraine onset, susceptibility to CSD and altered sensory processing remain largely unknown and are major open issues in the neurobiology of migraine. Genetic mouse models of a rare monogenic form of migraine with aura provide a unique experimental system to tackle these key unanswered questions. I will describe the functional alterations we have uncovered in the cerebral cortex of genetic mouse models and discuss the insights into the cellular and circuit mechanisms of migraine obtained from these findings.

SeminarNeuroscienceRecording

Development of multisensory perception and attention and their role in audiovisual speech processing

David Lewkowicz
Haskins Labs & Yale Child Study Ctr.
Oct 20, 2021
SeminarNeuroscienceRecording

Do you hear what I see: Auditory motion processing in blind individuals

Ione Fine
University of Washington
Oct 6, 2021

Perception of object motion is fundamentally multisensory, yet little is known about similarities and differences in the computations that give rise to our experience across senses. Insight can be provided by examining auditory motion processing in early blind individuals. In those who become blind early in life, the ‘visual’ motion area hMT+ responds to auditory motion. Meanwhile, the planum temporale, associated with auditory motion in sighted individuals, shows reduced selectivity for auditory motion, suggesting competition between cortical areas for functional role. According to the metamodal hypothesis of cross-modal plasticity developed by Pascual-Leone, the recruitment of hMT+ is driven by it being a metamodal structure containing “operators that execute a given function or computation regardless of sensory input modality”. Thus, the metamodal hypothesis predicts that the computations underlying auditory motion processing in early blind individuals should be analogous to visual motion processing in sighted individuals - relying on non-separable spatiotemporal filters. Inconsistent with the metamodal hypothesis, evidence suggests that the computational algorithms underlying auditory motion processing in early blind individuals fail to undergo a qualitative shift as a result of cross-modal plasticity. Auditory motion filters, in both blind and sighted subjects, are separable in space and time, suggesting that the recruitment of hMT+ to extract motion information from auditory input includes a significant modification of its normal computational operations.

SeminarNeuroscienceRecording

Plasticity and learning in multisensory perception for action

Marc Ernst
Ulm University
Sep 22, 2021
SeminarNeuroscienceRecording

Multisensory Integration: Development, Plasticity, and Translational Applications

Benjamin A. Rowland
Wake Forest School of Medicine
Sep 20, 2021
SeminarNeuroscienceRecording

Multisensory speech perception

Michael Beauchamp
University of Pennsylvania
Sep 15, 2021
SeminarNeuroscienceRecording

Music training effects on multisensory and cross-sensory transfer processing: from cross-sectional to RCT studies

Karin Petrini
University of Bath
Sep 8, 2021
SeminarNeuroscienceRecording

Multisensory self in spatial navigation

Olaf Blanke
Swiss Federal Institute of Technology (EPFL)
Sep 1, 2021
SeminarNeuroscienceRecording

The emergence of a ‘V1 like’ structure for soundscapes representing vision in the adult brain in the absence of visual experience

Amir Amedi
IDC
Jul 5, 2021
SeminarNeuroscience

Multisensory encoding of self-motion in the retrosplenial cortex and beyond

Sepiedeh Keshavarzi
Sainsbury Wellcome Centre, UCL
Jun 29, 2021

In order to successfully navigate through the environment, animals must accurately estimate the status of their motion with respect to the surrounding scene and objects. In this talk, I will present our recent work on how retrosplenial cortical (RSC) neurons combine vestibular and visual signals to reliably encode the direction and speed of head turns during passive motion and active navigation. I will discuss these data in the context of RSC long-range connectivity and further show our ongoing work on building population-level models of motion representation across cortical and subcortical networks.

SeminarNeuroscienceRecording

Multisensory development and the role of visual experience

Brigitte Röder
University of Hamburg
Jun 16, 2021
SeminarNeuroscienceRecording

Science and technology to understand developmental multisensory processing

Monica Gori
Italian Institute of Technology
Jun 9, 2021
SeminarNeuroscienceRecording

Clinical, Cognitive and Neuroscience Insights into Multisensory Processes

Mark Wallace
Vanderbilt University
May 19, 2021
SeminarNeuroscienceRecording

Brain (re)organization and sensory deprivation: Recycling the multisensory scaffolding of functional brain networks

Olivier Collignon
UCLouvain; University of Trento
May 5, 2021
SeminarNeuroscienceRecording

How multisensory perception is shaped by causal inference and serial effects

Christoph Kayser
Bielefeld University
Apr 21, 2021
SeminarNeuroscienceRecording

Applications of Multisensory Facilitation of Learning

Aaron Seitz
University of California, Riverside
Apr 14, 2021

In this talk I’ll discuss translation of findings of multisensory facilitation of learning to cognitive training. I’ll first review some early findings of multisensory facilitation of learning and then discuss how we have been translating these basic science approaches into gamified training interventions to improve cognitive functions. I’ll touch on approaches to training vision, hearing and working memory that we are developing at the UCR Brain Game Center for Mental Fitness and Well-being. I look forward to discussing both the basic science but also the complexities of how to translate approaches from basic science into the more complex frameworks often used in interventions.

SeminarNeuroscience

Blood is thicker than water

Michael Brecht
Bernstein Center for Computational Neuroscience Humboldt University Berlin, Germany
Nov 18, 2020

According to Hamilton’s inclusive fitness hypothesis, kinship is an organizing principle of social behavior. Behavioral evidence supporting this hypothesis includes the ability to recognize kin and the adjustment of behavior based on kin preference with respect to altruism, attachment and care for offspring in insect societies. Despite the fundamental importance of kinship behavior, the underlying neural mechanisms are poorly understood. We repeated behavioral experiments by Hepper on behavioral preference of rats for their kin. Consistent with Hepper’s work, we find a developmental time course for kinship behavior, where rats prefer sibling interactions at young ages and express non-sibling preferences at older ages. In probing the brain areas responsible for this behavior, we find that aspiration lesions of the lateral septum but not control lesions of cingulate cortices eliminate the behavioral preference in young animals for their siblings and in older rats for non-siblings. We then presented awake and anaesthetized rats with odors and calls of age- and status-matched kin (siblings and mothers) and non-kin (non-siblings and non-mothers) conspecifics, while performing in vivo juxta-cellular and whole-cell patch-clamp recordings in the lateral septum. We find multisensory (olfactory and auditory) neuronal responses, whereby neurons typically responded preferentially but not exclusively to individual social stimuli. Non-kin-odor responsive neurons were found dorsally, while kin-odor responsive neurons were located in ventrally in the lateral septum. To our knowledge such an ordered representation of response preferences according to kinship has not been previously observed and we refer this organization as nepotopy. Nepotopy could be instrumental in reading out kinship from preferential but not exclusive responses and in the generation of differential behavior according to kinship. Thus, our results are consistent with a role of the lateral septum in organizing mammalian kinship behavior.

ePoster

Non-feedforward architectures enable diverse multisensory computations

Marcus Ghosh, Dan Goodman

Bernstein Conference 2024

ePoster

Recurrence in temporal multisensory processing

Swathi Anil, Marcus Ghosh, Daniel Goodman

Bernstein Conference 2024

ePoster

Investigation of a multilevel multisensory circuit underlying female decision making in Drosophila

COSYNE 2022

ePoster

Investigation of a multilevel multisensory circuit underlying female decision making in Drosophila

COSYNE 2022

ePoster

Critical Learning Periods for Multisensory Integration in Deep Networks

Michael Kleinman, Alessandro Achille, Stefano Soatto

COSYNE 2023

ePoster

Topography of multisensory convergence throughout the mouse cortex

Kinjal Pravinbhai Patel, Avery Ryoo, Stefan Mihalas, Bryan Tripp

COSYNE 2023

ePoster

An automated behavioral platform for multisensory decision-making in mice

Fatemeh Yousefi, Dennis Laufs, Natalia Babushkina, Christopher Wiesbrock, Gerion Nabbefeld, Björn Kampa, Simon Musall

FENS Forum 2024

ePoster

Behavioral regression in Syn II KO mice: From latent synaptopathy to overt dysfunctions in multisensory social processing

Lorenzo Ciano, Sebastian Sulis Sato, Alessandro Esposito, Anna Fassio, Fabio Benfenati, Caterina Michetti

FENS Forum 2024

ePoster

Examining multisensory integration in weakly electric fish through manipulating sensory salience

Emine Ceren Rutbil, Gurkan Celik, Alp Demirel, Emin Yusuf Aydin, Ismail Uyanik

FENS Forum 2024

ePoster

Modality specificity of multisensory integration and decision-making in frontal cortex and superior colliculus

Alice Despatin, Irene Lenzi, Severin Graff, Kerstin Doerenkamp, Gerion Nabbefeld, Maria Laura Pérez, Anoushka Jain, Sonja Grün, Björn Kampa, Simon Musall

FENS Forum 2024

ePoster

Multisensory stimulation improves target tracking in zebrafish during rheotaxis

Sevval Izel Solmaz, Ismail Uyanik

FENS Forum 2024

ePoster

Neuronal circuit for multisensory integration in higher visual cortex

Mio Inoue, Yuta Tanisumi, Daisuke Kato, Nanami Kawamura, Akari Hashimoto, Ikuko Takeda, Etsuko Tarusawa, Hiroaki Wake

FENS Forum 2024

ePoster

A novel multisensory stimulation setup for refuge tracking in weakly electric fish

Alp Demirel, Ismail Uyanik

FENS Forum 2024

ePoster

Postural constraints affect the optimal weighting of multisensory integration during visuo-manual coordination

Célie Dézé, Clémence Daleux, Mathieu Beraneck, Joseph McIntyre, Michele Tagliabue

FENS Forum 2024

ePoster

Touching what you see: Multisensory location coding in mouse posterior parietal cortex

Adrian Roggenbach, Fritjof Helmchen

FENS Forum 2024

ePoster

A virtual-reality task to investigate multisensory object recognition in mice

Veronique Stokkers, Guido T Meijer, Smit Zayel, Jeroen J Bos, Francesco P Battaglia

FENS Forum 2024