← Back

Sensory Modalities

Topic spotlight
TopicWorld Wide

sensory modalities

Discover seminars, jobs, and research tagged with sensory modalities across World Wide.
10 curated items8 Seminars2 ePosters
Updated 7 months ago
10 items · sensory modalities
10 results
SeminarNeuroscienceRecording

Restoring Sight to the Blind: Effects of Structural and Functional Plasticity

Noelle Stiles
Rutgers University
May 21, 2025

Visual restoration after decades of blindness is now becoming possible by means of retinal and cortical prostheses, as well as emerging stem cell and gene therapeutic approaches. After restoring visual perception, however, a key question remains. Are there optimal means and methods for retraining the visual cortex to process visual inputs, and for learning or relearning to “see”? Up to this point, it has been largely assumed that if the sensory loss is visual, then the rehabilitation focus should also be primarily visual. However, the other senses play a key role in visual rehabilitation due to the plastic repurposing of visual cortex during blindness by audition and somatosensation, and also to the reintegration of restored vision with the other senses. I will present multisensory neuroimaging results, cortical thickness changes, as well as behavioral outcomes for patients with Retinitis Pigmentosa (RP), which causes blindness by destroying photoreceptors in the retina. These patients have had their vision partially restored by the implantation of a retinal prosthesis, which electrically stimulates still viable retinal ganglion cells in the eye. Our multisensory and structural neuroimaging and behavioral results suggest a new, holistic concept of visual rehabilitation that leverages rather than neglects audition, somatosensation, and other sensory modalities.

SeminarNeuroscience

Language Representations in the Human Brain: A naturalistic approach

Fatma Deniz
TU Berlin & Berkeley
Apr 26, 2022

Natural language is strongly context-dependent and can be perceived through different sensory modalities. For example, humans can easily comprehend the meaning of complex narratives presented through auditory speech, written text, or visual images. To understand how complex language-related information is represented in the human brain there is a necessity to map the different linguistic and non-linguistic information perceived under different modalities across the cerebral cortex. To map this information to the brain, I suggest following a naturalistic approach and observing the human brain performing tasks in its naturalistic setting, designing quantitative models that transform real-world stimuli into specific hypothesis-related features, and building predictive models that can relate these features to brain responses. In my talk, I will present models of brain responses collected using functional magnetic resonance imaging while human participants listened to or read natural narrative stories. Using natural text and vector representations derived from natural language processing tools I will present how we can study language processing in the human brain across modalities, in different levels of temporal granularity, and across different languages.

SeminarNeuroscienceRecording

Neural signature for accumulated evidence underlying temporal decisions

Nir Ofir
The Hebrew University of Jerusalem
Dec 15, 2021

Cognitive models of timing often include a pacemaker analogue whose ticks are accumulated to form an internal representation of time, and a threshold that determines when a target duration has elapsed. However, clear EEG manifestations of these abstract components have not yet been identified. We measured the EEG of subjects while they performed a temporal bisection task in which they were requested to categorize visual stimuli as short or long in duration. We report an ERP component whose amplitude depends monotonically on the stimulus duration. The relation of the ERP amplitude and stimulus duration can be captured by a simple model, adapted from a known drift-diffusion model for time perception. It includes a noisy accumulator that starts with the stimulus onset and a threshold. If the threshold is reached during stimulus presentation, the stimulus is categorized as "long", otherwise the stimulus is categorized as "short". At the stimulus offset, a response proportional to the distance to the threshold is emitted. This simple model has two parameters that fit both the behavior and ERP amplitudes recorded in the task. Two subsequent experiments replicate and extend this finding to another modality (touch) as well as to different time ranges (subsecond and suprasecond), establishing the described ERP component as a useful handle on the cognitive processes involved in temporal decisions.

SeminarNeuroscienceRecording

Seeing with technology: Exchanging the senses with sensory substitution and augmentation

Michael Proulx
University of Bath
Sep 29, 2021

What is perception? Our sensory modalities transmit information about the external world into electrochemical signals that somehow give rise to our conscious experience of our environment. Normally there is too much information to be processed in any given moment, and the mechanisms of attention focus the limited resources of the mind to some information at the expense of others. My research has advanced from first examining visual perception and attention to now examine how multisensory processing contributes to perception and cognition. There are fundamental constraints on how much information can be processed by the different senses on their own and in combination. Here I will explore information processing from the perspective of sensory substitution and augmentation, and how "seeing" with the ears and tongue can advance fundamental and translational research.

SeminarNeuroscienceRecording

How our biases may influence our study of visual modalities: Two tales from the sea

Sönke Johnsen
Duke University
Mar 14, 2021

It has long been appreciated (and celebrated) that certain species have sensory capabilities that humans do not share, for example polarization, ultraviolet, and infrared vision. What is less appreciated however, is that our position as terrestrial human scientists can significantly affect our study of animal senses and signals, even within modalities that we do share. For example, our acute vision can lead us to over-interpret the relevance of fine patterns in animals with coarser vision, and our Cartesian heritage as scientists can lead us to divide sensory modalities into orthogonal parameters (e.g. hue and brightness for color vision), even though this division may not exist within the animal itself. This talk examines two cases from marine visual ecology where a reconsideration of our biases as sharp-eyed Cartesian land mammals can help address questions in visual ecology. The first case examines the enormous variation in visual acuity among animals with image-forming eyes, and focuses on how acknowledging the typically poorer resolving power of animals can help us interpret the function of color patterns in cleaner shrimp and their client fish. The second case examines the how the typical human division of polarized light stimuli into angle and degree of polarization is problematic, and how a physiologically relevant interpretation is both closer to the truth and resolves a number of issues, particularly when considering the propagation of polarized light

SeminarNeuroscienceRecording

A Cortical Circuit for Audio-Visual Predictions

Aleena Garner
Keller lab, FMI
Mar 9, 2021

Team work makes sensory streams work: our senses work together, learn from each other, and stand in for one another, the result of which is perception and understanding. Learned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli (Mcgurk and Macdonald, 1976). During audio-visual associative learning, auditory cortex is thought to underlie multi-modal plasticity in visual cortex (McIntosh et al., 1998; Mishra et al., 2007; Zangenehpour and Zatorre, 2010). However, it is not well understood how processing in visual cortex is altered by an auditory stimulus that is predictive of a visual stimulus and what the mechanisms are that mediate such experience-dependent, audio-visual associations in sensory cortex. Here we describe a neural mechanism by which an auditory input can shape visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices. We show that the association of an auditory stimulus with a visual stimulus in a behaviorally relevant context leads to an experience-dependent suppression of visual responses in primary visual cortex (V1). Auditory cortex axons carry a mixture of auditory and retinotopically-matched visual input to V1, and optogenetic stimulation of these axons selectively suppresses V1 neurons responsive to the associated visual stimulus after, but not before, learning. Our results suggest that cross-modal associations can be stored in long-range cortical connections and that with learning these cross-modal connections function to suppress the responses to predictable input.

SeminarNeuroscience

Sensory modalities driving social behavior via the central oxytocin system

Valery Grinevich
Zentralinstitut für Seelische Gesundheit, University of Heidelberg, Germany
Nov 8, 2020
ePoster

Functional heterogeneity of astrocytes in the somatosensory cortex and its role in the processing of different sensory modalities

Andrea Misol Ortiz, Verónica Barranco Maresca, Marta Zaforas, Elena Alonso-Calviño, Elena Fernández-López, Juan Aguilar, Juliana M Rosa

FENS Forum 2024

ePoster

Multiple sensory modalities contribute to texture discrimination in head-fixed mice

Ilaria Zanchi, Alejandro Sempere, Marco Celotto, Lorenzo Tausani, Dania Vecchia, Angelo Forli, Jacopo Bonato, Stefano Panzeri, Tommaso Fellin

FENS Forum 2024