Auditory Stimulus
auditory stimulus
Vocal emotion perception at millisecond speed
The human voice is possibly the most important sound category in the social landscape. Compared to other non-verbal emotion signals, the voice is particularly effective in communicating emotions: it can carry information over large distances and independent of sight. However, the study of vocal emotion expression and perception is surprisingly far less developed than the study of emotion in faces. Thereby, its neural and functional correlates remain elusive. As the voice represents a dynamically changing auditory stimulus, temporally sensitive techniques such as the EEG are particularly informative. In this talk, the dynamic neurocognitive operations that take place when we listen to vocal emotions will be specified, with a focus on the effects of stimulus type, task demands, and speaker and listener characteristics (e.g., age). These studies suggest that emotional voice perception is not only a matter of how one speaks but also of who speaks and who listens. Implications of these findings for the understanding of psychiatric disorders such as schizophrenia will be discussed.
Pitch and Time Interact in Auditory Perception
Research into pitch perception and time perception has typically treated the two as independent processes. However, previous studies of music and speech perception have suggested that pitch and timing information may be processed in an integrated manner, such that the pitch of an auditory stimulus can influence a person’s perception, expectation, and memory of its duration and tempo. Typically, higher-pitched sounds are perceived as faster and longer in duration than lower-pitched sounds with identical timing. We conducted a series of experiments to better understand the limits of this pitch-time integrality. Across several experiments, we tested whether the higher-equals-faster illusion generalizes across the broader frequency range of human hearing by asking participants to compare the tempo of a repeating tone played in one of six octaves to a metronomic standard. When participants heard tones from all six octaves, we consistently found an inverted U-shaped effect of the tone’s pitch height, such that perceived tempo peaked between A4 (440 Hz) and A5 (880 Hz) and decreased at lower and higher octaves. However, we found that the decrease in perceived tempo at extremely high octaves could be abolished by exposing participants to high-pitched tones only, suggesting that pitch-induced timing biases are context sensitive. We additionally tested how the timing of an auditory stimulus influences the perception of its pitch, using a pitch discrimination task in which probe tones occurred early, late, or on the beat within a rhythmic context. Probe timing strongly biased participants to rate later tones as lower in pitch than earlier tones. Together, these results suggest that pitch and time exert a bidirectional influence on one another, providing evidence for integrated processing of pitch and timing information in auditory perception. Identifying the mechanisms behind this pitch-time interaction will be critical for integrating current models of pitch and tempo processing.
A Cortical Circuit for Audio-Visual Predictions
Team work makes sensory streams work: our senses work together, learn from each other, and stand in for one another, the result of which is perception and understanding. Learned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli (Mcgurk and Macdonald, 1976). During audio-visual associative learning, auditory cortex is thought to underlie multi-modal plasticity in visual cortex (McIntosh et al., 1998; Mishra et al., 2007; Zangenehpour and Zatorre, 2010). However, it is not well understood how processing in visual cortex is altered by an auditory stimulus that is predictive of a visual stimulus and what the mechanisms are that mediate such experience-dependent, audio-visual associations in sensory cortex. Here we describe a neural mechanism by which an auditory input can shape visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices. We show that the association of an auditory stimulus with a visual stimulus in a behaviorally relevant context leads to an experience-dependent suppression of visual responses in primary visual cortex (V1). Auditory cortex axons carry a mixture of auditory and retinotopically-matched visual input to V1, and optogenetic stimulation of these axons selectively suppresses V1 neurons responsive to the associated visual stimulus after, but not before, learning. Our results suggest that cross-modal associations can be stored in long-range cortical connections and that with learning these cross-modal connections function to suppress the responses to predictable input.
Contrasting the role of excitatory pyramidal cells and GABAergic interneurons in prefrontal cortex through a novel contextual auditory stimulus task paradigm and calcium imaging
FENS Forum 2024
Effect of open-loop auditory stimulus during NREM sleep among youth & geriatric subjects: A comparative nap study
FENS Forum 2024