perceptual processing
Latest
Decoding rapidly presented visual stimuli from prefrontal ensembles without report nor post-perceptual processing
Multisensory influences on vision: Sounds enhance and alter visual-perceptual processing
Visual perception is traditionally studied in isolation from other sensory systems, and while this approach has been exceptionally successful, in the real world, visual objects are often accompanied by sounds, smells, tactile information, or taste. How is visual processing influenced by these other sensory inputs? In this talk, I will review studies from our lab showing that a sound can influence the perception of a visual object in multiple ways. In the first part, I will focus on spatial interactions between sound and sight, demonstrating that co-localized sounds enhance visual perception. Then, I will show that these cross-modal interactions also occur at a higher contextual and semantic level, where naturalistic sounds facilitate the processing of real-world objects that match these sounds. Throughout my talk I will explore to what extent sounds not only improve visual processing but also alter perceptual representations of the objects we see. Most broadly, I will argue for the importance of considering multisensory influences on visual perception for a more complete understanding of our visual experience.
Space and its computational challenges
How our senses work both separately and together involves rich computational problems. I will discuss the spatial and representational problems faced by the visual and auditory system, focusing on two issues. 1. How does the brain correct for discrepancies in the visual and auditory spatial reference frames? I will describe our recent discovery of a novel type of otoacoustic emission, the eye movement related eardrum oscillation, or EMREO (Gruters et al, PNAS 2018). 2. How does the brain encode more than one stimulus at a time? I will discuss evidence for neural time-division multiplexing, in which neural activity fluctuates across time to allow representations to encode more than one simultaneous stimulus (Caruso et al, Nat Comm 2018). These findings all emerged from experimentally testing computational models regarding spatial representations and their transformations within and across sensory pathways. Further, they speak to several general problems confronting modern neuroscience such as the hierarchical organization of brain pathways and limits on perceptual/cognitive processing.
The consequences and constraints of functional organization on behavior
In many ways, cognitive neuroscience is the attempt to use physiological observation to clarify the mechanisms that shape behavior. Over the past 25 years, fMRI has provided a system-wide and yet somewhat spatially precise view of the response in human cortex evoked by a wide variety of stimuli and task contexts. The current talk focuses on the other direction of inference; the implications of this observed functional organization for behavior. To begin, we must interrogate the methodological and empirical frameworks underlying our derivation of this organization, partially by exploring its relationship to and predictability from gross neuroanatomy. Next, across a series of studies, the implications of two properties of functional organization for behavior will be explored: 1) the co-localization of visual working memory and perceptual processing and 2) implicit learning in the context of distributed responses. In sum, these results highlight the limitations of our current approach and hint at a new general mechanism for explaining observed behavior in context with the neural substrate.
perceptual processing coverage
4 items