Psychophysical Experiments
psychophysical experiments
Latest
From natural scene statistics to multisensory integration: experiments, models and applications
To efficiently process sensory information, the brain relies on statistical regularities in the input. While generally improving the reliability of sensory estimates, this strategy also induces perceptual illusions that help reveal the underlying computational principles. Focusing on auditory and visual perception, in my talk I will describe how the brain exploits statistical regularities within and across the senses for the perception space, time and multisensory integration. In particular, I will show how results from a series of psychophysical experiments can be interpreted in the light of Bayesian Decision Theory, and I will demonstrate how such canonical computations can be implemented into simple and biologically plausible neural circuits. Finally, I will show how such principles of sensory information processing can be leveraged in virtual and augmented reality to overcome display limitations and expand human perception.
The self-consistent nature of visual perception
Vision provides us with a holistic interpretation of the world that is, with very few exceptions, coherent and consistent across multiple levels of abstraction, from scene to objects to features. In this talk I will present results from past and ongoing work in my laboratory that investigates the role top-down signals play in establishing such coherent perceptual experience. Based on the results of several psychophysical experiments I will introduce a theory of “self-consistent inference” and show how it can account for human perceptual behavior. The talk will close with a discussion of how the theory can help us understand more cognitive processes.
The attentional requirement of unconscious processing
The tight relationship between attention and conscious perception has been extensively researched in the past decades. However, whether attentional modulation extended to unconscious processes remained largely unknown, particularly when it came to abstract and high-level processing. I will talk about a recent study where we utilized the Stroop paradigm to show that task load gates unconscious semantic processing. In a series of psychophysical experiments, the unconscious word semantics influenced conscious task performance only under the low task load condition, but not the high task load condition. Intriguingly, with enough practice in the high task load condition, the unconscious effect reemerged. These findings suggest a competition of attentional resources between unconscious and conscious processes, challenging the automaticity account of unconscious processing.
Computational psychophysics at the intersection of theory, data and models
Behavioural measurements are often overlooked by computational neuroscientists, who prefer to focus on electrophysiological recordings or neuroimaging data. This attitude is largely due to perceived lack of depth/richness in relation to behavioural datasets. I will show how contemporary psychophysics can deliver extremely rich and highly constraining datasets that naturally interface with computational modelling. More specifically, I will demonstrate how psychophysics can be used to guide/constrain/refine computational models, and how models can be exploited to design/motivate/interpret psychophysical experiments. Examples will span a wide range of topics (from feature detection to natural scene understanding) and methodologies (from cascade models to deep learning architectures).
psychophysical experiments coverage
4 items
Explore how psychophysical experiments research is advancing inside Neuro.
Visit domain