ePoster

Structure in motion: visual motion perception as online hierarchical inference

Johannes Bill,Samuel J. Gershman,Jan Drugowitsch
COSYNE 2022(2022)
Lisbon, Portugal
Presented: Mar 18, 2022

Conference

COSYNE 2022

Lisbon, Portugal

Resources

Authors & Affiliations

Johannes Bill,Samuel J. Gershman,Jan Drugowitsch

Abstract

Identifying the structure of motion relations in the environment is critical for navigation, tracking, prediction, and pursuit. Yet, little is known about the mental and neural computations that allow the visual system to infer this structure online from a volatile stream of visual information. We propose online hierarchical Bayesian inference as a principled solution for how the brain might solve this complex perceptual task. We derive an online Expectation-Maximization algorithm that continually updates an estimate of a visual scene’s underlying structure while using this inferred structure to organize incoming noisy velocity observations into meaningful, stable percepts. We show that the algorithm explains human percepts qualitatively and quantitatively for a diverse set of stimuli, covering classical psychophysics experiments, ambiguous motion scenes, and illusory motion displays. For instance, it quantitatively explains experimental results of human motion structure classification with higher fidelity than a previous ideal observer-based model. Furthermore, we identify normative explanations for the origin of erroneous human perception in motion direction repulsion experiments and make testable predictions for new psychophysics experiments. Finally, the algorithm affords a neural network implementation which shares properties with motion-sensitive middle temporal area (MT) and dorsal medial superior temporal area (MSTd) and motivates a novel class of neuroscientific experiments to reveal the neural representations of latent structure.

Unique ID: cosyne-22/structure-motion-visual-motion-perception-0577b935