Resources
Authors & Affiliations
Swathi Anil, Marcus Ghosh, Daniel Goodman
Abstract
Animals continuously process information from multiple sensory modalities to assess their environment and guide their behavior. For example, a predator may rely on both visual and auditory cues to track its prey. Over time, various algorithms have been developed to describe multisensory processing, including linear and nonlinear fusion. However, these algorithms typically treat each time step independently and fail to account for temporal dependencies in these signals. To address this limitation, we introduce a novel set of multisensory tasks that systematically incorporate controlled temporal dependencies across streams of sensory signals, making them more naturalistic. Our findings show that traditional multisensory algorithms, which ignore temporal dependencies, perform sub-optimally on these tasks. Conversely, these algorithms approach near-optimal performance when adapted to integrate evidence across both sensory channels and time. Additionally, we demonstrate that recurrent artificial neural networks (RNNs) outperform algorithmic models, underscoring the importance of recurrent connections and temporal dependencies in multisensory processing. This study highlights the benefits of integrating multisensory information across both channels and time and presents innovative and naturalistic tasks for evaluating the significance of these processes in biological systems.