Resources
Authors & Affiliations
Parastoo Azizeddin, Eray Erturk, Maryam Shanechi
Abstract
Multimodal data fusion can facilitate a more complete understanding of neural dynamics. In neuroscience, modalities such as neuronal spiking activity, local field potentials (LFPs), and behavioral signals capture various aspects of brain processes. By leveraging the complementary strengths of different modalities, multimodal fusion can not only construct a rich and unified representation of brain processes, but also address the limitations of single-modality analyses---such as incomplete or noisy data. An exciting recent direction has been to build models of simultaneously recorded neural and behavioral data to disentangle sources of neural variability within the latent space. However, these methods rely on latent variable models that use a single modality of neural data. Here we develop a nonlinear dynamical model that integrates multiple neural modalities---specifically LFPs and spike counts---with behavioral signals into a unified framework. To achieve this, we design a neural network architecture that nonlinearly fuses information across neural modalities with various probabilistic distributions while dissociating behaviorally relevant information across multimodal neural signals. We validate our model through simulations and a public motor cortical dataset collected during sequential 2D reaches to random targets. We find that our framework can enhance behavior decoding and neural prediction by nonlinearly fusing information across modalities. This framework can provide a new tool for studying behaviorally relevant computations across different spatiotemporal scales of neural activity measured with different modalities.