Resources
Authors & Affiliations
Alessandro Marin Vargas,Axel Bisi,Alberto Chiappa,Chris Versteeg,Lee E. Miller,Alexander Mathis
Abstract
Adaptive motor control requires the integration of proprioceptive and other sensory signals. However, the principles, which govern the processing of proprioception, are poorly understood. Here, we employ a task-driven modeling approach to quantitatively test hypotheses about the functional role of proprioceptive neurons in cuneate nucleus, which is the projection target of ascending spindle neurons in the brain stem. We generated datasets of realistic proprioceptive, muscle spindle inputs for a large, diverse repertoire of movements (following Sandbrink et al., bioRxiv, 2020), and used them to train hundreds of temporal convolutional neural networks (TCNs) to perform three behavioral tasks: action recognition, hand localization and redundancy reduction. We contrasted these hypotheses about what the ascending proprioceptive pathway does by predicting single-neuron activity from the cuneate nucleus in macaques performing a center-out reaching task. We tracked limb movements with DeepLabCut and inferred the proprioceptive inputs via musculoskeletal modeling. We used these as input to the task-trained TCNs to linearly regress single-neuron activity. Firstly, we found that the models trained on action recognition provide significantly better neural predictions than the other tasks (p < 0.05 post-hoc Tukey’s test). Secondly, for three different architectural families of TCNs, we found that models that perform better on the action recognition task, explain the neural data in the cuneate nucleus better. This relationship was not true for the hand localization and redundancy reduction task. Overall, this suggests that action recognition is sufficient to develop brain-like postural representations in the cuneate nucleus. Furthermore, our work is the first to directly predict proprioceptive neurons from task-driven modeling and thus consolidates task-driven models as an optimization-based framework to understand sensory systems (beyond vision, audition and touch).