Resources
Authors & Affiliations
Adrian Roggenbach, Fritjof Helmchen
Abstract
Multisensory integration requires transformations between coordinate systems. When an object is touched and seen at the same time, tactile signals from the snout’s vibrissae arrive in the somatotopically organized primary whisker somatosensory cortex (wS1) whereas visual signals arrive in the retinotopically organized primary visual cortex (V1). The posterior parietal cortex (PPC) in-between is a candidate cortical region for merging these two representations. However, how converging visual and tactile inputs of nearby objects are processed in these cortical areas remains unclear. To address this question, here we investigate how neurons in mouse wS1, V1, and PPC integrate visuotactile information about the location of a pole in reach of the whiskers. Using two-photon calcium imaging, we record neurons across the posterior cortex in L2/3 of head-fixed mice (n=11 mice, both sexes). A pole is presented at different rostro-caudal positions either in darkness or under illuminated conditions. We track whisker-pole interactions with a high-speed camera and record the gaze direction to reconstruct sensory signals at the periphery. We find that subsets of neurons in wS1, V1, and PPC show selectivity for specific locations in the near space. In primary sensory areas, location-selective cells are mainly activated by their respective primary modality and organized along spatial gradients that match retinotopy and somatotopy, respectively. In PPC, location selectivity is driven by both visual and tactile signals and cells are organized along a shared spatial gradient. These findings suggest that the posterior parietal cortex contains a visual-tactile map of the nearby space.