ePoster
Random compressed coding with neurons
Simone Blanco Malerbaand 3 co-authors
COSYNE 2022 (2022)
Mar 18, 2022
Lisbon, Portugal
Presentation
Mar 18, 2022
Event Information
Poster
View posterAbstract
According to the efficient coding hypothesis, neural responses represent information so as to enable the most accurate readout possible, constrained by neuronal resources and neuronal noise. To date, much of the theoretical work on efficient neural coding has focused on relatively simple models of neural activity, characterized by smooth, often unimodal tuning curves. Real neurons, however, often exhibit more complex tuning curves. For instance, in the enthorinal cortex, the periodicity of grid cell tuning curves, as well as their functional organization in modules, imparts the population code with an exponentially large dynamic range, defined as the ratio between the range of represented stimuli and resolution. Recently, multiple other examples of neurons with complex, but unstructured tuning curves have been identified. These findings lead us to ask whether highly efficient neural codes require fine organization, as in grid cells, or whether they can be realized with more complex and irregular tuning curves. We approached this question with a benchmark model: a shallow neural network in which irregular tuning curves emerge due to random synaptic weights. The synapses project from a large population of sensory neurons with unimodal tuning curves in response to a one-dimensional
stimulus onto a smaller neural population. A trade-off is observed between two qualitatively different types of readout errors: ‘local’ errors whereby two nearby stimuli are confused, and ‘global’ errors causing complete loss of information about the stimulus. When balancing the two error rates, we obtain an optimal solution in which a population code with irregular tuning curves achieves exponentially large dynamic range. We argue that compression balancing local and global errors takes place in the motor cortex, based on primate cortex recordings. Our results show that highly efficient codes do not require finely tuned response properties, and can emerge even in the presence of random synaptic connectivity.