Neural Population Code
neural population code
Christian Leibold
The lab of Christian Leibold invites applications of postdoc candidates on topics related to the geometry of neural manifolds. We will use spiking neural network simulations, analysis of massively parallel recordings, as well as techniques from differential geometry to understand the dynamics of the neural population code in the hippocampal formation in relation to complex cognitive behaviors. Our research group combines modelling of neural circuits with the development of machine learning techniques for data analysis. We strive for a diverse, interdisciplinary, and collaborative work environment.
Efficient Random Codes in a Shallow Neural Network
Efficient coding has served as a guiding principle in understanding the neural code. To date, however, it has been explored mainly in the context of peripheral sensory cells with simple tuning curves. By contrast, ‘deeper’ neurons such as grid cells come with more complex tuning properties which imply a different, yet highly efficient, strategy for representing information. I will show that a highly efficient code is not specific to a population of neurons with finely tuned response properties: it emerges robustly in a shallow network with random synapses. Here, the geometry of population responses implies that optimality obtains from a tradeoff between two qualitatively different types of error: ‘local’ errors (common to classical neural population codes) and ‘global’ (or ‘catastrophic’) errors. This tradeoff leads to efficient compression of information from a high-dimensional representation to a low-dimensional one. After describing the theoretical framework, I will use it to re-interpret recordings of motor cortex in behaving monkey. Our framework addresses the encoding of (sensory) information; if time allows, I will comment on ongoing work that focuses on decoding from the perspective of efficient coding.
Dissecting the neural processes supporting perceptual learning
The brain and its inherent functions can be modified by various forms of learning. Learning-induced changes are seen even in basic perceptual functions. In particular, repeated training in a perceptual task can lead to a significant improvement in the trained task—a phenomenon known as perceptual learning. There has been a long-standing debate about the mechanisms of perceptual learning. In this talk, I will present results from our series of electrophysiological studies. These studies have consistently shown that perceptual learning is mediated by concerted changes in both perceptual and cognitive processes, resulting in improved sensory representation, enhanced top-down influences, and refined readout process.
Mechanistic modeling of Drosophila neural population codes in natural social communication
COSYNE 2022
Mechanistic modeling of Drosophila neural population codes in natural social communication
COSYNE 2022
Homeostatic synaptic scaling optimizes learning in network models of neural population codes
COSYNE 2023