Quantitative
quantitative understanding
Correlations, chaos, and criticality in neural networks
The remarkable properties of information-processing of biological and of artificial neuronal networks alike arise from the interaction of large numbers of neurons. A central quest is thus to characterize their collective states. The directed coupling between pairs of neurons and their continuous dissipation of energy, moreover, cause dynamics of neuronal networks outside thermodynamic equilibrium. Tools from non-equilibrium statistical mechanics and field theory are thus instrumental to obtain a quantitative understanding. We here present progress with this recent approach [1]. On the experimental side, we show how correlations between pairs of neurons are informative on the dynamics of cortical networks: they are poised near a transition to chaos [2]. Close to this transition, we find prolongued sequential memory for past signals [3]. In the chaotic regime, networks offer representations of information whose dimensionality expands with time. We show how this mechanism aids classification performance [4]. Together these works illustrate the fruitful interplay between theoretical physics, neuronal networks, and neural information processing.
Can we predict the diversity of real populations? Part I: What is linked selection doing to populations?
Natural selection affects not only selected alleles, but also indirectly affects all genes near selected sites on the genome. An increasing body of evidence suggests that this linked selection is an important driver of evolutionary dynamics throughout the genomes of many species, implying that we need to substantially revise our basic understanding of molecular evolution. This session brings together early-career researchers working towards a quantitative understanding of the prevalence and effects of linked selection.