Recurrent
recurrent connectivity
Hebbian learning, its inference, and brain oscillation
Despite the recent success of deep learning in artificial intelligence, the lack of biological plausibility and labeled data in natural learning still poses a challenge in understanding biological learning. At the other extreme lies Hebbian learning, the simplest local and unsupervised one, yet considered to be computationally less efficient. In this talk, I would introduce a novel method to infer the form of Hebbian learning from in vivo data. Applying the method to the data obtained from the monkey inferior temporal cortex for the recognition task indicates how Hebbian learning changes the dynamic properties of the circuits and may promote brain oscillation. Notably, recent electrophysiological data observed in rodent V1 showed that the effect of visual experience on direction selectivity was similar to that observed in monkey data and provided strong validation of asymmetric changes of feedforward and recurrent synaptic strengths inferred from monkey data. This may suggest a general learning principle underlying the same computation, such as familiarity detection across different features represented in different brain regions.
A geometric framework to predict structure from function in neural networks
The structural connectivity matrix of synaptic weights between neurons is a critical determinant of overall network function. However, quantitative links between neural network structure and function are complex and subtle. For example, many networks can give rise to similar functional responses, and the same network can function differently depending on context. Whether certain patterns of synaptic connectivity are required to generate specific network-level computations is largely unknown. Here we introduce a geometric framework for identifying synaptic connections required by steady-state responses in recurrent networks of rectified-linear neurons. Assuming that the number of specified response patterns does not exceed the number of input synapses, we analytically calculate all feedforward and recurrent connectivity matrices that can generate the specified responses from the network inputs. We then use this analytical characterization to rigorously analyze the solution space geometry and derive certainty conditions guaranteeing a non-zero synapse between neurons.
Dimensions of variability in circuit models of cortex
Cortical circuits receive multiple inputs from upstream populations with non-overlapping stimulus tuning preferences. Both the feedforward and recurrent architectures of the receiving cortical layer will reflect this diverse input tuning. We study how population-wide neuronal variability propagates through a hierarchical cortical network receiving multiple, independent, tuned inputs. We present new analysis of in vivo neural data from the primate visual system showing that the number of latent variables (dimension) needed to describe population shared variability is smaller in V4 populations compared to those of its downstream visual area PFC. We successfully reproduce this dimensionality expansion from our V4 to PFC neural data using a multi-layer spiking network with structured, feedforward projections and recurrent assemblies of multiple, tuned neuron populations. We show that tuning-structured connectivity generates attractor dynamics within the recurrent PFC current, where attractor competition is reflected in the high dimensional shared variability across the population. Indeed, restricting the dimensionality analysis to activity from one attractor state recovers the low-dimensional structure inherited from each of our tuned inputs. Our model thus introduces a framework where high-dimensional cortical variability is understood as ``time-sharing’’ between distinct low-dimensional, tuning-specific circuit dynamics.
A robust neural integrator based on the interactions of three time scales
Neural integrators are circuits that are able to code analog information such as spatial location or amplitude. Storing amplitude requires the network to have a large number of attractors. In classic models with recurrent excitation, such networks require very careful tuning to behave as integrators and are not robust to small mistuning of the recurrent weights. In this talk, I introduce a circuit with recurrent connectivity that is subjected to a slow subthreshold oscillation (such as the theta rhythm in the hippocampus). I show that such a network can robustly maintain many discrete attracting states. Furthermore, the firing rates of the neurons in these attracting states are much closer to those seen in recordings of animals. I show the mechanism for this can be explained by the instability regions of the Mathieu equation. I then extend the model in various ways and, for example, show that in a spatially distributed network, it is possible to code location and amplitude simultaneously. I show that the resulting mean field equations are equivalent to a certain discontinuous differential equation.
How development sculpts memory circuits
In mammals, the selective transformation of transient experience into stored memory occurs in the hippocampus, which develops representations of specific events in the context in which they occur. In this talk, I will focus on the development of hippocampal circuits and the self-organized dynamics embedded in them since the latter critically support the role of the hippocampus in memory. I will discuss evidence that adult hippocampal cells and circuits are remarkably sculpted by development, as early as embryonic neurogenesis. We argue that these primary developmental programs provide a scaffold onto which later experience of the external world can be grafted. Next, I will present data on the emergence of recurrent connectivity and self-organized dynamics in hippocampal circuits and outline the critical turn points and discontinuities in that developmental journey.
Investigating the role of recurrent connectivity in connectome-constrained and task-optimized models of the fruit fly’s motion pathway
Bernstein Conference 2024
Clustered recurrent connectivity promotes the development of E/I co-tuning via synaptic plasticity
COSYNE 2022
Recurrent connectivity supports motion detection in connectome-constrained models of fly vision
COSYNE 2025