Synaptic Clustering
synaptic clustering
Learning and Memory
This webinar on learning and memory features three experts—Nicolas Brunel, Ashok Litwin-Kumar, and Julijana Gjorgieva—who present theoretical and computational approaches to understanding how neural circuits acquire and store information across different scales. Brunel discusses calcium-based plasticity and how standard “Hebbian-like” plasticity rules inferred from in vitro or in vivo datasets constrain synaptic dynamics, aligning with classical observations (e.g., STDP) and explaining how synaptic connectivity shapes memory. Litwin-Kumar explores insights from the fruit fly connectome, emphasizing how the mushroom body—a key site for associative learning—implements a high-dimensional, random representation of sensory features. Convergent dopaminergic inputs gate plasticity, reflecting a high-dimensional “critic” that refines behavior. Feedback loops within the mushroom body further reveal sophisticated interactions between learning signals and action selection. Gjorgieva examines how activity-dependent plasticity rules shape circuitry from the subcellular (e.g., synaptic clustering on dendrites) to the cortical network level. She demonstrates how spontaneous activity during development, Hebbian competition, and inhibitory-excitatory balance collectively establish connectivity motifs responsible for key computations such as response normalization.
Learning binds novel inputs into functional synaptic clusters via spinogenesis
Learning is known to induce the formation of new dendritic spines, but despite decades of effort, the functional properties of new spines in vivo remain unknown. Here, using a combination of longitudinal in vivo 2-photon imaging of the glutamate reporter, iGluSnFR, and correlated electron microscopy (CLEM) of dendritic spines on the apical dendrites of L2/3 excitatory neurons in the motor cortex during motor learning, we describe a framework of new spines' formation, survival, and resulting function. Specifically, our data indicate that the potentiation of a subset of clustered, pre-existing spines showing task-related activity in early sessions of learning creates a micro-environment of plasticity within dendrites, wherein multiple filopodia sample the nearby neuropil, form connections with pre-existing boutons connected to allodendritic spines, and are then selected for survival based on co-activity with nearby task-related spines. Thus, the formation and survival of new spines is determined by the functional micro-environment of dendrites. After formation, new spines show preferential co-activation with nearby task-related spines. This synchronous activity is more specific to movements than activation of the individual spines in isolation, and further, is coincident with movements that are more similar to the learned pattern. Thus, new spines functionally engage with their parent clusters to signal the learned movement. Finally, by reconstructing the axons associated with new spines, we found that they synapse with axons previously unrepresented in these dendritic domains, suggesting that the strong local co-activity structure exhibited by new spines is likely not due to axon sharing. Thus, learning involves the binding of new information streams into functional synaptic clusters to subserve the learned behavior.
Data-driven reduction of dendritic morphologies with preserved dendro-somatic responses
There is little consensus on the level of spatial complexity at which dendrites operate. On the one hand, emergent evidence indicates that synapses cluster at micrometer spatial scales. On the other hand, most modelling and network studies ignore dendrites altogether. This dichotomy raises an urgent question: what is the smallest relevant spatial scale for understanding dendritic computation? We have developed a method to construct compartmental models at any level of spatial complexity. Through carefully chosen parameter fits, solvable in the least-squares sense, we obtain accurate reduced compartmental models. Thus, we are able to systematically construct passive as well as active dendrite models at varying degrees of spatial complexity. We evaluate which elements of the dendritic computational repertoire are captured by these models. We show that many canonical elements of the dendritic computational repertoire can be reproduced with few compartments. For instance, for a model to behave as a two-layer network, it is sufficient to fit a reduced model at the soma and at locations at the dendritic tips. In the basal dendrites of an L2/3 pyramidal model, we reproduce the backpropagation of somatic action potentials (APs) with a single dendritic compartment at the tip. Further, we obtain the well-known Ca-spike coincidence detection mechanism in L5 Pyramidal cells with as few as eleven compartments, the requirement being that their spacing along the apical trunk supports AP backpropagation. We also investigate whether afferent spatial connectivity motifs admit simplification by ablating targeted branches and grouping affected synapses onto the next proximal dendrite. We find that voltage in the remaining branches is reproduced if temporal conductance fluctuations stay below a limit that depends on the average difference in input resistance between the ablated branches and the next proximal dendrite. Consequently, when the average conductance load on distal synapses is constant, the dendritic tree can be simplified while appropriately decreasing synaptic weights. When the conductance level fluctuates strongly, for instance through a-priori unpredictable fluctuations in NMDA activation, a constant weight rescale factor cannot be found, and the dendrite cannot be simplified. We have created an open source Python toolbox (NEAT - https://neatdend.readthedocs.io/en/latest/) that automatises the simplification process. A NEST implementation of the reduced models, currently under construction, will enable the simulation of few-compartment models in large-scale networks, thus bridging the gap between cellular and network level neuroscience.
Redox-dependent synaptic clustering of gephyrin
FENS Forum 2024