Balance
balance
Astrocytes: From Metabolism to Cognition
Different brain cell types exhibit distinct metabolic signatures that link energy economy to cellular function. Astrocytes and neurons, for instance, diverge dramatically in their reliance on glycolysis versus oxidative phosphorylation, underscoring that metabolic fuel efficiency is not uniform across cell types. A key factor shaping this divergence is the structural organization of the mitochondrial respiratory chain into supercomplexes. Specifically, complexes I (CI) and III (CIII) form a CI–CIII supercomplex, but the degree of this assembly varies by cell type. In neurons, CI is predominantly integrated into supercomplexes, resulting in highly efficient mitochondrial respiration and minimal reactive oxygen species (ROS) generation. Conversely, in astrocytes, a larger fraction of CI remains unassembled, freely existing apart from CIII, leading to reduced respiratory efficiency and elevated mitochondrial ROS production. Despite this apparent inefficiency, astrocytes boast a highly adaptable metabolism capable of responding to diverse stressors. Their looser CI–CIII organization allows for flexible ROS signaling, which activates antioxidant programs via transcription factors like Nrf2. This modular architecture enables astrocytes not only to balance energy production but also to support neuronal health and influence complex organismal behaviors.
FLUXSynID: High-Resolution Synthetic Face Generation for Document and Live Capture Images
Synthetic face datasets are increasingly used to overcome the limitations of real-world biometric data, including privacy concerns, demographic imbalance, and high collection costs. However, many existing methods lack fine-grained control over identity attributes and fail to produce paired, identity-consistent images under structured capture conditions. In this talk, I will present FLUXSynID, a framework for generating high-resolution synthetic face datasets with user-defined identity attribute distributions and paired document-style and trusted live capture images. The dataset generated using FLUXSynID shows improved alignment with real-world identity distributions and greater diversity compared to prior work. I will also discuss how FLUXSynID’s dataset and generation tools can support research in face recognition and morphing attack detection (MAD), enhancing model robustness in both academic and practical applications.
Decision and Behavior
This webinar addressed computational perspectives on how animals and humans make decisions, spanning normative, descriptive, and mechanistic models. Sam Gershman (Harvard) presented a capacity-limited reinforcement learning framework in which policies are compressed under an information bottleneck constraint. This approach predicts pervasive perseveration, stimulus‐independent “default” actions, and trade-offs between complexity and reward. Such policy compression reconciles observed action stochasticity and response time patterns with an optimal balance between learning capacity and performance. Jonathan Pillow (Princeton) discussed flexible descriptive models for tracking time-varying policies in animals. He introduced dynamic Generalized Linear Models (Sidetrack) and hidden Markov models (GLM-HMMs) that capture day-to-day and trial-to-trial fluctuations in choice behavior, including abrupt switches between “engaged” and “disengaged” states. These models provide new insights into how animals’ strategies evolve under learning. Finally, Kenji Doya (OIST) highlighted the importance of unifying reinforcement learning with Bayesian inference, exploring how cortical-basal ganglia networks might implement model-based and model-free strategies. He also described Japan’s Brain/MINDS 2.0 and Digital Brain initiatives, aiming to integrate multimodal data and computational principles into cohesive “digital brains.”
Learning and Memory
This webinar on learning and memory features three experts—Nicolas Brunel, Ashok Litwin-Kumar, and Julijana Gjorgieva—who present theoretical and computational approaches to understanding how neural circuits acquire and store information across different scales. Brunel discusses calcium-based plasticity and how standard “Hebbian-like” plasticity rules inferred from in vitro or in vivo datasets constrain synaptic dynamics, aligning with classical observations (e.g., STDP) and explaining how synaptic connectivity shapes memory. Litwin-Kumar explores insights from the fruit fly connectome, emphasizing how the mushroom body—a key site for associative learning—implements a high-dimensional, random representation of sensory features. Convergent dopaminergic inputs gate plasticity, reflecting a high-dimensional “critic” that refines behavior. Feedback loops within the mushroom body further reveal sophisticated interactions between learning signals and action selection. Gjorgieva examines how activity-dependent plasticity rules shape circuitry from the subcellular (e.g., synaptic clustering on dendrites) to the cortical network level. She demonstrates how spontaneous activity during development, Hebbian competition, and inhibitory-excitatory balance collectively establish connectivity motifs responsible for key computations such as response normalization.
Current and future trends in neuroimaging
With the advent of several different fMRI analysis tools and packages outside of the established ones (i.e., SPM, AFNI, and FSL), today's researcher may wonder what the best practices are for fMRI analysis. This talk will discuss some of the recent trends in neuroimaging, including design optimization and power analysis, standardized analysis pipelines such as fMRIPrep, and an overview of current recommendations for how to present neuroimaging results. Along the way we will discuss the balance between Type I and Type II errors with different correction mechanisms (e.g., Threshold-Free Cluster Enhancement and Equitable Thresholding and Clustering), as well as considerations for working with large open-access databases.
Sex hormone regulation of neural gene expression
Gonadal steroid hormones are the principal drivers of sex-variable biology in vertebrates. In the brain, estrogen (17β-estradiol) establishes neural sex differences in many species and modulates mood, behavior, and energy balance in adulthood. To understand the diverse effects of estradiol on the brain, we profiled the genomic binding of estrogen receptor alpha (ERα), providing the first picture of the neural actions of any gonadal hormone receptor. To relate ERα target genes to brain sex differences we assessed gene expression and chromatin accessibility in the posterior bed nucleus of the stria terminalis (BNSTp), a sexually dimorphic node in limbic circuitry that underlies sex-differential social behaviors such as aggression and parenting. In adult animals we observe that levels of ERα are predictive of the extent of sex-variable gene expression, and that these sex differences are a dynamic readout of acute hormonal state. In neonates we find that transient ERα recruitment at birth leads to persistent chromatin opening and male-biased gene expression, demonstrating a true epigenetic mechanism for brain sexual differentiation. Collectively, our findings demonstrate that sex differences in gene expression in the brain are a readout of state-dependent hormone receptor actions, rather than other factors such as sex chromosomes. We anticipate that the ERα targets we have found will contribute to established sex differences in the incidence and etiology of neurological and psychiatric disorders.
NOTE: DUE TO A CYBER ATTACK OUR UNIVERSITY WEB SYSTEM IS SHUT DOWN - TALK WILL BE RESCHEDULED
The size and structure of the dendritic arbor play important roles in determining how synaptic inputs of neurons are converted to action potential output and how neurons are integrated in the surrounding neuronal network. Accordingly, neurons with aberrant morphology have been associated with neurological disorders. Dysmorphic, enlarged neurons are, for example, a hallmark of focal epileptogenic lesions like focal cortical dysplasia (FCDIIb) and gangliogliomas (GG). However, the regulatory mechanisms governing the development of dendrites are insufficiently understood. The evolutionary conserved Ste20/Hippo kinase pathway has been proposed to play an important role in regulating the formation and maintenance of dendritic architecture. A key element of this pathway, Ste20-like kinase (SLK), regulates cytoskeletal dynamics in non-neuronal cells and is strongly expressed throughout neuronal development. Nevertheless, its function in neurons is unknown. We found that during development of mouse cortical neurons, SLK has a surprisingly specific role for proper elaboration of higher, ≥ 3rd, order dendrites both in cultured neurons and living mice. Moreover, SLK is required to maintain excitation-inhibition balance. Specifically, SLK knockdown causes a selective loss of inhibitory synapses and functional inhibition after postnatal day 15, while excitatory neurotransmission is unaffected. This mechanism may be relevant for human disease, as dysmorphic neurons within human cortical malformations exhibit significant loss of SLK expression. To uncover the signaling cascades underlying the action of SLK, we combined phosphoproteomics, protein interaction screens and single cell RNA seq. Overall, our data identifies SLK as a key regulator of both dendritic complexity during development and of inhibitory synapse maintenance.
The balanced brain: two-photon microscopy of inhibitory synapse formation
Coordination between excitatory and inhibitory synapses (providing positive and negative signals respectively) is required to ensure proper information processing in the brain. Many brain disorders, especially neurodevelopental disorders, are rooted in a specific disturbance of this coordination. In my research group we use a combination of two-photon microscopy and electrophisiology to examine how inhibitory synapses are fromed and how this formation is coordinated with nearby excitatroy synapses.
The balance hypothesis for the avian lumbosacral organ and an exploration of its morphological variation
Obesity and Brain – Bidirectional Influences
The regulation of body weight relies on homeostatic mechanisms that use a combination of internal signals and external cues to initiate and terminate food intake. Homeostasis depends on intricate communication between the body and the hypothalamus involving numerous neural and hormonal signals. However, there is growing evidence that higher-level cognitive function may also influence energy balance. For instance, research has shown that BMI is consistently linked to various brain, cognitive, and personality measures, implicating executive, reward, and attentional systems. Moreover, the rise in obesity rates over the past half-century is attributed to the affordability and widespread availability of highly processed foods, a phenomenon that contradicts the idea that food intake is solely regulated by homeostasis. I will suggest that prefrontal systems involved in value computation and motivation act to limit food overconsumption when food is scarce or expensive, but promote over-eating when food is abundant, an optimum strategy from an economic standpoint. I will review the genetic and neuroscience literature on the CNS control of body weight. I will present recent studies supporting a role of prefrontal systems in weight control. I will also present contradictory evidence showing that frontal executive and cognitive findings in obesity may be a consequence not a cause of increased hunger. Finally I will review the effects of obesity on brain anatomy and function. Chronic adiposity leads to cerebrovascular dysfunction, cortical thinning, and cognitive impairment. As the most common preventable risk factor for dementia, obesity poses a significant threat to brain health. I will conclude by reviewing evidence for treatment of obesity in adults to prevent brain disease.
From cells to systems: multiscale studies of the epileptic brain
It is increasingly recognized that epilepsy affects human brain organization across multiple scales, ranging from cellular alterations in specific regions towards macroscale network imbalances. My talk will overview an emerging paradigm that integrates cellular, neuroimaging, and network modelling approaches to faithful characterize the extent of structural and functional alterations in the common epilepsies. I will also discuss how multiscale framework can help to derive clinically useful biomarkers of dysfunction, and how these methods may guide surgical planning and prognostics.
Uncovering the molecular effectors of diet and exercise
Despite the profound effects of nutrition and physical activity on human health, our understanding of the molecules mediating the salutary effects of specific foods or activities remains remarkably limited. Here, we share our ongoing studies that use unbiased and high-resolution metabolomics technologies to uncover the molecules and molecular effectors of diet and exercise. We describe how exercise stimulates the production of Lac-Phe, a blood-borne signaling metabolite that suppresses feeding and obesity. Ablation of Lac-Phe biosynthesis in mice increases food intake and obesity after exercise. We also describe the discovery of an orphan metabolite, BHB-Phe. Ketosis-inducible BHB-Phe is a congener of exercise-inducible Lac-Phe, produced in CNDP2+ cells when levels of BHB are high, and functions to lower body weight and adiposity in ketosis. Our data uncover an unexpected and underappreciated signaling role for metabolic fuel derivatives in mediating the cardiometabolic benefits of diet and exercise. These data also suggest that diet and exercise may mediate their physiologic effects on energy balance via a common family of molecules and overlapping signaling pathways.
Off the rails - how pathological patterns of whole brain activity emerge in epileptic seizures
In most brains across the animal kingdom, brain dynamics can enter pathological states that are recognisable as epileptic seizures. Yet usually, brain operate within certain constraints given through neuronal function and synaptic coupling, that will prevent epileptic seizure dynamics from emerging. In this talk, I will bring together different approaches to identifying how networks in the broadest sense shape brain dynamics. Using illustrative examples from intracranial EEG recordings, disorders characterised by molecular disruption of a single neurotransmitter receptor type, to single-cell recordings of whole-brain activity in the larval zebrafish, I will address three key questions - (1) how does the regionally specific composition of synaptic receptors shape ongoing physiological brain activity; (2) how can disruption of this regionally specific balance result in abnormal brain dynamics; and (3) which cellular patterns underly the transition into an epileptic seizure.
Integration of 3D human stem cell models derived from post-mortem tissue and statistical genomics to guide schizophrenia therapeutic development
Schizophrenia is a neuropsychiatric disorder characterized by positive symptoms (such as hallucinations and delusions), negative symptoms (such as avolition and withdrawal) and cognitive dysfunction1. Schizophrenia is highly heritable, and genetic studies are playing a pivotal role in identifying potential biomarkers and causal disease mechanisms with the hope of informing new treatments. Genome-wide association studies (GWAS) identified nearly 270 loci with a high statistical association with schizophrenia risk; however each locus confers only a small increase in risk therefore it is difficult to translate these findings into understanding disease biology that can lead to treatments. Induced pluripotent stem cell (iPSC) models are a tractable system to translate genetic findings and interrogate mechanisms of pathogenesis. Mounting research with patient-derived iPSCs has proposed several neurodevelopmental pathways altered in SCZ, such as neural progenitor cell (NPC) proliferation, imbalanced differentiation of excitatory and inhibitory cortical neurons. However, it is unclear what exactly these iPS models recapitulate, how potential perturbations of early brain development translates into illness in adults and how iPS models that represent fetal stages can be utilized to further drug development efforts to treat adult illness. I will present the largest transcriptome analysis of post-mortem caudate nucleus in schizophrenia where we discovered that decreased presynaptic DRD2 autoregulation is the causal dopamine risk factor for schizophrenia (Benjamin et al, Nature Neuroscience 2022 https://doi.org/10.1038/s41593-022-01182-7). We developed stem cell models from a subset of the postmortem cohort to better understand the molecular underpinnings of human psychiatric disorders (Sawada et al, Stem Cell Research 2020). We established a method for the differentiation of iPS cells into ventral forebrain organoids and performed single cell RNAseq and cellular phenotyping. To our knowledge, this is the first study to evaluate iPSC models of SZ from the same individuals with postmortem tissue. Our study establishes that striatal neurons in the patients with SCZ carry abnormalities that originated during early brain development. Differentiation of inhibitory neurons is accelerated whereas excitatory neuronal development is delayed, implicating an excitation and inhibition (E-I) imbalance during early brain development in SCZ. We found a significant overlap of genes upregulated in the inhibitory neurons in SCZ organoids with upregulated genes in postmortem caudate tissues from patients with SCZ compared with control individuals, including the donors of our iPS cell cohort. Altogether, we demonstrate that ventral forebrain organoids derived from postmortem tissue of individuals with schizophrenia recapitulate perturbed striatal gene expression dynamics of the donors’ brains (Sawada et al, biorxiv 2022 https://doi.org/10.1101/2022.05.26.493589).
Dynamics of cortical circuits: underlying mechanisms and computational implications
A signature feature of cortical circuits is the irregularity of neuronal firing, which manifests itself in the high temporal variability of spiking and the broad distribution of rates. Theoretical works have shown that this feature emerges dynamically in network models if coupling between cells is strong, i.e. if the mean number of synapses per neuron K is large and synaptic efficacy is of order 1/\sqrt{K}. However, the degree to which these models capture the mechanisms underlying neuronal firing in cortical circuits is not fully understood. Results have been derived using neuron models with current-based synapses, i.e. neglecting the dependence of synaptic current on the membrane potential, and an understanding of how irregular firing emerges in models with conductance-based synapses is still lacking. Moreover, at odds with the nonlinear responses to multiple stimuli observed in cortex, network models with strongly coupled cells respond linearly to inputs. In this talk, I will discuss the emergence of irregular firing and nonlinear response in networks of leaky integrate-and-fire neurons. First, I will show that, when synapses are conductance-based, irregular firing emerges if synaptic efficacy is of order 1/\log(K) and, unlike in current-based models, persists even under the large heterogeneity of connections which has been reported experimentally. I will then describe an analysis of neural responses as a function of coupling strength and show that, while a linear input-output relation is ubiquitous at strong coupling, nonlinear responses are prominent at moderate coupling. I will conclude by discussing experimental evidence of moderate coupling and loose balance in the mouse cortex.
CNStalk: Finding the network balance in Parkinson’s hallucinations
Universal function approximation in balanced spiking networks through convex-concave boundary composition
The spike-threshold nonlinearity is a fundamental, yet enigmatic, component of biological computation — despite its role in many theories, it has evaded definitive characterisation. Indeed, much classic work has attempted to limit the focus on spiking by smoothing over the spike threshold or by approximating spiking dynamics with firing-rate dynamics. Here, we take a novel perspective that captures the full potential of spike-based computation. Based on previous studies of the geometry of efficient spike-coding networks, we consider a population of neurons with low-rank connectivity, allowing us to cast each neuron’s threshold as a boundary in a space of population modes, or latent variables. Each neuron divides this latent space into subthreshold and suprathreshold areas. We then demonstrate how a network of inhibitory (I) neurons forms a convex, attracting boundary in the latent coding space, and a network of excitatory (E) neurons forms a concave, repellant boundary. Finally, we show how the combination of the two yields stable dynamics at the crossing of the E and I boundaries, and can be mapped onto a constrained optimization problem. The resultant EI networks are balanced, inhibition-stabilized, and exhibit asynchronous irregular activity, thereby closely resembling cortical networks of the brain. Moreover, we demonstrate how such networks can be tuned to either suppress or amplify noise, and how the composition of inhibitory convex and excitatory concave boundaries can result in universal function approximation. Our work puts forth a new theory of biologically-plausible computation in balanced spiking networks, and could serve as a novel framework for scalable and interpretable computation with spikes.
Designing the BEARS (Both Ears) Virtual Reality Training Package to Improve Spatial Hearing in Young People with Bilateral Cochlear Implant
Results: the main areas which were modified based on participatory feedback were the variety of immersive scenarios to cover a range of ages and interests, the number of levels of complexity to ensure small improvements were measured, the feedback and reward schemes to ensure positive reinforcement, and specific provision for participants with balance issues, who had difficulties when using head-mounted displays. The effectiveness of the finalised BEARS suite will be evaluated in a large-scale clinical trial. We have added in additional login options for other members of the family and based on patient feedback we have improved the accompanying reward schemes. Conclusions: Through participatory design we have developed a training package (BEARS) for young people with bilateral cochlear implants. The training games are appropriate for use by the study population and ultimately should lead to patients taking control of their own management and reducing the reliance upon outpatient-based rehabilitation programmes. Virtual reality training provides a more relevant and engaging approach to rehabilitation for young people.
Introducing dendritic computations to SNNs with Dendrify
Current SNNs studies frequently ignore dendrites, the thin membranous extensions of biological neurons that receive and preprocess nearly all synaptic inputs in the brain. However, decades of experimental and theoretical research suggest that dendrites possess compelling computational capabilities that greatly influence neuronal and circuit functions. Notably, standard point-neuron networks cannot adequately capture most hallmark dendritic properties. Meanwhile, biophysically detailed neuron models are impractical for large-network simulations due to their complexity, and high computational cost. For this reason, we introduce Dendrify, a new theoretical framework combined with an open-source Python package (compatible with Brian2) that facilitates the development of bioinspired SNNs. Dendrify, through simple commands, can generate reduced compartmental neuron models with simplified yet biologically relevant dendritic and synaptic integrative properties. Such models strike a good balance between flexibility, performance, and biological accuracy, allowing us to explore dendritic contributions to network-level functions while paving the way for developing more realistic neuromorphic systems.
Timescales of neural activity: their inference, control, and relevance
Timescales characterize how fast the observables change in time. In neuroscience, they can be estimated from the measured activity and can be used, for example, as a signature of the memory trace in the network. I will first discuss the inference of the timescales from the neuroscience data comprised of the short trials and introduce a new unbiased method. Then, I will apply the method to the data recorded from a local population of cortical neurons from the visual area V4. I will demonstrate that the ongoing spiking activity unfolds across at least two distinct timescales - fast and slow - and the slow timescale increases when monkeys attend to the location of the receptive field. Which models can give rise to such behavior? Random balanced networks are known for their fast timescales; thus, a change in the neurons or network properties is required to mimic the data. I will propose a set of models that can control effective timescales and demonstrate that only the model with strong recurrent interactions fits the neural data. Finally, I will discuss the timescales' relevance for behavior and cortical computations.
The balance of excitation and inhibition and a canonical cortical computation
Excitatory and inhibitory (E & I) inputs to cortical neurons remain balanced across different conditions. The balanced network model provides a self-consistent account of this observation: population rates dynamically adjust to yield a state in which all neurons are active at biological levels, with their E & I inputs tightly balanced. But global tight E/I balance predicts population responses with linear stimulus-dependence and does not account for systematic cortical response nonlinearities such as divisive normalization, a canonical brain computation. However, when necessary connectivity conditions for global balance fail, states arise in which only a localized subset of neurons are active and have balanced inputs. We analytically show that in networks of neurons with different stimulus selectivities, the emergence of such localized balance states robustly leads to normalization, including sublinear integration and winner-take-all behavior. An alternative model that exhibits normalization is the Stabilized Supralinear Network (SSN), which predicts a regime of loose, rather than tight, E/I balance. However, an understanding of the causal relationship between E/I balance and normalization in SSN and conditions under which SSN yields significant sublinear integration are lacking. For weak inputs, SSN integrates inputs supralinearly, while for very strong inputs it approaches a regime of tight balance. We show that when this latter regime is globally balanced, SSN cannot exhibit strong normalization for any input strength; thus, in SSN too, significant normalization requires localized balance. In summary, we causally and quantitatively connect a fundamental feature of cortical dynamics with a canonical brain computation. Time allowing I will also cover our work extending a normative theoretical account of normalization which explains it as an example of efficient coding of natural stimuli. We show that when biological noise is accounted for, this theory makes the same prediction as the SSN: a transition to supralinear integration for weak stimuli.
Brain-body interactions that modulate fear
In most animals including in humans, emotions occur together with changes in the body, such as variations in breathing or heart rate, sweaty palms, or facial expressions. It has been suggested that this interoceptive information acts as a feedback signal to the brain, enabling adaptive modulation of emotions that is essential for survival. As such, fear, one of our basic emotions, must be kept in a functional balance to minimize risk-taking while allowing for the pursuit of essential needs. However, the neural mechanisms underlying this adaptive modulation of fear remain poorly understood. In this talk, I want to present and discuss the data from my PhD work where we uncover a crucial role for the interoceptive insular cortex in detecting changes in heart rate to maintain an equilibrium between the extinction and maintenance of fear memories in mice.
Taming chaos in neural circuits
Neural circuits exhibit complex activity patterns, both spontaneously and in response to external stimuli. Information encoding and learning in neural circuits depend on the ability of time-varying stimuli to control spontaneous network activity. In particular, variability arising from the sensitivity to initial conditions of recurrent cortical circuits can limit the information conveyed about the sensory input. Spiking and firing rate network models can exhibit such sensitivity to initial conditions that are reflected in their dynamic entropy rate and attractor dimensionality computed from their full Lyapunov spectrum. I will show how chaos in both spiking and rate networks depends on biophysical properties of neurons and the statistics of time-varying stimuli. In spiking networks, increasing the input rate or coupling strength aids in controlling the driven target circuit, which is reflected in both a reduced trial-to-trial variability and a decreased dynamic entropy rate. With sufficiently strong input, a transition towards complete network state control occurs. Surprisingly, this transition does not coincide with the transition from chaos to stability but occurs at even larger values of external input strength. Controllability of spiking activity is facilitated when neurons in the target circuit have a sharp spike onset, thus a high speed by which neurons launch into the action potential. I will also discuss chaos and controllability in firing-rate networks in the balanced state. For these, external control of recurrent dynamics strongly depends on correlations in the input. This phenomenon was studied with a non-stationary dynamic mean-field theory that determines how the activity statistics and the largest Lyapunov exponent depend on frequency and amplitude of the input, recurrent coupling strength, and network size. This shows that uncorrelated inputs facilitate learning in balanced networks. The results highlight the potential of Lyapunov spectrum analysis as a diagnostic for machine learning applications of recurrent networks. They are also relevant in light of recent advances in optogenetics that allow for time-dependent stimulation of a select population of neurons.
Keeping your Brain in Balance: the Ups and Downs of Homeostatic Plasticity (virtual)
Our brains must generate and maintain stable activity patterns over decades of life, despite the dramatic changes in circuit connectivity and function induced by learning and experience-dependent plasticity. How do our brains acheive this balance between opposing need for plasticity and stability? Over the past two decades, we and others have uncovered a family of “homeostatic” negative feedback mechanisms that are theorized to stabilize overall brain activity while allowing specific connections to be reconfigured by experience. Here I discuss recent work in which we demonstrate that individual neocortical neurons in freely behaving animals indeed have a homeostatic activity set-point, to which they return in the face of perturbations. Intriguingly, this firing rate homeostasis is gated by sleep/wake states in a manner that depends on the direction of homeostatic regulation: upward-firing rate homeostasis occurs selectively during periods of active wake, while downward-firing rate homeostasis occurs selectively during periods of sleep, suggesting that an important function of sleep is to temporally segregate bidirectional plasticity. Finally, we show that firing rate homeostasis is compromised in an animal model of autism spectrum disorder. Together our findings suggest that loss of homeostatic plasticity in some neurological disorders may render central circuits unable to compensate for the normal perturbations induced by development and learning.
Keeping the balance- A role for the insular cortex in emotion homeostasis
The GluN2A Subunit of the NMDA Receptor and Parvalbumin Interneurons: A Possible Role in Interneuron Development
N-methyl-D-aspartate receptors (NMDARs) are excitatory glutamate-gated ion channels that are expressed throughout the central nervous system. NMDARs mediate calcium entry into cells, and are involved in a host of neurological functions. The GluN2A subunit, encoded by the GRIN2A gene, is expressed by both excitatory and inhibitory neurons, with well described roles in pyramidal cells. By using Grin2a knockout mice, we show that the loss of GluN2A signaling impacts parvalbumin-positive (PV) GABAergic interneuron function in hippocampus. Grin2a knockout mice have 33% more PV cells in CA1 compared to wild type but similar cholecystokinin-positive cell density. Immunohistochemistry and electrophysiological recordings show that excess PV cells do eventually incorporate into the hippocampal network and participate in phasic inhibition. Although the morphology of Grin2a knockout PV cells is unaffected, excitability and action-potential firing properties show age-dependent alterations. Preadolescent (P20-25) PV cells have an increased input resistance, longer membrane time constant, longer action-potential half-width, a lower current threshold for depolarization-induced block of action-potential firing, and a decrease in peak action-potential firing rate. Each of these measures are corrected in adulthood, reaching wild type levels, suggesting a potential delay of electrophysiological maturation. The circuit and behavioral implications of this age-dependent PV interneuron malfunction are unknown. However, neonatal Grin2a knockout mice are more susceptible to lipopolysaccharide and febrile-induced seizures, consistent with a critical role for early GluN2A signaling in development and maintenance of excitatory-inhibitory balance. These results could provide insights into how loss-of-function GRIN2A human variants generate an epileptic phenotypes.
Stress deceleration theory: chronic adolescent stress exposure results in decelerated neurobehavioral maturation
Normative development in adolescence indicates that the prefrontal cortex is still under development thereby unable to exert efficient top-down inhibitory control on subcortical regions such as the basolateral amygdala and the nucleus accumbens. This imbalance in the developmental trajectory between cortical and subcortical regions is implicated in expression of the prototypical impulsive, compulsive, reward seeking and risk-taking adolescent behavior. Here we demonstrate that a chronic mild unpredictable stress procedure during adolescence in male Wistar rats arrests the normal behavioral maturation such that they continue to express adolescent-like impulsive, hyperactive, and compulsive behaviors into late adulthood. This arrest in behavioral maturation is associated with the hypoexcitability of prelimbic cortex (PLC) pyramidal neurons and reduced PLC-mediated synaptic glutamatergic control of BLA and nucleus accumbens core (NAcC) neurons that lasts late into adulthood. At the same time stress exposure in adolescence results in the hyperexcitability of the BLA pyramidal neurons sending stronger glutamatergic projections to the NAcC. Chemogenetic reversal of the PLC hypoexcitability decreased compulsivity and improved the expression of goal-directed behavior in rats exposed to stress during adolescence, suggesting a causal role for PLC hypoexcitability in this stress-induced arrested behavioral development. (https://www.biorxiv.org/content/10.1101/2021.11.21.469381v1.abstract)
A Network for Computing Value Equilibrium in the Human Medial Prefrontal Corte
Humans and other animals make decisions in order to satisfy their goals. However, it remains unknown how neural circuits compute which of multiple possible goals should be pursued (e.g., when balancing hunger and thirst) and how to combine these signals with estimates of available reward alternatives. Here, humans undergoing fMRI accumulated two distinct assets over a sequence of trials. Financial outcomes depended on the minimum cumulate of either asset, creating a need to maintain “value equilibrium” by redressing any imbalance among the assets. Blood-oxygen-level-dependent (BOLD) signals in the rostral anterior cingulate cortex (rACC) tracked the level of imbalance among goals, whereas the ventromedial prefrontal cortex (vmPFC) signaled the level of redress incurred by a choice rather than the overall amount received. These results suggest that a network of medial frontal brain regions compute a value signal that maintains value equilibrium among internal goals.
The organization of neural representations for control
Cognitive control allows us to think and behave flexibly based on our context and goals. Most theories of cognitive control propose a control representation that enables the same input to produce different outputs contingent on contextual factors. In this talk, I will focus on an important property of the control representation's neural code: its representational dimensionality. Dimensionality of a neural representation balances a basic separability/generalizability trade-off in neural computation. This tradeoff has important implications for cognitive control. In this talk, I will present initial evidence from fMRI and EEG showing that task representations in the human brain leverage both ends of this tradeoff during flexible behavior.
Reconstruct cellular dynamics from single cell data
Recent advances of single cell techniques catalyzed quantitative studies on the dynamics of cell phenotypic transitions (CPT) emerging as a new field. However, fixed cell-based approaches have fundamental limits on revealing temporal information, and fluorescence-based live cell imaging approaches are technically challenging for multiplex long-term imaging. To tackle the challenges, we developed an integrated experimental/computational platform for reconstructing single cell phenotypic transition dynamics. Experimentally, we developed a live-cell imaging platform to record the phenotypic transition path of A549 VIM-RFP reporter cell line and unveil parallel paths of epithelial-to-mesenchymal transition (EMT). Computationally, we modified a finite temperature string method to reconstruct the reaction coordinate from the paths, and reconstruct a corresponding quasi-potential, which reveals that the EMT process resembles a barrier-less relaxation process. Our work demonstrates the necessity of extracting dynamical information of phenotypic transitions and the existence of a unified theoretical framework describing transition and relaxation dynamics in systems with and without detailed balance.
NMC4 Short Talk: Multiscale and extended retrieval of associative memory structures in a cortical model of local-global inhibition balance
Inhibitory neurons take on many forms and functions. How this diversity contributes to memory function is not completely known. Previous formal studies indicate inhibition differentiated by local and global connectivity in associative memory networks functions to rescale the level of retrieval of excitatory assemblies. However, such studies lack biological details such as a distinction between types of neurons (excitatory and inhibitory), unrealistic connection schemas, and non-sparse assemblies. In this study, we present a rate-based cortical model where neurons are distinguished (as excitatory, local inhibitory, or global inhibitory), connected more realistically, and where memory items correspond to sparse excitatory assemblies. We use this model to study how local-global inhibition balance can alter memory retrieval in associative memory structures, including naturalistic and artificial structures. Experimental studies have reported inhibitory neurons and their sub-types uniquely respond to specific stimuli and can form sophisticated, joint excitatory-inhibitory assemblies. Our model suggests such joint assemblies, as well as a distribution and rebalancing of overall inhibition between two inhibitory sub-populations – one connected to excitatory assemblies locally and the other connected globally – can quadruple the range of retrieval across related memories. We identify a possible functional role for local-global inhibitory balance to, in the context of choice or preference of relationships, permit and maintain a broader range of memory items when local inhibition is dominant and conversely consolidate and strengthen a smaller range of memory items when global inhibition is dominant. This model therefore highlights a biologically-plausible and behaviourally-useful function of inhibitory diversity in memory.
NMC4 Event: NMC For Kids
We at Neuromatch 4.0 wish to open up science conferences to everyone and that is why we have included a session for kids and the young at heart. The NMC for kids has three excellent speakers from around the globe to talk about the balance system from bird butts to space: 1. Birds balance with their butts” by Bing Wen Brunton (Associate Prof of Biology at University of Washington, Seattle) 2. “The brain in motion” by Jenifer L. Campos (Associate Prof, University of Toronto) 3. “Getting ready for Mars: what happens to the brain in space?” By Elisa R Ferre (Senior Lecturer, Birkbeck University of London)
Spontaneous activity competes with externally evoked responses in sensory cortex
The interaction between spontaneously and externally evoked neuronal activity is fundamental for a functional brain. Increasing evidence suggests that bursts of high-power oscillations in the 15-30 Hz beta-band represent activation of resting state networks and can mask perception of external cues. Yet demonstration of the effect of beta power modulation on perception in real-time is missing, and little is known about the underlying mechanism. In this talk I will present the methods we developed to fill this gap together with our recent results. We used a closed-loop stimulus-intensity adjustment system based on online burst-occupancy analyses in rats involved in a forepaw vibrotactile detection task. We found that the masking influence of burst-occupancy on perception can be counterbalanced in real-time by adjusting the vibration amplitude. Offline analysis of firing-rates and local field potentials across cortical layers and frequency bands confirmed that beta-power in the somatosensory cortex anticorrelated with sensory evoked responses. Mechanistically, bursts in all bands were accompanied by transient synchronization of cell assemblies, but only beta-bursts were followed by a reduction of firing-rate. Our closed loop approach reveals that spontaneous beta-bursts reflect a dynamic state that competes with external stimuli.
Wiring & Rewiring: Experience-Dependent Circuit Development and Plasticity in Sensory Cortices
To build an appropriate representation of the sensory stimuli around the world, neural circuits are wired according to both intrinsic factors and external sensory stimuli. Moreover, the brain circuits have the capacity to rewire in response to altered environment, both during early development and throughout life. In this talk, I will give an overview about my past research in studying the dynamic processes underlying functional maturation and plasticity in rodent sensory cortices. I will also present data about the current and future research in my lab – that is, the synaptic and circuit mechanisms by which the mature brain circuits employ to regulate the balance between stability and plasticity. By applying chronic 2-photon calcium and close-loop visual exposure, we studied the circuit changes at single-neuron resolution to show that concurrent running with visual stimulus is required to drive neuroplasticity in the adult brain.
Design principles of adaptable neural codes
Behavior relies on the ability of sensory systems to infer changing properties of the environment from incoming sensory stimuli. However, the demands that detecting and adjusting to changes in the environment place on a sensory system often differ from the demands associated with performing a specific behavioral task. This necessitates neural coding strategies that can dynamically balance these conflicting needs. I will discuss our ongoing theoretical work to understand how this balance can best be achieved. We connect ideas from efficient coding and Bayesian inference to ask how sensory systems should dynamically allocate limited resources when the goal is to optimally infer changing latent states of the environment, rather than reconstruct incoming stimuli. We use these ideas to explore dynamic tradeoffs between the efficiency and speed of sensory adaptation schemes, and the downstream computations that these schemes might support. Finally, we derive families of codes that balance these competing objectives, and we demonstrate their close match to experimentally-observed neural dynamics during sensory adaptation. These results provide a unifying perspective on adaptive neural dynamics across a range of sensory systems, environments, and sensory tasks.
Synaptic plasticity controls the emergence of population-wide invariant representations in balanced network models
The intensity and features of sensory stimuli are encoded in the activity of neurons in the cortex. In the visual and piriform cortices, the stimulus intensity re-scales the activity of the population without changing its selectivity for the stimulus features. The cortical representation of the stimulus is therefore intensity-invariant. This emergence of network invariant representations appears robust to local changes in synaptic strength induced by synaptic plasticity, even though: i) synaptic plasticity can potentiate or depress connections between neurons in a feature-dependent manner, and ii) in networks with balanced excitation and inhibition, synaptic plasticity determines the non-linear network behavior. In this study, we investigate the consistency of invariant representations with a variety of synaptic states in balanced networks. By using mean-field models and spiking network simulations, we show how the synaptic state controls the emergence of intensity-invariant or intensity-dependent selectivity by inducing changes in the network response to intensity. In particular, we demonstrate how facilitating synaptic states can sharpen the network selectivity while depressing states broaden it. We also show how power-law-type synapses permit the emergence of invariant network selectivity and how this plasticity can be generated by a mix of different plasticity rules. Our results explain how the physiology of individual synapses is linked to the emergence of invariant representations of sensory stimuli at the network level.
Representation transfer and signal denoising through topographic modularity
To prevail in a dynamic and noisy environment, the brain must create reliable and meaningful representations from sensory inputs that are often ambiguous or corrupt. Since only information that permeates the cortical hierarchy can influence sensory perception and decision-making, it is critical that noisy external stimuli are encoded and propagated through different processing stages with minimal signal degradation. Here we hypothesize that stimulus-specific pathways akin to cortical topographic maps may provide the structural scaffold for such signal routing. We investigate whether the feature-specific pathways within such maps, characterized by the preservation of the relative organization of cells between distinct populations, can guide and route stimulus information throughout the system while retaining representational fidelity. We demonstrate that, in a large modular circuit of spiking neurons comprising multiple sub-networks, topographic projections are not only necessary for accurate propagation of stimulus representations, but can also help the system reduce sensory and intrinsic noise. Moreover, by regulating the effective connectivity and local E/I balance, modular topographic precision enables the system to gradually improve its internal representations and increase signal-to-noise ratio as the input signal passes through the network. Such a denoising function arises beyond a critical transition point in the sharpness of the feed-forward projections, and is characterized by the emergence of inhibition-dominated regimes where population responses along stimulated maps are amplified and others are weakened. Our results indicate that this is a generalizable and robust structural effect, largely independent of the underlying model specificities. Using mean-field approximations, we gain deeper insight into the mechanisms responsible for the qualitative changes in the system’s behavior and show that these depend only on the modular topographic connectivity and stimulus intensity. The general dynamical principle revealed by the theoretical predictions suggest that such a denoising property may be a universal, system-agnostic feature of topographic maps, and may lead to a wide range of behaviorally relevant regimes observed under various experimental conditions: maintaining stable representations of multiple stimuli across cortical circuits; amplifying certain features while suppressing others (winner-take-all circuits); and endow circuits with metastable dynamics (winnerless competition), assumed to be fundamental in a variety of tasks.
The brain control of appetite: Can an old dog teach us new tricks?
It is clear that the cause of obesity is a result of eating more than you burn. It is physics. What is more complex to answer is why some people eat more than others? Differences in our genetic make-up mean some of us are slightly more hungry all the time and so eat more than others. We now know that the genetics of body-weight, on which obesity sits on one end of the spectrum, is in actuality the genetics of appetite control. In contrast to the prevailing view, body-weight is not a choice. People who are obese are not bad or lazy; rather, they are fighting their biology.
Deriving local synaptic learning rules for efficient representations in networks of spiking neurons
How can neural networks learn to efficiently represent complex and high-dimensional inputs via local plasticity mechanisms? Classical models of representation learning assume that input weights are learned via pairwise Hebbian-like plasticity. Here, we show that pairwise Hebbian-like plasticity only works under specific requirements on neural dynamics and input statistics. To overcome these limitations, we derive from first principles a learning scheme based on voltage-dependent synaptic plasticity rules. Here, inhibition learns to locally balance excitatory input in individual dendritic compartments, and thereby can modulate excitatory synaptic plasticity to learn efficient representations. We demonstrate in simulations that this learning scheme works robustly even for complex, high-dimensional and correlated inputs. It also works in the presence of inhibitory transmission delays, where Hebbian-like plasticity typically fails. Our results draw a direct connection between dendritic excitatory-inhibitory balance and voltage-dependent synaptic plasticity as observed in vivo, and suggest that both are crucial for representation learning.
Migraine: a disorder of excitatory-inhibitory balance in multiple brain networks? Insights from genetic mouse models of the disease
Migraine is much more than an episodic headache. It is a complex brain disorder, characterized by a global dysfunction in multisensory information processing and integration. In a third of patients, the headache is preceded by transient sensory disturbances (aura), whose neurophysiological correlate is cortical spreading depression (CSD). The molecular, cellular and circuit mechanisms of the primary brain dysfunctions that underlie migraine onset, susceptibility to CSD and altered sensory processing remain largely unknown and are major open issues in the neurobiology of migraine. Genetic mouse models of a rare monogenic form of migraine with aura provide a unique experimental system to tackle these key unanswered questions. I will describe the functional alterations we have uncovered in the cerebral cortex of genetic mouse models and discuss the insights into the cellular and circuit mechanisms of migraine obtained from these findings.
Targeting the brain to improve obesity and type 2 diabetes
The increasing prevalence of obesity and type 2 diabetes (T2D) and associated morbidity and mortality emphasizes the need for a more complete understanding of the mechanisms mediating energy homeostasis to accelerate the identification of new medications. Recent reports indicate that obesity medication, 5-hydroxytryptamine (5-HT, serotonin)2C receptor (5-HT2CR) agonist lorcaserin improves glycemic control in association with weight loss in obese patients with T2D. We examined whether lorcaserin has a direct effect on insulin sensitivity and how this effect is achieved. We clarify that lorcaserin dose-dependently improves glycemic control in a mouse model of T2D without altering body weight. Examining the mechanism of this effect, we reveal a necessary and sufficient neurochemical mediator of lorcaserin’s glucoregulatory effects, via activation of brain pro-opiomelanocortin (POMC) peptides. We observed that lorcaserin reduces hepatic glucose production and improves insulin sensitivity. These data suggest that lorcaserin’s action within the brain represents a mechanistically novel treatment for T2D: findings of significance to a prevalent global disease.
Active sleep in flies: the dawn of consciousness
The brain is a prediction machine. Yet the world is never entirely predictable, for any animal. Unexpected events are surprising and this typically evokes prediction error signatures in animal brains. In humans such mismatched expectations are often associated with an emotional response as well. Appropriate emotional responses are understood to be important for memory consolidation, suggesting that valence cues more generally constitute an ancient mechanism designed to potently refine and generalize internal models of the world and thereby minimize prediction errors. On the other hand, abolishing error detection and surprise entirely is probably also maladaptive, as this might undermine the very mechanism that brains use to become better prediction machines. This paradoxical view of brain functions as an ongoing tug-of-war between prediction and surprise suggests a compelling new way to study and understand the evolution of consciousness in animals. I will present approaches to studying attention and prediction in the tiny brain of the fruit fly, Drosophila melanogaster. I will discuss how an ‘active’ sleep stage (termed rapid eye movement – REM – sleep in mammals) may have evolved in the first animal brains as a mechanism for optimizing prediction in motile creatures confronted with constantly changing environments. A role for REM sleep in emotional regulation could thus be better understood as an ancient sleep function that evolved alongside selective attention to maintain an adaptive balance between prediction and surprise. This view of active sleep has some interesting implications for the evolution of subjective awareness and consciousness.
Redressing imbalances in the kind of research that gets done and who gets credit for it
If we want good work to get done, we should credit those who do it. In science, researchers are credited predominantly via authorship on publications. But many contributions to modern research are not recognized with authorship, due in part to the high bar imposed by the authorship criteria of many journals. “Contributorship” is a more inclusive framework for indicating who did what in the work described by a paper, and many scientific journals have recently implemented versions of it. I will consider the motivation for and specifics of this change, describe the tenzing tool we created to facilitate it, and how we might want to support and shape the shift toward contributorship
Central representations of protein availability regulating appetite and body weight control
Dietary protein quantity and quality greatly impact metabolic health via evolutionary-conserved mechanisms that ensure avoidance of amino acid imbalanced food sources, promote hyperphagia when dietary protein density is low, and conversely produce satiety when dietary protein density is high. Growing evidence support the emerging concept of protein homeostasis in mammals, where protein intake is maintained within a tight range independently of energy intake to reach a target protein intake. The behavioural and neuroendocrine mechanisms underlying these adaptations are unclear and form the focus of our research.
A theory for Hebbian learning in recurrent E-I networks
The Stabilized Supralinear Network is a model of recurrently connected excitatory (E) and inhibitory (I) neurons with a supralinear input-output relation. It can explain cortical computations such as response normalization and inhibitory stabilization. However, the network's connectivity is designed by hand, based on experimental measurements. How the recurrent synaptic weights can be learned from the sensory input statistics in a biologically plausible way is unknown. Earlier theoretical work on plasticity focused on single neurons and the balance of excitation and inhibition but did not consider the simultaneous plasticity of recurrent synapses and the formation of receptive fields. Here we present a recurrent E-I network model where all synaptic connections are simultaneously plastic, and E neurons self-stabilize by recruiting co-tuned inhibition. Motivated by experimental results, we employ a local Hebbian plasticity rule with multiplicative normalization for E and I synapses. We develop a theoretical framework that explains how plasticity enables inhibition balanced excitatory receptive fields that match experimental results. We show analytically that sufficiently strong inhibition allows neurons' receptive fields to decorrelate and distribute themselves across the stimulus space. For strong recurrent excitation, the network becomes stabilized by inhibition, which prevents unconstrained self-excitation. In this regime, external inputs integrate sublinearly. As in the Stabilized Supralinear Network, this results in response normalization and winner-takes-all dynamics: when two competing stimuli are presented, the network response is dominated by the stronger stimulus while the weaker stimulus is suppressed. In summary, we present a biologically plausible theoretical framework to model plasticity in fully plastic recurrent E-I networks. While the connectivity is derived from the sensory input statistics, the circuit performs meaningful computations. Our work provides a mathematical framework of plasticity in recurrent networks, which has previously only been studied numerically and can serve as the basis for a new generation of brain-inspired unsupervised machine learning algorithms.
Co-tuned, balanced excitation and inhibition in olfactory memory networks
Odor memories are exceptionally robust and essential for the survival of many species. In rodents, the olfactory cortex shows features of an autoassociative memory network and plays a key role in the retrieval of olfactory memories (Meissner-Bernard et al., 2019). Interestingly, the telencephalic area Dp, the zebrafish homolog of olfactory cortex, transiently enters a state of precise balance during the presentation of an odor (Rupprecht and Friedrich, 2018). This state is characterized by large synaptic conductances (relative to the resting conductance) and by co-tuning of excitation and inhibition in odor space and in time at the level of individual neurons. Our aim is to understand how this precise synaptic balance affects memory function. For this purpose, we build a simplified, yet biologically plausible spiking neural network model of Dp using experimental observations as constraints: besides precise balance, key features of Dp dynamics include low firing rates, odor-specific population activity and a dominance of recurrent inputs from Dp neurons relative to afferent inputs from neurons in the olfactory bulb. To achieve co-tuning of excitation and inhibition, we introduce structured connectivity by increasing connection probabilities and/or strength among ensembles of excitatory and inhibitory neurons. These ensembles are therefore structural memories of activity patterns representing specific odors. They form functional inhibitory-stabilized subnetworks, as identified by the “paradoxical effect” signature (Tsodyks et al., 1997): inhibition of inhibitory “memory” neurons leads to an increase of their activity. We investigate the benefits of co-tuning for olfactory and memory processing, by comparing inhibitory-stabilized networks with and without co-tuning. We find that co-tuned excitation and inhibition improves robustness to noise, pattern completion and pattern separation. In other words, retrieval of stored information from partial or degraded sensory inputs is enhanced, which is relevant in light of the instability of the olfactory environment. Furthermore, in co-tuned networks, odor-evoked activation of stored patterns does not persist after removal of the stimulus and may therefore subserve fast pattern classification. These findings provide valuable insights into the computations performed by the olfactory cortex, and into general effects of balanced state dynamics in associative memory networks.
The collective behavior of the clonal raider ant: computations, patterns, and naturalistic behavior
Colonies of ants and other eusocial insects are superorganisms, which perform sophisticated cognitive-like functions at the level of the group. In my talk I will review our efforts to establish the clonal raider ant Ooceraea biroi as a lab model system for the systematic study of the principles underlying collective information processing in ant colonies. I will use results from two separate projects to demonstrate the potential of this model system: In the first, we analyze the foraging behavior of the species, known as group raiding: a swift offensive response of a colony to the detection of a potential prey by a scout. By using automated behavioral tracking and detailed analysis we show that this behavior is closely related to the army ant mass raid, an iconic collective behavior in which hundreds of thousands of ants spontaneously leave the nest to go hunting, and that the evolutionary transition between the two can be explained by a change in colony size alone. In the second project, we study the emergence of a collective sensory response threshold in a colony. The sensory threshold is a fundamental computational primitive, observed across many biological systems. By carefully controlling the sensory environment and the social structure of the colonies we were able to show that it also appear in a collective context, and that it emerges out of a balance between excitatory and inhibitory interactions between ants. Furthermore, by using a mathematical model we predict that these two interactions can be mapped into known mechanisms of communication in ants. Finally, I will discuss the opportunities for understanding collective behavior that are opening up by the development of methods for neuroimaging and neurocontrol of our ants.
Design principles of adaptable neural codes
Behavior relies on the ability of sensory systems to infer changing properties of the environment from incoming sensory stimuli. However, the demands that detecting and adjusting to changes in the environment place on a sensory system often differ from the demands associated with performing a specific behavioral task. This necessitates neural coding strategies that can dynamically balance these conflicting needs. I will discuss our ongoing theoretical work to understand how this balance can best be achieved. We connect ideas from efficient coding and Bayesian inference to ask how sensory systems should dynamically allocate limited resources when the goal is to optimally infer changing latent states of the environment, rather than reconstruct incoming stimuli. We use these ideas to explore dynamic tradeoffs between the efficiency and speed of sensory adaptation schemes, and the downstream computations that these schemes might support. Finally, we derive families of codes that balance these competing objectives, and we demonstrate their close match to experimentally-observed neural dynamics during sensory adaptation. These results provide a unifying perspective on adaptive neural dynamics across a range of sensory systems, environments, and sensory tasks.
Stability-Flexibility Dilemma in Cognitive Control: A Dynamical System Perspective
Constraints on control-dependent processing have become a fundamental concept in general theories of cognition that explain human behavior in terms of rational adaptations to these constraints. However, theories miss a rationale for why such constraints would exist in the first place. Recent work suggests that constraints on the allocation of control facilitate flexible task switching at the expense of the stability needed to support goal-directed behavior in face of distraction. We formulate this problem in a dynamical system, in which control signals are represented as attractors and in which constraints on control allocation limit the depth of these attractors. We derive formal expressions of the stability-flexibility tradeoff, showing that constraints on control allocation improve cognitive flexibility but impair cognitive stability. We provide evidence that human participants adapt higher constraints on the allocation of control as the demand for flexibility increases but that participants deviate from optimal constraints. In continuing work, we are investigating how collaborative performance of a group of individuals can benefit from individual differences defined in terms of balance between cognitive stability and flexibility.
Keeping the balance: a role for the insular cortex in emotion homeostasis
Interacting synapses stabilise both learning and neuronal dynamics in biological networks
Distinct synapses influence one another when they undergo changes, with unclear consequences for neuronal dynamics and function. Here we show that synapses can interact such that excitatory currents are naturally normalised and balanced by inhibitory inputs. This happens when classical spike-timing dependent synaptic plasticity rules are extended by additional mechanisms that incorporate the influence of neighbouring synaptic currents and regulate the amplitude of efficacy changes accordingly. The resulting control of excitatory plasticity by inhibitory activation, and vice versa, gives rise to quick and long-lasting memories as seen experimentally in receptive field plasticity paradigms. In models with additional dendritic structure, we observe experimentally reported clustering of co-active synapses that depends on initial connectivity and morphology. Finally, in recurrent neural networks, rich and stable dynamics with high input sensitivity emerge, providing transient activity that resembles recordings from the motor cortex. Our model provides a general framework for codependent plasticity that frames individual synaptic modifications in the context of population-wide changes, allowing us to connect micro-level physiology with behavioural phenomena.
Glassy phase in dynamically balanced networks
We study the dynamics of (inhibitory) balanced networks at varying (i) the level of symmetry in the synaptic connectivity; and (ii) the ariance of the synaptic efficacies (synaptic gain). We find three regimes of activity. For suitably low synaptic gain, regardless of the level of symmetry, there exists a unique stable fixed point. Using a cavity-like approach, we develop a quantitative theory that describes the statistics of the activity in this unique fixed point, and the conditions for its stability. Increasing the synaptic gain, the unique fixed point destabilizes, and the network exhibits chaotic activity for zero or negative levels of symmetry (i.e., random or antisymmetric). Instead, for positive levels of symmetry, there is multi-stability among a large number of marginally stable fixed points. In this regime, ergodicity is broken and the network exhibits non-exponential relaxational dynamics. We discuss the potential relevance of such a “glassy” phase to explain some features of cortical activity.
Role of Oxytocin in regulating microglia functions to prevent brain damage of the developing brain
Every year, 30 million infants worldwide are delivered after intra-uterine growth restriction (IUGR) and 15 million are born preterm. These two conditions are the leading causes of ante/perinatal stress and brain injury responsible for neurocognitive and behavioral disorders in more than 9 million children each year. Both prematurity and IUGR are associated with perinatal systemic inflammation, a key factor associated with neuroinflammation and identified to be the best predictor of subsequent neurological impairments. Most of pharmacological candidates have failed to demonstrate any beneficial effect to prevent perinatal brain damage. In contrast, environmental enrichment based on developmental care, skin-to-skin contact and vocal/music intervention appears to confer positive effects on brain structure and function. However, mechanisms underlying these effects remain unknown. There is strong evidence that an adverse environment during pregnancy and the perinatal period can influence hormonal responses of the newborn with long-lasting neurobehavioral consequences in infancy and adulthood. Excessive cortisol release in response to perinatal stress induces pro-inflammatory and brain-programming effects. These deleterious effects are known to be balanced by Oxytocin (OT), a neuropeptide playing a key role during the perinatal period and parturition, in social behavior and regulating the central inflammatory response to injury in the adult brain. Using a rodent model of IUGR associated with perinatal brain damage, we recently reported that Carbetocin, a brain permeable long-lasting OT receptor (OTR) agonist, was associated with a significant reduction of activated microglia, the primary immune cells of the brain. Moreover this reduced microglia reactivity was associated to a long-term neuroprotection. These findings make OT a promising candidate for neonatal neuroprotection through neuroinflammation regulation. However, the causality between the endogenous OT and central inflammation response to injury has not been established and will be further studied by the lab.
“The Mechanics of Non-Equilibrium Soft Interfaces”
At small length-scales, capillary effects are significant, and thus the mechanics of soft material interfaces may be dominated by solid surface stresses or liquid surface tensions. The balance between surface and bulk properties is described by an elasto-capillary length-scale, in which equilibrium interfacial energies are constant. However, at small length-scales in biological materials, including living cells and tissues, interfacial energies are not constant but are actively regulated and driven far from equilibrium. Thus, the balance between surface and bulk properties depends upon the distance from equilibrium. Here, we model the spreading (wetting) of living cell aggregates as ‘active droplets’, with a non-equilibrium surface tension that depends upon internal stress generated by the actomyosin cytoskeleton. Depending upon the extent of activity, droplet surface properties adapt to the mechanics of their surroundings. The impact of this adaptation challenges contemporary models of interfacial mechanics, including extensively used models of contact mechanics and wetting.
Distinct synaptic plasticity mechanisms determine the diversity of cortical responses during behavior
Spike trains recorded from the cortex of behaving animals can be complex, highly variable from trial to trial, and therefore challenging to interpret. A fraction of cells exhibit trial-averaged responses with obvious task-related features such as pure tone frequency tuning in auditory cortex. However, a substantial number of cells (including cells in primary sensory cortex) do not appear to fire in a task-related manner and are often neglected from analysis. We recently used a novel single-trial, spike-timing-based analysis to show that both classically responsive and non-classically responsive cortical neurons contain significant information about sensory stimuli and behavioral decisions suggesting that non-classically responsive cells may play an underappreciated role in perception and behavior. We now expand this investigation to explore the synaptic origins and potential contribution of these cells to network function. To do so, we trained a novel spiking recurrent neural network model that incorporates spike-timing-dependent plasticity (STDP) mechanisms to perform the same task as behaving animals. By leveraging excitatory and inhibitory plasticity rules this model reproduces neurons with response profiles that are consistent with previously published experimental data, including classically responsive and non-classically responsive neurons. We found that both classically responsive and non-classically responsive neurons encode behavioral variables in their spike times as seen in vivo. Interestingly, plasticity in excitatory-to-excitatory synapses increased the proportion of non-classically responsive neurons and may play a significant role in determining response profiles. Finally, our model also makes predictions about the synaptic origins of classically and non-classically responsive neurons which we can compare to in vivo whole-cell recordings taken from the auditory cortex of behaving animals. This approach successfully recapitulates heterogeneous response profiles measured from behaving animals and provides a powerful lens for exploring large-scale neuronal dynamics and the plasticity rules that shape them.
Targeting the synapse in Alzheimer’s Disease
Alzheimer’s Disease is characterised by the accumulation of misfolded proteins, namely amyloid and tau, however it is synapse loss which leads to the cognitive impairments associated with the disease. Many studies have focussed on single time points to determine the effects of pathology on synapses however this does not inform on the plasticity of the synapses, that is how they behave in vivo as the pathology progresses. Here we used in vivo two-photon microscopy to assess the temporal dynamics of axonal boutons and dendritic spines in mouse models of tauopathy[1] (rTg4510) and amyloidopathy[2] (J20). This revealed that pre- and post-synaptic components are differentially affected in both AD models in response to pathology. In the Tg4510 model, differences in the stability and turnover of axonal boutons and dendritic spines immediately prior to neurite degeneration was revealed. Moreover, the dystrophic neurites could be partially rescued by transgene suppression. Understanding the imbalance in the response of pre- and post-synaptic components is crucial for drug discovery studies targeting the synapse in Alzheimer’s Disease. To investigate how sub-types of synapses are affected in human tissue, the Multi-‘omics Atlas Project, a UKDRI initiative to comprehensively map the pathology in human AD, will determine the synaptome changes using imaging and synaptic proteomics in human post mortem AD tissue. The use of multiple brain regions and multiple stages of disease will enable a pseudotemporal profile of pathology and the associated synapse alterations to be determined. These data will be compared to data from preclinical models to determine the functional implications of the human findings, to better inform preclinical drug discovery studies and to develop a therapeutic strategy to target synapses in Alzheimer’s Disease[3].
Phospholipid regulation in cognitive impairment and vascular dementia
An imbalance in lipid metabolism in neurodegeneration is still poorly understood. Phospholipids (PLs) have multifactorial participation in vascular dementia as Alzheimer, post-stroke dementia, CADASIL between others. Which include the hyperactivation of phospholipases, mitochondrial stress, peroxisomal dysfunction and irregular fatty acid composition triggering proinflammation in a very early stage of cognitive impairment. The reestablishment of physiological conditions of cholesterol, sphingolipids, phospholipids and others are an interesting therapeutic target to reduce the progression of AD. We propose the positive effect of BACE1 silencing produces a balance of phospholipid profile in desaturase enzymes-depending mode to reduce the inflammation response, and recover the cognitive function in an Alzheimer´s animal and brain stroke models. Pointing out there is a great need for new well-designed research focused in preventing phospholipids imbalance, and their consequent energy metabolism impairment, pro-inflammation and enzymatic over-processing, which would help to prevent unhealthy aging and AD progression.
Global visual salience of competing stimuli
Current computational models of visual salience accurately predict the distribution of fixations on isolated visual stimuli. It is not known, however, whether the global salience of a stimulus, that is its effectiveness in the competition for attention with other stimuli, is a function of the local salience or an independent measure. Further, do task and familiarity with the competing images influence eye movements? In this talk, I will present the analysis of a computational model of the global salience of natural images. We trained a machine learning algorithm to learn the direction of the first saccade of participants who freely observed pairs of images. The pairs balanced the combinations of new and already seen images, as well as task and task-free trials. The coefficients of the model provided a reliable measure of the likelihood of each image to attract the first fixation when seen next to another image, that is their global salience. For example, images of close-up faces and images containing humans were consistently looked first and were assigned higher global salience. Interestingly, we found that global salience cannot be explained by the feature-driven local salience of images, the influence of task and familiarity was rather small and we reproduced the previously reported left-sided bias. This computational model of global salience allows to analyse multiple other aspects of human visual perception of competing stimuli. In the talk, I will also present our latest results from analysing the saccadic reaction time as a function of the global salience of the pair of images.
The many faces of KCC2 in the generation and suppression of seizures
KCC2, best known as the neuron-specific chloride extruder that sets the strength and polarity of GABAergic Cl-currents, is a multifunctional molecule which interacts with other ion-regulatory proteins and (structurally) with the neuronal cytoskeleton. Its multiple roles in the generation and suppression of seizures have been widely studied. In my talk, I will address some fundamental issues which are relevant in this field of research: What are EGABA shifts about? What is the role of KCC2 in shunting inhibition? What is meant by “the balance between excitation and inhibition” and, in this context, by the “NKCC1/KCC2 ratio”? Is down-regulation of KCC2 following neuronal trauma a manifestation of adaptive or maladaptive ionic plasticity? Under what conditions is K-Cl cotransport by KCC2 promoting seizures? Should we pay more attention to KCC2 as molecule involved in dendritic spine formation in brain areas such as the hippocampus? Most of these points are of potential importance also in the design of KCC2-targeting drugs and genetic manipulations aimed at combating seizures.
The emergence of contrast invariance in cortical circuits
Neurons in the primary visual cortex (V1) encode the orientation and contrast of visual stimuli through changes in firing rate (Hubel and Wiesel, 1962). Their activity typically peaks at a preferred orientation and decays to zero at the orientations that are orthogonal to the preferred. This activity pattern is re-scaled by contrast but its shape is preserved, a phenomenon known as contrast invariance. Contrast-invariant selectivity is also observed at the population level in V1 (Carandini and Sengpiel, 2004). The mechanisms supporting the emergence of contrast-invariance at the population level remain unclear. How does the activity of different neurons with diverse orientation selectivity and non-linear contrast sensitivity combine to give rise to contrast-invariant population selectivity? Theoretical studies have shown that in the balance limit, the properties of single-neurons do not determine the population activity (van Vreeswijk and Sompolinsky, 1996). Instead, the synaptic dynamics (Mongillo et al., 2012) as well as the intracortical connectivity (Rosenbaum and Doiron, 2014) shape the population activity in balanced networks. We report that short-term plasticity can change the synaptic strength between neurons as a function of the presynaptic activity, which in turns modifies the population response to a stimulus. Thus, the same circuit can process a stimulus in different ways –linearly, sublinearly, supralinearly – depending on the properties of the synapses. We found that balanced networks with excitatory to excitatory short-term synaptic plasticity cannot be contrast-invariant. Instead, short-term plasticity modifies the network selectivity such that the tuning curves are narrower (broader) for increasing contrast if synapses are facilitating (depressing). Based on these results, we wondered whether balanced networks with plastic synapses (other than short-term) can support the emergence of contrast-invariant selectivity. Mathematically, we found that the only synaptic transformation that supports perfect contrast invariance in balanced networks is a power-law release of neurotransmitter as a function of the presynaptic firing rate (in the excitatory to excitatory and in the excitatory to inhibitory neurons). We validate this finding using spiking network simulations, where we report contrast-invariant tuning curves when synapses release the neurotransmitter following a power- law function of the presynaptic firing rate. In summary, we show that synaptic plasticity controls the type of non-linear network response to stimulus contrast and that it can be a potential mechanism mediating the emergence of contrast invariance in balanced networks with orientation-dependent connectivity. Our results therefore connect the physiology of individual synapses to the network level and may help understand the establishment of contrast-invariant selectivity.
Adolescent maturation of cortical excitation-inhibition balance based on individualized biophysical network modeling
Bernstein Conference 2024
Local E/I Balance and Spontaneous Dynamics in Neuronal Networks
Bernstein Conference 2024
Neuronal spike generation via a homoclinic orbit bifurcation increases irregularity and chaos in balanced networks
Bernstein Conference 2024
Input correlations impede suppression of chaos and learning in balanced rate networks
COSYNE 2022
Input correlations impede suppression of chaos and learning in balanced rate networks
COSYNE 2022
Local dendritic balance enables the learning of efficient representations in networks of spiking neurons
COSYNE 2022
Local dendritic balance enables the learning of efficient representations in networks of spiking neurons
COSYNE 2022
Localized balance of excitation and inhibition leads to normalization
COSYNE 2022
Localized balance of excitation and inhibition leads to normalization
COSYNE 2022
Neural network size balances representational drift and flexibility during Bayesian sampling
COSYNE 2022
Neural network size balances representational drift and flexibility during Bayesian sampling
COSYNE 2022
Recurrent suppression in visual cortex explained by a balanced network with sparse synaptic connections
COSYNE 2022
Recurrent suppression in visual cortex explained by a balanced network with sparse synaptic connections
COSYNE 2022
Slow, low-dimensional dynamics in balanced networks with partially symmetric connectivity
COSYNE 2023
A spatiotemporal orchestration of balanced cholinergic effects regulates cortical activation
COSYNE 2023
Synaptic-type-specific clustering optimizes the computational capabilities of balanced recurrent networks
COSYNE 2023
Unifying mechanistic and functional models of cortical circuits with low-rank, E/I-balanced spiking networks
COSYNE 2023
Balanced two-photon holographic bidirectional optogenetics defines the mechanism for stimulus quenching of neural variability
COSYNE 2025
A direct link between Prefrontal E/I imbalance and executive dysfunction in schizophrenia
COSYNE 2025
Multimodal Ising-based connectomics reveals an excitation-inhibition imbalance in Alzheimer's Risk
COSYNE 2025
Neural sampling in a balanced spiking network with internally generated variability
COSYNE 2025
Altered excitatory/inhibitory balance in the prefrontal cortex of the IB2 KO mouse model of autism: From neuronal excitability to cerebellar modulation in vivo
FENS Forum 2024
Broken balance - Early impairment at inhibitory synapses in Alzheimer’s disease
FENS Forum 2024
The effect of moderate‐intensity balance training on the activity of antioxidant enzymes in people with Parkinson's disease: A pilot study
FENS Forum 2024
Evaluation and treatment of imbalance in patients with Alzheimer’s disease
FENS Forum 2024
Excitation-inhibition balance in a model of plastic coupled oscillators determines collective synchronization and connection fluctuations
FENS Forum 2024
Excitatory-inhibitory balance assessed by aperiodic component and its correlation with paired-pulse inhibition in the primary somatosensory cortex: An MEG study
FENS Forum 2024
Exploring the role of omega-3/omega-6 balance in long-lasting changes in microglia caused by intermittent alcohol consumption during adolescence
FENS Forum 2024
Gradual changes in TMS-induced motor excitability are associated with excitation-inhibition balance dynamics
FENS Forum 2024
Imbalanced inhibition in prefrontal cortico-thalamic circuits via thalamic reticular nucleus
FENS Forum 2024
The impact of CA2 E/I imbalance on social behaviour and network activity
FENS Forum 2024
Investigating the role of SNX27-retromer in excitatory/inhibitory balance in health and disease
FENS Forum 2024
Lateral hypothalamic neurotensin-expressing neurons shape the balance between drinking, feeding, and socializing
FENS Forum 2024
LRIG1 regulates the balance between proliferation and quiescence in glioblastoma stem cells
FENS Forum 2024
Maternal high-fat diet disrupts hippocampal excitation-inhibition balance impairing cognitive function selectively in adult female mouse offspring
FENS Forum 2024
Neuronal SMC3 regulates weight, body composition, and hormonal balance in parallel with sex-dependent effects on anxiety behavior
FENS Forum 2024
Non-invasive vagus nerve stimulation normalizes psychoemotional state shifting “sympatho-vagal balance”
FENS Forum 2024
Oxytocin and leptin crosstalk in the regulation of the energy balance
FENS Forum 2024
Regulatory network of forebrain development modeled in organoids reveals key factors associated with excitatory neurons imbalance in idiopathic autism
FENS Forum 2024
Role of HDAC4 in pre- and post-synaptic protein SUMOylation imbalance in a mouse model of Alzheimer’s disease
FENS Forum 2024