Excitation and Inhibition
excitation and inhibition
More than a beast growing in a passive brain: excitation and inhibition drive epilepsy and glioma progression
Gliomas are brain tumors formed by networks of connected tumor cells, nested in and interacting with neuronal networks. Neuronal activities interfere with tumor growth and occurrence of seizures affects glioma prognosis, while the developing tumor triggers seizures in the infiltrated cortex. Oncometabolites produced by tumor cells and neurotransmitters affect both the generation of epileptic activities by neurons and the growth of glioma cells through synaptic-related mechanisms, involving both GABAergic / Chloride pathways and glutamatergic signaling. From a clinical sight, epilepsy occurrence is beneficial to glioma prognosis but growing tumors are epileptogenic, which constitutes a paradox. This lecture will review how inhibitory and excitatory signaling drives glioma growth and how epileptic and oncological processes are interfering, with a special focus on the human brain.
Integration of 3D human stem cell models derived from post-mortem tissue and statistical genomics to guide schizophrenia therapeutic development
Schizophrenia is a neuropsychiatric disorder characterized by positive symptoms (such as hallucinations and delusions), negative symptoms (such as avolition and withdrawal) and cognitive dysfunction1. Schizophrenia is highly heritable, and genetic studies are playing a pivotal role in identifying potential biomarkers and causal disease mechanisms with the hope of informing new treatments. Genome-wide association studies (GWAS) identified nearly 270 loci with a high statistical association with schizophrenia risk; however each locus confers only a small increase in risk therefore it is difficult to translate these findings into understanding disease biology that can lead to treatments. Induced pluripotent stem cell (iPSC) models are a tractable system to translate genetic findings and interrogate mechanisms of pathogenesis. Mounting research with patient-derived iPSCs has proposed several neurodevelopmental pathways altered in SCZ, such as neural progenitor cell (NPC) proliferation, imbalanced differentiation of excitatory and inhibitory cortical neurons. However, it is unclear what exactly these iPS models recapitulate, how potential perturbations of early brain development translates into illness in adults and how iPS models that represent fetal stages can be utilized to further drug development efforts to treat adult illness. I will present the largest transcriptome analysis of post-mortem caudate nucleus in schizophrenia where we discovered that decreased presynaptic DRD2 autoregulation is the causal dopamine risk factor for schizophrenia (Benjamin et al, Nature Neuroscience 2022 https://doi.org/10.1038/s41593-022-01182-7). We developed stem cell models from a subset of the postmortem cohort to better understand the molecular underpinnings of human psychiatric disorders (Sawada et al, Stem Cell Research 2020). We established a method for the differentiation of iPS cells into ventral forebrain organoids and performed single cell RNAseq and cellular phenotyping. To our knowledge, this is the first study to evaluate iPSC models of SZ from the same individuals with postmortem tissue. Our study establishes that striatal neurons in the patients with SCZ carry abnormalities that originated during early brain development. Differentiation of inhibitory neurons is accelerated whereas excitatory neuronal development is delayed, implicating an excitation and inhibition (E-I) imbalance during early brain development in SCZ. We found a significant overlap of genes upregulated in the inhibitory neurons in SCZ organoids with upregulated genes in postmortem caudate tissues from patients with SCZ compared with control individuals, including the donors of our iPS cell cohort. Altogether, we demonstrate that ventral forebrain organoids derived from postmortem tissue of individuals with schizophrenia recapitulate perturbed striatal gene expression dynamics of the donors’ brains (Sawada et al, biorxiv 2022 https://doi.org/10.1101/2022.05.26.493589).
The balance of excitation and inhibition and a canonical cortical computation
Excitatory and inhibitory (E & I) inputs to cortical neurons remain balanced across different conditions. The balanced network model provides a self-consistent account of this observation: population rates dynamically adjust to yield a state in which all neurons are active at biological levels, with their E & I inputs tightly balanced. But global tight E/I balance predicts population responses with linear stimulus-dependence and does not account for systematic cortical response nonlinearities such as divisive normalization, a canonical brain computation. However, when necessary connectivity conditions for global balance fail, states arise in which only a localized subset of neurons are active and have balanced inputs. We analytically show that in networks of neurons with different stimulus selectivities, the emergence of such localized balance states robustly leads to normalization, including sublinear integration and winner-take-all behavior. An alternative model that exhibits normalization is the Stabilized Supralinear Network (SSN), which predicts a regime of loose, rather than tight, E/I balance. However, an understanding of the causal relationship between E/I balance and normalization in SSN and conditions under which SSN yields significant sublinear integration are lacking. For weak inputs, SSN integrates inputs supralinearly, while for very strong inputs it approaches a regime of tight balance. We show that when this latter regime is globally balanced, SSN cannot exhibit strong normalization for any input strength; thus, in SSN too, significant normalization requires localized balance. In summary, we causally and quantitatively connect a fundamental feature of cortical dynamics with a canonical brain computation. Time allowing I will also cover our work extending a normative theoretical account of normalization which explains it as an example of efficient coding of natural stimuli. We show that when biological noise is accounted for, this theory makes the same prediction as the SSN: a transition to supralinear integration for weak stimuli.
Mutation targeted gene therapy approaches to alter rod degeneration and retain cones
My research uses electrophysiological techniques to evaluate normal retinal function, dysfunction caused by blinding retinal diseases and the restoration of function using a variety of therapeutic strategies. We can use our understanding or normal retinal function and disease-related changes to construct optimal therapeutic strategies and evaluate how they ameliorate the effects of disease. Retinitis pigmentosa (RP) is a family of blinding eye diseases caused by photoreceptor degeneration. The absence of the cells that for this primary signal leads to blindness. My interest in RP involves the evaluation of therapies to restore vision: replacing degenerated photoreceptors either with: (1) new stem or other embryonic cells, manipulated to become photoreceptors or (2) prosthetics devices that replace the photoreceptor signal with an electronic signal to light. Glaucoma is caused by increased intraocular pressure and leads to ganglion cell death, which eliminates the link between the retinal output and central visual processing. We are parsing out of the effects of increased intraocular pressure and aging on ganglion cells. Congenital Stationary Night Blindness (CSNB) is a family of diseases in which signaling is eliminated between rod photoreceptors and their postsynaptic targets, rod bipolar cells. This deafferents the retinal circuit that is responsible for vision under dim lighting. My interest in CSNB involves understanding the basic interplay between excitation and inhibition in the retinal circuit and its normal development. Because of the targeted nature of this disease, we are hopeful that a gene therapy approach can be developed to restore night vision. My work utilizes rodent disease models whose mutations mimic those found in human patients. While molecular manipulation of rodents is a fairly common approach, we have recently developed a mutant NIH miniature swine model of a common form of autosomal dominant RP (Pro23His rhodopsin mutation) in collaboration with the National Swine Resource Research Center at University of Missouri. More genetically modified mini-swine models are in the pipeline to examine other retinal diseases.
Synaptic plasticity controls the emergence of population-wide invariant representations in balanced network models
The intensity and features of sensory stimuli are encoded in the activity of neurons in the cortex. In the visual and piriform cortices, the stimulus intensity re-scales the activity of the population without changing its selectivity for the stimulus features. The cortical representation of the stimulus is therefore intensity-invariant. This emergence of network invariant representations appears robust to local changes in synaptic strength induced by synaptic plasticity, even though: i) synaptic plasticity can potentiate or depress connections between neurons in a feature-dependent manner, and ii) in networks with balanced excitation and inhibition, synaptic plasticity determines the non-linear network behavior. In this study, we investigate the consistency of invariant representations with a variety of synaptic states in balanced networks. By using mean-field models and spiking network simulations, we show how the synaptic state controls the emergence of intensity-invariant or intensity-dependent selectivity by inducing changes in the network response to intensity. In particular, we demonstrate how facilitating synaptic states can sharpen the network selectivity while depressing states broaden it. We also show how power-law-type synapses permit the emergence of invariant network selectivity and how this plasticity can be generated by a mix of different plasticity rules. Our results explain how the physiology of individual synapses is linked to the emergence of invariant representations of sensory stimuli at the network level.
A theory for Hebbian learning in recurrent E-I networks
The Stabilized Supralinear Network is a model of recurrently connected excitatory (E) and inhibitory (I) neurons with a supralinear input-output relation. It can explain cortical computations such as response normalization and inhibitory stabilization. However, the network's connectivity is designed by hand, based on experimental measurements. How the recurrent synaptic weights can be learned from the sensory input statistics in a biologically plausible way is unknown. Earlier theoretical work on plasticity focused on single neurons and the balance of excitation and inhibition but did not consider the simultaneous plasticity of recurrent synapses and the formation of receptive fields. Here we present a recurrent E-I network model where all synaptic connections are simultaneously plastic, and E neurons self-stabilize by recruiting co-tuned inhibition. Motivated by experimental results, we employ a local Hebbian plasticity rule with multiplicative normalization for E and I synapses. We develop a theoretical framework that explains how plasticity enables inhibition balanced excitatory receptive fields that match experimental results. We show analytically that sufficiently strong inhibition allows neurons' receptive fields to decorrelate and distribute themselves across the stimulus space. For strong recurrent excitation, the network becomes stabilized by inhibition, which prevents unconstrained self-excitation. In this regime, external inputs integrate sublinearly. As in the Stabilized Supralinear Network, this results in response normalization and winner-takes-all dynamics: when two competing stimuli are presented, the network response is dominated by the stronger stimulus while the weaker stimulus is suppressed. In summary, we present a biologically plausible theoretical framework to model plasticity in fully plastic recurrent E-I networks. While the connectivity is derived from the sensory input statistics, the circuit performs meaningful computations. Our work provides a mathematical framework of plasticity in recurrent networks, which has previously only been studied numerically and can serve as the basis for a new generation of brain-inspired unsupervised machine learning algorithms.
Co-tuned, balanced excitation and inhibition in olfactory memory networks
Odor memories are exceptionally robust and essential for the survival of many species. In rodents, the olfactory cortex shows features of an autoassociative memory network and plays a key role in the retrieval of olfactory memories (Meissner-Bernard et al., 2019). Interestingly, the telencephalic area Dp, the zebrafish homolog of olfactory cortex, transiently enters a state of precise balance during the presentation of an odor (Rupprecht and Friedrich, 2018). This state is characterized by large synaptic conductances (relative to the resting conductance) and by co-tuning of excitation and inhibition in odor space and in time at the level of individual neurons. Our aim is to understand how this precise synaptic balance affects memory function. For this purpose, we build a simplified, yet biologically plausible spiking neural network model of Dp using experimental observations as constraints: besides precise balance, key features of Dp dynamics include low firing rates, odor-specific population activity and a dominance of recurrent inputs from Dp neurons relative to afferent inputs from neurons in the olfactory bulb. To achieve co-tuning of excitation and inhibition, we introduce structured connectivity by increasing connection probabilities and/or strength among ensembles of excitatory and inhibitory neurons. These ensembles are therefore structural memories of activity patterns representing specific odors. They form functional inhibitory-stabilized subnetworks, as identified by the “paradoxical effect” signature (Tsodyks et al., 1997): inhibition of inhibitory “memory” neurons leads to an increase of their activity. We investigate the benefits of co-tuning for olfactory and memory processing, by comparing inhibitory-stabilized networks with and without co-tuning. We find that co-tuned excitation and inhibition improves robustness to noise, pattern completion and pattern separation. In other words, retrieval of stored information from partial or degraded sensory inputs is enhanced, which is relevant in light of the instability of the olfactory environment. Furthermore, in co-tuned networks, odor-evoked activation of stored patterns does not persist after removal of the stimulus and may therefore subserve fast pattern classification. These findings provide valuable insights into the computations performed by the olfactory cortex, and into general effects of balanced state dynamics in associative memory networks.
The many faces of KCC2 in the generation and suppression of seizures
KCC2, best known as the neuron-specific chloride extruder that sets the strength and polarity of GABAergic Cl-currents, is a multifunctional molecule which interacts with other ion-regulatory proteins and (structurally) with the neuronal cytoskeleton. Its multiple roles in the generation and suppression of seizures have been widely studied. In my talk, I will address some fundamental issues which are relevant in this field of research: What are EGABA shifts about? What is the role of KCC2 in shunting inhibition? What is meant by “the balance between excitation and inhibition” and, in this context, by the “NKCC1/KCC2 ratio”? Is down-regulation of KCC2 following neuronal trauma a manifestation of adaptive or maladaptive ionic plasticity? Under what conditions is K-Cl cotransport by KCC2 promoting seizures? Should we pay more attention to KCC2 as molecule involved in dendritic spine formation in brain areas such as the hippocampus? Most of these points are of potential importance also in the design of KCC2-targeting drugs and genetic manipulations aimed at combating seizures.
The subcellular organization of excitation and inhibition underlying high-fidelity direction coding in the retina
Understanding how neural circuits in the brain compute information not only requires determining how individual inhibitory and excitatory elements of circuits are wired together, but also a detailed knowledge of their functional interactions. Recent advances in optogenetic techniques and mouse genetics now offer ways to specifically probe the functional properties of neural circuits with unprecedented specificity. Perhaps one of the most heavily interrogated circuits in the mouse brain is one in the retina that is involved in coding direction (reviewed by Mauss et al., 2017; Vaney et al., 2012). In this circuit, direction is encoded by specialized direction-selective (DS) ganglion cells (DSGCs), which respond robustly to objects moving in a ‘preferred’ direction but not in the opposite or ‘null’ direction (Barlow and Levick, 1965). We now know this computation relies on the coordination of three transmitter systems: glutamate, GABA and acetylcholine (ACh). In this talk, I will discuss the synaptic mechanisms that produce the spatiotemporal patterns of inhibition and excitation that are crucial for shaping directional selectivity. Special emphasis will be placed on the role of ACh, as it is unclear whether it is mediated by synaptic or non-synaptic mechanisms, which is in fact a central issue in the CNS. Barlow, H.B., and Levick, W.R. (1965). The mechanism of directionally selective units in rabbit's retina. J Physiol 178, 477-504. Mauss, A.S., Vlasits, A., Borst, A., and Feller, M. (2017). Visual Circuits for Direction Selectivity. Annu Rev Neurosci 40, 211-230. Vaney, D.I., Sivyer, B., and Taylor, W.R. (2012). Direction selectivity in the retina: symmetry and asymmetry in structure and function. Nat Rev Neurosci 13, 194-208
Localized balance of excitation and inhibition leads to normalization
COSYNE 2022
Localized balance of excitation and inhibition leads to normalization
COSYNE 2022
Patterns of mutual excitation and inhibition between classes of inhibitory neurons in the primary olfactory cortex
FENS Forum 2024