Energy
energy
Alona Fyshe
The Department of Psychology, University of Alberta, invites applications for a tenure-track position at the rank of Assistant Professor in Artificial Intelligence and Biological Cognition to commence with a start date as early as July 1, 2024. Exceptional candidates might be considered for hiring at the rank of Associate Professor. The position is part of a cluster hire in the intersection of AI/ML and other areas of research excellence within the University of Alberta that include Health, Energy, and Indigenous Initiatives in health and humanities, among others. The successful candidate will become an Amii Fellow, joining a highly collegial institute of world-class Artificial Intelligence and Machine Learning researchers, and will have access to Amii internal funding resources, administrative support, and a highly collaborative environment. The successful candidate will be nominated for a Canada CIFAR Artificial Intelligence (CCAI) Chair, by the Amii, which includes research funding for at least five years.
Astrocytes: From Metabolism to Cognition
Different brain cell types exhibit distinct metabolic signatures that link energy economy to cellular function. Astrocytes and neurons, for instance, diverge dramatically in their reliance on glycolysis versus oxidative phosphorylation, underscoring that metabolic fuel efficiency is not uniform across cell types. A key factor shaping this divergence is the structural organization of the mitochondrial respiratory chain into supercomplexes. Specifically, complexes I (CI) and III (CIII) form a CI–CIII supercomplex, but the degree of this assembly varies by cell type. In neurons, CI is predominantly integrated into supercomplexes, resulting in highly efficient mitochondrial respiration and minimal reactive oxygen species (ROS) generation. Conversely, in astrocytes, a larger fraction of CI remains unassembled, freely existing apart from CIII, leading to reduced respiratory efficiency and elevated mitochondrial ROS production. Despite this apparent inefficiency, astrocytes boast a highly adaptable metabolism capable of responding to diverse stressors. Their looser CI–CIII organization allows for flexible ROS signaling, which activates antioxidant programs via transcription factors like Nrf2. This modular architecture enables astrocytes not only to balance energy production but also to support neuronal health and influence complex organismal behaviors.
From Spiking Predictive Coding to Learning Abstract Object Representation
In a first part of the talk, I will present Predictive Coding Light (PCL), a novel unsupervised learning architecture for spiking neural networks. In contrast to conventional predictive coding approaches, which only transmit prediction errors to higher processing stages, PCL learns inhibitory lateral and top-down connectivity to suppress the most predictable spikes and passes a compressed representation of the input to higher processing stages. We show that PCL reproduces a range of biological findings and exhibits a favorable tradeoff between energy consumption and downstream classification performance on challenging benchmarks. A second part of the talk will feature our lab’s efforts to explain how infants and toddlers might learn abstract object representations without supervision. I will present deep learning models that exploit the temporal and multimodal structure of their sensory inputs to learn representations of individual objects, object categories, or abstract super-categories such as „kitchen object“ in a fully unsupervised fashion. These models offer a parsimonious account of how abstract semantic knowledge may be rooted in children's embodied first-person experiences.
Neurobiological constraints on learning: bug or feature?
Understanding how brains learn requires bridging evidence across scales—from behaviour and neural circuits to cells, synapses, and molecules. In our work, we use computational modelling and data analysis to explore how the physical properties of neurons and neural circuits constrain learning. These include limits imposed by brain wiring, energy availability, molecular noise, and the 3D structure of dendritic spines. In this talk I will describe one such project testing if wiring motifs from fly brain connectomes can improve performance of reservoir computers, a type of recurrent neural network. The hope is that these insights into brain learning will lead to improved learning algorithms for artificial systems.
Impact of High Fat Diet on Central Cardiac Circuits: When The Wanderer is Lost
Cardiac vagal motor drive originates in the brainstem's cardiac vagal motor neurons (CVNs). Despite well-established cardioinhibitory functions in health, our understanding of CVNs in disease is limited. There is a clear connection of cardiovascular regulation with metabolic and energy expenditure systems. Using high fat diet as a model, this talk will explore how metabolic dysfunction impacts the regulation of cardiac tissue through robust inhibition of CVNs. Specifically, it will present an often overlooked modality of inhibition, tonic gamma-aminobuytric acid (GABA) A-type neurotransmission using an array of techniques from single cell patch clamp electrophysiology to transgenic in vivo whole animal physiology. It also will highlight a unique interaction with the delta isoform of protein kinase C to facilitate GABA A-type receptor expression.
“Open Raman Microscopy (ORM): A modular Raman spectroscopy setup with an open-source controller”
Raman spectroscopy is a powerful technique for identifying chemical species by probing their vibrational energy levels, offering exceptional specificity with a relatively simple setup involving a laser source, spectrometer, and microscope/probe. However, the high cost of Raman systems lacking modularity often limits exploratory research hindering broader adoption. To address the need for an affordable, modular microscopy platform for multimodal imaging, we present a customizable confocal Raman spectroscopy setup alongside an open-source acquisition software, ORM (Open Raman Microscopy) Controller, developed in Python. This solution bridges the gap between expensive commercial systems and complex, custom-built setups used by specialist research groups. In this presentation, we will cover the components of the setup, the design rationale, assembly methods, limitations, and its modular potential for expanding functionality. Additionally, we will demonstrate ORM’s capabilities for instrument control, 2D and 3D Raman mapping, region-of-interest selection, and its adaptability to various instrument configurations. We will conclude by showcasing practical applications of this setup across different research fields.
Mitochondrial diversity in the mouse and human brain
The basis of the mind, of mental states, and complex behaviors is the flow of energy through microscopic and macroscopic brain structures. Energy flow through brain circuits is powered by thousands of mitochondria populating the inside of every neuron, glial, and other nucleated cell across the brain-body unit. This seminar will cover emerging approaches to study the mind-mitochondria connection and present early attempts to map the distribution and diversity of mitochondria across brain tissue. In rodents, I will present convergent multimodal evidence anchored in enzyme activities, gene expression, and animal behavior that distinct behaviorally-relevant mitochondrial phenotypes exist across large-scale mouse brain networks. Extending these findings to the human brain, I will present a developing systematic biochemical and molecular map of mitochondrial variation across cortical and subcortical brain structures, representing a foundation to understand the origin of complex energy patterns that give rise to the human mind.
Prefrontal mechanisms involved in learning distractor-resistant working memory in a dual task
Working memory (WM) is a cognitive function that allows the short-term maintenance and manipulation of information when no longer accessible to the senses. It relies on temporarily storing stimulus features in the activity of neuronal populations. To preserve these dynamics from distraction it has been proposed that pre and post-distraction population activity decomposes into orthogonal subspaces. If orthogonalization is necessary to avoid WM distraction, it should emerge as performance in the task improves. We sought evidence of WM orthogonalization learning and the underlying mechanisms by analyzing calcium imaging data from the prelimbic (PrL) and anterior cingulate (ACC) cortices of mice as they learned to perform an olfactory dual task. The dual task combines an outer Delayed Paired-Association task (DPA) with an inner Go-NoGo task. We examined how neuronal activity reflected the process of protecting the DPA sample information against Go/NoGo distractors. As mice learned the task, we measured the overlap between the neural activity onto the low-dimensional subspaces that encode sample or distractor odors. Early in the training, pre-distraction activity overlapped with both sample and distractor subspaces. Later in the training, pre-distraction activity was strictly confined to the sample subspace, resulting in a more robust sample code. To gain mechanistic insight into how these low-dimensional WM representations evolve with learning we built a recurrent spiking network model of excitatory and inhibitory neurons with low-rank connections. The model links learning to (1) the orthogonalization of sample and distractor WM subspaces and (2) the orthogonalization of each subspace with irrelevant inputs. We validated (1) by measuring the angular distance between the sample and distractor subspaces through learning in the data. Prediction (2) was validated in PrL through the photoinhibition of ACC to PrL inputs, which induced early-training neural dynamics in well-trained animals. In the model, learning drives the network from a double-well attractor toward a more continuous ring attractor regime. We tested signatures for this dynamical evolution in the experimental data by estimating the energy landscape of the dynamics on a one-dimensional ring. In sum, our study defines network dynamics underlying the process of learning to shield WM representations from distracting tasks.
Sex hormone regulation of neural gene expression
Gonadal steroid hormones are the principal drivers of sex-variable biology in vertebrates. In the brain, estrogen (17β-estradiol) establishes neural sex differences in many species and modulates mood, behavior, and energy balance in adulthood. To understand the diverse effects of estradiol on the brain, we profiled the genomic binding of estrogen receptor alpha (ERα), providing the first picture of the neural actions of any gonadal hormone receptor. To relate ERα target genes to brain sex differences we assessed gene expression and chromatin accessibility in the posterior bed nucleus of the stria terminalis (BNSTp), a sexually dimorphic node in limbic circuitry that underlies sex-differential social behaviors such as aggression and parenting. In adult animals we observe that levels of ERα are predictive of the extent of sex-variable gene expression, and that these sex differences are a dynamic readout of acute hormonal state. In neonates we find that transient ERα recruitment at birth leads to persistent chromatin opening and male-biased gene expression, demonstrating a true epigenetic mechanism for brain sexual differentiation. Collectively, our findings demonstrate that sex differences in gene expression in the brain are a readout of state-dependent hormone receptor actions, rather than other factors such as sex chromosomes. We anticipate that the ERα targets we have found will contribute to established sex differences in the incidence and etiology of neurological and psychiatric disorders.
Decoding mental conflict between reward and curiosity in decision-making
Humans and animals are not always rational. They not only rationally exploit rewards but also explore an environment owing to their curiosity. However, the mechanism of such curiosity-driven irrational behavior is largely unknown. Here, we developed a decision-making model for a two-choice task based on the free energy principle, which is a theory integrating recognition and action selection. The model describes irrational behaviors depending on the curiosity level. We also proposed a machine learning method to decode temporal curiosity from behavioral data. By applying it to rat behavioral data, we found that the rat had negative curiosity, reflecting conservative selection sticking to more certain options and that the level of curiosity was upregulated by the expected future information obtained from an uncertain environment. Our decoding approach can be a fundamental tool for identifying the neural basis for reward–curiosity conflicts. Furthermore, it could be effective in diagnosing mental disorders.
Consciousness in the age of mechanical minds
We are now clearly entering a new age in our relationship with machines. The power of AI natural language processors and image generators has rapidly exceeded the expectations of even those who developed them. Serious questions are now being asked about the extent to which machines could become — or perhaps already are — sentient or conscious. Do AI machines understand the instructions they are given and the answers they provide? In this talk I will consider the prospects for conscious machines, by which I mean machines that have feelings, know about their own existence, and about ours. I will suggest that the recent focus on information processing in models of consciousness, in which the brain is treated as a kind of digital computer, have mislead us about the nature of consciousness and how it is produced in biological systems. Treating the brain as an energy processing system is more likely to yield answers to these fundamental questions and help us understand how and when machines might become minds.
Obesity and Brain – Bidirectional Influences
The regulation of body weight relies on homeostatic mechanisms that use a combination of internal signals and external cues to initiate and terminate food intake. Homeostasis depends on intricate communication between the body and the hypothalamus involving numerous neural and hormonal signals. However, there is growing evidence that higher-level cognitive function may also influence energy balance. For instance, research has shown that BMI is consistently linked to various brain, cognitive, and personality measures, implicating executive, reward, and attentional systems. Moreover, the rise in obesity rates over the past half-century is attributed to the affordability and widespread availability of highly processed foods, a phenomenon that contradicts the idea that food intake is solely regulated by homeostasis. I will suggest that prefrontal systems involved in value computation and motivation act to limit food overconsumption when food is scarce or expensive, but promote over-eating when food is abundant, an optimum strategy from an economic standpoint. I will review the genetic and neuroscience literature on the CNS control of body weight. I will present recent studies supporting a role of prefrontal systems in weight control. I will also present contradictory evidence showing that frontal executive and cognitive findings in obesity may be a consequence not a cause of increased hunger. Finally I will review the effects of obesity on brain anatomy and function. Chronic adiposity leads to cerebrovascular dysfunction, cortical thinning, and cognitive impairment. As the most common preventable risk factor for dementia, obesity poses a significant threat to brain health. I will conclude by reviewing evidence for treatment of obesity in adults to prevent brain disease.
Uncovering the molecular effectors of diet and exercise
Despite the profound effects of nutrition and physical activity on human health, our understanding of the molecules mediating the salutary effects of specific foods or activities remains remarkably limited. Here, we share our ongoing studies that use unbiased and high-resolution metabolomics technologies to uncover the molecules and molecular effectors of diet and exercise. We describe how exercise stimulates the production of Lac-Phe, a blood-borne signaling metabolite that suppresses feeding and obesity. Ablation of Lac-Phe biosynthesis in mice increases food intake and obesity after exercise. We also describe the discovery of an orphan metabolite, BHB-Phe. Ketosis-inducible BHB-Phe is a congener of exercise-inducible Lac-Phe, produced in CNDP2+ cells when levels of BHB are high, and functions to lower body weight and adiposity in ketosis. Our data uncover an unexpected and underappreciated signaling role for metabolic fuel derivatives in mediating the cardiometabolic benefits of diet and exercise. These data also suggest that diet and exercise may mediate their physiologic effects on energy balance via a common family of molecules and overlapping signaling pathways.
Asymmetric signaling across the hierarchy of cytoarchitecture within the human connectome
Cortical variations in cytoarchitecture form a sensory-fugal axis that shapes regional profiles of extrinsic connectivity and is thought to guide signal propagation and integration across the cortical hierarchy. While neuroimaging work has shown that this axis constrains local properties of the human connectome, it remains unclear whether it also shapes the asymmetric signaling that arises from higher-order topology. Here, we used network control theory to examine the amount of energy required to propagate dynamics across the sensory-fugal axis. Our results revealed an asymmetry in this energy, indicating that bottom-up transitions were easier to complete compared to top-down. Supporting analyses demonstrated that asymmetries were underpinned by a connectome topology that is wired to support efficient bottom-up signaling. Lastly, we found that asymmetries correlated with differences in communicability and intrinsic neuronal time scales and lessened throughout youth. Our results show that cortical variation in cytoarchitecture may guide the formation of macroscopic connectome topology.
REM sleep and the energy allocation hypothesis”
Beyond Biologically Plausible Spiking Networks for Neuromorphic Computing
Biologically plausible spiking neural networks (SNNs) are an emerging architecture for deep learning tasks due to their energy efficiency when implemented on neuromorphic hardware. However, many of the biological features are at best irrelevant and at worst counterproductive when evaluated in the context of task performance and suitability for neuromorphic hardware. In this talk, I will present an alternative paradigm to design deep learning architectures with good task performance in real-world benchmarks while maintaining all the advantages of SNNs. We do this by focusing on two main features – event-based computation and activity sparsity. Starting from the performant gated recurrent unit (GRU) deep learning architecture, we modify it to make it event-based and activity-sparse. The resulting event-based GRU (EGRU) is extremely efficient for both training and inference. At the same time, it achieves performance close to conventional deep learning architectures in challenging tasks such as language modelling, gesture recognition and sequential MNIST.
From Machine Learning to Autonomous Intelligence
How could machines learn as efficiently as humans and animals? How could machines learn to reason and plan? How could machines learn representations of percepts and action plans at multiple levels of abstraction, enabling them to reason, predict, and plan at multiple time horizons? I will propose a possible path towards autonomous intelligent agents, based on a new modular cognitive architecture and a somewhat new self supervised training paradigm. The centerpiece of the proposed architecture is a configurable predictive world model that allows the agent to plan. Behavior and learning are driven by a set of differentiable intrinsic cost functions. The world model uses a new type of energy-based model architecture called H-JEPA (Hierarchical Joint Embedding Predictive Architecture). H-JEPA learns hierarchical abstract representations of the world that are simultaneously maximally informative and maximally predictable.
From Machine Learning to Autonomous Intelligence
How could machines learn as efficiently as humans and animals? How could machines learn to reason and plan? How could machines learn representations of percepts and action plans at multiple levels of abstraction, enabling them to reason, predict, and plan at multiple time horizons? I will propose a possible path towards autonomous intelligent agents, based on a new modular cognitive architecture and a somewhat new self-supervised training paradigm. The centerpiece of the proposed architecture is a configurable predictive world model that allows the agent to plan. Behavior and learning are driven by a set of differentiable intrinsic cost functions. The world model uses a new type of energy-based model architecture called H-JEPA (Hierarchical Joint Embedding Predictive Architecture). H-JEPA learns hierarchical abstract representations of the world that are simultaneously maximally informative and maximally predictable. The corresponding working paper is available here:https://openreview.net/forum?id=BZ5a1r-kVsf
General purpose event-based architectures for deep learning
Biologically plausible spiking neural networks (SNNs) are an emerging architecture for deep learning tasks due to their energy efficiency when implemented on neuromorphic hardware. However, many of the biological features are at best irrelevant and at worst counterproductive when evaluated in the context of task performance and suitability for neuromorphic hardware. In this talk, I will present an alternative paradigm to design deep learning architectures with good task performance in real-world benchmarks while maintaining all the advantages of SNNs. We do this by focusing on two main features -- event-based computation and activity sparsity. Starting from the performant gated recurrent unit (GRU) deep learning architecture, we modify it to make it event-based and activity-sparse. The resulting event-based GRU (EGRU) is extremely efficient for both training and inference. At the same time, it achieves performance close to conventional deep learning architectures in challenging tasks such as language modelling, gesture recognition and sequential MNIST
Redox and mitochondrial dysregulation in epilepsy
Epileptic seizures render the brain uniquely dependent on energy producing pathways. Studies in our laboratory have been focused on the role of redox processes and mitochondria in the context of abnormal neuronal excitability associated with epilepsy. We have shown that that status epilepticus (SE) alters mitochondrial and cellular redox status, energetics and function and conversely, that reactive oxygen species and resultant dysfunction can lead to chronic epilepsy. Oxidative stress and neuroinflammatory pathways have considerable crosstalk and targeting redox processes has recently been shown to control neuroinflammation and excitability. Understanding the role of metabolic and redox processes can enable the development of novel therapeutics to control epilepsy and/or its comorbidities.
Membrane mechanics meet minimal manifolds
Changes in the geometry and topology of self-assembled membranes underlie diverse processes across cellular biology and engineering. Similar to lipid bilayers, monolayer colloidal membranes studied by the Sharma (IISc Bangalore) and Dogic (UCSB) Labs have in-plane fluid-like dynamics and out-of-plane bending elasticity, but their open edges and micron length scale provide a tractable system to study the equilibrium energetics and dynamic pathways of membrane assembly and reconfiguration. First, we discuss how doping colloidal membranes with short miscible rods transforms disk-shaped membranes into saddle-shaped minimal surfaces with complex edge structures. Theoretical modeling demonstrates that their formation is driven by increasing positive Gaussian modulus, which in turn is controlled by the fraction of short rods. Further coalescence of saddle-shaped surfaces leads to exotic topologically distinct structures, including shapes similar to catenoids, tri-noids, four-noids, and higher order structures. We then mathematically explore the mechanics of these catenoid-like structures subject to an external axial force and elucidate their intimate connection to two problems whose solutions date back to Euler: the shape of an area-minimizing soap film and the buckling of a slender rod under compression. A perturbation theory argument directly relates the tensions of membranes to the stability properties of minimal surfaces. We also investigate the effects of including a Gaussian curvature modulus, which, for small enough membranes, causes the axial force to diverge as the ring separation approaches its maximal value.
Canonical neural networks perform active inference
The free-energy principle and active inference have received a significant attention in the fields of neuroscience and machine learning. However, it remains to be established whether active inference is an apt explanation for any given neural network that actively exchanges with its environment. To address this issue, we show that a class of canonical neural networks of rate coding models implicitly performs variational Bayesian inference under a well-known form of partially observed Markov decision process model (Isomura, Shimazaki, Friston, Commun Biol, 2022). Based on the proposed theory, we demonstrate that canonical neural networks—featuring delayed modulation of Hebbian plasticity—can perform planning and adaptive behavioural control in the Bayes optimal manner, through postdiction of their previous decisions. This scheme enables us to estimate implicit priors under which the agent’s neural network operates and identify a specific form of the generative model. The proposed equivalence is crucial for rendering brain activity explainable to better understand basic neuropsychology and psychiatric disorders. Moreover, this notion can dramatically reduce the complexity of designing self-learning neuromorphic hardware to perform various types of tasks.
Trading Off Performance and Energy in Spiking Networks
Many engineered and biological systems must trade off performance and energy use, and the brain is no exception. While there are theories on how activity levels are controlled in biological networks through feedback control (homeostasis), it is not clear what the effects on population coding are, and therefore how performance and energy can be traded off. In this talk we will consider this tradeoff in auto-encoding networks, in which there is a clear definition of performance (the coding loss). We first show how SNNs follow a characteristic trade-off curve between activity levels and coding loss, but that standard networks need to be retrained to achieve different tradeoff points. We next formalize this tradeoff with a joint loss function incorporating coding loss (performance) and activity loss (energy use). From this loss we derive a class of spiking networks which coordinates its spiking to minimize both the activity and coding losses -- and as a result can dynamically adjust its coding precision and energy use. The network utilizes several known activity control mechanisms for this --- threshold adaptation and feedback inhibition --- and elucidates their potential function within neural circuits. Using geometric intuition, we demonstrate how these mechanisms regulate coding precision, and thereby performance. Lastly, we consider how these insights could be transferred to trained SNNs. Overall, this work addresses a key energy-coding trade-off which is often overlooked in network studies, expands on our understanding of homeostasis in biological SNNs, as well as provides a clear framework for considering performance and energy use in artificial SNNs.
Growing a world-class precision medicine industry
Monash Biomedical Imaging is part of the new $71.2 million Australian Precision Medicine Enterprise (APME) facility, which will deliver large-scale development and manufacturing of precision medicines and theranostic radiopharmaceuticals for industry and research. A key feature of the APME project is a high-energy cyclotron with multiple production clean rooms, which will be located on the Monash Biomedical Imaging (MBI) site in Clayton. This strategic co-location will facilitate radiochemistry, PET and SPECT research and clinical use of theranostic (therapeutic and diagnostic) radioisotopes produced on-site. In this webinar, MBI’s Professor Gary Egan and Dr Maggie Aulsebrook will explain how the APME will secure Australia’s supply of critical radiopharmaceuticals, build a globally competitive Australian manufacturing hub, and train scientists and engineers for the Australian workforce. They will cover the APME’s state-of-the-art 30 MeV and 18-24 MeV cyclotrons and radiochemistry facilities, as well as the services that will be accessible to students, scientists, clinical researchers, and pharmaceutical companies in Australia and around the world. The APME is a collaboration between Monash University, Global Medical Solutions Australia, and Telix Pharmaceuticals. Professor Gary Egan is Director of Monash Biomedical Imaging, Director of the ARC Centre of Excellence for Integrative Brain Function and a Distinguished Professor at the Turner Institute for Brain and Mental Health, Monash University. He is also lead investigator of the Victorian Biomedical Imaging Capability, and Deputy Director of the Australian National Imaging Facility. Dr Maggie Aulsebrook obtained her PhD in Chemistry at Monash University and specialises in the development and clinical translation of radiopharmaceuticals. She has led the development of several investigational radiopharmaceuticals for first-in-human application. Maggie leads the Radiochemistry Platform at Monash Biomedical Imaging.
Emergence of homochirality in large molecular systems
The question of the origin of homochirality of living matter, or the dominance of one handedness for all molecules of life across the entire biosphere, is a long-standing puzzle in the research on the Origin of Life. In the fifties, Frank proposed a mechanism to explain homochirality based on the properties of a simple autocatalytic network containing only a few chemical species. Following this work, chemists struggled to find experimental realizations of this model, possibly due to a lack of proper methods to identify autocatalysis [1]. In any case, a model based on a few chemical species seems rather limited, because prebiotic earth is likely to have consisted of complex ‘soups’ of chemicals. To include this aspect of the problem, we recently proposed a mechanism based on certain features of large out-of-equilibrium chemical networks [2]. We showed that a phase transition towards an homochiral state is likely to occur as the number of chiral species in the system becomes large or as the amount of free energy injected into the system increases. Through an analysis of large chemical databases, we showed that there is no need for very large molecules for chiral species to dominate over achiral ones; it already happens when molecules contain about 10 heavy atoms. We also analyzed the various conventions used to measure chirality and discussed the relative chiral signs adopted by different groups of molecules [3]. We then proposed a generalization of Frank’s model for large chemical networks, which we characterized using random matrix theory. This analysis includes sparse networks, suggesting that the emergence of homochirality is a robust and generic transition. References: [1] A. Blokhuis, D. Lacoste, and P. Nghe, PNAS (2020), 117, 25230. [2] G. Laurent, D. Lacoste, and P. Gaspard, PNAS (2021) 118 (3) e2012741118. [3] G. Laurent, D. Lacoste, and P. Gaspard, Proc. R. Soc. A 478:20210590 (2022).
Better energies for low-dimensional elastic systems under combined bending and stretching
We present new kinematic bending measures and quadratic energies for isotropic elastic plates and shells, with certain desirable features not present in commonly employed models in mechanics and soft matter. These are justified both by simple physical arguments related to the through-thickness variation in strain, and through a detailed reduction from a three-dimensional energy quadratic in stretch. The measure of plate bending is a dilation-invariant surface tensor that couples stretch and curvature in a natural extension of primitive generalized bending strains for straight rods. The extension to naturally-curved rods and shells, for which the pure stretching of a curved rest configuration is not a dilation, contrasts with previous ad hoc postulated forms. Our results provide a clean basis for simple models of low-dimensional elastic systems, and should enable more accurate probing of the structure of singularities in soft sheets and membranes.
GeNN
Large-scale numerical simulations of brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. Similarly, spiking neural networks are also gaining traction in machine learning with the promise that neuromorphic hardware will eventually make them much more energy efficient than classical ANNs. In this session, we will present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale spiking neuronal networks to address the challenge of efficient simulations. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. GeNN was originally developed as a pure C++ and CUDA library but, subsequently, we have added a Python interface and OpenCL backend. We will briefly cover the history and basic philosophy of GeNN and show some simple examples of how it is used and how it interacts with other Open Source frameworks such as Brian2GeNN and PyNN.
How does the metabolically-expensive mammalian brain adapt to food scarcity?
Information processing is energetically expensive. In the mammalian brain, it is unclear how information coding and energy usage are regulated during food scarcity. I addressed this in the visual cortex of awake mice using whole-cell recordings and two-photon imaging to monitor layer 2/3 neuronal activity and ATP usage. I found that food restriction reduced synaptic ATP usage by 29% through a decrease in AMPA receptor conductance. Neuronal excitability was nonetheless preserved by a compensatory increase in input resistance and a depolarized resting membrane potential. Consequently, neurons spiked at similar rates as controls, but spent less ATP on underlying excitatory currents. This energy-saving strategy had a cost since it amplified the variability of visually-evoked subthreshold responses, leading to a 32% broadening in orientation tuning and impaired fine visual discrimination. This reduction in coding precision was associated with reduced levels of the fat mass-regulated hormone leptin and was restored by exogenous leptin supplementation. These findings reveal novel mechanisms that dynamically regulate energy usage and coding precision in neocortex.
Metabolic spikes: from rogue electrons to Parkinson's
Conventionally, neurons are thought to be cellular units that process synaptic inputs into synaptic spikes. However, it is well known that neurons can also spike spontaneously and display a rich repertoire of firing properties with no apparent functional relevance e.g. in in vitro cortical slice preparations. In this talk, I will propose a hypothesis according to which intrinsic excitability in neurons may be a survival mechanism to minimize toxic byproducts of the cell’s energy metabolism. In neurons, this toxicity can arise when mitochondrial ATP production stalls due to limited ADP. Under these conditions, electrons deviate from the electron transport chain to produce reactive oxygen species, disrupting many cellular processes and challenging cell survival. To mitigate this, neurons may engage in ADP-producing metabolic spikes. I will explore the validity of this hypothesis using computational models that illustrate the implications of synaptic and metabolic spiking, especially in the context of substantia nigra pars compacta dopaminergic neurons and their degeneration in Parkinson's disease.
New Mechanisms of Extracellular Matrix Remodeling
In the adult brain, synapses are tightly enwrapped by lattices of extracellular matrix that consist of extremely long-lived molecules. These lattices are deemed to stabilize synapses, restrict the reorganization of their transmission machinery, and prevent them from undergoing structural or morphological changes. At the same time, they are expected to retain some degree of flexibility to permit occasional events of synaptic plasticity. The recent understanding that structural changes to synapses are significantly more frequent than previously assumed (occurring even on a timescale of minutes) has called for a mechanism that allows continual and energy-efficient remodeling of the ECM at synapses. I review in the talk our recent work showcasing such a process, based on the constitutive recycling of synaptic ECM molecules. I discuss the key characteristics of this mechanism, focusing on its roles in mediating synaptic transmission and plasticity, and speculate on additional potential functions in neuronal signaling.
NMC4 Short Talk: Predictive coding is a consequence of energy efficiency in recurrent neural networks
Predictive coding represents a promising framework for understanding brain function, postulating that the brain continuously inhibits predictable sensory input, ensuring a preferential processing of surprising elements. A central aspect of this view on cortical computation is its hierarchical connectivity, involving recurrent message passing between excitatory bottom-up signals and inhibitory top-down feedback. Here we use computational modelling to demonstrate that such architectural hard-wiring is not necessary. Rather, predictive coding is shown to emerge as a consequence of energy efficiency, a fundamental requirement of neural processing. When training recurrent neural networks to minimise their energy consumption while operating in predictive environments, the networks self-organise into prediction and error units with appropriate inhibitory and excitatory interconnections and learn to inhibit predictable sensory input. We demonstrate that prediction units can reliably be identified through biases in their median preactivation, pointing towards a fundamental property of prediction units in the predictive coding framework. Moving beyond the view of purely top-down driven predictions, we demonstrate via virtual lesioning experiments that networks perform predictions on two timescales: fast lateral predictions among sensory units and slower prediction cycles that integrate evidence over time. Our results, which replicate across two separate data sets, suggest that predictive coding can be interpreted as a natural consequence of energy efficiency. More generally, they raise the question which other computational principles of brain function can be understood as a result of physical constraints posed by the brain, opening up a new area of bio-inspired, machine learning-powered neuroscience research.
NMC4 Short Talk: Rank similarity filters for computationally-efficient machine learning on high dimensional data
Real world datasets commonly contain nonlinearly separable classes, requiring nonlinear classifiers. However, these classifiers are less computationally efficient than their linear counterparts. This inefficiency wastes energy, resources and time. We were inspired by the efficiency of the brain to create a novel type of computationally efficient Artificial Neural Network (ANN) called Rank Similarity Filters. They can be used to both transform and classify nonlinearly separable datasets with many datapoints and dimensions. The weights of the filters are set using the rank orders of features in a datapoint, or optionally the 'confusion' adjusted ranks between features (determined from their distributions in the dataset). The activation strength of a filter determines its similarity to other points in the dataset, a measure based on cosine similarity. The activation of many Rank Similarity Filters transforms samples into a new nonlinear space suitable for linear classification (Rank Similarity Transform (RST)). We additionally used this method to create the nonlinear Rank Similarity Classifier (RSC), which is a fast and accurate multiclass classifier, and the nonlinear Rank Similarity Probabilistic Classifier (RSPC), which is an extension to the multilabel case. We evaluated the classifiers on multiple datasets and RSC is competitive with existing classifiers but with superior computational efficiency. Code for RST, RSC and RSPC is open source and was written in Python using the popular scikit-learn framework to make it easily accessible (https://github.com/KatharineShapcott/rank-similarity). In future extensions the algorithm can be applied to hardware suitable for the parallelization of an ANN (GPU) and a Spiking Neural Network (neuromorphic computing) with corresponding performance gains. This makes Rank Similarity Filters a promising biologically inspired solution to the problem of efficient analysis of nonlinearly separable data.
NMC4 Keynote: A network perspective on cognitive effort
Cognitive effort has long been an important explanatory factor in the study of human behavior in health and disease. Yet, the biophysical nature of cognitive effort remains far from understood. In this talk, I will offer a network perspective on cognitive effort. I will begin by canvassing a recent perspective that casts cognitive effort in the framework of network control theory, developed and frequently used in systems engineering. The theory describes how much energy is required to move the brain from one activity state to another, when activity is constrained to pass along physical pathways in a connectome. I will then turn to empirical studies that link this theoretical notion of energy with cognitive effort in a behaviorally demanding task, and with a metabolic notion of energy as accessible to FDG-PET imaging. Finally, I will ask how this structurally-constrained activity flow can provide us with insights about the brain’s non-equilibrium nature. Using a general tool for quantifying entropy production in macroscopic systems, I will provide evidence to suggest that states of marked cognitive effort are also states of greater entropy production. Collectively, the work I discuss offers a complementary view of cognitive effort as a dynamical process occurring atop a complex network.
Nonequilibrium self-assembly and time-irreversibility in living systems
Far-from-equilibrium processes constantly dissipate energy while converting a free-energy source to another form of energy. Living systems, for example, rely on an orchestra of molecular motors that consume chemical fuel to produce mechanical work. In this talk, I will describe two features of life, namely, time-irreversibility, and nonequilibrium self-assembly. Time irreversibility is the hallmark of nonequilibrium dissipative processes. Detecting dissipation is essential for our basic understanding of the underlying physical mechanism, however, it remains a challenge in the absence of observable directed motion, flows, or fluxes. Additional difficulty arises in complex systems where many internal degrees of freedom are inaccessible to an external observer. I will introduce a novel approach to detect time irreversibility and estimate the entropy production from time-series measurements, even in the absence of observable currents. This method can be implemented in scenarios where only partial information is available and thus provides a new tool for studying nonequilibrium phenomena. Further, I will explore the added benefits achieved by nonequilibrium driving for self-assembly, identify distinctive collective phenomena that emerge in a nonequilibrium self-assembly setting, and demonstrate the interplay between the assembly speed, kinetic stability, and relative population of dynamical attractors.
Edge Computing using Spiking Neural Networks
Deep learning has made tremendous progress in the last year but it's high computational and memory requirements impose challenges in using deep learning on edge devices. There has been some progress in lowering memory requirements of deep neural networks (for instance, use of half-precision) but there has been minimal effort in developing alternative efficient computational paradigms. Inspired by the brain, Spiking Neural Networks (SNN) provide an energy-efficient alternative to conventional rate-based neural networks. However, SNN architectures that employ the traditional feedforward and feedback pass do not fully exploit the asynchronous event-based processing paradigm of SNNs. In the first part of my talk, I will present my work on predictive coding which offers a fundamentally different approach to developing neural networks that are particularly suitable for event-based processing. In the second part of my talk, I will present our work on development of approaches for SNNs that target specific problems like low response latency and continual learning. References Dora, S., Bohte, S. M., & Pennartz, C. (2021). Deep Gated Hebbian Predictive Coding Accounts for Emergence of Complex Neural Response Properties Along the Visual Cortical Hierarchy. Frontiers in Computational Neuroscience, 65. Saranirad, V., McGinnity, T. M., Dora, S., & Coyle, D. (2021, July). DoB-SNN: A New Neuron Assembly-Inspired Spiking Neural Network for Pattern Classification. In 2021 International Joint Conference on Neural Networks (IJCNN) (pp. 1-6). IEEE. Machingal, P., Thousif, M., Dora, S., Sundaram, S., Meng, Q. (2021). A Cross Entropy Loss for Spiking Neural Networks. Expert Systems with Applications (under review).
Efficient GPU training of SNNs using approximate RTRL
Last year’s SNUFA workshop report concluded “Moving toward neuron numbers comparable with biology and applying these networks to real-world data-sets will require the development of novel algorithms, software libraries, and dedicated hardware accelerators that perform well with the specifics of spiking neural networks” [1]. Taking inspiration from machine learning libraries — where techniques such as parallel batch training minimise latency and maximise GPU occupancy — as well as our previous research on efficiently simulating SNNs on GPUs for computational neuroscience [2,3], we are extending our GeNN SNN simulator to pursue this vision. To explore GeNN’s potential, we use the eProp learning rule [4] — which approximates RTRL — to train SNN classifiers on the Spiking Heidelberg Digits and the Spiking Sequential MNIST datasets. We find that the performance of these classifiers is comparable to those trained using BPTT [5] and verify that the theoretical advantages of neuron models with adaptation dynamics [5] translate to improved classification performance. We then measured execution times and found that training an SNN classifier using GeNN and eProp becomes faster than SpyTorch and BPTT after less than 685 timesteps and much larger models can be trained on the same GPU when using GeNN. Furthermore, we demonstrate that our implementation of parallel batch training improves training performance by over 4⨉ and enables near-perfect scaling across multiple GPUs. Finally, we show that performing inference using a recurrent SNN using GeNN uses less energy and has lower latency than a comparable LSTM simulated with TensorFlow [6].
Optimal initialization strategies for Deep Spiking Neural Networks
Recent advances in neuromorphic hardware and Surrogate Gradient (SG) learning highlight the potential of Spiking Neural Networks (SNNs) for energy-efficient signal processing and learning. Like in Artificial Neural Networks (ANNs), training performance in SNNs strongly depends on the initialization of synaptic and neuronal parameters. While there are established methods of initializing deep ANNs for high performance, effective strategies for optimal SNN initialization are lacking. Here, we address this gap and propose flexible data-dependent initialization strategies for SNNs.
The brain control of appetite: Can an old dog teach us new tricks?
It is clear that the cause of obesity is a result of eating more than you burn. It is physics. What is more complex to answer is why some people eat more than others? Differences in our genetic make-up mean some of us are slightly more hungry all the time and so eat more than others. We now know that the genetics of body-weight, on which obesity sits on one end of the spectrum, is in actuality the genetics of appetite control. In contrast to the prevailing view, body-weight is not a choice. People who are obese are not bad or lazy; rather, they are fighting their biology.
Designing temporal networks that synchronize under resource constraints
Being fundamentally a non-equilibrium process, synchronization comes with unavoidable energy costs and has to be maintained under the constraint of limited resources. Such resource constraints are often reflected as a finite coupling budget available in a network to facilitate interaction and communication. In this talk, I will show that introducing temporal variation in the network structure can lead to efficient synchronization even when stable synchrony is impossible in any static network under the given budget. Our strategy is based on an open-loop control scheme and alludes to a fundamental advantage of temporal networks. Whether this advantage of temporality can be utilized in the brain is an interesting open question.
Sympathetic nerve remodeling in adipose tissue
Sympathetic nerve activation of adrenergic receptors on fat is the major pathway the brain uses to drive non-shivering thermogenesis in brown adipose tissue and lipolysis in white fat. There is accumulating evidence that the peripheral nerve architecture inside of organs is plastic (can be remodeled) but the factors and conditions that regulate or result in remodeling are largely unknown. Particularly for fat, it remains unclear if nerves in fat can be remodeled in step with hyperplasia/trophy of adipose tissue as result of a prolonged energy surfeit. This talk will discuss our recent work identifying the sympathetic nerve architecture in adipose tissue as highly plastic in response to the adipose hormone leptin, the brain circuitry leptin acts on to regulate this and the physiological effects remodeling of innervation has on fat tissue function.
Neocortex saves energy by reducing coding precision during food scarcity
Information processing is energetically expensive. In the mammalian brain, it is unclear how information coding and energy usage are regulated during food scarcity. We addressed this in the visual cortex of awake mice using whole-cell patch clamp recordings and two-photon imaging to monitor layer 2/3 neuronal activity and ATP usage. We found that food restriction resulted in energy savings through a decrease in AMPA receptor conductance, reducing synaptic ATP usage by 29%. Neuronal excitability was nonetheless preserved by a compensatory increase in input resistance and a depolarized resting membrane potential. Consequently, neurons spiked at similar rates as controls, but spent less ATP on underlying excitatory currents. This energy-saving strategy had a cost since it amplified the variability of visually-evoked subthreshold responses, leading to a 32% broadening in orientation tuning and impaired fine visual discrimination. These findings reveal novel mechanisms that dynamically regulate energy usage and coding precision in neocortex.
Neocortex saves energy by reducing coding precision during food scarcity
Synaptic health in Parkinson's Disease
Parkinson's disease (PD) is the second most common neurodegenerative disorder, affecting 1% of over 65's; there is currently no effective treatment. Dopaminergic neuronal loss is hallmark in PD and yet despite decades of intensive research there is still no known therapeutic which will completely halt the disorder. As a result, identification of interventive therapies to reverse or prevent PD are essential. Using genetically faithful models (induced pluripotent stem cells and knock-in mice) of familial late onset PD (LRRK2 G2019S and GBA N370S) we have contributed to the literature that neuronal dysfunction precedes degeneration. Specifically, using whole cell patch clamp electrophysiology, biochemical, behavioural and molecular biological techniques, we have begun to investigate the fundamental processes that make neurons specialised i.e., synaptic function and neurotransmission. We illustrate those alterations to spontaneous neurotransmitter release, neuronal firing, and short-term plasticity as well as Ca2+ and energy dyshomeostasis, are some of the earliest observable pathological dysfunctions and are likely precursors to late-stage degeneration. These pathologies represent targets which can be manipulated to address causation, rather than the symptoms of the PD, and represent a marker that, if measurable in patients, could form the basis of early PD detection and intervention.
Targeting the brain to improve obesity and type 2 diabetes
The increasing prevalence of obesity and type 2 diabetes (T2D) and associated morbidity and mortality emphasizes the need for a more complete understanding of the mechanisms mediating energy homeostasis to accelerate the identification of new medications. Recent reports indicate that obesity medication, 5-hydroxytryptamine (5-HT, serotonin)2C receptor (5-HT2CR) agonist lorcaserin improves glycemic control in association with weight loss in obese patients with T2D. We examined whether lorcaserin has a direct effect on insulin sensitivity and how this effect is achieved. We clarify that lorcaserin dose-dependently improves glycemic control in a mouse model of T2D without altering body weight. Examining the mechanism of this effect, we reveal a necessary and sufficient neurochemical mediator of lorcaserin’s glucoregulatory effects, via activation of brain pro-opiomelanocortin (POMC) peptides. We observed that lorcaserin reduces hepatic glucose production and improves insulin sensitivity. These data suggest that lorcaserin’s action within the brain represents a mechanistically novel treatment for T2D: findings of significance to a prevalent global disease.
As soon as there was life there was danger
Organisms face challenges to survival throughout life. When we freeze or flee in danger, we often feel fear. Tracing the deep history of danger gives a different perspective. The first cells living billions of years ago had to detect and respond to danger in order to survive. Life is about not being dead, and behavior is a major way that organisms hold death off. Although behavior does not require a nervous system, complex organisms have brain circuits for detecting and responding to danger, the deep roots of which go back to the first cells. But these circuits do not make fear, and fear is not the cause of why we freeze or flee. Fear a human invention; a construct we use to account for what happens in our minds when we become aware that we are in harm’s way. This requires a brain that can personally know that it existed in the past, that it is the entity that might be harmed in the present, and that it will cease to exist it the future. If other animals have conscious experiences, they cannot have the kinds of conscious experiences we have because they do not have the kinds of brains we have. This is not meant as a denial of animal consciousness; it is simply a statement about the fact that every species has a different brain. Nor is it a declaration about the wonders of the human brain, since we have done some wonderful, but also horrific, things with our brains. In fact, we are on the way to a climatic disaster that will not, as some suggest, destroy the Earth. But it will make it inhabitable for our kind, and other organisms with high energy demands. Bacteria have made it for billions of years and will likely be fine. The rest is up for grabs, and, in a very real sense, up to us.
Sleepless in Vienna - how to rescue folding-deficient dopamine transporters by pharmacochaperoning
Diseases that arise from misfolding of an individual protein are rare. However, collectively, these folding diseases represent a large proportion of hereditary and acquired disorders. In fact, the term "Molecular Medicine" was coined by Linus Pauling in conjunction with the study of a folding disease, i.e. sickle cell anemia. In the past decade, we have witnessed an exponential growth in the number of mutations, which have been identified in genes encoding solute carriers (SLC). A sizable faction - presumably the majority - of these mutations result in misfolding of the encoded protein. While studying the export of the GABA transporter (SLC6A1) and of the serotonin transporter (SLC6A4), from the endoplasmic reticulum (ER), we discovered by serendipity that some ligands can correct the folding defect imparted by point mutations. These bind to the inward facing state. The most effective compound is noribogaine, the metabolite of ibogaine (an alkaloid first isolated from the shrub Tabernanthe iboga). There are 13 mutations in the human dopamine transporter (DAT, SLC6A3), which give rise to a syndrome of infantile Parkinsonism and dystonia. We capitalized on our insights to explore, if the disease-relevant mutant proteins were amenable to pharmacological correction. Drosopohila melanogaster, which lack the dopamine transporter, are hyperactive and sleepless (fumin in Japanese). Thus, mutated human DAT variants can be introduced into fumin flies. This allows for examining the effect of pharmacochaperones on delivery of DAT to the axonal territory and on restoring sleep. We explored the chemical space populated by variations of the ibogaine structure to identify an analogue (referred to as compound 9b), which was highly effective: compound 9b also restored folding in DAT variants, which were not amenable to rescue by noribogaine. Deficiencies in the human creatine transporter-1 (CrT1, SLC6A8) give rise to a syndrome of intellectual disability and seizures and accounts for 5% of genetically based intellectual disabilities in boys. Point mutations occur, in part, at positions, which are homologous to those of folding-deficient DAT variants. CrT1 lacks the rich pharmacology of monoamine transporters. Nevertheless, our insights are also applicable to rescuing some disease-related variants of CrT1. Finally, the question arises how one can address the folding problem. We propose a two-pronged approach: (i) analyzing the effect of mutations on the transport cycle by electrophysiological recordings; this allows for extracting information on the rates of conformational transitions. The underlying assumption posits that - even when remedied by pharmacochaperoning - folding-deficient mutants must differ in the conformational transitions associated with the transport cycle. (ii) analyzing the effect of mutations on the two components of protein stability, i.e. thermodynamic and kinetic stability. This is expected to provide a glimpse of the energy landscape, which governs the folding trajectory.
Dynamical Neuromorphic Systems
In this talk, I aim to show that the dynamical properties of emerging nanodevices can accelerate the development of smart, and environmentally friendly chips that inherently learn through their physics. The goal of neuromorphic computing is to draw inspiration from the architecture of the brain to build low-power circuits for artificial intelligence. I will first give a brief overview of the state of the art of neuromorphic computing, highlighting the opportunities offered by emerging nanodevices in this field, and the associated challenges. I will then show that the intrinsic dynamical properties of these nanodevices can be exploited at the device and algorithmic level to assemble systems that infer and learn though their physics. I will illustrate these possibilities with examples from our work on spintronic neural networks that communicate and compute through their microwave oscillations, and on an algorithm called Equilibrium Propagation that minimizes both the error and energy of a dynamical system.
Central representations of protein availability regulating appetite and body weight control
Dietary protein quantity and quality greatly impact metabolic health via evolutionary-conserved mechanisms that ensure avoidance of amino acid imbalanced food sources, promote hyperphagia when dietary protein density is low, and conversely produce satiety when dietary protein density is high. Growing evidence support the emerging concept of protein homeostasis in mammals, where protein intake is maintained within a tight range independently of energy intake to reach a target protein intake. The behavioural and neuroendocrine mechanisms underlying these adaptations are unclear and form the focus of our research.
Causal coupling between neural activity, metabolism, and behavior across the Drosophila brain
Coordinated activity across networks of neurons is a hallmark of both resting and active behavioral states in many species, including worms, flies, fish, mice and humans. These global patterns alter energy metabolism in the brain over seconds to hours, making oxygen consumption and glucose uptake widely used proxies of neural activity. However, whether changes in neural activity are causally related to changes in metabolic flux in intact circuits on the sub-second timescales associated with behavior, is unclear. Moreover, it is unclear whether differences between rest and action are associated with spatiotemporally structured changes in neuronal energy metabolism at the subcellular level. My work combines two-photon microscopy across the fruit fly brain with sensors that allow simultaneous measurements of neural activity and metabolic flux, across both resting and active behavioral states. It demonstrates that neural activity drives changes in metabolic flux, creating a tight coupling between these signals that can be measured across large-scale brain networks. Further, using local optogenetic perturbation, I show that even transient increases in neural activity result in rapid and persistent increases in cytosolic ATP, suggesting that neuronal metabolism predictively allocates resources to meet the energy demands of future neural activity. Finally, these studies reveal that the initiation of even minimal behavioral movements causes large-scale changes in the pattern of neural activity and energy metabolism, revealing unexpectedly widespread engagement of the central brain.
Brain-body interactions in the metabolic/nutritional control of puberty: Neuropeptide pathways and central energy sensors
Puberty is a brain-driven phenomenon, which is under the control of sophisticated regulatory networks that integrate a large number of endogenous and environmental signals, including metabolic and nutritional cues. Puberty onset is tightly bound to the state of body energy reserves, and deregulation of energy/metabolic homeostasis is often associated with alterations in the timing of puberty. However, despite recent progress in the field, our knowledge of the specific molecular mechanisms and pathways whereby our brain decode metabolic information to modulate puberty onset remains fragmentary and incomplete. Compelling evidence, gathered over the last fifteen years, supports an essential role of hypothalamic neurons producing kisspeptins, encoded by Kiss1, in the neuroendocrine control of puberty. Kiss1 neurons are major components of the hypothalamic GnRH pulse generator, whose full activation is mandatory pubertal onset. Kiss1 neurons seemingly participate in transmitting the regulatory actions of metabolic cues on pubertal maturation. However, the modulatory influence of metabolic signals (e.g., leptin) on Kiss1 neurons might be predominantly indirect and likely involves also the interaction with other transmitters and neuronal populations. In my presentation, I will review herein recent work of our group, using preclinical models, addressing the molecular mechanisms whereby Kiss1 neurons are modulated by metabolic signals, and thereby contribute to the nutritional control of puberty. In this context, the putative roles of the energy/metabolic sensors, AMP-activated protein kinase (AMPK) and SIRT1, in the metabolic control of Kiss1 neurons and puberty will be discussed. In addition, I will summarize recent findings from our team pointing out a role of central de novo ceramide signaling in mediating the impact of obesity of (earlier) puberty onset, via non-canonical, kisspeptin-related pathways. These findings are posed of translational interest, as perturbations of these molecular pathways could contribute to the alterations of pubertal timing linked to conditions of metabolic stress in humans, ranging from malnutrition to obesity, and might become druggable targets for better management of pubertal disorders.
Trapping active particles up to the limiting case: bacteria enclosed in a biofilm
Active matter systems are composed of constituents, each one in nonequilibrium, that consume energy in order to move [1]. A characteristic feature of active matter is collective motion leading to nonequilibrium phase transitions or large scale directed motion [2]. A number of recent works have featured active particles interacting with obstacles, either moving or fixed [3,4,5]. When an active particle encounters an asymmetric obstacle, different behaviours are detected depending on the nature of its active motion. On the one side, rectification effects arise in a suspension of run-and-tumble particles interacting with a wall of funnelled-shaped openings, caused by particles persistence length [6]. The same trapping mechanism could be responsible for the intake of microorganisms in the underground leaves [7] of Carnivorous plants [8]. On the other side, for aligning particles [9] interacting with a wall of funnelled-shaped openings, trapping happens on the (opposite) wider opening side of the funnels [10,11]. Interestingly, when funnels are located on a circular array, trapping is more localised and depends on the nature of the Vicsek model. Active particles can be synthetic (such as synthetic active colloids) or alive (such as living bacteria). A prototypical model to study living microswimmers is P. fluorescens, a rod shaped and biofilm forming bacterium. Biofilms are microbial communities self-assembled onto external interfaces. Biofilms can be described within the Soft Matter physics framework [12] as a viscoelastic material consisting of colloids (bacterial cells) embedded in a cross-linked polymer gel (polysaccharides cross-linked via proteins/multivalent cations), whose water content vary depending on the environmental conditions. Bacteria embedded in the polymeric matrix control biofilm structure and mechanical properties by regulating its matrix composition. We have recently monitored structural features of Pseudomonas fluorescens biofilms grown with and without hydrodynamic stress [13,14]. We have demonstrated that bacteria are capable of self-adapting to hostile hydrodynamic stress by tailoring the biofilm chemical composition, thus affecting both the mesoscale structure of the matrix and its viscoelastic properties that ultimately regulate the bacteria-polymer interactions. REFERENCES [1] C. Bechinger et al. Rev. Mod. Phys. 88, 045006 (2016); [2] T. Vicsek, A. Zafeiris Phys. Rep. 517, 71 (2012); [3] C. Bechinger, R. Di Leonardo, H. Lowen, C. Reichhardt, G. Volpe, and G. Volpe, Reviews of Modern Physics 88, 045006 (2016); [4] R Martinez, F Alarcon, DR Rodriguez, JL Aragones, C Valeriani The European Physical Journal E 41, 1 (2018); [5] DR Rodriguez, F Alarcon, R Martinez, J Ramírez, C Valeriani, Soft matter 16 (5), 1162 (2020); [6] C. O. Reichhardt and C. Reichhardt, Annual Review of Condensed Matter Physics 8, 51 (2017); [7] W Barthlott, S Porembski, E Fischer, B Gemmel Nature 392, 447 (1998); [8] C B. Giuliano, R Zhang, R.Martinez Fernandez, C.Valeriani and L.Wilson (in preparation, 2021); [9] R Martinez, F Alarcon, JL Aragones, C Valeriani Soft matter 16 (20), 4739 (2020); [10] P. Galajada, J. Keymer, P. Chaikin and R.Austin, Journal of bacteriology, 189, 8704 (2007); [11] M. Wan, C.O. Reichhardt, Z. Nussinov, and C. Reichhardt, Physical Review Letters 101, 018102 (2008); [12] J N. Wilking , T E. Angelini , A Seminara , M P. Brenner , and D A. Weitz MRS Bulletin 36, 385 (2011); [13]J Jara, F Alarcón, A K Monnappa, J Ignacio Santos, V Bianco, P Nie, M Pica Ciamarra, A Canales, L Dinis, I López-Montero, C Valeriani, B Orgaz, Frontiers in microbiology 11, 3460 (2021); [14] P Nie, F Alarcon, I López-Montero, B Orgaz, C Valeriani, M Pica Ciamarra
Energy landscapes, order and disorder, and protein sequence coevolution: From proteins to chromosome structure
In vivo, the human genome folds into a characteristic ensemble of 3D structures. The mechanism driving the folding process remains unknown. A theoretical model for chromatin (the minimal chromatin model) explains the folding of interphase chromosomes and generates chromosome conformations consistent with experimental data is presented. The energy landscape of the model was derived by using the maximum entropy principle and relies on two experimentally derived inputs: a classification of loci into chromatin types and a catalog of the positions of chromatin loops. This model was generalized by utilizing a neural network to infer these chromatin types using epigenetic marks present at a locus, as assayed by ChIP-Seq. The ensemble of structures resulting from these simulations completely agree with HI-C data and exhibits unknotted chromosomes, phase separation of chromatin types, and a tendency for open chromatin to lie at the periphery of chromosome territories. Although this theoretical methodology was trained in one cell line, the human GM12878 lymphoblastoid cells, it has successfully predicted the structural ensembles of multiple human cell lines. Finally, going beyond Hi-C, our predicted structures are also consistent with microscopy measurements. Analysis of both structures from simulation and microscopy reveals that short segments of chromatin make two-state transitions between closed conformations and open dumbbell conformations. For gene active segments, the vast majority of genes appear clustered in the linker region of the chromatin segment, allowing us to speculate possible mechanisms by which chromatin structure and dynamics may be involved in controlling gene expression. * Supported by the NSF
Microorganism locomotion in viscoelastic fluids
Many microorganisms and cells function in complex (non-Newtonian) fluids, which are mixtures of different materials and exhibit both viscous and elastic stresses. For example, mammalian sperm swim through cervical mucus on their journey through the female reproductive tract, and they must penetrate the viscoelastic gel outside the ovum to fertilize. In micro-scale swimming the dynamics emerge from the coupled interactions between the complex rheology of the surrounding media and the passive and active body dynamics of the swimmer. We use computational models of swimmers in viscoelastic fluids to investigate and provide mechanistic explanations for emergent swimming behaviors. I will discuss how flexible filaments (such as flagella) can store energy from a viscoelastic fluid to gain stroke boosts due to fluid elasticity. I will also describe 3D simulations of model organisms such as C. Reinhardtii and mammalian sperm, where we use experimentally measured stroke data to separate naturally coupled stroke and fluid effects. We explore why strokes that are adapted to Newtonian fluid environments might not do well in viscoelastic environments.
Deciphering the kisspeptin-melanocortin pathways underlying the regulation of energy expenditure
Inertial active soft matter
Active particles which are self-propelled by converting energy into mechanical motion represent an expanding research realm in physics and chemistry. For micron-sized particles moving in a liquid (``microswimmers''), most of the basic features have been described by using the model of overdamped active Brownian motion [1]. However, for macroscopic particles or microparticles moving in a gas, inertial effects become relevant such that the dynamics is underdamped. Therefore, recently, active particles with inertia have been described by extending the active Brownian motion model to active Langevin dynamics which include inertia [2]. In this talk, recent developments of active particles with inertia (``microflyers'', ``hoppers'' or ``runners'') are summarized including: inertial delay effects between particle velocity and self-propulsion direction [3], tuning of the long-time self-diffusion by the moment of inertia [3], the influence of inertia on motility-induced phase separation and the cluster growth exponent [4], and the formation of active micelles (“rotelles”) by using inertial active surfactants. References [1] C. Bechinger, R. di Leonardo, H. Löwen, C. Reichhardt, G. Volpe, G. Volpe, Reviews of Modern Physics 88, 045006 (2016). [2] H. Löwen, Journal of Chemical Physics 152, 040901 (2020). [3] C. Scholz, S. Jahanshahi, A. Ldov, H. Löwen, Nature Communications 9, 5156 (2018). [4] S. Mandal, B. Liebchen, H. Löwen, Physical Review Letters 123, 228001 (2019). [5] C. Scholz, A. Ldov, T. Pöschel, M. Engel, H. Löwen, Surfactants and rotelles in active chiral fluids, will be published
Bend, slip, or break?
Rigidity is the ability of a system to resist imposed stresses before ultimately undergoing failure. However, disordered materials often contain both rigid and floppy subregions that complicate the utility of taking system-wide averages. I will talk about 3 frameworks capable of connecting the internal structure of disordered materials to their rigidity and/or failure under loading, and describe how my collaborators and I have applied these frameworks to laboratory data on laser-cut lattices and idealized granular materials. These are, in order of increasing physics content: (1) centrality within an adjacency matrix describing its connectivity, (2) Maxwell constraint counting on the full network of frictional contact forces, and (3) the vibrational modes of a synthetic dynamical matrix (Hessian). The first two rely primarily on topology, and the second two contrast the utility of considering interparticle forces (Coulomb failure) vs. the energy landscape. All three methods, while successfully elucidating the origins of rigidity and brittle vs. ductile failure, also provide interesting counterpoints regarding how much information is enough to make predictions.
Non-equilibrium molecular assembly in reshaping and cutting cells
A key challenge in modern soft matter is to identify the principles that govern the organisation and functionality in non-equilibrium systems. Current research efforts largely focus on non-equilibrium processes that occur either at the single-molecule scale (e.g. protein and DNA conformations under driving forces), or at the scale of whole tissues, organisms, and active colloidal and microscopic objects. However, the range of the scales in-between — from molecules to large-scaled molecular assemblies that consume energy and perform work — remains under-explored. This is, nevertheless, the scale that is crucial for the function of a living cell, where molecular self-assembly driven far from equilibrium produces mechanical work needed for cell reshaping, transport, motility, division, and healing. Today I will discuss physical modelling of active elastic filaments, called ESCRT-III filaments, that dynamically assemble and disassemble on cell membranes. This dynamic assembly changes the filaments’ shape and mechanical properties and leads to the remodelling and cutting of cells. I will present a range of experimental comparisons of our simulation results: from ESCRT-III-driven trafficking in eukaryotes to division of evolutionary simple archaeal cells.
Receptor Costs Determine Retinal Design
Our group is interested in discovering design principles that govern the structure and function of neurons and neural circuits. We record from well-defined neurons, mainly in flies’ visual systems, to measure the molecular and cellular factors that determine relevant measures of performance, such as representational capacity, dynamic range and accuracy. We combine this empirical approach with modelling to see how the basic elements of neural systems (ion channels, second messengers systems, membranes, synapses, neurons, circuits and codes) combine to determine performance. We are investigating four general problems. How are circuits designed to integrate information efficiently? How do sensory adaptation and synaptic plasticity contribute to efficiency? How do the sizes of neurons and networks relate to energy consumption and representational capacity? To what extent have energy costs shaped neurons, sense organs and brain regions during evolution?
Correlations, chaos, and criticality in neural networks
The remarkable properties of information-processing of biological and of artificial neuronal networks alike arise from the interaction of large numbers of neurons. A central quest is thus to characterize their collective states. The directed coupling between pairs of neurons and their continuous dissipation of energy, moreover, cause dynamics of neuronal networks outside thermodynamic equilibrium. Tools from non-equilibrium statistical mechanics and field theory are thus instrumental to obtain a quantitative understanding. We here present progress with this recent approach [1]. On the experimental side, we show how correlations between pairs of neurons are informative on the dynamics of cortical networks: they are poised near a transition to chaos [2]. Close to this transition, we find prolongued sequential memory for past signals [3]. In the chaotic regime, networks offer representations of information whose dimensionality expands with time. We show how this mechanism aids classification performance [4]. Together these works illustrate the fruitful interplay between theoretical physics, neuronal networks, and neural information processing.
Blurring the boundaries between neuroscience and organismal physiology
Work in my laboratory is based on the assumptions that we do not know yet how all physiological functions are regulated and that mouse genetics by allowing to identify novel inter-organ communications is the most efficient ways to identify novel regulation of physiological functions. We test these two contention through the study of bone which is the organ my lab has studied since its inception. Based on precise cell biological and clinical reasons that will be presented during the seminar we hypothesized that bone should be a regulator of energy metabolism and reproduction and identified a bone-derived hormone termed osteocalcin that is responsible of these regulatory events. The study of this hormone revealed that in addition to its predicted functions it also regulates brain size, hippocampus development, prevents anxiety and depression and favors spatial learning and memory by signaling through a specific receptor we characterized. As will be presented, we elucidated some of the molecular events accounting for the influence of osteocalcin on brain and showed that maternal osteocalcin is the pool of this hormone that affects brain development. Subsequently and looking at all the physiological functions regulated by osteocalcin, i.e., memory, the ability to exercise, glucose metabolism, the regulation of testosterone biosynthesis, we realized that are all need or regulated in the case of danger. In other words it suggested that osteocalcin is an hormone needed to sense and overcome acute danger. Consonant with this hypothesis we next showed this led us to demonstrate that bone via osteocalcin is needed to mount an acute stress response through molecular and cellular mechanisms that will be presented during the seminar. overall, an evolutionary appraisal of bone biology, this body of work and experiments ongoing in the lab concur to suggest 1] the appearance of bone during evolution has changed how physiological functions as diverse as memory, the acute stress response but also exercise and glucose metabolism are regulated and 2] identified bone and osteocalcin as its molecular vector, as an organ needed to sense and response to danger.
Phospholipid regulation in cognitive impairment and vascular dementia
An imbalance in lipid metabolism in neurodegeneration is still poorly understood. Phospholipids (PLs) have multifactorial participation in vascular dementia as Alzheimer, post-stroke dementia, CADASIL between others. Which include the hyperactivation of phospholipases, mitochondrial stress, peroxisomal dysfunction and irregular fatty acid composition triggering proinflammation in a very early stage of cognitive impairment. The reestablishment of physiological conditions of cholesterol, sphingolipids, phospholipids and others are an interesting therapeutic target to reduce the progression of AD. We propose the positive effect of BACE1 silencing produces a balance of phospholipid profile in desaturase enzymes-depending mode to reduce the inflammation response, and recover the cognitive function in an Alzheimer´s animal and brain stroke models. Pointing out there is a great need for new well-designed research focused in preventing phospholipids imbalance, and their consequent energy metabolism impairment, pro-inflammation and enzymatic over-processing, which would help to prevent unhealthy aging and AD progression.
Computational modelling of dentate granule cells reveals Pareto optimal trade-off between pattern separation and energy efficiency (economy)
Bernstein Conference 2024
Neuronal degeneracy: an information-energy trade-off?
Bernstein Conference 2024
Bayesian synaptic plasticity is energy efficient
COSYNE 2022
Energy efficient reinforcement learning as a matter of life and death
COSYNE 2022
How spiking neural networks can flexibly trade off performance and energy use
COSYNE 2022
How spiking neural networks can flexibly trade off performance and energy use
COSYNE 2022
Noradrenergic Modulation of Whole Brain Energy Landscape Mediates Perceptual Switches
COSYNE 2023
Cannabinoid CB1 receptors in oligodendrocytes: Modulation of energy metabolism and autoimmune demyelination
FENS Forum 2024
Criticality and generalization in hippocampal subregions reflect relationship predicted by the free-energy principle
FENS Forum 2024
Development of a next-generation bidirectional neurobiohybrid interface with optimized energy efficiency enabling real-time adaptive neuromodulation
FENS Forum 2024
Differential expression and enrichment analysis of brain energy metabolism genes in schizophrenia
FENS Forum 2024
Oxytocin and leptin crosstalk in the regulation of the energy balance
FENS Forum 2024
Preoptic PNOC neurons modulate energy expenditure and adipose tissue function
FENS Forum 2024
Reciprocal relationship between neural fibre span and node transmission in brain stabilizes information pathway and energy homeostasis across human life-span
FENS Forum 2024
Role of EphrinB3 in POMC neurons in the control of energy and glucose homeostasis
FENS Forum 2024
Role of hypothalamic NPGL/NPGM system in energy metabolism
FENS Forum 2024
The role of TRH neurons in energy homeostasis and regulation of brown adipose tissue
FENS Forum 2024
Role of Protein Kinase N1 in cerebral energy metabolism and stroke
FENS Forum 2024
SIK3 in different hypothalamic areas mediates whole-body energy balance
FENS Forum 2024