Numerical Simulations
numerical simulations
Yashar Ahmadian
The postdoc will work on a collaborative project between the labs of Yashar Ahmadian at the Computational and Biological Learning Lab (CBL), and Zoe Kourtzi at the Psychology Department, both at the University of Cambridge. The project investigates the computational principles and circuit mechanisms underlying human visual perceptual learning, particularly the role of adaptive changes in the balance of cortical excitation and inhibition resulting from perceptual learning. The postdoc will be based in CBL, with free access to the Kourtzi lab in the Psychology department.
Mathieu Desroches
The aim of the project is to develop a multiscale model of Dravet syndrome, from ionic channels of interacting neurons to large neural populations. We will use various modeling frameworks, adapted to the scale, from piecewise-deterministic Markov processes to mean-field formalism. The postdoc will perform a mathematical analysis of the models, extensive numerical simulations as well as data analysis using neural recordings from our experimental partners.
Prof. Tatjana Tchumatchenko
Postdoc position: The postdoc candidate will be involved in a computational project addressing how neurons efficiently synthesize and distribute proteins in order to ensure that these are readily available across all synapses, will analyze data and model synaptic plasticity changes in order to understand health and disease states computationally. This work is centered on computational tools and includes pen-and-paper calculations, data analysis, and numerical simulations and requires an interdisciplinary mindset. PhD position: The PhD candidate will be conducting circuit level data analysis and modeling of neural activity states. He/she will contribute to the development of machine learning algorithms to analyse imaging data or to distinguish different behavioral activity states. This work is centered on dynamical systems methods, data analysis and numerical simulations and requires an interdisciplinary mindset. Master students interested in conducting Master thesis research (6-12 months) related to the two projects above a welcome to apply.
Mathieu Desroches
The aim of the project is to develop a multiscale model of Dravet syndrome, from ionic channels of interacting neurons to large neural populations. We will use various modeling frameworks, adapted to the scale, from piecewise-deterministic Markov processes to mean-field formalism. The postdoc will perform a mathematical analysis of the models, extensive numerical simulations as well as data analysis using neural recordings from our experimental partners.
Prof. Massimiliano Pontil
We are seeking a talented and motivated Postdoc to join the Computational Statistics and Machine Learning Research Units at IIT, led by Prof. Massimiliano Pontil. The successful candidate will be engaged in designing novel learning algorithms for numerical simulations of physical systems, with a focus on machine learning for dynamical systems. CSML’s core focus is on ML theory and algorithms, while significant multidisciplinary interactions with other IIT groups apply our research outputs in areas ranging from Atomistic Simulations to Neuroscience and Robotics. We have also recently started international collaboration on Climate Modelling. The group hosts applied mathematicians, computer scientists, physicists, and computer engineers, working together on theory, algorithms and applications. ML techniques, coupled with numerical simulations of physical systems have the potential to revolutionize the way in which science is conducted. Meeting this challenge requires a multi-disciplinary approach in which experts from different disciplines work together.
Geometry of concept learning
Understanding Human ability to learn novel concepts from just a few sensory experiences is a fundamental problem in cognitive neuroscience. I will describe a recent work with Ben Sorcher and Surya Ganguli (PNAS, October 2022) in which we propose a simple, biologically plausible, and mathematically tractable neural mechanism for few-shot learning of naturalistic concepts. We posit that the concepts that can be learned from few examples are defined by tightly circumscribed manifolds in the neural firing-rate space of higher-order sensory areas. Discrimination between novel concepts is performed by downstream neurons implementing ‘prototype’ decision rule, in which a test example is classified according to the nearest prototype constructed from the few training examples. We show that prototype few-shot learning achieves high few-shot learning accuracy on natural visual concepts using both macaque inferotemporal cortex representations and deep neural network (DNN) models of these representations. We develop a mathematical theory that links few-shot learning to the geometric properties of the neural concept manifolds and demonstrate its agreement with our numerical simulations across different DNNs as well as different layers. Intriguingly, we observe striking mismatches between the geometry of manifolds in intermediate stages of the primate visual pathway and in trained DNNs. Finally, we show that linguistic descriptors of visual concepts can be used to discriminate images belonging to novel concepts, without any prior visual experience of these concepts (a task known as ‘zero-shot’ learning), indicated a remarkable alignment of manifold representations of concepts in visual and language modalities. I will discuss ongoing effort to extend this work to other high level cognitive tasks.
Associative memory of structured knowledge
A long standing challenge in biological and artificial intelligence is to understand how new knowledge can be constructed from known building blocks in a way that is amenable for computation by neuronal circuits. Here we focus on the task of storage and recall of structured knowledge in long-term memory. Specifically, we ask how recurrent neuronal networks can store and retrieve multiple knowledge structures. We model each structure as a set of binary relations between events and attributes (attributes may represent e.g., temporal order, spatial location, role in semantic structure), and map each structure to a distributed neuronal activity pattern using a vector symbolic architecture (VSA) scheme. We then use associative memory plasticity rules to store the binarized patterns as fixed points in a recurrent network. By a combination of signal-to-noise analysis and numerical simulations, we demonstrate that our model allows for efficient storage of these knowledge structures, such that the memorized structures as well as their individual building blocks (e.g., events and attributes) can be subsequently retrieved from partial retrieving cues. We show that long-term memory of structured knowledge relies on a new principle of computation beyond the memory basins. Finally, we show that our model can be extended to store sequences of memories as single attractors.
Optimal information loading into working memory in prefrontal cortex
Working memory involves the short-term maintenance of information and is critical in many tasks. The neural circuit dynamics underlying working memory remain poorly understood, with different aspects of prefrontal cortical (PFC) responses explained by different putative mechanisms. By mathematical analysis, numerical simulations, and using recordings from monkey PFC, we investigate a critical but hitherto ignored aspect of working memory dynamics: information loading. We find that, contrary to common assumptions, optimal information loading involves inputs that are largely orthogonal, rather than similar, to the persistent activities observed during memory maintenance. Using a novel, theoretically principled metric, we show that PFC exhibits the hallmarks of optimal information loading and we find that such dynamics emerge naturally as a dynamical strategy in task-optimized recurrent neural networks. Our theory unifies previous, seemingly conflicting theories of memory maintenance based on attractor or purely sequential dynamics, and reveals a normative principle underlying the widely observed phenomenon of dynamic coding in PFC.
Non-regular behavior during the coalescence of liquid-like cellular aggregates
The fusion of cell aggregates widely exists during biological processes such as development, tissue regeneration, and tumor invasion. Cellular spheroids (spherical cell aggregates) are commonly used to study this phenomenon. In previous studies, with approximated assumptions and measurements, researchers found that the fusion of two spheroids with some cell type is similar to the coalescence of two liquid droplets. However, with more accurate measurements focusing on the overall shape evolution in this process, we find that even in the previously-regarded liquid-like regime, the fusion process of spheroids can be very different from regular liquid coalescence. We conduct numerical simulations using both standard particulate models and vertex models with both Molecular Dynamics and Brownian Dynamics. The simulation results show that the difference between spheroids and regular liquid droplets is caused by the microscopic overdamped dynamics of each cell rather than the topological cell-cell interactions in the vertex model. Our research reveals the necessity of a new continuum theory for “liquid” with microscopically overdamped components, such as cellular and colloidal systems. Detailed analysis of our simulation results of different system sizes provides the basis for developing the new theory.
GeNN
Large-scale numerical simulations of brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. Similarly, spiking neural networks are also gaining traction in machine learning with the promise that neuromorphic hardware will eventually make them much more energy efficient than classical ANNs. In this session, we will present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale spiking neuronal networks to address the challenge of efficient simulations. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. GeNN was originally developed as a pure C++ and CUDA library but, subsequently, we have added a Python interface and OpenCL backend. We will briefly cover the history and basic philosophy of GeNN and show some simple examples of how it is used and how it interacts with other Open Source frameworks such as Brian2GeNN and PyNN.
Coordinated motion of active filaments on spherical surfaces
Filaments (slender, microscopic elastic bodies) are prevalent in biological and industrial settings. In the biological case, the filaments are often active, in that they are driven internally by motor proteins, with the prime examples being cilia and flagella. For cilia in particular, which can appear in dense arrays, their resulting motions are coupled through the surrounding fluid, as well as through surfaces to which they are attached. In this talk, I present numerical simulations exploring the coordinated motion of active filaments and how it depends on the driving force, density of filaments, as well as the attached surface. In particular, we find that when the surface is spherical, its topology introduces local defects in coordinated motion which can then feedback and alter the global state. This is particularly true when the surface is not held fixed and is free to move in the surrounding fluid. These simulations take advantage of a computational framework we developed for fully 3D filament motion that combines unit quaternions, implicit geometric time integration, quasi-Newton methods, and fast, matrix-free methods for hydrodynamic interactions and it will also be presented.
Capacitance clamp - artificial capacitance in biological neurons via dynamic clamp
A basic time scale in neural dynamics from single cells to the network level is the membrane time constant - set by a neuron’s input resistance and its capacitance. Interestingly, the membrane capacitance appears to be more dynamic than previously assumed with implications for neural function and pathology. Indeed, altered membrane capacitance has been observed in reaction to physiological changes like neural swelling, but also in ageing and Alzheimer's disease. Importantly, according to theory, even small changes of the capacitance can affect neuronal signal processing, e.g. increase network synchronization or facilitate transmission of high frequencies. In experiment, robust methods to modify the capacitance of a neuron have been missing. Here, we present the capacitance clamp - an electrophysiological method for capacitance control based on an unconventional application of the dynamic clamp. In its original form, dynamic clamp mimics additional synaptic or ionic conductances by injecting their respective currents. Whereas a conductance directly governs a current, the membrane capacitance determines how fast the voltage responds to a current. Accordingly, capacitance clamp mimics an altered capacitance by injecting a dynamic current that slows down or speeds up the voltage response (Fig 1 A). For the required dynamic current, the experimenter only has to specify the original cell and the desired target capacitance. In particular, capacitance clamp requires no detailed model of present conductances and thus can be applied in every excitable cell. To validate the capacitance clamp, we performed numerical simulations of the protocol and applied it to modify the capacitance of cultured neurons. First, we simulated capacitance clamp in conductance based neuron models and analysed impedance and firing frequency to verify the altered capacitance. Second, in dentate gyrus granule cells from rats, we could reliably control the capacitance in a range of 75 to 200% of the original capacitance and observed pronounced changes in the shape of the action potentials: increasing the capacitance reduced after-hyperpolarization amplitudes and slowed down repolarization. To conclude, we present a novel tool for electrophysiology: the capacitance clamp provides reliable control over the capacitance of a neuron and thereby opens a new way to study the temporal dynamics of excitable cells.
The physics of cement cohesion
Cement is the main binding agent in concrete, literally gluing together rocks and sand into the most-used synthetic material on Earth. However, cement production is responsible for significant amounts of man- made greenhouse gases—in fact if the cement industry were a country, it would be the third largest emitter in the world. Alternatives to the current, environmentally harmful cement production process are not available essentially because the gaps in fundamental understanding hamper the development of smarter and more sustainable solutions. The ultimate challenge is to link the chemical composition of cement grains to the nanoscale physics of the cohesive forces that emerge when mixing cement with water. Cement nanoscale cohesion originates from the electrostatics of ions accumulated in a water-based solution between like-charged surfaces but it is not captured by existing theories because of the nature of the ions involved and the high surface charges. Surprisingly enough, this is also the case for unexplained cohesion in a range of colloidal and biological matter. About one century after the early studies of cement hydration, we have quantitatively solved this notoriously hard problem and discovered how cement cohesion develops during hydration. I will discuss how 3D numerical simulations that feature a simple but molecular description of ions and water, together with an analytical theory that goes beyond the traditional continuum approximations, helped us demonstrate that the optimized interlocking of ion-water structures determine the net cohesive forces and their evolution. These findings open the path to scientifically grounded strategies of material design for cements and have implications for a much wider range of materials and systems where ionic water-based solutions feature both strong Coulombic and confinement effects, ranging from biological membranes to soils. Construction materials are central to our society and to our life as humans on this planet, but usually far removed from fundamental science. We can now start to understand how cement physical-chemistry determines performance, durability and sustainability.