← Back

Dynamical Systems

Topic spotlight
TopicWorld Wide

dynamical systems

Discover seminars, jobs, and research tagged with dynamical systems across World Wide.
44 curated items19 Seminars14 ePosters11 Positions
Updated 1 day ago
44 items · dynamical systems
44 results
Position

Ing. Mgr. Jaroslav Hlinka, Ph.D.

Institute of Computer Science of the Czech Academy of Sciences
Prague, Czech Republic
Dec 5, 2025

Postdoctoral / Junior Scientist position in Complex Networks and Information Theory A Postdoc or Junior Scientist position is available to join the Complex Networks and Brain Dynamics group for the project: “Network modelling of complex systems: from correlation graphs to information hypergraphs“ funded by the Czech Science Foundation. The project involves developing, optimizing and applying techniques for modelling complex dynamical systems beyond the currently available methods of complex network analysis and game theory. The project is carried out in collaboration with the Artificial Intelligence Center of the Czech Technical University. Conditions: • Contract is of 18 months duration (with the possibility of follow-up tenure-track application). • Starting date: position is available immediately. • Applications will be reviewed on a rolling basis with a first cut-off point on 30. 9. 2022. • This is a full-time fixed term contract appointment. Part time contract negotiable. • Monthly gross salary: 42 000 - 48 000 CZK based on qualifications and experience. Cost Of Living Comparison • Bonuses depending on performance and travel funding for conferences and research stays. • Contribution for reallocation costs for succesful applicant coming from abroad: 10 000 CZK plus 10 000 CZK for family (spouse and/or children). • No teaching duties

Position

Ann Kennedy

Northwestern University
United States, Chicago
Dec 5, 2025

We investigate principles of computation in meso-scale biological neural networks, and the role of these networks in shaping animal behavior. We work in collaboration with experimental neuroscientists recording neural activity in freely moving animals engaged in complex behaviors, to investigate how animals' environments, actions, and internal states are represented across multiple brain areas. Our work is especially inspired by the interaction between subcortical neural populations organized into heavily recurrent neural circuits, including basal ganglia and nuclei of the hypothalamus. Project in the lab include 1) developing novel supervised, semi-supervised, and unsupervised approaches to studying the structure of animal behavior, 2) using behavior as a common basis with which to model the interactions between multiple brain areas, and 3) studying computation and dynamics in networks of heterogenous neurons communicating with multiple neuromodulators and neuropeptides. The lab will also soon begin collecting behavioral data from freely interacting mice in a variety of model lines and animal conditions, to better chart the space of interactions between animal state and behavior expression. Come join us!

PositionComputational Neuroscience

Tatiana Engel

Cold Spring Harbor Laboratory
USA, Cold Spring Harbor
Dec 5, 2025

The Engel lab in the Department of Neuroscience at Cold Spring Harbor Laboratory invites applications from highly motivated candidates for a postdoctoral position working on the cutting-edge research in computational neuroscience. We are looking for theoretical/computational scientists to work at the exciting interface of systems neuroscience, machine learning, and statistical physics, in close collaboration with experimentalists. The postdoctoral scientist is expected to exhibit resourcefulness and independence, developing computational models of large-scale neural activity recordings with the goal to elucidate neural circuit mechanisms underlying cognitive functions. Details: https://cshl.peopleadmin.com/postings/15840

PositionNeuroscience

Federico Stella

Donders Institute of Radboud University
Donders Institute of Radboud University in Nijmegen, NL
Dec 5, 2025

The project will focus on the computational investigation of the role of neural reactivations in memory. Since their discovery neural reactivations happening during sleep have emerged as an exceptional tool to investigate the process of memory formation in the brain. This phenomenon has been mostly associated with the hippocampus, an area known for its role in the processing of new memories and their initial storage. Continuous advancements in data acquisition techniques are giving us an unprecedented access to the activity of large-scale networks during sleep, in the hippocampus and in other cortical regions. At the same time, our theoretical understanding of the computations underlying neural reactivations and more in general memory representations, has only began to take shape. Combining mathematical modeling of neural networks and analysis of existing dataset, we will address some key aspects of this phenomenon such as: 1) The role of different sleep phases in regulating the reactivation process and in modulating the evolution of a memory trace. 2) The relationship of hippocampal reactivations to the process of (semantic) learning and knowledge generalization. 3) The relevance of reactivation statistical properties for learning in cortico-hippocampal networks.

PositionComputational Neuroscience

Dr. Udo Ernst

Computational Neurophysics Lab, University of Bremen
University of Bremen, Hochschulring 18, D-28359 Bremen, Germany
Dec 5, 2025

In this project we want to study organization and optimization of flexible information processing in neural networks, with specific focus on the visual system. You will use network modelling, numerical simulation, and mathematical analysis to investigate fundamental aspects of flexible computation such as task-dependent coordination of multiple brain areas for efficient information processing, as well as the emergence of flexible circuits originating from learning schemes which simultaneously optimize for function and flexibility. These studies will be complemented by biophysically realistic modelling and data analysis in collaboration with experimental work done in the lab of Prof. Dr. Andreas Kreiter, also at the University of Bremen. Here we will investigate selective attention as a central aspect of flexibility in the visual system, involving task-dependent coordination of multiple visual areas.

Position

Gonzalo Uribarri

KTH
Stockholm
Dec 5, 2025

Our research group is looking for a Postdoc to work on a project involving Machine Learning and Dynamical Systems modeling applied to biomedical data. The project is part of a collaboration with Getinge, a leading MedTech company based in Stockholm, and is funded by a grant from Vinnova, the Swedish innovation agency.

Position

Prof. Massimiliano Pontil

IIT
IIT
Dec 5, 2025

We are seeking a talented and motivated Postdoc to join the Computational Statistics and Machine Learning Research Units at IIT, led by Prof. Massimiliano Pontil. The successful candidate will be engaged in designing novel learning algorithms for numerical simulations of physical systems, with a focus on machine learning for dynamical systems. CSML’s core focus is on ML theory and algorithms, while significant multidisciplinary interactions with other IIT groups apply our research outputs in areas ranging from Atomistic Simulations to Neuroscience and Robotics. We have also recently started international collaboration on Climate Modelling. The group hosts applied mathematicians, computer scientists, physicists, and computer engineers, working together on theory, algorithms and applications. ML techniques, coupled with numerical simulations of physical systems have the potential to revolutionize the way in which science is conducted. Meeting this challenge requires a multi-disciplinary approach in which experts from different disciplines work together.

PositionComputational Neuroscience

Dr Margarita Zachariou

The Cyprus Institute of Neurology and Genetics
Nicosia, Cyprus
Dec 5, 2025

We are looking for a Post-Doctoral Fellow and/or a Laboratory Scientific Officer(research assistant) to join the Bioinformatics Department of the Cyprus Institute of Neurology and Genetics. The team focuses on computational neuroscience, particularly on (1) building biophysical models of neurons and neuronal networks to study neurological diseases and (2) developing state-of-the-art analysis pipelines for neural data across scales, focusing on disease-specific patterns and integrating diverse data modalities. The successful candidate(s) will be working on multiscale models of magnetoelectric and ultrasonic effects on neuronal dynamics as part of the EU-Horizon funded META-BRAIN (https://meta-brain.eu).

PositionComputational Neuroscience

Dr. Dmitrii Todorov

The Biomedical Imaging Laboratory (INSERM / Sorbonne University / CNRS )
Paris
Dec 5, 2025

Title: Understanding Neural Mechanisms of Human Motor Learning by Using Explainable AI for Time Series and Brain-Computer Interfaces This PhD project will focus on uncovering mechanisms of human motor adaptation by using advanced computational tools. By analyzing (and potentially collecting new) EEG and MEG + behavioral data from multiple datasets you will explore how the brain adapts movements to external perturbations. There will also be an opportunity to test the newly obtained understanding using a brain-computer interface (BCI) protocol. The project will be co-supervised by Dr. Dmitrii Todorov and Dr. Veronique Marchand-Pauvert, and will be carried out within an international interdisciplinary team.

SeminarNeuroscience

Probing neural population dynamics with recurrent neural networks

Chethan Pandarinath
Emory University and Georgia Tech
Jun 11, 2024

Large-scale recordings of neural activity are providing new opportunities to study network-level dynamics with unprecedented detail. However, the sheer volume of data and its dynamical complexity are major barriers to uncovering and interpreting these dynamics. I will present latent factor analysis via dynamical systems, a sequential autoencoding approach that enables inference of dynamics from neuronal population spiking activity on single trials and millisecond timescales. I will also discuss recent adaptations of the method to uncover dynamics from neural activity recorded via 2P Calcium imaging. Finally, time permitting, I will mention recent efforts to improve the interpretability of deep-learning based dynamical systems models.

SeminarNeuroscienceRecording

Reimagining the neuron as a controller: A novel model for Neuroscience and AI

Dmitri 'Mitya' Chklovskii
Flatiron Institute, Center for Computational Neuroscience
Feb 4, 2024

We build upon and expand the efficient coding and predictive information models of neurons, presenting a novel perspective that neurons not only predict but also actively influence their future inputs through their outputs. We introduce the concept of neurons as feedback controllers of their environments, a role traditionally considered computationally demanding, particularly when the dynamical system characterizing the environment is unknown. By harnessing a novel data-driven control framework, we illustrate the feasibility of biological neurons functioning as effective feedback controllers. This innovative approach enables us to coherently explain various experimental findings that previously seemed unrelated. Our research has profound implications, potentially revolutionizing the modeling of neuronal circuits and paving the way for the creation of alternative, biologically inspired artificial neural networks.

SeminarNeuroscienceRecording

The balance hypothesis for the avian lumbosacral organ and an exploration of its morphological variation

Bing Brunton
Brain, Behavior, and Data Science. Meet the group · University of Washington, Seattle
May 9, 2023
SeminarNeuroscience

The centrality of population-level factors to network computation is demonstrated by a versatile approach for training spiking networks

Brian DePasquale
Princeton
May 2, 2023

Neural activity is often described in terms of population-level factors extracted from the responses of many neurons. Factors provide a lower-dimensional description with the aim of shedding light on network computations. Yet, mechanistically, computations are performed not by continuously valued factors but by interactions among neurons that spike discretely and variably. Models provide a means of bridging these levels of description. We developed a general method for training model networks of spiking neurons by leveraging factors extracted from either data or firing-rate-based networks. In addition to providing a useful model-building framework, this formalism illustrates how reliable and continuously valued factors can arise from seemingly stochastic spiking. Our framework establishes procedures for embedding this property in network models with different levels of realism. The relationship between spikes and factors in such networks provides a foundation for interpreting (and subtly redefining) commonly used quantities such as firing rates.

SeminarNeuroscience

Dynamic endocrine modulation of the nervous system

Emily Jabocs
US Santa Barbara Neuroscience
Apr 17, 2023

Sex hormones are powerful neuromodulators of learning and memory. In rodents and nonhuman primates estrogen and progesterone influence the central nervous system across a range of spatiotemporal scales. Yet, their influence on the structural and functional architecture of the human brain is largely unknown. Here, I highlight findings from a series of dense-sampling neuroimaging studies from my laboratory designed to probe the dynamic interplay between the nervous and endocrine systems. Individuals underwent brain imaging and venipuncture every 12-24 hours for 30 consecutive days. These procedures were carried out under freely cycling conditions and again under a pharmacological regimen that chronically suppresses sex hormone production. First, resting state fMRI evidence suggests that transient increases in estrogen drive robust increases in functional connectivity across the brain. Time-lagged methods from dynamical systems analysis further reveals that these transient changes in estrogen enhance within-network integration (i.e. global efficiency) in several large-scale brain networks, particularly Default Mode and Dorsal Attention Networks. Next, using high-resolution hippocampal subfield imaging, we found that intrinsic hormone fluctuations and exogenous hormone manipulations can rapidly and dynamically shape medial temporal lobe morphology. Together, these findings suggest that neuroendocrine factors influence the brain over short and protracted timescales.

SeminarNeuroscience

Extracting computational mechanisms from neural data using low-rank RNNs

Adrian Valente
Ecole Normale Supérieure
Jan 10, 2023

An influential theory in systems neuroscience suggests that brain function can be understood through low-dimensional dynamics [Vyas et al 2020]. However, a challenge in this framework is that a single computational task may involve a range of dynamic processes. To understand which processes are at play in the brain, it is important to use data on neural activity to constrain models. In this study, we present a method for extracting low-dimensional dynamics from data using low-rank recurrent neural networks (lrRNNs), a highly expressive and understandable type of model [Mastrogiuseppe & Ostojic 2018, Dubreuil, Valente et al. 2022]. We first test our approach using synthetic data created from full-rank RNNs that have been trained on various brain tasks. We find that lrRNNs fitted to neural activity allow us to identify the collective computational processes and make new predictions for inactivations in the original RNNs. We then apply our method to data recorded from the prefrontal cortex of primates during a context-dependent decision-making task. Our approach enables us to assign computational roles to the different latent variables and provides a mechanistic model of the recorded dynamics, which can be used to perform in silico experiments like inactivations and provide testable predictions.

SeminarNeuroscienceRecording

Nonlinear computations in spiking neural networks through multiplicative synapses

M. Nardin
IST Austria
Nov 8, 2022

The brain efficiently performs nonlinear computations through its intricate networks of spiking neurons, but how this is done remains elusive. While recurrent spiking networks implementing linear computations can be directly derived and easily understood (e.g., in the spike coding network (SCN) framework), the connectivity required for nonlinear computations can be harder to interpret, as they require additional non-linearities (e.g., dendritic or synaptic) weighted through supervised training. Here we extend the SCN framework to directly implement any polynomial dynamical system. This results in networks requiring multiplicative synapses, which we term the multiplicative spike coding network (mSCN). We demonstrate how the required connectivity for several nonlinear dynamical systems can be directly derived and implemented in mSCNs, without training. We also show how to precisely carry out higher-order polynomials with coupled networks that use only pair-wise multiplicative synapses, and provide expected numbers of connections for each synapse type. Overall, our work provides an alternative method for implementing nonlinear computations in spiking neural networks, while keeping all the attractive features of standard SCNs such as robustness, irregular and sparse firing, and interpretable connectivity. Finally, we discuss the biological plausibility of mSCNs, and how the high accuracy and robustness of the approach may be of interest for neuromorphic computing.

SeminarNeuroscience

Flexible multitask computation in recurrent networks utilizes shared dynamical motifs

Laura Driscoll
Stanford University
Aug 24, 2022

Flexible computation is a hallmark of intelligent behavior. Yet, little is known about how neural networks contextually reconfigure for different computations. Humans are able to perform a new task without extensive training, presumably through the composition of elementary processes that were previously learned. Cognitive scientists have long hypothesized the possibility of a compositional neural code, where complex neural computations are made up of constituent components; however, the neural substrate underlying this structure remains elusive in biological and artificial neural networks. Here we identified an algorithmic neural substrate for compositional computation through the study of multitasking artificial recurrent neural networks. Dynamical systems analyses of networks revealed learned computational strategies that mirrored the modular subtask structure of the task-set used for training. Dynamical motifs such as attractors, decision boundaries and rotations were reused across different task computations. For example, tasks that required memory of a continuous circular variable repurposed the same ring attractor. We show that dynamical motifs are implemented by clusters of units and are reused across different contexts, allowing for flexibility and generalization of previously learned computation. Lesioning these clusters resulted in modular effects on network performance: a lesion that destroyed one dynamical motif only minimally perturbed the structure of other dynamical motifs. Finally, modular dynamical motifs could be reconfigured for fast transfer learning. After slow initial learning of dynamical motifs, a subsequent faster stage of learning reconfigured motifs to perform novel tasks. This work contributes to a more fundamental understanding of compositional computation underlying flexible general intelligence in neural systems. We present a conceptual framework that establishes dynamical motifs as a fundamental unit of computation, intermediate between the neuron and the network. As more whole brain imaging studies record neural activity from multiple specialized systems simultaneously, the framework of dynamical motifs will guide questions about specialization and generalization across brain regions.

SeminarCognitionRecording

Eliminativism about Neural Representation

Inês Hipólito
Humboldt-Universität zu Berlin, Berlin School of Mind and Brain
Apr 11, 2022
SeminarPhysics of LifeRecording

Exact coherent structures and transition to turbulence in a confined active nematic

Caleb Wagner
University of Nebraska-Lincoln
Feb 27, 2022

Active matter describes a class of systems that are maintained far from equilibrium by driving forces acting on the constituent particles. Here I will focus on confined active nematics, which exhibit especially rich flow behavior, ranging from structured patterns in space and time to disordered turbulent flows. To understand this behavior, I will take a deterministic dynamical systems approach, beginning with the hydrodynamic equations for the active nematic. This approach reveals that the infinite-dimensional phase space of all possible flow configurations is populated by Exact Coherent Structures (ECS), which are exact solutions of the hydrodynamic equations with distinct and regular spatiotemporal structure; examples include unstable equilibria, periodic orbits, and traveling waves. The ECS are connected by dynamical pathways called invariant manifolds. The main hypothesis in this approach is that turbulence corresponds to a trajectory meandering in the phase space, transitioning between ECS by traveling on the invariant manifolds. Similar approaches have been successful in characterizing high Reynolds number turbulence of passive fluids. Here, I will present the first systematic study of active nematic ECS and their invariant manifolds and discuss their role in characterizing the phenomenon of active turbulence.

SeminarNeuroscienceRecording

Neural Population Dynamics for Skilled Motor Control

Britton Sauerbrei
Case Western Reserve University School of Medicine
Nov 3, 2021

The ability to reach, grasp, and manipulate objects is a remarkable expression of motor skill, and the loss of this ability in injury, stroke, or disease can be devastating. These behaviors are controlled by the coordinated activity of tens of millions of neurons distributed across many CNS regions, including the primary motor cortex. While many studies have characterized the activity of single cortical neurons during reaching, the principles governing the dynamics of large, distributed neural populations remain largely unknown. Recent work in primates has suggested that during the execution of reaching, motor cortex may autonomously generate the neural pattern controlling the movement, much like the spinal central pattern generator for locomotion. In this seminar, I will describe recent work that tests this hypothesis using large-scale neural recording, high-resolution behavioral measurements, dynamical systems approaches to data analysis, and optogenetic perturbations in mice. We find, by contrast, that motor cortex requires strong, continuous, and time-varying thalamic input to generate the neural pattern driving reaching. In a second line of work, we demonstrate that the cortico-cerebellar loop is not critical for driving the arm towards the target, but instead fine-tunes movement parameters to enable precise and accurate behavior. Finally, I will describe my future plans to apply these experimental and analytical approaches to the adaptive control of locomotion in complex environments.

SeminarNeuroscienceRecording

Credit Assignment in Neural Networks through Deep Feedback Control

Alexander Meulemans
Institute of Neuroinformatics, University of Zürich and ETH Zürich
Sep 29, 2021

The success of deep learning sparked interest in whether the brain learns by using similar techniques for assigning credit to each synaptic weight for its contribution to the network output. However, the majority of current attempts at biologically-plausible learning methods are either non-local in time, require highly specific connectivity motives, or have no clear link to any known mathematical optimization method. Here, we introduce Deep Feedback Control (DFC), a new learning method that uses a feedback controller to drive a deep neural network to match a desired output target and whose control signal can be used for credit assignment. The resulting learning rule is fully local in space and time and approximates Gauss-Newton optimization for a wide range of feedback connectivity patterns. To further underline its biological plausibility, we relate DFC to a multi-compartment model of cortical pyramidal neurons with a local voltage-dependent synaptic plasticity rule, consistent with recent theories of dendritic processing. By combining dynamical system theory with mathematical optimization theory, we provide a strong theoretical foundation for DFC that we corroborate with detailed results on toy experiments and standard computer-vision benchmarks.

SeminarNeuroscience

Dynamical Neuromorphic Systems

Julie Grollier
CNRS/Thales lab, Palaiseau, France
Jun 14, 2021

In this talk, I aim to show that the dynamical properties of emerging nanodevices can accelerate the development of smart, and environmentally friendly chips that inherently learn through their physics. The goal of neuromorphic computing is to draw inspiration from the architecture of the brain to build low-power circuits for artificial intelligence. I will first give a brief overview of the state of the art of neuromorphic computing, highlighting the opportunities offered by emerging nanodevices in this field, and the associated challenges. I will then show that the intrinsic dynamical properties of these nanodevices can be exploited at the device and algorithmic level to assemble systems that infer and learn though their physics. I will illustrate these possibilities with examples from our work on spintronic neural networks that communicate and compute through their microwave oscillations, and on an algorithm called Equilibrium Propagation that minimizes both the error and energy of a dynamical system.

SeminarNeuroscienceRecording

Stability-Flexibility Dilemma in Cognitive Control: A Dynamical System Perspective

Naomi Leonard
Princeton University
Mar 25, 2021

Constraints on control-dependent processing have become a fundamental concept in general theories of cognition that explain human behavior in terms of rational adaptations to these constraints. However, theories miss a rationale for why such constraints would exist in the first place. Recent work suggests that constraints on the allocation of control facilitate flexible task switching at the expense of the stability needed to support goal-directed behavior in face of distraction. We formulate this problem in a dynamical system, in which control signals are represented as attractors and in which constraints on control allocation limit the depth of these attractors. We derive formal expressions of the stability-flexibility tradeoff, showing that constraints on control allocation improve cognitive flexibility but impair cognitive stability. We provide evidence that human participants adapt higher constraints on the allocation of control as the demand for flexibility increases but that participants deviate from optimal constraints. In continuing work, we are investigating how collaborative performance of a group of individuals can benefit from individual differences defined in terms of balance between cognitive stability and flexibility.

SeminarPhysics of LifeRecording

Theory, reimagined

Greg Stephens
VU Amsterdam
Dec 10, 2020

Physics offers countless examples for which theoretical predictions are astonishingly powerful. But it’s hard to imagine a similar precision in complex systems where the number and interdependencies between components simply prohibits a first-principles approach, look no further than the challenge of the billions of neurons and trillions of connections within our own brains. In such settings how do we even identify the important theoretical questions? We describe a systems-scale perspective in which we integrate information theory, dynamical systems and statistical physics to extract understanding directly from measurements. We demonstrate our approach with a reconstructed state space of the behavior of the nematode C. elegans, revealing a chaotic attractor with symmetric Lyapunov spectrum and a novel perspective of motor control. We then outline a maximally predictive coarse-graining in which nonlinear dynamics are subsumed into a linear, ensemble evolution to obtain a simple yet accurate model on multiple scales. With this coarse-graining we identify long timescales and collective states in the Langevin dynamics of a double-well potential, the Lorenz system and in worm behavior. We suggest that such an ``inverse’’ approach offers an emergent, quantitative framework in which to seek rather than impose effective organizing principles of complex systems.

SeminarPhysics of LifeRecording

Simons-Emory Workshop on Neural Dynamics: What could neural dynamics have to say about neural computation, and do we know how to listen?

Workshop, Multiple Speakers
Emory University
Dec 3, 2020

Speakers will deliver focused 10-minute talks, with periods reserved for broader discussion on topics at the intersection of neural dynamics and computation. Organizer & Moderator: Chethan Pandarinath - Emory University and Georgia Tech Speakers & Discussants: Adrienne Fairhall - U Washington Mehrdad Jazayeri - MIT John Krakauer - John Hopkins Francesca Mastrogiuseppe - Gatsby / UCL Abigail Person - U Colorado Abigail Russo - Princeton Krishna Shenoy - Stanford Saurabh Vyas - Columbia

SeminarPhysics of Life

Pancreatic α and β cells are globally phase-locked

Chao Tang
Peking University – Beijing China
Jul 28, 2020

The Ca2+ modulated pulsatile secretions of glucagon and insulin by pancreatic α and β cells play a key role in glucose metabolism and homeostasis. However, how different types of cells in the islet couple and coordinate to give rise to various Ca2+ oscillation patterns and how these patterns are being tuned by paracrine regulation are still elusive. Here we developed a microfluidic device to facilitate long-term recording of islet Ca2+ activity at single cell level and found that islets show heterogeneous but intrinsic oscillation patterns. The α and β cells in an islet oscillate in antiphase and are globally phase locked to display a variety of oscillation modes. A mathematical model of islet oscillation maps out the dependence of the oscillation modes on the paracrine interactions between α and β cells. Our study reveals the origin of the islet oscillation patterns and highlights the role of paracrine regulation in tuning them.

SeminarNeuroscienceRecording

Neural manifolds for the stable control of movement

Sara Solla
Northwestern University
Apr 28, 2020

Animals perform learned actions with remarkable consistency for years after acquiring a skill. What is the neural correlate of this stability? We explore this question from the perspective of neural populations. Recent work suggests that the building blocks of neural function may be the activation of population-wide activity patterns: neural modes that capture the dominant co-variation patterns of population activity and define a task specific low dimensional neural manifold. The time-dependent activation of the neural modes results in latent dynamics. We hypothesize that the latent dynamics associated with the consistent execution of a behaviour need to remain stable, and use an alignment method to establish this stability. Once identified, stable latent dynamics allow for the prediction of various behavioural features via fixed decoder models. We conclude that latent cortical dynamics within the task manifold are the fundamental and stable building blocks underlying consistent behaviour.

ePoster

Modeling gait dynamics with switching non-linear dynamical systems

Heike Stein, Njiva Andrianarivelo, Clarisse Batifol, Jeremy Gabillet, Ali Jalil, Michael Graupner, N. Alex Cayco Gajic

Bernstein Conference 2024

ePoster

Neural manifold discovery via dynamical systems

Arthur Pellegrino, Isabel Cornacchia, Angus Chadwick

Bernstein Conference 2024

ePoster

Using Dynamical Systems Theory to Improve Temporal Credit Assignment in Spiking Neural Networks

Rainer Engelken, L.F. Abbott

Bernstein Conference 2024

ePoster

Data-driven dynamical systems model of epilepsy development simulates intervention strategies

COSYNE 2022

ePoster

Dynamical systems analysis reveals a novel hypothalamic encoding of state in nodes controlling social behavior

COSYNE 2022

ePoster

Modeling multi-region neural communication during decision making with recurrent switching dynamical systems

COSYNE 2022

ePoster

Modeling multi-region neural communication during decision making with recurrent switching dynamical systems

COSYNE 2022

ePoster

Decomposed linear dynamical systems for C. elegans functional connectivity

Eva Yezerets, Noga Mudrik, Yenho Chen, Christopher Rozell, Adam Charles

COSYNE 2023

ePoster

Parsing neural dynamics with infinite recurrent switching linear dynamical systems

Victor Geadah & Jonathan W. Pillow

COSYNE 2023

ePoster

Capturing condition dependence in neural dynamics with Gaussian process linear dynamical systems

Victor Geadah, Amin Nejatbakhsh, David Lipshutz, Jonathan Pillow, Alex Williams

COSYNE 2025

ePoster

Neural manifold discovery via dynamical systems

Isabel M. Cornacchia, Arthur Pellegrino, Angus Chadwick

COSYNE 2025

ePoster

Task Structures Shape Underlying Dynamical Systems That Implement Computation

Po-Chen Kuo, Edgar Y. Walker, Laura Driscoll

COSYNE 2025

ePoster

Understanding the effects of neural perturbations using cell-type dynamical systems

Aditi Jha, Diksha Gupta, Carlos Brody, Jonathan Pillow

COSYNE 2025

ePoster

Decomposed Linear Dynamical Systems (dLDS) for learning the latent components of neural dynamics

Yenho Chen

Neuromatch 5