Latest

SeminarNeuroscience

How the presynapse forms and functions”

Volker Haucke
Department of Molecular Pharmacology & Cell Biology, Leibniz Institute, Berlin, Germany
Aug 28, 2025

Nervous system function relies on the polarized architecture of neurons, established by directional transport of pre- and postsynaptic cargoes. While delivery of postsynaptic components depends on the secretory pathway, the identity of the membrane compartment(s) that supply presynaptic active zone (AZ) and synaptic vesicle (SV) proteins is largely unknown. I will discuss our recent advances in our understanding of how key components of the presynaptic machinery for neurotransmitter release are transported and assembled focussing on our studies in genome-engineered human induced pluripotent stem cell-derived neurons. Specifically, I will focus on the composition and cell biological identity of the axonal transport vesicles that shuttle key components of neurotransmission to nascent synapses and on machinery for axonal transport and its control by signaling lipids. Our studies identify a crucial mechanism mediating the delivery of SV and active zone proteins to developing synapses and reveal connections to neurological disorders. In the second part of my talk, I will discuss how exocytosis and endocytosis are coupled to maintain presynaptic membrane homeostasis. I will present unpublished data regarding the role of membrane tension in the coupling of exocytosis and endocytosis at synapses. We have identified an endocytic BAR domain protein that is capable of sensing alterations in membrane tension caused by the exocytotic fusion of SVs to initiate compensatory endocytosis to restore plasma membrane area. Interference with this mechanism results in defects in the coupling of presynaptic exocytosis and SV recycling at human synapses.

SeminarNeuroscience

The Systems Vision Science Summer School & Symposium, August 11 – 22, 2025, Tuebingen, Germany

Marco Bertamini, David Brainard, Peter Dayan, Andrea van Doorn, Roland Fleming, Pascal Fries, Wilson S Geisler, Robbe Goris, Sheng He, Tadashi Isa, Tomas Knapen, Jan Koenderink, Larry Maloney, Keith May, Marcello Rosa, Jonathan Victor
Aug 22, 2025

Applications are invited for our third edition of Systems Vision Science (SVS) summer school since 2023, designed for everyone interested in gaining a systems level understanding of biological vision. We plan a coherent, graduate-level, syllabus on the integration of experimental data with theory and models, featuring lectures, guided exercises and discussion sessions. The summer school will end with a Systems Vision Science symposium on frontier topics on August 20-22, with additional invited and contributed presentations and posters. Call for contributions and participations to the symposium will be sent out spring of 2025. All summer school participants are invited to attend, and welcome to submit contributions to the symposium.

SeminarNeuroscience

The Systems Vision Science Summer School & Symposium, August 11 – 22, 2025, Tuebingen, Germany

Marco Bertamini, David Brainard, Peter Dayan, Andrea van Doorn, Roland Fleming, Pascal Fries, Wilson S Geisler, Robbe Goris, Sheng He, Tadashi Isa, Tomas Knapen, Jan Koenderink, Larry Maloney, Keith May, Marcello Rosa, Jonathan Victor
Aug 21, 2025

Applications are invited for our third edition of Systems Vision Science (SVS) summer school since 2023, designed for everyone interested in gaining a systems level understanding of biological vision. We plan a coherent, graduate-level, syllabus on the integration of experimental data with theory and models, featuring lectures, guided exercises and discussion sessions. The summer school will end with a Systems Vision Science symposium on frontier topics on August 20-22, with additional invited and contributed presentations and posters. Call for contributions and participations to the symposium will be sent out spring of 2025. All summer school participants are invited to attend, and welcome to submit contributions to the symposium.

SeminarNeuroscience

The Systems Vision Science Summer School & Symposium, August 11 – 22, 2025, Tuebingen, Germany

Marco Bertamini, David Brainard, Peter Dayan, Andrea van Doorn, Roland Fleming, Pascal Fries, Wilson S Geisler, Robbe Goris, Sheng He, Tadashi Isa, Tomas Knapen, Jan Koenderink, Larry Maloney, Keith May, Marcello Rosa, Jonathan Victor
Aug 20, 2025

Applications are invited for our third edition of Systems Vision Science (SVS) summer school since 2023, designed for everyone interested in gaining a systems level understanding of biological vision. We plan a coherent, graduate-level, syllabus on the integration of experimental data with theory and models, featuring lectures, guided exercises and discussion sessions. The summer school will end with a Systems Vision Science symposium on frontier topics on August 20-22, with additional invited and contributed presentations and posters. Call for contributions and participations to the symposium will be sent out spring of 2025. All summer school participants are invited to attend, and welcome to submit contributions to the symposium.

SeminarNeuroscience

The Systems Vision Science Summer School & Symposium, August 11 – 22, 2025, Tuebingen, Germany

Marco Bertamini, David Brainard, Peter Dayan, Andrea van Doorn, Roland Fleming, Pascal Fries, Wilson S Geisler, Robbe Goris, Sheng He, Tadashi Isa, Tomas Knapen, Jan Koenderink, Larry Maloney, Keith May, Marcello Rosa, Jonathan Victor
Aug 19, 2025

Applications are invited for our third edition of Systems Vision Science (SVS) summer school since 2023, designed for everyone interested in gaining a systems level understanding of biological vision. We plan a coherent, graduate-level, syllabus on the integration of experimental data with theory and models, featuring lectures, guided exercises and discussion sessions. The summer school will end with a Systems Vision Science symposium on frontier topics on August 20-22, with additional invited and contributed presentations and posters. Call for contributions and participations to the symposium will be sent out spring of 2025. All summer school participants are invited to attend, and welcome to submit contributions to the symposium.

SeminarNeuroscience

The Systems Vision Science Summer School & Symposium, August 11 – 22, 2025, Tuebingen, Germany

Marco Bertamini, David Brainard, Peter Dayan, Andrea van Doorn, Roland Fleming, Pascal Fries, Wilson S Geisler, Robbe Goris, Sheng He, Tadashi Isa, Tomas Knapen, Jan Koenderink, Larry Maloney, Keith May, Marcello Rosa, Jonathan Victor
Aug 18, 2025

Applications are invited for our third edition of Systems Vision Science (SVS) summer school since 2023, designed for everyone interested in gaining a systems level understanding of biological vision. We plan a coherent, graduate-level, syllabus on the integration of experimental data with theory and models, featuring lectures, guided exercises and discussion sessions. The summer school will end with a Systems Vision Science symposium on frontier topics on August 20-22, with additional invited and contributed presentations and posters. Call for contributions and participations to the symposium will be sent out spring of 2025. All summer school participants are invited to attend, and welcome to submit contributions to the symposium.

SeminarNeuroscience

The Systems Vision Science Summer School & Symposium, August 11 – 22, 2025, Tuebingen, Germany

Marco Bertamini, David Brainard, Peter Dayan, Andrea van Doorn, Roland Fleming, Pascal Fries, Wilson S Geisler, Robbe Goris, Sheng He, Tadashi Isa, Tomas Knapen, Jan Koenderink, Larry Maloney, Keith May, Marcello Rosa, Jonathan Victor
Aug 15, 2025

Applications are invited for our third edition of Systems Vision Science (SVS) summer school since 2023, designed for everyone interested in gaining a systems level understanding of biological vision. We plan a coherent, graduate-level, syllabus on the integration of experimental data with theory and models, featuring lectures, guided exercises and discussion sessions. The summer school will end with a Systems Vision Science symposium on frontier topics on August 20-22, with additional invited and contributed presentations and posters. Call for contributions and participations to the symposium will be sent out spring of 2025. All summer school participants are invited to attend, and welcome to submit contributions to the symposium.

SeminarNeuroscience

The Systems Vision Science Summer School & Symposium, August 11 – 22, 2025, Tuebingen, Germany

Marco Bertamini, David Brainard, Peter Dayan, Andrea van Doorn, Roland Fleming, Pascal Fries, Wilson S Geisler, Robbe Goris, Sheng He, Tadashi Isa, Tomas Knapen, Jan Koenderink, Larry Maloney, Keith May, Marcello Rosa, Jonathan Victor
Aug 14, 2025

Applications are invited for our third edition of Systems Vision Science (SVS) summer school since 2023, designed for everyone interested in gaining a systems level understanding of biological vision. We plan a coherent, graduate-level, syllabus on the integration of experimental data with theory and models, featuring lectures, guided exercises and discussion sessions. The summer school will end with a Systems Vision Science symposium on frontier topics on August 20-22, with additional invited and contributed presentations and posters. Call for contributions and participations to the symposium will be sent out spring of 2025. All summer school participants are invited to attend, and welcome to submit contributions to the symposium.

SeminarNeuroscience

The Systems Vision Science Summer School & Symposium, August 11 – 22, 2025, Tuebingen, Germany

Marco Bertamini, David Brainard, Peter Dayan, Andrea van Doorn, Roland Fleming, Pascal Fries, Wilson S Geisler, Robbe Goris, Sheng He, Tadashi Isa, Tomas Knapen, Jan Koenderink, Larry Maloney, Keith May, Marcello Rosa, Jonathan Victor
Aug 13, 2025

Applications are invited for our third edition of Systems Vision Science (SVS) summer school since 2023, designed for everyone interested in gaining a systems level understanding of biological vision. We plan a coherent, graduate-level, syllabus on the integration of experimental data with theory and models, featuring lectures, guided exercises and discussion sessions. The summer school will end with a Systems Vision Science symposium on frontier topics on August 20-22, with additional invited and contributed presentations and posters. Call for contributions and participations to the symposium will be sent out spring of 2025. All summer school participants are invited to attend, and welcome to submit contributions to the symposium.

SeminarNeuroscience

The Systems Vision Science Summer School & Symposium, August 11 – 22, 2025, Tuebingen, Germany

Marco Bertamini, David Brainard, Peter Dayan, Andrea van Doorn, Roland Fleming, Pascal Fries, Wilson S Geisler, Robbe Goris, Sheng He, Tadashi Isa, Tomas Knapen, Jan Koenderink, Larry Maloney, Keith May, Marcello Rosa, Jonathan Victor
Aug 12, 2025

Applications are invited for our third edition of Systems Vision Science (SVS) summer school since 2023, designed for everyone interested in gaining a systems level understanding of biological vision. We plan a coherent, graduate-level, syllabus on the integration of experimental data with theory and models, featuring lectures, guided exercises and discussion sessions. The summer school will end with a Systems Vision Science symposium on frontier topics on August 20-22, with additional invited and contributed presentations and posters. Call for contributions and participations to the symposium will be sent out spring of 2025. All summer school participants are invited to attend, and welcome to submit contributions to the symposium.

SeminarNeuroscience

The Systems Vision Science Summer School & Symposium, August 11 – 22, 2025, Tuebingen, Germany

Marco Bertamini, David Brainard, Peter Dayan, Andrea van Doorn, Roland Fleming, Pascal Fries, Wilson S Geisler, Robbe Goris, Sheng He, Tadashi Isa, Tomas Knapen, Jan Koenderink, Larry Maloney, Keith May, Marcello Rosa, Jonathan Victor
Aug 11, 2025

Applications are invited for our third edition of Systems Vision Science (SVS) summer school since 2023, designed for everyone interested in gaining a systems level understanding of biological vision. We plan a coherent, graduate-level, syllabus on the integration of experimental data with theory and models, featuring lectures, guided exercises and discussion sessions. The summer school will end with a Systems Vision Science symposium on frontier topics on August 20-22, with additional invited and contributed presentations and posters. Call for contributions and participations to the symposium will be sent out spring of 2025. All summer school participants are invited to attend, and welcome to submit contributions to the symposium.

SeminarNeuroscience

OpenNeuro FitLins GLM: An Accessible, Semi-Automated Pipeline for OpenNeuro Task fMRI Analysis

Michael Demidenko
Stanford University
Aug 1, 2025

In this talk, I will discuss the OpenNeuro Fitlins GLM package and provide an illustration of the analytic workflow. OpenNeuro FitLins GLM is a semi-automated pipeline that reduces barriers to analyzing task-based fMRI data from OpenNeuro's 600+ task datasets. Created for psychology, psychiatry and cognitive neuroscience researchers without extensive computational expertise, this tool automates what is largely a manual process and compilation of in-house scripts for data retrieval, validation, quality control, statistical modeling and reporting that, in some cases, may require weeks of effort. The workflow abides by open-science practices, enhancing reproducibility and incorporates community feedback for model improvement. The pipeline integrates BIDS-compliant datasets and fMRIPrep preprocessed derivatives, and dynamically creates BIDS Statistical Model specifications (with Fitlins) to perform common mass univariate [GLM] analyses. To enhance and standardize reporting, it generates comprehensive reports which includes design matrices, statistical maps and COBIDAS-aligned reporting that is fully reproducible from the model specifications and derivatives. OpenNeuro Fitlins GLM has been tested on over 30 datasets spanning 50+ unique fMRI tasks (e.g., working memory, social processing, emotion regulation, decision-making, motor paradigms), reducing analysis times from weeks to hours when using high-performance computers, thereby enabling researchers to conduct robust single-study, meta- and mega-analyses of task fMRI data with significantly improved accessibility, standardized reporting and reproducibility.

SeminarNeuroscience

Understanding reward-guided learning using large-scale datasets

Kim Stachenfeld
DeepMind, Columbia U
Jul 9, 2025

Understanding the neural mechanisms of reward-guided learning is a long-standing goal of computational neuroscience. Recent methodological innovations enable us to collect ever larger neural and behavioral datasets. This presents opportunities to achieve greater understanding of learning in the brain at scale, as well as methodological challenges. In the first part of the talk, I will discuss our recent insights into the mechanisms by which zebra finch songbirds learn to sing. Dopamine has been long thought to guide reward-based trial-and-error learning by encoding reward prediction errors. However, it is unknown whether the learning of natural behaviours, such as developmental vocal learning, occurs through dopamine-based reinforcement. Longitudinal recordings of dopamine and bird songs reveal that dopamine activity is indeed consistent with encoding a reward prediction error during naturalistic learning. In the second part of the talk, I will talk about recent work we are doing at DeepMind to develop tools for automatically discovering interpretable models of behavior directly from animal choice data. Our method, dubbed CogFunSearch, uses LLMs within an evolutionary search process in order to "discover" novel models in the form of Python programs that excel at accurately predicting animal behavior during reward-guided learning. The discovered programs reveal novel patterns of learning and choice behavior that update our understanding of how the brain solves reinforcement learning problems.

SeminarNeuroscience

Neurobiological constraints on learning: bug or feature?

Cian O’Donell
Ulster University
Jun 11, 2025

Understanding how brains learn requires bridging evidence across scales—from behaviour and neural circuits to cells, synapses, and molecules. In our work, we use computational modelling and data analysis to explore how the physical properties of neurons and neural circuits constrain learning. These include limits imposed by brain wiring, energy availability, molecular noise, and the 3D structure of dendritic spines. In this talk I will describe one such project testing if wiring motifs from fly brain connectomes can improve performance of reservoir computers, a type of recurrent neural network. The hope is that these insights into brain learning will lead to improved learning algorithms for artificial systems.

SeminarNeuroscience

HealthCore: A modular data collection ecosystem to connect the dots in Neurorehab

Chris Awai
Lake Lucerne Institute, Switzerland
Jun 5, 2025
SeminarNeuroscience

Expanding mechanisms and therapeutic targets for neurodegenerative disease

Aaron D. Gitler
Department of Genetics, Stanford University
Jun 5, 2025

A hallmark pathological feature of the neurodegenerative diseases amyotrophic lateral sclerosis (ALS) and frontotemporal dementia (FTD) is the depletion of RNA-binding protein TDP-43 from the nucleus of neurons in the brain and spinal cord. A major function of TDP-43 is as a repressor of cryptic exon inclusion during RNA splicing. By re-analyzing RNA-sequencing datasets from human FTD/ALS brains, we discovered dozens of novel cryptic splicing events in important neuronal genes. Single nucleotide polymorphisms in UNC13A are among the strongest hits associated with FTD and ALS in human genome-wide association studies, but how those variants increase risk for disease is unknown. We discovered that TDP-43 represses a cryptic exon-splicing event in UNC13A. Loss of TDP-43 from the nucleus in human brain, neuronal cell lines and motor neurons derived from induced pluripotent stem cells resulted in the inclusion of a cryptic exon in UNC13A mRNA and reduced UNC13A protein expression. The top variants associated with FTD or ALS risk in humans are located in the intron harboring the cryptic exon, and we show that they increase UNC13A cryptic exon splicing in the face of TDP-43 dysfunction. Together, our data provide a direct functional link between one of the strongest genetic risk factors for FTD and ALS (UNC13A genetic variants), and loss of TDP-43 function. Recent analyses have revealed even further changes in TDP-43 target genes, including widespread changes in alternative polyadenylation, impacting expression of disease-relevant genes (e.g., ELP1, NEFL, and TMEM106B) and providing evidence that alternative polyadenylation is a new facet of TDP-43 pathology.

SeminarNeuroscience

Neural mechanisms of optimal performance

Luca Mazzucato
University of Oregon
May 23, 2025

When we attend a demanding task, our performance is poor at low arousal (when drowsy) or high arousal (when anxious), but we achieve optimal performance at intermediate arousal. This celebrated Yerkes-Dodson inverted-U law relating performance and arousal is colloquially referred to as being "in the zone." In this talk, I will elucidate the behavioral and neural mechanisms linking arousal and performance under the Yerkes-Dodson law in a mouse model. During decision-making tasks, mice express an array of discrete strategies, whereby the optimal strategy occurs at intermediate arousal, measured by pupil, consistent with the inverted-U law. Population recordings from the auditory cortex (A1) further revealed that sound encoding is optimal at intermediate arousal. To explain the computational principle underlying this inverted-U law, we modeled the A1 circuit as a spiking network with excitatory/inhibitory clusters, based on the observed functional clusters in A1. Arousal induced a transition from a multi-attractor (low arousal) to a single attractor phase (high arousal), and performance is optimized at the transition point. The model also predicts stimulus- and arousal-induced modulations of neural variability, which we confirmed in the data. Our theory suggests that a single unifying dynamical principle, phase transitions in metastable dynamics, underlies both the inverted-U law of optimal performance and state-dependent modulations of neural variability.

SeminarNeuroscience

Understanding reward-guided learning using large-scale datasets

Kim Stachenfeld
DeepMind, Columbia U
May 14, 2025

Understanding the neural mechanisms of reward-guided learning is a long-standing goal of computational neuroscience. Recent methodological innovations enable us to collect ever larger neural and behavioral datasets. This presents opportunities to achieve greater understanding of learning in the brain at scale, as well as methodological challenges. In the first part of the talk, I will discuss our recent insights into the mechanisms by which zebra finch songbirds learn to sing. Dopamine has been long thought to guide reward-based trial-and-error learning by encoding reward prediction errors. However, it is unknown whether the learning of natural behaviours, such as developmental vocal learning, occurs through dopamine-based reinforcement. Longitudinal recordings of dopamine and bird songs reveal that dopamine activity is indeed consistent with encoding a reward prediction error during naturalistic learning. In the second part of the talk, I will talk about recent work we are doing at DeepMind to develop tools for automatically discovering interpretable models of behavior directly from animal choice data. Our method, dubbed CogFunSearch, uses LLMs within an evolutionary search process in order to "discover" novel models in the form of Python programs that excel at accurately predicting animal behavior during reward-guided learning. The discovered programs reveal novel patterns of learning and choice behavior that update our understanding of how the brain solves reinforcement learning problems.

SeminarNeuroscience

Harnessing Big Data in Neuroscience: From Mapping Brain Connectivity to Predicting Traumatic Brain Injury

Franco Pestilli
University of Texas, Austin, USA
May 13, 2025

Neuroscience is experiencing unprecedented growth in dataset size both within individual brains and across populations. Large-scale, multimodal datasets are transforming our understanding of brain structure and function, creating opportunities to address previously unexplored questions. However, managing this increasing data volume requires new training and technology approaches. Modern data technologies are reshaping neuroscience by enabling researchers to tackle complex questions within a Ph.D. or postdoctoral timeframe. I will discuss cloud-based platforms such as brainlife.io, that provide scalable, reproducible, and accessible computational infrastructure. Modern data technology can democratize neuroscience, accelerate discovery and foster scientific transparency and collaboration. Concrete examples will illustrate how these technologies can be applied to mapping brain connectivity, studying human learning and development, and developing predictive models for traumatic brain injury (TBI). By integrating cloud computing and scalable data-sharing frameworks, neuroscience can become more impactful, inclusive, and data-driven..

SeminarNeuroscience

Cognitive maps as expectations learned across episodes – a model of the two dentate gyrus blades

Andrej Bicanski
Max Planck Institute for Human Cognitive and Brain Sciences
Mar 12, 2025

How can the hippocampal system transition from episodic one-shot learning to a multi-shot learning regime and what is the utility of the resultant neural representations? This talk will explore the role of the dentate gyrus (DG) anatomy in this context. The canonical DG model suggests it performs pattern separation. More recent experimental results challenge this standard model, suggesting DG function is more complex and also supports the precise binding of objects and events to space and the integration of information across episodes. Very recent studies attribute pattern separation and pattern integration to anatomically distinct parts of the DG (the suprapyramidal blade vs the infrapyramidal blade). We propose a computational model that investigates this distinction. In the model the two processing streams (potentially localized in separate blades) contribute to the storage of distinct episodic memories, and the integration of information across episodes, respectively. The latter forms generalized expectations across episodes, eventually forming a cognitive map. We train the model with two data sets, MNIST and plausible entorhinal cortex inputs. The comparison between the two streams allows for the calculation of a prediction error, which can drive the storage of poorly predicted memories and the forgetting of well-predicted memories. We suggest that differential processing across the DG aids in the iterative construction of spatial cognitive maps to serve the generation of location-dependent expectations, while at the same time preserving episodic memory traces of idiosyncratic events.

SeminarNeuroscienceRecording

Brain Emulation Challenge Workshop

Konrad Kording
Professor,University of Pennsylvania, Department of Neuroscience and Department of Bioengineering
Feb 21, 2025

Brain Emulation Challenge workshop will tackle cutting-edge topics such as ground-truthing for validation, leveraging artificial datasets generated from virtual brain tissue, and the transformative potential of virtual brain platforms, such as applied to the forthcoming Brain Emulation Challenge.

SeminarNeuroscienceRecording

Brain Emulation Challenge Workshop

Randal A. Koene
Co-Founder and Chief Science Officer, Carboncopies
Feb 21, 2025

Brain Emulation Challenge workshop will tackle cutting-edge topics such as ground-truthing for validation, leveraging artificial datasets generated from virtual brain tissue, and the transformative potential of virtual brain platforms, such as applied to the forthcoming Brain Emulation Challenge.

SeminarNeuroscienceRecording

Brain Emulation Challenge Workshop

Philip Shiu
Neuroscientist at A.I., Cognitive Science and Neurobiology Company, EON Systems
Feb 21, 2025

Brain Emulation Challenge workshop will tackle cutting-edge topics such as ground-truthing for validation, leveraging artificial datasets generated from virtual brain tissue, and the transformative potential of virtual brain platforms, such as applied to the forthcoming Brain Emulation Challenge.

SeminarNeuroscienceRecording

Brain Emulation Challenge Workshop

Razvan Marinescu
Assistant Professor, UC Santa Cruz, Department of Computer Science and Engineering
Feb 21, 2025

Brain Emulation Challenge workshop will tackle cutting-edge topics such as ground-truthing for validation, leveraging artificial datasets generated from virtual brain tissue, and the transformative potential of virtual brain platforms, such as applied to the forthcoming Brain Emulation Challenge.

SeminarNeuroscienceRecording

Brain Emulation Challenge Workshop

Janne K. Lappalainen
University of Tübingen and Max Planck Research School for Intelligent Systems
Feb 21, 2025

Brain Emulation Challenge workshop will tackle cutting-edge topics such as ground-truthing for validation, leveraging artificial datasets generated from virtual brain tissue, and the transformative potential of virtual brain platforms, such as applied to the forthcoming Brain Emulation Challenge.

SeminarNeuroscience

Structural & Functional Neuroplasticity in Children with Hemiplegia

Christos Papadelis
University of Texas at Arlington
Feb 21, 2025

About 30% of children with cerebral palsy have congenital hemiplegia, resulting from periventricular white matter injury, which impairs the use of one hand and disrupts bimanual co-ordination. Congenital hemiplegia has a profound effect on each child's life and, thus, is of great importance to the public health. Changes in brain organization (neuroplasticity) often occur following periventricular white matter injury. These changes vary widely depending on the timing, location, and extent of the injury, as well as the functional system involved. Currently, we have limited knowledge of neuroplasticity in children with congenital hemiplegia. As a result, we provide rehabilitation treatment to these children almost blindly based exclusively on behavioral data. In this talk, I will present recent research evidence of my team on understanding neuroplasticity in children with congenital hemiplegia by using a multimodal neuroimaging approach that combines data from structural and functional neuroimaging methods. I will further present preliminary data regarding functional improvements of upper extremities motor and sensory functions as a result of rehabilitation with a robotic system that involves active participation of the child in a video-game setup. Our research is essential for the development of novel or improved neurological rehabilitation strategies for children with congenital hemiplegia.

SeminarNeuroscience

Sensory cognition

SueYeon Chung, Srini Turaga
New York University; Janelia Research Campus
Nov 29, 2024

This webinar features presentations from SueYeon Chung (New York University) and Srinivas Turaga (HHMI Janelia Research Campus) on theoretical and computational approaches to sensory cognition. Chung introduced a “neural manifold” framework to capture how high-dimensional neural activity is structured into meaningful manifolds reflecting object representations. She demonstrated that manifold geometry—shaped by radius, dimensionality, and correlations—directly governs a population’s capacity for classifying or separating stimuli under nuisance variations. Applying these ideas as a data analysis tool, she showed how measuring object-manifold geometry can explain transformations along the ventral visual stream and suggested that manifold principles also yield better self-supervised neural network models resembling mammalian visual cortex. Turaga described simulating the entire fruit fly visual pathway using its connectome, modeling 64 key cell types in the optic lobe. His team’s systematic approach—combining sparse connectivity from electron microscopy with simple dynamical parameters—recapitulated known motion-selective responses and produced novel testable predictions. Together, these studies underscore the power of combining connectomic detail, task objectives, and geometric theories to unravel neural computations bridging from stimuli to cognitive functions.

SeminarNeuroscience

Learning and Memory

Nicolas Brunel, Ashok Litwin-Kumar, Julijana Gjeorgieva
Duke University; Columbia University; Technical University Munich
Nov 29, 2024

This webinar on learning and memory features three experts—Nicolas Brunel, Ashok Litwin-Kumar, and Julijana Gjorgieva—who present theoretical and computational approaches to understanding how neural circuits acquire and store information across different scales. Brunel discusses calcium-based plasticity and how standard “Hebbian-like” plasticity rules inferred from in vitro or in vivo datasets constrain synaptic dynamics, aligning with classical observations (e.g., STDP) and explaining how synaptic connectivity shapes memory. Litwin-Kumar explores insights from the fruit fly connectome, emphasizing how the mushroom body—a key site for associative learning—implements a high-dimensional, random representation of sensory features. Convergent dopaminergic inputs gate plasticity, reflecting a high-dimensional “critic” that refines behavior. Feedback loops within the mushroom body further reveal sophisticated interactions between learning signals and action selection. Gjorgieva examines how activity-dependent plasticity rules shape circuitry from the subcellular (e.g., synaptic clustering on dendrites) to the cortical network level. She demonstrates how spontaneous activity during development, Hebbian competition, and inhibitory-excitatory balance collectively establish connectivity motifs responsible for key computations such as response normalization.

SeminarNeuroscience

Decision and Behavior

Sam Gershman, Jonathan Pillow, Kenji Doya
Harvard University; Princeton University; Okinawa Institute of Science and Technology
Nov 29, 2024

This webinar addressed computational perspectives on how animals and humans make decisions, spanning normative, descriptive, and mechanistic models. Sam Gershman (Harvard) presented a capacity-limited reinforcement learning framework in which policies are compressed under an information bottleneck constraint. This approach predicts pervasive perseveration, stimulus‐independent “default” actions, and trade-offs between complexity and reward. Such policy compression reconciles observed action stochasticity and response time patterns with an optimal balance between learning capacity and performance. Jonathan Pillow (Princeton) discussed flexible descriptive models for tracking time-varying policies in animals. He introduced dynamic Generalized Linear Models (Sidetrack) and hidden Markov models (GLM-HMMs) that capture day-to-day and trial-to-trial fluctuations in choice behavior, including abrupt switches between “engaged” and “disengaged” states. These models provide new insights into how animals’ strategies evolve under learning. Finally, Kenji Doya (OIST) highlighted the importance of unifying reinforcement learning with Bayesian inference, exploring how cortical-basal ganglia networks might implement model-based and model-free strategies. He also described Japan’s Brain/MINDS 2.0 and Digital Brain initiatives, aiming to integrate multimodal data and computational principles into cohesive “digital brains.”

SeminarNeuroscience

The role of real-word data in scientific evidence. Experiences from the Danish Multiple Sclerosis Registry

Melinda Magyari
Danish Multiple Sclerosis Center
Nov 21, 2024
SeminarNeuroscience

Brain-Wide Compositionality and Learning Dynamics in Biological Agents

Kanaka Rajan
Harvard Medical School
Nov 13, 2024

Biological agents continually reconcile the internal states of their brain circuits with incoming sensory and environmental evidence to evaluate when and how to act. The brains of biological agents, including animals and humans, exploit many evolutionary innovations, chiefly modularity—observable at the level of anatomically-defined brain regions, cortical layers, and cell types among others—that can be repurposed in a compositional manner to endow the animal with a highly flexible behavioral repertoire. Accordingly, their behaviors show their own modularity, yet such behavioral modules seldom correspond directly to traditional notions of modularity in brains. It remains unclear how to link neural and behavioral modularity in a compositional manner. We propose a comprehensive framework—compositional modes—to identify overarching compositionality spanning specialized submodules, such as brain regions. Our framework directly links the behavioral repertoire with distributed patterns of population activity, brain-wide, at multiple concurrent spatial and temporal scales. Using whole-brain recordings of zebrafish brains, we introduce an unsupervised pipeline based on neural network models, constrained by biological data, to reveal highly conserved compositional modes across individuals despite the naturalistic (spontaneous or task-independent) nature of their behaviors. These modes provided a scaffolding for other modes that account for the idiosyncratic behavior of each fish. We then demonstrate experimentally that compositional modes can be manipulated in a consistent manner by behavioral and pharmacological perturbations. Our results demonstrate that even natural behavior in different individuals can be decomposed and understood using a relatively small number of neurobehavioral modules—the compositional modes—and elucidate a compositional neural basis of behavior. This approach aligns with recent progress in understanding how reasoning capabilities and internal representational structures develop over the course of learning or training, offering insights into the modularity and flexibility in artificial and biological agents.

SeminarNeuroscience

Use case determines the validity of neural systems comparisons

Erin Grant
Gatsby Computational Neuroscience Unit & Sainsbury Wellcome Centre at University College London
Oct 16, 2024

Deep learning provides new data-driven tools to relate neural activity to perception and cognition, aiding scientists in developing theories of neural computation that increasingly resemble biological systems both at the level of behavior and of neural activity. But what in a deep neural network should correspond to what in a biological system? This question is addressed implicitly in the use of comparison measures that relate specific neural or behavioral dimensions via a particular functional form. However, distinct comparison methodologies can give conflicting results in recovering even a known ground-truth model in an idealized setting, leaving open the question of what to conclude from the outcome of a systems comparison using any given methodology. Here, we develop a framework to make explicit and quantitative the effect of both hypothesis-driven aspects—such as details of the architecture of a deep neural network—as well as methodological choices in a systems comparison setting. We demonstrate via the learning dynamics of deep neural networks that, while the role of the comparison methodology is often de-emphasized relative to hypothesis-driven aspects, this choice can impact and even invert the conclusions to be drawn from a comparison between neural systems. We provide evidence that the right way to adjudicate a comparison depends on the use case—the scientific hypothesis under investigation—which could range from identifying single-neuron or circuit-level correspondences to capturing generalizability to new stimulus properties

SeminarNeuroscience

Localisation of Seizure Onset Zone in Epilepsy Using Time Series Analysis of Intracranial Data

Hamid Karimi-Rouzbahani
The University of Queensland
Oct 11, 2024

There are over 30 million people with drug-resistant epilepsy worldwide. When neuroimaging and non-invasive neural recordings fail to localise seizure onset zones (SOZ), intracranial recordings become the best chance for localisation and seizure-freedom in those patients. However, intracranial neural activities remain hard to visually discriminate across recording channels, which limits the success of intracranial visual investigations. In this presentation, I present methods which quantify intracranial neural time series and combine them with explainable machine learning algorithms to localise the SOZ in the epileptic brain. I present the potentials and limitations of our methods in the localisation of SOZ in epilepsy providing insights for future research in this area.

SeminarNeuroscience

Probing neural population dynamics with recurrent neural networks

Chethan Pandarinath
Emory University and Georgia Tech
Jun 12, 2024

Large-scale recordings of neural activity are providing new opportunities to study network-level dynamics with unprecedented detail. However, the sheer volume of data and its dynamical complexity are major barriers to uncovering and interpreting these dynamics. I will present latent factor analysis via dynamical systems, a sequential autoencoding approach that enables inference of dynamics from neuronal population spiking activity on single trials and millisecond timescales. I will also discuss recent adaptations of the method to uncover dynamics from neural activity recorded via 2P Calcium imaging. Finally, time permitting, I will mention recent efforts to improve the interpretability of deep-learning based dynamical systems models.

SeminarNeuroscienceRecording

Characterizing the causal role of large-scale network interactions in supporting complex cognition

Michal Ramot
Weizmann Inst. of Science
May 7, 2024

Neuroimaging has greatly extended our capacity to study the workings of the human brain. Despite the wealth of knowledge this tool has generated however, there are still critical gaps in our understanding. While tremendous progress has been made in mapping areas of the brain that are specialized for particular stimuli, or cognitive processes, we still know very little about how large-scale interactions between different cortical networks facilitate the integration of information and the execution of complex tasks. Yet even the simplest behavioral tasks are complex, requiring integration over multiple cognitive domains. Our knowledge falls short not only in understanding how this integration takes place, but also in what drives the profound variation in behavior that can be observed on almost every task, even within the typically developing (TD) population. The search for the neural underpinnings of individual differences is important not only philosophically, but also in the service of precision medicine. We approach these questions using a three-pronged approach. First, we create a battery of behavioral tasks from which we can calculate objective measures for different aspects of the behaviors of interest, with sufficient variance across the TD population. Second, using these individual differences in behavior, we identify the neural variance which explains the behavioral variance at the network level. Finally, using covert neurofeedback, we perturb the networks hypothesized to correspond to each of these components, thus directly testing their casual contribution. I will discuss our overall approach, as well as a few of the new directions we are currently pursuing.

SeminarNeuroscience

Generative models for video games

Katja Hoffman
Microsoft Research
May 1, 2024

Developing agents capable of modeling complex environments and human behaviors within them is a key goal of artificial intelligence research. Progress towards this goal has exciting potential for applications in video games, from new tools that empower game developers to realize new creative visions, to enabling new kinds of immersive player experiences. This talk focuses on recent advances of my team at Microsoft Research towards scalable machine learning architectures that effectively capture human gameplay data. In the first part of my talk, I will focus on diffusion models as generative models of human behavior. Previously shown to have impressive image generation capabilities, I present insights that unlock applications to imitation learning for sequential decision making. In the second part of my talk, I discuss a recent project taking ideas from language modeling to build a generative sequence model of an Xbox game.

SeminarNeuroscience

Roles of inhibition in stabilizing and shaping the response of cortical networks

Nicolas Brunel
Duke University
Apr 5, 2024

Inhibition has long been thought to stabilize the activity of cortical networks at low rates, and to shape significantly their response to sensory inputs. In this talk, I will describe three recent collaborative projects that shed light on these issues. (1) I will show how optogenetic excitation of inhibition neurons is consistent with cortex being inhibition stabilized even in the absence of sensory inputs, and how this data can constrain the coupling strengths of E-I cortical network models. (2) Recent analysis of the effects of optogenetic excitation of pyramidal cells in V1 of mice and monkeys shows that in some cases this optogenetic input reshuffles the firing rates of neurons of the network, leaving the distribution of rates unaffected. I will show how this surprising effect can be reproduced in sufficiently strongly coupled E-I networks. (3) Another puzzle has been to understand the respective roles of different inhibitory subtypes in network stabilization. Recent data reveal a novel, state dependent, paradoxical effect of weakening AMPAR mediated synaptic currents onto SST cells. Mathematical analysis of a network model with multiple inhibitory cell types shows that this effect tells us in which conditions SST cells are required for network stabilization.

ePosterNeuroscience

Inferring stochastic low-rank recurrent neural networks from neural data

Matthijs Pals, A Sağtekin, Felix Pei, Manuel Gloeckler, Jakob Macke

Bernstein Conference 2024

ePosterNeuroscience

Latent Diffusion for Neural Spiking Data

Auguste Schulz, Jaivardhan Kapoor, Julius Vetter, Felix Pei, Richard Gao, Jakob Macke

Bernstein Conference 2024

ePosterNeuroscience

Model Selection in Sensory Data Interpretation

Francesco Guido Rinaldi, Eugenio Piasini

Bernstein Conference 2024

ePosterNeuroscience

NeuroTask: A Benchmark Dataset for Multi-Task Neural Analysis

A. Filipe, Il Park

Bernstein Conference 2024

ePosterNeuroscience

Open-source solutions for research data management in neuroscience collaborations

Reema Gupta, Thomas Wachtler

Bernstein Conference 2024

ePosterNeuroscience

Single-cell morphological data provide refined simulations of resting-state

Penghao Qian, Linus Manubens-Gil, Hanchuan Peng

Bernstein Conference 2024

ePosterNeuroscience

Towards predicting Stroke Etiology from MRI and CT Imaging Data of Ischemic Stroke Patients

Beatrice Guastella, Steffen Tiedt, Hannah Spitzer

Bernstein Conference 2024

ePosterNeuroscience

Tracking the provenance of data generation and analysis in NEST simulations

Cristiano Köhler, Moritz Kern, Sonja Grün, Michael Denker

Bernstein Conference 2024

ePosterNeuroscience

Unified C. elegans Neural Activity and Connectivity Datasets for Building Foundation Models of a Small Nervous System

Quilee Simeon, Anshul Kashyap, Konrad Kording, Ed Boyden

Bernstein Conference 2024

ePosterNeuroscience

AutSim: Principled, data driven model development and abstraction for signaling in synaptic protein synthesis in Fragile X Syndrome (FXS) and healthy control.

Nisha Viswan,Upinder Bhalla

COSYNE 2022

ePosterNeuroscience

Data-driven dynamical systems model of epilepsy development simulates intervention strategies

Danylo Batulin,Fereshteh Lagzi,Annamaria Vezzani,Peter Jedlicka,Jochen Triesch

COSYNE 2022

ePosterNeuroscience

Emergence of time persistence in an interpretable data-driven neural network model

Sebastien Wolf,Guillaume Le Goc,Georges Debregeas,Simona Cocco,Rémi Monasson

COSYNE 2022

ePosterNeuroscience

Fast inter-subject alignment method for large datasets shows fine-grained cortical reorganisations

Alexis Thual,Huy Tran,Bertrand Thirion,Stanislas Dehaene

COSYNE 2022

ePosterNeuroscience

Gaussian Partial Information Decomposition: Quantifying Inter-areal Interactions in High-Dimensional Neural Data

Praveen Venkatesh,Gabriel Schamberg,Adrienne Fairhall,Shawn Olsen,Stefan Mihalas,Christof Koch

COSYNE 2022

ePosterNeuroscience

Gaussian Partial Information Decomposition: Quantifying Inter-areal Interactions in High-Dimensional Neural Data

Praveen Venkatesh,Gabriel Schamberg,Adrienne Fairhall,Shawn Olsen,Stefan Mihalas,Christof Koch

COSYNE 2022

ePosterNeuroscience

A high-throughput pipeline for evaluating recurrent neural networks on multiple datasets

Moufan Li,Nathan Cloos,Xun Yuan,Guangyu Robert Yang,Christopher J. Cueva

COSYNE 2022

ePosterNeuroscience

A high-throughput pipeline for evaluating recurrent neural networks on multiple datasets

Moufan Li,Nathan Cloos,Xun Yuan,Guangyu Robert Yang,Christopher J. Cueva

COSYNE 2022

ePosterNeuroscience

Identifying latent states in decision-making from cortical inactivation data

Zeinab Mohammadi,Zoe C. Ashwood,Lucas Pinto,David W. Tank,Carlos D. Brody,Jonathan Pillow

COSYNE 2022

ePosterNeuroscience

Identifying latent states in decision-making from cortical inactivation data

Zeinab Mohammadi,Zoe C. Ashwood,Lucas Pinto,David W. Tank,Carlos D. Brody,Jonathan Pillow

COSYNE 2022

ePosterNeuroscience

Identifying key structural connections from functional response data: theory & applications

Tirthabir Biswas,Tianzhi Lambus,James Fitzgerald

COSYNE 2022

ePosterNeuroscience

Identifying key structural connections from functional response data: theory & applications

Tirthabir Biswas,Tianzhi Lambus,James Fitzgerald

COSYNE 2022

ePosterNeuroscience

Inferring olfactory space from glomerular response data

Yakov Berchenko-Kogan,Min-Chun Wu,Matt Wachowiak,Vladimir Itskov

COSYNE 2022

ePosterNeuroscience

Inferring olfactory space from glomerular response data

Yakov Berchenko-Kogan,Min-Chun Wu,Matt Wachowiak,Vladimir Itskov

COSYNE 2022

ePosterNeuroscience

Inter-individual alignment and single-trial classification of MEG data using M-CCA

Leo Michalke,Jochem Rieger

COSYNE 2022

ePosterNeuroscience

Inter-individual alignment and single-trial classification of MEG data using M-CCA

Leo Michalke,Jochem Rieger

COSYNE 2022

ePosterNeuroscience

An accessible hippocampal dataset for benchmarking models of cognitive mapping

Alexandra Keinath, Justin Quinn Lee, Mark Brandon

COSYNE 2023

ePosterNeuroscience

Augmented Gaussian process variational autoencoders for multi-modal experimental data

Rabia Gondur, Evan Schaffer, Mikio Aoi, Stephen Keeley

COSYNE 2023

ePosterNeuroscience

Automated identification of data-consistent spiking neural network models

Richard Gao, Michael Deistler, Jakob Macke

COSYNE 2023

ePosterNeuroscience

A Bayesian hierarchical latent variable model for spike train data analysis

Josefina Correa Menendez, Earl Miller, Emery Brown

COSYNE 2023

ePosterNeuroscience

Data-driven discovery of long timescale behavioral strategies during sensory evoked locomotion

Gautam Sridhar, Antonio Costa, Massimo Vergassola, Claire Wyart

COSYNE 2023

ePosterNeuroscience

Disentangling input dynamics from intrinsic neural dynamics in modeling of neural-behavioral data

Parsa Vahidi, Omid Sani, Maryam Shanechi

COSYNE 2023

ePosterNeuroscience

Fitting normative neural sampling hypothesis models to neuronal response data

Suhas Shrinivasan, Andreas Tolias, Edgar Y. Walker, Fabian Sinz

COSYNE 2023

ePosterNeuroscience

Improved estimation of latent variable models from calcium imaging data

David Zoltowski, Adam Charles, Jonathan W. Pillow, Stephen Keeley

COSYNE 2023

ePosterNeuroscience

A Large Dataset of Macaque V1 Responses to Natural Images Revealed Complexity in V1 Neural Codes

Shang Gao, Tianye Wang, Xie Jue, Daniel Wang, Tai Sing Lee, Shiming Tang

COSYNE 2023

ePosterNeuroscience

maskNMF: a denoise-sparsen-detect pipeline for demixing dense imaging data faster than real time

Amol Pasarkar, Liam Paninski, Pengcheng Zhou, Melissa Wu, Ian Kinsella, Daisong Pan, Jiang Lan Fan, Zhen Wang, Lamiae Abdeladim, Darcy Peterka, Hillel Adesnik, Na Ji

COSYNE 2023

ePosterNeuroscience

A Method for Testing Bayesian Models Using Neural Data

Gabor Lengyel, Sabyasachi Shivkumar, Ralf Haefner

COSYNE 2023

ePosterNeuroscience

Neuroformer: A Transformer Framework for Multimodal Neural Data Analysis

Antonis Antoniades, Yiyi Yu, Spencer LaVere Smith

COSYNE 2023

ePosterNeuroscience

Responses to inconsistent stimuli in pyramidal neurons: An open science dataset

Colleen J. Gillon, Jérôme A. Lecoq, Jason E. Pina, Timothy M. Henley, Yazan N. Billeh, Shiella Caldejon, Jed Perkins, Matthew T. Valley, Ali Williford, Yoshua Bengio, Timothy Lillicrap, Joel Zylberberg, Blake A. Richards

COSYNE 2023

ePosterNeuroscience

Robust multiband drift estimation in electrophysiology data

Charlie Windolf, Angelique C Paulk, Yoav Kfir, Eric Trautmann, Samuel Garcia, Domokos Meszéna, William Munoz, Irene Caprara, Mohsen Jamali, Julien Boussard, Ziv Williams, Sydney Cash, Liam Paninski, Erdem Varol

COSYNE 2023

ePosterNeuroscience

Functional inter-subject alignment of MEG data outperforms anatomical alignment

Leo Michalke, Jochem Rieger

Bernstein Conference 2024

data coverage

90 items

Seminar50
ePoster40
Domain spotlight

Explore how data research is advancing inside Neuro.

Visit domain