TopicNeuro

reservoir computing

5 ePosters1 Position1 Seminar

Latest

PositionNeuroscience

Ann Kennedy

The Scripps Research Institute
San Diego, CA
Jan 12, 2026

The Kennedy lab is recruiting for multiple funded postdoctoral positions in theoretical and computational neuroscience, following our recent lab move to Scripps Research in San Diego, CA! Ongoing projects in the lab span topics in: reservoir computing with heterogeneous cell types, reinforcement learning/control theory analysis of complex behavior, neuromechanical whole-organism modeling, diffusion models for imitation learning/forecasting of mouse social interactions, joint analysis/modeling of effects of internal states on neural + vocalization + behavior data. With additional NIH and foundation funding for: characterizing progression of behavioral phenotypes in Parkinson’s, modeling cellular/circuit mechanisms underlying internal state-dependent changes in neural population dynamics, characterizing neural correlates of social relationships across species. Projects are flexible and can be tailored to applicants’ research and training goals, and there are abundant opportunities for new collaboration with local experimental groups. San Diego has a fantastic research community and very high quality of life. Our campus is located at the Pacific coast, at the northern edge of UCSD and not far from the Salk Institute. Postdoctoral stipends are well above NIH guidelines and include a relocation bonus, with research professorship positions available for qualified applicants.

SeminarNeuroscienceRecording

Heterogeneity and non-random connectivity in reservoir computing

Abigail Morrison
Jülich Research Centre & RWTH Aachen University, Germany
Jun 1, 2022

Reservoir computing is a promising framework to study cortical computation, as it is based on continuous, online processing and the requirements and operating principles are compatible with cortical circuit dynamics. However, the framework has issues that limit its scope as a generic model for cortical processing. The most obvious of these is that, in traditional models, learning is restricted to the output projections and takes place in a fully supervised manner. If such an output layer is interpreted at face value as downstream computation, this is biologically questionable. If it is interpreted merely as a demonstration that the network can accurately represent the information, this immediately raises the question of what would be biologically plausible mechanisms for transmitting the information represented by a reservoir and incorporating it in downstream computations. Another major issue is that we have as yet only modest insight into how the structural and dynamical features of a network influence its computational capacity, which is necessary not only for gaining an understanding of those features in biological brains, but also for exploiting reservoir computing as a neuromorphic application. In this talk, I will first demonstrate a method for quantifying the representational capacity of reservoirs without training them on tasks. Based on this technique, which allows systematic comparison of systems, I then present our recent work towards understanding the roles of heterogeneity and connectivity patterns in enhancing both the computational properties of a network and its ability to reliably transmit to downstream networks. Finally, I will give a brief taster of our current efforts to apply the reservoir computing framework to magnetic systems as an approach to neuromorphic computing.

ePosterNeuroscience

One-shot learning of paired associations by a reservoir computing model with Hebbian plasticity

M Ganesh Kumar,Cheston Tan,Camilo Libedinsky,Shih-Cheng Yen,Andrew Tan

COSYNE 2022

ePosterNeuroscience

One-shot learning of paired associations by a reservoir computing model with Hebbian plasticity

M Ganesh Kumar,Cheston Tan,Camilo Libedinsky,Shih-Cheng Yen,Andrew Tan

COSYNE 2022

ePosterNeuroscience

Slow transition to chaos and robust reservoir computing in recurrent neural networks with heavy-tailed distributed synaptic weights

Yi Xie, Stefan Mihalas, Lukasz Kusmierz

COSYNE 2025

ePosterNeuroscience

Sparse neural engagement in connectome-based reservoir computing networks

James McAllister, John Wade, Conor Houghton, Cian O'Donell

COSYNE 2025

ePosterNeuroscience

Reservoir computing using cultured neuronal networks with modular topology

Takuma Sumi, Hideaki Yamamoto, Yuichi Katori, Koki Ito, Hideyuki Kato, Hayato Chiba, Shigeo Sato, Ayumi Hirano-Iwata

FENS Forum 2024

reservoir computing coverage

7 items

ePoster5
Position1
Seminar1
Domain spotlight

Explore how reservoir computing research is advancing inside Neuro.

Visit domain