← Back

Reservoir Computing

Topic spotlight
TopicWorld Wide

reservoir computing

Discover seminars, jobs, and research tagged with reservoir computing across World Wide.
13 curated items7 Positions5 ePosters1 Seminar
Updated 2 days ago
13 items · reservoir computing
13 results
PositionComputational Neuroscience

Dr. Fleur Zeldenrust

Donders Institute for Brain, Cognition and Behaviour
Nijmegen, the Netherlands
Dec 5, 2025

We are looking for a postdoctoral researcher to study the effects of neuromodulators in biologically realistic networks and learning tasks in the Vidi project 'Top-down neuromodulation and bottom-up network computation, a computational study'. You will use cellular and behavioural data gathered by our department over the previous five years on dopamine, acetylcholine and serotonin in mouse barrel cortex, to bridge the gap between single cell, network and behavioural effects. The aim of this project is to explain the effects of neuromodulation on task performance in biologically realistic spiking recurrent neural networks (SRNNs). You will use biologically realistic learning frameworks, such as force learning, to study how network structure influences task performance. You will use existing open source data to train a SRNN on a pole detection task (for rodents using their whiskers) and incorporate realistic network properties of the (barrel) cortex based on our lab's measurements. Next, you will incorporate the cellular effects of dopamine, acetylcholine and serotonin that we have measured into the network, and investigate their effects on task performance. In particular, you will research the effects of biologically realistic network properties (balance between excitation and inhibition and the resulting chaotic activity, non-linear neuronal input-output relations, patterns in connectivity, Dale's law) and incorporate known neuron and network effects. You will build on the single cell data, network models and analysis methods available in our group, and your results will be incorporated into our group's further research to develop and validate efficient coding models of (somatosensory) perception. We are therefore looking for a team player who can collaborate well with the other group members, and is willing to both learn from them and share their knowledge.

PositionComputational Neuroscience

Joni Dambre

Ghent University
Ghent, Belgium
Dec 5, 2025

You will be enrolled at Ghent University for a PhD in Computer Science Engineering. However, your research will be highly interdisciplinary. You will need to combine in-depth understanding of biological learning, artificial learning and its efficiency as a hardware implementation. As PhD student at Ghent university, you will collaborate with enthusiastic colleagues at IDLab-AIRO (https://airo.ugent.be/research/) and our international partners in the SmartNets project (https://www.smartnets-etn.eu/). As an Early Stage Researcher (ESR) in the SmartNets network, you will form an active training network with the other ESRs in the project and you are required to spend part of your PhD time (~ 2 times 3 months) with some of our partners. For the complete vacancy visit: https://www.ugent.be/ea/idlab/en/news-events/news/vacancy-phd-biologically-inspired-feature-learning.htm

Position

Xavier Hinaut

Inria Bordeaux & Institute for Neurodegenerative diseases
Inria Bordeaux & Institute for Neurodegenerative diseases (Pellegrin Hospital Campus, Bordeaux)
Dec 5, 2025

This PhD thesis is part of the BrainGPT 'Inria Exploratory Action' project. The main ambition of the BrainGPT project is to combine the explainability of mechanistic models with the predictive power of Transformers to analyze brain imaging data. The thesis will mainly consist of developing new bio-inspired models inspired by the mechanisms, learning methods, and emerging behaviors of Large Language Models (LLMs) and Transformers. These models will be tested to assess their ability to predict brain activity from imaging data.

Position

Xavier Hinaut

Mnemosyne team, Inria, Bordeaux, France
Inria Bordeaux Sud-Ouest, LABRI & Institut des Maladies Neurodégénératives (Centre Broca Aquitaine, Carreire campus), Bordeaux, France
Dec 5, 2025

This six-month Master’s spring 2024 internship is centered on the ambitious goal of developing a cutting-edge Large Language Model (LLM) based service, specifically tailored to assist in programming with the ReservoirPy library. The primary objective is to create an AI-powered tool that simplifies and enhances the coding experience for users working with reservoir computing. This involves not only integrating the LLM with ReservoirPy to provide real-time coding assistance and error correction, but also customizing the model to understand and effectively respond to industry-specific terminologies and queries. Through this project, the intern will contribute to a pioneering effort in AI-assisted coding, bridging the gap between advanced AI language capabilities and practical, domain-specific programming needs.

Position

Xavier Hinaut

Mnemosyne team, Inria, Bordeaux, France
Inria Bordeaux Sud-Ouest, LABRI & Institut des Maladies Neurodégénératives (Centre Broca Aquitaine, Carreire campus), Bordeaux, France
Dec 5, 2025

The internship involves exploring how Reservoir Computing can be leveraged in the context of Meta-RL. The intern will consider evolving the architecture of a reservoir for performing well on a wide range of RL tasks. The project will involve designing a wide range of RL tasks in an existing simulation environment, learning how to learn multiple tasks without access to the goal information (i.e. meta-RL with a reservoir), and evolving reservoir architectures for meta-RL with evolutionary methods.

Position

Marco Miozzo

Centre Tecnològic de Telecomunicacions de Catalunya (CTTC)
Barcelona, Spain
Dec 5, 2025

We are offering a full-time post-doctorate position for investigating pervasive intelligence and Artificial Intelligence of Things (AIoT). The research will focus on solutions suitable for advancing towards a truly pervasive and liquid AI, enabling edge devices to accomplish training and inference with the same accuracy of cloud AI, without harming our environment. For doing so, highly efficient learning methods will be investigated including model compression, reservoir and neuromorphic computing, and distributed/decentralized and collaborative data/client selection algorithms. Realistic use cases from sustainable development goals will be considered to validate the selected solutions.

PositionNeuroscience

Ann Kennedy

The Scripps Research Institute
San Diego, CA
Dec 5, 2025

The Kennedy lab is recruiting for multiple funded postdoctoral positions in theoretical and computational neuroscience, following our recent lab move to Scripps Research in San Diego, CA! Ongoing projects in the lab span topics in: reservoir computing with heterogeneous cell types, reinforcement learning/control theory analysis of complex behavior, neuromechanical whole-organism modeling, diffusion models for imitation learning/forecasting of mouse social interactions, joint analysis/modeling of effects of internal states on neural + vocalization + behavior data. With additional NIH and foundation funding for: characterizing progression of behavioral phenotypes in Parkinson’s, modeling cellular/circuit mechanisms underlying internal state-dependent changes in neural population dynamics, characterizing neural correlates of social relationships across species. Projects are flexible and can be tailored to applicants’ research and training goals, and there are abundant opportunities for new collaboration with local experimental groups. San Diego has a fantastic research community and very high quality of life. Our campus is located at the Pacific coast, at the northern edge of UCSD and not far from the Salk Institute. Postdoctoral stipends are well above NIH guidelines and include a relocation bonus, with research professorship positions available for qualified applicants.

SeminarNeuroscienceRecording

Heterogeneity and non-random connectivity in reservoir computing

Abigail Morrison
Jülich Research Centre & RWTH Aachen University, Germany
May 31, 2022

Reservoir computing is a promising framework to study cortical computation, as it is based on continuous, online processing and the requirements and operating principles are compatible with cortical circuit dynamics. However, the framework has issues that limit its scope as a generic model for cortical processing. The most obvious of these is that, in traditional models, learning is restricted to the output projections and takes place in a fully supervised manner. If such an output layer is interpreted at face value as downstream computation, this is biologically questionable. If it is interpreted merely as a demonstration that the network can accurately represent the information, this immediately raises the question of what would be biologically plausible mechanisms for transmitting the information represented by a reservoir and incorporating it in downstream computations. Another major issue is that we have as yet only modest insight into how the structural and dynamical features of a network influence its computational capacity, which is necessary not only for gaining an understanding of those features in biological brains, but also for exploiting reservoir computing as a neuromorphic application. In this talk, I will first demonstrate a method for quantifying the representational capacity of reservoirs without training them on tasks. Based on this technique, which allows systematic comparison of systems, I then present our recent work towards understanding the roles of heterogeneity and connectivity patterns in enhancing both the computational properties of a network and its ability to reliably transmit to downstream networks. Finally, I will give a brief taster of our current efforts to apply the reservoir computing framework to magnetic systems as an approach to neuromorphic computing.

ePoster

One-shot learning of paired associations by a reservoir computing model with Hebbian plasticity

COSYNE 2022

ePoster

One-shot learning of paired associations by a reservoir computing model with Hebbian plasticity

COSYNE 2022

ePoster

Slow transition to chaos and robust reservoir computing in recurrent neural networks with heavy-tailed distributed synaptic weights

Yi Xie, Stefan Mihalas, Lukasz Kusmierz

COSYNE 2025

ePoster

Sparse neural engagement in connectome-based reservoir computing networks

James McAllister, John Wade, Conor Houghton, Cian O'Donell

COSYNE 2025

ePoster

Reservoir computing using cultured neuronal networks with modular topology

Takuma Sumi, Hideaki Yamamoto, Yuichi Katori, Koki Ito, Hideyuki Kato, Hayato Chiba, Shigeo Sato, Ayumi Hirano-Iwata

FENS Forum 2024