← Back

Continual Learning

Topic spotlight
TopicWorld Wide

continual learning

Discover seminars, jobs, and research tagged with continual learning across World Wide.
17 curated items10 ePosters4 Seminars3 Positions
Updated 1 day ago
17 items · continual learning
17 results
Position

Constantine Dovrolis

The Cyprus Institute
Cyprus
Dec 5, 2025

The Cyprus Institute invites applications for a Post-Doctoral Fellow to pursue research in Machine Learning. The successful candidate will be actively engaged in cutting-edge research in terms of core problems in ML and AI such as developing efficient and interpretable deep nets, continual learning, neuro-inspired ML, self-supervised learning, and other cutting-edge topics. The candidate should have deep understanding of machine learning fundamentals (e.g., linear algebra, probability theory, optimization) as well as broad knowledge of the state-of-the-art in AI and machine and learning. Additionally, the candidate should have extensive experience with ML programming frameworks (e.g., PyTorch). The candidate will be working primarily with two PIs: Prof. Constantine Dovrolis and Prof. Mihalis Nicolaou. The appointment is for a period of 2 years, with the option of renewal subject to performance and the availability of funds.

PositionComputational Neuroscience

Friedemann Zenke

Friedrich Miescher Institute
Basel, Switzerland
Dec 5, 2025

The position involves conducting research in computational neuroscience and bio-inspired machine intelligence, writing research articles and presenting them at international conferences, publishing in neuroscience journals and machine learning venues such as ICML, NeurIPS, ICLR, etc., and interacting and collaborating with experimental neuroscience groups or neuromorphic hardware developers nationally and internationally.

SeminarNeuroscienceRecording

Edge Computing using Spiking Neural Networks

Shirin Dora
Loughborough University
Nov 4, 2021

Deep learning has made tremendous progress in the last year but it's high computational and memory requirements impose challenges in using deep learning on edge devices. There has been some progress in lowering memory requirements of deep neural networks (for instance, use of half-precision) but there has been minimal effort in developing alternative efficient computational paradigms. Inspired by the brain, Spiking Neural Networks (SNN) provide an energy-efficient alternative to conventional rate-based neural networks. However, SNN architectures that employ the traditional feedforward and feedback pass do not fully exploit the asynchronous event-based processing paradigm of SNNs. In the first part of my talk, I will present my work on predictive coding which offers a fundamentally different approach to developing neural networks that are particularly suitable for event-based processing. In the second part of my talk, I will present our work on development of approaches for SNNs that target specific problems like low response latency and continual learning. References Dora, S., Bohte, S. M., & Pennartz, C. (2021). Deep Gated Hebbian Predictive Coding Accounts for Emergence of Complex Neural Response Properties Along the Visual Cortical Hierarchy. Frontiers in Computational Neuroscience, 65. Saranirad, V., McGinnity, T. M., Dora, S., & Coyle, D. (2021, July). DoB-SNN: A New Neuron Assembly-Inspired Spiking Neural Network for Pattern Classification. In 2021 International Joint Conference on Neural Networks (IJCNN) (pp. 1-6). IEEE. Machingal, P., Thousif, M., Dora, S., Sundaram, S., Meng, Q. (2021). A Cross Entropy Loss for Spiking Neural Networks. Expert Systems with Applications (under review).

SeminarNeuroscienceRecording

Multitask performance humans and deep neural networks

Christopher Summerfield
University of Oxford
Nov 24, 2020

Humans and other primates exhibit rich and versatile behaviour, switching nimbly between tasks as the environmental context requires. I will discuss the neural coding patterns that make this possible in humans and deep networks. First, using deep network simulations, I will characterise two distinct solutions to task acquisition (“lazy” and “rich” learning) which trade off learning speed for robustness, and depend on the initial weights scale and network sparsity. I will chart the predictions of these two schemes for a context-dependent decision-making task, showing that the rich solution is to project task representations onto orthogonal planes on a low-dimensional embedding space. Using behavioural testing and functional neuroimaging in humans, we observe BOLD signals in human prefrontal cortex whose dimensionality and neural geometry are consistent with the rich learning regime. Next, I will discuss the problem of continual learning, showing that behaviourally, humans (unlike vanilla neural networks) learn more effectively when conditions are blocked than interleaved. I will show how this counterintuitive pattern of behaviour can be recreated in neural networks by assuming that information is normalised and temporally clustered (via Hebbian learning) alongside supervised training. Together, this work offers a picture of how humans learn to partition knowledge in the service of structured behaviour, and offers a roadmap for building neural networks that adopt similar principles in the service of multitask learning. This is work with Andrew Saxe, Timo Flesch, David Nagy, and others.

SeminarNeuroscienceRecording

Synthesizing Machine Intelligence in Neuromorphic Computers with Differentiable Programming

Emre Neftci
University of California Irvine
Aug 30, 2020

The potential of machine learning and deep learning to advance artificial intelligence is driving a quest to build dedicated computers, such as neuromorphic hardware that emulate the biological processes of the brain. While the hardware technologies already exist, their application to real-world tasks is hindered by the lack of suitable programming methods. Advances at the interface of neural computation and machine learning showed that key aspects of deep learning models and tools can be transferred to biologically plausible neural circuits. Building on these advances, I will show that differentiable programming can address many challenges of programming spiking neural networks for solving real-world tasks, and help devise novel continual and local learning algorithms. In turn, these new algorithms pave the road towards systematically synthesizing machine intelligence in neuromorphic hardware without detailed knowledge of the hardware circuits.

ePoster

Continual learning using dendritic modulations on view-invariant feedforward weights

Viet Anh Khoa Tran, Emre Neftci, Willem Wybo

Bernstein Conference 2024

ePoster

Evaluating Memory Behavior in Continual Learning using the Posterior in a Binary Bayesian Network

Akshay Bedhotiya, Emre Neftci

Bernstein Conference 2024

ePoster

A Study of a biologically plausible combination of Sparsity, Weight Imprinting and Forward Inhibition in Continual Learning

Golzar Atefi, Justus Westerhof, Felix Gers, Erik Rodner

Bernstein Conference 2024

ePoster

Dissecting the Factors of Metaplasticity with Meta-Continual Learning

COSYNE 2022

ePoster

Hippocampal networks support continual learning and generalisation

COSYNE 2022

ePoster

Hippocampal networks support continual learning and generalisation

COSYNE 2022

ePoster

Compositional inference in the continual learning mouse playground

Aneesh Bal, Andrea Santi, Cecelia Shuai, Samantha Soto, Joshua Vogelstein, Patricia Janak, Kishore V. Kuchibhotla

COSYNE 2025

ePoster

Metrics of Task Relations Predict Continual Learning Performance

Haozhe Shan, Qianyi Li, Haim Sompolinsky

COSYNE 2025

ePoster

A neural network model of continual learning through closed-loop interaction with the environment

Alexander Rivkind, Daniel Wolpert, Guillaume Hennequin, Mate Lengyel

COSYNE 2025

ePoster

Probing the dynamics of neural representations that support generalization under continual learning

Daniel Kimmel, Kimberly Stachenfeld, Nikolaus Kriegeskorte, Stefano Fusi, C Daniel Salzman, Daphna Shohamy

COSYNE 2025