ePoster

DelGrad: Exact gradients in spiking networks for learning transmission delays and weights

Julian Göltz, Jimmy Weber, Laura Kriener, Peter Lake, Melika Payvand, Mihai Petrovici
Bernstein Conference 2024(2024)
Goethe University, Frankfurt, Germany

Conference

Bernstein Conference 2024

Goethe University, Frankfurt, Germany

Resources

Authors & Affiliations

Julian Göltz, Jimmy Weber, Laura Kriener, Peter Lake, Melika Payvand, Mihai Petrovici

Abstract

Spiking neural networks (SNNs) inherently rely on the timing of signals for representing and processing information. Transmission delays play an important role in shaping these temporal characteristics. Recent work [1, 2] has demonstrated the substantial advantages of learning these delays along with synaptic weights, both in terms of accuracy and memory efficiency. However, these approaches suffer from drawbacks in terms of precision and learning efficiency, as they operate in discrete time and with approximate gradients, while also requiring membrane potential recordings for calculating parameter updates. To alleviate these issues, building on prior work on exact gradients in SNNs [3] we propose an analytical approach for calculating exact gradients of the loss with respect to both synaptic weights and delays in an event-based fashion. The inclusion of delays emerges naturally within our proposed formalism, enriching the model's parameter search space with a temporal dimension. Our algorithm is purely based on the timing of individual spikes and does not require access to other variables such as membrane potentials. We explicitly compare the impact on accuracy and parameter efficiency of different types of delays - axonal, dendritic and synaptic - in the Yin-Yang classification task [4]. Furthermore, while previous work on learnable delays in SNNs has been mostly confined to software simulations, we demonstrate the functionality and benefits of our approach on the BrainScaleS-2 neuromorphic platform [5]: in a proof-of-concept study we implement transmission delays in an analog fashion on the chip itself and show desirable configurability and reproducibility. Moreover, we present a hybrid hardware-software approach where synaptic and neuronal dynamics are emulated on-chip, while axonal delays are realized digitally off-chip. DelGrad presents an event-based framework for gradient-based co-training of delay parameters and weights, without any approximations, and which meets the typical demands and constraints of neuromorphic hardware, as demonstrated experimentally by successfully training on an analog mixed-signal neuromorphic system.

Unique ID: bernstein-24/delgrad-exact-gradients-spiking-ca1191df