ePoster

On The Role Of Temporal Hierarchy In Spiking Neural Networks

Filippo Moro, Pau Aceituno, Melika Payvand
Bernstein Conference 2024(2024)
Goethe University, Frankfurt, Germany

Conference

Bernstein Conference 2024

Goethe University, Frankfurt, Germany

Resources

Authors & Affiliations

Filippo Moro, Pau Aceituno, Melika Payvand

Abstract

A recent trend in Computational Neuroscience is highlighting the beneficial role of heterogeneity of time constant in spiking neural networks (SNNs) [1, 2]. These findings suggest that heterogeneity, i.e. neuronal variability, should be embraced and leveraged to improve the performance of event-based computation. At the same time, Machine Learning teaches us that Neural Networks are surprisingly capable of representing abstract projections of input data, creating a hierarchy of representation in successive neural network layers. This is why Neural Networks have become much more complex, especially “deeper”, over the last decades, giving rise to the success of Deep Learning. In Machine Learning, efforts have been devoted to finding computational models that process temporal data leveraging hierarchical input representations [3]. This served as inspiration to develop a spiking version of such architecture [4]. Building on the concept of Heterogeneity and motivated by modern machine learning we propose the concept of Temporal Hierarchy. We prove that a hierarchy of neurons’ time constants in the layers of an SNN is useful in forming a complex representation of the input and improves its computational power. Our method involves benchmarking Spiking Neural Networks on different tasks with increasing temporal content, including a synthetic temporal dataset (Multi-Time-Scale XOR) and keyword spotting tasks (Spiking-Heidelberg-Digits, SHD, and Spiking-Speech-Commands, SSC). Standard SNNs with homogeneous time constants in neurons are compared with similar models featuring a hierarchy of time constants in different layers. We define the hierarchy degree as the difference of time constant between the last and first layer of the SNN, showing improved classification performance up to 4.1% on SHD when such coefficient is positive, while degraded performance when it is negative. We also analyze the effect of optimizing the time constant of neurons in deep SNNs, observing that a positive hierarchy degree emerges from the optimization process. We generalize the concept of hierarchy to temporal causal convolution-based SNNs, where progressively increasing the kernel size or dilation of the convolution yields better performance while approaching competitive results to the state-of-the-art in SHD (94.7%) and SSC (79.2%). We infer that organizing the temporal processing in a hierarchy, from fast to slow, is a positive inductive bias for Spiking Neural Networks.

Unique ID: bernstein-24/role-temporal-hierarchy-spiking-965d5acc