Cookies
We use essential cookies to run the site. Analytics cookies are optional and help us improve World Wide. Learn more.
Dr
Centre national de la recherche scientifique, CNRS | Toulouse
Showing your local timezone
Schedule
Tuesday, September 1, 2020
4:10 PM Europe/Berlin
Recording provided by the organiser.
Domain
Host
SNUFA
Duration
70 minutes
Back-propagation is a powerful supervised learning algorithm in artificial neural networks, because it solves the credit assignment problem (essentially: what should the hidden layers do?). This algorithm has led to the deep learning revolution. But unfortunately, back-propagation cannot be used directly in spiking neural networks (SNN). Indeed, it requires differentiable activation functions, whereas spikes are all-or-none events which cause discontinuities. Here we present two strategies to overcome this problem. The first one is to use a so-called 'surrogate gradient', that is to approximate the derivative of the threshold function with the derivative of a sigmoid. We will present some applications of this method for time series processing (audio, internet traffic, EEG). The second one concerns a specific class of SNNs, which process static inputs using latency coding with at most one spike per neuron. Using approximations, we derived a latency-based back-propagation rule for this sort of networks, called S4NN, and applied it to image classification.
Timothee Masquelier
Dr
Centre national de la recherche scientifique, CNRS | Toulouse
Contact & Resources
neuro
Digital Minds: Brain Development in the Age of Technology examines how our increasingly connected world shapes mental and cognitive health. From screen time and social media to virtual interactions, t
neuro
neuro
Alpha synuclein and Lrrk2 are key players in Parkinson's disease and related disorders, but their normal role has been confusing and controversial. Data from acute gene-editing based knockdown, follow