World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.
Review the Privacy Policy for details about analytics processing.
Dr
University of Sussex
Showing your local timezone
Schedule
Tuesday, November 2, 2021
5:15 PM Europe/Berlin
Seminar location
No geocoded details are available for this content yet.
Recording provided by the organiser.
Format
Recorded Seminar
Recording
Available
Host
SNUFA
Seminar location
No geocoded details are available for this content yet.
Last year’s SNUFA workshop report concluded “Moving toward neuron numbers comparable with biology and applying these networks to real-world data-sets will require the development of novel algorithms, software libraries, and dedicated hardware accelerators that perform well with the specifics of spiking neural networks” [1]. Taking inspiration from machine learning libraries — where techniques such as parallel batch training minimise latency and maximise GPU occupancy — as well as our previous research on efficiently simulating SNNs on GPUs for computational neuroscience [2,3], we are extending our GeNN SNN simulator to pursue this vision. To explore GeNN’s potential, we use the eProp learning rule [4] — which approximates RTRL — to train SNN classifiers on the Spiking Heidelberg Digits and the Spiking Sequential MNIST datasets. We find that the performance of these classifiers is comparable to those trained using BPTT [5] and verify that the theoretical advantages of neuron models with adaptation dynamics [5] translate to improved classification performance. We then measured execution times and found that training an SNN classifier using GeNN and eProp becomes faster than SpyTorch and BPTT after less than 685 timesteps and much larger models can be trained on the same GPU when using GeNN. Furthermore, we demonstrate that our implementation of parallel batch training improves training performance by over 4⨉ and enables near-perfect scaling across multiple GPUs. Finally, we show that performing inference using a recurrent SNN using GeNN uses less energy and has lower latency than a comparable LSTM simulated with TensorFlow [6].
James Knight
Dr
University of Sussex
neuro
neuro
The development of the iPS cell technology has revolutionized our ability to study development and diseases in defined in vitro cell culture systems. The talk will focus on Rett Syndrome and discuss t
neuro
Pluripotent cells, including embryonic stem (ES) and induced pluripotent stem (iPS) cells, are used to investigate the genetic and epigenetic underpinnings of human diseases such as Parkinson’s, Alzhe