Platform

  • Search
  • Seminars
  • Conferences
  • Jobs

Resources

  • Submit Content
  • About Us

© 2025 World Wide

Open knowledge for all • Started with World Wide Neuro • A 501(c)(3) Non-Profit Organization

Analytics consent required

World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.

Review the Privacy Policy for details about analytics processing.

World Wide
SeminarsConferencesWorkshopsCoursesJobsMapsFeedLibrary
← Back

Training Dynamic Spiking Neural

Back to SeminarsBack
Seminar✓ Recording AvailableNeuroscience

Training Dynamic Spiking Neural Network via Forward Propagation Through Time

B. Yin

CWI

Schedule
Wednesday, November 9, 2022

Showing your local timezone

Schedule

Wednesday, November 9, 2022

3:15 PM Europe/Berlin

Watch recording
Host: SNUFA

Seminar location

Seminar location

Not provided

No geocoded details are available for this content yet.

Watch the seminar

Your browser does not support the video tag.

Recording provided by the organiser.

Event Information

Format

Recorded Seminar

Recording

Available

Host

SNUFA

Seminar location

Seminar location

Not provided

No geocoded details are available for this content yet.

World Wide map

Abstract

With recent advances in learning algorithms, recurrent networks of spiking neurons are achieving performance competitive with standard recurrent neural networks. Still, these learning algorithms are limited to small networks of simple spiking neurons and modest-length temporal sequences, as they impose high memory requirements, have difficulty training complex neuron models, and are incompatible with online learning.Taking inspiration from the concept of Liquid Time-Constant (LTCs), we introduce a novel class of spiking neurons, the Liquid Time-Constant Spiking Neuron (LTC-SN), resulting in functionality similar to the gating operation in LSTMs. We integrate these neurons in SNNs that are trained with FPTT and demonstrate that thus trained LTC-SNNs outperform various SNNs trained with BPTT on long sequences while enabling online learning and drastically reducing memory complexity. We show this for several classical benchmarks that can easily be varied in sequence length, like the Add Task and the DVS-gesture benchmark. We also show how FPTT-trained LTC-SNNs can be applied to large convolutional SNNs, where we demonstrate novel state-of-the-art for online learning in SNNs on a number of standard benchmarks (S-MNIST, R-MNIST, DVS-GESTURE) and also show that large feedforward SNNs can be trained successfully in an online manner to near (Fashion-MNIST, DVS-CIFAR10) or exceeding (PS-MNIST, R-MNIST) state-of-the-art performance as obtained with offline BPTT. Finally, the training and memory efficiency of FPTT enables us to directly train SNNs in an end-to-end manner at network sizes and complexity that was previously infeasible: we demonstrate this by training in an end-to-end fashion the first deep and performant spiking neural network for object localization and recognition. Taken together, we out contribution enable for the first time training large-scale complex spiking neural network architectures online and on long temporal sequences.

Topics

BPTTFPTTLTC-SNLiquid Time-Constantconvolutional SNNsmemory complexityneuromorphic computingobject localizationonline learningspiking neural networks

About the Speaker

B. Yin

CWI

Contact & Resources

No additional contact information available

Related Seminars

Seminar64% match - Relevant

Continuous guidance of human goal-directed movements

neuro

Dec 9, 2024
VU University Amsterdam
Seminar64% match - Relevant

Rett syndrome, MECP2 and therapeutic strategies

neuro

The development of the iPS cell technology has revolutionized our ability to study development and diseases in defined in vitro cell culture systems. The talk will focus on Rett Syndrome and discuss t

Dec 10, 2024
Whitehead Institute for Biomedical Research and Department of Biology, MIT, Cambridge, USA
Seminar64% match - Relevant

Genetic and epigenetic underpinnings of neurodegenerative disorders

neuro

Pluripotent cells, including embryonic stem (ES) and induced pluripotent stem (iPS) cells, are used to investigate the genetic and epigenetic underpinnings of human diseases such as Parkinson’s, Alzhe

Dec 10, 2024
MIT Department of Biology
World Wide calendar

World Wide highlights

December 2025 • Syncing the latest schedule.

View full calendar
Awaiting featured picks
Month at a glance

Upcoming highlights