Platform

  • Search
  • Seminars
  • Conferences
  • Jobs

Resources

  • Submit Content
  • About Us

© 2025 World Wide

Open knowledge for all • Started with World Wide Neuro • A 501(c)(3) Non-Profit Organization

Analytics consent required

World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.

Review the Privacy Policy for details about analytics processing.

World Wide
SeminarsConferencesWorkshopsCoursesJobsMapsFeedLibrary
Back to SeminarsBack
Seminar✓ Recording AvailableNeuroscience

General purpose event-based architectures for deep learning

Anand Subramoney

Dr

Institute for Neural Computation

Schedule
Wednesday, October 5, 2022

Showing your local timezone

Schedule

Wednesday, October 5, 2022

5:00 PM Europe/Berlin

Watch recording
Host: SNUFA

Access Seminar

Meeting Password

$Em4HF

Use this password when joining the live session

Watch the seminar

Recording provided by the organiser.

Event Information

Domain

Neuroscience

Original Event

View source

Host

SNUFA

Duration

30 minutes

Abstract

Biologically plausible spiking neural networks (SNNs) are an emerging architecture for deep learning tasks due to their energy efficiency when implemented on neuromorphic hardware. However, many of the biological features are at best irrelevant and at worst counterproductive when evaluated in the context of task performance and suitability for neuromorphic hardware. In this talk, I will present an alternative paradigm to design deep learning architectures with good task performance in real-world benchmarks while maintaining all the advantages of SNNs. We do this by focusing on two main features -- event-based computation and activity sparsity. Starting from the performant gated recurrent unit (GRU) deep learning architecture, we modify it to make it event-based and activity-sparse. The resulting event-based GRU (EGRU) is extremely efficient for both training and inference. At the same time, it achieves performance close to conventional deep learning architectures in challenging tasks such as language modelling, gesture recognition and sequential MNIST

Topics

EGRUactivity sparsityactivity-sparsitydeep learningevent-based computationevent-based computinggated recurrent unitgesture recognitionlanguage modelingneuromorphicneuromorphic hardwarespiking neural networks

About the Speaker

Anand Subramoney

Dr

Institute for Neural Computation

Contact & Resources

Personal Website

t.co/0DavePNQJz

@anandsubramoney

Follow on Twitter/X

twitter.com/anandsubramoney

Related Seminars

Seminar60%

Knight ADRC Seminar

neuro

Jan 20, 2025
Washington University in St. Louis, Neurology
Seminar60%

TBD

neuro

Jan 20, 2025
King's College London
Seminar60%

Guiding Visual Attention in Dynamic Scenes

neuro

Jan 20, 2025
Haifa U
January 2026
Full calendar →