Platform

  • Search
  • Seminars
  • Conferences
  • Jobs

Resources

  • Submit Content
  • About Us

© 2025 World Wide

Open knowledge for all • Started with World Wide Neuro • A 501(c)(3) Non-Profit Organization

Analytics consent required

World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.

Review the Privacy Policy for details about analytics processing.

World Wide
SeminarsConferencesWorkshopsCoursesJobsMapsFeedLibrary
Back to SeminarsBack
SeminarPast EventNeuroscience

The centrality of population-level factors to network computation is demonstrated by a versatile approach for training spiking networks

Brian DePasquale

Dr

Princeton

Schedule
Wednesday, May 3, 2023

Showing your local timezone

Schedule

Wednesday, May 3, 2023

5:00 PM Europe/Berlin

Host: SNUFA

Access Seminar

Meeting Password

$Em4HF

Use this password when joining the live session

Event Information

Domain

Neuroscience

Original Event

View source

Host

SNUFA

Duration

30 minutes

Abstract

Neural activity is often described in terms of population-level factors extracted from the responses of many neurons. Factors provide a lower-dimensional description with the aim of shedding light on network computations. Yet, mechanistically, computations are performed not by continuously valued factors but by interactions among neurons that spike discretely and variably. Models provide a means of bridging these levels of description. We developed a general method for training model networks of spiking neurons by leveraging factors extracted from either data or firing-rate-based networks. In addition to providing a useful model-building framework, this formalism illustrates how reliable and continuously valued factors can arise from seemingly stochastic spiking. Our framework establishes procedures for embedding this property in network models with different levels of realism. The relationship between spikes and factors in such networks provides a foundation for interpreting (and subtly redefining) commonly used quantities such as firing rates.

Topics

FORCE learningartificial neural networksdimensionality reductiondynamical systemsfactor modelsfiring ratesmodel networksmodel-building frameworkmotor systemnetwork computationnetwork trainingneural activityneuron interactionspopulation dynamicspopulation-level factorsrecurrent neural networksspiking networksstochastic spiking

About the Speaker

Brian DePasquale

Dr

Princeton

Contact & Resources

Personal Website

www.princeton.edu/~briandd/

@briandepasquale

Follow on Twitter/X

twitter.com/briandepasquale

Related Seminars

Seminar60%

Knight ADRC Seminar

neuro

Jan 20, 2025
Washington University in St. Louis, Neurology
Seminar60%

TBD

neuro

Jan 20, 2025
King's College London
Seminar60%

Guiding Visual Attention in Dynamic Scenes

neuro

Jan 20, 2025
Haifa U
January 2026
Full calendar →