Platform

  • Search
  • Seminars
  • Conferences
  • Jobs

Resources

  • Submit Content
  • About Us

© 2025 World Wide

Open knowledge for all • Started with World Wide Neuro • A 501(c)(3) Non-Profit Organization

Analytics consent required

World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.

Review the Privacy Policy for details about analytics processing.

World Wide
SeminarsConferencesWorkshopsCoursesJobsMapsFeedLibrary
Back to SeminarsBack
SeminarPast EventNeuroscience

Theory of gating in recurrent neural networks

Kamesh Krishnamurthy

Dr

Princeton University

Schedule
Wednesday, September 16, 2020

Showing your local timezone

Schedule

Tuesday, September 15, 2020

8:00 PM America/Los_Angeles

Host: U Oregon Neuro

Access Seminar

Meeting Password

691984

Use this password when joining the live session

Event Information

Domain

Neuroscience

Original Event

View source

Host

U Oregon Neuro

Duration

70 minutes

Abstract

Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) for processing sequential data, and also in neuroscience, to understand the emergent properties of networks of real neurons. Prior theoretical work in understanding the properties of RNNs has focused on models with additive interactions. However, real neurons can have gating i.e. multiplicative interactions, and gating is also a central feature of the best performing RNNs in machine learning. Here, we develop a dynamical mean-field theory (DMFT) to study the consequences of gating in RNNs. We use random matrix theory to show how gating robustly produces marginal stability and line attractors – important mechanisms for biologically-relevant computations requiring long memory. The long-time behavior of the gated network is studied using its Lyapunov spectrum, and the DMFT is used to provide a novel analytical expression for the maximum Lyapunov exponent demonstrating its close relation to relaxation-time of the dynamics. Gating is also shown to give rise to a novel, discontinuous transition to chaos, where the proliferation of critical points (topological complexity) is decoupled from the appearance of chaotic dynamics (dynamical complexity), contrary to a seminal result for additive RNNs. Critical surfaces and regions of marginal stability in the parameter space are indicated in phase diagrams, thus providing a map for principled parameter choices for ML practitioners. Finally, we develop a field-theory for gradients that arise in training, by incorporating the adjoint sensitivity framework from control theory in the DMFT. This paves the way for the use of powerful field-theoretic techniques to study training/gradients in large RNNs.

Topics

chaoscritical pointsdynamical mean-field theorygatinglong memorylyapunov spectrummachine learningrandom matrix theoryrecurrent neural networkstheorytraining gradients

About the Speaker

Kamesh Krishnamurthy

Dr

Princeton University

Contact & Resources

Personal Website

kameshkk.github.io

Related Seminars

Seminar60%

Knight ADRC Seminar

neuro

Jan 20, 2025
Washington University in St. Louis, Neurology
Seminar60%

TBD

neuro

Jan 20, 2025
King's College London
Seminar60%

Guiding Visual Attention in Dynamic Scenes

neuro

Jan 20, 2025
Haifa U
January 2026
Full calendar →