World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.
Review the Privacy Policy for details about analytics processing.
Dr
Princeton University
Showing your local timezone
Schedule
Tuesday, September 15, 2020
12:00 PM America/Los_Angeles
Seminar location
No geocoded details are available for this content yet.
Format
Past Seminar
Recording
Not available
Host
U Oregon Neuro
Seminar location
No geocoded details are available for this content yet.
Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) for processing sequential data, and also in neuroscience, to understand the emergent properties of networks of real neurons. Prior theoretical work in understanding the properties of RNNs has focused on models with additive interactions. However, real neurons can have gating i.e. multiplicative interactions, and gating is also a central feature of the best performing RNNs in machine learning. Here, we develop a dynamical mean-field theory (DMFT) to study the consequences of gating in RNNs. We use random matrix theory to show how gating robustly produces marginal stability and line attractors – important mechanisms for biologically-relevant computations requiring long memory. The long-time behavior of the gated network is studied using its Lyapunov spectrum, and the DMFT is used to provide a novel analytical expression for the maximum Lyapunov exponent demonstrating its close relation to relaxation-time of the dynamics. Gating is also shown to give rise to a novel, discontinuous transition to chaos, where the proliferation of critical points (topological complexity) is decoupled from the appearance of chaotic dynamics (dynamical complexity), contrary to a seminal result for additive RNNs. Critical surfaces and regions of marginal stability in the parameter space are indicated in phase diagrams, thus providing a map for principled parameter choices for ML practitioners. Finally, we develop a field-theory for gradients that arise in training, by incorporating the adjoint sensitivity framework from control theory in the DMFT. This paves the way for the use of powerful field-theoretic techniques to study training/gradients in large RNNs.
Kamesh Krishnamurthy
Dr
Princeton University
Contact & Resources
neuro
neuro
The development of the iPS cell technology has revolutionized our ability to study development and diseases in defined in vitro cell culture systems. The talk will focus on Rett Syndrome and discuss t
neuro
Pluripotent cells, including embryonic stem (ES) and induced pluripotent stem (iPS) cells, are used to investigate the genetic and epigenetic underpinnings of human diseases such as Parkinson’s, Alzhe