Phase Diagrams
phase diagrams
“Models for Liquid-liquid Phase Separation of Intrinsically Disordered Proteins”
Intrinsically disordered proteins (IDPs), lack of a well-defined folded structure, have been recently shown to be critical to forming membrane-less organelles via liquid-liquid phase separation (LLPS). Due to the flexible conformations of IDPs, it could be challenging to investigate IDPs with solely experimental techniques. Computational models can therefore provide complementary views at several aspects, including the fundamental physics underlying LLPS and the sequence determinants contributing to LLPS. In this presentation, I will start with our coarse-grained computational framework that can help generate sequence dependent phase diagrams. The coarse-grained model further led to the development of a polymer model with empirical parameters to quickly predict LLPS of IDPs. At last, I will show our preliminary efforts on addressing molecular interactions within LLPS of IDPs using all-atom explicit-solvent simulations.
Theory of gating in recurrent neural networks
Recurrent neural networks (RNNs) are powerful dynamical models, widely used in machine learning (ML) for processing sequential data, and also in neuroscience, to understand the emergent properties of networks of real neurons. Prior theoretical work in understanding the properties of RNNs has focused on models with additive interactions. However, real neurons can have gating i.e. multiplicative interactions, and gating is also a central feature of the best performing RNNs in machine learning. Here, we develop a dynamical mean-field theory (DMFT) to study the consequences of gating in RNNs. We use random matrix theory to show how gating robustly produces marginal stability and line attractors – important mechanisms for biologically-relevant computations requiring long memory. The long-time behavior of the gated network is studied using its Lyapunov spectrum, and the DMFT is used to provide a novel analytical expression for the maximum Lyapunov exponent demonstrating its close relation to relaxation-time of the dynamics. Gating is also shown to give rise to a novel, discontinuous transition to chaos, where the proliferation of critical points (topological complexity) is decoupled from the appearance of chaotic dynamics (dynamical complexity), contrary to a seminal result for additive RNNs. Critical surfaces and regions of marginal stability in the parameter space are indicated in phase diagrams, thus providing a map for principled parameter choices for ML practitioners. Finally, we develop a field-theory for gradients that arise in training, by incorporating the adjoint sensitivity framework from control theory in the DMFT. This paves the way for the use of powerful field-theoretic techniques to study training/gradients in large RNNs.