Temporal Structure
temporal structure
Computational Mechanisms of Predictive Processing in Brains and Machines
Predictive processing offers a unifying view of neural computation, proposing that brains continuously anticipate sensory input and update internal models based on prediction errors. In this talk, I will present converging evidence for the computational mechanisms underlying this framework across human neuroscience and deep neural networks. I will begin with recent work showing that large-scale distributed prediction-error encoding in the human brain directly predicts how sensory representations reorganize through predictive learning. I will then turn to PredNet, a popular predictive coding inspired deep network that has been widely used to model real-world biological vision systems. Using dynamic stimuli generated with our Spatiotemporal Style Transfer algorithm, we demonstrate that PredNet relies primarily on low-level spatiotemporal structure and remains insensitive to high-level content, revealing limits in its generalization capacity. Finally, I will discuss new recurrent vision models that integrate top-down feedback connections with intrinsic neural variability, uncovering a dual mechanism for robust sensory coding in which neural variability decorrelates unit responses, while top-down feedback stabilizes network dynamics. Together, these results outline how prediction error signaling and top-down feedback pathways shape adaptive sensory processing in biological and artificial systems.
Minute-scale periodic sequences in medial entorhinal cortex
The medial entorhinal cortex (MEC) hosts many of the brain’s circuit elements for spatial navigation and episodic memory, operations that require neural activity to be organized across long durations of experience. While location is known to be encoded by a plethora of spatially tuned cell types in this brain region, little is known about how the activity of entorhinal cells is tied together over time. Among the brain’s most powerful mechanisms for neural coordination are network oscillations, which dynamically synchronize neural activity across circuit elements. In MEC, theta and gamma oscillations provide temporal structure to the neural population activity at subsecond time scales. It remains an open question, however, whether similarly coordination occurs in MEC at behavioural time scales, in the second-to-minute regime. In this talk I will show that MEC activity can be organized into a minute-scale oscillation that entrains nearly the entire cell population, with periods ranging from 10 to 100 seconds. Throughout this ultraslow oscillation, neural activity progresses in periodic and stereotyped sequences. The oscillation sometimes advances uninterruptedly for tens of minutes, transcending epochs of locomotion and immobility. Similar oscillatory sequences were not observed in neighboring parasubiculum or in visual cortex. The ultraslow periodic sequences in MEC may have the potential to couple its neurons and circuits across extended time scales and to serve as a scaffold for processes that unfold at behavioural time scales.
Exact coherent structures and transition to turbulence in a confined active nematic
Active matter describes a class of systems that are maintained far from equilibrium by driving forces acting on the constituent particles. Here I will focus on confined active nematics, which exhibit especially rich flow behavior, ranging from structured patterns in space and time to disordered turbulent flows. To understand this behavior, I will take a deterministic dynamical systems approach, beginning with the hydrodynamic equations for the active nematic. This approach reveals that the infinite-dimensional phase space of all possible flow configurations is populated by Exact Coherent Structures (ECS), which are exact solutions of the hydrodynamic equations with distinct and regular spatiotemporal structure; examples include unstable equilibria, periodic orbits, and traveling waves. The ECS are connected by dynamical pathways called invariant manifolds. The main hypothesis in this approach is that turbulence corresponds to a trajectory meandering in the phase space, transitioning between ECS by traveling on the invariant manifolds. Similar approaches have been successful in characterizing high Reynolds number turbulence of passive fluids. Here, I will present the first systematic study of active nematic ECS and their invariant manifolds and discuss their role in characterizing the phenomenon of active turbulence.
Rhythms in sounds and rhythms in brains: the temporal structure of auditory comprehension
Representation of speech temporal structure in human cortex
Understanding the role of neural heterogeneity in learning
The brain has a hugely diverse and heterogeneous nature. The exact role of heterogeneity has been relatively little explored as most neural models tend to be largely homogeneous. We trained spiking neural networks with varying degrees of heterogeneity on complex real-world tasks and found that heterogeneity resulted in more stable and robust training and improved training performance, especially for tasks with a higher temporal structure. Moreover, the optimal distribution of parameters found by training was found to be similar to experimental observations. These findings suggest that heterogeneity is not simply a result of noisy biological processes, but it may play a crucial role for learning in complex, changing environments.
How polymer-loop-extruding motors shape chromosomes
Chromosomes are extremely long, active polymers that are spatially organized across multiple scales to promote cellular functions, such as gene transcription and genetic inheritance. During each cell cycle, chromosomes are dramatically compacted as cells divide and dynamically reorganized into less compact, spatiotemporally patterned structures after cell division. These activities are facilitated by DNA/chromatin-binding protein motors called SMC complexes. Each of these motors can perform a unique activity known as “loop extrusion,” in which the motor binds the DNA/chromatin polymer, reels in the polymer fiber, and extrudes it as a loop. Using simulations and theory, I show how loop-extruding motors can collectively compact and spatially organize chromosomes in different scenarios. First, I show that loop-extruding complexes can generate sufficient compaction for cell division, provided that loop-extrusion satisfies stringent physical requirements. Second, while loop-extrusion alone does not uniquely spatially pattern the genome, interactions between SMC complexes and protein “boundary elements” can generate patterns that emerge in the genome after cell division. Intriguingly, these “boundary elements” are not necessarily stationary, which can generate a variety of patterns in the neighborhood of transcriptionally active genes. These predictions, along with supporting experiments, show how SMC complexes and other molecular machinery, such as RNA polymerase, can spatially organize the genome. More generally, this work demonstrates both the versatility of the loop extrusion mechanism for chromosome functional organization and how seemingly subtle microscopic effects can emerge in the spatiotemporal structure of nonequilibrium polymers.
Neural heterogeneity promotes robust learning
The brain has a hugely diverse, heterogeneous structure. By contrast, many functional neural models are homogeneous. We compared the performance of spiking neural networks trained to carry out difficult tasks, with varying degrees of heterogeneity. Introducing heterogeneity in membrane and synapse time constants substantially improved task performance, and made learning more stable and robust across multiple training methods, particularly for tasks with a rich temporal structure. In addition, the distribution of time constants in the trained networks closely matches those observed experimentally. We suggest that the heterogeneity observed in the brain may be more than just the byproduct of noisy processes, but rather may serve an active and important role in allowing animals to learn in changing environments.
Crowding and the Architecture of the Visual System
Classically, vision is seen as a cascade of local, feedforward computations. This framework has been tremendously successful, inspiring a wide range of ground-breaking findings in neuroscience and computer vision. Recently, feedforward Convolutional Neural Networks (ffCNNs), inspired by this classic framework, have revolutionized computer vision and been adopted as tools in neuroscience. However, despite these successes, there is much more to vision. I will present our work using visual crowding and related psychophysical effects as probes into visual processes that go beyond the classic framework. In crowding, perception of a target deteriorates in clutter. We focus on global aspects of crowding, in which perception of a small target is strongly modulated by the global configuration of elements across the visual field. We show that models based on the classic framework, including ffCNNs, cannot explain these effects for principled reasons and identify recurrent grouping and segmentation as a key missing ingredient. Then, we show that capsule networks, a recent kind of deep learning architecture combining the power of ffCNNs with recurrent grouping and segmentation, naturally explain these effects. We provide psychophysical evidence that humans indeed use a similar recurrent grouping and segmentation strategy in global crowding effects. In crowding, visual elements interfere across space. To study how elements interfere over time, we use the Sequential Metacontrast psychophysical paradigm, in which perception of visual elements depends on elements presented hundreds of milliseconds later. We psychophysically characterize the temporal structure of this interference and propose a simple computational model. Our results support the idea that perception is a discrete process. Together, the results presented here provide stepping-stones towards a fuller understanding of the visual system by suggesting architectural changes needed for more human-like neural computations.
Neural control of vocal interactions in songbirds
During conversations we rapidly switch between listening and speaking which often requires withholding or delaying our speech in order to hear others and avoid overlapping. This capacity for vocal turn-taking is exhibited by non-linguistic species as well, however the neural circuit mechanisms that enable us to regulate the precise timing of our vocalizations during interactions are unknown. We aim to identify the neural mechanisms underlying the coordination of vocal interactions. Therefore, we paired zebra finches with a vocal robot (1Hz call playback) and measured the bird’s call response times. We found that individual birds called with a stereotyped delay in respect to the robot call. Pharmacological inactivation of the premotor nucleus HVC revealed its necessity for the temporal coordination of calls. We further investigated the contributing neural activity within HVC by performing intracellular recordings from premotor neurons and inhibitory interneurons in calling zebra finches. We found that inhibition is preceding excitation before and during call onset. To test whether inhibition guides call timing we pharmacologically limited the impact of inhibition on premotor neurons. As a result zebra finches converged on a similar delay time i.e. birds called more rapidly after the vocal robot call suggesting that HVC inhibitory interneurons regulate the coordination of social contact calls. In addition, we aim to investigate the vocal turn-taking capabilities of the common nightingale. Male nightingales learn over 100 different song motifs which are being used in order to attract mates or defend territories. Previously, it has been shown that nightingales counter-sing with each other following a similar temporal structure to human vocal turn-taking. These animals are also able to spontaneously imitate a motif of another nightingale. The neural mechanisms underlying this behaviour are not yet understood. In my lab, we further probe the capabilities of these animals in order to access the dynamic range of their vocal turn taking flexibility.
Flygenvectors: The spatial and temporal structure of neural activity across the fly brain
COSYNE 2022
Multi-scale single-cycle analysis of cortex-wide wave dynamics reveals complex spatio-temporal structure
Bernstein Conference 2024
Flygenvectors: The spatial and temporal structure of neural activity across the fly brain
COSYNE 2022