Changing
changing environments
Design principles of adaptable neural codes
Behavior relies on the ability of sensory systems to infer changing properties of the environment from incoming sensory stimuli. However, the demands that detecting and adjusting to changes in the environment place on a sensory system often differ from the demands associated with performing a specific behavioral task. This necessitates neural coding strategies that can dynamically balance these conflicting needs. I will discuss our ongoing theoretical work to understand how this balance can best be achieved. We connect ideas from efficient coding and Bayesian inference to ask how sensory systems should dynamically allocate limited resources when the goal is to optimally infer changing latent states of the environment, rather than reconstruct incoming stimuli. We use these ideas to explore dynamic tradeoffs between the efficiency and speed of sensory adaptation schemes, and the downstream computations that these schemes might support. Finally, we derive families of codes that balance these competing objectives, and we demonstrate their close match to experimentally-observed neural dynamics during sensory adaptation. These results provide a unifying perspective on adaptive neural dynamics across a range of sensory systems, environments, and sensory tasks.
Understanding the role of neural heterogeneity in learning
The brain has a hugely diverse and heterogeneous nature. The exact role of heterogeneity has been relatively little explored as most neural models tend to be largely homogeneous. We trained spiking neural networks with varying degrees of heterogeneity on complex real-world tasks and found that heterogeneity resulted in more stable and robust training and improved training performance, especially for tasks with a higher temporal structure. Moreover, the optimal distribution of parameters found by training was found to be similar to experimental observations. These findings suggest that heterogeneity is not simply a result of noisy biological processes, but it may play a crucial role for learning in complex, changing environments.
Active sleep in flies: the dawn of consciousness
The brain is a prediction machine. Yet the world is never entirely predictable, for any animal. Unexpected events are surprising and this typically evokes prediction error signatures in animal brains. In humans such mismatched expectations are often associated with an emotional response as well. Appropriate emotional responses are understood to be important for memory consolidation, suggesting that valence cues more generally constitute an ancient mechanism designed to potently refine and generalize internal models of the world and thereby minimize prediction errors. On the other hand, abolishing error detection and surprise entirely is probably also maladaptive, as this might undermine the very mechanism that brains use to become better prediction machines. This paradoxical view of brain functions as an ongoing tug-of-war between prediction and surprise suggests a compelling new way to study and understand the evolution of consciousness in animals. I will present approaches to studying attention and prediction in the tiny brain of the fruit fly, Drosophila melanogaster. I will discuss how an ‘active’ sleep stage (termed rapid eye movement – REM – sleep in mammals) may have evolved in the first animal brains as a mechanism for optimizing prediction in motile creatures confronted with constantly changing environments. A role for REM sleep in emotional regulation could thus be better understood as an ancient sleep function that evolved alongside selective attention to maintain an adaptive balance between prediction and surprise. This view of active sleep has some interesting implications for the evolution of subjective awareness and consciousness.
Neural heterogeneity promotes robust learning
The brain has a hugely diverse, heterogeneous structure. By contrast, many functional neural models are homogeneous. We compared the performance of spiking neural networks trained to carry out difficult tasks, with varying degrees of heterogeneity. Introducing heterogeneity in membrane and synapse time constants substantially improved task performance, and made learning more stable and robust across multiple training methods, particularly for tasks with a rich temporal structure. In addition, the distribution of time constants in the trained networks closely matches those observed experimentally. We suggest that the heterogeneity observed in the brain may be more than just the byproduct of noisy processes, but rather may serve an active and important role in allowing animals to learn in changing environments.
Vision in dynamically changing environments
Many visual systems can process information in dynamically changing environments. In general, visual perception scales with changes in the visual stimulus, or contrast, irrespective of background illumination. This is achieved by adaptation. However, visual perception is challenged when adaptation is not fast enough to deal with sudden changes in overall illumination, for example when gaze follows a moving object from bright sunlight into a shaded area. We have recently shown that the visual system of the fly found a solution by propagating a corrective luminance-sensitive signal to higher processing stages. Using in vivo two-photon imaging and behavioural analyses we showed that distinct OFF-pathway inputs encode contrast and luminance. The luminance-sensitive pathway is particularly required when processing visual motion in contextual dim light, when pure contrast sensitivity underestimates the salience of a stimulus. Recent work in the lab has addressed the question how two visual pathways obtain such fundamentally different sensitivities, given common photoreceptor input. We are furthermore currently working out the network-based strategies by which luminance- and contrast-sensitive signals are combined to guide appropriate visual behaviour. Together, I will discuss the molecular, cellular, and circuit mechanisms that ensure contrast computation, and therefore robust vision, in fast changing visual scenes.
EEG correlates of Bayesian inference in auditory spatial localization in changing environments
FENS Forum 2024