TopicNeuro

adaptation

50 Seminars40 ePosters

Latest

SeminarNeuroscienceRecording

Functional Plasticity in the Language Network – evidence from Neuroimaging and Neurostimulation

Gesa Hartwigsen
University of Leipzig, Germany
May 20, 2025

Efficient cognition requires flexible interactions between distributed neural networks in the human brain. These networks adapt to challenges by flexibly recruiting different regions and connections. In this talk, I will discuss how we study functional network plasticity and reorganization with combined neurostimulation and neuroimaging across the adult life span. I will argue that short-term plasticity enables flexible adaptation to challenges, via functional reorganization. My key hypothesis is that disruption of higher-level cognitive functions such as language can be compensated for by the recruitment of domain-general networks in our brain. Examples from healthy young brains illustrate how neurostimulation can be used to temporarily interfere with efficient processing, probing short-term network plasticity at the systems level. Examples from people with dyslexia help to better understand network disorders in the language domain and outline the potential of facilitatory neurostimulation for treatment. I will also discuss examples from aging brains where plasticity helps to compensate for loss of function. Finally, examples from lesioned brains after stroke provide insight into the brain’s potential for long-term reorganization and recovery of function. Collectively, these results challenge the view of a modular organization of the human brain and argue for a flexible redistribution of function via systems plasticity.

SeminarNeuroscience

Probing neural population dynamics with recurrent neural networks

Chethan Pandarinath
Emory University and Georgia Tech
Jun 12, 2024

Large-scale recordings of neural activity are providing new opportunities to study network-level dynamics with unprecedented detail. However, the sheer volume of data and its dynamical complexity are major barriers to uncovering and interpreting these dynamics. I will present latent factor analysis via dynamical systems, a sequential autoencoding approach that enables inference of dynamics from neuronal population spiking activity on single trials and millisecond timescales. I will also discuss recent adaptations of the method to uncover dynamics from neural activity recorded via 2P Calcium imaging. Finally, time permitting, I will mention recent efforts to improve the interpretability of deep-learning based dynamical systems models.

SeminarNeuroscience

Neurogenic versus Oligodendrogenic progenitors in the postnatal brain. Different adaptations, different porperties

Kazanis Ilias
Cell Pathology, School of Life Sciences, University of Westminster, London, UK
Feb 21, 2024
SeminarNeuroscience

A recurrent network model of planning predicts hippocampal replay and human behavior

Marcelo Mattar
NYU
Oct 20, 2023

When interacting with complex environments, humans can rapidly adapt their behavior to changes in task or context. To facilitate this adaptation, we often spend substantial periods of time contemplating possible futures before acting. For such planning to be rational, the benefits of planning to future behavior must at least compensate for the time spent thinking. Here we capture these features of human behavior by developing a neural network model where not only actions, but also planning, are controlled by prefrontal cortex. This model consists of a meta-reinforcement learning agent augmented with the ability to plan by sampling imagined action sequences drawn from its own policy, which we refer to as `rollouts'. Our results demonstrate that this agent learns to plan when planning is beneficial, explaining the empirical variability in human thinking times. Additionally, the patterns of policy rollouts employed by the artificial agent closely resemble patterns of rodent hippocampal replays recently recorded in a spatial navigation task, in terms of both their spatial statistics and their relationship to subsequent behavior. Our work provides a new theory of how the brain could implement planning through prefrontal-hippocampal interactions, where hippocampal replays are triggered by -- and in turn adaptively affect -- prefrontal dynamics.

SeminarNeuroscience

A recurrent network model of planning explains hippocampal replay and human behavior

Guillaume Hennequin
University of Cambridge, UK
May 31, 2023

When interacting with complex environments, humans can rapidly adapt their behavior to changes in task or context. To facilitate this adaptation, we often spend substantial periods of time contemplating possible futures before acting. For such planning to be rational, the benefits of planning to future behavior must at least compensate for the time spent thinking. Here we capture these features of human behavior by developing a neural network model where not only actions, but also planning, are controlled by prefrontal cortex. This model consists of a meta-reinforcement learning agent augmented with the ability to plan by sampling imagined action sequences drawn from its own policy, which we refer to as 'rollouts'. Our results demonstrate that this agent learns to plan when planning is beneficial, explaining the empirical variability in human thinking times. Additionally, the patterns of policy rollouts employed by the artificial agent closely resemble patterns of rodent hippocampal replays recently recorded in a spatial navigation task, in terms of both their spatial statistics and their relationship to subsequent behavior. Our work provides a new theory of how the brain could implement planning through prefrontal-hippocampal interactions, where hippocampal replays are triggered by - and in turn adaptively affect - prefrontal dynamics.

SeminarNeuroscience

Relations and Predictions in Brains and Machines

Kim Stachenfeld
Deepmind
Apr 7, 2023

Humans and animals learn and plan with flexibility and efficiency well beyond that of modern Machine Learning methods. This is hypothesized to owe in part to the ability of animals to build structured representations of their environments, and modulate these representations to rapidly adapt to new settings. In the first part of this talk, I will discuss theoretical work describing how learned representations in hippocampus enable rapid adaptation to new goals by learning predictive representations, while entorhinal cortex compresses these predictive representations with spectral methods that support smooth generalization among related states. I will also cover recent work extending this account, in which we show how the predictive model can be adapted to the probabilistic setting to describe a broader array of generalization results in humans and animals, and how entorhinal representations can be modulated to support sample generation optimized for different behavioral states. In the second part of the talk, I will overview some of the ways in which we have combined many of the same mathematical concepts with state-of-the-art deep learning methods to improve efficiency and performance in machine learning applications like physical simulation, relational reasoning, and design.

SeminarNeuroscienceRecording

Analogical inference in mathematics: from epistemology to the classroom (and back)

Dr Francesco Nappo & Dr Nicolò Cangiotti
Politecnico di Milano
Feb 23, 2023

In this presentation, we will discuss adaptations of historical examples of mathematical research to bring out some of the intuitive judgments that accompany the working practice of mathematicians when reasoning by analogy. The main epistemological claim that we will aim to illustrate is that a central part of mathematical training consists in developing a quasi-perceptual capacity to distinguish superficial from deep analogies. We think of this capacity as an instance of Hadamard’s (1954) discriminating faculty of the mathematical mind, whereby one is led to distinguish between mere “hookings” (77) and “relay-results” (80): on the one hand, suggestions or ‘hints’, useful to raise questions but not to back up conjectures; on the other, more significant discoveries, which can be used as an evidentiary source in further mathematical inquiry. In the second part of the presentation, we will present some recent applications of this epistemological framework to mathematics education projects for middle and high schools in Italy.

SeminarNeuroscienceRecording

The multimodal number sense: spanning space, time, sensory modality, and action

David Burr
University of Florence
Oct 20, 2022

Humans and other animals can estimate rapidly the number of items in a scene, flashes or tones in a sequence and motor actions. Adaptation techniques provide clear evidence in humans for the existence of specialized numerosity mechanisms that make up the numbersense. This sense of number is truly general, encoding the numerosity of both spatial arrays and sequential sets, in vision and audition, and interacting strongly with action. The adaptation (cross-sensory and cross-format) acts on sensory mechanisms rather than decisional processes, pointing to a truly general sense.

SeminarNeuroscience

Brain-muscle signaling coordinates exercise adaptations in Drosophila

Robert Wessells
Wayne State University
Sep 20, 2022

Chronic exercise is a powerful intervention that lowers the incidence of most age-related diseases while promoting healthy metabolism in humans. However, illness, injury or age prevent many humans from consistently exercising. Thus, identification of molecular targets that can mimic the benefits of exercise would be a valuable tool to improve health outcomes of humans with neurodegenerative or mitochondrial diseases, or those with enforced sedentary lifestyles. Using a novel exercise platform for Drosophila, we have identified octopaminergic neurons as a key subset of neurons that are critical for the exercise response, and shown that periodic daily stimulation of these neurons can induce a systemic exercise response in sedentary flies. Octopamine is released into circulation where it signals through various octopamine receptors in target tissues and induces gene expression changes similar to exercise. In particular, we have identified several key molecules that respond to octopamine in skeletal muscle, including the mTOR modulator Sestrin, the PGC-1α homolog Spargel, and the FNDC5/Irisin homolog Iditarod. We are currently testing these molecules as potential therapies for multiple diseases that reduce mobility, including the PolyQ disease SCA2 and the mitochondrial disease Barth syndrome.

SeminarNeuroscienceRecording

Trading Off Performance and Energy in Spiking Networks

Sander Keemink
Donders Institute for Brain, Cognition and Behaviour
Jun 1, 2022

Many engineered and biological systems must trade off performance and energy use, and the brain is no exception. While there are theories on how activity levels are controlled in biological networks through feedback control (homeostasis), it is not clear what the effects on population coding are, and therefore how performance and energy can be traded off. In this talk we will consider this tradeoff in auto-encoding networks, in which there is a clear definition of performance (the coding loss). We first show how SNNs follow a characteristic trade-off curve between activity levels and coding loss, but that standard networks need to be retrained to achieve different tradeoff points. We next formalize this tradeoff with a joint loss function incorporating coding loss (performance) and activity loss (energy use). From this loss we derive a class of spiking networks which coordinates its spiking to minimize both the activity and coding losses -- and as a result can dynamically adjust its coding precision and energy use. The network utilizes several known activity control mechanisms for this --- threshold adaptation and feedback inhibition --- and elucidates their potential function within neural circuits. Using geometric intuition, we demonstrate how these mechanisms regulate coding precision, and thereby performance. Lastly, we consider how these insights could be transferred to trained SNNs. Overall, this work addresses a key energy-coding trade-off which is often overlooked in network studies, expands on our understanding of homeostasis in biological SNNs, as well as provides a clear framework for considering performance and energy use in artificial SNNs.

SeminarNeuroscienceRecording

A transcriptomic axis predicts state modulation of cortical interneurons

Stephane Bugeon
Harris & Carandini's lab, UCL
Apr 27, 2022

Transcriptomics has revealed that cortical inhibitory neurons exhibit a great diversity of fine molecular subtypes, but it is not known whether these subtypes have correspondingly diverse activity patterns in the living brain. We show that inhibitory subtypes in primary visual cortex (V1) have diverse correlates with brain state, but that this diversity is organized by a single factor: position along their main axis of transcriptomic variation. We combined in vivo 2-photon calcium imaging of mouse V1 with a novel transcriptomic method to identify mRNAs for 72 selected genes in ex vivo slices. We classified inhibitory neurons imaged in layers 1-3 into a three-level hierarchy of 5 Subclasses, 11 Types, and 35 Subtypes using previously-defined transcriptomic clusters. Responses to visual stimuli differed significantly only across Subclasses, suppressing cells in the Sncg Subclass while driving cells in the other Subclasses. Modulation by brain state differed at all hierarchical levels but could be largely predicted from the first transcriptomic principal component, which also predicted correlations with simultaneously recorded cells. Inhibitory Subtypes that fired more in resting, oscillatory brain states have less axon in layer 1, narrower spikes, lower input resistance and weaker adaptation as determined in vitro and express more inhibitory cholinergic receptors. Subtypes firing more during arousal had the opposite properties. Thus, a simple principle may largely explain how diverse inhibitory V1 Subtypes shape state-dependent cortical processing.

SeminarNeuroscienceRecording

Transcriptional adaptation couples past experience and future sensory responses

Tatsuya Tsukahara
Datta lab, Harvard Medical School
Apr 27, 2022

Animals traversing different environments encounter both stable background stimuli and novel cues, which are generally thought to be detected by primary sensory neurons and then distinguished by downstream brain circuits. Sensory adaptation is a neural mechanism that filters background by minimizing responses to stable sensory stimuli, and a fundamental feature of sensory systems. Adaptation over relatively fast timescales (milliseconds to minutes) have been reported in many sensory systems. However, adaptation to persistent environmental stimuli over longer timescales (hours to days) have been largely unexplored, even though those timescales are ethologically important since animals typically stay in one environment for hours. I showed that each of the ~1,000 olfactory sensory neuron (OSN) subtypes in the mouse harbors a distinct transcriptome whose content is precisely determined by interactions between its odorant receptor and the environment. This transcriptional variation is systematically organized to support sensory adaptation: expression levels of many genes relevant to transforming odors into spikes continuously vary across OSN subtypes, dynamically adjust to new environments over hours, and accurately predict acute OSN-specific odor responses. The sensory periphery therefore separates salient signals from predictable background via a transcriptional mechanism whose moment-to-moment state reflects the past and constrains the future; these findings suggest a general model in which structured transcriptional variation within a cell type reflects individual experience.

SeminarNeuroscienceRecording

Four questions about brain and behaviour

Alexandra de Sousa
Bath Spa University
Apr 25, 2022

Tinbergen encouraged ethologists to address animal behaviour by answering four questions, covering physiology, adaptation, phylogeny, and development. This broad approach has implications for neuroscience and psychology, yet, questions about phylogeny are rarely considered in these fields. Here I describe how phylogeny can shed light on our understanding of brain structure and function. Further, I show that we now have or are developing the data and analytical methods necessary to study the natural history of the human mind.

SeminarNeuroscienceRecording

Retinal responses to natural inputs

Fred Rieke
University of Washington
Apr 18, 2022

The research in my lab focuses on sensory signal processing, particularly in cases where sensory systems perform at or near the limits imposed by physics. Photon counting in the visual system is a beautiful example. At its peak sensitivity, the performance of the visual system is limited largely by the division of light into discrete photons. This observation has several implications for phototransduction and signal processing in the retina: rod photoreceptors must transduce single photon absorptions with high fidelity, single photon signals in photoreceptors, which are only 0.03 – 0.1 mV, must be reliably transmitted to second-order cells in the retina, and absorption of a single photon by a single rod must produce a noticeable change in the pattern of action potentials sent from the eye to the brain. My approach is to combine quantitative physiological experiments and theory to understand photon counting in terms of basic biophysical mechanisms. Fortunately there is more to visual perception than counting photons. The visual system is very adept at operating over a wide range of light intensities (about 12 orders of magnitude). Over most of this range, vision is mediated by cone photoreceptors. Thus adaptation is paramount to cone vision. Again one would like to understand quantitatively how the biophysical mechanisms involved in phototransduction, synaptic transmission, and neural coding contribute to adaptation.

SeminarNeuroscienceRecording

A Panoramic View on Vision

Maximilian Joesch
IST Austria
Mar 7, 2022

Statistics of natural scenes are not uniform - their structure varies dramatically from ground to sky. It remains unknown whether these non-uniformities are reflected in the large-scale organization of the early visual system and what benefits such adaptations would confer. By deploying an efficient coding argument, we predict that changes in the structure of receptive fields across visual space increase the efficiency of sensory coding. To test this experimentally, developed a simple, novel imaging system that is indispensable for studies at this scale. In agreement with our predictions, we could show that receptive fields of retinal ganglion cells change their shape along the dorsoventral axis, with a marked surround asymmetry at the visual horizon. Our work demonstrates that, according to principles of efficient coding, the panoramic structure of natural scenes is exploited by the retina across space and cell-types.

SeminarNeuroscienceRecording

NaV Long-term Inactivation Regulates Adaptation in Place Cells and Depolarization Block in Dopamine Neurons

Carmen Canavier
LSU Health Sciences Center, New Orleans
Feb 9, 2022

In behaving rodents, CA1 pyramidal neurons receive spatially-tuned depolarizing synaptic input while traversing a specific location within an environment called its place. Midbrain dopamine neurons participate in reinforcement learning, and bursts of action potentials riding a depolarizing wave of synaptic input signal rewards and reward expectation. Interestingly, slice electrophysiology in vitro shows that both types of cells exhibit a pronounced reduction in firing rate (adaptation) and even cessation of firing during sustained depolarization. We included a five state Markov model of NaV1.6 (for CA1) and NaV1.2 (for dopamine neurons) respectively, in computational models of these two types of neurons. Our simulations suggest that long-term inactivation of this channel is responsible for the adaptation in CA1 pyramidal neurons, in response to triangular depolarizing current ramps. We also show that the differential contribution of slow inactivation in two subpopulations of midbrain dopamine neurons can account for their different dynamic ranges, as assessed by their responses to similar depolarizing ramps. These results suggest long-term inactivation of the sodium channel is a general mechanism for adaptation.

SeminarNeuroscienceRecording

Why is the suprachiasmatic nucleus such a brilliant circadian time-keeper?

Michael Hastings
MRC Laboratory of Molecular Biology, Cambridge
Feb 8, 2022

Circadian clocks dominate our lives. By creating and distributing an internal representation of 24-hour solar time, they prepare us, and thereby adapt us, to the daily and seasonal world. Jet-lag is an obvious indicator of what can go wrong when such adaptation is disrupted acutely. More seriously, the growing prevalence of rotational shift-work which runs counter to our circadian life, is a significant chronic challenge to health, presenting as increased incidence of systemic conditions such as metabolic and cardiovascular disease. Added to this, circadian and sleep disturbances are a recognised feature of various neurological and psychiatric conditions, and in some cases may contribute to disease progression. The “head ganglion” of the circadian system is the suprachiasmatic nucleus (SCN) of the hypothalamus. It synchronises the, literally, innumerable cellular clocks across the body, to each other and to solar time. Isolated in organotypic slice culture, it can maintain precise, high-amplitude circadian cycles of neural activity, effectively, indefinitely, just as it does in vivo. How is this achieved: how does this clock in a dish work? This presentation will consider SCN time-keeping at the level of molecular feedback loops, neuropeptidergic networks and neuron-astrocyte interactions.

SeminarNeuroscience

Separable pupillary signatures of perception and action during perceptual multistability

Jan Brascamp
Michigan State University
Jan 26, 2022

The pupil provides a rich, non-invasive measure of the neural bases of perception and cognition, and has been of particular value in uncovering the role of arousal-linked neuromodulation, which alters cortical processing as well as pupil size. But pupil size is subject to a multitude of influences, which complicates unique interpretation. We measured pupils of observers experiencing perceptual multistability -- an ever-changing subjective percept in the face of unchanging but inconclusive sensory input. In separate conditions the endogenously generated perceptual changes were either task-relevant or not, allowing a separation between perception-related and task-related pupil signals. Perceptual changes were marked by a complex pupil response that could be decomposed into two components: a dilation tied to task execution and plausibly indicative of an arousal-linked noradrenaline surge, and an overlapping constriction tied to the perceptual transient and plausibly a marker of altered visual cortical representation. Constriction, but not dilation, amplitude systematically depended on the time interval between perceptual changes, possibly providing an overt index of neural adaptation. These results show that the pupil provides a simultaneous reading on interacting but dissociable neural processes during perceptual multistability, and suggest that arousal-linked neuromodulation shapes action but not perception in these circumstances. This presentation covers work that was published in e-life

SeminarNeuroscienceRecording

NMC4 Short Talk: What can 140,000 Reaches Tell Us About Demographic Contributions to Visuomotor Adaptation?

Hrach Asmerian
University of California, Berkeley
Dec 2, 2021

Motor learning is typically assessed in the lab, affording a high degree of control over the task environment. However, this level of control often comes at the cost of smaller sample sizes and a homogenous pool of participants (e.g. college students). To address this, we have designed a web-based motor learning experiment, making it possible to reach a larger, more diverse set of participants. As a proof-of-concept, we collected 1,581 participants completing a visuomotor rotation task, where participants controlled a visual cursor on the screen with their mouse and trackpad. Motor learning was indexed by how fast participants were able to compensate for a 45° rotation imposed between the cursor and their actual movement. Using a cross-validated LASSO regression, we found that motor learning varied significantly with the participant’s age and sex, and also strongly correlated with the location of the target, visual acuity, and satisfaction with the experiment. In contrast, participants' mouse and browser type were features eliminated by the model, indicating that motor performance was not influenced by variations in computer hardware and software. Together, this proof-of-concept study demonstrates how large datasets can generate important insights into the factors underlying motor learning.

SeminarNeuroscienceRecording

NMC4 Short Talk: A theory for the population rate of adapting neurons disambiguates mean vs. variance-driven dynamics and explains log-normal response statistics

Laureline Logiaco (she/her)
Columbia University
Dec 2, 2021

Recently, the field of computational neuroscience has seen an explosion of the use of trained recurrent network models (RNNs) to model patterns of neural activity. These RNN models are typically characterized by tuned recurrent interactions between rate 'units' whose dynamics are governed by smooth, continuous differential equations. However, the response of biological single neurons is better described by all-or-none events - spikes - that are triggered in response to the processing of their synaptic input by the complex dynamics of their membrane. One line of research has attempted to resolve this discrepancy by linking the average firing probability of a population of simplified spiking neuron models to rate dynamics similar to those used for RNN units. However, challenges remain to account for complex temporal dependencies in the biological single neuron response and for the heterogeneity of synaptic input across the population. Here, we make progress by showing how to derive dynamic rate equations for a population of spiking neurons with multi-timescale adaptation properties - as this was shown to accurately model the response of biological neurons - while they receive independent time-varying inputs, leading to plausible asynchronous activity in the network. The resulting rate equations yield an insightful segregation of the population's response into dynamics that are driven by the mean signal received by the neural population, and dynamics driven by the variance of the input across neurons, with respective timescales that are in agreement with slice experiments. Further, these equations explain how input variability can shape log-normal instantaneous rate distributions across neurons, as observed in vivo. Our results help interpret properties of the neural population response and open the way to investigating whether the more biologically plausible and dynamically complex rate model we derive could provide useful inductive biases if used in an RNN to solve specific tasks.

SeminarNeuroscience

Adaptive bottleneck to pallium for sequence memory, path integration and mixed selectivity representation

André Longtin
University of Ottawa
Nov 10, 2021

Spike-driven adaptation involves intracellular mechanisms that are initiated by neural firing and lead to the subsequent reduction of spiking rate followed by a recovery back to baseline. We report on long (>0.5 second) recovery times from adaptation in a thalamic-like structure in weakly electric fish. This adaptation process is shown via modeling and experiment to encode in a spatially invariant manner the time intervals between event encounters, e.g. with landmarks as the animal learns the location of food. These cells also come in two varieties, ones that care only about the time since the last encounter, and others that care about the history of encounters. We discuss how the two populations can share in the task of representing sequences of events, supporting path integration and converting from ego-to-allocentric representations. The heterogeneity of the population parameters enables the representation and Bayesian decoding of time sequences of events which may be put to good use in path integration and hilus neuron function in hippocampus. Finally we discuss how all the cells of this gateway to the pallium exhibit mixed selectivity of social features of their environment. The data and computational modeling further reveal that, in contrast to a long-held belief, these gymnotiform fish are endowed with a corollary discharge, albeit only for social signalling.

SeminarNeuroscienceRecording

Efficient GPU training of SNNs using approximate RTRL

James Knight
University of Sussex
Nov 3, 2021

Last year’s SNUFA workshop report concluded “Moving toward neuron numbers comparable with biology and applying these networks to real-world data-sets will require the development of novel algorithms, software libraries, and dedicated hardware accelerators that perform well with the specifics of spiking neural networks” [1]. Taking inspiration from machine learning libraries — where techniques such as parallel batch training minimise latency and maximise GPU occupancy — as well as our previous research on efficiently simulating SNNs on GPUs for computational neuroscience [2,3], we are extending our GeNN SNN simulator to pursue this vision. To explore GeNN’s potential, we use the eProp learning rule [4] — which approximates RTRL — to train SNN classifiers on the Spiking Heidelberg Digits and the Spiking Sequential MNIST datasets. We find that the performance of these classifiers is comparable to those trained using BPTT [5] and verify that the theoretical advantages of neuron models with adaptation dynamics [5] translate to improved classification performance. We then measured execution times and found that training an SNN classifier using GeNN and eProp becomes faster than SpyTorch and BPTT after less than 685 timesteps and much larger models can be trained on the same GPU when using GeNN. Furthermore, we demonstrate that our implementation of parallel batch training improves training performance by over 4⨉ and enables near-perfect scaling across multiple GPUs. Finally, we show that performing inference using a recurrent SNN using GeNN uses less energy and has lower latency than a comparable LSTM simulated with TensorFlow [6].

SeminarNeuroscienceRecording

Adaptation-driven sensory detection and sequence memory

André Longtin
University of Ottawa
Oct 6, 2021

Spike-driven adaptation involves intracellular mechanisms that are initiated by spiking and lead to the subsequent reduction of spiking rate. One of its consequences is the temporal patterning of spike trains, as it imparts serial correlations between interspike intervals in baseline activity. Surprisingly the hidden adaptation states that lead to these correlations themselves exhibit quasi-independence. This talk will first discuss recent findings about the role of such adaptation in suppressing noise and extending sensory detection to weak stimuli that leave the firing rate unchanged. Further, a matching of the post-synaptic responses to the pre-synaptic adaptation time scale enables a recovery of the quasi-independence property, and can explain observations of correlations between post-synaptic EPSPs and behavioural detection thresholds. We then consider the involvement of spike-driven adaptation in the representation of intervals between sensory events. We discuss the possible link of this time-stamping mechanism to the conversion of egocentric to allocentric coordinates. The heterogeneity of the population parameters enables the representation and Bayesian decoding of time sequences of events which may be put to good use in path integration and hilus neuron function in hippocampus.

SeminarNeuroscience

Understanding the Assessment of Spatial Neglect and its Treatment Using Prism Adaptation Training

Matthew Checketts
Division of Neuroscience & Experimental Psychology and Division of Psychology and Mental Health, Faculty of Biology, Medicine and Health, The University of Manchester, Manchester Academic Health Science Centre, Manchester, United Kingdom
Oct 5, 2021

Spatial neglect is a syndrome that is most frequently associated with damage to the right hemisphere, although damage to the left hemisphere can also result in signs of spatial neglect. It is characterised by absent or deficient awareness of the contralesional side of space. The screening and diagnosis of spatial neglect lacks a universal gold standard, but is usually achieved by using various modes of assessment. Spatial neglect is also difficult to treat, although prism adaptation training (PAT) has in the past reportedly showed some promise. This seminar will include highlights from a series of studies designed to identify knowledge gaps, and will suggest ways in which these can be bridged. The first study was conducted to identify and quantify clinicians’ use of assessment tools for spatial neglect, finding that several different tools are in use, but that there is an emerging consensus and appetite for harmonisation. The second study included PAT, and sought to uncover whether PAT can improve engagement in recommended therapy in order to improve the outcomes of stroke survivors with spatial neglect. The final study, a systematic review and meta-analysis, sought to investigate the scientific efficacy (rather than clinical effectiveness) of PAT, identifying several knowledge gaps in the existing literature and a need for a new approach in the study of PAT in the clinical setting.

SeminarNeuroscience

Understanding the role of prediction in sensory encoding

Jason Mattingley
Monash Biomedical Imaging
Jul 29, 2021

At any given moment the brain receives more sensory information than it can use to guide adaptive behaviour, creating the need for mechanisms that promote efficient processing of incoming sensory signals. One way in which the brain might reduce its sensory processing load is to encode successive presentations of the same stimulus in a more efficient form, a process known as neural adaptation. Conversely, when a stimulus violates an expected pattern, it should evoke an enhanced neural response. Such a scheme for sensory encoding has been formalised in predictive coding theories, which propose that recent experience establishes expectations in the brain that generate prediction errors when violated. In this webinar, Professor Jason Mattingley will discuss whether the encoding of elementary visual features is modulated when otherwise identical stimuli are expected or unexpected based upon the history of stimulus presentation. In humans, EEG was employed to measure neural activity evoked by gratings of different orientations, and multivariate forward modelling was used to determine how orientation selectivity is affected for expected versus unexpected stimuli. In mice, two-photon calcium imaging was used to quantify orientation tuning of individual neurons in the primary visual cortex to expected and unexpected gratings. Results revealed enhanced orientation tuning to unexpected visual stimuli, both at the level of whole-brain responses and for individual visual cortex neurons. Professor Mattingley will discuss the implications of these findings for predictive coding theories of sensory encoding. Professor Jason Mattingley is a Laureate Fellow and Foundation Chair in Cognitive Neuroscience at The University of Queensland. His research is directed toward understanding the brain processes that support perception, selective attention and decision-making, in health and disease.

SeminarNeuroscience

Digitization as a driving force for collaboration in neuroscience

Michael Denker
Forschungszentrum Jülich
Jul 1, 2021

Many of the collaborations we encounter in our scientific careers are centered on a common idea that can be associated with certain resources, such as a dataset, an algorithm, or a model. All partners in a collaboration need to develop a common understanding of these resources, and need to be able to access them in a simple and unambiguous manner in order to avoid incorrect conclusions especially in highly cross-disciplinary contexts. While digital computers have entered to assist scientific workflows in experiment and simulation for many decades, the high degree of heterogeneity in the field had led to a scattered landscape of highly customized, lab-internal solutions to organizing and managing the resources on a project-by-project basis. Only with the availability of modern technologies such as the semantic web, platforms for collaborative coding or the development of data standards overarching different disciplines, we have tools at our disposal to make resources increasingly more accessible, understandable, and usable. However, without overarching standardization efforts and adaptation of such technologies to the workflows and needs of individual researchers, their adoption by the neuroscience community will be impeded. From the perspective of computational neuroscience, which is inherently dependent on leveraging data and methods across the field of neuroscience for inspiration and validation, I will outline my view on past and present developments towards a more rigorous use of digital resources and how they improved collaboration, and introduce emerging initiatives to support this process in the future (e.g., EBRAINS http://ebrains.eu, NFDI-Neuro http://www.nfdi-neuro.de).

SeminarNeuroscience

Central representations of protein availability regulating appetite and body weight control

Clemence Blouet
Wellcome-MRC Institute of Metabolic Science, University of Cambridge
Jun 14, 2021

Dietary protein quantity and quality greatly impact metabolic health via evolutionary-conserved mechanisms that ensure avoidance of amino acid imbalanced food sources, promote hyperphagia when dietary protein density is low, and conversely produce satiety when dietary protein density is high. Growing evidence support the emerging concept of protein homeostasis in mammals, where protein intake is maintained within a tight range independently of energy intake to reach a target protein intake. The behavioural and neuroendocrine mechanisms underlying these adaptations are unclear and form the focus of our research.

SeminarNeuroscience

Stress and the Individual: Neurobiological Mechanisms Underlying Differential Susceptibilities and Adaptations

Carmen Sandi
Swiss Federal Institute of Technology, Lausanne
May 1, 2021

Dr. Carmen Sandi leads the laboratory of Behavioral Genetis in EPFL, Lausanne. Her lab investigates the impact and mechanism whereby stress and anxiety affect brain and behavior in an integrative program involvong studies in rodents and humans. She is the founder and co-president of Swiss Stress Network, co-director of Swiss National Centre of Competence in Research Synapsy. She is Chair of the ALBA Network, and pas-President of Cajal Advanced Neuroscience Training Program and the Federation of European Neuroscience Societies.

SeminarNeuroscienceRecording

Recurrent network dynamics lead to interference in sequential learning

Friedrich Schuessler
Barak lab, Technion, Haifa, Israel
Apr 29, 2021

Learning in real life is often sequential: A learner first learns task A, then task B. If the tasks are related, the learner may adapt the previously learned representation instead of generating a new one from scratch. Adaptation may ease learning task B but may also decrease the performance on task A. Such interference has been observed in experimental and machine learning studies. In the latter case, it is mediated by correlations between weight updates for the different tasks. In typical applications, like image classification with feed-forward networks, these correlated weight updates can be traced back to input correlations. For many neuroscience tasks, however, networks need to not only transform the input, but also generate substantial internal dynamics. Here we illuminate the role of internal dynamics for interference in recurrent neural networks (RNNs). We analyze RNNs trained sequentially on neuroscience tasks with gradient descent and observe forgetting even for orthogonal tasks. We find that the degree of interference changes systematically with tasks properties, especially with emphasis on input-driven over autonomously generated dynamics. To better understand our numerical observations, we thoroughly analyze a simple model of working memory: For task A, a network is presented with an input pattern and trained to generate a fixed point aligned with this pattern. For task B, the network has to memorize a second, orthogonal pattern. Adapting an existing representation corresponds to the rotation of the fixed point in phase space, as opposed to the emergence of a new one. We show that the two modes of learning – rotation vs. new formation – are directly linked to recurrent vs. input-driven dynamics. We make this notion precise in a further simplified, analytically tractable model, where learning is restricted to a 2x2 matrix. In our analysis of trained RNNs, we also make the surprising observation that, across different tasks, larger random initial connectivity reduces interference. Analyzing the fixed point task reveals the underlying mechanism: The random connectivity strongly accelerates the learning mode of new formation, and has less effect on rotation. The prior thus wins the race to zero loss, and interference is reduced. Altogether, our work offers a new perspective on sequential learning in recurrent networks, and the emphasis on internally generated dynamics allows us to take the history of individual learners into account.

SeminarNeuroscienceRecording

A neuronal model for learning to keep a rhythmic beat

John Rinzel
New York University
Apr 21, 2021

When listening to music, we typically lock onto and move to a beat (1-6 Hz). Behavioral studies on such synchronization (Repp 2005) abound, yet the neural mechanisms remain poorly understood. Some models hypothesize an array of self-sustaining entrainable neural oscillators that resonate when forced with rhythmic stimuli (Large et al. 2010). In contrast, our formulation focuses on event time estimation and plasticity: a neuronal beat generator that adapts its intrinsic frequency and phase to match the extermal rhythm. The model quickly learns new rhythms, within a few cycles as found in human behavior. When the stimulus is removed the beat generator continues to produce the learned rhythm in accordance with a synchronization continuation task.

SeminarNeuroscienceRecording

Stability-Flexibility Dilemma in Cognitive Control: A Dynamical System Perspective

Naomi Leonard
Princeton University
Mar 26, 2021

Constraints on control-dependent processing have become a fundamental concept in general theories of cognition that explain human behavior in terms of rational adaptations to these constraints. However, theories miss a rationale for why such constraints would exist in the first place. Recent work suggests that constraints on the allocation of control facilitate flexible task switching at the expense of the stability needed to support goal-directed behavior in face of distraction. We formulate this problem in a dynamical system, in which control signals are represented as attractors and in which constraints on control allocation limit the depth of these attractors. We derive formal expressions of the stability-flexibility tradeoff, showing that constraints on control allocation improve cognitive flexibility but impair cognitive stability. We provide evidence that human participants adapt higher constraints on the allocation of control as the demand for flexibility increases but that participants deviate from optimal constraints. In continuing work, we are investigating how collaborative performance of a group of individuals can benefit from individual differences defined in terms of balance between cognitive stability and flexibility.

SeminarNeuroscience

Nr4a1-mediated morphological adaptations in Ventral Pallidal projections to Mediodorsal Thalamus support cocaine intake and relapse-like behaviors

Michel Engeln
Institute of Neurodegenerative Diseases, University of Bordeaux, Bordeaux, France
Mar 19, 2021

Growing evidence suggests the ventral pallidum (VP) is critical for drug intake and seeking behaviors. Receiving dense projections from the nucleus accumbens as well as dopamine inputs from the midbrain, the VP plays a central role in the control of motivated behaviors. Repeated exposure to cocaine is known to alter VP neuronal firing and neurotransmission. Surprisingly, there is limited information on the molecular adaptations occurring in VP neurons following cocaine intake.To provide insights into cocaine-induced transcriptional alterations we performed RNA-sequencing on VP of mice following cocaine self-administration. Gene Ontology analysis pointed toward alterations in dendrite- and spinerelated genes. Subsequent transcriptional regulator analysis identified the transcription factor Nr4a1 as a common regulator for these sets of morphology-related genes.Consistent with the central role of the VP in reward, its neurons project to several key regions associated with cocaine-mediated behaviors. We thus assessed Nr4a1 expression levels in various projection populations.Following cocaine self-administration, VP neurons projecting to the mediodorsal thalamus (MDT) showed significantly increased Nr4a1 levels. To further investigate the role of Nr4a1 in cocaine intake and relapse, we bidirectionally manipulated its expression levels selectively in VP neurons projecting to the MDT. Increasing Nr4a1 levels resulted in enhanced relapse-like behaviors accompanied by a blockage of cocaine-induced spinogenesis.However, decreasing Nr4a1expression levels completely abolished cocaine intake and consequential relapse-like behaviors. Together, our preliminary findings suggest that drug-induced neuronal remodeling in pallido-thalamic circuits is critical for cocaine intake and relapse-like behaviors.

SeminarNeuroscienceRecording

Distinct forms of cortical plasticity underlie difficulties to reliably detect sounds in noisy environments"; "Acoustic context modulates natural sound discrimination in auditory cortex through frequency specific adaptation

Dr. Jennifer Resnik; Dr. Julio Hechavarria
Ben-Gurion University; Goethe University
Feb 23, 2021
SeminarNeuroscience

Plasticity of Pain and Pleasure

Robert Bonin
University of Toronto Centre for the Study of Pain
Feb 1, 2021

What happens when the nervous system fails to adapt? Our perception of the world relies on a nervous system that learns and adapts to sensory information. Based on our experience we can predict what a wooden surface will feel like, that fire is hot, and that a gentle caress from a partner can be soothing. But our sensory experience of the world is not static – warm water can feel like fire on sunburned skin and the gentle brush of our clothes can be excruciating after an injury. In pathological conditions such as chronic pain, changes in nervous system function can cause normally innocuous sensory stimuli to be perceived as aversive or painful long after the initial injury has happened. These changes can sometimes be similar to the formation of a pain ‘memory’ that can modulate and distort our perception of sensory information. Our research program seeks to understand how fundamental processes that govern the formation and maintenance of plastic changes in the nervous system can lead to pathological conditions and how we can reverse engineer these changes to treat chronic conditions.

SeminarNeuroscienceRecording

Theory and modeling of whisking rhythm generation in the brainstem

David Golomb
Ben Gurion University
Jan 30, 2021

The vIRt nucleus in the medulla, composed of mainly inhibitory neurons, is necessary for whisking rhythm generation. It innervates motoneurons in the facial nucleus (FN) that project to intrinsic vibrissa muscles. The nearby pre-Bötzinger complex (pBötC), which generates inhalation, sends inhibitory inputs to the vIRt nucleus which contribute to the synchronization of vIRt neurons. Lower-amplitude periodic whisking, however, can occur after decay of the pBötC signal. To explain how vIRt network generates these “intervening” whisks by bursting in synchrony, and how pBötC input induces strong whisks, we construct and analyze a conductance-based (CB) model of the vIRt circuit composed of hypothetical two groups, vIRtr and vIRtp, of bursting inhibitory neurons with spike-frequency adaptation currents and constant external inputs. The CB model is reduced to a rate model to enable analytical treatment. We find, analytically and computationally, that without pBötC input, periodic bursting states occur within a certain ranges of network connectivities. Whisk amplitudes increase with the level constant external input to the vIRT. With pBötC inhibition intact, the amplitude of the first whisk in a breathing cycle is larger than the intervening whisks for large pBötC input and small inhibitory coupling between the vIRT sub-populations. The pBötC input advances the next whisk and shortens its amplitude if it arrives at the beginning of the whisking cycle generated by the vIRT, and delays the next whisks if it arrives at the end of that cycle. Our theory provides a mechanism for whisking generation and reveals how whisking frequency and amplitude are controlled.

SeminarNeuroscienceRecording

Machine Learning as a tool for positive impact : case studies from climate change

Alexandra (Sasha) Luccioni
University of Montreal and Mila (Quebec Institute for Learning Algorithms)
Dec 10, 2020

Climate change is one of our generation's greatest challenges, with increasingly severe consequences on global ecosystems and populations. Machine Learning has the potential to address many important challenges in climate change, from both mitigation (reducing its extent) and adaptation (preparing for unavoidable consequences) aspects. To present the extent of these opportunities, I will describe some of the projects that I am involved in, spanning from generative model to computer vision and natural language processing. There are many opportunities for fundamental innovation in this field, advancing the state-of-the-art in Machine Learning while ensuring that this fundamental progress translates into positive real-world impact.

SeminarNeuroscienceRecording

Beyond energy - an unconventional role of mitochondria in cone photoreceptors

Wei Li
NIH Bethesda
Dec 8, 2020

The long-term goal of my research is to study the mammalian retina as a model for the central nervous system (CNS) -- to understand how it functions in physiological conditions, how it is formed, how it breaks down in pathological conditions, and how it can be repaired. I have focused on two research themes: 1) Photoreceptor structure, synapse, circuits, and development, 2) Hibernation and metabolic adaptations in the retina and beyond. As the first neuron of the visual system, photoreceptors are vital for photoreception and transmission of visual signals. I am particularly interested in cone photoreceptors, as they mediate our daylight vision with high resolution color information. Diseases affecting cone photoreceptors compromise visual functions in the central macular area of the human retina and are thus most detrimental to our vision. However, because cones are much less abundant compared to rods in most mammals, they are less well studied. We have used the ground squirrel (GS) as a model system to study cone vision, taking advantage of their unique cone-dominant retina. In particular, we have focused on short-wavelength sensitive cones (S-cones), which are not only essential for color vision, but are also an important origin of signals for biological rhythm, mood and cognitive functions, and the growth of the eye during development. We are studying critical cone synaptic structures – synaptic ribbons, the synaptic connections of S-cones, and the development of S-cones with regard to their specific connections. These works will provide knowledge of normal retinal development and function, which can also be extended to the rest of CNS; for example, the mechanisms of synaptic targeting during development. In addition, such knowledge will benefit the development of optimal therapeutic strategies for regeneration and repair in cases of retinal degenerative disease. Many neurodegenerative diseases, including retinal diseases, are rooted in metabolic stress in neurons and/or glial cells. Using the same GS model, we aim to learn from this hibernating mammal, which possesses an amazing capability to adapt to the extreme metabolic conditions during hibernation. By exploring the mechanisms of such adaptation, we hope to discover novel therapeutic tactics for neurodegenerative diseases.

SeminarNeuroscienceRecording

Dynamic computation in the retina by retuning of neurons and synapses

Leon Lagnado
University of Sussex
Sep 16, 2020

How does a circuit of neurons process sensory information? And how are transformations of neural signals altered by changes in synaptic strength? We investigate these questions in the context of the visual system and the lateral line of fish. A distinguishing feature of our approach is the imaging of activity across populations of synapses – the fundamental elements of signal transfer within all brain circuits. A guiding hypothesis is that the plasticity of neurotransmission plays a major part in controlling the input-output relation of sensory circuits, regulating the tuning and sensitivity of neurons to allow adaptation or sensitization to particular features of the input. Sensory systems continuously adjust their input-output relation according to the recent history of the stimulus. A common alteration is a decrease in the gain of the response to a constant feature of the input, termed adaptation. For instance, in the retina, many of the ganglion cells (RGCs) providing the output produce their strongest responses just after the temporal contrast of the stimulus increases, but the response declines if this input is maintained. The advantage of adaptation is that it prevents saturation of the response to strong stimuli and allows for continued signaling of future increases in stimulus strength. But adaptation comes at a cost: a reduced sensitivity to a future decrease in stimulus strength. The retina compensates for this loss of information through an intriguing strategy: while some RGCs adapt following a strong stimulus, a second population gradually becomes sensitized. We found that the underlying circuit mechanisms involve two opposing forms of synaptic plasticity in bipolar cells: synaptic depression causes adaptation and facilitation causes sensitization. Facilitation is in turn caused by depression in inhibitory synapses providing negative feedback. These opposing forms of plasticity can cause simultaneous increases and decreases in contrast-sensitivity of different RGCs, which suggests a general framework for understanding the function of sensory circuits: plasticity of both excitatory and inhibitory synapses control dynamic changes in tuning and gain.

ePosterNeuroscience

Correcting cortical output: a distributed learning framework for motor adaptation

Leonardo Agueci, N Alex Cayco Gajic

Bernstein Conference 2024

ePosterNeuroscience

Identifying plasticity mechanisms underlying experience-driven adaptation in cortical circuits

Dimitra Maoutsa, Julijana Gjorgjieva

Bernstein Conference 2024

ePosterNeuroscience

Short-term adaptation reshapes retinal ganglion cell selectivity to natural scenes

Baptiste Lorenzi, Samuele Virgili, Déborah Varro, Olivier Marre

Bernstein Conference 2024

ePosterNeuroscience

Synaptic modulation facilitates adaptation in cortical networks

Ivan Bulygin, James Ferguson, Nicoleta Condruz, Tim Vogels

Bernstein Conference 2024

ePosterNeuroscience

Clear evidence in favor of adaptation and against temporally specific predictive suppression in monkey primary auditory cortex

Tobias Teichert

COSYNE 2022

ePosterNeuroscience

Cortical adaptation to sound reverberation

Aleksandar Ivanov,Andrew King,Benjamin Willmore,Kerry Walker,Nicol Harper

COSYNE 2022

ePosterNeuroscience

Gain-mediated statistical adaptation in recurrent neural networks

Lyndon Duong,Colin Bredenberg,David Heeger,Eero Simoncelli

COSYNE 2022

ePosterNeuroscience

Gain-mediated statistical adaptation in recurrent neural networks

Lyndon Duong,Colin Bredenberg,David Heeger,Eero Simoncelli

COSYNE 2022

ePosterNeuroscience

Long-term motor learning creates structure within neural space that shapes motor adaptation

Joanna Chang,Matthew Perich,Lee E. Miller,Juan Gallego,Claudia Clopath

COSYNE 2022

ePosterNeuroscience

Long-term motor learning creates structure within neural space that shapes motor adaptation

Joanna Chang,Matthew Perich,Lee E. Miller,Juan Gallego,Claudia Clopath

COSYNE 2022

ePosterNeuroscience

Neural adaptation in attractor networks implements replay trajectories in the hippocampus

Zilong Ji,Xingsi Dong,Tianhao Chu,Si Wu

COSYNE 2022

ePosterNeuroscience

Neural adaptation in attractor networks implements replay trajectories in the hippocampus

Zilong Ji,Xingsi Dong,Tianhao Chu,Si Wu

COSYNE 2022

ePosterNeuroscience

Sensory feedback can drive adaptation in motor cortex and facilitate generalization

Barbara Feulner,Matthew G. Perich,Lee E. Miller,Claudia Clopath,Juan A. Gallego

COSYNE 2022

ePosterNeuroscience

Sensory feedback can drive adaptation in motor cortex and facilitate generalization

Barbara Feulner,Matthew G. Perich,Lee E. Miller,Claudia Clopath,Juan A. Gallego

COSYNE 2022

ePosterNeuroscience

Top-down optimization recovers biological coding principles of single-neuron adaptation in RNNs

Victor Geadah,Giancarlo Kerg,Stefan Horoi,Guy Wolf,Guillaume Lajoie

COSYNE 2022

ePosterNeuroscience

Top-down optimization recovers biological coding principles of single-neuron adaptation in RNNs

Victor Geadah,Giancarlo Kerg,Stefan Horoi,Guy Wolf,Guillaume Lajoie

COSYNE 2022

ePosterNeuroscience

Context-dependent sensory adaptation in cortical area MT as a substrate of flexible decision-making

Kara McGaughey, Joshua Gold, Nathan Tardiff, Kyra Schapiro, Hannah Lefumat

COSYNE 2023

ePosterNeuroscience

Decreased interictal EEG slowing is consistent with increased multiple timescale neural adaptation

Brian Lundstrom & Thomas Richner

COSYNE 2023

ePosterNeuroscience

Efficient coding explains neural response homeostasis and stimulus-specific adaptation

Edward Young & Yashar Ahmadian

COSYNE 2023

ePosterNeuroscience

Spatiotemporal patterns of adaptation-induced slow oscillations in a whole-brain model of slow-wave sleep

Caglar Cakan, Cristiana Dimulescu, Liliia Khakimova, Daniela Obst, Agnes Flöel, Klaus Obermayer

COSYNE 2023

ePosterNeuroscience

Synaptic low-rank modulation facilitates adaptation in cortical networks

Ivan Bulygin, James Ferguson, Tim Vogels

COSYNE 2023

ePosterNeuroscience

A multi-area RNN model of adaptive motor control explains adaptation-induced reorganization of neural activity

Rui Xia, Guillaume Hennequin

COSYNE 2025

ePosterNeuroscience

Synaptic modulation outperforms somatic modulation for rapid adaptation in cortical nets

Ivan Bulygin, James Ferguson, Nicoleta Condruz, Tim Vogels

COSYNE 2025

ePosterNeuroscience

A universal power law in visual adaptation: balancing representation fidelity and metabolic cost

Matteo Mariani, Amin S. Moosavi, Dario Ringach, Mario Dipoppa

COSYNE 2025

ePosterNeuroscience

Adaptation of rats and humans to a volatile hidden Markov model for reward collection

Maria Ravera, Mathew E. Diamond

FENS Forum 2024

ePosterNeuroscience

Bayesian perceptual adaptation in auditory motion perception: A multimodal approach with EEG and pupillometry

Roman Fleischmann, Burcu Bayram, David Meijer, Roberto Barumerli, Michelle Spierings, Ulrich Pomper, Robert Baumgartner

FENS Forum 2024

ePosterNeuroscience

Cellular and circuit underpinnings of social behaviour adaptations

Myrto Panopoulou, Julia Odermatt, Delia Christ, Peter Scheiffele

FENS Forum 2024

ePosterNeuroscience

Cortico-cerebellar neuronal dynamics during adaptation to movement perturbations

Capucine Gros, Brandon Stell

FENS Forum 2024

ePosterNeuroscience

Decoding cocaine-induced proteomic adaptations in the mouse nucleus accumbens

Lucas Sosnick, Ashik Gurung, Sidoli Simone, Eric J Nestler, Philipp Mews

FENS Forum 2024

ePosterNeuroscience

Decision strategy adaptation is supported by shifts in dopaminergic RPE-mediated contingency representation

Maxime Come, Aylin Gulmez, Loussineh Keshishian, Elise Bousseyrol, Steve Didienne, Philippe Faure

FENS Forum 2024

ePosterNeuroscience

Dynamical adaptation of neuronal activity in the prefrontal cortex depending on different motivations behind a choice

Hugo Malagon-Vina, Dimitrios Mariatos Metaxas, Cristian Estarellas, Claudia Espinoza, Thomas Klausberger

FENS Forum 2024

ePosterNeuroscience

Exploring visual adaptation in vivo: The effect of luminance on receptive field properties

Divyansh Gupta, Maximilian Joesch

FENS Forum 2024

ePosterNeuroscience

Glasses to hear differently? The aftereffects of prism adaptation on auditory threshold in young and older healthy adults

Vincent Ardonceau, Bénédicte Poulin-Charronnat, Carine Michel-Colent

FENS Forum 2024

ePosterNeuroscience

Identifying overlapping spikes in neural activity with unsupervised-subspace domain adaptation

Min-Ki Kim, Jeong-Woo Sohn

FENS Forum 2024

ePosterNeuroscience

Investigating lipid droplet regulation and microglia activation as intrinsic adaptations in brains of the African naked mole rat

Liv Svenningsson Krogstad, Markus A. Teppen, Harald S. Mjønes, Samuel Geiseler, Cecilie Morland

FENS Forum 2024

ePosterNeuroscience

Locomotion modulates visual adaptation in the mouse superior colliculus

Maria Florencia Gonzalez Fleitas, Liad Jacob Baruchin, Sylvia Schröder

FENS Forum 2024

ePosterNeuroscience

A novel role for LSD1 splicing modulation in homeostatic adaptation to chronic stress

Arteda Paplekaj, Chiara Forastieri, Elena Romito, Andrea de Donato, Sara Testa, Emanuela Toffolo, Elena Battaglioli, Francesco Rusconi

FENS Forum 2024

ePosterNeuroscience

Parsing the striatal molecular adaptations in glutamate and dopamine systems in a preclinical model of depression

Marion Violain, Marie-Charlotte Allichon, Vanesa Ortiz, Paula Pousinha, Gwenola Poupon, Stephane Martin, Peter Vanhoutte, Jacques Barik

FENS Forum 2024

ePosterNeuroscience

The role of serotonin in escape responses and learned adaptation to the looming stimulus

Dafna` Ljubotina, Laura Burnett, Peter Koppensteiner, Tim P Vogels, Maximilian Jösch

FENS Forum 2024

ePosterNeuroscience

Sensorimotor integration in the zebrafish inferior olive during motor adaptation

Pierce Mullen, Hesho Shaweis, Maarten Zwart

FENS Forum 2024

adaptation coverage

90 items

Seminar50
ePoster40
Domain spotlight

Explore how adaptation research is advancing inside Neuro.

Visit domain