← Back

Neural Integrator

Topic spotlight
TopicWorld Wide

neural integrator

Discover seminars, jobs, and research tagged with neural integrator across World Wide.
2 curated items2 Seminars
Updated almost 4 years ago
2 items · neural integrator
2 results
SeminarNeuroscienceRecording

Integrators in short- and long-term memory

Mark Goldman
UC Davis
Mar 1, 2022

The accumulation and storage of information in memory is a fundamental computation underlying animal behavior. In many brain regions and task paradigms, ranging from motor control to navigation to decision-making, such accumulation is accomplished through neural integrator circuits that enable external inputs to move a system’s population-wide patterns of neural activity along a continuous attractor. In the first portion of the talk, I will discuss our efforts to dissect the circuit mechanisms underlying a neural integrator from a rich array of anatomical, physiological, and perturbation experiments. In the second portion of the talk, I will show how the accumulation and storage of information in long-term memory may also be described by attractor dynamics, but now within the space of synaptic weights rather than neural activity. Altogether, this work suggests a conceptual unification of seemingly distinct short- and long-term memory processes.

SeminarNeuroscienceRecording

A robust neural integrator based on the interactions of three time scales

Bard Ermentrout
University of Pittsburgh
Nov 10, 2020

Neural integrators are circuits that are able to code analog information such as spatial location or amplitude. Storing amplitude requires the network to have a large number of attractors. In classic models with recurrent excitation, such networks require very careful tuning to behave as integrators and are not robust to small mistuning of the recurrent weights. In this talk, I introduce a circuit with recurrent connectivity that is subjected to a slow subthreshold oscillation (such as the theta rhythm in the hippocampus). I show that such a network can robustly maintain many discrete attracting states. Furthermore, the firing rates of the neurons in these attracting states are much closer to those seen in recordings of animals. I show the mechanism for this can be explained by the instability regions of the Mathieu equation. I then extend the model in various ways and, for example, show that in a spatially distributed network, it is possible to code location and amplitude simultaneously. I show that the resulting mean field equations are equivalent to a certain discontinuous differential equation.