← Back

Information Flow

Topic spotlight
TopicNeuro

information flow

Discover seminars, jobs, and research tagged with information flow across Neuro.
13 curated items13 Seminars
Updated 11 months ago
13 items · information flow

Latest

13 results
SeminarNeuroscience

Analyzing Network-Level Brain Processing and Plasticity Using Molecular Neuroimaging

Alan Jasanoff
Massachusetts Institute of Technology
Jan 28, 2025

Behavior and cognition depend on the integrated action of neural structures and populations distributed throughout the brain. We recently developed a set of molecular imaging tools that enable multiregional processing and plasticity in neural networks to be studied at a brain-wide scale in rodents and nonhuman primates. Here we will describe how a novel genetically encoded activity reporter enables information flow in virally labeled neural circuitry to be monitored by fMRI. Using the reporter to perform functional imaging of synaptically defined neural populations in the rat somatosensory system, we show how activity is transformed within brain regions to yield characteristics specific to distinct output projections. We also show how this approach enables regional activity to be modeled in terms of inputs, in a paradigm that we are extending to address circuit-level origins of functional specialization in marmoset brains. In the second part of the talk, we will discuss how another genetic tool for MRI enables systematic studies of the relationship between anatomical and functional connectivity in the mouse brain. We show that variations in physical and functional connectivity can be dissociated both across individual subjects and over experience. We also use the tool to examine brain-wide relationships between plasticity and activity during an opioid treatment. This work demonstrates the possibility of studying diverse brain-wide processing phenomena using molecular neuroimaging.

SeminarNeuroscience

The Neural Race Reduction: Dynamics of nonlinear representation learning in deep architectures

Andrew Saxe
UCL
Apr 14, 2023

What is the relationship between task, network architecture, and population activity in nonlinear deep networks? I will describe the Gated Deep Linear Network framework, which schematizes how pathways of information flow impact learning dynamics within an architecture. Because of the gating, these networks can compute nonlinear functions of their input. We derive an exact reduction and, for certain cases, exact solutions to the dynamics of learning. The reduction takes the form of a neural race with an implicit bias towards shared representations, which then govern the model’s ability to systematically generalize, multi-task, and transfer. We show how appropriate network architectures can help factorize and abstract knowledge. Together, these results begin to shed light on the links between architecture, learning dynamics and network performance.

SeminarNeuroscience

From spikes to factors: understanding large-scale neural computations

Mark M. Churchland
Columbia University, New York, USA
Apr 6, 2023

It is widely accepted that human cognition is the product of spiking neurons. Yet even for basic cognitive functions, such as the ability to make decisions or prepare and execute a voluntary movement, the gap between spikes and computation is vast. Only for very simple circuits and reflexes can one explain computations neuron-by-neuron and spike-by-spike. This approach becomes infeasible when neurons are numerous the flow of information is recurrent. To understand computation, one thus requires appropriate abstractions. An increasingly common abstraction is the neural ‘factor’. Factors are central to many explanations in systems neuroscience. Factors provide a framework for describing computational mechanism, and offer a bridge between data and concrete models. Yet there remains some discomfort with this abstraction, and with any attempt to provide mechanistic explanations above that of spikes, neurons, cell-types, and other comfortingly concrete entities. I will explain why, for many networks of spiking neurons, factors are not only a well-defined abstraction, but are critical to understanding computation mechanistically. Indeed, factors are as real as other abstractions we now accept: pressure, temperature, conductance, and even the action potential itself. I use recent empirical results to illustrate how factor-based hypotheses have become essential to the forming and testing of scientific hypotheses. I will also show how embracing factor-level descriptions affords remarkable power when decoding neural activity for neural engineering purposes.

SeminarNeuroscience

Altered dynamic information flow through the cortico-basal ganglia pathways is responsible for Parkinson’s disease symptoms

Satomi Chiken
Mar 10, 2023
SeminarNeuroscienceRecording

NMC4 Short Talk: The complete connectome of an insect brain

Michael Winding (he/him)
University of Cambridge
Dec 2, 2021

Brains must integrate complex sensory information and compare to past events to generate appropriate behavioral responses. The neural circuit basis of these computations is unclear and the underlying structure unknown. Here, we mapped the comprehensive synaptic wiring diagram of the fruit fly larva brain, which contains 3,013 neurons and 544K synaptic sites. It is the most complete insect connectome to date: 1) Both brain hemispheres are reconstructed, allowing investigation of neural pathways that include contralateral axons, which we found in 37% of brain neurons. 2) All sensory neurons and descending neurons are reconstructed, allowing one to follow signals in an uninterrupted chain—from the sensory periphery, through the brain, to motor neurons in the nerve cord. We developed novel computational tools, allowing us to cluster the brain and investigate how information flows through it. We discovered that feedforward pathways from sensory to descending neurons are multilayered and highly multimodal. Robust feedback was observed at almost all levels of the brain, including descending neurons. We investigated how the brain hemispheres communicate with each other and the nerve cord, leading to identification of novel circuit motifs. This work provides the complete blueprint of a brain and a strong foundation to study the structure-function relationship of neural circuits.

SeminarNeuroscienceRecording

Tuning dumb neurons to task processing - via homeostasis

Viola Priesemann
Max Planck Institute for Dynamics and Self-organization
Oct 8, 2021

Homeostatic plasticity plays a key role in stabilizing neural network activity. But what is its role in neural information processing? We showed analytically how homeostasis changes collective dynamics and consequently information flow - depending on the input to the network. We then studied how input and homeostasis on a recurrent network of LIF neurons impacts information flow and task performance. We showed how we can tune the working point of the network, and found that, contrary to previous assumptions, there is not one optimal working point for a family of tasks, but each task may require its own working point.

SeminarNeuroscienceRecording

Mechanisms of cortical communication during decision-making

Arseny Finkelstein
Svoboda lab, Janelia HHMI
Mar 3, 2021

Regulation of information flow in the brain is critical for many forms of behavior. In the process of sensory based decision-making, decisions about future actions are held in memory until enacted, making them potentially vulnerable to distracting sensory input. Therefore, gating of information flow from sensory to motor areas could protect memory from interference during decision-making, but the underlying network mechanisms are not understood. I will present our recent experimental and modeling work describing how information flow from the sensory cortex can be gated by state-dependent frontal cortex dynamics during decision-making in mice. Our results show that communication between brain regions can be regulated via attractor dynamics, which control the degree of commitment to an action, and reveal a novel mechanism of gating of neural information.

SeminarNeuroscience

Leveraging olfaction to understand how the brain and the body generate social behavior

Lisa Stowers
Scripps research institute
Nov 30, 2020

Courtship behavior is an innate model for many types of brain computations including sensory detection, learning and memory, and internal state modulation. Despite the robustness of the behavior, we have little understanding of the underlying neural circuits and mechanisms. The Stowers’ lab is leveraging the ability of specialized olfactory cues, pheromones, to specifically activate and therefore identify and study courtship circuits in the mouse. We are interested in identifying general circuit principles (specific brain nodes and information flow) that are common to all individuals, in order to additionally study how experience, gender, age, and internal state modulate and personalize behavior. We are solving two parallel sensory to motor courtship circuits, that promote social vocal calling and scent marking, to study information processing of behavior as a complete unit instead of restricting focus to a single brain region. We expect comparing and contrasting the coding logic of two courtship motor behaviors will begin to shed light on general principles of how the brain senses context, weighs experience and responds to internal state to ultimately decide appropriate action.

SeminarNeuroscienceRecording

Using noise to probe recurrent neural network structure and prune synapses

Rishidev Chaudhuri
University of California, Davis
Sep 25, 2020

Many networks in the brain are sparsely connected, and the brain eliminates synapses during development and learning. How could the brain decide which synapses to prune? In a recurrent network, determining the importance of a synapse between two neurons is a difficult computational problem, depending on the role that both neurons play and on all possible pathways of information flow between them. Noise is ubiquitous in neural systems, and often considered an irritant to be overcome. In the first part of this talk, I will suggest that noise could play a functional role in synaptic pruning, allowing the brain to probe network structure and determine which synapses are redundant. I will introduce a simple, local, unsupervised plasticity rule that either strengthens or prunes synapses using only synaptic weight and the noise-driven covariance of the neighboring neurons. For a subset of linear and rectified-linear networks, this rule provably preserves the spectrum of the original matrix and hence preserves network dynamics even when the fraction of pruned synapses asymptotically approaches 1. The plasticity rule is biologically-plausible and may suggest a new role for noise in neural computation. Time permitting, I will then turn to the problem of extracting structure from neural population data sets using dimensionality reduction methods. I will argue that nonlinear structures naturally arise in neural data and show how these nonlinearities cause linear methods of dimensionality reduction, such as Principal Components Analysis, to fail dramatically in identifying low-dimensional structure.

SeminarNeuroscience

Information and Decision-Making

Daniel Polani
University of Hertfordshire
Jul 20, 2020

In recent years it has become increasingly clear that (Shannon) information is a central resource for organisms, akin in importance to energy. Any decision that an organism or a subsystem of an organism takes involves the acquisition, selection, and processing of information and ultimately its concentration and enaction. It is the consequences of this balance that will occupy us in this talk. This perception-action loop picture of an agent's life cycle is well established and expounded especially in the context of Fuster's sensorimotor hierarchies. Nevertheless, the information-theoretic perspective drastically expands the potential and predictive power of the perception-action loop perspective. On the one hand information can be treated - to a significant extent - as a resource that is being sought and utilized by an organism. On the other hand, unlike energy, information is not additive. The intrinsic structure and dynamics of information can be exceedingly complex and subtle; in the last two decades one has discovered that Shannon information possesses a rich and nontrivial intrinsic structure that must be taken into account when informational contributions, information flow or causal interactions of processes are investigated, whether in the brain or in other complex processes. In addition, strong parallels between information and control theory have emerged. This parallelism between the theories allows one to obtain unexpected insights into the nature and properties of the perception-action loop. Through the lens of information theory, one can not only come up with novel hypotheses about necessary conditions for the organization of information processing in a brain, but also with constructive conjectures and predictions about what behaviours, brain structure and dynamics and even evolutionary pressures one can expect to operate on biological organisms, induced purely by informational considerations.

SeminarNeuroscience

Hippocampal disinhibitory circuits: cell types, connectivity and function

Lisa Topolnik
Université Laval
Jun 25, 2020

The concept of a dynamic excitation / inhibition ratio, that can shape information flow in cortical circuits during complex behavioural tasks due to circuit disinhibition, has recently arisen as an important and conserved processing motif. It has been also recognized that, in cortical circuits, a subpopulation of GABAergic cells that express vasoactive intestinal polypeptide (VIP) innervates selectively inhibitory interneurons, providing for circuit disinhibition as a possible outcome, depending on the network state and behavioural context. In this talk, I will highlight the latest discoveries on the dynamic organization of hippocampal disinhibitory circuits with a focus on VIP-expressing interneurons. I will discuss the neuron types that can be involved in disinhibition and their local circuit and long-range synaptic connections. I will also discuss some recent findings on how hippocampal VIP circuits may coordinate spatial learning.

SeminarNeuroscienceRecording

The active modulation of sound and vibration perception

Natasha Mhatre
University of Western Ontario
Jun 17, 2020

The dominant view of perception right now is that information travels from the environment to the sensory system, then to the nervous systems which processes it to generate a percept and behaviour. Ongoing behaviour is thought to occur largely through simple iterations of this process. However, this linear view, where information flows only in one direction and the properties of the environment and the sensory system remain static and unaffected by behaviour, is slowly fading. Many of us are beginning to appreciate that perception is largely active, i.e. that information flows back and forth between the three systems modulating their respective properties. In other words, in the real world, the environment and sensorimotor loop is pretty much always closed. I study the loop; in particular I study how the reverse arm of the loop affects sound and vibration perception. I will present two examples of motor modulation of perception at two very different temporal and spatial scales. First, in crickets, I will present data on how high-speed molecular motor activity enhances hearing via the well-studied phenomenon of active amplification. Second, in spiders I will present data on how body posture, a slow macroscopic feature, which can barely be called ‘active’, can nonetheless modulate vibration perception. I hope these results will motivate a conversation about whether ‘active’ perception is an optional feature observed in some sensory systems, or something that is ultimately necessitated by both evolution and physics.

information flow coverage

13 items

Seminar13
Domain spotlight

Explore how information flow research is advancing inside Neuro.

Visit domain