← Back

Egocentric

Topic spotlight
TopicWorld Wide

egocentric

Discover seminars, jobs, and research tagged with egocentric across World Wide.
13 curated items7 Seminars6 ePosters
Updated over 2 years ago
13 items · egocentric
13 results
SeminarNeuroscience

Learning through the eyes and ears of a child

Brenden Lake
NYU
Apr 20, 2023

Young children have sophisticated representations of their visual and linguistic environment. Where do these representations come from? How much knowledge arises through generic learning mechanisms applied to sensory data, and how much requires more substantive (possibly innate) inductive biases? We examine these questions by training neural networks solely on longitudinal data collected from a single child (Sullivan et al., 2020), consisting of egocentric video and audio streams. Our principal findings are as follows: 1) Based on visual only training, neural networks can acquire high-level visual features that are broadly useful across categorization and segmentation tasks. 2) Based on language only training, networks can acquire meaningful clusters of words and sentence-level syntactic sensitivity. 3) Based on paired visual and language training, networks can acquire word-referent mappings from tens of noisy examples and align their multi-modal conceptual systems. Taken together, our results show how sophisticated visual and linguistic representations can arise through data-driven learning applied to one child’s first-person experience.

SeminarNeuroscienceRecording

Spatial uncertainty provides a unifying account of navigation behavior and grid field deformations

Yul Kang
Lengyel lab, Cambridge University
Apr 5, 2022

To localize ourselves in an environment for spatial navigation, we rely on vision and self-motion inputs, which only provide noisy and partial information. It is unknown how the resulting uncertainty affects navigation behavior and neural representations. Here we show that spatial uncertainty underlies key effects of environmental geometry on navigation behavior and grid field deformations. We develop an ideal observer model, which continually updates probabilistic beliefs about its allocentric location by optimally combining noisy egocentric visual and self-motion inputs via Bayesian filtering. This model directly yields predictions for navigation behavior and also predicts neural responses under population coding of location uncertainty. We simulate this model numerically under manipulations of a major source of uncertainty, environmental geometry, and support our simulations by analytic derivations for its most salient qualitative features. We show that our model correctly predicts a wide range of experimentally observed effects of the environmental geometry and its change on homing response distribution and grid field deformation. Thus, our model provides a unifying, normative account for the dependence of homing behavior and grid fields on environmental geometry, and identifies the unavoidable uncertainty in navigation as a key factor underlying these diverse phenomena.

SeminarNeuroscience

Using extra-hippocampal cognitive maps for goal-directed spatial navigation

Hiroshi Ito
Max Planck Institute for Brain Research
Jul 6, 2021

Goal-directed navigation requires precise estimates of spatial relationships between current position and future goal, as well as planning of an associated route or action. While neurons in the hippocampal formation can represent the animal’s position and nearby trajectories, their role in determining the animal’s destination or action has been questioned. We thus hypothesize that brain regions outside the hippocampal formation may play complementary roles in navigation, particularly for guiding goal-directed behaviours based on the brain’s internal cognitive map. In this seminar, I will first describe a subpopulation of neurons in the retrosplenial cortex (RSC) that increase their firing when the animal approaches environmental boundaries, such as walls or edges. This boundary coding is independent of direct visual or tactile sensation but instead depends on inputs from the medial entorhinal cortex (MEC) that contains spatial tuning cells, such as grid cells or border cells. However, unlike MEC border cells, we found that RSC border cells encode environmental boundaries in a self-centred egocentric coordinate frame, which may allow an animal for efficient avoidance from approaching walls or edges during navigation. I will then discuss whether the brain can possess a precise estimate of remote target location during active environmental exploration. Such a spatial code has not been described in the hippocampal formation. However, we found that neurons in the rat orbitofrontal cortex (OFC) form spatial representations that persistently point to the animal’s subsequent goal destination throughout navigation. This destination coding emerges before navigation onset without direct sensory access to a distal goal, and are maintained via destination-specific neural ensemble dynamics. These findings together suggest key roles for extra-hippocampal regions in spatial navigation, enabling animals to choose appropriate actions toward a desired destination by avoiding possible dangers.

SeminarNeuroscienceRecording

Natural switches in sensory attention rapidly modulate hippocampal spatial codes

Ayelet Sarel
Ulanovsky lab, Weizmann Institute of Science
Jun 1, 2021

During natural behavior animals dynamically switch between different behaviors, yet little is known about how the brain performs behavioral-switches. Navigation is a complex dynamic behavior that enables testing these kind of behavioral switches: It requires the animal to know its own allocentric (world-centered) location within the environment, while also paying attention to incoming sudden events such as obstacles or other conspecifics – and therefore the animal may need to rapidly switch from representing its own allocentric position to egocentrically representing ‘things out-there’. Here we used an ethological task where two bats flew together in a very large environment (130 meters), and had to switch between two behaviors: (i) navigation, and (ii) obstacle-avoidance during ‘cross-over’ events with the other bat. Bats increased their echolocation click-rate before a cross-over, indicating spatial attention to the other bat. Hippocampal CA1 neurons represented the bat’s own position when flying alone (allocentric place-coding); surprisingly, when meeting the other bat, neurons switched very rapidly to jointly representing the inter-bat distance × position (egocentric × allocentric coding). This switching to a neuronal representation of the other bat was correlated on a trial-by-trial basis with the attention signal, as indexed by the bat’s echolocation calls – suggesting that sensory attention is controlling these major switches in neural coding. Interestingly, we found that in place-cells, the different place-fields of the same neuron could exhibit very different tuning to inter-bat distance – creating a non-separable coding of allocentric position × egocentric distance. Together, our results suggest that attentional switches during navigation – which in bats can be measured directly based on their echolocation signals – elicit rapid dynamics of hippocampal spatial coding. More broadly, this study demonstrates that during natural behavior, when animals often switch between different behaviors, neural circuits can rapidly and flexibly switch their core computations.

SeminarNeuroscience

State-dependent egocentric and allocentric heading representation in the monarch butterfly sun compass

Basil El Jundi
University of Wuerzburg
Mar 30, 2021

For spatial orientation, heading information can be processed in two different frames of reference, a self-centered egocentric or a viewpoint allocentric frame of reference. Using the most efficient frame of reference is in particular important if an animal migrates over large distances, as it the case for the monarch butterfly (Danaus plexippus). These butterflies employ a sun compass to travel over more than 4,000 kilometers to their destination in central Mexico. We developed tetrode recordings from the heading-direction network of tethered flying monarch butterflies that were allowed to orient with respect to a sun stimulus. We show that the neurons switch their frame of reference depending on the animal’s locomotion state. In quiescence, the heading-direction cells encode a sun bearing in an egocentric reference frame, while during active flight, the heading-direction is encoded within an allocentric reference frame. By switching to an allocentric frame of reference during flight, monarch butterflies convert the sun to a global compass cue for long-distance navigation, an ideal strategy for maintaining a migratory heading.

SeminarNeuroscience

The Spatial Memory Pipeline: a deep learning model of egocentric to allocentric understanding in mammalian brains

Benigno Uria
DeepMind
Jan 12, 2021
ePoster

Disrupted Egocentric Vector Coding of Environmental Geometry in Alzheimer’s Disease Mouse Model

Yoonsoo Yeo, Jeehyun Kwag

COSYNE 2025

ePoster

PFL1 neurons transform a vector from an allocentric reference frame to an egocentric reference frame

Benjamin Gorko, Sung Soo Kim

COSYNE 2025

ePoster

Retrosplenial Parvalbumin Interneurons Gate the Egocentric Vector Coding of Environmental Geometry

Jiyeon Yang, Jeehyun Kwag

COSYNE 2025

ePoster

Egocentric navigation network plasticity: Training extends functional connectivity of V6 to frontal areas of congenitally blind people

Elena Aggius-Vella, Daniel-Robert Chebat, Shachar Maidenbaum, Amir Amedi

FENS Forum 2024

ePoster

Emergence of different spatial cognitive maps in CA1 for rats performing an episodic memory task using egocentric and allocentric navigational strategies

Elena Faillace, Francesco Gobbo, Rufus Mitchell-Heggs, Adrian J. Duskiewicz, Patrick Spooner, Richard G.M. Morris, Simon R. Schultz

FENS Forum 2024

ePoster

Neural coding of space and goals: Dynamics of egocentric boundary tuning during bait-chasing

Pearl Saldanha, Martin Bjerke, Benjamin Adric Dunn, Jonathan Robert Whitlock

FENS Forum 2024