Spatial Attention
spatial attention
Synthetic and natural images unlock the power of recurrency in primary visual cortex
During perception the visual system integrates current sensory evidence with previously acquired knowledge of the visual world. Presumably this computation relies on internal recurrent interactions. We record populations of neurons from the primary visual cortex of cats and macaque monkeys and find evidence for adaptive internal responses to structured stimulation that change on both slow and fast timescales. In the first experiment, we present abstract images, only briefly, a protocol known to produce strong and persistent recurrent responses in the primary visual cortex. We show that repetitive presentations of a large randomized set of images leads to enhanced stimulus encoding on a timescale of minutes to hours. The enhanced encoding preserves the representational details required for image reconstruction and can be detected in post-exposure spontaneous activity. In a second experiment, we show that the encoding of natural scenes across populations of V1 neurons is improved, over a timescale of hundreds of milliseconds, with the allocation of spatial attention. Given the hierarchical organization of the visual cortex, contextual information from the higher levels of the processing hierarchy, reflecting high-level image regularities, can inform the activity in V1 through feedback. We hypothesize that these fast attentional boosts in stimulus encoding rely on recurrent computations that capitalize on the presence of high-level visual features in natural scenes. We design control images dominated by low-level features and show that, in agreement with our hypothesis, the attentional benefits in stimulus encoding vanish. We conclude that, in the visual system, powerful recurrent processes optimize neuronal responses, already at the earliest stages of cortical processing.
The dynamics of temporal attention
Selection is the hallmark of attention: processing improves for attended items but is relatively impaired for unattended items. It is well known that visual spatial attention changes sensory signals and perception in this selective fashion. In the work I will present, we asked whether and how attentional selection happens across time. First, our experiments revealed that voluntary temporal attention (attention to specific points in time) is selective, resulting in perceptual tradeoffs across time. Second, we measured small eye movements called microsaccades and found that directing voluntary temporal attention increases the stability of the eyes in anticipation of an attended stimulus. Third, we developed a computational model of dynamic attention, which proposes specific mechanisms underlying temporal attention and its selectivity. Lastly, I will mention how we are testing predictions of the model with MEG. Altogether, this research shows how precisely timed voluntary attention helps manage inherent limits in visual processing across short time intervals, advancing our understanding of attention as a dynamic process.
Investigating the neural mechanisms of spatial attention biases during sleep onset
Deciding to stop deciding: A cortical-subcortical circuit for forming and terminating a decision
The neurobiology of decision-making is informed by neurons capable of representing information over time scales of seconds. Such neurons were initially characterized in studies of spatial working memory, motor planning (e.g., Richard Andersen lab) and spatial attention. For decision-making, such neurons emit graded spike rates, that represent the accumulated evidence for or against a choice. They establish the conduit between the formation of the decision and its completion, usually in the form of a commitment to an action, even if provisional. Indeed, many decisions appear to arise through an accumulation of noisy samples of evidence to a terminating threshold, or bound. Previous studies show that single neurons in the lateral intraparietal area (LIP) represent the accumulation of evidence when monkeys make decisions about the direction of random dot motion (RDM) and express their decision with a saccade to the neuron’s preferred target. The mechanism of termination (the bound) is elusive. LIP is interconnected with other brain regions that also display decision-related activity. Whether these areas play roles in the decision process that are similar to or fundamentally different from that of LIP is unclear. I will present new unpublished experiments that begin to resolve these issues by recording from populations of neurons simultaneously in LIP and one of its primary targets, the superior colliculus (SC), while monkeys make difficult perceptual decisions.
Natural switches in sensory attention rapidly modulate hippocampal spatial codes
During natural behavior animals dynamically switch between different behaviors, yet little is known about how the brain performs behavioral-switches. Navigation is a complex dynamic behavior that enables testing these kind of behavioral switches: It requires the animal to know its own allocentric (world-centered) location within the environment, while also paying attention to incoming sudden events such as obstacles or other conspecifics – and therefore the animal may need to rapidly switch from representing its own allocentric position to egocentrically representing ‘things out-there’. Here we used an ethological task where two bats flew together in a very large environment (130 meters), and had to switch between two behaviors: (i) navigation, and (ii) obstacle-avoidance during ‘cross-over’ events with the other bat. Bats increased their echolocation click-rate before a cross-over, indicating spatial attention to the other bat. Hippocampal CA1 neurons represented the bat’s own position when flying alone (allocentric place-coding); surprisingly, when meeting the other bat, neurons switched very rapidly to jointly representing the inter-bat distance × position (egocentric × allocentric coding). This switching to a neuronal representation of the other bat was correlated on a trial-by-trial basis with the attention signal, as indexed by the bat’s echolocation calls – suggesting that sensory attention is controlling these major switches in neural coding. Interestingly, we found that in place-cells, the different place-fields of the same neuron could exhibit very different tuning to inter-bat distance – creating a non-separable coding of allocentric position × egocentric distance. Together, our results suggest that attentional switches during navigation – which in bats can be measured directly based on their echolocation signals – elicit rapid dynamics of hippocampal spatial coding. More broadly, this study demonstrates that during natural behavior, when animals often switch between different behaviors, neural circuits can rapidly and flexibly switch their core computations.
Networks for multi-sensory attention and working memory
Converging evidence from fMRI and EEG shows that audtiory spatial attention engages the same fronto-parietal network associated with visuo-spatial attention. This network is distinct from an auditory-biased processing network that includes other frontal regions; this second network is can be recruited when observers extract rhythmic information from visual inputs. We recently used a dual-task paradigm to examine whether this "division of labor" between a visuo-spatial network and an auditory-rhythmic network can be observed in a working memory paradigm. We varied the sensory modality (visual vs. auditory) and information domain (spatial or rhythmic) that observers had to store in working memory, while also performing an intervening task. Behavior, pupilometry, and EEG results show a complex interaction across the working memory and intervening tasks, consistent with two cognitive control networks managing auditory and visual inputs based on the kind of information being processed.
The When, Where and What of visual memory formation
The eyes send a continuous stream of about two million nerve fibers to the brain, but only a fraction of this information is stored as visual memories. This talk will detail three neurocomputational models that attempt an understanding how the visual system makes on-the-fly decisions about how to encode that information. First, the STST family of models (Bowman & Wyble 2007; Wyble, Potter, Bowman & Nieuwenstein 2011) proposes mechanisms for temporal segmentation of continuous input. The conclusion of this work is that the visual system has mechanisms for rapidly creating brief episodes of attention that highlight important moments in time, and also separates each episode from temporally adjacent neighbors to benefit learning. Next, the RAGNAROC model (Wyble et al. 2019) describes a decision process for determining the spatial focus (or foci) of attention in a spatiotopic field and the neural mechanisms that provide enhancement of targets and suppression of highly distracting information. This work highlights the importance of integrating behavioral and electrophysiological data to provide empirical constraints on a neurally plausible model of spatial attention. The model also highlights how a neural circuit can make decisions in a continuous space, rather than among discrete alternatives. Finally, the binding pool (Swan & Wyble 2014; Hedayati, O’Donnell, Wyble in Prep) provides a mechanism for selectively encoding specific attributes (i.e. color, shape, category) of a visual object to be stored in a consolidated memory representation. The binding pool is akin to a holographic memory system that layers representations of select latent representations corresponding to different attributes of a given object. Moreover, it can bind features into distinct objects by linking them to token placeholders. Future work looks toward combining these models into a coherent framework for understanding the full measure of on-the-fly attentional mechanisms and how they improve learning.
A common neural mechanism mediates microsaccades and covert spatial attention
COSYNE 2023
The posterior parietal cortex mediates serial dependence during visuospatial attention
COSYNE 2025
Electrical stimulation over the parietal cortex induces spatial bias by mediating the influence of visuospatial attention on the temporal dynamics of visuocortical processing
FENS Forum 2024
Can Wii modulate pseudoneglect? Improving visuospatial attention in healthy subjects by active video gaming
FENS Forum 2024