← Back

Visual Inputs

Topic spotlight
TopicWorld Wide

visual inputs

Discover seminars, jobs, and research tagged with visual inputs across World Wide.
10 curated items9 Seminars1 ePoster
Updated 7 months ago
10 items · visual inputs
10 results
SeminarNeuroscienceRecording

Restoring Sight to the Blind: Effects of Structural and Functional Plasticity

Noelle Stiles
Rutgers University
May 21, 2025

Visual restoration after decades of blindness is now becoming possible by means of retinal and cortical prostheses, as well as emerging stem cell and gene therapeutic approaches. After restoring visual perception, however, a key question remains. Are there optimal means and methods for retraining the visual cortex to process visual inputs, and for learning or relearning to “see”? Up to this point, it has been largely assumed that if the sensory loss is visual, then the rehabilitation focus should also be primarily visual. However, the other senses play a key role in visual rehabilitation due to the plastic repurposing of visual cortex during blindness by audition and somatosensation, and also to the reintegration of restored vision with the other senses. I will present multisensory neuroimaging results, cortical thickness changes, as well as behavioral outcomes for patients with Retinitis Pigmentosa (RP), which causes blindness by destroying photoreceptors in the retina. These patients have had their vision partially restored by the implantation of a retinal prosthesis, which electrically stimulates still viable retinal ganglion cells in the eye. Our multisensory and structural neuroimaging and behavioral results suggest a new, holistic concept of visual rehabilitation that leverages rather than neglects audition, somatosensation, and other sensory modalities.

SeminarNeuroscienceRecording

Probabilistic computation in natural vision

Ruben Coen-Cagli
Albert Einstein College of Medicine
Mar 29, 2022

A central goal of vision science is to understand the principles underlying the perception and neural coding of the complex visual environment of our everyday experience. In the visual cortex, foundational work with artificial stimuli, and more recent work combining natural images and deep convolutional neural networks, have revealed much about the tuning of cortical neurons to specific image features. However, a major limitation of this existing work is its focus on single-neuron response strength to isolated images. First, during natural vision, the inputs to cortical neurons are not isolated but rather embedded in a rich spatial and temporal context. Second, the full structure of population activity—including the substantial trial-to-trial variability that is shared among neurons—determines encoded information and, ultimately, perception. In the first part of this talk, I will argue for a normative approach to study encoding of natural images in primary visual cortex (V1), which combines a detailed understanding of the sensory inputs with a theory of how those inputs should be represented. Specifically, we hypothesize that V1 response structure serves to approximate a probabilistic representation optimized to the statistics of natural visual inputs, and that contextual modulation is an integral aspect of achieving this goal. I will present a concrete computational framework that instantiates this hypothesis, and data recorded using multielectrode arrays in macaque V1 to test its predictions. In the second part, I will discuss how we are leveraging this framework to develop deep probabilistic algorithms for natural image and video segmentation.

SeminarNeuroscienceRecording

Keeping visual cortex in the back of your mind: From visual inputs to behavior and memory

Sharon Gilaie-Dotan
Bar Ilan University
Feb 21, 2022
SeminarPsychology

Categories, language, and visual working memory: how verbal labels change capacity limitations

Alessandra S. Souza
University of Porto, University of Zurich
Aug 10, 2021

The limited capacity of visual working memory constrains the quantity and quality of the information we can store in mind for ongoing processing. Research from our lab has demonstrated that verbal labeling/categorization of visual inputs increases its retention and fidelity in visual working memory. In this talk, I will outline the hypotheses that explain the interaction between visual and verbal inputs in working memory, leading to the boosts we observed. I will further show how manipulations of the categorical distinctiveness of the labels, the timing of their occurrence, to which item labels are applied, as well as their validity modulate the benefits one can draw from combining visual and verbal inputs to alleviate capacity limitations. Finally, I will discuss the implications of these results to our understanding of working memory and its interaction with prior knowledge.

SeminarPsychology

Visual working memory representations are distorted by their use in perceptual comparisons

Keisuke Fukuda
University of Toronto Mississauga, University of Toronto
Jun 21, 2021

Visual working memory (VWM) allows us to maintain a small amount of task-relevant information in mind so that we can use them to guide our behavior. Although past studies have successfully characterized its capacity limit and representational quality during maintenance, the consequence of its usage for task-relevant behaviors has been largely unknown. In this talk, I will demonstrate that VWM representations get distorted when they are used for perceptual comparisons with new visual inputs, especially when the inputs are subjectively similar to the VWM representations. Furthermore, I will show that this similarity-induced memory bias (SIMB) occurs for both simple (e.g. , color, shape) and complex stimuli (e.g., real world objects, faces) that are perceptually encoded and retrieved from long-term memory. Given the observed versatility of the SIMB, its implication for other memory distortion phenomena (e.g., distractor-induced distortion, misinformation effect) will be discussed.

SeminarNeuroscienceRecording

Networks for multi-sensory attention and working memory

Barbara Shinn-Cunningham
Carnegie Mellon University
May 12, 2021

Converging evidence from fMRI and EEG shows that audtiory spatial attention engages the same fronto-parietal network associated with visuo-spatial attention. This network is distinct from an auditory-biased processing network that includes other frontal regions; this second network is can be recruited when observers extract rhythmic information from visual inputs. We recently used a dual-task paradigm to examine whether this "division of labor" between a visuo-spatial network and an auditory-rhythmic network can be observed in a working memory paradigm. We varied the sensory modality (visual vs. auditory) and information domain (spatial or rhythmic) that observers had to store in working memory, while also performing an intervening task. Behavior, pupilometry, and EEG results show a complex interaction across the working memory and intervening tasks, consistent with two cognitive control networks managing auditory and visual inputs based on the kind of information being processed.

SeminarNeuroscienceRecording

Arousal modulates retinal output

Sylvia Schröder
University of Sussex
Feb 21, 2021

Neural responses in the visual system are usually not purely visual but depend on behavioural and internal states such as arousal. This dependence is seen both in primary visual cortex (V1) and in subcortical brain structures receiving direct retinal input. In this talk, I will show that modulation by behavioural state arises as early as in the output of the retina.To measure retinal activity in the awake, intact brain, we imaged the synaptic boutons of retinal axons in the superficial superior colliculus (sSC) of mice. The activity of about half of the boutons depended not only on vision but also on running speed and pupil size, regardless of retinal illumination. Arousal typically reduced the boutons’ visual responses to preferred direction and their selectivity for direction and orientation.Arousal may affect activity in retinal boutons by presynaptic neuromodulation. To test whether the effects of arousal occur already in the retina, we recorded from retinal axons in the optic tract. We found that, in darkness, more than one third of the recorded axons was significantly correlated with running speed. Arousal had similar effects postsynaptically, in sSC neurons, independent of activity in V1, the other main source of visual inputs to colliculus. Optogenetic inactivation of V1 generally decreased activity in collicular neurons but did not diminish the effects of arousal. These results indicate that arousal modulates activity at every stage of the visual system. In the future, we will study the purpose and the underlying mechanisms of behavioural modulation in the early visual system

ePoster

Coregistration of heading and visual inputs in retrosplenial cortex

Kevin Sit & Michael Goard

COSYNE 2023