← Back

Visual Behaviour

Topic spotlight
TopicWorld Wide

visual behaviour

Discover seminars, jobs, and research tagged with visual behaviour across World Wide.
6 curated items5 Seminars1 ePoster
Updated almost 2 years ago
6 items · visual behaviour
6 results
SeminarNeuroscienceRecording

Incorporating visual evidence and counter-evidence to estimate self-movement

Damon Clark
Yale University
Jan 21, 2024
SeminarNeuroscienceRecording

Rodents to Investigate the Neural Basis of Audiovisual Temporal Processing and Perception

Ashley Schormans
BrainsCAN, Western University, Canada.
Sep 26, 2023

To form a coherent perception of the world around us, we are constantly processing and integrating sensory information from multiple modalities. In fact, when auditory and visual stimuli occur within ~100 ms of each other, individuals tend to perceive the stimuli as a single event, even though they occurred separately. In recent years, our lab, and others, have developed rat models of audiovisual temporal perception using behavioural tasks such as temporal order judgments (TOJs) and synchrony judgments (SJs). While these rodent models demonstrate metrics that are consistent with humans (e.g., perceived simultaneity, temporal acuity), we have sought to confirm whether rodents demonstrate the hallmarks of audiovisual temporal perception, such as predictable shifts in their perception based on experience and sensitivity to alterations in neurochemistry. Ultimately, our findings indicate that rats serve as an excellent model to study the neural mechanisms underlying audiovisual temporal perception, which to date remains relativity unknown. Using our validated translational audiovisual behavioural tasks, in combination with optogenetics, neuropharmacology and in vivo electrophysiology, we aim to uncover the mechanisms by which inhibitory neurotransmission and top-down circuits finely control ones’ perception. This research will significantly advance our understanding of the neuronal circuitry underlying audiovisual temporal perception, and will be the first to establish the role of interneurons in regulating the synchronized neural activity that is thought to contribute to the precise binding of audiovisual stimuli.

SeminarPsychology

Characterising the brain representations behind variations in real-world visual behaviour

Simon Faghel-Soubeyrand
Université de Montréal
Aug 4, 2021

Not all individuals are equally competent at recognizing the faces they interact with. Revealing how the brains of different individuals support variations in this ability is a crucial step to develop an understanding of real-world human visual behaviour. In this talk, I will present findings from a large high-density EEG dataset (>100k trials of participants processing various stimulus categories) and computational approaches which aimed to characterise the brain representations behind real-world proficiency of “super-recognizers”—individuals at the top of face recognition ability spectrum. Using decoding analysis of time-resolved EEG patterns, we predicted with high precision the trial-by-trial activity of super-recognizers participants, and showed that evidence for face recognition ability variations is disseminated along early, intermediate and late brain processing steps. Computational modeling of the underlying brain activity uncovered two representational signatures supporting higher face recognition ability—i) mid-level visual & ii) semantic computations. Both components were dissociable in brain processing-time (the first around the N170, the last around the P600) and levels of computations (the first emerging from mid-level layers of visual Convolutional Neural Networks, the last from a semantic model characterising sentence descriptions of images). I will conclude by presenting ongoing analyses from a well-known case of acquired prosopagnosia (PS) using similar computational modeling of high-density EEG activity.

SeminarNeuroscienceRecording

Evolution of vision - The regular route and shortcuts

Dan Nilsson
University of Lund
Jun 27, 2021

Eyes abound in the animal kingdom. Some are large as basketballs and others are just fractions of a millimetre. Eyes also come in many different types, such as the compound eyes of insects, the mirror eyes of scallopsor our own camera-like eyes. Common to all animal eyes is that they serve the same fundamental role of collecting external information for guidingthe animal’s behaviour. But behaviours vary tremendously across the animal kingdom, and it turns outthis is the key to understand how eyes evolved. The lecture will take a tour from the first animals that could only sense the presence of light, to those that saw the first crude image of the world and finally to animals that use acute vision for interacting with otheranimals. Amazingly, all these stages of eye evolution still exist in animals living today, and this is how we can unravel the evolution of behaviours that has been the driving force behind eye evolution

SeminarNeuroscienceRecording

Vision in dynamically changing environments

Marion Silies
Johannes Gutenberg-Universität Mainz, Germany
May 17, 2020

Many visual systems can process information in dynamically changing environments. In general, visual perception scales with changes in the visual stimulus, or contrast, irrespective of background illumination. This is achieved by adaptation. However, visual perception is challenged when adaptation is not fast enough to deal with sudden changes in overall illumination, for example when gaze follows a moving object from bright sunlight into a shaded area. We have recently shown that the visual system of the fly found a solution by propagating a corrective luminance-sensitive signal to higher processing stages. Using in vivo two-photon imaging and behavioural analyses we showed that distinct OFF-pathway inputs encode contrast and luminance. The luminance-sensitive pathway is particularly required when processing visual motion in contextual dim light, when pure contrast sensitivity underestimates the salience of a stimulus. Recent work in the lab has addressed the question how two visual pathways obtain such fundamentally different sensitivities, given common photoreceptor input. We are furthermore currently working out the network-based strategies by which luminance- and contrast-sensitive signals are combined to guide appropriate visual behaviour. Together, I will discuss the molecular, cellular, and circuit mechanisms that ensure contrast computation, and therefore robust vision, in fast changing visual scenes.

ePoster

Comparing the effects of optogenetic and electrical stimulation of macaque V1 on visual behaviour

Marta Falkowska, Jennifer Greilsamer, Liza Kumari, Jaime Cadena Valencia, Beshoy Agayby, Samy Rima, Marcus Haag, Diego Ghezzi, Michael C. Schmid

FENS Forum 2024