Platform

  • Search
  • Seminars
  • Conferences
  • Jobs

Resources

  • Submit Content
  • About Us

© 2025 World Wide

Open knowledge for all • Started with World Wide Neuro • A 501(c)(3) Non-Profit Organization

Analytics consent required

World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.

Review the Privacy Policy for details about analytics processing.

World Wide
SeminarsConferencesWorkshopsCoursesJobsMapsFeedLibrary
Back to SeminarsBack
Seminar✓ Recording AvailableNeuroscience

Global visual salience of competing stimuli

Alex Hernandez-Garcia

Université de Montréal

Schedule
Thursday, December 10, 2020

Showing your local timezone

Schedule

Thursday, December 10, 2020

4:00 PM Europe/Zurich

Watch recording
Host: NeuroLeman Network

Watch the seminar

Your browser does not support the video tag.

Recording provided by the organiser.

Event Information

Domain

Neuroscience

Original Event

View source

Host

NeuroLeman Network

Duration

70 minutes

Abstract

Current computational models of visual salience accurately predict the distribution of fixations on isolated visual stimuli. It is not known, however, whether the global salience of a stimulus, that is its effectiveness in the competition for attention with other stimuli, is a function of the local salience or an independent measure. Further, do task and familiarity with the competing images influence eye movements? In this talk, I will present the analysis of a computational model of the global salience of natural images. We trained a machine learning algorithm to learn the direction of the first saccade of participants who freely observed pairs of images. The pairs balanced the combinations of new and already seen images, as well as task and task-free trials. The coefficients of the model provided a reliable measure of the likelihood of each image to attract the first fixation when seen next to another image, that is their global salience. For example, images of close-up faces and images containing humans were consistently looked first and were assigned higher global salience. Interestingly, we found that global salience cannot be explained by the feature-driven local salience of images, the influence of task and familiarity was rather small and we reproduced the previously reported left-sided bias. This computational model of global salience allows to analyse multiple other aspects of human visual perception of competing stimuli. In the talk, I will also present our latest results from analysing the saccadic reaction time as a function of the global salience of the pair of images.

Topics

attentioncompeting stimulicomputational modeleye movementsfirst fixationglobal saliencemachine learningsaccadessaliencevisionvisual perceptionvisual salience

About the Speaker

Alex Hernandez-Garcia

Université de Montréal

Contact & Resources

No additional contact information available

Related Seminars

Seminar60%

Pancreatic Opioids Regulate Ingestive and Metabolic Phenotypes

neuro

Jan 12, 2025
Washington University in St. Louis
Seminar60%

Exploration and Exploitation in Human Joint Decisions

neuro

Jan 12, 2025
Munich
Seminar60%

The Role of GPCR Family Mrgprs in Itch, Pain, and Innate Immunity

neuro

Jan 12, 2025
Johns Hopkins University
January 2026
Full calendar →