← Back

Behavioural Relevance

Topic spotlight
TopicWorld Wide

behavioural relevance

Discover seminars, jobs, and research tagged with behavioural relevance across World Wide.
3 curated items3 Seminars
Updated over 4 years ago
3 items · behavioural relevance
3 results
SeminarNeuroscienceRecording

The neuroscience of color and what makes primates special

Bevil Conway
NIH
May 10, 2021

Among mammals, excellent color vision has evolved only in certain non-human primates. And yet, color is often assumed to be just a low-level stimulus feature with a modest role in encoding and recognizing objects. The rationale for this dogma is compelling: object recognition is excellent in grayscale images (consider black-and-white movies, where faces, places, objects, and story are readily apparent). In my talk I will discuss experiments in which we used color as a tool to uncover an organizational plan in inferior temporal cortex (parallel, multistage processing for places, faces, colors, and objects) and a visual-stimulus functional representation in prefrontal cortex (PFC). The discovery of an extensive network of color-biased domains within IT and PFC, regions implicated in high-level object vision and executive functions, compels a re-evaluation of the role of color in behavior. I will discuss behavioral studies prompted by the neurobiology that uncover a universal principle for color categorization across languages, the first systematic study of the color statistics of objects and a chromatic mechanism by which the brain may compute animacy, and a surprising paradoxical impact of memory on face color. Taken together, my talk will put forward the argument that color is not primarily for object recognition, but rather for the assessment of the likely behavioral relevance, or meaning, of the stuff we see.

SeminarNeuroscienceRecording

A Cortical Circuit for Audio-Visual Predictions

Aleena Garner
Keller lab, FMI
Mar 9, 2021

Team work makes sensory streams work: our senses work together, learn from each other, and stand in for one another, the result of which is perception and understanding. Learned associations between stimuli in different sensory modalities can shape the way we perceive these stimuli (Mcgurk and Macdonald, 1976). During audio-visual associative learning, auditory cortex is thought to underlie multi-modal plasticity in visual cortex (McIntosh et al., 1998; Mishra et al., 2007; Zangenehpour and Zatorre, 2010). However, it is not well understood how processing in visual cortex is altered by an auditory stimulus that is predictive of a visual stimulus and what the mechanisms are that mediate such experience-dependent, audio-visual associations in sensory cortex. Here we describe a neural mechanism by which an auditory input can shape visual representations of behaviorally relevant stimuli through direct interactions between auditory and visual cortices. We show that the association of an auditory stimulus with a visual stimulus in a behaviorally relevant context leads to an experience-dependent suppression of visual responses in primary visual cortex (V1). Auditory cortex axons carry a mixture of auditory and retinotopically-matched visual input to V1, and optogenetic stimulation of these axons selectively suppresses V1 neurons responsive to the associated visual stimulus after, but not before, learning. Our results suggest that cross-modal associations can be stored in long-range cortical connections and that with learning these cross-modal connections function to suppress the responses to predictable input.

SeminarNeuroscience

Experience dependent changes of sensory representation in the olfactory cortex

Antonia Marin Burgin
Biomedicine Research Institute of Buenos Aires
Nov 17, 2020

Sensory representations are typically thought as neuronal activity patterns that encode physical attributes of the outside world. However, increasing evidence is showing that as animals learned the association between a sensory stimulus and its behavioral relevance, stimulus representation in sensory cortical areas can change. In this seminar I will present recent experiments from our lab showing that the activity in the olfactory piriform cortex (PC) of mice encodes not only odor information, but also non-olfactory variables associated with the behavioral task. By developing an associative olfactory learning task, in which animals learn to associate a particular context with an odor and a reward, we were able to record the activity of multiple neurons as the animal runs in a virtual reality corridor. By analyzing the population activity dynamics using Principal Components Analysis, we find different population trajectories evolving through time that can discriminate aspects of different trial types. By using Generalized Linear Models we further dissected the contribution of different sensory and non-sensory variables to the modulation of PC activity. Interestingly, the experiments show that variables related to both sensory and non-sensory aspects of the task (e.g., odor, context, reward, licking, sniffing rate and running speed) differently modulate PC activity, suggesting that the PC adapt odor processing depending on experience and behavior.