Priming
priming
Epigenomic (re)programming of the brain and behavior by ovarian hormones
Rhythmic changes in sex hormone levels across the ovarian cycle exert powerful effects on the brain and behavior, and confer female-specific risks for neuropsychiatric conditions. In this talk, Dr. Kundakovic will discuss the role of fluctuating ovarian hormones as a critical biological factor contributing to the increased depression and anxiety risk in women. Cycling ovarian hormones drive brain and behavioral plasticity in both humans and rodents, and the talk will focus on animal studies in Dr. Kundakovic’s lab that are revealing the molecular and receptor mechanisms that underlie this female-specific brain dynamic. She will highlight the lab’s discovery of sex hormone-driven epigenetic mechanisms, namely chromatin accessibility and 3D genome changes, that dynamically regulate neuronal gene expression and brain plasticity but may also prime the (epi)genome for psychopathology. She will then describe functional studies, including hormone replacement experiments and the overexpression of an estrous cycle stage-dependent transcription factor, which provide the causal link(s) between hormone-driven chromatin dynamics and sex-specific anxiety behavior. Dr. Kundakovic will also highlight an unconventional role that chromatin dynamics may have in regulating neuronal function across the ovarian cycle, including in sex hormone-driven X chromosome plasticity and hormonally-induced epigenetic priming. In summary, these studies provide a molecular framework to understand ovarian hormone-driven brain plasticity and increased female risk for anxiety and depression, opening new avenues for sex- and gender-informed treatments for brain disorders.
How do we find what we are looking for? The Guided Search 6.0 model
The talk will give a tour of Guided Search 6.0 (GS6), the latest evolution of the Guided Search model of visual search. Part 1 describes The Mechanics of Search. Because we cannot recognize more than a few items at a time, selective attention is used to prioritize items for processing. Selective attention to an item allows its features to be bound together into a representation that can be matched to a target template in memory or rejected as a distractor. The binding and recognition of an attended object is modeled as a diffusion process taking > 150 msec/item. Since selection occurs more frequently than that, it follows that multiple items are undergoing recognition at the same time, though asynchronously, making GS6 a hybrid serial and parallel model. If a target is not found, search terminates when an accumulating quitting signal reaches a threshold. Part 2 elaborates on the five sources of Guidance that are combined into a spatial “priority map” to guide the deployment of attention (hence “guided search”). These are (1) top-down and (2) bottom-up feature guidance, (3) prior history (e.g. priming), (4) reward, and (5) scene syntax and semantics. Finally, in Part 3, we will consider the internal representation of what we are searching for; what is often called “the search template”. That search template is really two templates: a guiding template (probably in working memory) and a target template (in long term memory). Put these pieces together and you have GS6.
Beyond visual search: studying visual attention with multitarget visual foraging tasks
Visual attention refers to a set of processes allowing selection of relevant and filtering out of irrelevant information in the visual environment. A large amount of research on visual attention has involved visual search paradigms, where observers are asked to report whether a single target is present or absent. However, recent studies have revealed that these classic single-target visual search tasks only provide a snapshot of how attention is allocated in the visual environment, and that multitarget visual foraging tasks may capture the dynamics visual attention more accurately. In visual foraging, observers are asked to select multiple instances of multiple target types, as fast as they can. A critical question in foraging research concerns the factors driving the next target selection. Most likely, this would require two steps: (1) identifying a set of candidates for the next selection, and (2) selecting the best option among these candidates. After having briefly described the advantage of visual foraging over visual search, I will review recent visual foraging studies testing the influence of several manipulations (e.g., target crypticity, number of items, selection modality) on foraging behaviour. Overall, these studies revealed that the next target selection during visual foraging is determined by the competition between three factors: target value, target proximity, and priming of features. I will explain how the analysis of individual differences in foraging behaviour can provide important information, with the idea that individuals show by-default internal biases toward value, proximity and priming that determine their search strategy and behaviour.
How do we find what we are looking for? The Guided Search 6.0 model
The talk will give a tour of Guided Search 6.0 (GS6), the latest evolution of Guided Search. Part 1 describes The Mechanics of Search. Because we cannot recognize more than a few items at a time, selective attention is used to prioritize items for processing. Selective attention to an item allows its features to be bound together into a representation that can be matched to a target template in memory or rejected as a distractor. The binding and recognition of an attended object is modeled as a diffusion process taking > 150 msec/item. Since selection occurs more frequently than that, it follows that multiple items are undergoing recognition at the same time, though asynchronously, making GS6 a hybrid serial and parallel model. If a target is not found, search terminates when an accumulating quitting signal reaches a threshold. Part 2 elaborates on the five sources of Guidance that are combined into a spatial “priority map” to guide the deployment of attention (hence “guided search”). These are (1) top-down and (2) bottom-up feature guidance, (3) prior history (e.g. priming), (4) reward, and (5) scene syntax and semantics. In GS6, the priority map is a dynamic attentional landscape that evolves over the course of search. In part, this is because the visual field is inhomogeneous. Part 3: That inhomogeneity imposes spatial constraints on search that described by three types of “functional visual field” (FVFs): (1) a resolution FVF, (2) an FVF governing exploratory eye movements, and (3) an FVF governing covert deployments of attention. Finally, in Part 4, we will consider that the internal representation of the search target, the “search template” is really two templates: a guiding template and a target template. Put these pieces together and you have GS6.
Abstraction and Analogy in Natural and Artificial Intelligence
Learning by analogy is a powerful tool children’s developmental repertoire, as well as in educational contexts such as mathematics, where the key knowledge base involves building flexible schemas. However, noticing and learning from analogies develops over time and is cognitively resource intensive. I review studies that provide insight into the relationship between mechanisms driving children’s developing analogy skills, highlighting environmental inputs (parent talk and prior experiences priming attention to relations) and neuro-cognitive factors (Executive Functions and brain injury). I then note implications for mathematics learning, reviewing experimental findings that show analogy can improve learning, but also that both individual differences in EFs and environmental factors that reduce available EFs such as performance pressure can predict student learning.
Priming the senses: Hunger's influence on olfaction, behaviour, and physiological responses
FENS Forum 2024
Shedding light on object location recall: Optogenetic priming of the HIP-mPFC pathway for object-location memory
FENS Forum 2024