Resources
Authors & Affiliations
Alexis Monnet-Aimard, Camila Losada, Guilhem Ibos
Abstract
Detecting behaviorally relevant stimuli in a complex environment (such as a camouflaged predator) requires comparing sensory representations (what we are looking at) with memory/cognitive representations (what we are looking for). We test a hypothesis proposed by Freedman and Ibos in 2018 that, in such context, lateral intraparietal area (LIP) integrates and compares visual and working memory related information in order to transform them into decision related signals. We posit that sensory information are extracted within the hierarchy of visual cortical areas, and that prefrontal cortex (PFC) plays a central role in storing information in working memory. We simultaneously recorded the activity of LIP, PFC and V4 neurons of macaque monkeys performing a delayed-conjunction matching task. In this task, monkeys matched the color, orientation and location of a sample and subsequently presented test stimuli. We used both ROC analysis and decoding methods to test selectivity and latencies of these information. Preliminary results show a clear functional dissociation between a sensory driven network—composed of V4 and LIP neurons—and a network—composed of PFC and LIP neurons—more strongly encoding goal-directed related signals. V4 and LIP neurons encode visual information prior to PFC, while PFC neurons encode cognitive related information earlier and in more stable manner than V4 and LIP. Both sequential encoding of sensory and mnemonic signals within the V4/PFC/LIP network as well as decoding strength suggest LIP play a central role in representing each type of information. Such results are in line with the tested hypothesis.