ePoster

Linking neural dynamics across macaque V4, IT, and PFC to trial-by-trial object recognition behavior

Kohitij Kar,Reese Green,James DiCarlo
COSYNE 2022(2022)
Lisbon, Portugal
Presented: Mar 19, 2022

Conference

COSYNE 2022

Lisbon, Portugal

Resources

Authors & Affiliations

Kohitij Kar,Reese Green,James DiCarlo

Abstract

Primates exhibit a remarkable ability to rapidly and accurately recognize visual objects in their central field of view. Based on accurate estimates of object identity from neural population activity and the object-specific behavioral deficits observed after cortical perturbation, previous work has identified the primate ventral visual cortex as a critical brain circuit for core object recognition. In addition, our recent work has demonstrated that the ventrolateral prefrontal cortex (vlPFC; recurrently connected to the ventral stream) is also critical for robust object recognition. In this study, we ask: is there a common neuronal basis (e.g., in vlPFC? in IT cortex?), and a simple (e.g., linear) transformation that can account for various levels of behavioral measurements? We performed large-scale (multi-electrode) recordings in areas V4, IT, and vlPFC while monkeys performed a battery of binary object discrimination tasks. We tested ~10,000 neural linking models (hypotheses) that comprised various spatial (brain areas: V4, IT, vlPFC) and temporal (integration windows) pooling algorithms. We then determined how accurately those models predict macaque behavior at the level of overall performance, object-level confusions, image-level difficulty, and trial-by-trial choices. We observed that a specific subset of IT-based linking models that integrate weighted summations of neural activity exhibited significantly higher trial-by-trial choice consistencies (our finest-grained behavioral measurements) with the monkeys’ behavior than those constructed from V4, or vlPFC responses. Therefore, despite being downstream of the IT cortex, our results provide evidence against a vlPFC readout model. Together with prior work, we speculate that, vlPFC might be part of a specialized circuitry (including the ventral stream) to enable more robust recognition while the behavioral read-out is primarily IT-based. Apart from shedding light on the neural underpinnings of primate visual object recognition, our results provide algorithmic guidance to improve current computer vision models that are typically less robust and outperformed by humans.

Unique ID: cosyne-22/linking-neural-dynamics-across-macaque-5e84daf6