ePoster

Predicting V1 contextual modulation and neural tuning using a convolutional neural network

Cem Uran, Martin Vinck
Bernstein Conference 2024(2024)
Goethe University, Frankfurt, Germany

Conference

Bernstein Conference 2024

Goethe University, Frankfurt, Germany

Resources

Authors & Affiliations

Cem Uran, Martin Vinck

Abstract

We propose a novel method to predict contextual modulation of local field potential (LFP) gamma synchronization and multi-unit activity (MUA) in a single framework. Convolutional neural networks are state-of-the-art models of the primary visual cortex (V1). Sensory processing in V1 is determined by contextual modulation via feedback and recurrent connections. However, the computational goal and the neural implementation that shape V1 activity are not yet fully understood. We use transfer learning to use an object recognition convolutional neural network (VGG-16) to predict firing rates and gamma synchronization amplitude. We trained the network using a previously published dataset consisting of full natural images including the surround information which shapes contextual modulation. Our model predicts gamma synchronization amplitude and MUA firing rate modulation with high accuracy. Surprisingly, the model was able to capture center-surround tuning properties of LFP and MUA. Moreover, these predictions extrapolated to artificial color and grating stimuli. Using an image synthesis approach, we find the input image that maximizes gamma oscillations to be contours extended to the surround. These findings suggest that gamma oscillations are crucial in understanding center-surround properties. Furthermore, tuning properties of natural and artificial stimuli are likely learned implicitly from natural scene statistics. Overall, we propose a novel model to predict contextual modulation through LFP and MUA signals which is crucial in uncovering the mechanisms.

Unique ID: bernstein-24/predicting-contextual-modulation-db55c35f