Resources
Authors & Affiliations
Wah Ming Wayne Soo,Máté Lengyel
Abstract
It has been proposed that the visual cortex performs sampling-based probabilistic inference where neural responses represent stochastic samples from a posterior distribution. However, the highly-correlated nature of neural activity across time presents a computational ceiling on the effective sample size for any given time interval. Here, we show that auxiliary neurons, when implemented strategically, can speed up sampling rates in neural circuits to efficiently express an inferred posterior distribution. We train stabilized supralinear networks using a loss function that intrinsically rewards fast sampling. A base model comprising of 50 coding excitatory and 50 inhibitory neurons is able to achieve an acceptable level of performance when trained to sample over 800ms but fails to produce enough effective samples under a more realistic time frame of 400ms. We construct the full network by installing an additional 100 auxiliary excitatory neurons. Our optimized network (trained for 400ms) attains competitive levels of performance compared to both the base network trained for 800ms and a naïve duplication of the base model obtained by collecting samples from two concurrent base networks trained for 400ms despite having less sampling time or coding neurons. Two key temporal structures emerged from all models after optimization which are also experimentally observed: (1) positively-skewed membrane potential distributions at low contrast; and (2) gamma oscillations whose frequency increases with contrast. Additionally, oscillations expressed by the full network uniquely contain a well-hidden but computationally crucial temporal signature that evades typical spectral analyses in the form of reduced temporal co-kurtosis. We analytically show that the dynamics underlying these effects lead to improved sampling efficiency. Our results enhance the biological plausibility of sampling-based probabilistic inference and objectively attribute key experimental observations towards its computational efficiency, while our analysis on irregular oscillations brings to light the significance of analysing higher-order temporal moments in neural activity.