Latest

SeminarNeuroscienceRecording

Cones with character: An in vivo circuit implementation of efficient coding

Tom Baden
University of Sussex
Nov 10, 2020

In this talk I will summarize some of our recent unpublished work on spectral coding in the larval zebrafish retina. Combining 2p imaging, hyperspectral stimulation, computational modeling and connectomics, we take a renewed look at the spectral tuning of cone photoreceptors in the live eye. We find that already cones optimally rotate natural colour space in a PCA-like fashion to disambiguate greyscale from "colour" information. We then follow this signal through the retinal layers and ultimately into the brain to explore the major spectral computations performed by the visual system at its consecutive stages. We find that by and large, zebrafish colour vision can be broken into three major spectral zones: long wavelength grey-scale-like vision, short-wavelength prey capture circuits, and spectrally diverse mid-wavelength circuits which possibly support the bulk of "true colour vision" in this tetrachromate vertebrate.

SeminarNeuroscienceRecording

Using noise to probe recurrent neural network structure and prune synapses

Rishidev Chaudhuri
University of California, Davis
Sep 25, 2020

Many networks in the brain are sparsely connected, and the brain eliminates synapses during development and learning. How could the brain decide which synapses to prune? In a recurrent network, determining the importance of a synapse between two neurons is a difficult computational problem, depending on the role that both neurons play and on all possible pathways of information flow between them. Noise is ubiquitous in neural systems, and often considered an irritant to be overcome. In the first part of this talk, I will suggest that noise could play a functional role in synaptic pruning, allowing the brain to probe network structure and determine which synapses are redundant. I will introduce a simple, local, unsupervised plasticity rule that either strengthens or prunes synapses using only synaptic weight and the noise-driven covariance of the neighboring neurons. For a subset of linear and rectified-linear networks, this rule provably preserves the spectrum of the original matrix and hence preserves network dynamics even when the fraction of pruned synapses asymptotically approaches 1. The plasticity rule is biologically-plausible and may suggest a new role for noise in neural computation. Time permitting, I will then turn to the problem of extracting structure from neural population data sets using dimensionality reduction methods. I will argue that nonlinear structures naturally arise in neural data and show how these nonlinearities cause linear methods of dimensionality reduction, such as Principal Components Analysis, to fail dramatically in identifying low-dimensional structure.

ePosterNeuroscience

Online contrastive PCA with Hebbian / anti-Hebbian plasticity

Tiberiu Tesileanu, Siavash Golkar, David Lipshutz, Dmitri Chklovskii

COSYNE 2023

PCA coverage

3 items

Seminar2
ePoster1
Domain spotlight

Explore how PCA research is advancing inside Neuro.

Visit domain