TopicNeuroscience

separability

Content Overview
3Total items
2Seminars
1ePoster

Latest

SeminarNeuroscienceRecording

The organization of neural representations for control

David Badre
Brown University
Dec 10, 2021

Cognitive control allows us to think and behave flexibly based on our context and goals. Most theories of cognitive control propose a control representation that enables the same input to produce different outputs contingent on contextual factors. In this talk, I will focus on an important property of the control representation's neural code: its representational dimensionality. Dimensionality of a neural representation balances a basic separability/generalizability trade-off in neural computation. This tradeoff has important implications for cognitive control. In this talk, I will present initial evidence from fMRI and EEG showing that task representations in the human brain leverage both ends of this tradeoff during flexible behavior.

SeminarNeuroscienceRecording

NMC4 Short Talk: Rank similarity filters for computationally-efficient machine learning on high dimensional data

Katharine Shapcott
FIAS
Dec 2, 2021

Real world datasets commonly contain nonlinearly separable classes, requiring nonlinear classifiers. However, these classifiers are less computationally efficient than their linear counterparts. This inefficiency wastes energy, resources and time. We were inspired by the efficiency of the brain to create a novel type of computationally efficient Artificial Neural Network (ANN) called Rank Similarity Filters. They can be used to both transform and classify nonlinearly separable datasets with many datapoints and dimensions. The weights of the filters are set using the rank orders of features in a datapoint, or optionally the 'confusion' adjusted ranks between features (determined from their distributions in the dataset). The activation strength of a filter determines its similarity to other points in the dataset, a measure based on cosine similarity. The activation of many Rank Similarity Filters transforms samples into a new nonlinear space suitable for linear classification (Rank Similarity Transform (RST)). We additionally used this method to create the nonlinear Rank Similarity Classifier (RSC), which is a fast and accurate multiclass classifier, and the nonlinear Rank Similarity Probabilistic Classifier (RSPC), which is an extension to the multilabel case. We evaluated the classifiers on multiple datasets and RSC is competitive with existing classifiers but with superior computational efficiency. Code for RST, RSC and RSPC is open source and was written in Python using the popular scikit-learn framework to make it easily accessible (https://github.com/KatharineShapcott/rank-similarity). In future extensions the algorithm can be applied to hardware suitable for the parallelization of an ANN (GPU) and a Spiking Neural Network (neuromorphic computing) with corresponding performance gains. This makes Rank Similarity Filters a promising biologically inspired solution to the problem of efficient analysis of nonlinearly separable data.

ePosterNeuroscience

Changes in tuning curves, not neural population covariance, improve category separability in the primate ventral visual pathway

Jenelle Feather, Long Sha, Gouki Okazawa, Nga Yu Lo, SueYeon Chung, Roozbeh Kiani

COSYNE 2025

separability coverage

3 items

Seminar2
ePoster1

Share your knowledge

Know something about separability? Help the community by contributing seminars, talks, or research.

Contribute content
Domain spotlight

Explore how separability research is advancing inside Neuroscience.

Visit domain

Cookies

We use essential cookies to run the site. Analytics cookies are optional and help us improve World Wide. Learn more.