Platform

  • Search
  • Seminars
  • Conferences
  • Jobs

Resources

  • Submit Content
  • About Us

© 2025 World Wide

Open knowledge for all • Started with World Wide Neuro • A 501(c)(3) Non-Profit Organization

Analytics consent required

World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.

Review the Privacy Policy for details about analytics processing.

World Wide
SeminarsConferencesWorkshopsCoursesJobsMapsFeedLibrary
Back to SeminarsBack
Seminar✓ Recording AvailableNeuroscience

NMC4 Short Talk: Rank similarity filters for computationally-efficient machine learning on high dimensional data

Katharine Shapcott

Postdoctoral Researcher

FIAS

Schedule
Thursday, December 2, 2021

Showing your local timezone

Schedule

Thursday, December 2, 2021

3:15 AM America/New_York

Watch recording
Host: Neuromatch 4

Watch the seminar

Your browser does not support the video tag.

Recording provided by the organiser.

Event Information

Domain

Neuroscience

Original Event

View source

Host

Neuromatch 4

Duration

15 minutes

Abstract

Real world datasets commonly contain nonlinearly separable classes, requiring nonlinear classifiers. However, these classifiers are less computationally efficient than their linear counterparts. This inefficiency wastes energy, resources and time. We were inspired by the efficiency of the brain to create a novel type of computationally efficient Artificial Neural Network (ANN) called Rank Similarity Filters. They can be used to both transform and classify nonlinearly separable datasets with many datapoints and dimensions. The weights of the filters are set using the rank orders of features in a datapoint, or optionally the 'confusion' adjusted ranks between features (determined from their distributions in the dataset). The activation strength of a filter determines its similarity to other points in the dataset, a measure based on cosine similarity. The activation of many Rank Similarity Filters transforms samples into a new nonlinear space suitable for linear classification (Rank Similarity Transform (RST)). We additionally used this method to create the nonlinear Rank Similarity Classifier (RSC), which is a fast and accurate multiclass classifier, and the nonlinear Rank Similarity Probabilistic Classifier (RSPC), which is an extension to the multilabel case. We evaluated the classifiers on multiple datasets and RSC is competitive with existing classifiers but with superior computational efficiency. Code for RST, RSC and RSPC is open source and was written in Python using the popular scikit-learn framework to make it easily accessible (https://github.com/KatharineShapcott/rank-similarity). In future extensions the algorithm can be applied to hardware suitable for the parallelization of an ANN (GPU) and a Spiking Neural Network (neuromorphic computing) with corresponding performance gains. This makes Rank Similarity Filters a promising biologically inspired solution to the problem of efficient analysis of nonlinearly separable data.

Topics

Rank Similarity ClassifierRank Similarity FiltersRank Similarity Transformartificial neural networkclass separabilitycomputational efficiencycosine similaritymachine learningmultilabel classificationneural networksnonlinear classifiers

About the Speaker

Katharine Shapcott

Postdoctoral Researcher

FIAS

Contact & Resources

Personal Website

www.esi-frankfurt.de/people/katharineshapcott/

@ShapcottKA

Follow on Twitter/X

twitter.com/ShapcottKA

Related Seminars

Seminar60%

Pancreatic Opioids Regulate Ingestive and Metabolic Phenotypes

neuro

Jan 12, 2025
Washington University in St. Louis
Seminar60%

Exploration and Exploitation in Human Joint Decisions

neuro

Jan 12, 2025
Munich
Seminar60%

The Role of GPCR Family Mrgprs in Itch, Pain, and Innate Immunity

neuro

Jan 12, 2025
Johns Hopkins University
January 2026
Full calendar →