Platform

  • Search
  • Seminars
  • Conferences
  • Jobs

Resources

  • Submit Content
  • About Us

© 2025 World Wide

Open knowledge for all • Started with World Wide Neuro • A 501(c)(3) Non-Profit Organization

Analytics consent required

World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.

Review the Privacy Policy for details about analytics processing.

World Wide
SeminarsConferencesWorkshopsCoursesJobsMapsFeedLibrary
Back to SeminarsBack
Seminar✓ Recording AvailableNeuroscience

Can a single neuron solve MNIST? Neural computation of machine learning tasks emerges from the interaction of dendritic properties

Ilenna Jones

University of Pennsylvania

Schedule
Wednesday, December 7, 2022

Showing your local timezone

Schedule

Wednesday, December 7, 2022

3:00 PM Europe/Berlin

Watch recording
Host: SNUFA

Access Seminar

Meeting Password

$Em4HF

Use this password when joining the live session

Watch the seminar

Recording provided by the organiser.

Event Information

Domain

Neuroscience

Original Event

View source

Host

SNUFA

Duration

30 minutes

Abstract

Physiological experiments have highlighted how the dendrites of biological neurons can nonlinearly process distributed synaptic inputs. However, it is unclear how qualitative aspects of a dendritic tree, such as its branched morphology, its repetition of presynaptic inputs, voltage-gated ion channels, electrical properties and complex synapses, determine neural computation beyond this apparent nonlinearity. While it has been speculated that the dendritic tree of a neuron can be seen as a multi-layer neural network and it has been shown that such an architecture could be computationally strong, we do not know if that computational strength is preserved under these qualitative biological constraints. Here we simulate multi-layer neural network models of dendritic computation with and without these constraints. We find that dendritic model performance on interesting machine learning tasks is not hurt by most of these constraints and may synergistically benefit from all of them combined. Our results suggest that single real dendritic trees may be able to learn a surprisingly broad range of tasks through the emergent capabilities afforded by their properties.

Topics

MNISTbiophysicalbranched morphologycomputational strengthconstraintsdeep learningdendritesdendritic propertiesmachine learningmodelingmulti-layer neural networkneural computationnonlinearitysynaptic inputsvoltage-gated ion channels

About the Speaker

Ilenna Jones

University of Pennsylvania

Contact & Resources

Personal Website

www.ilenna.com

@IlennaJ

Follow on Twitter/X

twitter.com/IlennaJ

Related Seminars

Seminar60%

Pancreatic Opioids Regulate Ingestive and Metabolic Phenotypes

neuro

Jan 12, 2025
Washington University in St. Louis
Seminar60%

Exploration and Exploitation in Human Joint Decisions

neuro

Jan 12, 2025
Munich
Seminar60%

The Role of GPCR Family Mrgprs in Itch, Pain, and Innate Immunity

neuro

Jan 12, 2025
Johns Hopkins University
January 2026
Full calendar →