Cookies
We use essential cookies to run the site. Analytics cookies are optional and help us improve World Wide. Learn more.
University of Pennsylvania
Showing your local timezone
Schedule
Wednesday, December 7, 2022
4:00 PM Europe/Berlin
Recording provided by the organiser.
Domain
NeuroscienceHost
SNUFA
Duration
30 minutes
Physiological experiments have highlighted how the dendrites of biological neurons can nonlinearly process distributed synaptic inputs. However, it is unclear how qualitative aspects of a dendritic tree, such as its branched morphology, its repetition of presynaptic inputs, voltage-gated ion channels, electrical properties and complex synapses, determine neural computation beyond this apparent nonlinearity. While it has been speculated that the dendritic tree of a neuron can be seen as a multi-layer neural network and it has been shown that such an architecture could be computationally strong, we do not know if that computational strength is preserved under these qualitative biological constraints. Here we simulate multi-layer neural network models of dendritic computation with and without these constraints. We find that dendritic model performance on interesting machine learning tasks is not hurt by most of these constraints and may synergistically benefit from all of them combined. Our results suggest that single real dendritic trees may be able to learn a surprisingly broad range of tasks through the emergent capabilities afforded by their properties.
Ilenna Jones
University of Pennsylvania
Contact & Resources
neuro
neuro
neuro