ePoster

Improving the Neuronal Classification Capacity with Nonlinear Parallel Synapses

Yuru Song, Marcus Benna
Bernstein Conference 2024(2024)
Goethe University, Frankfurt, Germany

Conference

Bernstein Conference 2024

Goethe University, Frankfurt, Germany

Resources

Authors & Affiliations

Yuru Song, Marcus Benna

Abstract

A cortical neuron often establishes multiple synaptic contacts with the same postsynaptic neuron. To avoid functional redundancy of these parallel synapses, it is crucial that each synapse exhibits distinct computational properties. We discuss models of a neuron in which the current to the soma contributed by each synapse is described by a sigmoidal transmission function of its presynaptic input, with learnable parameters. We evaluate the classification capacity of a neuron equipped with such nonlinear parallel synapses, and show that with a small number of parallel synapses per axon, it substantially exceeds that of the perceptron. Furthermore, the number of correctly classified data points can increase superlinearly as the number of presynaptic axons grows. When training with an unrestricted number of parallel synapses, our model neuron can effectively implement an arbitrary aggregate transmission function for each axon, constrained only by monotonicity. Nevertheless, the successfully learned model often uses only a small number of parallel synapses, in agreement with biological observations. We also apply these parallel synapses in a feedforward neural network trained to classify handwritten digits, and show that they can increase the test accuracy. This demonstrates that multiple nonlinear synapses per input axon can substantially enhance a neuron's computational power.

Unique ID: bernstein-24/improving-neuronal-classification-38a364ef