ePoster

Training Large Neural Networks With Low-Dimensional Error Feedback

Maher Hanutand 1 co-author
COSYNE 2025 (2025)
Montreal, Canada

Presentation

Date TBA

Poster preview

Training Large Neural Networks With Low-Dimensional Error Feedback poster preview

Event Information

Abstract

Training deep neural networks typically relies on backpropagating high-dimensional error signals---a computationally intensive and biologically implausible process. However, since most tasks involve low-dimensional outputs, we propose that low-dimensional error signals may suffice for effective learning. To test this hypothesis, we introduce a novel local learning rule based on Feedback Alignment that leverages indirect, low-dimensional error feedback to train large networks. Our method decouples the backward pass from the forward pass, enabling precise control over error signal dimensionality while maintaining high-dimensional representations. We begin with a detailed theoretical derivation for linear networks, which forms the foundation of our learning framework, and extend our approach to nonlinear and convolutional architectures. Remarkably, we demonstrate that even minimal error dimensionality---on the order of the task dimensionality---can achieve performance matching that of traditional backpropagation. Our rule efficiently trains convolutional networks, previously resistant to Feedback Alignment, with minimal error. This breakthrough not only paves the way for more biologically plausible models of learning but also challenges the conventional reliance on high-dimensional gradient signals in neural network training. Our findings suggest that low-dimensional error signals can be as effective as high-dimensional ones, prompting a reevaluation of gradient-based learning in high-dimensional systems. Ultimately, our work offers a fresh perspective on neural network optimization and contributes to understanding learning mechanisms in both artificial and biological systems.

Cookies

We use essential cookies to run the site. Analytics cookies are optional and help us improve World Wide. Learn more.