ePoster

Learning as Guessing: Rapid Learning through Hypothesis Testing with Compositional Representations

Reidar Rivelandand 1 co-author

Presenting Author

Conference
COSYNE 2025 (2025)
Montreal, Canada

Conference

COSYNE 2025

Montreal, Canada

Resources

Authors & Affiliations

Reidar Riveland, Alex Pouget

Abstract

Fast and adaptable learning is often characterized by sudden jumps in performance rather than slow adaptation typical of updating synaptic weights. Cognitive science theories posit that rapid insights are due to compositional symbolic representations that allow animals to quickly recombine past knowledge to adapt to new settings. Traditionally, it has been difficult to model this type of compositional learning with neural systems. Here we leverage recent work on compositional computations in multitasking Recurrent Neural Networks (RNNs) to construct a neural model of compositional learning. Past work found that representations for related tasks are often organized according to a set of abstract linear axes. We use memory networks to store task representations and axes vectors that encode this abstract structure. Learning then proceeds by hypothesis testing: in each trial the system recalls a combination of these representations which effectively recruits a set of factorized computations from the RNN. Because relational vectors are abstract, those learned during training can also cue a novel combination of computations that solve an unseen target task. After pre-training on a subset of the task set, our models learn heldout tasks 100 times faster than gradient descent. It also shows the characteristic sudden jump in performance typical of discrete hypothesis testing. We also find that cuing tasks with semantic embeddings of task instructions, as opposed to non-linguistic cues, during pre-training leads to stronger performance in the learning phase. Our learning algorithm is the first to make concrete predictions about the neural representations that underpin hypothesis testing style learning. It also has the potential to unify many recent experimental results that have found similar abstract linear relationships in neural data. Our learning algorithm makes explicit how a system can leverage this representational scheme to rapidly adapt to novel settings.

Unique ID: cosyne-25/learning-guessing-rapid-learning-75217ad8