Resources
Authors & Affiliations
Julia Steinberg,Haim Sompolinsky
Abstract
Humans can robustly store and retrieve information with complex and hierarchical structure. Some ex- amples are temporal sequences, where each event is associated to a particular time, episodic memories where each event is associated with a particular context, cognitive maps representing spatial environ- ments through landmarks associated with relative locations, and semantic structures in language in which meaning is conveyed through both the identity of individual words and their role in the sentence. While working memory tasks typically process one structure at a time, a long-term associative memory network must store multiple structures in a manner that allows them to be used upon retrieval for a variety of higher cognitive tasks. In this work, we specifically ask how Hopfield type networks can store and retrieve multiple knowledge structures. We model each structure as a set of binary relations between events and attributes (attributes may represent e.g., temporal order, spatial location, role in semantic structure). We use binarized holographic reduced representation (HRR) to map structures to distributed neuronal activity patterns (Plate, 1994; Eliasmith, 2013). We then use associative memory plasticity rules to store these activity patterns as fixed points in a recurrent network. By a combination of signal-to-noise analysis and numerical simulations, we demonstrate that our model allows for efficient storage of these knowledge structures, so that memorized structures and their individual building blocks (e.g., events and attributes) can be subsequently retrieved, from partial retrieving cues. We show that long-term memory of structured knowledge relies on a new principle of computation beyond the memory basins. Our model can also be extended to store sequences of memories as single attractors.