Resources
Authors & Affiliations
Vidit Tripathi, Alex Wang, Hannah Choi
Abstract
A range of work has been done analyzing the effect of specific pruning rules on the dynamics of recurrent
neural networks (RNNs) with a given connectivity structure. However, a more comprehensive study that
explains the effect of different, biologically plausible pruning rules across a variety of structures has yet to
be done. Specifically, previous work has investigated uniform random sparsification on RNNs with low rank
connectivity matrices [1], and utilized theoretical results from graph sparsification to propose a pruning
rule shown to be optimal for preserving the dynamics of symmetric and diagonally dominant connectivity
matrices [3]. Our work seeks to unify some of these ideas, presenting a framework to understand the
relationship between the rank of the connectivity matrix and the effects of different probabilistic pruning
rules on the network dynamics. We evaluate random pruning under 2 different rules: 1) the strengthen
or prune method, and 2) the regular prune method. In the “strengthen or prune” method, connections
that are not pruned are strengthened, and in the “regular prune” method, unpruned connections are
left unaltered. The main results of our analysis include: i) strengthen or prune rules preserve full low
dimensional dynamics not just the dimensionality for low rank connectivity matrices and ii) for high
rank connectivity matrices, strengthen or prune rules perform worse than regular pruning due to the
stronger pruning-induced, high-dimensional dynamics. Our work thus explains the relationship between
connectivity structure and dynamics induced by pruning in RNN’s, which in turn provides insights into
dynamics of sparse biological neuronal networks.