ePoster

A high-throughput pipeline for evaluating recurrent neural networks on multiple datasets

Moufan Li,Nathan Cloos,Xun Yuan,Guangyu Robert Yang,Christopher J. Cueva
COSYNE 2022(2022)
Lisbon, Portugal
Presented: Mar 19, 2022

Conference

COSYNE 2022

Lisbon, Portugal

Resources

Authors & Affiliations

Moufan Li,Nathan Cloos,Xun Yuan,Guangyu Robert Yang,Christopher J. Cueva

Abstract

Neural networks are now widely used for modeling neural activity in the brain. They have been particularly successful in modeling the visual system, using mostly feedforward networks and leveraging community-wide efforts centered around benchmarks to both improve model architectures and evaluate model fits to data. Now that recurrent neural networks (RNNs) are also used to model a larger variety of brain functions, there is a similar need for developing appropriate metrics to compare these models. Towards this goal, we have built a high-throughput pipeline for training different RNN models on a wide range of tasks and comparing them to many experimental datasets through a variety of analysis methods. To ensure the reliability of results generated by these methods we evaluate a set of model-data similarity measures based on several criteria: 1) robustness to noise, 2) higher similarity scores for comparisons between models of the same structure versus models of different structures, and 3) rise in similarity scores between models and data after training. We found that the similarity of models to datasets rises after training for all the methods. Centered kernel alignment (CKA) is less sensitive to noise and better identifies models of the same structure compared to methods based on canonical correlation analysis (CCA). Our framework provides the flexibility to add models, datasets and analysis methods, serving as a basis for further refinement and testing of RNN models by evaluating them against multiple datasets.

Unique ID: cosyne-22/highthroughput-pipeline-evaluating-b19f635d