Latest

SeminarNeuroscience

Brian2CUDA: Generating Efficient CUDA Code for Spiking Neural Networks

Denis Alevi
Berlin Institute of Technology (
Nov 3, 2022

Graphics processing units (GPUs) are widely available and have been used with great success to accelerate scientific computing in the last decade. These advances, however, are often not available to researchers interested in simulating spiking neural networks, but lacking the technical knowledge to write the necessary low-level code. Writing low-level code is not necessary when using the popular Brian simulator, which provides a framework to generate efficient CPU code from high-level model definitions in Python. Here, we present Brian2CUDA, an open-source software that extends the Brian simulator with a GPU backend. Our implementation generates efficient code for the numerical integration of neuronal states and for the propagation of synaptic events on GPUs, making use of their massively parallel arithmetic capabilities. We benchmark the performance improvements of our software for several model types and find that it can accelerate simulations by up to three orders of magnitude compared to Brian’s CPU backend. Currently, Brian2CUDA is the only package that supports Brian’s full feature set on GPUs, including arbitrary neuron and synapse models, plasticity rules, and heterogeneous delays. When comparing its performance with Brian2GeNN, another GPU-based backend for the Brian simulator with fewer features, we find that Brian2CUDA gives comparable speedups, while being typically slower for small and faster for large networks. By combining the flexibility of the Brian simulator with the simulation speed of GPUs, Brian2CUDA enables researchers to efficiently simulate spiking neural networks with minimal effort and thereby makes the advancements of GPU computing available to a larger audience of neuroscientists.

SeminarNeuroscience

The Brain’s Constraints on Human Number Concepts

Andreas Nieder
University of Tübingen
May 26, 2021

Although animals can estimate numerical quantities, true counting and arithmetic abilities are unique to humans and are inextricably linked to symbolic competence. However, our unprecedented numerical skills are deeply rooted in our neuronal heritage as primates and vertebrates. I argue that numerical competence in humans is the result of three neural constraints. First, I propose that the neuronal mechanisms of quantity estimation are part of our evolutionary heritage and can be witnessed across primate and vertebrate phylogeny. Second, I suggest that a basic understanding of number, what numerical quantity means, is innately wired into the brain and gives rise to an intuitive number sense, or number instinct. Third and finally, I argue that symbolic counting and arithmetic in humans is rooted in an evolutionarily and ontogenetically primeval neural system for non-symbolic number representations. These three neural constraints jointly determine the basic processing of number concepts in the human mind.

ePosterNeuroscience

Arithmetic value representation for hierarchical behavior composition

Hiroshi Makino

COSYNE 2022

arithmetic coverage

3 items

Seminar2
ePoster1
Domain spotlight

Explore how arithmetic research is advancing inside Neuro.

Visit domain