Heidelberg University

An introduction to analog neuromorphic computing based on the BrainScaleS architecture

Johannes Schemmel, Heidelberg University

Abstract:

One important scientific goal of computational neuroscience is the advancement of brain-inspired computing. Continuous-time emulation technologies for modeling brain function play an important role in this endeavor, since they provide resource-efficient platforms for the bottom-up modeling of brain functions including computationally expensive aspects like plasticity and learning or structured neurons.
BrainScaleS is a neuromorphic computing platform that realizes this approach to the furthest extent possible with current technologies by constructing a physical replica of the most commonly used reductionist view of the biological brain: a network of neurons connected via plastic synapses.
In this aspect it is totally different from most other modelling approaches within the computational neuroscience community. While the network model operates, no differential equation gets solved nor is any biological process represented by discrete-time changes of a multitude of bits representing some binary approximation of molecular biology; instead, the temporal evolution of physical quantities within the replica, such as current and voltage, directly represents the neural dynamics.BrainScaleS is based on standard CMOS technology, the neuron and synapse are realized as mixed-signal full-custom circuits. One important goal of the BrainScaleS system is the support for the research of biological learning rules. This kind of research usually needs a high number of individual learning experiments, which lasts from minutes to hours in biological real-time. Therefore, it is of utmost importance to have some kind of acceleration available, to keep the total time in check. The BrainScaleS physical implementation allows to compress time by a factor of 1000 in the current generation, thereby a learning process taking 1 hour in biology needs only 3.6 seconds on BrainScaleS.
The biggest limitation of a fully continuous-time physical modeling system like BrainScaleS is its complete in-memory operation. BrainScaleS, in its most recent version, adds a second operating mode: activation- or rate-based analog computing using the same accelerated synapse matrix as it uses for event-based processing. The activation-based analog matrix-vector-multiplication does not rely on any internal state and can therefore be time-multiplexed to realize large multi-layered structures, including convolutional network layers, based on external weight storage.

This five-day course is organized as follows: biological neurons and synapses, rate- and spike-based models, networks of neurons, machine learning vs. biology, simulation and emulation of neural networks, in-memory computing, physical models based on microelectronic circuits, introduction to the BrainScaleS neuromorphic architecture.