4.8 Article

Fast and energy-efficient neuromorphic deep learning with first-spike times

Journal

NATURE MACHINE INTELLIGENCE
Volume 3, Issue 9, Pages 823-835

Publisher

NATURE PORTFOLIO
DOI: 10.1038/s42256-021-00388-x

Keywords

-

Funding

  1. European Union's Horizon 2020 research and innovation programme through the ICEI project [800858]
  2. state of Baden-Wurttemberg through bwHPC
  3. German Research Foundation (DFG) [INST 39/963-1 FUGG]
  4. European Union [604102, 720270, 785907, 945539]
  5. Manfred Stark Foundation

Ask authors/readers for more resources

For a biological agent or engineered system, energy consumption and reaction times are critical. In neuronal implementation, achieving desired results with as few and as early spikes as possible is key. Spiking neural networks promise fast and energy-efficient information processing, especially with the 'time-to-first-spike' coding scheme.
For a biological agent operating under environmental pressure, energy consumption and reaction times are of critical importance. Similarly, engineered systems are optimized for short time-to-solution and low energy-to-solution characteristics. At the level of neuronal implementation, this implies achieving the desired results with as few and as early spikes as possible. With time-to-first-spike coding, both of these goals are inherently emerging features of learning. Here, we describe a rigorous derivation of a learning rule for such first-spike times in networks of leaky integrate-and-fire neurons, relying solely on input and output spike times, and show how this mechanism can implement error backpropagation in hierarchical spiking networks. Furthermore, we emulate our framework on the BrainScaleS-2 neuromorphic system and demonstrate its capability of harnessing the system's speed and energy characteristics. Finally, we examine how our approach generalizes to other neuromorphic platforms by studying how its performance is affected by typical distortive effects induced by neuromorphic substrates. Spiking neural networks promise fast and energy-efficient information processing. The 'time-to-first-spike' coding scheme, where the time elapsed before a neuron's first spike is utilized as the main variable, is a particularly efficient approach and Goltz and Kriener et al. demonstrate that error backpropagation, an essential ingredient for learning in neural networks, can be implemented in this scheme.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available