Biologically-inspired training of spiking recurrent neural networks with neuromorphic hardware

Thomas Bohnstingl*, Anja Surina, Maxime Fabre, Yigit Demirag, Charlotte Frenkel, Melika Payvand, Giacomo Indiveri, Angeliki Pantazi

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Abstract

Recurrent spiking neural networks (SNNs) are inspired by the working principles of biological nervous systems that offer unique temporal dynamics and event-based processing. Recently, the error backpropagation through time (BPTT) algorithm has been successfully employed to train SNNs offline, with comparable performance to artificial neural networks (ANNs) on complex tasks. However, BPTT has severe limitations for online learning scenarios of SNNs where the network is required to simultaneously process and learn from incoming data. Specifically, as BPTT separates the inference and update phases, it would require to store all neuronal states for calculating the weight updates backwards in time. To address these fundamental issues, alternative credit assignment schemes are required. Within this context, neuromorphic hardware (NMHW) implementations of SNNs can greatly benefit from in-memory computing (IMC) concepts that follow the brain-inspired collocation of memory and processing, further enhancing their energy efficiency. In this work, we utilize a biologically-inspired local and online training algorithm compatible with IMC, which approximates BPTT, e-prop, and present an approach to support both inference and training of a recurrent SNN using NMHW. To do so, we embed the SNN weights on an in-memory computing NMHW with phase-change memory (PCM) devices and integrate it into a hardware-in-the-loop training setup. We develop our approach with respect to limited precision and imperfections of the analog devices using a PCM-based simulation framework and a NMHW consisting of in-memory computing cores fabricated in 14nm CMOS technology with 256×256 PCM crossbar arrays. We demonstrate that our approach is robust even to 4-bit precision and achieves competitive performance to a floating-point 32-bit realization, while simultaneously equipping the SNN with online training capabilities and exploiting the acceleration benefits of NMHW.

Original languageEnglish
Title of host publicationProceeding - IEEE International Conference on Artificial Intelligence Circuits and Systems, AICAS 2022
PublisherInstitute of Electrical and Electronics Engineers
Pages218-221
Number of pages4
ISBN (Electronic)9781665409964
DOIs
Publication statusPublished - 2022
Event4th IEEE International Conference on Artificial Intelligence Circuits and Systems: AICAS 2022 - Incheon, Korea, Republic of
Duration: 13 Jun 202215 Jun 2022

Conference

Conference4th IEEE International Conference on Artificial Intelligence Circuits and Systems
Abbreviated titleAICAS 2022
Country/TerritoryKorea, Republic of
CityIncheon
Period13/06/2215/06/22

Keywords

  • in-memory computing
  • neuromorphic hardware
  • online training
  • phase-change memory
  • spiking neural networks

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Computer Vision and Pattern Recognition
  • Hardware and Architecture
  • Human-Computer Interaction
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Biologically-inspired training of spiking recurrent neural networks with neuromorphic hardware'. Together they form a unique fingerprint.

Cite this