Memory-dependent computation and learning in spiking neural networks through Hebbian plasticity

Thomas Limbacher, Ozan Özdenizci, Robert Legenstein*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Spiking neural networks (SNNs) are the basis for many energy-efficient neuromorphic hardware systems. While there has been substantial progress in SNN research, artificial SNNs still lack many capabilities of their biological counterparts. In biological neural systems, memory is a key component that enables the retention of information over a huge range of temporal scales, ranging from hundreds of milliseconds up to years. While Hebbian plasticity is believed to play a pivotal role in biological memory, it has so far been analyzed mostly in the context of pattern completion and unsupervised learning in artificial and SNNs. Here, we propose that Hebbian plasticity is fundamental for computations in biological and artificial spiking neural systems. We introduce a novel memory-augmented SNN architecture that is enriched by Hebbian synaptic plasticity. We show that Hebbian enrichment renders SNNs surprisingly versatile in terms of their computational as well as learning capabilities. It improves their abilities for out-of-distribution generalization, one-shot learning, cross-modal generative association, language processing, and reward-based learning. This suggests that powerful cognitive neuromorphic systems can be built based on this principle.
Original languageEnglish
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
Publication statusPublished - 19 Dec 2023

Fields of Expertise

  • Information, Communication & Computing

Fingerprint

Dive into the research topics of 'Memory-dependent computation and learning in spiking neural networks through Hebbian plasticity'. Together they form a unique fingerprint.

Cite this