Fault Pruning: Robust Training of Neural Networks with Memristive Weights

Ceca Kraisnikovic, Spyros Stathopoulos, Themis Prodromakis, Robert Legenstein*

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

Abstract

Neural networks with memristive memory for weights have been proposed as an energy-efficient solution for scaling up of neural network implementations. However, training such memristive neural networks is still challenging due to various memristor imperfections and faulty memristive elements. Such imperfections and faults are becoming increasingly severe as the density of memristor arrays increases in order to scale up weight memory. We propose fault pruning, a robust training scheme for memristive neural networks based on the idea to identify faulty memristive behavior on the fly during training and prune corresponding connections. We test this algorithm in simulations of memristive neural networks using both feed-forward and convolutional architectures on standard object recognition data sets. We show its ability to mitigate the detrimental effect of memristor faults on network training.
Original languageEnglish
Publication statusAccepted/In press - 2023
Event20th International Conference on Unconventional Computation and Natural Computation - University of North Florida in Jacksonville, Jacksonville, United States
Duration: 13 Mar 202317 Mar 2023
https://sites.google.com/view/ucnc2023/

Conference

Conference20th International Conference on Unconventional Computation and Natural Computation
Abbreviated titleUCNC 2023
Country/TerritoryUnited States
CityJacksonville
Period13/03/2317/03/23
Internet address

Keywords

  • neural networks
  • memristors
  • robust training
  • memristor faults
  • network pruning

Fingerprint

Dive into the research topics of 'Fault Pruning: Robust Training of Neural Networks with Memristive Weights'. Together they form a unique fingerprint.

Cite this