Abstract
Neural networks with memristive memory for weights have been proposed as an energy-efficient solution for scaling up of neural network implementations. However, training such memristive neural networks is still challenging due to various memristor imperfections and faulty memristive elements. Such imperfections and faults are becoming increasingly severe as the density of memristor arrays increases in order to scale up weight memory. We propose fault pruning, a robust training scheme for memristive neural networks based on the idea to identify faulty memristive behavior on the fly during training and prune corresponding connections. We test this algorithm in simulations of memristive neural networks using both feed-forward and convolutional architectures on standard object recognition data sets. We show its ability to mitigate the detrimental effect of memristor faults on network training.
Originalsprache | englisch |
---|---|
Publikationsstatus | Angenommen/In Druck - 2023 |
Veranstaltung | 20th International Conference on Unconventional Computation and Natural Computation - University of North Florida in Jacksonville, Jacksonville, USA / Vereinigte Staaten Dauer: 13 März 2023 → 17 März 2023 https://sites.google.com/view/ucnc2023/ |
Konferenz
Konferenz | 20th International Conference on Unconventional Computation and Natural Computation |
---|---|
Kurztitel | UCNC 2023 |
Land/Gebiet | USA / Vereinigte Staaten |
Ort | Jacksonville |
Zeitraum | 13/03/23 → 17/03/23 |
Internetadresse |