Supervised Learning Algorithms for Spiking Neuromorphic Hardware

Research output: ThesisMaster's ThesisResearch

Abstract

Analog neuromorphic hardware allows highly accelerated and, in terms of dissipated power, efficient emulations of spiking neural networks (SNNs) compared to simulations on conventional computers. This thesis presents a supervised learning algorithm for hardware emulated feedforward SNNs with limited resolution of their synaptic efficacies. Training is done in conjunction with an abstract software model of the network which is used to perform the parameter updates based on the recorded network activity of the SNN. Despite inherently present noise in analog circuitry, the introduced algorithm was successfully tested on two classification problems—on the classical XOR problem and on a subset of the MNIST dataset—where a good classification performance could be achieved. The performance could, however, be improved if the variations of the hardware would be on a smaller scale, as it has been shown in a software simulation of the SNNs.
Original languageEnglish
Publication statusPublished - 22 Feb 2017

Fingerprint

Supervised learning
Learning algorithms
Neural networks
Hardware
Feedforward neural networks

Cite this

Supervised Learning Algorithms for Spiking Neuromorphic Hardware. / Limbacher, Thomas.

2017. 75 p.

Research output: ThesisMaster's ThesisResearch

@phdthesis{b13fb2f0da5c4b0ebdb821ae599082f3,
title = "Supervised Learning Algorithms for Spiking Neuromorphic Hardware",
abstract = "Analog neuromorphic hardware allows highly accelerated and, in terms of dissipated power, efficient emulations of spiking neural networks (SNNs) compared to simulations on conventional computers. This thesis presents a supervised learning algorithm for hardware emulated feedforward SNNs with limited resolution of their synaptic efficacies. Training is done in conjunction with an abstract software model of the network which is used to perform the parameter updates based on the recorded network activity of the SNN. Despite inherently present noise in analog circuitry, the introduced algorithm was successfully tested on two classification problems—on the classical XOR problem and on a subset of the MNIST dataset—where a good classification performance could be achieved. The performance could, however, be improved if the variations of the hardware would be on a smaller scale, as it has been shown in a software simulation of the SNNs.",
author = "Thomas Limbacher",
year = "2017",
month = "2",
day = "22",
language = "English",

}

TY - THES

T1 - Supervised Learning Algorithms for Spiking Neuromorphic Hardware

AU - Limbacher, Thomas

PY - 2017/2/22

Y1 - 2017/2/22

N2 - Analog neuromorphic hardware allows highly accelerated and, in terms of dissipated power, efficient emulations of spiking neural networks (SNNs) compared to simulations on conventional computers. This thesis presents a supervised learning algorithm for hardware emulated feedforward SNNs with limited resolution of their synaptic efficacies. Training is done in conjunction with an abstract software model of the network which is used to perform the parameter updates based on the recorded network activity of the SNN. Despite inherently present noise in analog circuitry, the introduced algorithm was successfully tested on two classification problems—on the classical XOR problem and on a subset of the MNIST dataset—where a good classification performance could be achieved. The performance could, however, be improved if the variations of the hardware would be on a smaller scale, as it has been shown in a software simulation of the SNNs.

AB - Analog neuromorphic hardware allows highly accelerated and, in terms of dissipated power, efficient emulations of spiking neural networks (SNNs) compared to simulations on conventional computers. This thesis presents a supervised learning algorithm for hardware emulated feedforward SNNs with limited resolution of their synaptic efficacies. Training is done in conjunction with an abstract software model of the network which is used to perform the parameter updates based on the recorded network activity of the SNN. Despite inherently present noise in analog circuitry, the introduced algorithm was successfully tested on two classification problems—on the classical XOR problem and on a subset of the MNIST dataset—where a good classification performance could be achieved. The performance could, however, be improved if the variations of the hardware would be on a smaller scale, as it has been shown in a software simulation of the SNNs.

M3 - Master's Thesis

ER -