Synaptic sampling: A Bayesian approach to neural network plasticity and rewiring

David Kappel, Stefan Habenschuss, Robert Legenstein, Wolfgang Maass

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Abstract

We reexamine in this article the conceptual and mathematical framework for understanding
the organization of plasticity in spiking neural networks. We propose
that inherent stochasticity enables synaptic plasticity to carry out probabilistic inference
by sampling from a posterior distribution of synaptic parameters. This
view provides a viable alternative to existing models that propose convergence of
synaptic weights to maximum likelihood parameters. It explains how priors on
weight distributions and connection probabilities can be merged optimally with
learned experience. In simulations we show that our model for synaptic plasticity
allows spiking neural networks to compensate continuously for unforeseen disturbances.
Furthermore it provides a normative mathematical framework to better
understand the permanent variability and rewiring observed in brain networks.
Original languageEnglish
Title of host publicationProceedings of NIPS
Publisher.
Publication statusAccepted/In press - 2015

Fingerprint

Plasticity
Sampling
Neural networks
Maximum likelihood
Brain

Fields of Expertise

  • Information, Communication & Computing

Cite this

Kappel, D., Habenschuss, S., Legenstein, R., & Maass, W. (Accepted/In press). Synaptic sampling: A Bayesian approach to neural network plasticity and rewiring. In Proceedings of NIPS ..

Synaptic sampling: A Bayesian approach to neural network plasticity and rewiring. / Kappel, David; Habenschuss, Stefan; Legenstein, Robert; Maass, Wolfgang.

Proceedings of NIPS. ., 2015.

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

@inproceedings{7e41b4a4c4714c408daa37563c4e5ec2,
title = "Synaptic sampling: A Bayesian approach to neural network plasticity and rewiring",
abstract = "We reexamine in this article the conceptual and mathematical framework for understandingthe organization of plasticity in spiking neural networks. We proposethat inherent stochasticity enables synaptic plasticity to carry out probabilistic inferenceby sampling from a posterior distribution of synaptic parameters. Thisview provides a viable alternative to existing models that propose convergence ofsynaptic weights to maximum likelihood parameters. It explains how priors onweight distributions and connection probabilities can be merged optimally withlearned experience. In simulations we show that our model for synaptic plasticityallows spiking neural networks to compensate continuously for unforeseen disturbances.Furthermore it provides a normative mathematical framework to betterunderstand the permanent variability and rewiring observed in brain networks.",
author = "David Kappel and Stefan Habenschuss and Robert Legenstein and Wolfgang Maass",
year = "2015",
language = "English",
booktitle = "Proceedings of NIPS",
publisher = ".",

}

TY - GEN

T1 - Synaptic sampling: A Bayesian approach to neural network plasticity and rewiring

AU - Kappel, David

AU - Habenschuss, Stefan

AU - Legenstein, Robert

AU - Maass, Wolfgang

PY - 2015

Y1 - 2015

N2 - We reexamine in this article the conceptual and mathematical framework for understandingthe organization of plasticity in spiking neural networks. We proposethat inherent stochasticity enables synaptic plasticity to carry out probabilistic inferenceby sampling from a posterior distribution of synaptic parameters. Thisview provides a viable alternative to existing models that propose convergence ofsynaptic weights to maximum likelihood parameters. It explains how priors onweight distributions and connection probabilities can be merged optimally withlearned experience. In simulations we show that our model for synaptic plasticityallows spiking neural networks to compensate continuously for unforeseen disturbances.Furthermore it provides a normative mathematical framework to betterunderstand the permanent variability and rewiring observed in brain networks.

AB - We reexamine in this article the conceptual and mathematical framework for understandingthe organization of plasticity in spiking neural networks. We proposethat inherent stochasticity enables synaptic plasticity to carry out probabilistic inferenceby sampling from a posterior distribution of synaptic parameters. Thisview provides a viable alternative to existing models that propose convergence ofsynaptic weights to maximum likelihood parameters. It explains how priors onweight distributions and connection probabilities can be merged optimally withlearned experience. In simulations we show that our model for synaptic plasticityallows spiking neural networks to compensate continuously for unforeseen disturbances.Furthermore it provides a normative mathematical framework to betterunderstand the permanent variability and rewiring observed in brain networks.

UR - https://papers.nips.cc/paper/5952-synaptic-sampling-a-bayesian-approach-to-neural-network-plasticity-and-rewiring.pdf

M3 - Conference contribution

BT - Proceedings of NIPS

PB - .

ER -