Synaptic sampling: A Bayesian approach to neural network plasticity and rewiring

David Kappel, Stefan Habenschuss, Robert Legenstein, Wolfgang Maass

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Abstract

We reexamine in this article the conceptual and mathematical framework for understanding
the organization of plasticity in spiking neural networks. We propose
that inherent stochasticity enables synaptic plasticity to carry out probabilistic inference
by sampling from a posterior distribution of synaptic parameters. This
view provides a viable alternative to existing models that propose convergence of
synaptic weights to maximum likelihood parameters. It explains how priors on
weight distributions and connection probabilities can be merged optimally with
learned experience. In simulations we show that our model for synaptic plasticity
allows spiking neural networks to compensate continuously for unforeseen disturbances.
Furthermore it provides a normative mathematical framework to better
understand the permanent variability and rewiring observed in brain networks.
Original languageEnglish
Title of host publicationProceedings of NIPS
Publisher.
Publication statusAccepted/In press - 2015

Fields of Expertise

  • Information, Communication & Computing

Fingerprint

Dive into the research topics of 'Synaptic sampling: A Bayesian approach to neural network plasticity and rewiring'. Together they form a unique fingerprint.

Cite this