Distributed Bayesian computation and self-organized learning in sheets of spiking neurons with local lateral inhibition

Johannes Bill, Lars Holger Büsing, Stefan Habenschuss, Bernhard Nessler, Wolfgang Maass, Robert Legenstein

Research output: Contribution to journalArticleResearchpeer-review

Abstract

During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input.
Original languageEnglish
Pages (from-to)e0134356-e0134356
JournalPLoS ONE
Volume10
Issue number8
DOIs
Publication statusPublished - 2015

Fingerprint

Neurons
Plasticity
learning
Probability Theory
neurons
Learning
Cognitive Science
probabilistic models
neurophysiology
Pyramidal Cells
artificial intelligence
Statistical Models
synapse
Automatic Data Processing
computer simulation
Computer Simulation
Synapses
Mammals
cells
mammals

Fields of Expertise

  • Information, Communication & Computing

Treatment code (Nähere Zuordnung)

  • Theoretical

Cite this

Distributed Bayesian computation and self-organized learning in sheets of spiking neurons with local lateral inhibition. / Bill, Johannes; Büsing, Lars Holger; Habenschuss, Stefan; Nessler, Bernhard; Maass, Wolfgang; Legenstein, Robert.

In: PLoS ONE, Vol. 10, No. 8, 2015, p. e0134356-e0134356.

Research output: Contribution to journalArticleResearchpeer-review

Bill, Johannes ; Büsing, Lars Holger ; Habenschuss, Stefan ; Nessler, Bernhard ; Maass, Wolfgang ; Legenstein, Robert. / Distributed Bayesian computation and self-organized learning in sheets of spiking neurons with local lateral inhibition. In: PLoS ONE. 2015 ; Vol. 10, No. 8. pp. e0134356-e0134356.
@article{cc29a50a93004ba185c0ecaa37974d9c,
title = "Distributed Bayesian computation and self-organized learning in sheets of spiking neurons with local lateral inhibition",
abstract = "During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input.",
author = "Johannes Bill and B{\"u}sing, {Lars Holger} and Stefan Habenschuss and Bernhard Nessler and Wolfgang Maass and Robert Legenstein",
year = "2015",
doi = "https://doi.org/10.1371/journal.pone.0134356",
language = "English",
volume = "10",
pages = "e0134356--e0134356",
journal = "PLoS ONE",
issn = "1932-6203",
publisher = "Public Library of Science",
number = "8",

}

TY - JOUR

T1 - Distributed Bayesian computation and self-organized learning in sheets of spiking neurons with local lateral inhibition

AU - Bill, Johannes

AU - Büsing, Lars Holger

AU - Habenschuss, Stefan

AU - Nessler, Bernhard

AU - Maass, Wolfgang

AU - Legenstein, Robert

PY - 2015

Y1 - 2015

N2 - During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input.

AB - During the last decade, Bayesian probability theory has emerged as a framework in cognitive science and neuroscience for describing perception, reasoning and learning of mammals. However, our understanding of how probabilistic computations could be organized in the brain, and how the observed connectivity structure of cortical microcircuits supports these calculations, is rudimentary at best. In this study, we investigate statistical inference and self-organized learning in a spatially extended spiking network model, that accommodates both local competitive and large-scale associative aspects of neural information processing, under a unified Bayesian account. Specifically, we show how the spiking dynamics of a recurrent network with lateral excitation and local inhibition in response to distributed spiking input, can be understood as sampling from a variational posterior distribution of a well-defined implicit probabilistic model. This interpretation further permits a rigorous analytical treatment of experience-dependent plasticity on the network level. Using machine learning theory, we derive update rules for neuron and synapse parameters which equate with Hebbian synaptic and homeostatic intrinsic plasticity rules in a neural implementation. In computer simulations, we demonstrate that the interplay of these plasticity rules leads to the emergence of probabilistic local experts that form distributed assemblies of similarly tuned cells communicating through lateral excitatory connections. The resulting sparse distributed spike code of a well-adapted network carries compressed information on salient input features combined with prior experience on correlations among them. Our theory predicts that the emergence of such efficient representations benefits from network architectures in which the range of local inhibition matches the spatial extent of pyramidal cells that share common afferent input.

UR - http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0134356

U2 - https://doi.org/10.1371/journal.pone.0134356

DO - https://doi.org/10.1371/journal.pone.0134356

M3 - Article

VL - 10

SP - e0134356-e0134356

JO - PLoS ONE

JF - PLoS ONE

SN - 1932-6203

IS - 8

ER -