Implementation of Bayesian Inference in Distributed Neural Networks

Zhaofei Yu, Tiejun Huang, Jian K. Liu

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Abstract

Numerous neuroscience experiments have suggested that the cognitive process of human brain is realized as probability reasoning and further modeled as Bayesian inference. It is still unclear how Bayesian inference could be implemented by neural underpinnings in the brain. Here we present a novel Bayesian inference algorithm based on importance sampling. By distributed sampling through a deep tree structure with simple and stackable basic motifs for any given neural circuit, one can perform local inference while guaranteeing the accuracy of global inference. We show that these task-independent motifs can be used in parallel for fast inference without iteration and scale-limitation. Furthermore, experimental simulations with a small-scale neural network demonstrate that our distributed sampling-based algorithm, consisting with our theoretical analysis, can approximate Bayesian inference. Taken all together, we provide a proofof-principle to use distributed neural networks to implement Bayesian inference, which gives a road-map for large-scale Bayesian network implementation based on spiking neural networks with computer hardwares, including neuromorphic chips.

Original languageEnglish
Title of host publicationProceedings - 26th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2018
PublisherInstitute of Electrical and Electronics Engineers
Pages666-673
Number of pages8
ISBN (Electronic)9781538649756
DOIs
Publication statusPublished - 6 Jun 2018
Event26th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2018 - Cambridge, United Kingdom
Duration: 21 Mar 201823 Mar 2018

Conference

Conference26th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2018
CountryUnited Kingdom
CityCambridge
Period21/03/1823/03/18

Fingerprint

Neural networks
Brain
Sampling
Importance sampling
Bayesian networks
Computer hardware
Networks (circuits)
Experiments

Keywords

  • Bayesian inference
  • distributed neural network
  • importance sampling
  • neural implementation

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Hardware and Architecture

Cite this

Yu, Z., Huang, T., & Liu, J. K. (2018). Implementation of Bayesian Inference in Distributed Neural Networks. In Proceedings - 26th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2018 (pp. 666-673). Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/PDP2018.2018.00111

Implementation of Bayesian Inference in Distributed Neural Networks. / Yu, Zhaofei; Huang, Tiejun; Liu, Jian K.

Proceedings - 26th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2018. Institute of Electrical and Electronics Engineers, 2018. p. 666-673.

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Yu, Z, Huang, T & Liu, JK 2018, Implementation of Bayesian Inference in Distributed Neural Networks. in Proceedings - 26th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2018. Institute of Electrical and Electronics Engineers, pp. 666-673, 26th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2018, Cambridge, United Kingdom, 21/03/18. https://doi.org/10.1109/PDP2018.2018.00111
Yu Z, Huang T, Liu JK. Implementation of Bayesian Inference in Distributed Neural Networks. In Proceedings - 26th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2018. Institute of Electrical and Electronics Engineers. 2018. p. 666-673 https://doi.org/10.1109/PDP2018.2018.00111
Yu, Zhaofei ; Huang, Tiejun ; Liu, Jian K. / Implementation of Bayesian Inference in Distributed Neural Networks. Proceedings - 26th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2018. Institute of Electrical and Electronics Engineers, 2018. pp. 666-673
@inproceedings{1f1be6d454f5476199fe9173d45d490c,
title = "Implementation of Bayesian Inference in Distributed Neural Networks",
abstract = "Numerous neuroscience experiments have suggested that the cognitive process of human brain is realized as probability reasoning and further modeled as Bayesian inference. It is still unclear how Bayesian inference could be implemented by neural underpinnings in the brain. Here we present a novel Bayesian inference algorithm based on importance sampling. By distributed sampling through a deep tree structure with simple and stackable basic motifs for any given neural circuit, one can perform local inference while guaranteeing the accuracy of global inference. We show that these task-independent motifs can be used in parallel for fast inference without iteration and scale-limitation. Furthermore, experimental simulations with a small-scale neural network demonstrate that our distributed sampling-based algorithm, consisting with our theoretical analysis, can approximate Bayesian inference. Taken all together, we provide a proofof-principle to use distributed neural networks to implement Bayesian inference, which gives a road-map for large-scale Bayesian network implementation based on spiking neural networks with computer hardwares, including neuromorphic chips.",
keywords = "Bayesian inference, distributed neural network, importance sampling, neural implementation",
author = "Zhaofei Yu and Tiejun Huang and Liu, {Jian K.}",
year = "2018",
month = "6",
day = "6",
doi = "10.1109/PDP2018.2018.00111",
language = "English",
pages = "666--673",
booktitle = "Proceedings - 26th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2018",
publisher = "Institute of Electrical and Electronics Engineers",
address = "United States",

}

TY - GEN

T1 - Implementation of Bayesian Inference in Distributed Neural Networks

AU - Yu, Zhaofei

AU - Huang, Tiejun

AU - Liu, Jian K.

PY - 2018/6/6

Y1 - 2018/6/6

N2 - Numerous neuroscience experiments have suggested that the cognitive process of human brain is realized as probability reasoning and further modeled as Bayesian inference. It is still unclear how Bayesian inference could be implemented by neural underpinnings in the brain. Here we present a novel Bayesian inference algorithm based on importance sampling. By distributed sampling through a deep tree structure with simple and stackable basic motifs for any given neural circuit, one can perform local inference while guaranteeing the accuracy of global inference. We show that these task-independent motifs can be used in parallel for fast inference without iteration and scale-limitation. Furthermore, experimental simulations with a small-scale neural network demonstrate that our distributed sampling-based algorithm, consisting with our theoretical analysis, can approximate Bayesian inference. Taken all together, we provide a proofof-principle to use distributed neural networks to implement Bayesian inference, which gives a road-map for large-scale Bayesian network implementation based on spiking neural networks with computer hardwares, including neuromorphic chips.

AB - Numerous neuroscience experiments have suggested that the cognitive process of human brain is realized as probability reasoning and further modeled as Bayesian inference. It is still unclear how Bayesian inference could be implemented by neural underpinnings in the brain. Here we present a novel Bayesian inference algorithm based on importance sampling. By distributed sampling through a deep tree structure with simple and stackable basic motifs for any given neural circuit, one can perform local inference while guaranteeing the accuracy of global inference. We show that these task-independent motifs can be used in parallel for fast inference without iteration and scale-limitation. Furthermore, experimental simulations with a small-scale neural network demonstrate that our distributed sampling-based algorithm, consisting with our theoretical analysis, can approximate Bayesian inference. Taken all together, we provide a proofof-principle to use distributed neural networks to implement Bayesian inference, which gives a road-map for large-scale Bayesian network implementation based on spiking neural networks with computer hardwares, including neuromorphic chips.

KW - Bayesian inference

KW - distributed neural network

KW - importance sampling

KW - neural implementation

UR - http://www.scopus.com/inward/record.url?scp=85048820930&partnerID=8YFLogxK

U2 - 10.1109/PDP2018.2018.00111

DO - 10.1109/PDP2018.2018.00111

M3 - Conference contribution

SP - 666

EP - 673

BT - Proceedings - 26th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing, PDP 2018

PB - Institute of Electrical and Electronics Engineers

ER -