Abstract
A popular method to efficiently perform probabilistic inference in graphical models is belief propagation, which is also called message passing. Thereby, messages that represent local beliefs are introduced into the graph and have to be periodically updated until convergence. When applied to graphs containing loops, it is called Loopy Belief Propagation (LBP) and only approximate inference is possible, whereas convergence is not guaranteed. By introducing methods of mes
sage scheduling that regulate the order in which messages are updated, the convergence rate can be increased significantly while still providing good estimates for the marginals. The most efficient scheduling method is that of Residual Belief Propagation (RBP), where the message that changes the most is selected for update. We apply RBP to estimate the marginals of difficult grid graphs with Ising factors and analyze the nonconvergent cases. We observe the problem of message oscillation, where the same small set of messages repeatedly gets selected for update. We introduce and evaluate two different methods to suppress / avoid the oscillations. The first method called RBPnoise injects noise into the message update when oscillation is detected. The second method called RBPnUp takes the number of message updates into account when choosing the next message for update. We show that both methods increase the convergence rate while also giving better marginals than standard RBP. Thereby, RBPnoise improves the convergence rate the most, while RBPnUp give the best marginals.
sage scheduling that regulate the order in which messages are updated, the convergence rate can be increased significantly while still providing good estimates for the marginals. The most efficient scheduling method is that of Residual Belief Propagation (RBP), where the message that changes the most is selected for update. We apply RBP to estimate the marginals of difficult grid graphs with Ising factors and analyze the nonconvergent cases. We observe the problem of message oscillation, where the same small set of messages repeatedly gets selected for update. We introduce and evaluate two different methods to suppress / avoid the oscillations. The first method called RBPnoise injects noise into the message update when oscillation is detected. The second method called RBPnUp takes the number of message updates into account when choosing the next message for update. We show that both methods increase the convergence rate while also giving better marginals than standard RBP. Thereby, RBPnoise improves the convergence rate the most, while RBPnUp give the best marginals.
Originalsprache  englisch 

Qualifikation  Master of Science 
Betreuer/in / Berater/in 

Datum der Bewilligung  30 Apr 2015 
Publikationsstatus  Veröffentlicht  10 Mär 2015 