Abstract
sage scheduling that regulate the order in which messages are updated, the convergence rate can be increased significantly while still providing good estimates for the marginals. The most efficient scheduling method is that of Residual Belief Propagation (RBP), where the message that changes the most is selected for update. We apply RBP to estimate the marginals of difficult grid graphs with Ising factors and analyze the nonconvergent cases. We observe the problem of message oscillation, where the same small set of messages repeatedly gets selected for update. We introduce and evaluate two different methods to suppress / avoid the oscillations. The first method called RBPnoise injects noise into the message update when oscillation is detected. The second method called RBPnUp takes the number of message updates into account when choosing the next message for update. We show that both methods increase the convergence rate while also giving better marginals than standard RBP. Thereby, RBPnoise improves the convergence rate the most, while RBPnUp give the best marginals.
Original language  English 

Qualification  Master of Science 
Supervisors/Advisors 

Award date  30 Apr 2015 
Publication status  Published  10 Mar 2015 
Fingerprint
Keywords
 probabilistic inference
 graphical models
 factor graphs
 belief propagation
 message passing
 Ising grids
Cite this
Message Scheduling in Loopy Belief Propagation. / Rath, Michael.
2015. 70 p.Research output: Thesis › Master's Thesis › Research
}
TY  THES
T1  Message Scheduling in Loopy Belief Propagation
AU  Rath, Michael
PY  2015/3/10
Y1  2015/3/10
N2  A popular method to efficiently perform probabilistic inference in graphical models is belief propagation, which is also called message passing. Thereby, messages that represent local beliefs are introduced into the graph and have to be periodically updated until convergence. When applied to graphs containing loops, it is called Loopy Belief Propagation (LBP) and only approximate inference is possible, whereas convergence is not guaranteed. By introducing methods of message scheduling that regulate the order in which messages are updated, the convergence rate can be increased significantly while still providing good estimates for the marginals. The most efficient scheduling method is that of Residual Belief Propagation (RBP), where the message that changes the most is selected for update. We apply RBP to estimate the marginals of difficult grid graphs with Ising factors and analyze the nonconvergent cases. We observe the problem of message oscillation, where the same small set of messages repeatedly gets selected for update. We introduce and evaluate two different methods to suppress / avoid the oscillations. The first method called RBPnoise injects noise into the message update when oscillation is detected. The second method called RBPnUp takes the number of message updates into account when choosing the next message for update. We show that both methods increase the convergence rate while also giving better marginals than standard RBP. Thereby, RBPnoise improves the convergence rate the most, while RBPnUp give the best marginals.
AB  A popular method to efficiently perform probabilistic inference in graphical models is belief propagation, which is also called message passing. Thereby, messages that represent local beliefs are introduced into the graph and have to be periodically updated until convergence. When applied to graphs containing loops, it is called Loopy Belief Propagation (LBP) and only approximate inference is possible, whereas convergence is not guaranteed. By introducing methods of message scheduling that regulate the order in which messages are updated, the convergence rate can be increased significantly while still providing good estimates for the marginals. The most efficient scheduling method is that of Residual Belief Propagation (RBP), where the message that changes the most is selected for update. We apply RBP to estimate the marginals of difficult grid graphs with Ising factors and analyze the nonconvergent cases. We observe the problem of message oscillation, where the same small set of messages repeatedly gets selected for update. We introduce and evaluate two different methods to suppress / avoid the oscillations. The first method called RBPnoise injects noise into the message update when oscillation is detected. The second method called RBPnUp takes the number of message updates into account when choosing the next message for update. We show that both methods increase the convergence rate while also giving better marginals than standard RBP. Thereby, RBPnoise improves the convergence rate the most, while RBPnUp give the best marginals.
KW  probabilistic inference
KW  graphical models
KW  factor graphs
KW  belief propagation
KW  message passing
KW  Ising grids
M3  Master's Thesis
ER 