Publications & Technical Reports | |
R110 | ||
Iterative Algorithms for Graphical Models
Robert Mateescu
Abstract
Probabilistic inference in Bayesian networks, and even reasoning within error bounds are known to be NP-hard problems. Our research focuses on investigating approximate message-passing algorithms inspired by Pearl’s belief propagation algorithm and by variable elimination. We study the advantages of bounded inference provided by anytime schemes such as Mini-Clustering (MC), and combine them with the virtues of iterative algorithms such as Iterative Belief Propagation (IBP). Our resulting hybrid algorithm Iterative Join-Graph Propagation (IJGP) is shown empirically to surpass the performance of both MC and IBP on several classes of networks. IJGP can also be viewed as a Generalized Belief Propagation algorithm, a framework which recently allowed connections with approximate algorithms from statistical physics, showing that convergence points are in fact stationary points of the Bethe (or the more general Kikuchi) free energy. Although there is still little understanding why or when IBP works well, it exhibits tremendous performance on different classes of problems, most notably coding and satisfiability problems. We investigate the iterative algorithms for Bayesian networks by making connections with well known constraint processing algorithms, which help explain when IBP infers correctly extreme beliefs. This study gives an account of why iterating helps, and identifies classes of easy and hard problems for IBP (and IJGP). Finally, we plan to investigate iterative message-passing algorithms in other graph-based frameworks such as influence diagrams and planning. |