Mixture Model for Approximate Inference in Bayesian Networks
Bayes Net is a useful tool for reasoning under uncertainty. However, exact inference in Bayes Net is generally an NP-Hard problem. In this thesis, we propose an approximate inference algorithm which trains a hierarchical mixture model to answer marginal queries in a given Bayes Net. The mixture model can be thought of as a hierarchical clustering indexing structure which allows fast learning and inference. We use a large number of hidden classes for accuracy while taking advantage of the unlimited training data generated by the Bayes Net. This method can also be used to do inference in hybrid Bayes Nets which is a difficult problem in general.