Learning High-Dimensional Mixtures of Graphical Models
نویسندگان
چکیده
We consider unsupervised estimation of mixtures of discrete graphical models, where the class variable corresponding to the mixture components is hidden and each mixture component over the observed variables can have a potentially different Markov graph structure and parameters. We propose a novel approach for estimating the mixture components, and our output is a tree-mixture model which serves as a good approximation to the underlying graphical model mixture. Our method is efficient when the union graph, which is the union of the Markov graphs of the mixture components, has sparse vertex separators between any pair of observed variables. This includes tree mixtures and mixtures of bounded degree graphs. For such models, we prove that our method correctly recovers the union graph structure and the tree structures corresponding to maximum-likelihood tree approximations of the mixture components. The sample and computational complexities of our method scale as poly(p, r), for an r-component mixture of p-variate graphical models. We further extend our results to the case when the union graph has sparse local separators between any pair of observed variables, such as mixtures of locally tree-like graphs, and the mixture components are in the regime of correlation decay.
منابع مشابه
Mixtures of Tree-Structured Probabilistic Graphical Models for Density Estimation in High Dimensional Spaces
Probabilistic graphical models reduce the number of parameters necessary to encode a joint probability distribution by exploiting independence relationships between variables. However, using those models is challenging when there are thousands of variables or more. First, both learning these models from a set of observations and exploiting them is computationally problematic. Second, the number...
متن کاملHigh-dimensional probability density estimation with randomized ensembles of tree structured Bayesian networks
In this work we explore the Perturb and Combine idea, celebrated in supervised learning, in the context of probability density estimation in high-dimensional spaces with graphical probabilistic models. We propose a new family of unsupervised learning methods of mixtures of large ensembles of randomly generated tree or poly-tree structures. The specific feature of these methods is their scalabil...
متن کاملLearning Mixtures of Tree Graphical Models
We consider unsupervised estimation of mixtures of discrete graphical models, where the class variable is hidden and each mixture component can have a potentially different Markov graph structure and parameters over the observed variables. We propose a novel method for estimating the mixture components with provable guarantees. Our output is a tree-mixture model which serves as a good approxima...
متن کاملLearning Tractable Graphical Models Using Mixture of Arithmetic Circuits
In recent years, there has been a growing interest in learning tractable graphical models in which exact inference is efficient. Two main approaches are to restrict the inference complexity directly, as done by low-treewidth graphical models and arithmetic circuits (ACs), or introduce latent variables, as done by mixtures of trees, latent tree models, and sum-product networks (SPNs). In this pa...
متن کاملSpectral Algorithms for Graphical Models Lecturer : Eric
Modern machine learning tasks often deal with high-dimensional data. One typically makes some assumption on structure, like sparsity, to make learning tractable over high-dimensional instances. Another common assumption on structure is that of latent variables in the generative model. In latent variable models, one attempts to perform inference not only on observed variables, but also on unobse...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1203.0697 شماره
صفحات -
تاریخ انتشار 2012