Variational Inference on Deep Exponential Family by using Variational Inferences on Conjugate Models

نویسندگان

  • Mohammad Emtiyaz Khan
  • Wu Lin
چکیده

In this paper, we propose a new variational inference method for deep exponentialfamily (DEF) models. Our method converts non-conjugate factors in a DEF model to easy-to-compute conjugate exponential-family messages. This enables local and modular updates similar to variational message passing, as well as stochastic natural-gradient updates similar to stochastic variational inference. Such updates make our algorithm highly suitable for large-scale learning. Our method exploits the structure of the deep network and can be useful to reduce the variance of stochastic methods for variational inference.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Variational Inference in the Conjugate Exponential Family

We present a general method for deriving collapsed variational inference algorithms for probabilistic models in the conjugate exponential family. Our method unifies many existing approaches to collapsed variational inference. Our collapsed variational inference leads to a new lower bound on the marginal likelihood. We exploit the information geometry of the bound to derive much faster optimizat...

متن کامل

Conjugate-Computation Variational Inference: Converting Variational Inference in Non-Conjugate Models to Inferences in Conjugate Models

Variational inference is computationally challenging in models that contain both conjugate and non-conjugate terms. Methods specifically designed for conjugate models, even though computationally efficient, find it difficult to deal with non-conjugate terms. On the other hand, stochastic-gradient methods can handle the nonconjugate terms but they usually ignore the conjugate structure of the mo...

متن کامل

Algorithmic improvements for variational inference

Variational methods for approximate inference in machine learning often adapt a parametric probability distribution to optimize a given objective function. This view is especially useful when applying variational Bayes (VB) to models outside the conjugate-exponential family. For them, variational Bayesian expectation maximization (VB EM) algorithms are not easily available, and gradient-based m...

متن کامل

A Guide to Black Box Variational Inference for Gamma Distributions

Black box variational inference (BBVI) (Ranganath et al., 2014) is a promising approach to statistical inference. It allows practitioners to avoid long derivations of updates of traditional variational inference (Wainwright and Jordan, 2008), and it can be made stochastic (Hoffman et al., 2013) to scale to millions, if not more, observed data points. For models that are non-conjugate, there are...

متن کامل

Bayesian Models of Data Streams with Hierarchical Power Priors

Making inferences from data streams is a pervasive problem in many modern data analysis applications. But it requires to address the problem of continuous model updating, and adapt to changes or drifts in the underlying data generating distribution. In this paper, we approach these problems from a Bayesian perspective covering general conjugate exponential models. Our proposal makes use of non-...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016