Variational Inference via Upper Bound Minimization

نویسندگان

  • Dustin Tran
  • Rajesh Ranganath
چکیده

Variational inference (VI) is widely used as an efficient alternative to Markovchain Monte Carlo. It posits a family of approximating distributions q and findsthe closest member to the exact posterior p. Closeness is usually measured via adivergence D(q||p) from q to p. While successful, this approach also has problems.Notably, it typically leads to underestimation of the posterior variance. In this paperwe propose CHIVI, a black-box variational inference algorithm that minimizesDχ(p||q), the χ-divergence from p to q. CHIVI minimizes an upper bound of themodel evidence, which we term the χ upper bound (CUBO). Minimizing theCUBO leads to improved posterior uncertainty, and it can also be used with theclassical VI lower bound (ELBO) to provide a sandwich estimate of the modelevidence. We study CHIVI on three models: probit regression, Gaussian processclassification, and a Cox process model of basketball plays. When compared toexpectation propagation and classical VI, CHIVI produces better error rates andmore accurate estimates of posterior variance.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variational Inference via \chi Upper Bound Minimization

Variational inference enables Bayesian analysis for complex probabilistic models with massive data sets. It works by positing a family of distributions and finding the member in the family that is closest to the posterior. While successful, variational methods can run into pathologies; for example, they typically underestimate posterior uncertainty. We propose chi-vi, a complementary algorithm ...

متن کامل

Approximate Inference with the Variational Holder Bound

We introduce the Variational Hölder (VH) bound as an alternative to Variational Bayes (VB) for approximate Bayesian inference. Unlike VB which typically involves maximization of a non-convex lower bound with respect to the variational parameters, the VH bound involves minimization of a convex upper bound to the intractable integral with respect to the variational parameters. Minimization of the...

متن کامل

Neural Variational Inference and Learning in Undirected Graphical Models

Many problems in machine learning are naturally expressed in the language of undirected graphical models. Here, we propose black-box learning and inference algorithms for undirected models that optimize a variational approximation to the log-likelihood of the model. Central to our approach is an upper bound on the logpartition function parametrized by a function q that we express as a flexible ...

متن کامل

Integrated Non-Factorized Variational Inference

We present a non-factorized variational method for full posterior inference in Bayesian hierarchical models, with the goal of capturing the posterior variable dependencies via efficient and possibly parallel computation. Our approach unifies the integrated nested Laplace approximation (INLA) under the variational framework. The proposed method is applicable in more challenging scenarios than ty...

متن کامل

The anatomy of choice: dopamine and decision-making

This paper considers goal-directed decision-making in terms of embodied or active inference. We associate bounded rationality with approximate Bayesian inference that optimizes a free energy bound on model evidence. Several constructs such as expected utility, exploration or novelty bonuses, softmax choice rules and optimism bias emerge as natural consequences of free energy minimization. Previ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017