Hierarchical Mixture-of-Experts Model for Large-Scale Gaussian Process Regression

نویسندگان

  • Jun Wei Ng
  • Marc Peter Deisenroth
چکیده

We propose a practical and scalable Gaussian process model for large-scale nonlinear probabilistic regression. Our mixture-of-experts model is conceptually simple and hierarchically recombines computations for an overall approximation of a full Gaussian process. Closed-form and distributed computations allow for efficient and massive parallelisation while keeping the memory consumption small. Given sufficient computing resources, our model can handle arbitrarily large data sets, without explicit sparse approximations. We provide strong experimental evidence that our model can be applied to large data sets of sizes far beyond millions. Hence, our model has the potential to lay the foundation for general large-scale Gaussian process research.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hierarchical Double Dirichlet Process Mixture of Gaussian Processes

We consider an infinite mixture model of Gaussian processes that share mixture components between nonlocal clusters in data. Meeds and Osindero (2006) use a single Dirichlet process prior to specify a mixture of Gaussian processes using an infinite number of experts. In this paper, we extend this approach to allow for experts to be shared non-locally across the input domain. This is accomplishe...

متن کامل

Hierarchical Gaussian Processes for Large Scale Bayesian Regression

We present a deep hierarchical mixture-of-experts model for scalable Gaussian process (GP) regression, which allows for highly parallel and distributed computation on a large number of computational units, enabling the application of GP modelling to large data sets with tens of millions of data points without an explicit sparse representation. The key to the model is the subdivision of the data...

متن کامل

Building Large-Scale Occupancy Maps using an Infinite Mixture of Gaussian Process Experts

This paper proposes a novel method of occupancy map building for large-scale applications. Although Gaussian processes have been successfully applied to occupancy map building, it suffers from high computational complexity of O(n), where n is the number of training data, limiting its use for large-scale mappings. We propose to take a divide-and-conquer approach by partitioning training data int...

متن کامل

Infinite Mixtures of Gaussian Process Experts

We present an extension to the Mixture of Experts (ME) model, where the individual experts are Gaussian Process (GP) regression models. Using an input-dependent adaptation of the Dirichlet Process, we implement a gating network for an infinite number of Experts. Inference in this model may be done efficiently using a Markov Chain relying on Gibbs sampling. The model allows the effective covaria...

متن کامل

Variational Mixture of Gaussian Process Experts

Mixture of Gaussian processes models extended a single Gaussian process with ability of modeling multi-modal data and reduction of training complexity. Previous inference algorithms for these models are mostly based on Gibbs sampling, which can be very slow, particularly for large-scale data sets. We present a new generative mixture of experts model. Each expert is still a Gaussian process but ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1412.3078  شماره 

صفحات  -

تاریخ انتشار 2014