نتایج جستجو برای: gaussian process

تعداد نتایج: 1372428  

2008
Hyun-Chul Kim Zoubin Ghahramani

Gaussian process classifiers (GPCs) are a fully statistical model for kernel classification. We present a form of GPC which is robust to labeling errors in the data set. This model allows label noise not only near the class boundaries, but also far from the class boundaries which can result from mistakes in labelling or gross errors in measuring the input features. We derive an outlier robust a...

2011
Vinayak Rao Yee Whye Teh

Renewal processes are generalizations of the Poisson process on the real line whose intervals are drawn i.i.d. from some distribution. Modulated renewal processes allow these interevent distributions to vary with time, allowing the introduction of nonstationarity. In this work, we take a nonparametric Bayesian approach, modelling this nonstationarity with a Gaussian process. Our approach is bas...

2007
Vikas Sindhwani Wei Chu S. Sathiya Keerthi

In this paper, we propose a graph-based construction of semi-supervised Gaussian process classifiers. Our method is based on recently proposed techniques for incorporating the geometric properties of unlabeled data within globally defined kernel functions. The full machinery for standard supervised Gaussian process inference is brought to bear on the problem of learning from labeled and unlabel...

1999
Jörg C. Lemm

Nonparametric Bayesian approaches based on Gaussian processes have recently become popular in the empirical learning community. They encompass many classical methods of statistics, like Radial Basis Functions or various splines, and are technically convenient because Gaussian integrals can be calculated analytically. Restricting to Gaussian processes, however, forbids for example the implementi...

2015
Jing Zhao Shiliang Sun

The recently proposed Gaussian process dynamical models (GPDMs) have been successfully applied to time series modeling. There are four learning algorithms for GPDMs: maximizing a posterior (MAP), fixing the kernel hyperparameters ᾱ (Fix.ᾱ), balanced GPDM (B-GPDM) and twostage MAP (T.MAP), which are designed for model training with complete data. When data are incomplete, GPDMs reconstruct the m...

2017
Erik A. Daxberger Kian Hsiang Low

This paper presents a novel distributed batch Gaussian process upper confidence bound (DB-GP-UCB) algorithm for performing batch Bayesian optimization (BO) of highly complex, costly-to-evaluate black-box objective functions. In contrast to existing batch BO algorithms, DBGP-UCB can jointly optimize a batch of inputs (as opposed to selecting the inputs of a batch one at a time) while still prese...

Journal: :CoRR 2018
Tingran Gao Shahar Z. Kovalsky Doug M. Boyer Ingrid Daubechies

As a means of improving analysis of biological shapes, we propose a greedy algorithm for sampling a Riemannian manifold based on the uncertainty of a Gaussian process. This is known to produce a near optimal experimental design with the manifold as the domain, and appears to outperform the use of user-placed landmarks in representing geometry of biological objects. We provide an asymptotic anal...

2008
Ryan Prescott

The Gaussian process is a useful prior on functions for Bayesian regression and classification. Density estimation with a Gaussian process prior has been difficult, however, due to the requirements that densities be nonnegative and integrate to unity. The statistics community has explored the use of a logistic Gaussian process for density estimation, relying on various methods of approximating ...

2014
Keith Dalbey Laura Swiler

The objective is to calculate the probability, PF, that a device will fail when its inputs, x, are randomly distributed with probability density, p (x), e.g., the probability that a device will fracture when subject to varying loads. Here failure is defined as some scalar function, y (x), exceeding a threshold, T . If evaluating y (x) via physical or numerical experiments is sufficiently expens...

2014

Nonparametric regression for massive numbers of samples (n) and features (p) is an important problem. We propose a Bayesian approach for scaling up Gaussian process (GP) regression to big n and p settings using random compression. The proposed compressed GP is particularly motivated by the setting in which features can be projected to a low-dimensional manifold with minimal loss of information ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید