نتایج جستجو برای: gaussian kernel

تعداد نتایج: 123253  

2016
Andrew Gordon Wilson Zhiting Hu Ruslan Salakhutdinov Eric P. Xing

Deep kernel learning combines the non-parametric flexibility of kernel methods with the inductive biases of deep learning architectures. We propose a novel deep kernel learning model and stochastic variational inference procedure which generalizes deep kernel learning approaches to enable classification, multi-task learning, additive covariance structures, and stochastic gradient training. Spec...

Journal: :Journal of Machine Learning Research 2013
Rob Hall Alessandro Rinaldo Larry A. Wasserman

Differential privacy is a framework for privately releasing summaries of a database. Previous work has focused mainly on methods for which the output is a finite dimensional vector, or an element of some discrete set. We develop methods for releasing functions while preserving differential privacy. Specifically, we show that adding an appropriate Gaussian process to the function of interest yie...

2010
Cedric Archambeau Francis Bach

We consider a Gaussian process formulation of the multiple kernel learning problem. The goal is to select the convex combination of kernel matrices that best explains the data and by doing so improve the generalisation on unseen data. Sparsity in the kernel weights is obtained by adopting a hierarchical Bayesian approach: Gaussian process priors are imposed over the latent functions and general...

2008
Aymeric Histace Michel Ménard Christine Cavaro-Ménard

Image data restoration by diffusion equation is now a well established approach since the pioneering work of Perona and Malik (Perona & Malik, 1990). Originally, image diffusion consists in a convolution by a Gaussian kernel which introduces a scale dimension related to the standard deviation of the Gaussian kernel t 2 = σ . This convolution is equivalent to solve the following linear diffusion...

Journal: :Automatica 2016
Tianshi Chen Tohid Ardeshiri Francesca P. Carli Alessandro Chiuso Lennart Ljung Gianluigi Pillonetto

The first order stable spline (SS-1) kernel is used extensively in regularized system identification. In particular, the stable spline estimator models the impulse response as a zero-mean Gaussian process whose covariance is given by the SS-1 kernel. In this paper, we discuss the maximum entropy properties of this prior. In particular, we formulate the exact maximum entropy problem solved by th...

2016
Andrew Gordon Wilson Zhiting Hu Ruslan Salakhutdinov Eric P. Xing

We introduce scalable deep kernels, which combine the structural properties of deep learning architectures with the nonparametric flexibility of kernel methods. Specifically, we transform the inputs of a spectral mixture base kernel with a deep architecture, using local kernel interpolation, inducing points, and structure exploiting (Kronecker and Toeplitz) algebra for a scalable kernel represe...

Journal: :Neurocomputing 2005
Gavin C. Cawley Nicola L. C. Talbot

In this paper we present a simple hierarchical Bayesian treatment of the sparse kernel logistic regression (KLR) model based on the evidence framework introduced by MacKay. The principal innovation lies in the re-parameterisation of the model such that the usual spherical Gaussian prior over the parameters in the kernel induced feature space also corresponds to a spherical Gaussian prior over t...

Journal: :CoRR 2016
Michael T. McCann Matthew C. Fickus Jelena Kovacevic

We present a new smooth, Gaussian-like kernel that allows the kernel density estimate for an angular distribution to be exactly represented by a finite number of its Fourier series coefficients. Distributions of angular quantities, such as gradients, are a central part of several state-of-the-art image processing algorithms, but these distributions are usually described via histograms and there...

1999
Matthias W. Seeger

We present a variational Bayesian method for model selection over families of kernels classifiers like Support Vector machines or Gaussian processes. The algorithm needs no user interaction and is able to adapt a large number of kernel parameters to given data without having to sacrifice training cases for validation. This opens the possibility to use sophisticated families of kernels in situat...

Journal: :Pattern Recognition 2009
Jie Wang Haiping Lu Konstantinos N. Plataniotis Juwei Lu

This paper presents a novel algorithm to optimize the Gaussian kernel for pattern classification tasks, where it is desirable to have well-separated samples in the kernel feature space. We propose to optimize the Gaussian kernel parameters by maximizing a classical class separability criterion, and the problem is solved through a quasi-Newton algorithm by making use of a recently proposed decom...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید