نتایج جستجو برای: gaussian kernel

تعداد نتایج: 123253  

2017

The importance of the support vector machine and its applicability to a wide range of problems is well known. The strength of the support vector machine lies in its kernel. In our recent paper, we have shown how the Laplacian kernel overcomes some of the drawbacks of the Gaussian kernel. However this was not a total remedy for the shortcomings of the Gaussian kernel. In this paper, we design a ...

Journal: :Neural computation 2017
Shaobo Lin Jinshan Zeng Xiangyu Chang

This letter aims at refined error analysis for binary classification using support vector machine (SVM) with gaussian kernel and convex loss. Our first result shows that for some loss functions, such as the truncated quadratic loss and quadratic loss, SVM with gaussian kernel can reach the almost optimal learning rate provided the regression function is smooth. Our second result shows that for ...

2017
Angel Deborah S S. Milton Rajendram T. T. Mirnalinee

The system developed by the SSN MLRG1 team for Semeval-2017 task 5 on fine-grained sentiment analysis uses Multiple Kernel Gaussian Process for identifying the optimistic and pessimistic sentiments associated with companies and stocks. Since the comments on the same companies and stocks may display different emotions depending on time, their properities like smoothness and periodicity may vary....

2004
Lawrence D. Brown

The method of regularization with the Gaussian reproducing kernel is popular in the machine learning literature and successful in many practical applications. In this paper we consider the periodic version of the Gaussian kernel regularization. We show in the white noise model setting, that in function spaces of very smooth functions, such as the infinite-order Sobolev space and the space of an...

2008
Jayanta K. Ghosh H. van Zanten

We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described through a concentration function that is expressed in the reproducing Hilbert space. Absolute co...

2017
Mehran Kafai Kave Eshghi

Kernel methods have been shown to be effective for many machine learning tasks such as classification and regression. In particular, support vector machines with the Gaussian kernel have proved to be powerful classification tools. The standard way to apply kernel methods is to use the kernel trick, where the inner product of the vectors in the feature space is computed via the kernel function. ...

2004
YI LIN LAWRENCE D. BROWN L. D. BROWN

The method of regularization with the Gaussian reproducing kernel is popular in the machine learning literature and successful in many practical applications. In this paper we consider the periodic version of the Gaussian kernel regularization. We show in the white noise model setting, that in function spaces of very smooth functions, such as the infinite-order Sobolev space and the space of an...

1999
L. Remaki M. Cheriet

Scale-space representation is one formulation of the multi-scale representation which has received considerable interest in the literature, because of its efficiency in several practical applications, and the distinct properties of the Gaussian kernel which generates the Scale-space. However, in practice, we note some undesirable limitations when using the Gaussian kernel: information loss caus...

Journal: :Inf. Sci. 2015
Tobias Reitmaier Bernhard Sick

Kernel functions in support vector machines (SVM) are needed to assess the similarity of input samples in order to classify these samples, for instance. Besides standard kernels such as Gaussian (i.e., radial basis function, RBF) or polynomial kernels, there are also specific kernels tailored to consider structure in the data for similarity assessment. In this article, we will capture structure...

Journal: :Advances in neural information processing systems 2017
Tri Dao Christopher De Sa Christopher Ré

Kernel methods have recently attracted resurgent interest, showing performance competitive with deep neural networks in tasks such as speech recognition. The random Fourier features map is a technique commonly used to scale up kernel machines, but employing the randomized feature map means that O(ε-2) samples are required to achieve an approximation error of at most ε. We investigate some alter...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید