نتایج جستجو برای: kernels

تعداد نتایج: 16416  

2006
José R. Herrero Juan J. Navarro

The use of highly optimized inner kernels is of paramount importance for obtaining efficient numerical algorithms. Often, such kernels are created by hand. In this paper, however, we present an alternative way to produce efficient matrix multiplication kernels based on a set of simple codes which can be parameterized at compilation time. Using the resulting kernels we have been able to produce ...

2007
Georg Berschneider Wolfgang zu Castell

Conditionally positive definite kernels provide a powerful tool for scattered data approximation. Many nice properties of such methods follow from an underlying reproducing kernel structure. While the connection between positive definite kernels and reproducing kernel Hilbert spaces is well understood, the analog relation between conditionally positive definite kernels and reproducing kernel Po...

Journal: :J. Complexity 2009
S. B. Damelin Jeremy Levesley David L. Ragozin X. Sun

The purpose of this paper is to derive quadrature estimates on compact, homogenous manifolds embedded in Euclidean spaces, via energy functionals associated with a class of group-invariant kernels which are generalizations of zonal kernels on the spheres or radial kernels in euclidean spaces. Our results apply, in particular, to weighted Riesz kernels defined on spheres and certain projective s...

2006
Hjalmar ROSENGREN Vadim Kuznetsov H. Rosengren

We study multivariable Christoffel–Darboux kernels, which may be viewed as reproducing kernels for antisymmetric orthogonal polynomials, and also as correlation functions for products of characteristic polynomials of random Hermitian matrices. Using their interpretation as reproducing kernels, we obtain simple proofs of Pfaffian and determinant formulas, as well as Schur polynomial expansions, ...

2011
Marco Cuturi

We propose novel approaches to cast the widely-used family of Dynamic Time Warping (DTW) distances and similarities as positive definite kernels for time series. To this effect, we provide new theoretical insights on the family of Global Alignment kernels introduced by Cuturi et al. (2007) and propose alternative kernels which are both positive definite and faster to compute. We provide experim...

2005
Alexei Vinokourov Andrei N. Soklakov Craig Saunders

There has recently been numerous applications of kernel methods in the field of bioinformatics. In particular, the problem of protein homology has served as a benchmark for the performance of many new kernels which operate directly on strings (such as amino-acid sequences). Several new kernels have been developed and successfully applied to this type of data, including spectrum, string, mismatc...

2013
Debarghya Ghoshdastidar Ambedkar Dukkipati

The role of kernels is central to machine learning. Motivated by the importance of power-law distributions in statistical modeling, in this paper, we propose the notion of powerlaw kernels to investigate power-laws in learning problem. We propose two power-law kernels by generalizing Gaussian and Laplacian kernels. This generalization is based on distributions, arising out of maximization of a ...

Journal: :Journal of Machine Learning Research 2007
Yiming Ying Ding-Xuan Zhou

Gaussian kernels with flexible variances provide a rich family of Mercer kernels for learning algorithms. We show that the union of the unit balls of reproducing kernel Hilbert spaces generated by Gaussian kernels with flexible variances is a uniform Glivenko-Cantelli (uGC) class. This result confirms a conjecture concerning learnability of Gaussian kernels and verifies the uniform convergence ...

Journal: :Journal of Machine Learning Research 2007
Marco Reisert Hans Burkhardt

This paper presents a new class of matrix valued kernels that are ideally suited to learn vector valued equivariant functions. Matrix valued kernels are a natural generalization of the common notion of a kernel. We set the theoretical foundations of so called equivariant matrix valued kernels. We work out several properties of equivariant kernels, we give an interpretation of their behavior and...

2010
Corinna Cortes Mehryar Mohri Afshin Rostamizadeh

This paper presents several novel generalization bounds for the problem of learning kernels based on a combinatorial analysis of the Rademacher complexity of the corresponding hypothesis sets. Our bound for learning kernels with a convex combination of p base kernels using L1 regularization admits only a √ log p dependency on the number of kernels, which is tight and considerably more favorable...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید