نتایج جستجو برای: kernel functions

تعداد نتایج: 534716  

2007
Longin Jan Latecki Aleksandar Lazarevic Dragoljub Pokrajac

Outlier detection has recently become an important problem in many industrial and financial applications. In this paper, a novel unsupervised algorithm for outlier detection with a solid statistical foundation is proposed. First we modify a nonparametric density estimate with a variable kernel to yield a robust local density estimation. Outliers are then detected by comparing the local density ...

2010
Prateek Jain Brian Kulis Inderjit S. Dhillon

In this paper we consider the problem of semi-supervised kernel function learning. We first propose a general regularized framework for learning a kernel matrix, and then demonstrate an equivalence between our proposed kernel matrix learning framework and a general linear transformation learning problem. Our result shows that the learned kernel matrices parameterize a linear transformation kern...

2008
Fabrizio Colombo Graziano Gentili Irene Sabadini

In this paper we show how to construct a regular, non commutative Cauchy kernel for slice regular quaternionic functions. We prove an (algebraic) representation formula for such functions, which leads to a new Cauchy formula. We find the expression of the derivatives of a regular function in terms of the powers of the Cauchy kernel, and we present several other consequent results. AMS Classific...

Journal: :Remote Sensing 2016
Yanbiao Sun Liang Zhao Guoqing Zhou Lei Yan

The classical absolute orientation method is capable of transforming tie points (TPs) from a local coordinate system to a global (geodetic) coordinate system. The method is based only on a unique set of similarity transformation parameters estimated by minimizing the total difference between all ground control points (GCPs) and the fitted points. Nevertheless, it often yields a transformation w...

2010

In this paper we consider the fundamental problem of semi-supervised kernel function learning. We first propose a general regularized framework for learning a kernel matrix, and then demonstrate an equivalence between our proposed kernel matrix learning framework and a general linear transformation learning problem. Our result shows that the learned kernel matrices parameterize a linear transfo...

1999
Volker Roth Volker Steinhage

Fishers linear discriminant analysis (LDA) is a classical multivariate technique both for dimension reduction and classification. The data vectors are transformed into a low dimensional subspace such that the class centroids are spread out as much as possible. In this subspace LDA works as a simple prototype classifier with linear decision boundaries. However, in many applications the linear bo...

2005
Annalisa Barla

Understanding image content is a long standing problem of computer science. Despite decades of research in computer vision, an effective solution to this problem does not appear to be in sight. Recent advances in the theory of learning by examples indicate that devising systems which can be trained instead of programmed to solve this problem is an interesting alternative to solutions constructe...

2010
N. Arcozzi R. Rochberg E. Sawyer B. D. Wick B. D. WICK

Suppose H is a space of functions on X. If H is a Hilbert space with reproducing kernel then that structure of H can be used to build distance functions on X. We describe some of those and their interpretations and interrelations. We also present some computational properties and examples.

2008
Tien-Chung Hu Jianqing Fan

Jianqing Fan Department of Statistics University of North Carolina Chapel Hill, N.C. 27514 Kernel density estimates are frequently used, based on a second order kernel. Thus, the bias inherent to the estimates has an order of O(h~). In this note, a method of corr~cting the bias in the kernel density estimates is provided, which reduces the bias to a smaller order. Effectively, this method produ...

Journal: :Evolutionary Intelligence 2012
Tobias Glasmachers Jan Koutník Jürgen Schmidhuber

To parameterize continuous functions for evolutionary learning, we use kernel expansions in nested sequences of function spaces of growing complexity. This approach is particularly powerful when dealing with non-convex constraints and discontinuous objective functions. Kernel methods offer a number of beneficial properties for parameterizing continuous functions, such as smoothness and locality...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید