نتایج جستجو برای: kernel trick

تعداد نتایج: 52726  

2006
Sadaaki Miyamoto

We overview methods of fuzzy -means clustering as a representative techniques of unsupervised classification by soft computing. The basic framework is the alternate optimization algorithm originally proposed by Dunn and Bezdek is reviewed and two more objective functions are introduced. An additional variable of controlling volume size is included as an extension. Moreover a method of the kerne...

2007

Kernel Principal Component Analysis (KPCA) is a popular generalization of linear PCA that allows non-linear feature extraction. In KPCA, data in the input space is mapped to higher (usually) dimensional feature space where the data can be linearly modeled. The feature space is typically induced implicitly by a kernel function, and linear PCA in the feature space is performed via the kernel tric...

2008
Riadh Ksantini Djemel Ziou Bernard Colin François Dubeau

The Kernel Fisher’s Discriminant (KFD) is a non-linear classifier which has proven to be powerful and competitive to several state-of-the-art classifiers. Its main ingredient is the kernel trick which allows the efficient computation of Fisher’s Linear Discriminant in feature space. However, it is assuming equal covariance structure for all transformed classes, which is not true in many applica...

2006
Ranjeeth Kumar

The single-class verification framework is gaining increasing attention for problems involving authentication and retrieval. In this paper, nonlinear features are extracted using the kernel trick. The class of interest is modeled by using all the available samples rather than a single representative sample. Kernel selection is used to enhance the class specific feature set. A tunable objective ...

2006
Christian Gagné Marc Schoenauer Michèle Sebag Marco Tomassini

Support Vector Machines (SVMs) are well-established Machine Learning (ML) algorithms. They rely on the fact that i) linear learning can be formalized as a well-posed optimization problem; ii) non-linear learning can be brought into linear learning thanks to the kernel trick and the mapping of the initial search space onto an high dimensional feature space. The kernel is designed by the ML exper...

Journal: :Journal of Machine Learning Research 2005
Marco Cuturi Kenji Fukumizu Jean-Philippe Vert

We present a family of positive definite kernels on measures, characterized by the fact that the value of the kernel between two measures is a function of their sum. These kernels can be used to derive kernels on structured objects, such as images and texts, by representing these objects as sets of components, such as pixels or words, or more generally as measures on the space of components. Se...

Journal: :CoRR 2013
Rimah Amami Dorra Ben Ayed Mezghanni Noureddine Ellouze

It is known that the classification performance of Support Vector Machine (SVM) can be conveniently affected by the different parameters of the kernel tricks and the regularization parameter, C. Thus, in this article, we propose a study in order to find the suitable kernel with which SVM may achieve good generalization performance as well as the parameters to use. We need to analyze the behavio...

1999
Volker Roth Volker Steinhage

Fishers linear discriminant analysis (LDA) is a classical multivariate technique both for dimension reduction and classification. The data vectors are transformed into a low dimensional subspace such that the class centroids are spread out as much as possible. In this subspace LDA works as a simple prototype classifier with linear decision boundaries. However, in many applications the linear bo...

2008
Minh Hoai Nguyen Fernando De la Torre

Kernel Principal Component Analysis (KPCA) is a popular generalization of linear PCA that allows non-linear feature extraction. In KPCA, data in the input space is mapped to higher (usually) dimensional feature space where the data can be linearly modeled. The feature space is typically induced implicitly by a kernel function, and linear PCA in the feature space is performed via the kernel tric...

2002
Zhihua Zhang James T. Kwok Dit-Yan Yeung Wanqiu Wang

In this paper, we present a novel feature extraction called the protoface. While the Eigenface is based on principal component analysis and the Fisherface on Fisher’s linear discriminant analysis, the protoface only requires decomposing the covariance matrix of the prototypes that can better describe whole observations. It is thus more computationally efficient especially when the number of the...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید