نتایج جستجو برای: kernel trick
تعداد نتایج: 52726 فیلتر نتایج به سال:
We overview methods of fuzzy -means clustering as a representative techniques of unsupervised classification by soft computing. The basic framework is the alternate optimization algorithm originally proposed by Dunn and Bezdek is reviewed and two more objective functions are introduced. An additional variable of controlling volume size is included as an extension. Moreover a method of the kerne...
Kernel Principal Component Analysis (KPCA) is a popular generalization of linear PCA that allows non-linear feature extraction. In KPCA, data in the input space is mapped to higher (usually) dimensional feature space where the data can be linearly modeled. The feature space is typically induced implicitly by a kernel function, and linear PCA in the feature space is performed via the kernel tric...
The Kernel Fisher’s Discriminant (KFD) is a non-linear classifier which has proven to be powerful and competitive to several state-of-the-art classifiers. Its main ingredient is the kernel trick which allows the efficient computation of Fisher’s Linear Discriminant in feature space. However, it is assuming equal covariance structure for all transformed classes, which is not true in many applica...
The single-class verification framework is gaining increasing attention for problems involving authentication and retrieval. In this paper, nonlinear features are extracted using the kernel trick. The class of interest is modeled by using all the available samples rather than a single representative sample. Kernel selection is used to enhance the class specific feature set. A tunable objective ...
Support Vector Machines (SVMs) are well-established Machine Learning (ML) algorithms. They rely on the fact that i) linear learning can be formalized as a well-posed optimization problem; ii) non-linear learning can be brought into linear learning thanks to the kernel trick and the mapping of the initial search space onto an high dimensional feature space. The kernel is designed by the ML exper...
We present a family of positive definite kernels on measures, characterized by the fact that the value of the kernel between two measures is a function of their sum. These kernels can be used to derive kernels on structured objects, such as images and texts, by representing these objects as sets of components, such as pixels or words, or more generally as measures on the space of components. Se...
It is known that the classification performance of Support Vector Machine (SVM) can be conveniently affected by the different parameters of the kernel tricks and the regularization parameter, C. Thus, in this article, we propose a study in order to find the suitable kernel with which SVM may achieve good generalization performance as well as the parameters to use. We need to analyze the behavio...
Fishers linear discriminant analysis (LDA) is a classical multivariate technique both for dimension reduction and classification. The data vectors are transformed into a low dimensional subspace such that the class centroids are spread out as much as possible. In this subspace LDA works as a simple prototype classifier with linear decision boundaries. However, in many applications the linear bo...
Kernel Principal Component Analysis (KPCA) is a popular generalization of linear PCA that allows non-linear feature extraction. In KPCA, data in the input space is mapped to higher (usually) dimensional feature space where the data can be linearly modeled. The feature space is typically induced implicitly by a kernel function, and linear PCA in the feature space is performed via the kernel tric...
In this paper, we present a novel feature extraction called the protoface. While the Eigenface is based on principal component analysis and the Fisherface on Fisher’s linear discriminant analysis, the protoface only requires decomposing the covariance matrix of the prototypes that can better describe whole observations. It is thus more computationally efficient especially when the number of the...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید