نتایج جستجو برای: kernel trick

تعداد نتایج: 52726  

2009
Xiao-Ming Wu Anthony Man-Cho So Zhenguo Li Shuo-Yen Robert Li

Kernel learning is a powerful framework for nonlinear data modeling. Using the kernel trick, a number of problems have been formulated as semidefinite programs (SDPs). These include Maximum Variance Unfolding (MVU) (Weinberger et al., 2004) in nonlinear dimensionality reduction, and Pairwise Constraint Propagation (PCP) (Li et al., 2008) in constrained clustering. Although in theory SDPs can be...

2004
Zhili Wu Chunhung Li

This paper presents a novel Kernel enabled Rival Penalised Competitive Learning (KRPCL) algorithm for clustering. Not only is it able to perform correct clustering without restriction on cluster shape, but also can automatically find the number of clusters. This algorithm generalizes the Rival Penalised Competitive Learning (RPCL)algorithm [15] by using the state-of-the-art kernel trick [2]. Mo...

2003
Gavin C. Cawley Nicola L.C. Talbot

Mika et al. (in: Neural Network for Signal Processing, Vol. IX, IEEE Press, New York, 1999; pp. 41–48) apply the “kernel trick” to obtain a non-linear variant of Fisher’s linear discriminant analysis method, demonstrating state-of-the-art performance on a range of benchmark data sets. We show that leave-one-out cross-validation of kernel Fisher discriminant classi'ers can be implemented with a ...

2008
George G. Cabral Adriano Lorena Inácio de Oliveira

One-class classification is an important problem with applications in several diÆerent areas such as outlier detection and machine monitoring. In this paper we propose a novel method for one-class classification, referred to as kernel k NNDDSRM. This is a modification of an earlier algorithm, the kNNDDSRM, which aims to make the method able to build more flexible descriptions with the use of th...

Journal: :Journal of telecommunications and information technology 2023


 Over the last few years, kernel adaptive filters have gained in importance as trick started to be used classic linear order address various regression and time-series prediction issues nonlinear environments.In this paper, we study a recursive method for identifying finite impulse response (FIR) systems based on binary-value observation systems. We also apply projection (RP) algorithm, yie...

2012
Chuang Lin Binghui Wang Zheming Lu Kuanjiu Zhou

Kernel Fisher discriminant analysis (KFD) is an effective method to extract nonlinear discriminant features of input data using the kernel trick. However, conventional KFD algorithms endure the kernel selection problem as well as the singular problem. In order to overcome these limitations, a novel nonlinear feature extraction method called adaptive quasiconformal kernel Fisher discriminant ana...

2015
Nikolaos Tsapanos Anastasios Tefas Nikolaos Nikolaidis Ioannis Pitas

Data clustering is an unsupervised learning task that has found many applications in various scientific fields. The goal is to find subgroups of closely related data samples (clusters) in a set of unlabeled data. A classic clustering algorithm is the so-called k-Means. It is very popular, however, it is also unable to handle cases in which the clusters are not linearly separable. Kernel k-Means...

2007
Jieping Ye Shuiwang Ji Jianhui Chen

Regularized Kernel Discriminant Analysis (RKDA) performs linear discriminant analysis in the feature space via the kernel trick. Its performance depends on the selection of kernels. We show that this kernel learning problem can be formulated as a semidefinite program (SDP). Based on the equivalence relationship between RKDA and least square problems in the binary-class case, we propose an effic...

2007
Sepp Hochreiter Michael C. Mozer

We address the problem of identifying multiple independent speech sources from a single signal that is a mixture of the sources. Because the problem is ill-posed, standard independent component analysis (ICA) approaches which try to invert the mixing matrix fail. We show how the unsupervised problem can be transformed into a supervised regression task which is then solved by supportvector regre...

Journal: :Neural networks : the official journal of the International Neural Network Society 2009
Zenglin Xu Kaizhu Huang Jianke Zhu Irwin King Michael R. Lyu

Kernel methods have been widely used in pattern recognition. Many kernel classifiers such as Support Vector Machines (SVM) assume that data can be separated by a hyperplane in the kernel-induced feature space. These methods do not consider the data distribution and are difficult to output the probabilities or confidences for classification. This paper proposes a novel Kernel-based Maximum A Pos...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید