نتایج جستجو برای: kernel trick
تعداد نتایج: 52726 فیلتر نتایج به سال:
When solving data analysis problems it is important to integrate prior knowledge and/or structural invariances. This paper contributes by a novel framework for incorporating algebraic invariance structure into kernels. In particular, we show that algebraic properties such as sign symmetries in data, phase independence, scaling etc. can be included easily by essentially performing the kernel tri...
The Modified Quadratic Discriminant Function was first proposed by Kimura et al to improve the performance of Quadratic Discriminant Function, which can be seen as a dot-product method by eigen-decompostion of the covariance matrix of each class. Therefore, it is possible to expand MQDF to high dimension space by kernel trick. This paper presents a new kernel-based method to pattern recognition...
This work presents kernel functions that can be used in conjunction with the Support Vector Machine – SVM – learning algorithm to solve the automatic text classification task. Initially the Vector Space Model for text processing is presented. According to this model text is seen as a set of vectors in a high dimensional space; then extensions and alternative models are derived, and some preproc...
Most machine learning algorithms, such as classification or regression, treat the individual data point as the object of interest. Here we consider extending machine learning algorithms to operate on groups of data points. We suggest treating a group of data points as a set of i.i.d. samples from an underlying feature distribution for the group. Our approach is to generalize kernel machines fro...
In this paper, we present a simple and straightforward primal-dual support vector machine formulation to the problem of principal component analysis (PCA) in dual variables. By considering a mapping to a high-dimensional feature space and application of the kernel trick (Mercer theorem), kernel PCA is obtained as introduced by Scholkopf et al. (2002). While least squares support vector machine ...
This paper provides new insight into kernel methods by using data selection. The kernel trick is used to select from the data a relevant subset forming a basis in a feature space F . Thus the selected vectors de,ne a subspace in F . Then, the data is projected onto this subspace where classical algorithms are applied. We show that kernel methods like generalized discriminant analysis (Neural Co...
The minimum regularized covariance determinant method (MRCD) is a robust estimator for multivariate location and scatter, which detects outliers by fitting matrix to the data. Its regularization ensures that well-conditioned in any dimension. MRCD assumes non-outlying observations are roughly elliptically distributed, but many datasets not of form. Moreover, computation time increases substanti...
We study the branch divisors on boundary of canonical toroidal compactification ball quotients. show a criterion, low slope cusp form trick, for proving that quotients are general type. Moreover, we classify when irregular cusps exist in case discriminant kernel and construct concrete examples some arithmetic subgroups. As another direction study, complex is embedded into Hermitian symmetric do...
Regularized kernel discriminant analysis (RKDA) performs linear discriminant analysis in the feature space via the kernel trick. Its performance depends on the selection of kernels. In this paper, we consider the problem of multiple kernel learning (MKL) for RKDA, in which the optimal kernel matrix is obtained as a linear combination of pre-specified kernel matrices. We show that the kernel lea...
This paper corrects the proof of the Theorem 2 from the Gower’s paper [3, page 5]. The correction is needed in order to establish the existence of the kernel function used commonly in the kernel trick e.g. for k-means clustering algorithm, on the grounds of distance matrix. The scope of correction is explained in section 2. 1 The background problem Kernel based k-means clustering algorithm (clu...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید