Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimination
نویسندگان
چکیده
Variable selection serves a dual purpose in statistical classification problems: it enables one to identify the input variables which separate the groups well, and a classification rule based on these variables frequently has a lower error rate than the rule based on all the input variables. Kernel Fisher discriminant analysis (KFDA) is a recently proposed powerful classification procedure, frequently applied in cases characterised by large numbers of input variables. The important problem of eliminating redundant input variables before implementing KFDA is addressed in this paper. A backward elimination approach is recommended, and two criteria which can be used for recursive elimination of input variables are proposed and investigated. Their performance is evaluated on several data sets and in a simulation study. © 2006 Elsevier B.V. All rights reserved.
منابع مشابه
Adaptive Quasiconformal Kernel Fisher Discriminant Analysis via Weighted Maximum Margin Criterion
Kernel Fisher discriminant analysis (KFD) is an effective method to extract nonlinear discriminant features of input data using the kernel trick. However, conventional KFD algorithms endure the kernel selection problem as well as the singular problem. In order to overcome these limitations, a novel nonlinear feature extraction method called adaptive quasiconformal kernel Fisher discriminant ana...
متن کاملSupervised Kernel Principal Component Analysis by Most Expressive Feature Reordering
The presented paper is concerned with feature space derivation through feature selection. The selection is performed on results of kernel Principal Component Analysis (kPCA) of input data samples. Several criteria that drive feature selection process are introduced and their performance is assessed and compared against the reference approach, which is a combination of kPCA and most expressive f...
متن کاملRapid and Brief Communication An e$cient renovation on kernel Fisher discriminant analysis and face recognition experiments
A reformative kernel algorithm, which can deal with two-class problems as well as those with more than two classes, on Fisher discriminant analysis is proposed. In the novel algorithm the supposition that in feature space discriminant vector can be approximated by some linear combination of a part of training samples, called “signi6cant nodes”, is made. If the “signi6cant nodes” are found out, ...
متن کاملFault Diagnosis Based on Improved Kernel Fisher Discriminant Analysis
There are two fundamental problems of the Kernel Fisher Discriminant Analysis (KFDA) for nonlinear fault diagnosis. The first one is the classification performance of KFDA between the normal data and fault data degenerates as long as overlapping samples exist. The second one is that the computational cost of kernel matrix becomes large when the training sample number increases. Aiming at the tw...
متن کاملMental Arithmetic Task Recognition Using Effective Connectivity and Hierarchical Feature Selection From EEG Signals
Introduction: Mental arithmetic analysis based on Electroencephalogram (EEG) signal for monitoring the state of the user’s brain functioning can be helpful for understanding some psychological disorders such as attention deficit hyperactivity disorder, autism spectrum disorder, or dyscalculia where the difficulty in learning or understanding the arithmetic exists. Most mental arithmetic recogni...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Computational Statistics & Data Analysis
دوره 51 شماره
صفحات -
تاریخ انتشار 2006