Affine Feature Extraction: A Generalization of the Fukunaga-Koontz Transformation

نویسندگان

  • Wenbo Cao
  • Robert M. Haralick
چکیده

Dimension reduction methods are often applied in machine learning and data mining problems. Linear subspace methods are the commonly used ones, such as principal component analysis (PCA), Fisher’s linear discriminant analysis (FDA), common spatial pattern (CSP), et al. In this paper, we describe a novel feature extraction method for binary classification problems. Instead of finding linear subspaces, our method finds lower-dimensional affine subspaces satisfying a generalization of the Fukunaga– Koontz transformation (FKT). The proposed method has a closed-form solution and thus can be solved very efficiently. Under normality assumption, our method can be seen as finding an optimal truncated spectrum of the Kullback–Leibler divergence. Also we show that FDA and CSP are special cases of our proposed method under normality assumption. Experiments on simulated data show that our method performs better than PCA and FDA on data that is distributed on two cylinders, even one within the other. We also show that, on several real data sets, our method provides statistically significant improvement on test set accuracy over FDA, CSP and FKT. Therefore the proposed method can be used as another preliminary data-exploring tool to help solve machine learning and data mining problems. & 2008 Elsevier Ltd. All rights reserved.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Nonlinear Feature Extraction Algorithm Using Distance Transrormation WARREN

Feature extraction has been recognized as a useful technique for pattern recognition. Feature extraction is accomplished by constructing a mapping from the measurement space to a feature space. Often, the mapping is chosen from an arbitrarily specified parametric family by optimizing the parameters with respect to a separability criterion. In the approach described here, a scalar distance funct...

متن کامل

Feature Selection for Improved Classiication 2 F Eature Selection

We apply the feature selection technique of Fukunaga and Koontz an extension of the Karhunen Lo eve transformation to spoken letter recognition Feedforward networks trained for letter pair discrimination with the new features show up to reduction in classi er error rate relative to networks trained with spectral coe cients This performance increase is accompanyed by a reduction in feature dimen...

متن کامل

A General Methodology for Simultaneous Representation and Discrimination of Multiple Object Classes

In this paper we address a new general method for linear and nonlinear feature extraction for simultaneous representation and classi cation. We call this approach the maximum representation and discrimination feature (MRDF) method. We develop a novel nonlinear eigenfeature extraction (NLEF) technique to represent data with closed-form solutions and use it to derive a nonlinear MRDF algorithm. R...

متن کامل

Fukunaga-Koontz Transform for Small Sample Size Problems

In this paper, we propose the Fukunaga-Koontz Transform (FKT) as applied to Small-Sample Size (SSS) problems and formulate a feature scatter matrix based equivalent of the FKT. We establish the classical Linear Discriminant Analysis (LDA) analogy of the FKT and apply it to a SSS situation. We demonstrate the significant computational savings and robustness associated with our approach using a m...

متن کامل

Evaluation of Hyperspectral Image Classification Using Random Forest and Fukunaga-Koontz Transform

Since hyperspectral imagery (HSI) (or remotely sensed data) provides more information (or additional bands) than traditional gray level and color images, it can be used to improve the performance of image classification applications. A hyperspectral image presents spectral features (also called spectral signature) of regions in the image as well as spatial features. Feature reduction, selection...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007