Comparison of algorithms that select features for pattern classi"ers

نویسندگان

  • Mineichi Kudo
  • Jack Sklansky
چکیده

A comparative study of algorithms for large-scale feature selection (where the number of features is over 50) is carried out. In the study, the goodness of a feature subset is measured by leave-one-out correct-classi"cation rate of a nearestneighbor (1-NN) classi"er and many practical problems are used. A uni"ed way is given to compare algorithms having dissimilar objectives. Based on the results of many experiments, we give guidelines for the use of feature selection algorithms. Especially, it is shown that sequential #oating search methods are suitable for smalland medium-scale problems and genetic algorithms are suitable for large-scale problems. ( 1999 Pattern Recognition Society. Published by Elsevier Science Ltd. All rights reserved.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Improving Image Classi cation by Co-training with Multi-modal Features

We explore the use of co-training to improve the performance of image classi cation in the setting where multiple classi ers are used and several types of features are available. Features are assigned to classi ers in an optimal manner using hierarchical clustering with a distance metric based on conditional mutual information. The e ect of increasing the number of classi ers is then evaluated ...

متن کامل

Comparison of multiwavelet, wavelet, Haralick, and shape features for microcalcification classification in mammograms

We present an evaluation and comparison of the performance of four di5erent texture and shape feature extraction methods for classi(cation of benign and malignant microcalci(cations in mammograms. For 103 regions containing microcalci(cation clusters, texture and shape features were extracted using four approaches: conventional shape quanti(ers; co-occurrence-based method of Haralick; wavelet t...

متن کامل

Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets

We present attribute bagging (AB), a technique for improving the accuracy and stability of classi#er ensembles induced using random subsets of features. AB is a wrapper method that can be used with any learning algorithm. It establishes an appropriate attribute subset size and then randomly selects subsets of features, creating projections of the training set on which the ensemble classi#ers ar...

متن کامل

Comparison and Combination of Statistical and Neural Network Algorithms for Remote-sensing Image Classification

In recent years, the remote-sensing community has became very interested in applying neural networks to image classi cation and in comparing neural networks performances with the ones of classical statistical methods. These experimental comparisons pointed out that no single classi cation algorithm can be regarded as a \panacea". The superiority of one algorithm over the other strongly depends ...

متن کامل

Adaptive Selection of Image Classifiers

Recently, the concept of \Multiple Classi er Systems" was proposed as a new approach to the development of high performance image classi cation systems. Multiple Classi er Systems can be used to improve classi cation accuracy by combining the outputs of classi ers making \uncorrelated" errors. Unfortunately, in real image recognition problems, it may be very di cult to design an ensemble of cla...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999