Non-parametric Nearest Neighbor Classification Based on Global Variance Difference

نویسندگان

چکیده

Abstract As technology improves, how to extract information from vast datasets is becoming more urgent. well known, k-nearest neighbor classifiers are simple implement and conceptually implement. It not without its shortcomings, however, as follows: (1) there still a sensitivity the choice of k -values even when representative attributes considered in each class; (2) some cases, proximity between test samples nearest cannot be reflected accurately due measurements, etc. Here, we propose non-parametric classification method based on global variance differences. First, difference calculated before after adding sample subject, then divided by tested, resulting quotient serves objective function. In final step, tested classified into class with smallest discuss theoretical aspects this Using Lagrange method, it can shown that function optimal centers averaged. Twelve real University California, Irvine used compare proposed algorithm competitors such Local mean pseudo-nearest algorithm. According comprehensive experimental study, average accuracy 12 high 86.27 $$\%$$ % , which far higher than other algorithms. The findings verify produces results dependable existing

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Non-parametric Nearest Neighbor with Local Adaptation

The k–Nearest Neighbor algorithm (k-NN) uses a classification criterion that depends on the parameter k. Usually, the value of this parameter must be determined by the user. In this paper we present an algorithm based on the NN technique that does not take the value of k from the user. Our approach evaluates values of k that classified the training examples correctly and takes which classified ...

متن کامل

Non-zero probability of nearest neighbor searching

Nearest Neighbor (NN) searching is a challenging problem in data management and has been widely studied in data mining, pattern recognition and computational geometry. The goal of NN searching is efficiently reporting the nearest data to a given object as a query. In most of the studies both the data and query are assumed to be precise, however, due to the real applications of NN searching, suc...

متن کامل

Parametric Local Metric Learning for Nearest Neighbor Classification

We study the problem of learning local metrics for nearest neighbor classification. Most previous works on local metric learning learn a number of local unrelated metrics. While this ”independence” approach delivers an increased flexibility its downside is the considerable risk of overfitting. We present a new parametric local metric learning method in which we learn a smooth metric matrix func...

متن کامل

Nearest Neighbor Classification

The nearest-neighbor method is perhaps the simplest of all algorithms for predicting the class of a test example. The training phase is trivial: simply store every training example, with its label. To make a prediction for a test example, first compute its distance to every training example. Then, keep the k closest training examples, where k ≥ 1 is a fixed integer. Look for the label that is m...

متن کامل

Uncertain Nearest Neighbor Classification

This work deals with the problem of classifying uncertain data. With this aim the Uncertain Nearest Neighbor (UNN) rule is here introduced, which represents the generalization of the deterministic nearest neighbor rule to the case in which uncertain objects are available. The UNN rule relies on the concept of nearest neighbor class, rather than on that of nearest neighbor object. The nearest ne...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal of Computational Intelligence Systems

سال: 2023

ISSN: ['1875-6883', '1875-6891']

DOI: https://doi.org/10.1007/s44196-023-00200-1