نتایج جستجو برای: weighted knn

تعداد نتایج: 105149  

2005
Yu Takigawa Seiji Hotta Senya Kiyasu Sueharu Miyahara

The recognition rate of the typical nonparametric method “k-nearest neighbor rule (kNN)” is degraded when the dimensionality of feature vectors is large. For reducing this difficulty, Mitani and Hamamoto have proposed a simple and strong classifier that outputs the class of a test sample by measuring the distance between the test sample and the average patterns, which are calculated using k-nea...

2006
Sanjay Rawat V. P. Gulati Arun K. Pujari V. Rao Vemuri

This paper introduces a new similarity measure, termed Binary Weighted Cosine (BWC) metric, for anomaly-based intrusion detection schemes that rely on using sequences of system calls. The new similarity measure considers both the number of shared system calls between two processes as well as frequencies of those calls. The k nearest neighbor (kNN) classifier is used to categorize a process as e...

2014
Nils André Treiber Oliver Kramer

A precise wind power prediction is important for the integration of wind energy into the power grid. Besides numerical weather models for short-term predictions, there is a trend towards the development of statistical data-driven models that can outperform the classical forecast models [1]. In this paper, we improve a statistical prediction model proposed by Kramer and Gieseke [5], by employing...

2005
Pradeep Kumar M. Venkateswara Rao P. Radha Krishna Raju S. Bapi

With the enormous growth of data, which exhibit sequentiality, it has become important to investigate the impact of embedded sequential information within the data. Sequential data are growing enormously, hence an efficient classification of sequential data is needed. k-Nearest Neighbor (kNN) has been used and proved to be an efficient classification technique for two-class problems. This paper...

2012
Tsung-Hsien Chiang Hung-Yi Lo Shou-De Lin

Multi-label classification has attracted a great deal of attention in recent years. This paper presents an approach exploits a ranking model to learn which neighbor’s labels are more trustable candidates for a weighted KNN-based strategy, and then assigns higher weights to those candidates when making weighted-voting decisions. Our experiment results demonstrate that the proposed method outperf...

Journal: :CoRR 2014
Ahmad Basheer Hassanat Mohammad Ali Abbadi Ghada Awad Altarawneh Ahmad Ali Alhasanat

This paper presents a new solution for choosing the K parameter in the k-nearest neighbor (KNN) algorithm, the solution depending on the idea of ensemble learning, in which a weak KNN classifier is used each time with a different K, starting from one to the square root of the size of the training set. The results of the weak classifiers are combined using the weighted sum rule. The proposed sol...

Journal: :International Journal of Advanced Trends in Computer Science and Engineering 2020

Journal: :EURASIP J. Adv. Sig. Proc. 2010
Dongyu Zhang Wangmeng Zuo David Zhang Hongzhi Zhang Naimin Li

Advances in sensor and signal processing techniques have provided effective tools for quantitative research in traditional Chinese pulse diagnosis (TCPD). Because of the inevitable intraclass variation of pulse patterns, the automatic classification of pulse waveforms has remained a difficult problem. In this paper, by referring to the edit distance with real penalty (ERP) and the recent progre...

2008
Ulf Johansson Henrik Boström Rikard König

The standard kNN algorithm suffers from two major drawbacks: sensitivity to the parameter value k, i.e., the number of neighbors, and the use of k as a global constant that is independent of the particular region in which the example to be classified falls. Methods using weighted voting schemes only partly alleviate these problems, since they still involve choosing a fixed k. In this paper, a n...

2011
Jianhua Xu

Multi-label classification is an extension of classical multi-class one, where any instance can be associated with several classes simultaneously and thus the classes are no longer mutually exclusive. It was experimentally shown that the distance-weighted k-nearest neighbour (DWkNN) algorithm is superior to the original kNN rule for multi-class learning. But, it has not been investigated whethe...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید