Simulation for time series classification using feature covariance matrices with K-nearest neighbor

نویسندگان

چکیده

Covariance matrices have advantages to classify series for time classification. The is segmented into subsequences that its feature points are adopted as the vector. can capture pairwise correlation between vectors. After extraction, next step classification process. Our primary purpose evaluate covariance with kNN (Cov-kNN) under simulation pattern and UCR data sets by optimal neighborhood interval. CovNN also compared in this work. results both algorithms comparable performance. So that, using Cov-kNN separating several intervals considered.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Improved K-Nearest Neighbor with Crow Search Algorithm for Feature Selection in Text Documents Classification

The Internet provides easy access to a kind of library resources. However, classification of documents from a large amount of data is still an issue and demands time and energy to find certain documents. Classification of similar documents in specific classes of data can reduce the time for searching the required data, particularly text documents. This is further facilitated by using Artificial...

متن کامل

An Improved K-Nearest Neighbor with Crow Search Algorithm for Feature Selection in Text Documents Classification

The Internet provides easy access to a kind of library resources. However, classification of documents from a large amount of data is still an issue and demands time and energy to find certain documents. Classification of similar documents in specific classes of data can reduce the time for searching the required data, particularly text documents. This is further facilitated by using Artificial...

متن کامل

K-Nearest Neighbor Classification Using Anatomized Data

This paper analyzes k nearest neighbor classification with training data anonymized using anatomy. Anatomy preserves all data values, but introduces uncertainty in the mapping between identifying and sensitive values. We first study the theoretical effect of the anatomized training data on the k nearest neighbor error rate bounds, nearest neighbor convergence rate, and Bayesian error. We then v...

متن کامل

Class Dependent Feature Weighting and K-Nearest Neighbor Classification

Feature weighting in supervised learning concerns the development of methods for quantifying the capability of features to discriminate instances from different classes. A popular method for this task, called RELIEF, generates a feature weight vector from a given training set, one weight for each feature. This is achieved by maximizing in a greedy way the sample margin defined on the nearest ne...

متن کامل

Anisotropic k-Nearest Neighbor Search Using Covariance Quadtree

We present a variant of the hyper-quadtree that divides a multidimensional space according to the hyperplanes associated to the principal components of the data in each hyperquadrant. Each of the 2 hyper-quadrants is a data partition in a λ-dimension subspace, whose intrinsic dimensionality λ ≤ d is reduced from the root dimensionality d by the principal components analysis, which discards the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Nucleation and Atmospheric Aerosols

سال: 2022

ISSN: ['0094-243X', '1551-7616', '1935-0465']

DOI: https://doi.org/10.1063/5.0108204