نتایج جستجو برای: k nearest neighbor object based classifier
تعداد نتایج: 3455668 فیلتر نتایج به سال:
The nearest neighbor classifier (NNC) is a popular non-parametric classifier. It is a simple classifier with no design phase and shows good performance. Important factors affecting the efficiency and performance of NNC are (i) memory required to store the training set, (ii) classification time required to search the nearest neighbor of a given test pattern, and (iii) due to the curse of dimensi...
A Wireless Object Sorting Robot Arm System (WOSRAS) is the combination of Machine Vision System (MVS), Wireless Embedded System (WES) and Robot Arm System (RAS). MVS is the essential fragment of the object sorting robot arm system which constitutes of an image sensor and LabVIEW installed personal computer system to classify the object from an image. NI vision acquisition express of LabVIEW acq...
In this paper, we propose novel methods to find the best relevant feature subset using fuzzy rough set-based attribute subset selection with biologically inspired algorithm search such as ant colony and particle swarm optimization and the principles of an evolutionary process. We then propose a hybrid fuzzy rough with K-nearest neighbor (KNN)-based classifier (FRNN) to classify the patterns in ...
In this paper, a new classification method that uses a clustering method to reduce the train set of K-Nearest Neighbor (KNN) classifier and also in order to enhance its performance is proposed. The proposed method is called Nearest Cluster Classifier (NCC). Inspiring the traditional K-NN algorithm, the main idea is to classify a test sample according to the tag of its nearest neighbor. First, t...
The K-nearest-neighbor decision rule assigns an object of unknown class to the plurality class among the K labeled \training" objects that are closest to it. Closeness is usually de ̄ned in terms of a metric distance on the Euclidean space with the input measurement variables as axes. The metric chosen to de ̄ne this distance can strongly e®ect performance. An optimal choice depends on the proble...
With m processors available, the k-nearest neighbor classifier can be straightforwardly parallelized with a linear speed increase of factor m. In this paper we introduce two methods that in principle are able to achieve this aim. The first method splits the test set in m parts, while the other distributes the training set over m sub-classifiers, and merges their m nearest neighbor sets with eac...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید