Nearest neighbor classifier generalization through spatially constrained filters
نویسندگان
چکیده
منابع مشابه
Nearest neighbor classifier generalization through spatially constrained filters
It is widely understood that the performance of the nearest neighbor (NN) rule is dependent on: (i) the way distances are computed between di↵erent examples, and (ii) the type of feature representation used. Linear filters are often used in computer vision as a pre-processing step, to extract useful feature representations. In this paper we demonstrate an equivalence between (i) and (ii) for NN...
متن کاملConstrained Nearest Neighbor Queries
In this paper we introduce the notion of constrained nearest neighbor queries (CNN) and propose a series of methods to answer them. This class of queries can be thought of as nearest neighbor queries with range constraints. Although both nearest neighbor and range queries have been analyzed extensively in previous literature, the implications of constrained nearest neighbor queries have not bee...
متن کاملCenter-based nearest neighbor classifier
In this paper, a novel center-based nearest neighbor (CNN) classifier is proposed to deal with the pattern classification problems. Unlike nearest feature line (NFL) method, CNN considers the line passing through a sample point with known label and the center of the sample class. This line is called the center-based line (CL). These lines seem to have more capacity of representation for sample ...
متن کاملProbabilistic characterization of nearest neighbor classifier
The k-Nearest Neighbor classification algorithm (kNN) is one of the most simple yet effective classification algorithms in use. It finds major applications in text categorization, outlier detection, handwritten character recognition, fraud detection and in other related areas. Though sound theoretical results exist regarding convergence of the Generalization Error (GE) of this algorithm to Baye...
متن کاملIterative improvement of a nearest neighbor classifier
In practical pattern recognition applications, the nearest neighbor classifier (NNC) is often applied because it does not require an a priori knowledge of the joint probability density of the input feature vectors. As the number of example vectors is increased, the error probability of the NNC approaches that of the Baysian classifier. However, at the same time, the computational complexity of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Pattern Recognition
سال: 2013
ISSN: 0031-3203
DOI: 10.1016/j.patcog.2012.06.009