نتایج جستجو برای: distinction sensitive learning vector quantization
تعداد نتایج: 1091013 فیلتر نتایج به سال:
A correlation-based similarity measure is derived for generalized relevance learning vector quantization (GRLVQ). The resulting GRLVQ-C classifier makes Pearson correlation available in a classification cost framework where data prototypes and global attribute weighting terms are adapted into directions of minimum cost function values. In contrast to the Euclidean metric, the Pearson correlatio...
This contribution extends generalized LVQ, generalized relevance LVQ, and robust soft LVQ to the graph domain. The proposed approaches are based on the basic learning graph quantization (lgq) algorithm using the orbifold framework. Experiments on three data sets show that the proposed approaches outperform lgq and lgq2.1.
PhD Thesis in Computer Science written by María Teresa Martín Valdivia under the supervision of Dr. L. Alfonso Ureña López (Univ. of Jaén) and Dr. Francisco Triguero Ruiz (Univ. of Málaga). The author was examined in May 6 2004 by the commitee formed by Dr. Manual Palomar Sanz (Univ. of Alicante), Dr. Amparo Ruiz Sepúlveda (Univ. of Málaga), Dr. Emilio Sanchís Arnal (Univ. Politécnica of Valenc...
Winner-Takes-All (WTA) algorithms offer intuitive and powerful learning schemes such as Learning Vector Quantization (LVQ) and variations thereof, most of which are heuristically motivated. In this article we investigate in an exact mathematical way the dynamics of different vector quantization (VQ) schemes including standard LVQ in simple, though relevant settings. We consider the training fro...
This paper shows that the distributed representation found in Learning Vector Quantization (LVQ) enables reinforcement learning methods to cope with a large decision search space, defined in terms of equivalence classes of input patterns like those found in the game of Go. In particular, this paper describes S[arsa]LVQ, a novel reinforcement learning algorithm and shows its feasibility for patt...
Kohonen neural nets are some kind of competitive nets. The most commonly known variants are the Self-Organizing Maps (SOMs) and the Learning Vector Quantization (LVQ). The former model uses an unsupervized learning, the latter is an e cient classi er. This paper tries to give, in simple words, a clear idea about the basis of competitive neural nets and competitive learning emphasizing on the SO...
Prototype-based classification models, and particularly Learning Vector Quantization (LVQ) frameworks with adaptive metrics, are powerful supervised classification techniques with good generalization behaviour. This thesis proposes three advanced learning methodologies, in the context of LVQ, aiming at better classification performance under various classification settings. The first contributi...
In this paper we describe a method of learning hierarchical representations for describing and recognizing gestures expressed as one and two arm movements using competitive learning methods. At the low end of the hierarchy, the atomic motions (“letters”) corresponding to flow fields computed from successive color image frames are derived using Learning Vector Quantization (LVQ). At the next int...
Identifying clusters is an important aspect of data analysis. This paper proposes a noveldata clustering algorithm to increase the clustering accuracy. A novel game theoretic self-organizingmap (NGTSOM ) and neural gas (NG) are used in combination with Competitive Hebbian Learning(CHL) to improve the quality of the map and provide a better vector quantization (VQ) for clusteringdata. Different ...
In this treatise a range of Line Spectrum Frequency (LSF) Vector Quantization (VQ) schemes were studied comparatively, which were designed for wideband speech codecs. Both predictive arrangements and memoryless schemes were investigated. Specifically, both memoryless Split Vector Quantization (SVQ) and Classified Vector Quantization (CVQ) were studied. These techniques exhibit a low complexity ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید