Generalized Derivative Based Kernelized Learning Vector Quantization

نویسندگان

  • Frank-Michael Schleif
  • Thomas Villmann
  • Barbara Hammer
  • Petra Schneider
  • Michael Biehl
چکیده

We derive a novel derivative based version of kernelized Generalized Learning Vector Quantization (KGLVQ) as an effective, easy to interpret, prototype based and kernelized classifier. It is called D-KGLVQ and we provide generalization error bounds, experimental results on real world data, showing that D-KGLVQ is competitive with KGLVQ and the SVM on UCI data and additionally show that automatic parameter adaptation for the used kernels simplifies the learning.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient Kernelized Prototype Based Classification

Prototype based classifiers are effective algorithms in modeling classification problems and have been applied in multiple domains. While many supervised learning algorithms have been successfully extended to kernels to improve the discrimination power by means of the kernel concept, prototype based classifiers are typically still used with Euclidean distance measures. Kernelized variants of pr...

متن کامل

A sparse kernelized matrix learning vector quantization model for human activity recognition

The contribution describes our application to the ESANN'2013 Competition on Human Activity Recognition (HAR) using Android-OS smartphone sensor signals. We applied a kernel variant of learning vector quantization with metric adaptation using only one prototype vector per class. This sparse model obtains very good accuracies and additionally provides class correlation information. Further, the m...

متن کامل

Relationships Between Support Vector Classifiers and Generalized Linear Discriminant Analysis on Support Vectors

The linear discriminant analysis based on the generalized singular value decomposition (LDA/GSVD) has recently been introduced to circumvents the nonsingularity restriction that occur in the classical LDA so that a dimension reducing transformation can be effectively obtained for undersampled problems. In this paper, relationships between support vector machines (SVMs) and the generalized linea...

متن کامل

Kernelizing Vector Quantization Algorithms

The kernel trick is a well known approach allowing to implicitly cast a linear method into a nonlinear one by replacing any dot product by a kernel function. However few vector quantization algorithms have been kernelized. Indeed, they usually imply to compute linear transformations (e.g. moving prototypes), what is not easily kernelizable. This paper introduces the Kernel-based Vector Quantiza...

متن کامل

Astuce du Noyau & Quantification Vectorielle

The kernel trick is a well known approach allowing to implicitly cast a linear method into a nonlinear one by replacing any dot product by a kernel function. However few vector quantization algorithms have been kernelized. Indeed, they usually imply to compute linear transformations (e.g., moving prototypes), what is not easily kernelizable. This paper introduces the Kernel-based Vector Quantiz...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010