نتایج جستجو برای: quadratic support

تعداد نتایج: 702776  

2009
Martin Aigner Laureano González-Vega Bert Jüttler Maria Lucia Sampoli

The support function of a free-form–surface is closely related to the implicit equation of the dual surface, and the process of computing both the dual surface and the support function can be seen as dual implicitization. The support function can be used to parameterize a surface by its inverse Gauss map. This map makes it relatively simple to study isophotes (which are simply images of spheric...

Journal: :Journal of Machine Learning Research 2006
Katya Scheinberg

We propose an active set algorithm to solve the convex quadratic programming (QP) problem which is the core of the support vector machine (SVM) training. The underlying method is not new and is based on the extensive practice of the Simplex method and its variants for convex quadratic problems. However, its application to large-scale SVM problems is new. Until recently the traditional active se...

Journal: :Neurocomputing 2008
Davide Anguita Alessandro Ghio Stefano Pischiutta Sandro Ridella

We describe here a method for building a support vector machine (SVM) with integer parameters. Our method is based on a branchand-bound procedure, derived from modern mixed integer quadratic programming solvers, and is useful for implementing the feedforward phase of the SVM in fixed–point arithmetic. This allows the implementation of the SVM algorithm on resource–limited hardware like, for exa...

2004
Alain Rakotomamonjy

For many years now, there is a growing interest around ROC curve for characterizing machine learning performances. This is particularly due to the fact that in real-world problems misclassification costs are not known and thus, ROC curve and related metrics such as the Area Under ROC curve (AUC) can be a more meaningful performance measures. In this paper, we propose a quadratic programming bas...

2006
Kin Keung Lai Lean Yu Ligang Zhou Shouyang Wang

Credit risk evaluation has been the major focus of financial and banking industry due to recent financial crises and regulatory concern of Basel II. Recent studies have revealed that emerging artificial intelligent techniques are advantageous to statistical models for credit risk evaluation. In this study, we discuss the use of least square support vector machine (LSSVM) technique to design a c...

2007
Ivor W. Tsang James T. Kwok

The training of support vector machines (SVM) involves a quadratic programming problem, which is often optimized by a complicated numerical solver. In this paper, we propose a much simpler approach based on multiplicative updates. This idea was first explored in [Cristianini et al., 1999], but its convergence is sensitive to a learning rate that has to be fixed manually. Moreover, the update ru...

Journal: :J. Inf. Sci. Eng. 2015
Qing Wu

ε-support vector regression (ε-SVR) can be converted into an unconstrained convex and non-smooth quadratic programming problem. It is not solved by the traditional algorithm. In order to solve this non-smooth problem, a class of piecewise smooth functions is introduced to approximate the ε-insensitive loss function of ε-SVR, which generates a ε-piecewise smooth support vector regression (ε-dPWS...

2004
Nikolas List

The decomposition method is currently one of the major methods for solving the convex quadratic optimization problems being associated with support vector machines. For a special case of such problems the convergence of the decomposition method to an optimal solution has been proven based on a working set selection via the gradient of the objective function. In this paper we will show that a ge...

2005
Ali Hisham Malik

The paper selected for implementation was Support Vector Tracking by Shai Avidan. The paper introduces a simple but novel idea of replacing the standard optical flow algorithm equation with equations that use the SVM to calculate the derivatives. The standard way of calculating optical flow is by solving the following matrix: A 11 A 12 u b 1 =-(1) A 21 A 22 v b 2 where (using Lucas Kanade optic...

2005
Jigang Wang Predrag Neskovic Leon N. Cooper

In recent years, support vector machines (SVMs) have become a popular tool for pattern recognition and machine learning. Training a SVM involves solving a constrained quadratic programming problem, which requires large memory and enormous amounts of training time for large-scale problems. In contrast, the SVM decision function is fully determined by a small subset of the training data, called s...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید