Efficient Large Scale Linear Programming Support Vector Machines
نویسنده
چکیده
This paper presents a decomposition method for efficiently constructing 1-norm Support Vector Machines (SVMs). The decomposition algorithm introduced in this paper possesses many desirable properties. For example, it is provably convergent, scales well to large datasets, is easy to implement, and can be extended to handle support vector regression and other SVM variants. We demonstrate the efficiency of our algorithm by training on (dense) synthetic datasets of sizes up to 20 million points (in R). The results show our algorithm to be several orders of magnitude faster than a previously published method for the same task. We also present experimental results on real data sets—our method is seen to be not only very fast, but also highly competitive against the leading SVM implementations.
منابع مشابه
A New Play-off Approach in League Championship Algorithm for Solving Large-Scale Support Vector Machine Problems
There are many numerous methods for solving large-scale problems in which some of them are very flexible and efficient in both linear and non-linear cases. League championship algorithm is such algorithm which may be used in the mentioned problems. In the current paper, a new play-off approach will be adapted on league championship algorithm for solving large-scale problems. The proposed algori...
متن کاملLarge-Scale Sonar Target Detection with l1-Norm SV Regression based on Unfeasible Interior Point Methods
Support Vector Machines (SVMs) have become one of the most popular supervised learning-machines in the statistical pattern recognition area. They are used for classification (i.e. SVM) and regression analysis (i.e. Support Vector Regression, SVR). However, when the number of samples available to model an SVM/SVR problem supersedes the computational resources (i.e. large-scale problems where the...
متن کاملA parallel solver for large quadratic programs in training support vector machines
This work is concerned with the solution of the convex quadratic programming problem arising in training the learning machines named support vector machines. The problem is subject to box constraints and to a single linear equality constraint; it is dense and, for many practical applications, it becomes a large-scale problem. Thus, approaches based on explicit storage of the matrix of the quadr...
متن کاملParallel Decomposition Approaches for Training Support Vector Machines
We consider parallel decomposition techniques for solving the large quadratic programming (QP) problems arising in training support vector machines. A recent technique is improved by introducing an efficient solver for the inner QP subproblems and a preprocessing step useful to hot start the decomposition strategy. The effectiveness of the proposed improvements is evaluated by solving large-sca...
متن کاملEfficient Kernel Approximation for Large-Scale Support Vector Machine Classification
Training support vector machines (SVMs) with nonlinear kernel functions on large-scale data are usually very timeconsuming. In contrast, there exist faster solvers to train the linear SVM. We propose a technique which sufficiently approximates the infinite-dimensional implicit feature mapping of the Gaussian kernel function by a low-dimensional feature mapping. By explicitly mapping data to the...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006