نتایج جستجو برای: quadratic support
تعداد نتایج: 702776 فیلتر نتایج به سال:
Parallel software for solving the quadratic program arising in training support vector machines for classification problems is introduced. The software implements an iterative decomposition technique and exploits both the storage and the computing resources available on multiprocessor systems, by distributing the heaviest computational tasks of each decomposition iteration. Based on a wide rang...
In this paper, we propose a support vector machine with automatic confidence (SVMAC) for pattern classification. The main contributions of this work to learning machines are twofold. One is that we develop an algorithm for calculating the label confidence value of each training sample. Thus, the label confidence values of all of the training samples can be considered in training support vector ...
In this paper the problem of blind equalization of constant modulus (CM) signals is formulated within the support vector (SV) regression framework. The quadratic inequalities derived from the CM property are transformed into linear ones, thus yielding a quadratic programming (QP) problem. Then an iterative reweighted procedure is proposed to blindly restore the CM property. The technique can be...
We propose and study a new technique for aggregating an ensemble of bootstrapped classifiers. In this method we seek a linear combination of the base-classifiers such that the weights are optimized to reduce variance. Minimum variance combinations are computed using quadratic programming. This optimization technique is borrowed from Mathematical Finance where it is called Markowitz Mean-Varianc...
This paper is concerned with the fuzzy support vector classification, in which both of the type of the output training point and the value of the final fuzzy classification function are triangle fuzzy number. First, the fuzzy classification problem is formulated as a fuzzy chance constrained programming. Then, we transform this programming into its equivalence quadratic programming. Final, a fu...
In this paper we propose some improvements to a recent decomposition technique for the large quadratic program arising in training Support Vector Machines. As standard decomposition approaches, the technique we consider is based on the idea to optimize, at each iteration, a subset of the variables through the solution of a quadratic programming subproblem. The innovative features of this approa...
This paper proposes a novel model of support function machine (SFM) for time series predictions. Two machine learning models, namely, support vector machines (SVM) and procedural neural networks (PNN) are compared in solving time series and they inspire the creation of SFM. SFM aims to extend the support vectors to spatiotemporal domain, in which each component of vectors is a function with res...
Support Vector Machine is one of the most classical approaches for classification and regression. Despite being studied for decades, obtaining practical algorithms for SVM is still an active research problem in machine learning. In this paper, we propose a new perspective for SVM via saddle point optimization. We provide an algorithm which achieves (1 − )-approximations with running time Õ(nd +...
Twin support vector regression (TSVR) is a new regression algorithm, which aims at finding -insensitive upand down-bound functions for the training points. In order to do so, one needs to resolve a pair of smaller-sized quadratic programming problems (QPPs) rather than a single large one in a classical SVR. However, the same penalties are given to the samples in TSVR. In fact, samples in the di...
In this work, an implicit Lagrangian for the dual twin support vector regression is proposed. Our formulation leads to determining non-parallel ε –insensitive downand upbound functions for the unknown regressor by constructing two unconstrained quadratic programming problems of smaller size, instead of a single large one as in the standard support vector regression (SVR). The two related suppor...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید