نتایج جستجو برای: genetic algorithms and support vector machines
تعداد نتایج: 16990544 فیلتر نتایج به سال:
An implicit Lagrangian for the dual of a simple reformulation of the standard quadratic program of a linear support vector machine is proposed. This leads to the minimization of an unconstrained differentiable convex function in a space of dimensionality equal to the number of classified points. This problem is solvable by an extremely simple linearly convergent Lagrangian support vector machin...
We introduce in this paper Fβ SVMs, a new parametrization of support vector machines. It allows to optimize a SVM in terms of Fβ , a classical information retrieval criterion, instead of the usual classification rate. Experiments illustrate the advantages of this approach with respect to the traditionnal 2norm soft-margin SVM when precision and recall are of unequal importance. An automatic mod...
By setting apart the two functions of a support vector machine: separation of points by a nonlinear surface in the original space of patterns, and maximizing the distance between separating planes in a higher dimensional space, we are able to deene indeenite, possibly discontinuous, kernels, not necessarily inner product ones, that generate highly nonlin-ear separating surfaces. Maximizing the ...
The Support Vector Machine (SVM) has shown great performance in practice as a classification methodology. Oftentimes multicategory problems have been treated as a series of binary problems in the SVM paradigm. Even though the SVM implements the optimal classification rule asymptotically in the binary case, solutions to a series of binary problems may not be optimal for the original multicategor...
In ranking problems, the goal is to learn a ranking function r(x) ∈ R from labeled pairs x, x′ of input points. In this paper, we consider the related comparison problem, where the label y ∈ {−1, 0, 1} indicates which element of the pair is better, or if there is no significant difference. We cast the learning problem as a margin maximization, and show that it can be solved by converting it to ...
This paper proposes a mathematical programming framew ork for combining SVMs with possibly di erent kernels. Compared to single SVMs, the advantage of this approach is tw ofold: it creates SVMs with local domains of expertise leading to local enlargements of the margin, and it allows the use of simple linear kernels combined with a xed boolean operation that is particularly well suited for buil...
Support Vector Machines (SVMs) are state-of-the-art algorithms for classification in machine learning. However, the SVM formulation does not directly seek to find sparse solutions. In this work, we propose an alternate formulation that explicitly imposes sparsity. We show that the proposed technique is related to the standard SVM formulation and therefore shares similar theoretical guarantees. ...
The Vicinal Risk Minimization principle establishes a bridge between generative models and methods derived from the Structural Risk Minimization Principle such as Support Vector Machines or Statistical Regularization. We explain how VRM provides a framework which integrates a number of existing algorithms, such as Parzen windows, Support Vector Machines, Ridge Regression, Constrained Logistic C...
abstract: this paper considers the problem of scheduling n jobs on m unrelated parallel machines with sequence-dependent setup times. to better comply with industrial situations, jobs have varying due dates and ready times and there are some precedence relations between them. furthermore sequence-dependent setup times and anticipatory setups are included in the proposed model. the objective is ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید