نتایج جستجو برای: quadratic support

تعداد نتایج: 702776  

2008
Margot Rabl Bert Jüttler Laureano Gonzalez-Vega

We consider surfaces whose support function is obtained by restricting a quadratic polynomial to the unit sphere. If such a surface is subject to a rigid body motion, then the Gauss image of the characteristic curves is shown to be a spherical quartic curve, making them accessible to exact geometric computation. In particular we analyze the case of moving surfaces of revolution.

2004
Filippo Portera Alessandro Sperduti

The standard SVR formulation for real-valued function approximation on multidimensional spaces is based on the -insensitive loss function, where errors are considered not correlated. Due to this, local information in the feature space which can be useful to improve the prediction model is disregarded. In this paper we address this problem by defining a generalized quadratic loss where the co-oc...

Journal: :Neural networks : the official journal of the International Neural Network Society 2003
Daisuke Tsujinishi Shigeo Abe

In least squares support vector machines (LS-SVMs), the optimal separating hyperplane is obtained by solving a set of linear equations instead of solving a quadratic programming problem. But since SVMs and LS-SVMs are formulated for two-class problems, unclassifiable regions exist when they are extended to multiclass problems. In this paper, we discuss fuzzy LS-SVMs that resolve unclassifiable ...

2002
Mario Martin

This paper describes an on-line method for building ε-insensitive support vector machines for regression as described in (Vapnik, 1995). The method is an extension of the method developed by (Cauwenberghs & Poggio, 2000) for building incremental support vector machines for classification. Machines obtained by using this approach are equivalent to the ones obtained by applying exact methods like...

2006
Wang Zhongdong

Support vector machine has become an increasingly popular tool for machine learning tasks involving classification, regression or novelty detection. Training a support vector machine requires the solution of a very large quadratic programming problem. Traditional optimization methods cannot be directly applied due to memory restrictions. Up to now, several approaches exist for circumventing the...

2004
Filippo Portera Alessandro Sperduti

The standard SVM formulation for binary classification is based on the Hinge loss function, where errors are considered not correlated. Due to this, local information in the feature space which can be useful to improve the prediction model is disregarded. In this paper we address this problem by defining a generalized quadratic loss where the co-occurrence of errors is weighted according to a k...

Journal: :Mathematics 2022

With the development of science and technology, more data have been produced. For many these datasets, only some labels. In order to make full use information in data, it is necessary classify them. this paper, we propose a strong sparse quadratic kernel-free least squares semi-supervised support vector machine (SSQLSS3VM), which add ℓ0norm regularization term sparse. An NP-hard problem arises ...

Ali Akbar Mohsenipour, Serge B. Provost,

Noncentral indefinite quadratic expressions in possibly non- singular normal vectors are represented in terms of the difference of two positive definite quadratic forms and an independently distributed linear combination of standard normal random variables. This result also ap- plies to quadratic forms in singular normal vectors for which no general representation is currently available. The ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید