نتایج جستجو برای: support vector regression

تعداد نتایج: 1101317  

2006
Sedat Ozer Hakan A. Çirpan Nihat Kabaoglu

This paper addresses the problem of applying powerful statistical pattern classification algorithm based on kernel functions to target tracking on surveillance systems. Rather than directly adapting a recognizer, we develop a localizer directly using the regression form of the Support Vector Machines (SVM). The proposed approach considers to use dynamic model together as feature vectors and mak...

2007
Ignacio Lopez-Moreno Ismael Mateos-Garcia Daniel Ramos-Castro Joaquín González-Rodríguez

This paper explores Support Vector Regression (SVR) as an alternative to the widely-used Support Vector Classification (SVC) in GLDS (Generalized Linear Discriminative Sequence)based speaker verification. SVR allows the use of a ε-insensitive loss function which presents many advantages. First, the optimization of the ε parameter adapts the system to the variability of the features extracted fr...

Journal: :Neurocomputing 2003
Jinbo Bi Kristin P. Bennett

We develop an intuitive geometric framework for support vector regression (SVR). By examining when 2-tubes exist, we show that SVR can be regarded as a classification problem in the dual space. Hard and soft 2-tubes are constructed by separating the convex or reduced convex hulls respectively of the training data with the response variable shifted up and down by 2. A novel SVR model is proposed...

Journal: :IEEE Trans. Pattern Anal. Mach. Intell. 2000
Olvi L. Mangasarian David R. Musicant

ÐThe robust Huber M-estimator, a differentiable cost function that is quadratic for small errors and linear otherwise, is modeled exactly, in the original primal space of the problem, by an easily solvable simple convex quadratic program for both linear and nonlinear support vector estimators. Previous models were significantly more complex or formulated in the dual space and most involved spec...

2005
Chao Fan Hossein Sarrafzadeh Farhad Dadgostar Hamid Gholamhosseini

Real-time Facial Expression analysis is one the important topics in the development of the next generation affect-sensitive user interfaces. However current algorithms and techniques are computationally expensive, and therefore are not suitable for real-time applications. In this paper we present a real-time system for analyzing the six basic facial expressions of the human user, using regressi...

2006
Yuya Kamada Shigeo Abe

In our previous work we have shown that Mahalanobis kernels are useful for support vector classifiers both from generalization ability and model selection speed. In this paper we propose using Mahalanobis kernels for function approximation. We determine the covariance matrix for the Mahalanobis kernel using all the training data. Model selection is done by line search. Namely, first the margin ...

Journal: :Journal of Machine Learning Research 2012
Chia-Hua Ho Chih-Jen Lin

Support vector regression (SVR) and support vector classification (SVC) are popular learning techniques, but their use with kernels is often time consuming. Recently, linear SVC without kernels has been shown to give competitive accuracy for some applications, but enjoys much faster training/testing. However, few studies have focused on linear SVR. In this paper, we extend state-of-the-art trai...

2001
Wei Chu S. Sathiya Keerthi Chong Jin Ong

................................................................................................................................ii Table of

2004
Yahya Forghani Hadi Sadoghi Yazdi Sohrab Effati

In this paper, we incorporate the concept of fuzzy set theory into the support vector regression (SVR). In our proposed method, target outputs of training samples are considered to be fuzzy numbers and then, membership function of actual output (objective hyperplane in high dimensional feature space) is obtained. Two main properties of our proposed method are: (1) membership function of actual ...

1999
Sethu Vijayakumar S. Wu

Support Vector Machines(SVMs) map the input training data into a high dimensional feature space and nds a maximal margin hyperplane separating the data in that feature space. Extensions of this approach account for non-separable or noisy training data (soft classi ers) as well as support vector based regression. The optimal hyperplane is usually found by solving a quadratic programming problem ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید