نتایج جستجو برای: orthogonal regression

تعداد نتایج: 362809  

Journal: :Statistical Analysis and Data Mining 2013
Bradley C. Turnbull Subhashis Ghosal Hao Helen Zhang

High dimensional data are nowadays encountered in various branches of science. Variable selection techniques play a key role in analyzing high dimensional data. Generally two approaches for variable selection in the high dimensional data setting are considered — forward selection methods and penalization methods. In the former, variables are introduced in the model one at a time depending on th...

Journal: :Int. J. Systems Science 2015
Yuzhu Guo Lingzhong Guo Stephen A. Billings Hua-Liang Wei

A novel iterative learning algorithm is proposed to improve the classic orthogonal forward regression (OFR) algorithm in an attempt to produce an optimal solution under a purely OFR framework without using any other auxiliary algorithms. The new algorithm searches for the optimal solution on a global solution space while maintaining the advantage of simplicity and computational efficiency. Both...

Journal: :IEEE transactions on neural networks and learning systems 2021

Effective features can improve the performance of a model and help us understand characteristics underlying structure complex data. Previously proposed feature selection methods usually cannot retain more discriminative information. To address this shortcoming, we propose novel supervised orthogonal least square regression with weighting for selection. The optimization problem objective functio...

2015
Andrej-Nikolai Spiess

Orthogonal nonlinear least squares (ONLS) regression is a not so frequently applied and largely overlooked regression technique that comes into question when one encounters an ”error in variables” problem. While classical nonlinear least squares (NLS) aims to minimize the sum of squared vertical residuals, ONLS minimizes the sum of squared orthogonal residuals. The method is based on finding po...

1998
Michael Kohler

Let (X; Y) be a pair of random variables with supp(X) 0; 1] and EY 2 < 1. Let m be the corresponding regression function. Estimation of m from i.i.d. data is considered. The L 2 error with integration with respect to the design measure (i.e., the distribution of X) is used as an error criterion. Estimates are constructed by estimating the coeecients of an orthonormal expansion of the regression...

Journal: :IEEE Trans. Systems, Man, and Cybernetics, Part A 2000
Giuseppe Carlo Calafiore

This paper deals with the problem of multivariate affine regression in the presence of outliers in the data. The method discussed is based on weighted orthogonal least squares. The weights associated with the data satisfy a suitable optimality criterion and are computed by a two-step algorithm requiring a RANSAC step and a gradient-based optimization step. Issues related to the breakdown point ...

2011
Aurelie C. Lozano Grzegorz Swirszcz Naoki Abe

We consider a matching pursuit approach for variable selection and estimation in logistic regression models. Specifically, we propose Logistic Group Orthogonal Matching Pursuit (LogitGOMP), which extends the Group-OMP procedure originally proposed for linear regression models, to select groups of variables in logistic regression models, given a predefined grouping structure within the explanato...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید