نتایج جستجو برای: orthogonal regression

تعداد نتایج: 362809  

1994
R. J. Carroll David Ruppert

Orthogonal regression is one of the standard linear regression methods to correct for the eeects of measurement error in predictors. We argue that orthogonal regression is often misused in errors-in-variables linear regression, because of a failure to account for equation errors. The typical result is to overcorrect for measurement error, i.e., overestimate the slope, because equation error is ...

2004

In a recent review article, White and Piette provide an overview of the use of reverse regressions in discrimination-related litigation. They explain the technique, provide a model application, summarize its advantages and disadvantages, and identify litigation in which it has been used. We point out weaknesses in common uses of reverse regression, some of which might cause serious misinterpret...

Journal: :Computers & OR 1993
Andrzej Bargiela Joanna Katherine Hartley

Scope and Purpose : In this paper, a new technique for solving a multivariate linear model using the orthogonal least absolute values regression is proposed. The orthogonal least absolute values (ORLAV) regression minimises the sum of the absolute, orthogonal distance from each data point to the resulting regression hyperplane. In a large set of equations where the variables are independent of ...

2009
Ch.H. Müller R. Wellmann

We present a comparison of different depth notions which are appropriate for classical and orthogonal regression with and without intercept. We consider the global depth and tangential depth introduced by Mizera (2002) and the simplicial depth studied for regression in detail at first by Müller (2005). The global depth and the tangential depth are based on quality functions. These quality funct...

2004
Sheng Chen Xia Hong Christopher J. Harris

An automatic algorithm is derived for constructing kernel density estimates based on a regression approach that directly optimizes generalization capability. Computational efficiency of the density construction is ensured using an orthogonal forward regression, and the algorithm incrementally minimizes the leave-one-out test score. Local regularization is incorporated into the density construct...

Journal: :Int. J. Systems Science 2017
Xia Hong Sheng Chen Yi Guo Junbin Gao

A l-norm penalized orthogonal forward regression (l-POFR) algorithm is proposed based on the concept of leaveone-out mean square error (LOOMSE). Firstly, a new l-norm penalized cost function is defined in the constructed orthogonal space, and each orthogonal basis is associated with an individually tunable regularization parameter. Secondly, due to orthogonal computation, the LOOMSE can be anal...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید