نتایج جستجو برای: orthogonal regression

تعداد نتایج: 362809  

Journal: :Research Journal of Applied Sciences, Engineering and Technology 2013

2002
Sheng Chen

The paper proposes to combine an orthogonal least squares (OLS) model selection with local regularisation for efficient sparse kernel data modelling. By assigning each orthogonal weight in the regression model with an individual regularisation parameter, the ability for the OLS model selection to produce a very parsimonious model with excellent generalisation performance is greatly enhanced.

2014
Ibrahim Aljamaan David T. Westwick Michael Foley

In this work, a non-iterative identification approach is presented for estimating a single-input single-output Wiener model, comprising an infinite impulse response discrete transfer function followed by static non-linearity. Global orthogonal basis functions and orthogonal Hermite polynomials are used as expansion bases for the linear subsystem and the non-linearity, respectively. A multi-inde...

Journal: :Concurrency - Practice and Experience 1999
Erricos John Kontoghiorghes

Efficient algorithms for estimating the coefficient parameters of the ordinary linear model on a massively parallel SIMD computer are presented. The numerical stability of the algorithms is ensured by using orthogonal transformations in the form of Householder reflections and Givens plane rotations to compute the complete orthogonal decomposition of the coefficient matrix. Algorithms for recons...

Journal: :International journal of neural systems 2004
Xia Hong Sheng Chen Paul M. Sharkey

This paper introduces an automatic robust nonlinear identification algorithm using the leave-one-out test score also known as the PRESS (Predicted REsidual Sums of Squares) statistic and regularised orthogonal least squares. The proposed algorithm aims to achieve maximised model robustness via two effective and complementary approaches, parameter regularisation via ridge regression and model op...

Journal: :Pattern Recognition Letters 2012
Feiping Nie Shiming Xiang Yun Liu Chenping Hou Changshui Zhang

In this paper, a new discriminant analysis for feature extraction is derived from the perspective of least squares regression. To obtain great discriminative power between classes, all the data points in each class are expected to be regressed to a single vector, and the basic task is to find a transformation matrix such that the squared regression error is minimized. To this end, two least squ...

2016
Brian J. McCartin

As a direct consequence of the Galton-Pearson-McCartin Theorem [10, Theorem 2], the concentration ellipse provides a unifying thread to the Euclidean construction of various lines of regression. These include lines of coordinate regression [7], orthogonal regression [13], λ-regression [8] and (λ, μ)-regression [9] whose geometric constructions are afforded a unified treatment in the present pap...

2007
Joel L. Horowitz

In functional linear regression, the slope “parameter” is a function. Therefore, in a nonparametric context, it is determined by an infinite number of unknowns. Its estimation involves solving an illposed problem and has points of contact with a range of methodologies, including statistical smoothing and deconvolution. The standard approach to estimating the slope function is based explicitly o...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید