نتایج جستجو برای: orthogonal regression

تعداد نتایج: 362809  

2011
Qibin Zhao Cesar F. Caiafa Danilo P. Mandic Liqing Zhang Tonio Ball Andreas Schulze-Bonhage Andrzej Cichocki

A multilinear subspace regression model based on so called latent variable decomposition is introduced. Unlike standard regression methods which typically employ matrix (2D) data representations followed by vector subspace transformations, the proposed approach uses tensor subspace transformations to model common latent variables across both the independent and dependent data. The proposed appr...

2015
Martin Bilodeau Pierre Lafaye de Micheaux Smail Mahdi

The R package groc for generalized regression on orthogonal components contains functions for the prediction of q responses using a set of p predictors. The primary building block is the grid algorithm used to search for components (projections of the data) which are most dependent on the response. The package offers flexibility in the choice of the dependence measure which can be user-defined....

Journal: :Neurocomputing 2008
Sheng Chen Xia Hong Christopher J. Harris

Using the classical Parzen window (PW) estimate as the desired response, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density (SKD) estimates. The proposed algorithm incrementally minimises a leave-one-out test score to select a sparse kernel model, and a local regularisation method is i...

2009
I. A. Al-Subaihi

The problem of this research tackles the process of fitting two concentric spheres to data, which arises in computational metrology. There are also many fitting criteria that could be used effectively, and the most widely used one in metrology, for example, is that of the sum of squared minimal distance. However, a simple and robust algorithm assigned for using the orthogonal distance regressio...

2018
Krzysztof Choromanski Mark Rowland Tamas Sarlos Vikas Sindhwani Richard E. Turner Adrian Weller

We present an in-depth examination of the effectiveness of radial basis function kernel (beyond Gaussian) estimators based on orthogonal random feature maps. We show that orthogonal estimators outperform state-of-the-art mechanisms that use iid sampling under weak conditions for tails of the associated Fourier distributions. We prove that for the case of many dimensions, the superiority of the ...

2017

We present an in-depth examination of the effectiveness of radial basis function kernel (beyond Gaussian) estimators based on orthogonal random feature maps. We show that orthogonal estimators outperform state-of-the-art mechanisms that use iid sampling under weak conditions for tails of the associated Fourier distributions. We prove that for the case of many dimensions, the superiority of the ...

2007
Sheng Chen

Combining orthogonal least squares (OLS) model selection with local regularisation or smoothing leads to efficient sparse kernel-based data modelling. By assigning each orthogonal weight in the regression model with an individual regularisation parameter, the ability for the OLS model selection to produce a very parsimonious model with excellent generalisation performance is greatly enhanced.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید