نتایج جستجو برای: reduced rank model

تعداد نتایج: 2637401  

2016
Willin Álvarez Victor Griffin

This paper presents a procedure for coefficient estimation in a multivariate regression model of reduced rank in the presence of multicollinearity. The procedure permits the prediction of the dependent variables taking advantage of both Partial Least Squares (PLS) and Singular Value Decomposition (SVD) methods, which is denoted by PLSSVD. Global variability indices and prediction error sums are...

Journal: :IEEE Trans. Signal Processing 2001
Yingbo Hua Maziar Nikpour Petre Stoica

This paper provides a unified view of, and a further insight into, a class of optimal reduced-rank estimators and filters. An alternating power (AP) method for computing the optimal reduced-rank estimators and filters is derived and analyzed. The AP method is a generalization of the conventional power method for subspace computation, which is shown to be globally and exponentially convergent un...

Journal: :IEEE Trans. Information Theory 2001
Michael L. Honig Weimin Xiao

The performance of reduced-rank linear filtering is studied for the suppression of multiple-access interference. A reduced-rank filter resides in a lower dimensional space, relative to the full-rank filter, which enables faster convergence and tracking. We evaluate the large system output signal-to-interference plus noise ratio (SINR) as a function of filter rank for the multistage Wiener filte...

Journal: :Journal of multivariate analysis 2017
Gyuhyeong Goh Dipak K. Dey Kun Chen

Many modern statistical problems can be cast in the framework of multivariate regression, where the main task is to make statistical inference for a possibly sparse and low-rank coefficient matrix. The low-rank structure in the coefficient matrix is of intrinsic multivariate nature, which, when combined with sparsity, can further lift dimension reduction, conduct variable selection, and facilit...

Journal: :CoRR 2013
Lei Wang Rodrigo C. de Lamare

A reduced-rank framework with set-membership filtering (SMF) techniques is presented for adaptive beamforming problems encountered in radar systems. We develop and analyze stochastic gradient (SG) and recursive least squares (RLS)-type adaptive algorithms, which achieve an enhanced convergence and tracking performance with low computational cost as compared to existing techniques. Simulations s...

2014
Sébastien Duminil Hassane Sadok Daniel B. Szyld DANIEL B. SZYLD

Extrapolation methods can be a very effective technique used for accelerating the convergence of vector sequences. In this paper, these methods are used to accelerate the convergence of Schwarz iterative methods for nonlinear problems. Some convergence analysis is presented, and it is shown numerically that certain extrapolation methods can indeed be very effective in accelerating the convergen...

2002
Peter Reinhard Hansen

It is well-know that estimation by reduced rank regression is given by the solution to a generalized eigenvalue problem. This paper presents a new proof to establish this result and provides additional insight into the structure of the estimation problem. The proof is a direct algebraic proof that some might find more intuitive than existing proofs. JEL Classification: C3, C32

2006
Luke Rosenberg Doug Gray

Large regions of a Synthetic Aperture Radar (SAR) image can potentially be destroyed by an airborne broadband jammer. Jammer components include both the direct-path and multipath reflections from the ground, known as hot-clutter (HC) or terrain scattered interference. Using multiple antennas on a SAR provides spatial degrees of freedom and allows for beamforming to reject the direct-path signal...

2018
Ziping Zhao Daniel P. Palomar

In this paper, the estimation problem for sparse reduced rank regression (SRRR) model is considered. The SRRR model is widely used for dimension reduction and variable selection with applications in signal processing, econometrics, etc. The problem is formulated to minimize the least squares loss with a sparsity-inducing penalty considering an orthogonality constraint. Convex sparsity-inducing ...

2014
Mohammad Taha Bahadori Yan Liu Jinchi Lv

Reduced-rank regression, i.e., multi-task regression subject to a low-rank constraint, is an effective approach to reduce the number of observations required for estimation consistency. However, it is still possible for the estimated singular vectors to be inconsistent in high dimensions as the number of predictors go to infinity with a faster rate than the number of available observations. Spa...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید