On Dimension Reduction in Regressions with Multivariate Responses

نویسندگان

  • Li-Ping Zhu
  • Li-Xing Zhu
  • Song-Qiao Wen
چکیده

This paper is concerned with dimension reduction in regressions with multivariate responses on high-dimensional predictors. A unified method that can be regarded as either an inverse regression approach or a forward regression method is proposed to recover the central dimension reduction subspace. By using Stein’s Lemma, the forward regression estimates the first derivative of the conditional characteristic function of the response given the predictors; by using the Fourier method, the inverse regression estimates the subspace spanned by the conditional mean of the predictors given the responses. Both methods lead to an identical kernel matrix, while preserving as much regression information as possible. Illustrative examples of a data set and comprehensive simulations are used to demonstrate the application of our methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Moment Based Dimension Reduction for Multivariate Response Regression

Dimension reduction aims to reduce the complexity of a regression without requiring a pre-specified model. In the case of multivariate response regressions, covariance-based estimation methods for the k-th moment based dimension reduction subspaces circumvent slicing and nonparametric estimation so that they are readily applicable to multivariate regression settings. In this article, the covari...

متن کامل

A minimum discrepancy approach to multivariate dimension reduction via k-means inverse regression

We proposed a new method to estimate the intra-cluster adjusted central subspace for regressions with multivariate responses. Following Setodji and Cook (2004), we made use of the k-means algorithm to cluster the observed response vectors. Our method was designed to recover the intracluster information and outperformed previous method with respect to estimation accuracies on both the central su...

متن کامل

Principle Component Analysis and Partial Least Squares: Two Dimension Reduction Techniques for Regression

_____________________________________________________________________________ Abstract: Dimension reduction is one of the major tasks for multivariate analysis, it is especially critical for multivariate regressions in many P&C insurance-related applications. In this paper, we’ll present two methodologies, principle component analysis (PCA) and partial least squares (PLC), for dimension reducti...

متن کامل

On model-free conditional coordinate tests for regressions

Existing model-free tests of the conditional coordinate hypothesis in sufficient dimension reduction (Cook (1998) [3]) focused mainly on the first-order estimation methods such as the sliced inverse regression estimation (Li (1991) [14]). Such testing procedures based on quadratic inference functions are difficult to be extended to second-order sufficient dimension reduction methods such as the...

متن کامل

Estimating Sufficient Reductions of the Predictors in Abundant High-dimensional Regressions by R. Dennis Cook1, Liliana Forzani

We study the asymptotic behavior of a class of methods for sufficient dimension reduction in high-dimension regressions, as the sample size and number of predictors grow in various alignments. It is demonstrated that these methods are consistent in a variety of settings, particularly in abundant regressions where most predictors contribute some information on the response, and oracle rates are ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010