Comment: Fisher Lecture: Dimension Reduction in Regression
نویسنده
چکیده
I am pleased to participate in this well-deserved recognition of Dennis Cook’s remarkable career. Cook points out Fisher’s insistence that predictor variables in regression be chosen without reference to the dependent variable. Reduction by principal components clearly satisfies that dictum. One of my primary objections to partial least squares regression when I first encountered it as an alternative to principal components was that the predictor variables were being chosen with reference to the dependent variable. (I now have other objections to partial least squares.) Yet on the other hand, variable selection in regression is well accepted and it clearly chooses variables based on their relationship to the dependent variable. Perhaps variable selection is better thought of as a form of shrinkage estimation rather than as a process for choosing predictor variables. Cook also reiterates something that I think is difficult to overemphasize: Fisher’s point that “More or less elaborate forms [for models] will be suitable according to the volume of the data.” We see this now on a regular basis as modern technology provides larger data sets to which elaborate models are regularly fitted. With regard to Cook’s work, it seems to me that the key issue in the development of Cook’s models (2), (5), (10) and (13) is whether they are broadly reasonable. The question did not seem to be extensively addressed but Cook shows that much can be gained if we can reasonably use them. When they are appropriate, the results in the corresponding propositions are rather stunning. It has long been
منابع مشابه
Fisher Lecture: Dimension Reduction in Regression1,21,21,21,2
Beginning with a discussion of R. A. Fisher’s early written remarks that relate to dimension reduction, this article revisits principal components as a reductive method in regression, develops several model-based extensions and ends with descriptions of general approaches to model-based and model-free dimension reduction in regression. It is argued that the role for principal components and rel...
متن کاملFisher Lecture: Dimension Reduction in Regression1,2
Beginning with a discussion of R. A. Fisher’s early written remarks that relate to dimension reduction, this article revisits principal components as a reductive method in regression, develops several model-based extensions and ends with descriptions of general approaches to model-based and model-free dimension reduction in regression. It is argued that the role for principal components and rel...
متن کاملComment: Fisher Lecture: Dimension Reduction in Regression
This paper puts dimension reduction into the historical context of sufficiency, efficiency and principal component analysis, and opens up an avenue toward efficient dimension reduction via maximum likelihood estimation of inverse regression. I congratulate Professor Cook for this insightful and groundbreaking work. My discussion will focus on two points that explore and extend Cook’s ideas. The...
متن کاملPrincipal Fitted Components for Dimension Reduction in Regression
We provide a remedy for two concerns that have dogged the use of principal components in regression: (i) principal components are computed from the predictors alone and do not make apparent use of the response, and (ii) principal components are not invariant or equivariant under full rank linear transformation of the predictors. The development begins with principal fitted components [Cook, R. ...
متن کاملGaussian Regularized Sliced Inverse Regression
Sliced Inverse Regression (SIR) is an effective method for dimension reduction in high-dimensional regression problems. The original method, however, requires the inversion of the predictors covariance matrix. In case of collinearity between these predictors or small sample sizes compared to the dimension, the inversion is not possible and a regularization technique has to be used. Our approach...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2008