Robust Group Identification and Variable Selection in Sliced Inverse Regression Using Tukey's Biweight Criterion and Ball Covariance
نویسندگان
چکیده
The SSIR-PACS is a group identification and model-free variable selection method under sufficient dimension reduction (SDR) settings. It combined the Pairwise Absolute Clustering Sparsity (PACS) with sliced inverse regression (SIR) methods to produce solutions sparsity ability of identification. However, depends on classical estimates for dispersion location, squared loss function, non-robust weights outliers. In this paper, robust version (RSSIR-PACS) proposed. We replaced by criterion Tukey's biweight. Also, outliers, which depend Pearson’s correlations, are substituted based recently developed ball correlation. Moreover, mean covariance matrix median covariance, respectively. RSSIR-PACS outliers in both response covariates. According results simulations, produces very good results. If existing, efficacy considerably better than competitors. addition, criteria estimate structural d makes practically feasible. we employed real data demonstrate utility RSSIR-PACS.
منابع مشابه
Sliced Inverse Regression with Variable Selection and Interaction Detection
Variable selection methods play important roles in modeling high dimensional data and are keys to data-driven scientific discoveries. In this paper, we consider the problem of variable selection with interaction detection under the sliced inverse index modeling framework, in which the response is influenced by predictors through an unknown function of both linear combinations of predictors and ...
متن کاملVariable Selection for General Index Models via Sliced Inverse Regression
Variable selection, also known as feature selection in machine learning, plays an important role in modeling high dimensional data and is key to data-driven scientific discoveries. We consider here the problem of detecting influential variables under the general index model, in which the response is dependent of predictors through an unknown function of one or more linear combinations of them. ...
متن کاملLocalized Sliced Inverse Regression
We developed localized sliced inverse regression for supervised dimension reduction. It has the advantages of preventing degeneracy, increasing estimation accuracy, and automatic subclass discovery in classification problems. A semisupervised version is proposed for the use of unlabeled data. The utility is illustrated on simulated as well as real data sets.
متن کاملStudent Sliced Inverse Regression
Sliced Inverse Regression (SIR) has been extensively used to reduce the dimension of the predictor space before performing regression. SIR is originally a model free method but it has been shown to actually correspond to the maximum likelihood of an inverse regression model with Gaussian errors. This intrinsic Gaussianity of standard SIR may explain its high sensitivity to outliers as observed ...
متن کاملAsymptotics of Sliced Inverse Regression
Sliced Inverse Regression is a method for reducing the dimension of the explanatory variables x in non-parametric regression problems. Li (1991) discussed a version of this method which begins with a partition of the range of y into slices so that the conditional covariance matrix of x given y can be estimated by the sample covariance matrix within each slice. After that the mean of the conditi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Gazi university journal of science
سال: 2022
ISSN: ['2147-1762']
DOI: https://doi.org/10.35378/gujs.735503