Regularizing Lasso: a Consistent Variable Selection Method

نویسندگان

  • Quefeng Li
  • Jun Shao
چکیده

Table 1 provides the average computational time (in minutes) for the eight methods under the simulation settings. SIS clearly requires the least computational effort, whereas RLASSO as well as Scout require much longer computational time. But all methods except RLASSO(CLIME) can be computed under a reasonable amount of time for p = 5000 and n = 100. RLASSO(CLIME) takes much longer because of inverting a matrix of 5000 dimension. However, 790.8 minutes of computation may still be acceptable. In an unreported simulation with p = 2000 and the same other settings, the average computational time for RLASSO(CLIME) is 46.7 minutes.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variable selection for multiply-imputed data with application to dioxin exposure study.

Multiple imputation (MI) is a commonly used technique for handling missing data in large-scale medical and public health studies. However, variable selection on multiply-imputed data remains an important and longstanding statistical problem. If a variable selection method is applied to each imputed dataset separately, it may select different variables for different imputed datasets, which makes...

متن کامل

The Adaptive Lasso and Its Oracle Properties

The lasso is a popular technique for simultaneous estimation and variable selection. Lasso variable selection has been shown to be consistent under certain conditions. In this work we derive a necessary condition for the lasso variable selection to be consistent. Consequently, there exist certain scenarios where the lasso is inconsistent for variable selection. We then propose a new version of ...

متن کامل

“preconditioning” for Feature Selection and Regression in High-dimensional Problems1 by Debashis Paul,

We consider regression problems where the number of predictors greatly exceeds the number of observations. We propose a method for variable selection that first estimates the regression function, yielding a “preconditioned” response variable. The primary method used for this initial regression is supervised principal components. Then we apply a standard procedure such as forward stepwise select...

متن کامل

Model selection via standard error adjusted adaptive lasso

The adaptive lasso is a model selection method shown to be both consistent in variable selection and asymptotically normal in coefficient estimation. The actual variable selection performance of the adaptive lasso depends on the weight used. It turns out that the weight assignment using the OLS estimate (OLS-adaptive lasso) can result in very poor performance when collinearity of the model matr...

متن کامل

Consistent group selection in high-dimensional linear regression.

In regression problems where covariates can be naturally grouped, the group Lasso is an attractive method for variable selection since it respects the grouping structure in the data. We study the selection and estimation properties of the group Lasso in high-dimensional settings when the number of groups exceeds the sample size. We provide sufficient conditions under which the group Lasso selec...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014