نتایج جستجو برای: partial linear model preliminary test lasso

تعداد نتایج: 3367252  

2011
Sebastian Petry Claudia Flexeder Gerhard Tutz

In the last decade several estimators have been proposed that enforce the grouping property. A regularized estimate exhibits the grouping property if it selects groups of highly correlated predictor rather than selecting one representative. The pairwise fused lasso is related to fusion methods but does not assume that predictors have to be ordered. By penalizing parameters and differences betwe...

Journal: :Journal of machine learning research : JMLR 2016
Matey Neykov Jun S. Liu Tianxi Cai

It is known that for a certain class of single index models (SIMs) [Formula: see text], support recovery is impossible when X ~ 𝒩(0, 𝕀 p×p ) and a model complexity adjusted sample size is below a critical threshold. Recently, optimal algorithms based on Sliced Inverse Regression (SIR) were suggested. These algorithms work provably under the assumption that the design X comes from an i.i.d. Gaus...

2006
Jian Huang Shuangge Ma Cun-Hui Zhang

We study the asymptotic properties of the adaptive Lasso estimators in sparse, high-dimensional, linear regression models when the number of covariates may increase with the sample size. We consider variable selection using the adaptive Lasso, where the L1 norms in the penalty are re-weighted by data-dependent weights. We show that, if a reasonable initial estimator is available, under appropri...

Journal: :Proceedings of the National Academy of Sciences of the United States of America 1979
S Bratosin O Laub J Tal Y Aloni

During an electron-microscopic survey with the aim of identifying the parvovirus MVM transcription template, we observed previously unidentified structures of MVM DNA in lysates of virus-infected cells. These included double-stranded "lasso"-like structures and relaxed circles. Both structures were of unit length MVM DNA, indicating that they were not intermediates formed during replication; th...

2013
Fabian L. Wauthier Nebojsa Jojic Michael I. Jordan

The Lasso is a cornerstone of modern multivariate data analysis, yet its performance suffers in the common situation in which covariates are correlated. This limitation has led to a growing number of Preconditioned Lasso algorithms that pre-multiply X and y by matrices PX , Py prior to running the standard Lasso. A direct comparison of these and similar Lasso-style algorithms to the original La...

2011
PETER RADCHENKO GARETH M. JAMES

Recently, considerable interest has focused on variable selection methods in regression situations where the number of predictors, p, is large relative to the number of observations, n. Two commonly applied variable selection approaches are the Lasso, which computes highly shrunk regression coefficients, and Forward Selection, which uses no shrinkage. We propose a new approach, “Forward-Lasso A...

Journal: :Statistics and Computing 2010
Peter Bühlmann Torsten Hothorn

We propose Twin Boosting which has much better feature selection behavior than boosting, particularly with respect to reducing the number of false positives (falsely selected features). In addition, for cases with a few important effective and many noise features, Twin Boosting also substantially improves the predictive accuracy of boosting. Twin Boosting is as general and generic as boosting. ...

2011
ALEXANDRE BELLONI VICTOR CHERNOZHUKOV

In this paper we study post-penalized estimators which apply ordinary, unpenal-ized linear regression to the model selected by first-step penalized estimators, typically LASSO.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید