نتایج جستجو برای: partial linear model preliminary test lasso

تعداد نتایج: 3367252  

1997
Robert Tibshirani

We propose a new method for variable selection and estimation in Cox's proportional hazards model. Our proposal minimizes the log partial likelihood subject to the sum of the absolute values of the parameters being bounded by a constant. Because of the nature of this constraint it tends to produce some coeecients that are exactly zero and hence gives interpretable models. The method is a variat...

2014
Jasdeep Pannu

We consider the problem of selecting functional variables using the L1 regularization in a functional linear regression model with a scalar response and functional predictors in the presence of outliers. Since the LASSO is a special case of the penalized least squares regression with L1-penalty function it suffers from the heavy-tailed errors and/or outliers in data. Recently, the LAD regressio...

Journal: :Chemical science 2017
Fumito Saito Jeffrey W Bode

The chemical synthesis of peptide-based [1]rotaxanes (lasso peptides) was achieved by [2]rotaxane formation followed by two chemoselective ligation reactions. Our approach enabled incorporation of various peptide sequences into a common rotaxane structure. The synthetic lasso peptides were characterized by NMR, chromatography, and partial degradation by proteases. A linear peptide epitope graft...

2009
Fei Ye Cun-Hui Zhang

We consider the estimation of regression coefficients in a high-dimensional linear model. A lower bound of the minimax `q risk is provided for regression coefficients in `r balls, along with a minimax lower bound for the tail of the `q loss. Under certain conditions on the design matrix and penalty level, we prove that these minimax convergence rates are attained by both the Lasso and Dantzig e...

Journal: :CoRR 2015
Jason D. Lee Yuekai Sun Qiang Liu Jonathan E. Taylor

We devise a one-shot approach to distributed sparse regression in the high-dimensional setting. The key idea is to average " debiased " or " desparsified " lasso estimators. We show the approach converges at the same rate as the lasso as long as the dataset is not split across too many machines. We also extend the approach to generalized linear models.

2015
Yen-Huan Li Ya-Ping Hsieh Nissim Zerbib Volkan Cevher

We study the estimation error of constrained M -estimators, and derive explicit upper bounds on the expected estimation error determined by the Gaussian width of the constraint set. Both of the cases where the true parameter is on the boundary of the constraint set (matched constraint), and where the true parameter is strictly in the constraint set (mismatched constraint) are considered. For bo...

Journal: :Scandinavian Journal of Statistics 2023

Abstract Standard likelihood penalties to learn Gaussian graphical models are based on regularizing the off‐diagonal entries of precision matrix. Such methods, and their Bayesian counterparts, not invariant scalar multiplication variables, unless one standardizes observed data unit sample variances. We show that such standardization can have a strong effect inference introduce new family partia...

2012
Marius Kwemou

We consider the problem of estimating a function f0 in logistic regression model. We propose to estimate this function f0 by a sparse approximation build as a linear combination of elements of a given dictionary of p functions. This sparse approximation is selected by the Lasso or Group Lasso procedure. In this context, we state non asymptotic oracle inequalities for Lasso and Group Lasso under...

2010
Pablo Sprechmann Ignacio Ramirez Guillermo Sapiro Yonina Eldar

Sparse modeling is a powerful framework for data analysis and processing. Traditionally, encoding in this framework is done by solving an `1-regularized linear regression problem, usually called Lasso. In this work we first combine the sparsityinducing property of the Lasso model, at the individual feature level, with the block-sparsity property of the group Lasso model, where sparse groups of ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید