نتایج جستجو برای: pabon lasso analysis

تعداد نتایج: 2827094  

Journal: :Computers in biology and medicine 2011
Songfeng Zheng Weixiang Liu

Selecting a subset of genes with strong discriminative power is a very important step in classification problems based on gene expression data. Lasso and Dantzig selector are known to have automatic variable selection ability in linear regression analysis. This paper applies Lasso and Dantzig selector to select the most informative genes for representing the probability of an example being posi...

2011
Paolo Giordani

A new method for linear regression analysis of interval-valued data is proposed. In particular, the linear relationship between an interval-valued response variable and a set of interval-valued explanatory variables is investigated by considering two regression models, one for the midpoints (the locations of the intervals) of the response and explanatory variables and the other one for the radi...

2005
Shuangge Ma Jian Huang

1 Summary. The additive risk model is a useful alternative to the proportional hazards model. It postulates that the hazard function is the sum of the baseline hazard function and the regression function of covariates. In this article, we investigate estimation in the additive risk model with right censored survival data and high dimensional covariates. A LASSO (least absolute shrinkage and sel...

Journal: :CoRR 2016
Eugène Ndiaye Olivier Fercoq Alexandre Gramfort Vincent Leclère Joseph Salmon

In high dimensional settings, sparse structures are crucial for efficiency, both in term of memory, computation and performance. It is customary to consider `1 penalty to enforce sparsity in such scenarios. Sparsity enforcing methods, the Lasso being a canonical example, are popular candidates to address high dimension. For efficiency, they rely on tuning a parameter trading data fitting versus...

2009
Junzhou Huang Tong Zhang

This paper develops a theory for group Lasso using a concept called strong group sparsity. Our result shows that group Lasso is superior to standard Lasso for strongly group-sparse signals. This provides a convincing theoretical justification for using group sparse regularization when the underlying group structure is consistent with the data. Moreover, the theory predicts some limitations of t...

2016
Brian R. Gaines Hua Zhou

We compare alternative computing strategies for solving the constrained lasso problem. As its name suggests, the constrained lasso extends the widely-used lasso to handle linear constraints, which allow the user to incorporate prior information into the model. In addition to quadratic programming, we employ the alternating direction method of multipliers (ADMM) and also derive an efficient solu...

2010
Jian Kang Jian Guo

In this paper, we proposed a self-adaptive lasso method for variable selection in regression problems. Unlike the popular lasso method, the proposed method introduces a specific tuning parameter for each regression coefficient. We modeled self-adaptive lasso in a Bayesian framework and developed an efficient Gibbs sampling algorithm to automatically select these tuning parameters and estimate t...

2011
Sara van de Geer Peter Bühlmann Shuheng Zhou

Abstract: We revisit the adaptive Lasso as well as the thresholded Lasso with refitting, in a high-dimensional linear model, and study prediction error, lq-error (q ∈ {1, 2}), and number of false positive selections. Our theoretical results for the two methods are, at a rather fine scale, comparable. The differences only show up in terms of the (minimal) restricted and sparse eigenvalues, favor...

2018
Alexander Jung Nguyen Tran Alexandru Mara

The “least absolute shrinkage and selection operator” (Lasso) method has been adapted recently for network-structured datasets. In particular, this network Lasso method allows to learn graph signals from a small number of noisy signal samples by using the total variation of a graph signal for regularization. While efficient and scalable implementations of the network Lasso are available, only l...

2017
Daniel F. Schmidt Enes Makalic

The lasso, introduced by Robert Tibshirani in 1996, has become one of the most popular techniques for estimating Gaussian linear regression models. An important reason for this popularity is that the lasso can simultaneously estimate all regression parameters as well as select important variables, yielding accurate regression models that are highly interpretable. This paper derives an efficient...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید