نتایج جستجو برای: stein type shrinkage lasso
تعداد نتایج: 1360847 فیلتر نتایج به سال:
We propose the Bayesian adaptive Lasso (BaLasso) for variable selection and coefficient estimation in linear regression. The BaLasso is adaptive to the signal level by adopting different shrinkage for different coefficients. Furthermore, we provide a model selection machinery for the BaLasso by assessing the posterior conditional mode estimates, motivated by the hierarchical Bayesian interpreta...
I propose a new method for variable selection and shrinkage in Cox's proportional hazards model. My proposal minimizes the log partial likelihood subject to the sum of the absolute values of the parameters being bounded by a constant. Because of the nature of this constraint, it shrinks coefficients and produces some coefficients that are exactly zero. As a result it reduces the estimation vari...
We wholeheartedly congratulate Lockhart, Taylor, Tibshrani and Tibshrani on the stimulating paper, which provides insights into statistical inference based on the lasso solution path. The authors proposed novel covariance statistics for testing the significance of predictor variables as they enter the active set, which formalizes the data-adaptive test based on the lasso path. The observation t...
In the sparse linear regression setting, we consider testing the significance of the predictor variable that enters the current lasso model, in the sequence of models visited along the lasso solution path. We propose a simple test statistic based on lasso fitted values, called the covariance test statistic, and show that when the true model is linear, this statistic has an Exp(1) asymptotic dis...
It is well known that in unidentifiable models, the Bayes estimation has the advantage of generalization performance to the maximum likelihood estimation. However, accurate approximation of the posterior distribution requires huge computational costs. In this paper, we consider an empirical Bayes approach where a part of the parameters are regarded as hyperparameters, which we call a subspace B...
Entropy is a fundamental quantity in statistics and machine learning. In this note, we present a novel procedure for statistical learning of entropy from high-dimensional small-sample data. Specifically, we introduce a a simple yet very powerful small-sample estimator of the Shannon entropy based on James-Stein-type shrinkage. This results in an estimator that is highly efficient statistically ...
BACKGROUND Because multiple loci control complex diseases, there is great interest in testing markers simultaneously instead of one by one. In this paper, we applied two model selection algorithms: the stochastic search variable selection (SSVS) and the least absolute shrinkage and selection operator (LASSO) to two quantitative phenotypes related to rheumatoid arthritis (RA). RESULTS The Gene...
Lasso-type variable selection has increasingly expanded its machine learning applications. In this paper, uncorrelated Lasso is proposed for variable selection, where variable de-correlation is considered simultaneously with variable selection, so that selected variables are uncorrelated as much as possible. An effective iterative algorithm, with the proof of convergence, is presented to solve ...
The application of the lasso is espoused in high-dimensional settings where only a small number of the regression coefficients are believed to be nonzero (i.e., the solution is sparse). Moreover, statistical properties of high-dimensional lasso estimators are often proved under the assumption that the correlation between the predictors is bounded. In this vein, coordinatewise methods, the most ...
A new shrinkage method for higher dimensions regression model to remedy of multicollinearity problem
This research seeks to present new method of shrinking variables select some basic from large data sets. shrinkage estimator is a modification (Ridge and Adaptive Lasso) regression in the presence mixing parameter that was calculated Elastic-Net. The Proposed called (Improved Mixed Shrinkage Estimator (IMSHE)) handle problem multicollinearity. In practice, it difficult achieve required accuracy...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید