نتایج جستجو برای: stein type shrinkage lasso
تعداد نتایج: 1360847 فیلتر نتایج به سال:
Charles Stein [10] discovered that, under quadratic loss, the usual unbiased estimator for the mean vector of a multivariate normal distribution is inadmissible if the dimension n of the mean vector exceeds two. On the way, he constructed shrinkage estimators that dominate the usual estimator asymptotically in n. It has since been claimed that Stein’s results and the subsequent James–Stein esti...
The least absolute deviation (LAD) regression is a useful method for robust regression, and the least absolute shrinkage and selection operator (lasso) is a popular choice for shrinkage estimation and variable selection. In this article we combine these two classical ideas together to produce LAD-lasso. Compared with the LAD regression, LAD-lasso can do parameter estimation and variable selecti...
Yuan an Lin (2004) proposed the grouped LASSO, which achieves shrinkage and selection simultaneously, as LASSO does, but works on blocks of covariates. That is, the grouped LASSO provides a model where some blocks of regression coefficients are exactly zero. The grouped LASSO is useful when there are meaningful blocks of covariates such as polynomial regression and dummy variables from categori...
We study group variable selection on multivariate regression model. Group variable selection is selecting the non-zero rows of coefficient matrix, since there are multiple response variables and thus if one predictor is irrelevant to estimation then the corresponding row must be zero. In a high dimensional setup, shrinkage estimation methods are applicable and guarantee smaller MSE than OLS acc...
The Stein paradox has played an influential role in the field of high dimensional statistics. This result warns that sample mean, classically regarded as “usual estimator”, may be suboptimal dimensions. development James-Stein estimator, addresses this paradox, by now inspired a large literature on theme “shrinkage” In direction, we develop type estimator for first principal component dimension...
The estimation problem in multivariate linear calibration with elliptical errors is considered under a loss function which can be derived from the Kullback-Leibler distance. First, we discuss the problem under normal errors and we give unbiased estimate of risk of an alternative estimator by means of the Stein and Stein-Haff identities for multivariate normal distribution. From the unbiased est...
Proposed by Tibshirani (1996), the LASSO (least absolute shrinkage and selection operator) estimates a vector of regression coefficients by minimising the residual sum of squares subject to a constraint on the l-norm of coefficient vector. The LASSO estimator typically has one or more zero elements and thus shares characteristics of both shrinkage estimation and variable selection. In this pape...
Non-Local Means (NLM) and its variants have proven to be effective and robust in many image denoising tasks. In this letter, we study approaches to selecting center pixel weights (CPW) in NLM. Our key contributions are: 1) we give a novel formulation of the CPW problem from a statistical shrinkage perspective; 2) we construct the James-Stein shrinkage estimator in the CPW context; and 3) we pro...
The least absolute shrinkage and selection operator (lasso) has been widely used in regression shrinkage and selection. In this article, we extend its application to the REGression model with AutoRegressive errors (REGAR). Two types of lasso estimators are carefully studied. The first is similar to the traditional lasso estimator with only two tuning parameters (one for regression coefficients ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید