نتایج جستجو برای: stein estimator

تعداد نتایج: 34287  

2010
Jana Jurečková Rudolf Beran R. Beran

Charles Stein [10] discovered that, under quadratic loss, the usual unbiased estimator for the mean vector of a multivariate normal distribution is inadmissible if the dimension n of the mean vector exceeds two. On the way, he constructed shrinkage estimators that dominate the usual estimator asymptotically in n. It has since been claimed that Stein’s results and the subsequent James–Stein esti...

2005
Sanjay Chaudhuri

The Reverse Stein Effect is identified and illustrated: A statistician who shrinks his/her data toward a point chosen without reliable knowledge about the underlying value of the parameter to be estimated but based instead upon the observed data will not be protected by the minimax property of shrinkage estimators such a" that of James and Stein, but instead will likely incur a greater error th...

2000
Tae-Hwan Kim Halbert White

We explore the extension of James-Stein type estimators in a direction that enables them to preserve their superiority when the sample size goes to infinity. Instead of shrinking a base estimator towards a fixed point, we shrink it towards a data-dependent point. We provide an analytic expression for the asymptotic risk and bias of James-Stein type estimators shrunk towards a data-dependent poi...

Journal: :IEEE Trans. Signal Processing 1998
Jonathan H. Manton Vikram Krishnamurthy H. Vincent Poor

In 1961, James and Stein discovered a remarkable estimator that dominates the maximum-likelihood estimate of the mean of a p-variate normal distribution, provided the dimension p is greater than two. This paper extends the James–Stein estimator and highlights benefits of applying these extensions to adaptive signal processing problems. The main contribution of this paper is the derivation of th...

2008
Rudolf Beran

The Stein estimator 's and the better positive-part Stein estimator gpS both dominate the sample mean, under quadratic loss, in the N(g, I) model of dimension q > 3. Standard large sample theory does not explaill this phenomenon well. Plausible bootstrap estimators for the risk of 's do not converge correctly at the shrinkage point as sample size n increases. By analyzing a submodel exactly, wi...

Journal: :Japanese Journal of Statistics and Data Science 2023

Abstract It is now 62 years since the publication of James and Stein’s seminal article on estimation a multivariate normal mean vector. The paper made spectacular first impression statistical community through its demonstration inadmissability maximum likelihood estimator. continues to be influential, but not for initial reasons. Empirical Bayes shrinkage estimation, major topic, found early ju...

1999
Tae-Hwan Kim Halbert White Clive Granger James Hamilton Patrick Fitzsimmons

We explore the extension of James-Stein type estimators in a direction that enables them to preserve their superiority when the sample size goes to infinity. Instead of shrinking a base estimator towards a fixed point, we shrink it towards a data-dependent point, which makes it possible that the “prior” becomes more accurate as the sample size grows. We provide an analytic expression for the as...

2002
Tae-Hwan Kim Halbert White

We explore the extension of James-Stein type estimators in a direction that enables them to preserve their superiority when the sample size goes to infinity. Instead of shrinking a base estimator towards a fixed point, we shrink it towards a data-dependent point. We provide an analytic expression for the asymptotic risk and bias of James-Stein type estimators shrunk towards a data-dependent poi...

2010
Charles Stein

Charles Stein shocked the statistical world in 1955 with his proof that maximum likelihood estimation methods for Gaussian models, in common use for more than a century, were inadmissible beyond simple oneor twodimensional situations. These methods are still in use, for good reasons, but Stein-type estimators have pointed the way toward a radically different empirical Bayes approach to high-dim...

2017
Jann Spiess

In a linear regression model with homoscedastic Normal noise, I consider James–Stein type shrinkage in the estimation of nuisance parameters associated with control variables. For at least three control variables and exogenous treatment, I show that the standard leastsquares estimator is dominated with respect to squared-error loss in the treatment effect even among unbiased estimators and even...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید