Empirical Bayesian thresholding for sparse signals using mixture loss functions

نویسندگان

  • Vikas C. Raykar
  • Linda H. Zhao
چکیده

We develop an empirical Bayesian thresholding rule for the normal mean problem that adapts well to the sparsity of the signal. An key element is the use of a mixture loss function that combines both the Lp loss and the 0 − 1 loss function. The Bayes procedures under this loss are explicitly given as thresholding rules and are easy to compute. The prior on each mean is a mixture of an atom of probability at zero, and a Laplace or normal density for the nonzero part. The mixing probability as well as the spread of the non-zero part are hyperparameters that are estimated by the empirical Bayes procedure. Our simulation experiments demonstrate that the proposed method performs better than the other competing methods for a wide range of scenarios. We also apply our proposed method for feature selection to four data sets.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Needles and Straw in Haystacks: Empirical Bayes Estimates of Possibly Sparse Sequences By

An empirical Bayes approach to the estimation of possibly sparse sequences observed in Gaussian white noise is set out and investigated. The prior considered is a mixture of an atom of probability at zero and a heavytailed density γ , with the mixing weight chosen by marginal maximum likelihood, in the hope of adapting between sparse and dense sequences. If estimation is then carried out using ...

متن کامل

Needles and Straw in Haystacks: Empirical Bayes Estimates of Possibly Sparse Sequences

An empirical Bayes approach to the estimation of possibly sparse sequences observed in Gaussian white noise is set out and investigated. The prior considered is a mixture of an atom of probability at zero and a heavy-tailed density γ, with the mixing weight chosen by marginal maximum likelihood, in the hope of adapting between sparse and dense sequences. If estimation is then carried out using ...

متن کامل

Wavelet thresholding for some classes of non-Gaussian noise

Wavelet shrinkage and thresholding methods constitute a powerful way to carry out signal denoising, especially when the underlying signal has a sparse wavelet representation. They are computationally fast, and automatically adapt to the smoothness of the signal to be estimated. Nearly minimax properties for simple threshold estimators over a large class of function spaces and for a wide range o...

متن کامل

Block-Based Compressive Sensing Using Soft Thresholding of Adaptive Transform Coefficients

Compressive sampling (CS) is a new technique for simultaneous sampling and compression of signals in which the sampling rate can be very small under certain conditions. Due to the limited number of samples, image reconstruction based on CS samples is a challenging task. Most of the existing CS image reconstruction methods have a high computational complexity as they are applied on the entire im...

متن کامل

Empirical Bayes thresholding: adapting to sparsity when it advantageous to do so

Suppose one is trying to estimate a high dimensional vector of parameters from a series of one observation per parameter. Often, it is possible to take advantage of sparsity in the parameters by thresholding the data in an appropriate way. A marginal maximum likelihood approach, within a suitable Bayesian structure, has excellent properties. For very sparse signals, the procedure chooses a larg...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010