Efficient Sparse Recovery via Adaptive Non-Convex Regularizers with Oracle Property

نویسندگان

  • Ming Lin
  • Rong Jin
  • Changshui Zhang
چکیده

The main shortcoming of sparse recovery with a convex regularizer is that it is a biased estimator and therefore will result in a suboptimal performance in many cases. Recent studies have shown, both theoretically and empirically, that non-convex regularizer is able to overcome the biased estimation problem. Although multiple algorithms have been developed for sparse recovery with non-convex regularization, they are either computationally demanding or not equipped with the desired properties (i.e. optimal recovery error, selection consistency and oracle property). In this work, we develop an algorithm for efficient sparse recovery based on proximal gradient descent. The key feature of the proposed algorithm is introducing adaptive non-convex regularizers whose shrinking threshold vary over iterations. The algorithm is compatible with most popular non-convex regularizers, achieves a geometric convergence rate for the recovery error, is selection consistent, and most importantly has the oracle property. Based on the proposed framework, we suggest to use a so–called ACCQ regularizer, which is equivalent to zero proximal projection gap adaptive hard-thresholding. Experiments with both synthetic data sets and real images verify both the efficiency and effectiveness of the proposed method compared to the state-of-the-art methods for sparse recovery.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Study of Convex Regularizers for Sparse Recovery and Feature Selection

We study the problem of recovering a sparse vector from a set of linear measurements. This problem also relates to feature or variable selection in statistics and machine learning. A widely used method for such problems has been regularization with the L1 norm. We extend this methodology to allow for a broader class of regularizers which includes the L1 norm. This class is characterized by a co...

متن کامل

High-dimensional Inference via Lipschitz Sparsity-Yielding Regularizers

Non-convex regularizers are more and more applied to high-dimensional inference with sparsity prior knowledge. In general, the nonconvex regularizer is superior to the convex ones in inference but it suffers the difficulties brought by local optimums and massive computation. A ”good” regularizer should perform well in both inference and optimization. In this paper, we prove that some non-convex...

متن کامل

Relaxed sparse eigenvalue conditions for sparse estimation via non-convex regularized regression

Non-convex regularizers usually improve the performance of sparse estimation in practice. To prove this fact, we study the conditions of sparse estimations for the sharp concave regularizers which are a general family of non-convex regularizers including many existing regularizers. For the global solutions of the regularized regression, our sparse eigenvalue based conditions are weaker than tha...

متن کامل

Adaptive Regularization through Entire Solution Surface

Several sparseness penalties have been suggested for delivery of good predictive performance in automatic variable selection within the framework of regularization. All assume that the true model is sparse. We propose a penalty, a convex combination of the L1and L∞-norms, that adapts to a variety of situations including sparseness and nonsparseness, grouping and nongrouping. The proposed penalt...

متن کامل

Sparse Recovery via Partial Regularization: Models, Theory and Algorithms

In the context of sparse recovery, it is known that most of existing regularizers such as `1 suffer from some bias incurred by some leading entries (in magnitude) of the associated vector. To neutralize this bias, we propose a class of models with partial regularizers for recovering a sparse solution of a linear system. We show that every local minimizer of these models is sufficiently sparse o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014