Generalized sparse covariance-based estimation
نویسندگان
چکیده
منابع مشابه
Individual-specific, sparse inverse covariance estimation in generalized estimating equations
This paper proposes a data-driven approach that derives individual-specific sparse working correlation matrices for generalized estimating equations (GEEs). The approach is motivated by the observation that, in some applications of the GEE, the covariance structure across individuals is heterogeneous and cannot be appropriately captured by a single correlationmatrix. The proposed approach enjoy...
متن کامل0 Sparse Inverse Covariance Estimation
Recently, there has been focus on penalized loglikelihood covariance estimation for sparse inverse covariance (precision) matrices. The penalty is responsible for inducing sparsity, and a very common choice is the convex l1 norm. However, the best estimator performance is not always achieved with this penalty. The most natural sparsity promoting “norm” is the non-convex l0 penalty but its lack ...
متن کاملSparse permutation invariant covariance estimation
The paper proposes a method for constructing a sparse estimator for the inverse covariance (concentration) matrix in high-dimensional settings. The estimator uses a penalized normal likelihood approach and forces sparsity by using a lasso-type penalty. We establish a rate of convergence in the Frobenius norm as both data dimension p and sample size n are allowed to grow, and show that the rate ...
متن کاملEstimation of Functionals of Sparse Covariance Matrices.
High-dimensional statistical tests often ignore correlations to gain simplicity and stability leading to null distributions that depend on functionals of correlation matrices such as their Frobenius norm and other ℓ r norms. Motivated by the computation of critical values of such tests, we investigate the difficulty of estimation the functionals of sparse correlation matrices. Specifically, we ...
متن کاملSparse inverse covariance estimation with the lasso
We consider the problem of estimating sparse graphs by a lasso penalty applied to the inverse covariance matrix. Using a coordinate descent procedure for the lasso, we develop a simple algorithm— the Graphical Lasso— that is remarkably fast: it solves a 1000 node problem (∼ 500, 000 parameters) in at most a minute, and is 30 to 4000 times faster than competing methods. It also provides a concep...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Signal Processing
سال: 2018
ISSN: 0165-1684
DOI: 10.1016/j.sigpro.2017.09.010