نتایج جستجو برای: minimum covariance determinant estimator

تعداد نتایج: 267026  

Journal: :Signal Processing 2017
Khalil Elkhalil Abla Kammoun Tareq Y. Al-Naffouri Mohamed-Slim Alouini

This paper analyzes the statistical properties of the signal-to-noise ratio (SNR) at the output of the Capon’s minimum variance distortionless response (MVDR) beamformers when operating over impulsive noises. Particularly, we consider the supervised case in which the receiver employs the regularized Tyler estimator in order to estimate the covariance matrix of the interference-plus-noise proces...

2010
Peter Bell

HMM-based systems for Automatic Speech Recognition typically model the acoustic features using mixtures of multivariate Gaussians. In this thesis, we consider the problem of learning a suitable covariance matrix for each Gaussian. A variety of schemes have been proposed for controlling the number of covariance parameters per Gaussian, and studies have shown that in general, the greater the numb...

2011
T. Tony Cai Harrison H. Zhou

Driven by a wide range of applications in high-dimensional data analysis, there has been significant recent interest in the estimation of large covariance matrices. In this paper, we consider optimal estimation of a covariance matrix as well as its inverse over several commonly used parameter spaces under the matrix l1 norm. Both minimax lower and upper bounds are derived. The lower bounds are ...

2007
Christian B. Hansen

I consider the asymptotic properties of a commonly advocated covariance matrix estimator for panel data. Under asymptotics where the cross-section dimension, n, grows large with the time dimension, T, fixed, the estimator is consistent while allowing essentially arbitrary correlation within each individual. However, many panel data sets have a non-negligible time dimension. I extend the usual a...

Journal: :Computational Statistics & Data Analysis 2012
Mikko Packalen Tony S. Wirjanto

Selecting an estimator for the covariance matrix of a regression’s parameter estimates is an important step in hypothesis testing. From less robust to more robust, the available choices include: Eicker/White heteroskedasticity-robust standard errors, cluster-robust standard errors, and multi-way cluster-robust standard errors. The rationale for using a less robust covariance matrix estimator is...

2013
Olivier Ledoit Michael Wolf

This paper revisits the methodology of Stein (1975, 1986) for estimating a covariance matrix in the setting where the number of variables can be of the same magnitude as the sample size. Stein proposed to keep the eigenvectors of the sample covariance matrix but to shrink the eigenvalues. By minimizing an unbiased estimator of risk, Stein derived an ‘optimal’ shrinkage transformation. Unfortuna...

2012
Matthew Coudron Gilad Lerman

We estimate the rate of convergence and sample complexity of a recent robust estimator for a generalized version of the inverse covariance matrix. This estimator is used in a convex algorithm for robust subspace recovery (i.e., robust PCA). Our model assumes a sub-Gaussian underlying distribution and an i.i.d. sample from it. Our main result shows with high probability that the norm of the diff...

2015
Bohan Liu Ernest Fokoué B. H. Liu E. Fokoué

We introduce and develop a novel approach to outlier detection based on adaptation of random subspace learning. Our proposed method handles both high-dimension low-sample size and traditional low-dimensional high-sample size datasets. Essentially, we avoid the computational bottleneck of techniques like Minimum Covariance Determinant (MCD) by computing the needed determinants and associated mea...

2004
B XIHONG LIN NAISYIN WANG ALAN H. WELSH RAYMOND J. CARROLL

S For independent data, it is well known that kernel methods and spline methods are essentially asymptotically equivalent (Silverman, 1984). However, recent work of Welsh et al. (2002) shows that the same is not true for clustered/longitudinal data. Splines and conventional kernels are different in localness and ability to account for the within-cluster correlation. We show that a smoothi...

Journal: :Journal of Machine Learning Research 2011
Jérémie Bigot Rolando J. Biscay Jean-Michel Loubes Lilian Muñiz-Alvarez

In this paper, we consider the Group Lasso estimator of the covariance matrix of a stochastic process corrupted by an additive noise. We propose to estimate the covariance matrix in a highdimensional setting under the assumption that the process has a sparse representation in a large dictionary of basis functions. Using a matrix regression model, we propose a new methodology for high-dimensiona...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید