نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

2016
Shiyong Cui Chengfeng Luo

In this article, a method based on a non-parametric estimation of the Kullback–Leibler divergence using a local feature space is proposed for synthetic aperture radar (SAR) image change detection. First, local features based on a set of Gabor filters are extracted from both preand post-event images. The distribution of these local features from a local neighbourhood is considered as a statistic...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2007
Michele Tumminello Fabrizio Lillo Rosario N Mantegna

We show that the Kullback-Leibler distance is a good measure of the statistical uncertainty of correlation matrices estimated by using a finite set of data. For correlation matrices of multivariate Gaussian variables we analytically determine the expected values of the Kullback-Leibler distance of a sample correlation matrix from a reference model and we show that the expected values are known ...

2005
NADER EBRAHIMI

In this paper, we introduce the minimum dynamic discrimination information (MDDI) approach to probability modeling. The MDDI model relative to a given distributionG is that which has least Kullback–Leibler information discrepancy relative to G, among all distributions satisfying some information constraints given in terms of residual moment inequalities, residualmoment growth inequalities, or h...

Journal: :Int. J. Computational Intelligence Systems 2014
R. Priya T. N. Shanmugam R. Baskaran

Content-based video retrieval systems have shown great potential in supporting decision making in clinical activities, teaching, and biological research. In content-based video retrieval, feature combination plays a key role. As a result content-based retrieval of all different type video data turns out to be a challenging and vigorous problem. This paper presents an effective content based vid...

Journal: :IEEE Transactions on Information Theory 2000

Journal: :SIAM Journal on Scientific Computing 2021

We propose to compute a sparse approximate inverse Cholesky factor $L$ of dense covariance matrix $\Theta$ by minimizing the Kullback--Leibler divergence between Gaussian distributions $\mathcal{N}(0, \Theta)$ and L^{-\top} L^{-1})$, subject sparsity constraint. Surprisingly, this problem has closed-form solution that can be computed efficiently, recovering popular Vecchia approximation in spat...

2006
Young Kyung Lee Byeong U. Park

Motivated from the bandwidth selection problem in local likelihood density estimation and from the problem of assessing a final model chosen by a certain model selection procedure, we consider estimation of the Kullback–Leibler divergence. It is known that the best bandwidth choice for the local likelihood density estimator depends on the distance between the true density and the ‘vehicle’ para...

2009
Elena Ezhova Vadim Mottl Olga Krasotkina

The problem of estimating time-varying regression is inevitably concerned with the necessity to choose the appropriate level of model volatility ranging from the full stationarity of instant regression models to their absolute independence of each other. In the stationary case the number of regression coefficients to be estimated equals that of regressors, whereas the absence of any smoothness ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید