نتایج جستجو برای: leibler distance

تعداد نتایج: 244184  

Journal: :CoRR 2018
Abraao D. C. Nascimento Alejandro C. Frery Renato J. Cintra

Images obtained from coherent illumination processes are contaminated with speckle. A prominent example of such imagery systems is the polarimetric synthetic aperture radar (PolSAR). For such remote sensing tool the speckle interference pattern appears in the form of a positive definite Hermitian matrix, which requires specialized models and makes change detection a hard task. The scaled comple...

1998
Peter Hall Brett Presnell

Contamination of a sampled distribution, for example by a heavy-tailed distribution, can degrade the performance of a statistical estimator. We suggest a general approach to alleviating this problem, using a version of the weighted bootstrap. The idea is to “tilt” away from the contaminated distribution by a given (but arbitrary) amount, in a direction that minimises a measure of the new distri...

2012
Karim T. Abou-Moustafa Frank P. Ferrie

Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central operation that appears in most of these areas is to measure the difference between two multivariate Gaussians. Unfortunately, traditional measures based on the Kullback– Leibler (KL) divergence and the Bhattacharyya distance do not satisfy all metric axioms necessary for many algorithms. In this...

2011
XuanLong Nguyen

We consider Wasserstein distances for assessing the convergence of latent discrete measures, which serve as mixing distributions in hierarchical and nonparametric mixture models. We clarify the relationships between Wasserstein distances of mixing distributions and f -divergence functionals such as Hellinger and Kullback-Leibler distances on the space of mixture distributions using various iden...

2006
FERDINAND OSTERREICHER

A b s t r a c t . The class Ii0 , ~ C (0, c~], of f-divergences investigated in this paper is defined in terms of a class of entropies introduced by Arimoto (1971, Information and Control, 19, 181-194). It contains the squared Hellinger distance (for/~ = 1/2), the sum I(Q1 II (Q1 +Q2)/2) +I(Q~ II (Q1 +Q2)/2) of Kullback-Leibler divergences (for ~ = 1) and half of the variation distance (for/3 =...

2009
Mathew D. Penrose J. E. Yukich

Nearest neighbor cells in Rd are used to define coefficients of divergence (φ-divergences) between continuous multivariate samples. For large sample sizes, such distances are shown to be asymptotically normal with a variance depending on the underlying point density. The finite-dimensional distributions of the point measures induced by the coefficients of divergence converge to those of a gener...

2017
Jiyong Park Jaehak Lee

We consider how to quantify non-Gaussianity for the correlation of a bipartite quantum state by using various measures such as relative entropy and geometric distances. We first show that an intuitive approach, i.e., subtracting the correlation of a reference Gaussian state from that of a target non-Gaussian state, fails to yield a non-negative measure with monotonicity under local Gaussian cha...

2010
Heinz H. Bauschke Xianfu Wang

In Euclidean spaces, the geometric notions of nearest-points map, farthestpoints map, Chebyshev set, Klee set, and Chebyshev center are well known and well understood. Since early works going back to the 1930s, tremendous theoretical progress has been made, mostly by extending classical results from Euclidean space to Banach space settings. In all these results, the distance between points is i...

2005
Mathieu Ben Guillaume Gravier Frédéric Bimbot

In this paper, we investigate the use of a distance between Gaussian mixture models for speaker detection. The proposed distance is derived from the KL divergence and is defined as a Euclidean distance in a particular model space. This distance is simply computable directly from the model parameters thus leading to a very efficient scoring process. This new framework for scoring is compared to ...

2003
JACOB BURBEA RADHAKRISHNA RAO P. R. Krishnaiah

The paper is devoted to metrization of probability spaces through the introduction of a quadratic differential metric in the parameter space of the probability distributions. For this purpose, a d-entropy functional is defined on the probability space and its Hessian along a direction of the tangent space of the parameter space is taken as the metric. The distance between two probability distri...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید