نتایج جستجو برای: divergence measures

تعداد نتایج: 399991  

2016
Kathryn R. Ritz Mohamed A. F. Noor

Measures of genetic divergence have long been used to identify evolutionary processes operating within and between species. However, recent reviews have described a bias in the use of relative divergence measures towards incorrectly identifying genomic regions that are seemingly immune to introgression. Here, we present a novel and opposite bias of relative divergence measures: misidentifying r...

Journal: :J. Economic Theory 2011
Brice Magdalou Richard Nock

Inequality indices (i) evaluate the divergence between the income distribution and the hypothetical situation where all individuals have the mean income and (ii) are unambiguously reduced by a Pigou-Dalton progressive transfer. This paper proposes a new approach to evaluate the divergence between any two income distributions, where the second one can be a reference distribution for the first. I...

2005
INDER JEET TANEJA

, s = 1 The first measure generalizes the well known J-divergence due to Jeffreys [16] and Kullback and Leibler [17]. The second measure gives a unified generalization of JensenShannon divergence due to Sibson [22] and Burbea and Rao [2, 3], and arithmeticgeometric mean divergence due to Taneja [27]. These two measures contain in particular some well known divergences such as: Hellinger’s discr...

2013
Elisabeth M. Werner Deping Ye

In this paper, we extend the concept of classical f -divergence (for a pair of measures) to mixed f -divergence (for multiple pairs of measures). Mixed f -divergence provides a way to measure the difference between multiple pairs of probability distributions. Properties for mixed f -divergence are established, such as permutation invariance and symmetry in distributions. We also provide an Alex...

1999
Jorge Jiménez Susana Montes Carlo Bertoluzza

Journal: :IEEE Trans. Information Theory 1991
Jianhua Lin

A new class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are ...

2000
Jan Beirlant Luc Devroye

We discuss Chernoo-type large deviation results for the total variation , the I-divergence errors, and the 2-divergence errors on partitions. In contrast to the total variation and the I-divergence, the 2-divergence has an unconventional large deviation rate. Applications to Bahadur eeciencies of goodness-of-t tests based on these divergence measures for multivariate observations are given.

Journal: :CoRR 2018
Morteza Noshad Alfred O. Hero

We propose a scalable divergence estimation method based on hashing. Consider two continuous random variables X and Y whose densities have bounded support. We consider a particular locality sensitive random hashing, and consider the ratio of samples in each hash bin having non-zero numbers of Y samples. We prove that the weighted average of these ratios over all of the hash bins converges to fd...

2001
TA-HSIN LI

New divergence measures are introduced for change detection and discrimination of stochastic signals (time series) on the basis of parametric filtering — a technique that combines parametric linear filtering with correlation characterization. The sensitivity of these divergence measures is investigated using local curvatures under additive and multiplicative spectral departure models. It is fou...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید