نتایج جستجو برای: divergence measure

تعداد نتایج: 390285  

2012
Shivakumar Jolad Ahmed Roman Mahesh C. Shastry

We introduce a new divergence measure, the bounded Bhattacharyya distance (BBD), for quantifying the dissimilarity between probability distributions. BBD is based on the Bhattacharyya coefficient (fidelity) , and is symmetric, positive semi-definite, and bounded. Unlike the Kullback-Leibler divergence, BBD does not require probability density functions to be absolutely continuous with respect t...

Journal: :CoRR 2014
Visar Berisha Alan Wisler Alfred O. Hero Andreas Spanias

Information divergence functions play a critical role in statistics and information theory. In this paper we show that a non-parametric f -divergence measure can be used to provide improved bounds on the minimum binary classification probability of error for the case when the training and test data are drawn from the same distribution and for the case where there exists some mismatch between tr...

Journal: :Entropy 2012
Javier E. Contreras-Reyes Reinaldo Boris Arellano-Valle

The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate normal distribution, showing that this is equivalent to comparing univariate versions of these...

2015
Elizabeth Roberto

Decomposition analysis is a critical tool for understanding the social and spatial dimensions of inequality, segregation, and diversity. In this paper, I propose a new measure – the Divergence Index – to address the need for a decomposable measure of segregation. Although the Information Theory Index has been used to decompose segregation within and between communities, I argue that it measures...

2016
Michele Battisti Gianfranco di Vaio Joseph Zeira

This paper introduces a new way to measure divergence of output across countries. It measures how productivity, technology or output per worker in each country follow the global frontier. We find that during the years 1970-2008 most countries followed the global frontier only partially, so they diverged from it. We use the tools of ‘development accounting’ to measure by how much countries follo...

Journal: :Tamkang Journal of Mathematics 2021

There are many fuzzy information and divergence measures exist in the literature of Information Theory. Inequalities play important role for finding relations. Here, we will introduce some new inequalities on their applications pattern recognition. Also established relations between well known with help $f$-divergence measure jensen’s inequality.

2005
Inder Jeet Taneja

There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leibler [13] relative information and Jeffreys [12] Jdivergence. Sibson [17] Jensen-Shannon divergence has also found its applications in the literature. The author [20] studied a new divergence measures based on arithmetic and geometric means....

2008
Yu Qiao Nobuaki Minematsu

Finding measures (or features) invariant to inevitable variations caused by non-linguistical factors (transformations) is a fundamental yet important problem in speech recognition. Recently, Minematsu [1, 2] proved that Bhattacharyya distance (BD) between two distributions is invariant to invertible transforms on feature space, and develop an invariant structural representation of speech based ...

2002
John Reidar Mathiassen Amund Skavhaug Ketil Bø

We propose a texture similarity measure based on the Kullback-Leibler divergence between gamma distributions (KLGamma). We conjecture that the spatially smoothed Gabor filter magnitude responses of some classes of visually homogeneous stochastic textures are gamma distributed. Classification experiments with disjoint test and training images, show that the KLGamma measure performs better than o...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید