نتایج جستجو برای: leibler distance

تعداد نتایج: 244184  

Journal: :CoRR 2006
Ambedkar Dukkipati

Kullback-Leibler relative-entropy, in cases involving distributions resulting from relative-entropy minimization, has a celebrated property reminiscent of squared Euclidean distance: it satisfies an analogue of the Pythagoras’ theorem. And hence, this property is referred to as Pythagoras’ theorem of relative-entropy minimization or triangle equality and plays a fundamental role in geometrical ...

2012
XuanLong Nguyen

We consider Wasserstein distance functionals for assessing the convergence of latent discrete measures, which serve as mixing distributions in hierarchical and nonparametric mixture models. We clarify the relationships between Wasserstein distances of mixing distributions and f -divergence functionals such as Hellinger and Kullback-Leibler distances on the space of mixture distributions using v...

2005
Uwe D. Reichel Florian Schiel

In this study four statistical grapheme-to-phoneme (G2P) conversion methods for canonical German are compared. The G2P models differ in terms of usage of morphologic information and of phoneme history (left context) information. In order to evaluate our models we introduce two measures, namely mean normalized Levenshtein distance for classification accuracy and conditional relative entropy for ...

2011
XuanLong Nguyen

We consider Wasserstein distance functionals for comparing between and assessing the convergence of latent discrete measures, which serve as mixing distributions in hierarchical and nonparametric mixture models. We explore the space of discrete probability measures metrized by Wasserstein distances, clarify the relationships between Wasserstein distances of mixing distributions and f -divergenc...

2014
Cheng Wan Yiquan Wu

In this paper, we use non-subsampled shearlet transform (NSST) and Krawtchouk Moment Invariants (KMI) to realize image retrieval based on texture and shape features. Shearlet is a new sparse representation tool of multidimensional function, which provides a simple and efficient mathematical framework. We decompose the images by NSST. The directional subband coefficients are modeled by Generaliz...

Journal: :Computer and Information Science 2023

This study presents a measure-theoretic approach to estimate the upper bound on total variation of difference between hypergeometric and binomial distributions using Kullback-Leibler information divergence. The distribution can be used find probabilities associated with experiments. But if sample size is large relative population size, experiment may not binomial, good choice experiment. probab...

Journal: :Journal of statistical theory and practice 2021

This paper addresses the question of clustering density curves around a unit circle by approximating each such curve mixture an appropriate number von Mises distributions. is done first defining distance between any two either via $$L^2$$ or symmetrized Kullback–Leibler divergence. We show that both these measures yield similar results. After demonstrating simulations proposed methods work succ...

Journal: :Siam Journal on Optimization 2021

Recently, a new kind of distance has been introduced for the graphs two point-to-set operators, one which is maximally monotone. When both operators are subdifferential proper lower semicontinuous convex function, this specializes under modest assumptions to classical Bregman distance. We name generalized distance, and we shed light on it with examples that utilize other most natural representa...

2016
Robert Gatenby B. Roy Frieden

Enzymes are proteins that accelerate intracellular chemical reactions often by factors of 105-1012s-1. We propose the structure and function of enzymes represent the thermodynamic expression of heritable information encoded in DNA with post-translational modifications that reflect intra- and extra-cellular environmental inputs. The 3 dimensional shape of the protein, determined by the genetical...

Journal: :IEEE Signal Process. Lett. 2013
Vittorio Perduca Grégory Nuel

We measure the influence of individual observations on the sequence of the hidden states of the Hidden Markov Model (HMM) by means of the Kullback-Leibler distance (KLD). Namely, we consider the KLD between the conditional distribution of the hidden states’ chain given the complete sequence of observations and the conditional distribution of the hidden chain given all the observations but the o...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید