نتایج جستجو برای: kullback

تعداد نتایج: 7189  

Journal: :Journal of Machine Learning Research 2013
Edward Challis David Barber

We investigate Gaussian Kullback-Leibler (G-KL) variational approximate inference techniques for Bayesian generalised linear models and various extensions. In particular we make the following novel contributions: sufficient conditions for which the G-KL objective is differentiable and convex are described; constrained parameterisations of Gaussian covariance that make G-KL methods fast and scal...

2000
Don H. Johnson Sinan Sinanović

We define a new distance measure the resistor-average distance between two probability distributions that is closely related to the Kullback-Leibler distance. While the KullbackLeibler distance is asymmetric in the two distributions, the resistor-average distance is not. It arises from geometric considerations similar to those used to derive the Chernoff distance. Determining its relation to we...

Journal: :Entropy 2014
Victor Bakhtin

For the simplest colored branching process, we prove an analog to the McMillan theorem and calculate the Hausdorff dimensions of random fractals defined in terms of the limit behavior of empirical measures generated by finite genetic lines. In this setting, the role of Shannon’s entropy is played by the Kullback–Leibler divergence, and the Hausdorff dimensions are computed by means of the so-ca...

2000
Anton Arnold Peter Markowich Giuseppe Toscani Andreas Unterreiter

The classical Csiszz ar{Kullback inequality bounds the L 1 {distance of two probability densities in terms of their relative (convex) entropies. Here we generalize such inequalities to not necessarily normalized and possibly non-positive L 1 functions. Also, our generalized Csiszz ar{Kullback inequalities are in many important cases sharper than the classical ones (in terms of the functional de...

2011
Andrey Savchenko

The problem of automatic image recognition based on the minimum information discrimination principle is formulated and solved. Color histograms comparison in the Kullback–Leibler information metric is proposed. It’s combined with method of directed enumeration alternatives as opposed to complete enumeration of competing hypotheses. Results of an experimental study of the Kullback-Leibler discri...

2016
Mark Kelbert Pavel Mozgunov

The paper considers a family of probability distributions depending on a parameter. The goal is to derive the generalized versions of Cramér-Rao and Bhattacharyya inequalities for the weighted covariance matrix and of the Kullback inequality for the weighted Kullback distance, which are important objects themselves [9, 23, 28]. The asymptotic forms of these inequalities for a particular family ...

2009
Rudolf Kulhavý

The use of probability in system identification is shown to be equivalent to measuring Kullback-Leibler distance between the actual (empirical) and model distributions of data. When data are not known completely (being compressed, quantized, aggregated, missing etc.), the minimum distance approach can be seen as an asymptotic approximation of probabilistic inference. A class of problems is poin...

2004
Mark-Jan Nederhof Giorgio Satta

We consider the problem of computing the Kullback-Leibler distance, also called the relative entropy, between a probabilistic context-free grammar and a probabilistic finite automaton. We show that there is a closed-form (analytical) solution for one part of the Kullback-Leibler distance, viz. the cross-entropy. We discuss several applications of the result to the problem of distributional appr...

In this paper, we investigate some inferential properties of the upper record Lomax distribution. Also, we will estimate the upper record of the Lomax distribution parameters using methods, Moment (MME), Maximum Likelihood (MLE), Kullback-Leibler Divergence of the Survival function (DLS) and Baysian. Finally, we will compare these methods using the Monte Carlo simulation.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید