نتایج جستجو برای: kullback leibler distance
تعداد نتایج: 244274 فیلتر نتایج به سال:
The cross-entropy and minimum cross-entropy methods are well-known Monte Carlo simulation techniques for rare-event probability estimation and optimization. In this paper, we investigate how these methods can be extended to provide a general non-parametric cross-entropy framework based on 1-divergence distance measures. We show how the 2 2 distance, in particular, yields a viable alternative to...
Bregman divergences are generalizations of the well known Kullback Leibler divergence. They are based on convex functions and have recently received great attention. We present a class of “squared root metrics” based on Bregman divergences. They can be regarded as natural generalization of Euclidean distance. We provide necessary and sufficient conditions for a convex function so that the squar...
In this paper we propose a Bayesian, information theoretic approach to dimensionality reduction. The approach is formulated as a variational principle on mutual information, and seamlessly addresses the notions of sufficiency, relevance, and representation. Maximally informative statistics are shown to minimize a Kullback-Leibler distance between posterior distributions. Illustrating the approa...
When learning processes depend on samples but not on the order of the information in the sample, then the Bernoulli distribution is relevant and Bernstein polynomials enter into the analysis. We derive estimates of the approximation of the entropy function x log x that are sharper than the bounds from Voronovskaja’s theorem. In this way we get the correct asymptotics for the Kullback-Leibler di...
In this paper we address the problem of multi-modal coregistration of medical 3D images. Several techniques for the rigid registration of multi-modal images have been developed; in one of those the Kullback-Leibler distance is used to align 2D-3D angiographic images [1]. In this paper we investigate the performance of this technique on the registration of pairs of 3D CT/US images. We study the ...
The mixing distribution is also sometimes called the mixing weights. There are a number of statistical estimation / unsupervised learning tasks associated with mixture models. Three such tasks are as follows (assuming an iid sample from some m? ∈M is given): 1. Density estimation: Here we assume that each distribution in M has a density. The task is to pick m̂ ∈ M such that m̂ is close to m? unde...
This paper seeks to develop an allocation of 0/1 data matrices to physical systems upon a Kullback-Leibler distance between probability distributions. The distributions are estimated from the contents of the data matrices. We discuss an ascending hierarchical classification method, a numerical example and mention an application with survey data concerning the level of development of the departm...
We show that under certain conditions, a language model can be trained on the basis of a second language model. The main instance of the technique trains a finite automaton on the basis of a probabilistic context-free grammar, such that the Kullback-Leibler distance between grammar and trained automaton is provably minimal. This is a substantial generalization of an existing algorithm to train ...
We show the capacity of a generalized pulse-position modulation (PPM) channel, where the input vectors may be any set that allows a transitive group of coordinate permutations, is achieved by a uniform input distribution. We derive a simple expression in terms of the Kullback–Leibler distance for the binary case, and find the asymptote in the PPM order. We prove a sub-additivity result for the ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید