نتایج جستجو برای: leibler distance

تعداد نتایج: 244184  

Journal: :Journal of Approximation Theory 2004
Dietrich Braess Tomas Sauer

When learning processes depend on samples but not on the order of the information in the sample, then the Bernoulli distribution is relevant and Bernstein polynomials enter into the analysis. We derive estimates of the approximation of the entropy function x log x that are sharper than the bounds from Voronovskaja’s theorem. In this way we get the correct asymptotics for the Kullback-Leibler di...

2003
Gjenna Stippel James Ellsmere Simon K. Warfield William M. Wells Wilfried Philips

In this paper we address the problem of multi-modal coregistration of medical 3D images. Several techniques for the rigid registration of multi-modal images have been developed; in one of those the Kullback-Leibler distance is used to align 2D-3D angiographic images [1]. In this paper we investigate the performance of this technique on the registration of pairs of 3D CT/US images. We study the ...

2013
Daniel Hsu

The mixing distribution is also sometimes called the mixing weights. There are a number of statistical estimation / unsupervised learning tasks associated with mixture models. Three such tasks are as follows (assuming an iid sample from some m? ∈M is given): 1. Density estimation: Here we assume that each distribution in M has a density. The task is to pick m̂ ∈ M such that m̂ is close to m? unde...

Journal: :JAMDS 2008
Amar Rebbouh

This paper seeks to develop an allocation of 0/1 data matrices to physical systems upon a Kullback-Leibler distance between probability distributions. The distributions are estimated from the contents of the data matrices. We discuss an ascending hierarchical classification method, a numerical example and mention an application with survey data concerning the level of development of the departm...

Journal: :Computational Linguistics 2005
Mark-Jan Nederhof

We show that under certain conditions, a language model can be trained on the basis of a second language model. The main instance of the technique trains a finite automaton on the basis of a probabilistic context-free grammar, such that the Kullback-Leibler distance between grammar and trained automaton is provably minimal. This is a substantial generalization of an existing algorithm to train ...

2006
J. Hamkins M. Klimesh B. Moision

We show the capacity of a generalized pulse-position modulation (PPM) channel, where the input vectors may be any set that allows a transitive group of coordinate permutations, is achieved by a uniform input distribution. We derive a simple expression in terms of the Kullback–Leibler distance for the binary case, and find the asymptote in the PPM order. We prove a sub-additivity result for the ...

2003
Ilan N. Goodman Don H. Johnson

We develop two new multivariate statistical dependence measures. The first, based on the Kullback-Leibler distance, results in a single value that indicates the general level of dependence among the random variables. The second, based on an orthonormal series expansion of joint probability density functions, provides more detail about the nature of the dependence. We apply these dependence meas...

2004
Yasumasa Matsuda

Graphical models for multivariate time series is a concept extended by Dahlhaus (2000) from a random vector to a time series. We propose a test statistic to identify a graphical model for multivariate time series with the Kullback-Leibler distance between two spectral density matrices characterized by graphical models. Asymptotic null distribution is derived to be normal with the mean and varia...

2003
Soosan Beheshti Munther A. Dahleh

We introduce a new method of model order selection: minimum description complexity (MDC). The approach is motivated by the Kullback-Leibler information distance. The method suggests to choose the model set for which the "model set relative entropy" is minimum. The proposed method is comparable with the existing order estimation methods such as AIC and MDL. We elaborate on the advantages of MDC ...

2004
A. R. de Leon K. C. Carrière

A distance for mixed nominal, ordinal and continuous data is developed by applying the Kullback–Leibler divergence to the general mixed-data model, an extension of the general location model that allows for ordinal variables to be incorporated in the model. The distance obtained can be considered as a generalization of the Mahalanobis distance to data with a mixture of nominal, ordinal and cont...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید