نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

2007
M. Tumminello F. Lillo R. N. Mantegna

The problem of filtering information from large correlation matrices is of great importance in many applications. We have recently proposed the use of the Kullback–Leibler distance to measure the performance of filtering algorithms in recovering the underlying correlation matrix when the variables are described by a multivariate Gaussian distribution. Here we use the Kullback–Leibler distance t...

Journal: :IEEE Trans. Information Theory 2014
Tim van Erven Peter Harremoës

Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...

Journal: :Kybernetika 2006
Nihat Ay Andreas Knauf

Stochastic interdependence of a probablility distribution on a product space is measured by its Kullback-Leibler distance from the exponential family of product distributions (called multi-information). Here we investigate lowdimensional exponential families that contain the maximizers of stochastic interdependence in their closure. Based on a detailed description of the structure of probablili...

2002
R. Quian Quiroga J. Arnhold K. Lehnertz P. Grassberger

Reply to " Comments on Kullback-Leibler and renormal-ized entropies: Applications to electroencephalograms of epilepsy patients ". Abstract Kopitzki et al (preceeding comment) claim that the relationship between Renormalized and Kullback-Leibler entropies has already been given in their previous papers. Moreover, they argue that the first can give more useful information for e.g. localizing the...

2003
Xueli Xu

This paper demonstrates the performance of two possible CAT selection strategies for cognitive diagnosis. One is based on Shannon entropy and the other is based on Kullback-Leibler information. The performances of these two test construction methods are compared with random item selection. The cognitive diagnosis model used in this study is a simplified version of the Fusion model. Item banks a...

In this paper, we introduce new tests for exponentiality based on estimators of Renyi entropy of a continuous random variable. We first consider two transformations of the observations which turn the test of exponentiality into one of uniformity and use a corresponding test based on Renyi entropy. Critical values of the test statistics are computed by Monte Carlo simulations. Then, we compare p...

1995
Lei Xu

A Bayesian-Kullback learning scheme, called Ying-Yang Machine, is proposed based on the two complement but equivalent Bayesian representations for joint density and their Kullback divergence. Not only the scheme unifies existing major supervised and unsupervised learnings, including the classical maximum likelihood or least square learning, the maximum information preservation, the EM & em algo...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید