نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

2014
Steeve Zozor Jean-Marc Brossier

In this paper we propose a generalization of the usual deBruijn identity that links the Shannon differential entropy (or the Kullback–Leibler divergence) and the Fisher information (or the Fisher divergence) of the output of a Gaussian channel. The generalization makes use of φ -entropies on the one hand, and of φ -divergences (of the Csizàr class) on the other hand, as generalizations of the S...

Journal: :Entropy 2014
Keisuke Yano Fumiyasu Komaki

We investigate the asymptotic construction of constant-risk Bayesian predictive densities under the Kullback–Leibler risk when the distributions of data and target variables are different and have a common unknown parameter. It is known that the Kullback–Leibler risk is asymptotically equal to a trace of the product of two matrices: the inverse of the Fisher information matrix for the data and ...

2001
MARC JOANNIDES

We study the asymptotic behavior of the Bayesian estimator for a deterministic signal in additive Gaussian white noise, in the case where the set of minima of the Kullback–Leibler information is a submanifold of the parameter space. This problem includes as a special case the study of the asymptotic behavior of the nonlinear filter, when the state equation is noise-free, and when the limiting d...

2005
David Pereira Coutinho Mário A. T. Figueiredo

Most approaches to text classification rely on some measure of (dis)similarity between sequences of symbols. Information theoretic measures have the advantage of making very few assumptions on the models which are considered to have generated the sequences, and have been the focus of recent interest. This paper addresses the use of the Ziv-Merhav method (ZMM) for the estimation of relative entr...

2013
Boris Schauerte Rainer Stiefelhagen

We propose the use of Bayesian surprise to detect arbitrary, salient acoustic events. We use Gaussian or Gamma distributions to model the spectrogram distribution and use the Kullback-Leibler divergence of the posterior and prior distribution to calculate how “unexpected” and thus surprising newly observed audio samples are. This way, we efficiently detect arbitrary surprising/salient acoustic ...

Journal: :Signal, Image and Video Processing 2007
Mohand Saïd Allili Djemel Ziou

In this paper, we propose a robust model for tracking in video sequences with non-static backgrounds. The object boundaries are tracked on each frame of the sequence by minimizing an energy functional that combines region, boundary and shape information. The region information is formulated by minimizing the symmetric Kullback– Leibler (KL) distance between the local and global statistics of th...

2013
David Witmer

1 Recap Recall some important facts about entropy and mutual information from the previous lecture: • H(X,Y ) = H(X) + H(Y |X) = H(Y ) + H(X|Y ) • I(X;Y ) = H(X)−H(X|Y ) = H(Y )−H(Y |X) = H(X) + H(Y )−H(X,Y ) • I(X;Y |Z) = H(X|Z)−H(X|Y,Z) • I(X;Y ) = 0 if X and Y are independent • I(X;Y ) ≥ 0 or, equivalently, H(X) ≥ H(X|Y ) Exercise 1.1 Prove that H(X|Y ) = 0 if and only if X = g(Y ) for some ...

Journal: :CoRR 2018
Mahito Sugiyama Hiroyuki Nakahara Koji Tsuda

We present a novel nonnegative tensor decomposition method, called Legendre decomposition, which factorizes an input tensor into a multiplicative combination of parameters. Thanks to the well-developed theory of information geometry, the reconstructed tensor is unique and alwaysminimizes the KL divergence from an input tensor. We empirically show that Legendre decomposition can more accurately ...

Journal: :Intelligent Information Management 2010
Kefan Xie Qian Wu Gang Chen Chao Ji

Information is a key factor in emergency management, which helps decision makers to make effective decisions. In this paper, aiming at clarifying the information aggregation laws, and according to the characteristic of emergency information, information relative entropy is applied in the information aggregation to establish the information aggregation model of emergency group decision-making. T...

2012
Aditya Guntuboyina

The mutual information I(θ; X) between two random variables θ and X is defined as the Kullback-Leibler divergence between their joint distribution and the product of their marginal distributions. It is interpreted as the amount of information that X contains about θ.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید