نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

2008
Tongzhu Li Linyu Peng Huafei Sun

In the present paper we study the geometric structure of the inverse Gamma manifold from the viewpoint of information geometry and give the Kullback divergence, the J-divergence and the geodesic equations. Also, some applications of the inverse Gamma distribution are provided.

2003
Andreas Knauf ANDREAS KNAUF

We investigate the structure of the global maximizers of stochas-tic interdependence, which is measured by the Kullback-Leibler divergence of the underlying joint probability distribution from the exponential family of factorizable random fields (multi-information). As a consequence of our structure results, it comes out that random fields with globally maximal multi-information are contained i...

Journal: :CoRR 2014
Jonathon Shlens

The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies the proximity of two probability distributions. Although difficult to understand by examining the equation, an intuition and understanding of the KL divergence arises from its intimate relationship with likelihood theory. We discuss how KL divergence arises from likelihood theory in an attempt t...

1992
Yoav Freund H. Sebastian Seung Eli Shamir Naftali Tishby

We analyze the "query by committee" algorithm, a method for filtering informative queries from a random stream of inputs. We show that if the two-member committee algorithm achieves information gain with positive lower bound, then the prediction error decreases exponentially with the number of queries. We show that, in particular, this exponential decrease holds for query learning of thresholde...

2000
Brigitte Krenn

An empirical study is presented showing how factors such as co-occurrence frequency, linguistic constraints in the candidate data and type of collocation to be identiied innuence the identiication accuracy achieved, on the one hand, by a mere frequency-based approach and, on the other hand, by well known statistical association measures such as mutual information, Dice coeecient, relative entro...

1995
DAVID HAUSSLER

Each parameter w in an abstract parameter space W is associated with a di er ent probability distribution on a set Y A parameter w is chosen at random from W according to some a priori distribution on W and n conditionally indepen dent random variables Y n Y Yn are observed with common distribution determined by w Viewing W as a random variable we obtain bounds on the mutual information between...

2010
S. Tahmasebi J. Behboodian

In the present paper Shannon’s entropy for concomitants of generalized order statistics in FGM family is obtained. Application of this result is given for order statistics, record values, k-record values, and progressive type II censored order statistics. Also, we show that the Kullback-Leibler distance among the concomitants of generalized order statistics is distributionfree.

Journal: :Entropy 2015
Nihat Ay Shun-ichi Amari

A divergence function defines a Riemannian metric g and dually coupled affine connections ∇ and ∇∗ with respect to it in a manifold M . When M is dually flat, that is flat with respect to ∇ and ∇∗, a canonical divergence is known, which is uniquely determined from (M, g,∇,∇∗). We propose a natural definition of a canonical divergence for a general, not necessarily flat, M by using the geodesic ...

Journal: :CoRR 2012
Jithin Vachery Ambedkar Dukkipati

—The importance of power-law distributions is attributed to the fact that most of the naturally occurring phenomenon exhibit this distribution. While exponential distributions can be derived by minimizing KL-divergence w.r.t some moment constraints, some power law distributions can be derived by minimizing some generalizations of KL-divergence (more specifically some special cases of Csiszár f-...

Journal: :CoRR 2008
François Bavaud

My greatest concern was what to call it. I thought of calling it “information”, but the word was overly used, so I decided to call it “uncertainty”. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already h...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید