نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

Journal: :CoRR 2015
Juan-Manuel Torres-Moreno

In this paper we introduce the intuitive notion of trivergence of probability distributions (TPD). This notion allow us to calculate the similarity among triplets of objects. For this computation, we can use the well known measures of probability divergences like Kullback-Leibler and Jensen-Shannon. Divergence measures may be used in Information Retrieval tasks as Automatic Text Summarization, ...

2008

1 General case 1.1 Cumulative distribution function 1.2 A counterexample 1.3 Normally distributed and independent 2 Bivariate case 3 Affine transformation 4 Geometric interpretation 5 Correlations and independence 6 Higher moments 7 Conditional distributions 8 Fisher information matrix 9 Kullback-Leibler divergence 10 Estimation of parameters 11 Entropy 12 Multivariate normality tests 13 Drawin...

Journal: :Journal of Machine Learning Research 2015
Vladimir Nikulin

In this paper we formulate in general terms an approach to prove strong consistency of the Empirical Risk Minimisation inductive principle applied to the prototype or distance based clustering. This approach was motivated by the Divisive Information-Theoretic Feature Clustering model in probabilistic space with Kullback-Leibler divergence, which may be regarded as a special case within the Clus...

2007
Giovanni Parmigiani

Information theoretic measures are of paramount importance in Bayesian inference. Yet, their direct application in practice has been somewhat limited by the severe computational diiculties encountered in complex problems. In this paper we discuss implementation strategies for fast numerical computations of Entropies and Kullback-Leibler divergences that are relevant to Bayesian inference and de...

2000
David R. Wolf

In this paper we propose a Bayesian, information theoretic approach to dimensionality reduction. The approach is formulated as a variational principle on mutual information, and seamlessly addresses the notions of sufficiency, relevance, and representation. Maximally informative statistics are shown to minimize a Kullback-Leibler distance between posterior distributions. Illustrating the approa...

2018
Linus Holm Erik Lundgren

Curiosity is thought to be an intrinsically motivated driving force for seeking information. Thus, the opportunity for an information gain (IG) should instil curiosity in humans and result in information gathering actions. To investigate if, and how, information acts as an intrinsic reward, a search task was set in a context of blurred background images which could be revealed by iterative clic...

2012
Antonio Cabrales Olivier Gossner Roberto Serrano

An information transaction entails the purchase of information. Formally, it consists of an information structure together with a price. We develop an index of the appeal of information transactions, which is derived as a dual to the agent’s preferences for information. The index of information transactions has a simple analytic characterization in terms of the relative entropy from priors to p...

2007
Flemming Topsøe

Information theory is becoming more and more important for many fields. This is true for engineeringand technology-based areas but also for more theoretically oriented sciences such as probability and statistics. Aspects of this development is first discussed at the non-technical level with emphasis on the role of information theoretical games. The overall rationale is explained and central typ...

2005
INDER JEET TANEJA

There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber’s [17] relative information and Jeffreys [16] J-divergence, Information radius or Jensen difference divergence measure due to Sibson [23]. Burbea and Rao [3, 4] has also found its applications in the literature. Taneja [25] studied anoth...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید