نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

2013
Roman V. Belavkin

The concept of information distance in non-commutative setting is re-considered. Additive information, such as Kullback-Leibler divergence, is defined using convex functional with gradient having the property of homomorphism between multiplicative and additive subgroups. We review several geometric properties, such as the logarithmic law of cosines, Pythagorean theorem and a lower bound given b...

2007
Huaiyu Zhu

Bayesian information geometry provides a general error decomposition theorem for arbitrary statistical models and a family of information deviations that include Kullback-Leibler information as a special case. When applied to Gaussian measures it takes the classical Hilbert space (Sobolev space) theories for estimation (regression, filtering, approximation, smoothing) as a special case. When th...

Journal: :ACM Computing Surveys 2023

This tutorial studies relations between differential privacy and various information-theoretic measures by using several selective articles. In particular, we present how these connections can provide new interpretations for the guarantee in systems that deploy an framework. Accordingly, delivers extensive summary on existing literature makes use of tools such as mutual information, min-entropy...

2001
ANDREAS DE VRIES Andreas de Vries

A connection between the notion of information and the concept of risk and return in portfolio theory is deduced. This succeeds in two steps: A general moment-return relation for arbitrary assets is derived, thereafter the total expected return is connected to the Kullback-Leibler information. With this result the optimization problem to maximize the expected return of a portfolio consisting of...

Journal: :Entropy 2008
Imre Csiszár

Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1, . . . , N} representable by joint entropies of components of an N -dimensional rand...

2008
I. Kontoyiannis O. T. Johnson M. Madiman

An information-theoretic foundation for compound Poisson approximation limit theorems is presented, in analogy to the corresponding developments for the central limit theorem and for simple Poisson approximation. It is shown that the compound Poisson distributions satisfy a natural maximum entropy property within a natural class of distributions. Simple compound Poisson approximation bounds are...

Journal: :IJCSA 2010
Hazra Imran Aditi Sharan

Automatic Query expansion is a well-known method to improve the performance of information retrieval systems. In this paper, we consider methods to extract the candidate terms for automatic query expansion, based on co-occurrence information from psuedo relevant documents. The objective of the paper is: to present to user different ways of selecting and ranking co-occurring terms and to suggest...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید