نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

Journal: :Foundations and Trends in Communications and Information Theory 2004
Imre Csiszár Paul C. Shields

This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The information measure known as information divergence or Kullback-Leibler distance or relative entropy plays a key role, often with a geometric flavor as an analogue of squared Euclidean distance, as in the concepts of I-projection, I-radius and I-centroid. The topics cov...

2003
Ce Liu Harry Shum

In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting has following properties. First, classification is based on the sum of histogram divergences along corresponding global and discriminating linear features. Second, these linear features, called KL features, are iteratively learnt by maximizing the projected Kullback-Leibler d...

2008
R. Fischer

The design of fusion diagnostics is essential for the physics program of future fusion devices. The goal is to maximize the information gain of a future experiment with respect to various constraints. A measure of information gain is the mutual information between the posterior and the prior distribution. The Kullback-Leibler distance is used as a utility function to calculate the expected info...

Journal: :bulletin of the iranian mathematical society 2011
h. talebi n. esmailzadeh

this paper considers the search problem, introduced by srivastava cite{sr}. this is a model discrimination problem. in the context of search linear models, discrimination ability of search designs has been studied by several researchers. some criteria have been developed to measure this capability, however, they are restricted in a sense of being able to work for searching only one possible non...

Journal: :Entropy 2017
Edgar Parker

An alternative derivation of the yield curve based on entropy or the loss of information as it is communicated through time is introduced. Given this focus on entropy growth in communication the Shannon entropy will be utilized. Additionally, Shannon entropy’s close relationship to the Kullback–Leibler divergence is used to provide a more precise understanding of this new yield curve. The deriv...

Journal: :International Journal of Contents 2009

Journal: :Journal of the American Statistical Association 2004

Journal: :Journal of Biometrics & Biostatistics 2012

Journal: :Mathematics 2021

Asymptotic unbiasedness and L2-consistency are established, under mild conditions, for the estimates of Kullback–Leibler divergence between two probability measures in Rd, absolutely continuous with respect to (w.r.t.) Lebesgue measure. These based on certain k-nearest neighbor statistics pair independent identically distributed (i.i.d.) due vector samples. The novelty results is also treating ...

2016
Philip S. Thomas Bruno Castro da Silva Christoph Dann Emma Brunskill

We propose a new class of algorithms for minimizing or maximizing functions of parametric probabilistic models. These new algorithms are natural gradient algorithms that leverage more information than prior methods by using a new metric tensor in place of the commonly used Fisher information matrix. This new metric tensor is derived by computing directions of steepest ascent where the distance ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید