نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

Journal: :Entropy 2015
Anton Golub Gregor Chliamovitch Alexandre Dupuis Bastien Chopard

In this paper, we model discrete time series as discrete Markov processes of arbitrary order and derive the approximate distribution of the Kullback-Leibler divergence between a known transition probability matrix and its sample estimate. We introduce two new information-theoretic measurements: information memory loss and information codependence structure. The former measures the memory conten...

2001
Yasuo Matsuyama Naoto Katsumata Shuichiro Imahara

The convex divergence is used as a surrogate function for obtaining a class of ICA algorithms (Independent Component Analysis) called the f-ICA. The convex divergence is a super class of α-divergence, which is a further upper family of Kullback-Leibler divergence or mutual information. Therefore, the f-ICA contains the α-ICA and the minimum mutual information ICA. In addition to theoretical int...

2010
Tim van Erven Peter Harremoës

Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including it...

2002
Jose A. García Joaquín Fernández-Valdivia Rosa Rodriguez-Sánchez Xosé R. Fernández-Vidal

This paper presents a new method for characterizing information of a compressed image relative to the original one. We show how the Kullback-Leibler information gain is based on three basic postulates which are natural for image processing and thus desirable. As an example of the proposed measure, we analyze the effects of lossy compression on the identification of breast cancer microcalcificat...

2008
Chenlei Leng

Shi and Tsai (JRSSB, 2002) proposed an interesting residual information criterion (RIC) for model selection in regression. Their RIC was motivated by the principle of minimizing the Kullback-Leibler discrepancy between the residual likelihoods of the true and candidate model. We show, however, under this principle, RIC would always choose the full (saturated) model. The residual likelihood ther...

Journal: :IEEE Transactions on Aerospace and Electronic Systems 2007

Journal: :IEEE Transactions on Automatic Control 2021

This article proposes a novel information-theoretic joint probabilistic data association filter for tracking unknown number of targets. The proposed algorithm is obtained by the minimization weighted reverse Kullback–Leibler divergence to approximate posterior Gaussian mixture probability density function. Theoretical analysis mean performance and error covariance with ideal detection presented...

Journal: :Mathematical Inequalities & Applications 2017

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید