نتایج جستجو برای: kullback information
تعداد نتایج: 1158173 فیلتر نتایج به سال:
In this paper, we model discrete time series as discrete Markov processes of arbitrary order and derive the approximate distribution of the Kullback-Leibler divergence between a known transition probability matrix and its sample estimate. We introduce two new information-theoretic measurements: information memory loss and information codependence structure. The former measures the memory conten...
The convex divergence is used as a surrogate function for obtaining a class of ICA algorithms (Independent Component Analysis) called the f-ICA. The convex divergence is a super class of α-divergence, which is a further upper family of Kullback-Leibler divergence or mutual information. Therefore, the f-ICA contains the α-ICA and the minimum mutual information ICA. In addition to theoretical int...
Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including it...
This paper presents a new method for characterizing information of a compressed image relative to the original one. We show how the Kullback-Leibler information gain is based on three basic postulates which are natural for image processing and thus desirable. As an example of the proposed measure, we analyze the effects of lossy compression on the identification of breast cancer microcalcificat...
Shi and Tsai (JRSSB, 2002) proposed an interesting residual information criterion (RIC) for model selection in regression. Their RIC was motivated by the principle of minimizing the Kullback-Leibler discrepancy between the residual likelihoods of the true and candidate model. We show, however, under this principle, RIC would always choose the full (saturated) model. The residual likelihood ther...
This article proposes a novel information-theoretic joint probabilistic data association filter for tracking unknown number of targets. The proposed algorithm is obtained by the minimization weighted reverse Kullback–Leibler divergence to approximate posterior Gaussian mixture probability density function. Theoretical analysis mean performance and error covariance with ideal detection presented...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید