نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

Journal: :Entropy 2016
Frank Nielsen Ke Sun

Information-theoretic measures such as the entropy, cross-entropy and the KullbackLeibler divergence between two mixture models is a core primitive in many signal processing tasks. Since the Kullback-Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte-Carlo stochastic integration, approximated, or bounded using vari...

2003
Clifford M. Hurvich Chih-Ling Tsai

We develop a version of the Corrected Akaike Information Criterion (AICC) suitable for selection of an h-step-ahead linear predictor for a weakly stationary time series in discrete time. A motivation for this criterion is provided in terms of a generalized Kullback-Leibler information which is minimized at the optimal h-step predictor, and which is equivalent to the ordinary Kullback-Leibler in...

2003
Ritei Shibata RITEI SHIBATA

Estimation of Kullback-Leibler information is a crucial part of deriving a statistical model selection procedure which, like AIC, is based on the likelihood principle. To discriminate between nested models, we have to estimate KullbackLeibler information up to the order of a constant, while Kullback-Leibler information itself is of the order of the number of observations. A correction term empl...

2017

Dear Editor, The topic addressed by this brief communication is the quantification of diagnostic information and, in particular, a method of illustrating graphically the quantification of diagnostic information for binary tests. To begin, consider the following outline procedure for development of such a test. An appropriate indicator variable that will serve as a proxy for the actual variable ...

Journal: :EPL 2021

We define a new measure of causation from fluctuation-response theorem for Kullback-Leibler divergences, based on the information-theoretic cost perturbations. This information response has both invariance properties required an and physical interpretation propagation In linear systems, reduces to transfer entropy, providing connection between Fisher mutual information.

Journal: :TRANSACTIONS OF THE JAPAN SOCIETY OF MECHANICAL ENGINEERS Series B 1985

Journal: :Signal Processing 2010
Abd-Krim Seghouane

The Akaike information criterion, AIC, and its corrected version, AICc are two methods for selecting normal linear regression models. Both criteria were designed as estimators of the expected Kullback–Leibler information between the model generating the data and the approximating candidate model. In this paper, two new corrected variants of AIC are derived for the purpose of small sample linear...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید