نتایج جستجو برای: kullback leibler distance

تعداد نتایج: 244274  

Journal: :ADCAIJ: Advances in Distributed Computing and Artificial Intelligence Journal 2014

Journal: :IEEE Transactions on Automatic Control 2015

Journal: :IEEE Transactions on Information Theory 2014

2006
Z. I. Botev D. P. Kroese T. Taimre

The cross-entropy and minimum cross-entropy methods are well-known Monte Carlo simulation techniques for rare-event probability estimation and optimization. In this paper we investigate how these methods can be extended to provide a general non-parametric cross-entropy framework based on φ-divergence distance measures. We show how the χ distance in particular yields a viable alternative to Kull...

2014
Sergey G. Bobkov Gennadiy P. Chistyakov Friedrich Götze

Berry–Esseen-type bounds for total variation and relative entropy distances to the normal law are established for the sums of non-i.i.d. random variables.

2004
Rui Gan Jue Wu Albert C. S. Chung Simon C. H. Yu William M. Wells

This paper extends our prior work on multi-modal image registration based on the a priori knowledge of the joint intensity distribution that we expect to obtain, and Kullback-Leibler distance. This expected joint distribution can be estimated from pre-aligned training images. Experimental results show that, as compared with the Mutual Information and Approximate Maximum Likelihood based registr...

2016
Fouzi Harrou Ying Sun Muddu Madakyaru

Accurate and effective anomaly detection and diagnosis of modern engineering systems by monitoring processes ensure reliability and safety of a product while maintaining desired quality. In this paper, an innovative method based on Kullback-Leibler divergence for detecting incipient anomalies in highly correlated multivariate data is presented. We use a partial least square (PLS) method as a mo...

2006
Meral Candan Çetin Aydin Erar

In this paper, the problem of variable selection in linear regression is considered. This problem involves choosing the most appropriate model from the candidate models. Variable selection criteria based on estimates of the Kullback-Leibler information are most common. Akaike’s AIC and bias corrected AIC belong to this group of criteria. The reduction of the bias in estimating the Kullback-Leib...

The purpose of this paper is to obtain the tracking interval for difference of expected Kullback-Leibler risks of two models under Type II hybrid censoring scheme. This interval helps us to evaluate proposed models in comparison with each other. We drive a statistic which tracks the difference of expected Kullback–Leibler risks between maximum likelihood estimators of the distribution in two diff...

2009
Rudolf Kulhavý

The role of Kerridge inaccuracy, Shannon entropy and Kullback-Leibler distance in statistical estimation is shown for both dicrete and continuous observations. The cases of data independence and regression-type dependence are considered in parallel. Pythagorean-like relations valid for probability distributions are presented and their importance for estimation under compressed data is indicated.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید