نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

2001
Stephan Schulz

We model the learning of classifications as a combination of abstraction and class assignment. We discuss the problem of selecting the most suitable of multiple abstractions for this purpose. Weaker abstractions perform better on training sets, but typically do not generalize very well. Stronger abstractions often generalize better, but may fail to include important properties. We introduce the...

Journal: :IEEE Trans. Information Theory 1997
Qun Xie Andrew R. Barron

Let Xn = (X1; ; Xn) be a memoryless source with unknown distribution on a finite alphabet of size k. We identify the asymptotic minimax coding redundancy for this class of sources, and provide a sequence of asymptotically minimax codes. Equivalently, we determine the limiting behavior of the minimax relative entropy minQ maxP D(PX kQX ), where the maximum is over all independent and identically...

The purpose of this paper is to obtain the tracking interval for difference of expected Kullback-Leibler risks of two models under Type II hybrid censoring scheme. This interval helps us to evaluate proposed models in comparison with each other. We drive a statistic which tracks the difference of expected Kullback–Leibler risks between maximum likelihood estimators of the distribution in two diff...

Journal: :ADCAIJ: Advances in Distributed Computing and Artificial Intelligence Journal 2014

Journal: :IEEE Transactions on Information Theory 2014

2003
Soosan Beheshti Munther A. Dahleh

We introduce a new method of model order selection: minimum description complexity (MDC). The approach is motivated by the Kullback-Leibler information distance. The method suggests to choose the model set for which the "model set relative entropy" is minimum. The proposed method is comparable with the existing order estimation methods such as AIC and MDL. We elaborate on the advantages of MDC ...

2011
Jessica Kasza Patty Solomon

In this paper, we compare the performance of two methods for estimating Bayesian networks from data containing exogenous variables and random effects. The first method is fully Bayesian in which a prior distribution is placed on the exogenous variables, whereas the second method, which we call the residual approach, accounts for the effects of exogenous variables by using the notion of restrict...

2005
PRANESH KUMAR ANDREW JOHNSON Pranesh Kumar

A non-parametric symmetric measure of divergence which belongs to the family of Csiszár’s f -divergences is proposed. Its properties are studied and bounds in terms of some well known divergence measures obtained. An application to the mutual information is considered. A parametric measure of information is also derived from the suggested non-parametric measure. A numerical illustration to comp...

Journal: :CoRR 2016
Jacob S. Hunter Nathan O. Hodas

Deep nonlinear models pose a challenge for fitting parameters due to lack of knowledge of the hidden layer and the potentially non-affine relation of the initial and observed layers. In the present work we investigate the use of information theoretic measures such as mutual information and Kullback-Leibler (KL) divergence as objective functions for fitting such models without knowledge of the h...

2009
Jie Peng Iadh Ounis

The application of query-independent features, such as PageRank, can boost the retrieval effectiveness of a Web Information Retrieval (IR) system. In some previous works, a query-independent feature is uniformly applied to all queries. Other works predict the most useful feature based on the query type. However, the accuracy of the current query type prediction methods is not high. In this pape...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید