نتایج جستجو برای: kullback leibler distance

تعداد نتایج: 244274  

2006
Chih-Yuan Tseng

Model or variable selection is usually achieved through ranking models according to the increasing order of preference. One of methods is applying Kullback-Leibler distance or relative entropy as a selection criterion. Yet that will raise two questions, why uses this criterion and are there any other criteria. Besides, conventional approaches require a reference prior, which is usually difficul...

2017

Dear Editor, The topic addressed by this brief communication is the quantification of diagnostic information and, in particular, a method of illustrating graphically the quantification of diagnostic information for binary tests. To begin, consider the following outline procedure for development of such a test. An appropriate indicator variable that will serve as a proxy for the actual variable ...

2008
Yuefeng Wu Subhashis Ghosal

Abstract: Positivity of the prior probability of Kullback-Leibler neighborhood around the true density, commonly known as the Kullback-Leibler property, plays a fundamental role in posterior consistency. A popular prior for Bayesian estimation is given by a Dirichlet mixture, where the kernels are chosen depending on the sample space and the class of densities to be estimated. The Kullback-Leib...

2001
Charlotte M. Gruner Don H. Johnson

We have developed a method for quantifying neural response changes in terms of the Kullback-Leibler distance between the intensity functions for each stimulus condition. We use empirical histogram estimates to characterize the intensity function of the neural response. A critical factor in determining the histogram estimates is selection of binwidth. In this work we analytically derive the Kull...

Journal: :Methods of information in medicine 2014
Gareth Hughes

Dear Editor, The topic addressed by this brief communication is the quantification of diagnostic information and, in particular, a method of illustrating graphically the quantification of diagnostic information for binary tests. To begin, consider the following outline procedure for development of such a test. An appropriate indicator variable that will serve as a proxy for the actual variable ...

2017

Dear Editor, The topic addressed by this brief communication is the quantification of diagnostic information and, in particular, a method of illustrating graphically the quantification of diagnostic information for binary tests. To begin, consider the following outline procedure for development of such a test. An appropriate indicator variable that will serve as a proxy for the actual variable ...

Journal: :Axioms 2017
Dagmar Markechová

The main aim of this contribution is to define the notions of Kullback-Leibler divergence and conditional mutual information in fuzzy probability spaces and to derive the basic properties of the suggested measures. In particular, chain rules for mutual information of fuzzy partitions and for Kullback-Leibler divergence with respect to fuzzy P-measures are established. In addition, a convexity o...

Journal: :IEEE Access 2022

We provide optimal lower and upper bounds for the augmented Kullback-Leibler divergence in terms of total variation distance between two probability measures defined on Euclidean spaces having different dimensions. call them refined Pinsker’s reverse inequalities, respectively.

2017

Dear Editor, The topic addressed by this brief communication is the quantification of diagnostic information and, in particular, a method of illustrating graphically the quantification of diagnostic information for binary tests. To begin, consider the following outline procedure for development of such a test. An appropriate indicator variable that will serve as a proxy for the actual variable ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید