نتایج جستجو برای: leibler distance

تعداد نتایج: 244184  

Journal: :ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences 2020

Journal: :Pattern Recognition Letters 2005
Frans Coetzee

A frequent practice in feature selection is to maximize the Kullback-Leibler (K-L) distance between target classes. In this note we show that this common custom is frequently suboptimal, since it fails to take into account the fact that classification occurs using a finite number of samples. In classification, the variance and higher order moments of the likelihood function should be taken into...

2006
Young Kyung Lee Byeong U. Park

Motivated from the bandwidth selection problem in local likelihood density estimation and from the problem of assessing a final model chosen by a certain model selection procedure, we consider estimation of the Kullback–Leibler divergence. It is known that the best bandwidth choice for the local likelihood density estimator depends on the distance between the true density and the ‘vehicle’ para...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2007
Michele Tumminello Fabrizio Lillo Rosario N Mantegna

We show that the Kullback-Leibler distance is a good measure of the statistical uncertainty of correlation matrices estimated by using a finite set of data. For correlation matrices of multivariate Gaussian variables we analytically determine the expected values of the Kullback-Leibler distance of a sample correlation matrix from a reference model and we show that the expected values are known ...

2018
Neda Lovričević Ðilda Pečarić Josip Pečarić

Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type inequalities, in this paper we integrate this approach with the well known Zipf-Mandelbrot law applied to various types of f-divergences and distances, such are Kullback-Leibler divergence, Hellinger distance, Bhattacharyya distance (via coefficient), [Formula: see text]-divergence, total variation ...

Journal: :IAES International Journal of Artificial Intelligence (IJ-AI) 2020

Journal: :CoRR 2017
Marc G. Bellemare Ivo Danihelka Will Dabney Shakir Mohamed Balaji Lakshminarayanan Stephan Hoyer Rémi Munos

The Wasserstein probability metric has received much attention from the machine learning community. Unlike the Kullback-Leibler divergence, which strictly measures change in probability, the Wasserstein metric reflects the underlying geometry between outcomes. The value of being sensitive to this geometry has been demonstrated, among others, in ordinal regression and generative modelling. In th...

Journal: :International journal of epidemiology 1999
W C Lee

BACKGROUND To select a proper diagnostic test, it is recommended that the most specific test be used to confirm (rule in) a diagnosis, and the most sensitive test be used to establish that a disease is unlikely (rule out). These rule-in and rule-out concepts can also be characterized by the likelihood ratio (LR). However, previous papers discussed only the case of binary tests and assumed test ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید