نتایج جستجو برای: divergence measure

تعداد نتایج: 390285  

Journal: :IEEE Access 2021

In recent years, uncertain data clustering has become the subject of active research in many fields, for example, pattern recognition, and machine learning. Nowadays, researchers have committed themselves to substitute traditional distance or similarity measures with new metrics existing centralized algorithms order tackle uncertainty data. However, perform clustering, representation plays an i...

2005
A. P. Majtey P. W. Lamberti

Abstract We discuss an alternative to relative entropy as a measure of distance between mixed quantum states. The proposed quantity is an extension to the realm of quantum theory of the JensenShannon divergence (JSD) between probability distributions. The JSD has several interesting properties. It arises in information theory and, unlike the Kullback-Leibler divergence, it is symmetric, always ...

Journal: :CoRR 2011
R. C. Venkatesan Angel Plastino

The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics subjected to the additive duality of generalized statistics (dual generalized K-Ld) is reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence. The Pyth...

Journal: :Journal of Machine Learning Research 2003
Kari Torkkola

We present a method for learning discriminative feature transforms using as criterion the mutual information between class labels and transformed features. Instead of a commonly used mutual information measure based on Kullback-Leibler divergence, we use a quadratic divergence measure, which allows us to make an efficient non-parametric implementation and requires no prior assumptions about cla...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2013
Christopher R S Banerji Simone Severini Andrew E Teschendorff

A measure is derived to quantify directed information transfer between pairs of vertices in a weighted network, over paths of a specified maximal length. Our approach employs a general, probabilistic model of network traffic, from which the informational distance between dynamics on two weighted networks can be naturally expressed as a Jensen Shannon divergence. Our network transfer entropy mea...

1999
A. H. Lumpkin

We present the simultaneous measurement of beam divergence and source size based on the APS diagnostic undulator line. A 300-μm-thick Si(400) crystal monochromator is used to measure the divergence with a resolution down to 3 μrad (1 μrad with the third harmonic). X-rays transmitted through the crystal are simultaneously used by a pinhole camera to measure the beam size, at a resolution of abou...

2015
Hoang Vu Nguyen Jilles Vreeken

Quantifying the difference between two distributions is a common problem in many machine learning and data mining tasks. What is also common in many tasks is that we only have empirical data. That is, we do not know the true distributions nor their form, and hence, before we can measure their divergence we first need to assume a distribution or perform estimation. For exploratory purposes this ...

2014
Rajeev Kaushik Rakesh K Bajaj

In this paper, we proposed an algorithm to find the optimal threshold value for denoising an image. A new cost function is designed to find the optimal threshold in every image. The cost function is based on the intuitionistic fuzzy divergence measure of the denoised image and original image. In addition, the intuitionistic fuzzy entropy of denoised image is added to the cost function. This is ...

2011
R. C. Venkatesan A. Plastino

The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics [constrained by the additive duality of generalized statistics (dual generalized K-Ld)] is here reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure-theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence....

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید