نتایج جستجو برای: leibler distance
تعداد نتایج: 244184 فیلتر نتایج به سال:
A class of statistical distance measures and their spectral counterparts are presented. They have strong physical foundations since they are based on the combinatorial law leading to Bose-Einstein statistics in statistical physics. It is shown that these distance measures are very closely related to the recently introduced Jensen-Shannon divergence measure. The Kullback-Leibler number is found ...
Window profiles of amino acids in protein sequences are used to describe the amino acid environment. The relative entropy or Kullback-Leibler distance derived from these profiles is used as a measure of dissimilarity for comparison of amino acids and secondary structure conformations. Distance matrices of amino acid pairs at different conformations are obtained, which display a non-negligible d...
The concept of information distance in non-commutative setting is re-considered. Additive information, such as Kullback-Leibler divergence, is defined using convex functional with gradient having the property of homomorphism between multiplicative and additive subgroups. We review several geometric properties, such as the logarithmic law of cosines, Pythagorean theorem and a lower bound given b...
We provide optimal lower and upper bounds for the augmented Kullback-Leibler divergence in terms of total variation distance between two probability measures defined on Euclidean spaces having different dimensions. call them refined Pinsker’s reverse inequalities, respectively.
Numerous divergence measures (spectral distance, cepstral distance, difference of the cepstral coefficients, Kullback-Leibler divergence, distance given by the General Likelihood Ratio, distance defined by the Recursive Bayesian Changepoint Detector and the Mahalanobis measure) are compared in this study. The measures are used for detection of abrupt spectral changes in synthetic AR signals via...
This paper is a strongly geometrical approach to the Fisher distance, which is a measure of dissimilarity between two probability distribution functions. The Fisher distance, as well as other divergence measures, are also used in many applications to establish a proper data average. The main purpose is to widen the range of possible interpretations and relations of the Fisher distance and its a...
Clustering short length texts is a difficult task itself, but adding the narrow domain characteristic poses an additional challenge for current clustering methods. We addressed this problem with the use of a new measure of distance between documents which is based on the symmetric Kullback-Leibler distance. Although this measure is commonly used to calculate a distance between two probability d...
With increasingly more databases becoming available on the Internet, there is a growing opportunity to globalise knowledge discovery and learn general patterns, rather than restricting learning to specific databases from which the rules may not be generalisable. Clustering of distributed databases facilitates learning of new concepts that characterise common features of, and differences between...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید