نتایج جستجو برای: divergence measures
تعداد نتایج: 399991 فیلتر نتایج به سال:
We introduce a new divergence measure, the bounded Bhattacharyya distance (BBD), for quantifying the dissimilarity between probability distributions. BBD is based on the Bhattacharyya coefficient (fidelity) , and is symmetric, positive semi-definite, and bounded. Unlike the Kullback-Leibler divergence, BBD does not require probability density functions to be absolutely continuous with respect t...
A divergence function measures how different two points are in a base space. Well-known examples are the Kullback-Leibler divergence and f-divergence, which are defined in a manifold of probability distributions. The Bregman divergence is used in a more general situation. The present paper characterizes the geometrical structure which a divergence function gives, and proves that the fdivergence...
Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central operation that appears in most of these areas is to measure the difference between two multivariate Gaussians. Unfortunately, traditional measures based on the Kullback–Leibler (KL) divergence and the Bhattacharyya distance do not satisfy all metric axioms necessary for many algorithms. In this ...
A population of bluegill sunfish, Lepomis macrochirus, was introduced into three man-made ponds in 1966. Analyses of these ponds in 1984 and 1985 found significant mtDNA divergence without nuclear gene differentiation. The difference between nuclear gene and mtDNA measures of interpopulational divergence was very large and suggests that sexual asymmetries in life histories may be important cons...
We characterise equality cases in matrix H¨older’s inequality and develop a divergence formulation of optimal transport vector measures. As an application, we reprove the representation formula for measures polar cone to monotone maps. generalise last result wide class of cones, including cones tangent unit ball space differentiable functions Sobolev spaces.
There are many fuzzy information and divergence measures exist in the literature of Information Theory. Inequalities play important role for finding relations. Here, we will introduce some new inequalities on their applications pattern recognition. Also established relations between well known with help $f$-divergence measure jensen’s inequality.
Numerous divergence measures (spectral distance, cepstral distance, difference of the cepstral coefficients, Kullback-Leibler divergence, distance given by the General Likelihood Ratio, distance defined by the Recursive Bayesian Changepoint Detector and the Mahalanobis measure) are compared in this study. The measures are used for detection of abrupt spectral changes in synthetic AR signals via...
Inequalities which connect information divergence with other measures of discrimination or distance between probability distributions are used in information theory and its applications to mathematical statistics, ergodic theory and other scientific fields. We suggest new inequalities of this type, often based on underlying identities. As a consequence we obtain certain improvements of the well...
One of important components in an image retrieval system is selecting a distance measure to compute rank between two objects. In this paper, several distance measures were researched to implement a foliage plant retrieval system. Sixty kinds of foliage plants with various leaf color and shape were used to test the performance of 7 different kinds of distance measures: city block distance, Eucli...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید