نتایج جستجو برای: divergence measure

تعداد نتایج: 390285  

Journal: :IEEE Trans. Signal Processing 2003
Yun He A. Ben Hamza Hamid Krim

Entropy-based divergence measures have shown promising results in many areas of engineering and image processing. In this paper, we define a new generalized divergence measure, namely, the Jensen–Rényi divergence. Some properties such as convexity and its upper bound are derived. Based on the Jensen–Rényi divergence, we propose a new approach to the problem of image registration. Some appealing...

2006
Martin Schweizer

Let X be a continuous adapted process for which there exists an equivalent local martingale measure (ELMM). The minimal martingale measure P̂ is the unique ELMM for X with the property that local P -martingales strongly orthogonal to the P -martingale part of X are also local P̂ -martingales. We prove that if P̂ exists, it minimizes the reverse relative entropy H(P |Q) over all ELMMs Q for X. A co...

Journal: :Entropy 2014
Takafumi Kanamori

Divergence is a discrepancy measure between two objects, such as functions, vectors, matrices, and so forth. In particular, divergences defined on probability distributions are widely employed in probabilistic forecasting. As the dissimilarity measure, the divergence should satisfy some conditions. In this paper, we consider two conditions: The first one is the scale-invariance property and the...

2009
Dominik Schnitzer Arthur Flexer Gerhard Widmer

We present a filter-and-refine method to speed up acoustic audio similarity queries which use the Kullback-Leibler divergence as similarity measure. The proposed method rescales the divergence and uses a modified FastMap [1] implementation to accelerate nearest-neighbor queries. The search for similar music pieces is accelerated by a factor of 10−30 compared to a linear scan but still offers hi...

2005
INDER JEET TANEJA

, s = 1 The first measure generalizes the well known J-divergence due to Jeffreys [16] and Kullback and Leibler [17]. The second measure gives a unified generalization of JensenShannon divergence due to Sibson [22] and Burbea and Rao [2, 3], and arithmeticgeometric mean divergence due to Taneja [27]. These two measures contain in particular some well known divergences such as: Hellinger’s discr...

A. Raj Mishra G. Sisodia K. Raj Pardasani K. Sharma

Global challenge and the speedy growth of information technologies compel organizations to constantly change their ways. At the present time, associations need IT personnel who create a difference by creative thoughts and who preserve with the rapid amendments. Since the evaluation of IT personnel selection (ITPS) consists of different alternatives and criteria, therefore, IT personnel selectio...

2001
Yun He A. Ben Hamza Hamid Krim

Entropy-based divergence measures have shown promising results in many areas of engineering and image processing. In this paper, a generalized information-theoretic measure called Jensen-Rényi divergence is proposed. Some properties such as convexity and its upper bound are derived. Using the Jensen-Rényi divergence, we propose a new approach to the problem of ISAR (Inverse Synthetic Aperture R...

M. Khodabin M. Khounsiavash, R. Kazemi Matin

In the conventional data envelopment analysis (DEA) internal sub-processes of the production units are ignored. The current paper develops a network-DEA super-efficiency model to compare the performance of efficient network systems. A new ranking method is developed by aggregating the computed super-efficiency scores with a J-divergence measure. The proposed approach is then applied to evaluate...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید