نتایج جستجو برای: divergence measures

تعداد نتایج: 399991  

1997
A. Eskin S. Mozes N. Shah

Let G and H ⊂ G be connected reductive real algebraic groups defined over Q, and admitting no nontrivial Q-characters. Let Γ ⊂ G(Q) be an arithmetic lattice in G, and π : G→ Γ\G be the natural quotient map. Let μH denote the H-invariant probability measure on the closed orbit π(H). Suppose that π(Z(H)) is compact, where Z(H) denotes the centralizer of H in G. We prove that the set {μH · g : g ∈...

2014
Miguel A. Ré Rajeev K. Azad

Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theo...

2010
Uri Heinemann Naftali Tishby

Our goal in this work is to detect anomalies in networks. As networks change over time, we would like to acquire the ability to detect the time-points in which the network behaves in an anomalous way. For example, let us look at the e-mail network. The email addresses are represented by vertices, while the edges’ weights are functions of the frequency of correspondence between any two addresses...

2007
Joseph E. Cavanaugh

Model selection criteria frequently arise from constructing estimators of discrepancy measures used to assess the disparity between thètrue' model and a tted approximating model. The Akaike (1973) information criterion and its variants result from utilizing Kullback's (1968) directed divergence as the targeted discrepancy. The directed divergence is an asym-metric measure of separation between ...

Journal: :Neural networks : the official journal of the International Neural Network Society 2011
Kazuho Watanabe Masato Okada Kazushi Ikeda

The local variational method is a technique to approximate an intractable posterior distribution in Bayesian learning. This article formulates a general framework for local variational approximation and shows that its objective function is decomposable into the sum of the Kullback information and the expected Bregman divergence from the approximating posterior distribution to the Bayesian poste...

Journal: :CoRR 2011
Inder Jeet Taneja

In 1938, Gini [3] studied a mean having two parameters. Later, many authors studied properties of this mean. It contains as particular cases the famous means such as harmonic, geometric, arithmetic, etc. Also it contains, the power mean of order r and Lehmer mean as particular cases. In this paper we have considered inequalities arising due to Gini-Mean and Heron’s mean, and improved them based...

2003
Christopher M. Kreucher Keith Kastella Alfred O. Hero

This paper presents a sensor management scheme based on maximizing the expected Rényi Information Divergence at each sample, applied to the problem of tracking multiple targets. The underlying tracking methodology is a multiple target tracking scheme based on recursive estimation of a Joint Multitarget Probability Density (JMPD), which is implemented using particle filtering methods. This Bayes...

2012
Hassan Ezzaidi Jean Rouat

Recently many research has been conducted to retrieve pertinent parameters and adequate models for automatic music genre classification. In this paper, two measures based upon information theory concepts are investigated for mapping the features space to decision space. A Gaussian Mixture Model (GMM) is used as a baseline and reference system. Various strategies are proposed for training and te...

Journal: :Canadian mathematical bulletin 2022

Abstract We investigate the weighted $L_p$ affine surface areas which appear in recently established Steiner formula of Brunn–Minkowski theory. show that they are valuations on set convex bodies and prove isoperimetric inequalities for them. related to f divergences cone measures body its polar, namely Kullback–Leibler divergence Rényi divergence.

2003
Noel Cressie Leandro Pardo LEANDRO PARDO

In this paper we consider inference based on very general divergence measures, under assumptions of multinomial sampling and loglinear models. We define the minimum φ-divergence estimator, which is seen to be a generalization of the maximum likelihood estimator. This estimator is then used in a φ-divergence goodness-of-fit statistic, which is the basis of two new statistics for solving the prob...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید