نتایج جستجو برای: kullback

تعداد نتایج: 7189  

2002
Jason K. Johnson

This paper considers an information theoretic min-max approach to the model selection problem. The aim of this approach is to select the member of a given parameterized family of probability models so as to minimize the worst-case KullbackLeibler divergence from an uncertain “truth” model. Uncertainty of the truth is specified by an upper-bound of the KL-divergence relative to a given reference...

Journal: :Neurocomputing 1998
Lei Xu

Bayesian Kullback Ying—Yang dependence reduction system and theory is presented. Via stochastic approximation, implementable algorithms and criteria are given for parameter learning and model selection, respectively. Three typical architectures are further studied on several special cases. The forward one is a general information theoretic dependence reduction model that maps an observation x i...

2008
Miroslav Kárný Josef Andrýsek

Non-symmetric Kullback–Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat. 1979; 7(3):686–690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, us...

Journal: :Kybernetika 2008
Jan Sindelár Igor Vajda Miroslav Kárný

The paper solves the problem of minimization of the Kullback divergence between a partially known and a completely known probability distribution. It considers two probability distributions of a random vector (u1, x1, . . . , uT , xT ) on a sample space of 2T dimensions. One of the distributions is known, the other is known only partially. Namely, only the conditional probability distributions ...

2002
Febe de Wet Johan de Veth Louis Boves

In this paper, the accumulated Kullback divergence (AKD) is used to analyze ASR performance deterioration due to the presence of background noise. The AKD represents a distance between the feature value distribution observed during training and the distribution of the observations in the noisy test condition for each individual feature vector component. In our experiments the AKD summed over al...

Journal: :CoRR 2015
Lan Yang Jingbin Wang Yujin Tu Prarthana Mahapatra Nelson Cardoso

This paper proposes a new method for vector quantization by minimizing the Kullback-Leibler Divergence between the class label distributions over the quantization inputs, which are original vectors, and the output, which is the quantization subsets of the vector set. In this way, the vector quantization output can keep as much information of the class label as possible. An objective function is...

Journal: :IEEE Trans. Automat. Contr. 2008
Augusto Ferrante Michele Pavon Federico Ramponi

In this paper, we study a matricial version of a generalized moment problem with degree constraint. We introduce a new metric on multivariable spectral densities induced by the family of their spectral factors, which, in the scalar case, reduces to the Hellinger distance. We solve the corresponding constrained optimization problem via duality theory. A highly nontrivial existence theorem for th...

Journal: :IEEE Trans. Information Theory 2003
Tryphon T. Georgiou Anders Lindquist

We introduce a Kullback-Leibler type distance between spectral density functions of stationary stochastic processes and solve the problem of optimal approximation of a given spectral density Ψ by one that is consistent with prescribed second-order statistics. In general, such statistics are expressed as the state covariance of a linear filter driven by a stochastic process whose spectral densit...

Journal: :Pattern Recognition 2017
Moacir Ponti Josef Kittler Mateus Riva Teófilo Emídio de Campos Cemre Zor

In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback-Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback-Leibler divergence (DC-KL), to reduce the contribution of the...

2003
Sumiyoshi Abe

Nonadditive (nonextensive) generalization of the quantum Kullback-Leibler divergence, termed the quantum q-divergence, is shown not to increase by projective measurements in an elementary manner.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید