نتایج جستجو برای: kullback leibler distance

تعداد نتایج: 244274  

Journal: :CoRR 2017
Marc G. Bellemare Ivo Danihelka Will Dabney Shakir Mohamed Balaji Lakshminarayanan Stephan Hoyer Rémi Munos

The Wasserstein probability metric has received much attention from the machine learning community. Unlike the Kullback-Leibler divergence, which strictly measures change in probability, the Wasserstein metric reflects the underlying geometry between outcomes. The value of being sensitive to this geometry has been demonstrated, among others, in ordinal regression and generative modelling. In th...

Journal: :International journal of epidemiology 1999
W C Lee

BACKGROUND To select a proper diagnostic test, it is recommended that the most specific test be used to confirm (rule in) a diagnosis, and the most sensitive test be used to establish that a disease is unlikely (rule out). These rule-in and rule-out concepts can also be characterized by the likelihood ratio (LR). However, previous papers discussed only the case of binary tests and assumed test ...

Journal: :Int. J. Computational Intelligence Systems 2014
R. Priya T. N. Shanmugam R. Baskaran

Content-based video retrieval systems have shown great potential in supporting decision making in clinical activities, teaching, and biological research. In content-based video retrieval, feature combination plays a key role. As a result content-based retrieval of all different type video data turns out to be a challenging and vigorous problem. This paper presents an effective content based vid...

Journal: :IAES International Journal of Artificial Intelligence (IJ-AI) 2020

Journal: :Speech Communication 1998
Kazuhiro Arai Jeremy H. Wright Giuseppe Riccardi Allen L. Gorin

A new method for automatically acquiring grammar fragments for understanding uently spoken language is proposed. The goal of this method is to generate a collection of grammar fragments each representing a set of syntactically and semantically similar phrases. First phrases observed frequently in the training set are selected as candidates. Each candidate phrase has three associated probability...

2006
Rani Nelken Stuart M. Shieber

Kullback-Leibler divergence is a natural distance measure between two probabilistic finite-state automata. Computing this distance is difficult, since it requires a summation over a countably infinite number of strings. Nederhof and Satta (2004) recently provided a solution in the course of solving the more general problem of finding the cross-entropy between a probabilistic context-free gramma...

Journal: :IEEE Trans. Information Theory 2003
Tryphon T. Georgiou Anders Lindquist

We introduce a Kullback-Leibler type distance between spectral density functions of stationary stochastic processes and solve the problem of optimal approximation of a given spectral density Ψ by one that is consistent with prescribed second-order statistics. In general, such statistics are expressed as the state covariance of a linear filter driven by a stochastic process whose spectral densit...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید