نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

Journal: :The Journal of chemical physics 2005
K Ch Chatzisavvas Ch C Moustakidis C P Panos

Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z(2 < or = Z < or = 54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S = a + b ln Z is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy, and a complexity measu...

2000
David R. Anderson Kenneth P. Burnham

We provide background information to allow a heuristic understanding of two types of criteria used in selecting a model for making inferences from ringing data. The first type of criteria (e.g., AIC, AIC, QAIC and TIC) are estimates of (relative) Kullback-Leibler information or-distance and attempt to select a good approximating model for inference, based on the Principle of Parsimony. The seco...

Journal: :CoRR 2013
Fan Wang Jun Zhu Lin Zhang

Relative entropy is an essential tool in quantum information theory. There are so many problems which are related to relative entropy. In this article, the optimal values which are defined by max U∈U(Xd) S(UρU∗ ‖ σ) and min U∈U(Xd) S(UρU∗ ‖ σ) for two positive definite operators ρ, σ ∈ Pd(X ) are obtained. And the set of S(UρU∗ ‖ σ) for every unitary operator U is full of the interval [ min U∈U...

Journal: :Signal Processing 2007
Sinan Sinanovic Don H. Johnson

Information processing theory endeavors to quantify how well signals encode information and how well systems, by acting on signals, process information. We use information-theoretic distance measures, the Kullback-Leibler distance in particular, to quantify how well signals represent information. The ratio of distances between a system’s output and input quantifies the system’s information proc...

2006
Yuhua Qian Jiye Liang

Based on the intuitionistic knowledge content characteristic of information gain, the concepts of combination entropy CE(A) and combination granulation CG(A) in incomplete information system are introduced, their some properties are given. Furthermore, the relationship between combination entropy and combination granulation is established. These concepts and properties are all special instances...

2009
V. J. Yohai

Hampel (1974) introduced a very general procedure to derive optimal robust M-estimates for oneparameter families of distributions. The optimal estimate is obtained by minimizing the asymptotic variance among M-estimates which are Fisher-consistent and have gross error sensitivity (GES) bounded by a given constant. Stahel (1981) generalized the optimal M-estimates for the case of families which ...

Journal: :Journal of radio electronics 2022

The estimation of the feature space in analysis radar signals (airplanes, ships, navigation stations, etc.), is an important element machine learning. From point view queuing theory, a mathematical model complex detected signal can be represented as ordinary flow events described by Poisson distribution for randomly varying parameters signal. paper demonstrates orthogonality characteristic sour...

2002
R. Quian Quiroga J. Arnhold K. Lehnertz P. Grassberger

Kopitzki et al (preceeding comment) claim that the relationship between Renormalized and Kullback-Leibler entropies has already been given in their previous papers. Moreover, they argue that the first can give more useful information for e.g. localizing the seizure-generating area in epilepsy patients. In our reply we stress that if the relationship between both entropies would have been known ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید