نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

Journal: :Information and Control 1974
C. T. Ng

The representation for measures of information which are symmetric, expansible, and have the branching property in the form of a sum is provided. This class of measures includes, in particular, Shannon's entropy, entropies of degree fi, Kullback's directed divergence, and Kerridge's inaccuracy. Rdnyi's entropy and information gain of order fl are, however, excluded from this class. The proof is...

2015
Sergey Bobkov Gennadiy Chistyakov

Optimal stability estimates in the class of regularized distributions are derived for the characterization of normal laws in Cramer’s theorem with respect to relative entropy and Fisher information distance.

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2010
Saar Rahav Shaul Mukamel

By subjecting a dynamical system to a series of short pulses and varying several time delays, we can obtain multidimensional characteristic measures of the system. Multidimensional Kullback-Leibler response function (KLRF), which are based on the Kullback-Leibler distance between the initial and final states, are defined. We compare the KLRF, which are nonlinear in the probability density, with...

Journal: :European Journal of Operational Research 2010
M. J. Rufo Carlos J. Perez Jacinto Martín

In this paper, a general approach is proposed to address a full Bayesian analysis for the class of quadratic natural exponential families in the presence of several expert sources of prior information. By expressing the opinion of each expert as a conjugate prior distribution, a mixture model is used by the decision maker to arrive at a consensus of the sources. A hyperprior distribution on the...

Journal: :CoRR 2016
Sukanya Patil Ajit Rajwade

Reconstruction error bounds in compressed sensing under Gaussian or uniform bounded noise do not translate easily to the case of Poisson noise. Reasons for this include the signal dependent nature of Poisson noise, and also the fact that the negative log likelihood in case of a Poisson distribution (which is directly related to the generalized Kullback-Leibler divergence) is not a metric and do...

2006
Chih-Yuan Tseng

Model or variable selection is usually achieved through ranking models according to the increasing order of preference. One of methods is applying Kullback-Leibler distance or relative entropy as a selection criterion. Yet that will raise two questions, why uses this criterion and are there any other criteria. Besides, conventional approaches require a reference prior, which is usually difficul...

Journal: :Entropy 2017
Ali E. Abbas Andrea H. Cadenbach Ehsan Salimi

Entropy methods enable a convenient general approach to providing a probability distribution with partial information. The minimum cross-entropy principle selects the distribution that minimizes the Kullback–Leibler divergence subject to the given constraints. This general principle encompasses a wide variety of distributions, and generalizes other methods that have been proposed independently....

Journal: :CoRR 2015
Di Li Soummya Kar Fuad E. Alsaadi Shuguang Cui

We propose a distributed Bayesian quickest change detection algorithm for sensor networks, based on a random gossip inter-sensor communication structure. Without a control or fusion center, each sensor executes its local change detection procedure in a parallel and distributed fashion, interacting with its neighbor sensors via random inter-sensor communications to propagate information. By mode...

Journal: :Journal of Investigative Dermatology 2022

Psoriasis is an immune-mediated inflammatory and hyperproliferative skin condition affecting ∼2% of the US population, with a total annual cost around 3 billion dollars. Despite successes drug development, there can be significant variation in treatment response, which correlate patients’ genetic variations baseline genomic profiles. However, no study has integrated multiomic information to enh...

2008
I. Goychuk

In biological systems, information is frequently transferred with Poisson like spike processes (shot noise) modulated in time by information-carrying signals. How then to quantify information transfer by such processes for nonstationary input signals of finite duration? Is there some minimal length of the input signal duration versus its strength? Can such signals be better detected when immers...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید