نتایج جستجو برای: kullback information

تعداد نتایج: 1158173  

Journal: :CoRR 2008
Ambedkar Dukkipati

We show that various formulations (e.g., dual and Kullback-Csiszar iterations) of estimation of maximum entropy (ME) models can be transformed to solving systems of polynomial equations in several variables for which one can use celebrated Gröbner bases methods. Posing of ME estimation as solving polynomial equations is possible, in the cases where feature functions (sufficient statistic) that ...

Journal: :CoRR 2006
Cheng-Yuan Liou Bruce R. Musicus

We apply two variations of the principle of Minimum Cross Entropy (the Kullback information measure) to fit parameterized probability density models to observed data densities. For an array beamforming problem with P incident narrowband point sources, N > P sensors, and colored noise, both approaches yield eigenvector fitting methods similar to that of the MUSIC algorithm[1]. Furthermore, the c...

2013
Guido Montúfar Johannes Rauh Nihat Ay

We review recent results about the maximal values of the Kullback-Leibler information divergence from statistical models defined by neural networks, including näıve Bayes models, restricted Boltzmann machines, deep belief networks, and various classes of exponential families. We illustrate approaches to compute the maximal divergence from a given model starting from simple subor super-models. W...

Journal: :Journal of Approximation Theory 2004
Dietrich Braess Tomas Sauer

When learning processes depend on samples but not on the order of the information in the sample, then the Bernoulli distribution is relevant and Bernstein polynomials enter into the analysis. We derive estimates of the approximation of the entropy function x log x that are sharper than the bounds from Voronovskaja’s theorem. In this way we get the correct asymptotics for the Kullback-Leibler di...

2007
Priscilla E. Greenwood Wolfgang Wefelmeyer

Suppose we have speciied a parametric model for the transition distribution of a Markov chain, but that the true transition distribution does not belong to the model. Then the maximum likelihood estimator estimates the parameter which maximizes the Kullback{Leibler information between the true transition distribution and the model. We prove that the maximum likelihood estimator is asymp-totical...

2000
Ali MANSOUR Allan Kardec BARROS Noboru OHNISHI

The blind separation of sources is a recent and important problem in signal processing. Since 1984 [1], it has been studied by many authors whilst many algorithms have been proposed. In this paper, the description of the problem, its assumptions, its currently applications and some algorithms and ideas are discussed. key words: independent component analysis (ICA), contrast function, Kullback-L...

Journal: :Neural computation 2000
Toshiyuki Tanaka

I present a general theory of mean-field approximation based on information geometry and applicable not only to Boltzmann machines but also to wider classes of statistical models. Using perturbation expansion of the Kullback divergence (or Plefka expansion in statistical physics), a formulation of mean-field approximation of general orders is derived. It includes in a natural way the "naive" me...

1999
Baibing Li Bart De Moor

For ATM network traffic, a new approach based on the Kullback-Leibler information measure is proposed for stochastic system identification of packet traffic. Thus approach, equivalent to the maximum marginal likelihood estimate, can overcome the over-modeling problem in [II such that much more parsimonious model order N can be obtained, and then can lead significant reduction in the latter queu...

2001
E. Valkeila Peter Spreij Esko Valkeila

In this paper we give explicit representations for Kullback-Leibler information numbers between a priori and a posteriori distributions, when the observations come from a semimartingale. We assume that the distribution of the observed semimartingale is described in terms of the so-called triplet of predictable characteristics. We end by considering the corresponding notions in a model with a fr...

2009
Baibai Fu

In this paper, we first develop the concept of the maximum entropy function for equilibrium programming by employing prior distribution information and Kullback entropy. It plays a key role in entropy function method for solving equilibrium programming. After analyze and discuss the relation with the model transformation equivalent applied in mixed traffic assignment problems, and finally we gi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید