نتایج جستجو برای: kullback
تعداد نتایج: 7189 فیلتر نتایج به سال:
Regularized (stabilized) versions of exponential and linear forgetting in parameter tracking are shown to be dual to each other. Both are derived by solving essentially the same Bayesian decision-making problem where Kullback-Leibler divergence is used to measure (quasi)distance between posterior probability distributions of estimated parameters. The type of forgetting depends solely on the ord...
Exact inference in large and complex graphical models (e.g. Bayesian networks) is computationally intractable. Approximate schemes are therefore of great importance for real world computation. In this paper we consider a general scheme in which the original intractable graphical model is approximated by a model with a tractable structure. The approximating model is optimised by an iterative pro...
This paper offers a gentle introduction to probability for linguists, assuming little or no background beyond what one learns in high school. The most important points that we emphasize are: the conceptual difference between probability and frequency, the use of maximizing probability of an observation by considering different models, and Kullback-Leibler divergence. Nous offrons une introducti...
We present a sequential noise compensation method based on the sequential Kullback proximal algorithm, which uses the Kullback-Leibler divergence as a regularization function for the maximum likelihood estimation. The method is implemented as filters. In contrast to sequential noise compensation method based on the sequential EM algorithm, the convergence rate of the method and estimation error...
Recent Automatic Speech Recognition (ASR) studies have shown that Kullback-Leibler diverge based hidden Markov models (KL-HMMs) are very powerful when only small amounts of training data are available. However, since the KL-HMMs use a cost function that is based on the Kullback-Leibler divergence (instead of maximum likelihood), standard ASR algorithms such as the commonly used decision tree cl...
A frequent practice in feature selection is to maximize the Kullback-Leibler (K-L) distance between target classes. In this note we show that this common custom is frequently suboptimal, since it fails to take into account the fact that classification occurs using a finite number of samples. In classification, the variance and higher order moments of the likelihood function should be taken into...
We analyze a contrasting dynamical behavior of Gibbs–Shannon and conditional Kullback-Leibler entropies, induced by time-evolution of continuous probability distributions. The question of predominantly purposedependent entropy definition for non-equilibriummodel systems is addressed. The conditional Kullback–Leibler entropy is often believed to properly capture physical features of an asymptoti...
In this paper, we obtain the main term of the average stochastic complexity for certain complete bipartite graph-type spin models in Bayesian estimation. We study the Kullback function of the spin model by using a new method of eigenvalue analysis first and use a recursive blowing up process for obtaining the maximum pole of the zeta function which is defined by using the Kullback function. The...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید