نتایج جستجو برای: divergence measure
تعداد نتایج: 390285 فیلتر نتایج به سال:
This paper contributes a tutorial level discussion of some interesting properties of the recent Cauchy–Schwarz (CS) divergence measure between probability density functions. This measure brings together elements from several different machine learning fields, namely information theory, graph theory and Mercer kernel and spectral theory. These connections are revealed when estimating the CS dive...
Information theoretic measures provide quantitative entropic divergences between two probability distributions or data sets. In this paper, we analyze the theoretical properties of the Jensen-Rényi divergence which is defined between any arbitrary number of probability distributions. Using the theory of majorization, we derive its maximum value, and also some performance upper bounds in terms o...
In this paper, we compare two discrete statistical models for the evaluation of a road safety measure. We give much simpler proof expression maximum likelihood estimator more complex model and demonstrate theoretical results on measure divergence between models. The obtained real data suggest that both are very competitive
In this paper, we focus on parameters estimation of probabilistic models in discrete space. A naive calculation of the normalization constant of the probabilistic model on discrete space is often infeasible and statistical inference based on such probabilistic models has difficulty. In this paper, we propose a novel estimator for probabilistic models on discrete space, which is derived from an ...
We explore a fairness-related challenge that arises in generative models. The is biased training data with imbalanced demographics may yield high asymmetry size of generated samples across distinct groups. focus on practically-relevant scenarios wherein demographic labels are not available and therefore the design fair model non-straightforward. In this paper, we propose an optimization framewo...
Henze-Penrose divergence is a non-parametric divergence measure that can be used to estimate a bound on the Bayes error in a binary classification problem. In this paper, we show that a crossmatch statistic based on optimal weighted matching can be used to directly estimate Henze-Penrose divergence. Unlike an earlier approach based on the Friedman-Rafsky minimal spanning tree statistic, the pro...
Divergence paralysis is characterized by acquired horizontal homonymous diplopia when viewing distant objects but without limitation of ocular movements (Bielschowsky, I935). Although raised intracranial pressure is a significant aetiological factor in the production of divergence paralysis (Bender and Savitsky, I940; Chamlin and Davidoff, I950, 1951), there has been much diversity of opinion a...
Motivated from the bandwidth selection problem in local likelihood density estimation and from the problem of assessing a final model chosen by a certain model selection procedure, we consider estimation of the Kullback–Leibler divergence. It is known that the best bandwidth choice for the local likelihood density estimator depends on the distance between the true density and the ‘vehicle’ para...
New divergence measures are introduced for change detection and discrimination of stochastic signals (time series) on the basis of parametric filtering — a technique that combines parametric linear filtering with correlation characterization. The sensitivity of these divergence measures is investigated using local curvatures under additive and multiplicative spectral departure models. It is fou...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید