نتایج جستجو برای: divergence time estimation

تعداد نتایج: 2136009  

2007
Yuhong Yang

General results on adaptive density estimation are obtained with respect to any countable collection of estimation strategies under Kullback-Leibler and square L 2 losses. It is shown that without knowing which strategy works best for the underlying density, a single strategy can be constructed by mixing the proposed ones to be adaptive in terms of statistical risks. A consequence is that under...

Journal: :Molecular biology and evolution 2006
Ziheng Yang Bruce Rannala

We implement a Bayesian Markov chain Monte Carlo algorithm for estimating species divergence times that uses heterogeneous data from multiple gene loci and accommodates multiple fossil calibration nodes. A birth-death process with species sampling is used to specify a prior for divergence times, which allows easy assessment of the effects of that prior on posterior time estimates. We propose a ...

Journal: :Information geometry 2022

Abstract The logarithmic divergence is an extension of the Bregman motivated by optimal transport and a generalized convex duality, satisfies many remarkable properties. Using geometry induced divergence, we introduce generalization continuous time mirror descent that term conformal descent. We derive its dynamics under map, show it change corresponding Hessian gradient flow. also prove converg...

2008
Le Song Mark D. Reid Robert C. Williamson Alex J. Smola

We propose an approach for estimating f divergences that exploits a new representation of an f -divergence as a weighted integral of cost-weighted Bayes risks. We are therefore able to reduce f -divergence estimation to a problem of a posterior conditional probability estimation. We provide both batch and online implementation of our approach and analyze their convergence. Empirically, we show ...

2011
Jascha Sohl-Dickstein Peter Battaglino Michael R. DeWeese

Fitting probabilistic models to data is often difficult, due to the general intractability of the partition function and its derivatives. Here we propose a new parameter estimation technique that does not require computing an intractable normalization factor or sampling from the equilibrium distribution of the model. This is achieved by establishing dynamics that would transform the observed da...

Journal: :IEEE Trans. Information Theory 1992
Andrew R. Barron László Györfi Edward C. van der Meulen

The problem of the nonparametric estimation of a probability distribution is considered from three viewpoints: the consistency in total variation, the consistency in information divergence, and consistency in reversed order information divergence. These types of consistencies are relatively strong criteria of convergence, and a probability distribution cannot he consistently estimated in either...

2016
G T Lloyd D W Bapst M Friedman K E Davis

Branch lengths-measured in character changes-are an essential requirement of clock-based divergence estimation, regardless of whether the fossil calibrations used represent nodes or tips. However, a separate set of divergence time approaches are typically used to date palaeontological trees, which may lack such branch lengths. Among these methods, sophisticated probabilistic approaches have rec...

2016
Rim Chriki-Adeeb Ali Chriki

Accurate estimation of divergence times of soil bacteria that form nitrogen-fixing associations with most leguminous plants is challenging because of a limited fossil record and complexities associated with molecular clocks and phylogenetic diversity of root nodule bacteria, collectively called rhizobia. To overcome the lack of fossil record in bacteria, divergence times of host legumes were us...

2014
Abdulhakim Ali Qahtan Suojin Wang Raymond J. Carroll Xiangliang Zhang

Streaming data are dynamic in nature with frequent changes. To detect such changes, most methods measure the difference between the data distributions in a current time window and a reference window. Divergence metrics and density estimation are required to measure the difference between the data distributions. Our study shows that the Kullback-Leibler (KL) divergence, the most popular metric f...

2001
Alfred O. Hero Bing Ma Olivier Michel John Gorman

Motivated by Chernoff’s bound on asymptotic probability of error we propose the alpha-divergence measure and a surrogate, the alpha-Jensen difference, for feature classification, indexing and retrieval in image and other databases. The alpha-divergence, also known as Renyi divergence, is a generalization of the Kullback-Liebler divergence and the Hellinger affinity between the probability densi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید