Irreversible Monte Carlo Algorithms for Efficient Sampling
نویسندگان
چکیده
Equilibrium systems evolve according to Detailed Balance (DB). This principe guided development of the Monte-Carlo sampling techniques, of which Metropolis-Hastings (MH) algorithm is the famous representative. It is also known that DB is sufficient but not necessary. We construct irreversible deformation of a given reversible algorithm capable of dramatic improvement of sampling from known distribution. Our transformation modifies transition rates keeping the structure of transitions intact. To illustrate the general scheme we design an Irreversible version of Metropolis-Hastings (IMH) and test it on example of a spin cluster. Standard MH for the model suffers from the critical slowdown, while IMH is free from critical slowdown.
منابع مشابه
Efficient Calculation of Risk Measures by Importance Sampling – the Heavy Tailed Case
Computation of extreme quantiles and tail-based risk measures using standard Monte Carlo simulation can be inefficient. A method to speed up computations is provided by importance sampling. We show that importance sampling algorithms, designed for efficient tail probability estimation, can significantly improve Monte Carlo estimators of tail-based risk measures. In the heavy-tailed setting, whe...
متن کاملLookahead Strategies for Sequential Monte Carlo
Based on the principles of importance sampling and resampling, sequential Monte Carlo (SMC) encompasses a large set of powerful techniques dealing with complex stochastic dynamic systems. Many of these systems possess strong memory, with which future information can help sharpen the inference about the current state. By providing theoretical justification of several existing algorithms and intr...
متن کاملNiching in Monte Carlo Filtering Algorithms
Nonlinear multimodal filtering problems are usually addressed via Monte Carlo algorithms. These algorithms involve sampling procedures that are similar to proportional selection in genetic algorithms, and that are prone to failure due to genetic drift. This work investigates the feasibility and the relevance of niching strategies in this context. Sharing methods are evaluated experimentally, an...
متن کاملHamiltonian Monte Carlo Acceleration Using Neural Network Surrogate functions
Relatively high computational cost for Bayesian methods often limits their application for big data analysis. In recent years, there have been many attempts to improve computational efficiency of Bayesian inference. Here we propose an efficient and scalable computational technique for a state-of-the-art Markov Chain Monte Carlo (MCMC) methods, namely, Hamiltonian Monte Carlo (HMC). The key idea...
متن کاملLocal Quasi-Monte Carlo Exploration
In physically-based image synthesis, the path space of light transport paths is usually explored by stochastic sampling. The two main families of algorithms are Monte Carlo/quasi-Monte Carlo sampling and Markov chain Monte Carlo. While the former is known for good uniform discovery of important regions, the latter facilitates efficient exploration of local effects. We introduce a hybrid samplin...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/0809.0916 شماره
صفحات -
تاریخ انتشار 2008