Depth-Scaled Biased Stochastic Sampling
نویسندگان
چکیده
Several techniques have recently been developed that are based on discrepancy analysis in heuristic-based search trees. We offer a comparison of two of these proposed techniques: depth-bounded discrepancy search (DDS) and heuristic-biased stochastic sampling (HBSS), and one new technique, depth-scaled biased stochastic sampling (DSBSS). The new technique uses the framework of HBSS, but takes into account that mistakes are less likely to occur deep in a search tree. Through this comparison, we show that DSBSS performs systematically better than HBSS, and quickly approaches DDS when finding a goal in a limited amount of search.
منابع مشابه
Enhancing Stochastic Search Performance by Value-Biased Randomization of Heuristics
Stochastic search algorithms are often robust, scalable problem solvers. In this paper, we concern ourselves with the class of stochastic search algorithms called stochastic sampling. Randomization in such a search framework can be an effective means of expanding search around a stochastic neighborhood of a strong domain heuristic. Specifically, we show that a value-biased approach can be more ...
متن کاملStochastic Gradient Estimate Variance in Contrastive Divergence and Persistent Contrastive Divergence
Contrastive Divergence (CD) and Persistent Contrastive Divergence (PCD) are popular methods for training Restricted Boltzmann Machines. However, both methods use an approximate method for sampling from the model distribution. As a side effect, these approximations yield significantly different biases and variances for stochastic gradient estimates of individual data points. It is well known tha...
متن کاملBiased Importance Sampling for Deep Neural Network Training
Importance sampling has been successfully used to accelerate stochastic optimization in many convex problems. However, the lack of an efficient way to calculate the importance still hinders its application to Deep Learning. In this paper, we show that the loss value can be used as an alternative importance metric, and propose a way to efficiently approximate it for a deep model, using a small m...
متن کاملLiu Estimates and Influence Analysis in Regression Models with Stochastic Linear Restrictions and AR (1) Errors
In the linear regression models with AR (1) error structure when collinearity exists, stochastic linear restrictions or modifications of biased estimators (including Liu estimators) can be used to reduce the estimated variance of the regression coefficients estimates. In this paper, the combination of the biased Liu estimator and stochastic linear restrictions estimator is considered to overcom...
متن کاملUsing Prior Knowledge with Adaptive Probing
When searching a tree to find the best leaf, complete search methods uch as depth-first search and depth-bounded discrepancy search use a fixed deterministic order that may or may not be appropriate for the tree at hand. Adaptive probing is a recently-proposed stochastic method that attempts to adjust its sampling on-line to focus on areas of the tree that seem to contain good solutions. While ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005