نتایج جستجو برای: smoothed minima
تعداد نتایج: 40824 فیلتر نتایج به سال:
We investigate contraction of the Wasserstein distances on \(\mathbb {R}^{d}\) under Gaussian smoothing. It is well known that heat semigroup exponentially contractive with respect to manifolds positive curvature; however, flat Euclidean space—where corresponds smoothing measures by convolution—the situation more subtle. prove precise asymptotics for 2-Wasserstein distance action semigroup, and...
We propose a derivative-free algorithm for finding high-quality local minima for functions that require significant computational resources to evaluate. Our algorithm efficiently utilizes the computational resources allocated to it and also has strong theoretical results, almost surely starting a finite number of local optimization runs and identifying all local minima. We propose metrics for m...
In deep learning, depth, as well as nonlinearity, create non-convex loss surfaces. Then, does depth alone create bad local minima? In this paper, we prove that without nonlinearity, depth alone does not create bad local minima, although it induces non-convex loss surface. Using this insight, we greatly simplify a recently proposed proof to show that all of the local minima of feedforward deep l...
In many applications of terrain analysis, pits or local minima are considered artifacts that must be removed before the terrain can be used. Most of the existing methods for local minima removal work only for raster terrains. In this paper we consider algorithms to remove local minima from polyhedral terrains, by modifying the heights of the vertices. To limit the changes introduced to the terr...
Convolutional Neural Networks (CNN) and the locally connected layer are limited in capturing the importance and relations of different local receptive fields, which are often crucial for tasks such as face verification, visual question answering, and word sequence prediction. To tackle the issue, we propose a novel locally smoothed neural network (LSNN) in this paper. The main idea is to repres...
State-action value functions (i.e., Q-values) are ubiquitous in reinforcement learning (RL), giving rise to popular algorithms such as SARSA and Q-learning. We propose a new notion of action value defined by a Gaussian smoothed version of the expected Q-value. We show that such smoothed Q-values still satisfy a Bellman equation, making them learnable from experience sampled from an environment....
In this review the theory and application of Smoothed particle hydrodynamics (SPH) since its inception in 1977 are discussed. Emphasis is placed on the strengths and weaknesses, the analogy with particle dynamics and the numerous areas where SPH has been successfully applied. 0034-4885/05/081703+57$90.00 © 2005 IOP Publishing Ltd Printed in the UK 1703
We study inequalities that simultaneously relate the number of lattice points, volume and successive minima a convex body to one another. One main ingredient in order establish these relations is Blaschke's shaking procedure, by which problem can be reduced from arbitrary bodies anti-blocking bodies. As consequence our results, we obtain an upper bound on point enumerator terms minima, equivale...
We demonstrate the efficiency of the multidomain sampler (MDS) in finding multiple distinct global minima and low-energy local minima in the hydrophobic-polar (HP) lattice protein model. Extending the idea of partitioning energy space in the Wang-Landau algorithm, our approach introduces an additional partitioning scheme to divide the protein conformation space into local basins of attraction. ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید