نتایج جستجو برای: global gradient algorithm
تعداد نتایج: 1260152 فیلتر نتایج به سال:
In this paper, a truncated projected Newton-type algorithm is presented for solving large-scale semi-infinite programming problems. This is a hybrid method of a truncated projected Newton direction and a modified projected gradient direction. The truncated projected Newton method is used to solve the constrained nonlinear system. In order to guarantee global convergence, a robust loss function ...
In this paper we discuss a global optimization problem arising in the calculation of aircraft flight paths. Since gradient information for this problem may not be readily available, a direct-search algorithm (DIRECT), proposed by Jones et al. [11] appears to be a promising solution technique. We describe some numerical experience in which DIRECT is used in several different ways to solve a samp...
We consider parallel global optimization of derivative-free expensive-to-evaluate functions, and proposes an efficient method based on stochastic approximation for implementing a conceptual Bayesian optimization algorithm proposed by [10]. To accomplish this, we use infinitessimal perturbation analysis (IPA) to construct a stochastic gradient estimator and show that this estimator is unbiased.
The CFD analysis and optimization of multi-element airfoils are presented. Window-Embedment technique is used to automatically generate the grids along with the geometry evolving. The CFD code named NSAWET is employed for performance and flow field analysis. The Genetic Algorithms are used in along with gradient methods to get a good compromise of global optimization capability and efficiency. ...
In this paper, we propose a new automatic hyperparameter selection approach for determining the optimal network configuration (network structure and hyperparameters) for deep neural networks using particle swarm optimization (PSO) in combination with a steepest gradient descent algorithm. In the proposed approach, network configurations were coded as a set of real-number m-dimensional vectors a...
We present an algorithm for the minimization of f : Rn → R, assumed to be locally Lipschitz and continuously differentiable in an open dense subset D of Rn. The objective f may be nonsmooth and/or nonconvex. The method is based on the gradient sampling algorithm (GS) of Burke, Lewis, and Overton [SIAM J. Optim., 15 (2005), pp. 751-779]. It differs, however, from previously proposed versions of ...
We extend the concept of the correlated knowledge-gradient policy for ranking and selection of a finite set of alternatives to the case of continuous decision variables. We propose an approximate knowledge gradient for problems with continuous decision variables in the context of a Gaussian process regression model in a Bayesian setting, along with an algorithm to maximize the approximate knowl...
In this paper, we propose an implicit gradient descent algorithm for the classic k-means problem. The implicit gradient step or backward Euler is solved via stochastic fixed-point iteration, in which we randomly sample a mini-batch gradient in every iteration. It is the average of the fixed-point trajectory that is carried over to the next gradient step. We draw connections between the proposed...
An algorithm for the computation of global discrete conformal parametrizations with prescribed global holonomy signatures for triangle meshes was recently described in [Campen and Zorin 2017]. In this paper we provide a detailed analysis of convergence and correctness of this algorithm. We generalize and extend ideas of [Springborn et al. 2008] to show a connection of the algorithm to Newton’s ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید