نتایج جستجو برای: global gradient algorithm

تعداد نتایج: 1260152  

Journal: :SIAM Journal on Optimization 2006
Qin Ni Chen Ling Liqun Qi Kok Lay Teo

In this paper, a truncated projected Newton-type algorithm is presented for solving large-scale semi-infinite programming problems. This is a hybrid method of a truncated projected Newton direction and a modified projected gradient direction. The truncated projected Newton method is used to solve the constrained nonlinear system. In order to guarantee global convergence, a robust loss function ...

Journal: :Comp. Opt. and Appl. 2002
Mike C. Bartholomew-Biggs Steven C. Parkhurst Simon P. Wilson

In this paper we discuss a global optimization problem arising in the calculation of aircraft flight paths. Since gradient information for this problem may not be readily available, a direct-search algorithm (DIRECT), proposed by Jones et al. [11] appears to be a promising solution technique. We describe some numerical experience in which DIRECT is used in several different ways to solve a samp...

2015
Jialei Wang Scott C. Clark Eric Liu Peter I. Frazier

We consider parallel global optimization of derivative-free expensive-to-evaluate functions, and proposes an efficient method based on stochastic approximation for implementing a conceptual Bayesian optimization algorithm proposed by [10]. To accomplish this, we use infinitessimal perturbation analysis (IPA) to construct a stochastic gradient estimator and show that this estimator is unbiased.

2012
H. X. Chen Y. F. Zhang W. S. Zhang S. Fu Y. C. Chen Y. L. Li T. Zhou

The CFD analysis and optimization of multi-element airfoils are presented. Window-Embedment technique is used to automatically generate the grids along with the geometry evolving. The CFD code named NSAWET is employed for performance and flow field analysis. The Genetic Algorithms are used in along with gradient methods to get a good compromise of global optimization capability and efficiency. ...

2017
Fei Ye

In this paper, we propose a new automatic hyperparameter selection approach for determining the optimal network configuration (network structure and hyperparameters) for deep neural networks using particle swarm optimization (PSO) in combination with a steepest gradient descent algorithm. In the proposed approach, network configurations were coded as a set of real-number m-dimensional vectors a...

Journal: :Optimization Methods and Software 2013
Frank E. Curtis Xiaocun Que

We present an algorithm for the minimization of f : Rn → R, assumed to be locally Lipschitz and continuously differentiable in an open dense subset D of Rn. The objective f may be nonsmooth and/or nonconvex. The method is based on the gradient sampling algorithm (GS) of Burke, Lewis, and Overton [SIAM J. Optim., 15 (2005), pp. 751-779]. It differs, however, from previously proposed versions of ...

Journal: :SIAM Journal on Optimization 2011
Warren R. Scott Peter I. Frazier Warren B. Powell

We extend the concept of the correlated knowledge-gradient policy for ranking and selection of a finite set of alternatives to the case of continuous decision variables. We propose an approximate knowledge gradient for problems with continuous decision variables in the context of a Gaussian process regression model in a Bayesian setting, along with an algorithm to maximize the approximate knowl...

Journal: :CoRR 2017
Penghang Yin Minh Pham Adam M. Oberman Stanley Osher

In this paper, we propose an implicit gradient descent algorithm for the classic k-means problem. The implicit gradient step or backward Euler is solved via stochastic fixed-point iteration, in which we randomly sample a mini-batch gradient in every iteration. It is the average of the fixed-point trajectory that is carried over to the next gradient step. We draw connections between the proposed...

Journal: :CoRR 2017
Marcel Campen Denis Zorin

An algorithm for the computation of global discrete conformal parametrizations with prescribed global holonomy signatures for triangle meshes was recently described in [Campen and Zorin 2017]. In this paper we provide a detailed analysis of convergence and correctness of this algorithm. We generalize and extend ideas of [Springborn et al. 2008] to show a connection of the algorithm to Newton’s ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید