نتایج جستجو برای: conjugate gradient descent

تعداد نتایج: 174860  

2015
Xiaoli Sun

In this paper, a new spectral PRP conjugate gradient method is proposed, which can always generate sufficient descent direction without any line search. Under some standard conditions, global convergence of the proposed method is established when the standard Armijo or weak Wolfe line search is used. Moreover, we extend these results to the HS method. Numerical comparisons are reported with som...

2014
Mohd Asrul Hery Ibrahim Mustafa Mamat

The conjugate gradient method plays an important role in solving large-scaled problems and the quasi-Newton method is known as the most efficient method in solving unconstrained optimization problems. Therefore, in this paper, the new hybrid 2592 Mohd Asrul Hery Ibrahim et al. method between the conjugate gradient method and the quasi-newton method for solving optimization problem is suggested....

2001
A. V. Panov

A computer model of the feed-forward neural network with the hidden layer is developed to reconstruct physical field investigated by the fiber-optic measuring system. The Gaussian distributions of some physical quantity are selected as learning patterns. Neural network is learned by error backpropagation using the conjugate gradient and coordinate descent minimization of deviation. Learned neur...

2009
LI ZHANG

Based on the secant condition often satisfied by quasi-Newton methods, two new versions of the Hestenes-Stiefel (HS) nonlinear conjugate gradient method are proposed, which are descent methods even with inexact line searches. The search directions of the proposed methods have the form dk = −θk gk + β H S k dk−1, or dk = −gk + β H S k dk−1 + θk yk−1. When exact line searches are used, the propos...

2015
Gonglin Yuan Xiabin Duan Wenjie Liu Xiaoliang Wang Zengru Cui Zhou Sheng Yongtang Shi

Two new PRP conjugate Algorithms are proposed in this paper based on two modified PRP conjugate gradient methods: the first algorithm is proposed for solving unconstrained optimization problems, and the second algorithm is proposed for solving nonlinear equations. The first method contains two aspects of information: function value and gradient value. The two methods both possess some good prop...

2017
Jialei Wang Jason D. Lee Mehrdad Mahdavi Mladen Kolar Nathan Srebro

We provide a unified optimization view of iterative Hessian sketch (IHS) and iterative dual random projection (IDRP). We establish a primal-dual connection between the Hessian sketch and dual random projection, and show that their iterative extensions are optimization processes with preconditioning. We develop accelerated versions of IHS and IDRP based on this insight together with conjugate gr...

2005
Norbert Röhrl

Abstract. We present a variational algorithm for solving the classical inverse Sturm-Liouville problem in one dimension when two spectra are given. All critical points of the least squares functional are at global minima, which justifies minimization by a (conjugate) gradient descent algorithm. Numerical examples show that the resulting algorithm works quite reliable without tuning for particul...

Journal: :Journal of biomedical optics 2010
Zijian Guo Changhui Li Liang Song Lihong V Wang

The data acquisition speed in photoacoustic computed tomography (PACT) is limited by the laser repetition rate and the number of parallel ultrasound detecting channels. Reconstructing an image with fewer measurements can effectively accelerate the data acquisition and reduce the system cost. We adapt compressed sensing (CS) for the reconstruction in PACT. CS-based PACT is implemented as a nonli...

Journal: :Mathematics and Statistics 2021

Conjugate Gradient (CG) method is the most prominent iterative mathematical technique that can be useful for optimization of both linear and non-linear systems due to its simplicity, low memory requirement, computational cost, global convergence properties. However, some classical CG methods have drawbacks which include weak convergence, poor numerical performance in terms number iterations CPU...

Journal: :Math. Program. 1974
Claude Lemaréchal

1. This note summarizes a paper [4] to appear in full elsewhere. It presents an algorithm for the minimization of a general (not necessarily differentiable) convex function. Its central idea is the construction of descent directions as projections of the origin onto the convex hull of previously calculated subgradients as long as satisfactory progress can be made. Using projection to obtain a d...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید