نتایج جستجو برای: conjugate gradient descent

تعداد نتایج: 174860  

Journal: :Optimization Methods and Software 2012
Neculai Andrei

In this paper we suggest a new conjugate gradient algorithm that for all both the descent and the conjugacy conditions are guaranteed. The search direction is selected as a linear combination of 0 k ≥ 1 k g + − and where , k s 1 1 ( ) k k g f x + + = ∇ , and the coefficients in this linear combination are selected in such a way that both the descent and the conjugacy condition are satisfied at ...

Global Krylov subspace methods are the most efficient and robust methods to solve generalized coupled Sylvester matrix equation. In this paper, we propose the nested splitting conjugate gradient process for solving this equation. This method has inner and outer iterations, which employs the generalized conjugate gradient method as an inner iteration to approximate each outer iterate, while each...

2009
Mohammad Subhi Al-Batah Nor Ashidi Mat Isa Kamal Zuhairi Zamli Zamani Md Sani Khairun Azizi Azizli

Occupying more than 70% of the concrete’s volume, aggregates play a vital role as the raw feed for construction materials; particularly in the production of concrete and concrete products. Often, the characteristics such as shape, size and surface texture of aggregates significantly affect the quality of the construction materials produced. This article discusses a novel method for automatic cl...

In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...

2007
Yao Xie

In this report we solved a regularized maximum likelihood (ML) image reconstruction problem (with Poisson noise). We considered the gradient descent method, exact Newton’s method, pre-conditioned conjugate gradient (CG) method, and an approximate Newton’s method, with which we can solve a million variable problem. However, the current version of approximate Newton’s method converges slow. We co...

Journal: :journal of ai and data mining 2015
f. alibakhshi m. teshnehlab m. alibakhshi m. mansouri

the stability of learning rate in neural network identifiers and controllers is one of the challenging issues which attracts great interest from researchers of neural networks. this paper suggests adaptive gradient descent algorithm with stable learning laws for modified dynamic neural network (mdnn) and studies the stability of this algorithm. also, stable learning algorithm for parameters of ...

2015
Ivie Stein

The purpose of this paper is to illustrate the application of numerical optimizationmethods for nonquadratic functionals defined on non-Hilbert Sobolev spaces. These methods use a gradient defined on a norm-reflexive and hence strictly convex normed linear space. This gradient is defined by Michael Golomb and Richard A. Tapia in [M. Golomb, R.A. Tapia, Themetric gradient in normed linear spaces...

Journal: :IEEE Trans. Signal Processing 1999
D. O. Walsh Michael W. Marcellin

A new stopping rule is proposed for linear, iterative signal restoration using the gradient descent and conjugate gradient algorithms. The stopping rule attempts to minimize MSE under the assumption that the signal arises from a white noise process. This assumption is appropriate for many coherent imaging applications. The stopping rule is trivial to compute, and for xed relaxation parameters, ...

2005
WILLIAM W. HAGER HONGCHAO ZHANG

This paper reviews the development of different versions of nonlinear conjugate gradient methods, with special attention given to global convergence properties.

2010
László Gál László T. Kóczy Rita Lovassy

The Three Step Bacterial Memetic Algorithm is proposed. This new version of the Bacterial Memetic Algorithm with Modified Operator Execution Order (BMAM) is applied in a practical problem, namely is proposed as the Fuzzy Neural Networks (FNN) training algorithm. This paper strove after the improvement of the function approximation capability of the FNNs by applying a combination of evolutionary...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید