نتایج جستجو برای: conjugate gradient

تعداد نتایج: 163423  

2011
Jonas Koko Taoufik Sassi T. Sassi

The Stokes problem plays an important role in computational fluid dynamics since it is encountered in the time discretization of (incompressible) Navier-Stokes equations by operator-splitting methods [2, 3]. Space discretization of the Stokes problem leads to large scale ill-conditioned systems. The Uzawa (preconditioned) conjugate gradient method is an efficient method for solving the Stokes p...

2007
Yu-Hong Daiy

Conjugate gradient methods are a class of important methods for unconstrained optimization, especially when the dimension is large. This paper proposes a new conjugacy condition, which considers an inexact line search scheme but reduces to the old one if the line search is exact. Based on the new conjugacy condition, two nonlinear conjugate gradient methods are constructed. Convergence analysis...

2016
Elena Akimova Dmitry Belousov

For solving systems of linear algebraic equations with blockfivediagonal matrices arising in geoelectrics and diffusion problems, the parallel matrix square root method, conjugate gradient method with preconditioner, conjugate gradient method with regularization, and parallel matrix sweep algorithm are proposed and some of them are implemented numerically on multi-core CPU Intel. Investigation ...

2013
Kohei Arai

Sensitivity analysis on Sea Surface Temperature: SST estimation with Thermal Infrared Radiometer: TIR data through simulations is conducted. Also Conjugate Gradient Method: CGM based SST estimation method is proposed. SST estimation error of the proposed CGM based method is compared to the conventional Split Window Method: SWM with a variety of conditions including atmospheric models. The resul...

Journal: :Math. Comput. 2003
Yu-Hong Dai

Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. This paper proposes a three-parameter family of hybrid conjugate gradient methods. Two important features of the family are that (i) it can avoid the propensity of small steps, namely, if a small step is generated away from the ...

2008
Huibo Ji Jonathan H. Manton John B. Moore

Self-concordant functions are a special class of convex functions in Euclidean space introduced by Nesterov. They are used in interior point methods, based on Newton iterations, where they play an important role in solving efficiently certain constrained optimization problems. The concept of self-concordant functions has been defined on Riemannian manifolds by Jiang et al. and a damped Newton m...

Journal: :SIAM Journal on Optimization 1991
Kashmira M. Irani Manohar P. Kamat Calvin J. Ribbens Homer F. Walker Layne T. Watson

There are algorithms for finding zeros or fixed points of nonlinear systems of equations that are globally convergent for almost all starting points, i.e., with probability one. The essence of all such algorithms is the construction of an appropriate homotopy map and then tracking some smooth curve in the zero set of this homotopy map. HOMPACK is a mathematical software package implementing glo...

Journal: :SIAM J. Numerical Analysis 2005
Andreas Rieder

In our papers [Inverse Problems, 15, 309-327,1999] and [Numer. Math., 88, 347-365, 2001] we proposed algorithm REGINN being an inexact Newton iteration for the stable solution of nonlinear ill-posed problems. REGINN consists of two components: the outer iteration, which is a Newton iteration stopped by the discrepancy principle, and an inner iteration, which computes the Newton correction by so...

2007
Mohammad Nayeem Teli

A multi-layer neural network with multiple hidden layers was trained as an autoencoder using steepest descent, scaled conjugate gradient and alopex algorithms. These algorithms were used in different combinations with steepest descent and alopex used as pretraining algorithms followed by training using scaled conjugate gradient. All the algorithms were also used to train the autoencoders withou...

2017
Mohammad Emtiyaz Khan Wu Lin

Variational inference is computationally challenging in models that contain both conjugate and non-conjugate terms. Methods specifically designed for conjugate models, even though computationally efficient, find it difficult to deal with non-conjugate terms. On the other hand, stochastic-gradient methods can handle the nonconjugate terms but they usually ignore the conjugate structure of the mo...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید