نتایج جستجو برای: newton method

تعداد نتایج: 1641305  

Journal: :CoRR 2017
Albert S. Berahas Raghu Bollapragada Jorge Nocedal

The concepts of sketching and subsampling have recently received much attention by the optimization and statistics communities. In this paper, we study NewtonSketch and Subsampled Newton (SSN) methods for the finite-sum optimization problem. We consider practical versions of the two methods in which the Newton equations are solved approximately using the conjugate gradient (CG) method or a stoc...

2004
A. K. Alekseev I. M. Navon

We compare the performance of several robust large-scale minimization algorithms applied for the minimization of the cost functional in the solution of ill-posed inverse problems related to parameter estimation applied to the parabolized Navier-Stokes equations. The methods compared consist of the conjugate gradient method (CG), Quasi-Newton (BFGS), the limited memory Quasi-Newton (L-BFGS) [1],...

Journal: :Neural computation 2009
Pierre-Antoine Absil Mariya Ishteva Lieven De Lathauwer Sabine Van Huffel

Newton's method for solving the matrix equation F(X) identical to AX-XX(T) AX = 0 runs up against the fact that its zeros are not isolated. This is due to a symmetry of F by the action of the orthogonal group. We show how differential-geometric techniques can be exploited to remove this symmetry and obtain a "geometric" Newton algorithm that finds the zeros of F. The geometric Newton method doe...

Journal: :SIAM Journal on Optimization 1997
Defeng Sun Jiye Han

The paper presents concrete realizations of quasi-Newton methods for solving several standard problems including complementarity problems, special variational inequality problems, and the Karush–Kuhn–Tucker (KKT) system of nonlinear programming. A new approximation idea is introduced in this paper. The Q-superlinear convergence of the Newton method and the quasiNewton method are established und...

2005
S. Gratton N. K. Nichols

The Gauss-Newton algorithm is an iterative method regularly used for solving nonlinear least squares problems. It is particularly well-suited to the treatment of very large scale variational data assimilation problems that arise in atmosphere and ocean forecasting. The procedure consists of a sequence of linear least squares approximations to the nonlinear problem, each of which is solved by an...

Journal: :SIAM Journal on Optimization 1994
Stanley C. Eisenstat Homer F. Walker

Inexact Newton methods for finding a zero of F 1 1 are variations of Newton's method in which each step only approximately satisfies the linear Newton equation but still reduces the norm of the local linear model of F. Here, inexact Newton methods are formulated that incorporate features designed to improve convergence from arbitrary starting points. For each method, a basic global convergence ...

Journal: :J. Comput. Physics 2009
Khosro Shahbazi Dimitri J. Mavriplis Nicholas K. Burgess

Multigrid algorithms are developed for systems arising from high-order discontinuous Galerkin discretizations of the compressible Navier-Stokes equations on unstructured meshes. The algorithms are based on coupling both pand h-multigrid (ph-multigrid) methods which are used in non-linear or linear forms, and either directly as solvers or as preconditioners to a Newton-Krylov method. The perform...

2008
Nenad Ujević Nena Jović Lucija Mijić

A Newton-like method for convex functions is derived. It is shown that this method can be better than the Newton method. Especially good results can be obtained if we combine these two methods. Illustrative numerical examples are given. Mathematics Subject Classification: 65H05

2007
Chih-Jen Lin Ruby C. Weng Alexander Smola

Large-scale logistic regression arises in many applications such as document classification and natural language processing. In this paper, we apply a trust region Newton method to maximize the log-likelihood of the logistic regression model. The proposed method uses only approximate Newton steps in the beginning, but achieves fast convergence in the end. Experiments show that it is faster than...

2008
Nuchun Hu Weiping Shen Chong Li

The famous Newton–Kantorovich hypothesis has been used for a long time as a sufficient condition for the convergence of Newton’s method to a solution of an equation. Here we present a “Kantorovich type” convergence analysis for the Gauss–Newton’s method which improves the result in [W.M. Häußler, A Kantorovich-type convergence analysis for the Gauss–Newton-method, Numer. Math. 48 (1986) 119–125...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید