نتایج جستجو برای: newton method

تعداد نتایج: 1641305  

Journal: :SIAM Journal on Optimization 2002
Asen L. Dontchev Houduo Qi Liqun Qi Hongxia Yin

Abstract. In 1986, Irvine, Marin, and Smith proposed a Newton-type method for shapepreserving interpolation and, based on numerical experience, conjectured its quadratic convergence. In this paper, we prove local quadratic convergence of their method by viewing it as a semismooth Newton method. We also present a modification of the method which has global quadratic convergence. Numerical exampl...

1997
H. XU X. W. CHANG

We develop general approximate Newton methods for solving Lipschitz continuous equations by replacing the iteration matrix with a consistently approximated Jacobian, thereby reducing the computation in the generalized Newton method. Locally superlinear convergence results are presented under moderate assumptions. To construct a consistently approximated Jacobian, we introduce two main methods: ...

2005
LI Chong WANG Jinhua

The Newton method and its variations are the most efficient methods known for solving systems of nonlinear equations when they are continuously differentiable. Besides its practical applications, the Newton method is also a powerful theoretical tool. One of the famous results on the Newton method is the well-known Kantorovich’s theorem, which has the advantage that the Newton sequence converges...

Journal: :SIAM J. Numerical Analysis 2008
Roger P. Pawlowski Joseph P. Simonis Homer F. Walker John N. Shadid

The dogleg method is a classical trust-region technique for globalizing Newton’s method. While it is widely used in optimization, including large-scale optimization via truncatedNewton approaches, its implementation in general inexact Newton methods for systems of nonlinear equations can be problematic. In this paper, we first outline a very general dogleg method suitable for the general inexac...

1997
Q Ni Y Yuan

In this paper we propose a subspace limited memory quasi-Newton method for solving large-scale optimization with simple bounds on the variables. The limited memory quasi-Newton method is used to update the variables with indices outside of the active set, while the projected gradient method is used to update the active variables. The search direction consists of three parts: a subspace quasi-Ne...

Journal: :Applied Mathematics and Computation 2015
Suzhen Liu Yongzhong Song Xiaojian Zhou

Recently, a new treatment based on Taylor’s expansion to give the estimate of the convergence radius of iterative method for multiple roots has been presented. It has been successfully applied to enlarge the convergence radius of themodified Newton’s method and Osada’s method for multiple roots. This paper re-investigates the convergence radius of Halley’s method under the condition that the de...

Journal: :SIAM J. Scientific Computing 2001
Michael Pernice Michael D. Tocci

Globalized inexact Newton methods are well suited for solving large-scale systems of nonlinear equations. When combined with a Krylov iterative method, an explicit Jacobian is never needed, and the resulting matrix-free Newton–Krylov method greatly simplifies application of the method to complex problems. Despite asymptotically superlinear rates of convergence, the overall efficiency of a Newto...

Journal: :East Asian Journal on Applied Mathematics 2018

Journal: :IEEE Transactions on Automatic Control 2019

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید