نتایج جستجو برای: newton method

تعداد نتایج: 1641305  

2010
Mark Schmidt Dongmin Kim

We consider projected Newton-type methods for solving large-scale optimization problems arising in machine learning and related fields. We first introduce an algorithmic framework for projected Newton-type methods by reviewing a canonical projected (quasi-)Newton method. This method, while conceptually pleasing, has a high computation cost per iteration. Thus, we discuss two variants that are m...

Journal: :J. Computational Applied Mathematics 2015
Rob Haelterman Ben Lauwens Filip Van Utterbeeck Helena Bruyninckx Jan A. Vierendeels

We show how the quasi-Newton least squares method (QN-LS) relates to Krylov subspace methods in general and to GMRes in particular.

2014
Nitin Jain Kushal D Murthy

1 Student of Department of Electronics & Communication Engineering, 2 Professor, Department of Mathematics, 1,2 R. V. College of Engineering, Mysore Road, Bangalore, Karnataka-560 059, INDIA __________________________________________________________________________________________ Abstract: New iterative algorithms for finding the nth root of a positive number m, to any degree of accuracy, are ...

2009
A. POTSCHKA

We investigate an iterative method for the solution of time-periodic parabolic PDE constrained optimization problems. It is an inexact Sequential Quadratic Programming (iSQP) method based on the Newton-Picard approach. We present and analyze a linear quadratic model problem and prove optimal mesh-independent convergence rates. Additionally, we propose a two-grid variant of the Newton-Picard met...

Journal: :Math. Comput. 1997
Q. Ni Ya-Xiang Yuan

In this paper we propose a subspace limited memory quasi-Newton method for solving large-scale optimization with simple bounds on the variables. The limited memory quasi-Newton method is used to update the variables with indices outside of the active set, while the projected gradient method is used to update the active variables. The search direction consists of three parts: a subspace quasi-Ne...

Journal: :Neural computation 2015
Chien-Chih Wang Chun-Heng Huang Chih-Jen Lin

Newton methods can be applied in many supervised learning approaches. However, for large-scale data, the use of the whole Hessian matrix can be time-consuming. Recently, subsampled Newton methods have been proposed to reduce the computational time by using only a subset of data for calculating an approximation of the Hessian matrix. Unfortunately, we find that in some situations, the running sp...

2005
Zhao Li Richard Shi

In this paper, we present SILCA-Newton-Krylov, a new method for accurate, efficient and robust timedomain VLSI circuit simulation. Similar to SPICE, SILCA-Newton-Krylov uses time-difference and Newton-Raphson for solving nonlinear differential equations from circuit simulation. But different from SPICE, SILCA-Newton-Krylov explores a preconditioned flexible generalized minimal residual (FGMRES)...

2014
H. A. Aisha W. L. Fatima M. Y Waziri J. E. Dennis M. Y. Waziri

It is well known that when the Jacobian of nonlinear systems is nonsingular in the neighborhood of the solution, the convergence of Newton method is guaranteed and the rate is quadratic. Violating this condition, i. e. the Jacobian to be singular the convergence may be unsatisfactory and may even be lost. In this paper we present a modification of Newton's method via extra updating for non...

1996
Peter Deuflhard Martin Weiser

The finite element setting for nonlinear elliptic PDEs directly leads to the minimization of convex functionals. Uniform ellipticity of the underlying PDE shows up as strict convexity of the arising nonlinear functional. The paper analyzes computational variants of Newton’s method for convex optimization in an affine conjugate setting, which reflects the appropriate affine transformation behavi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید