نتایج جستجو برای: full newton step
تعداد نتایج: 566996 فیلتر نتایج به سال:
Abstract The focus in this paper is interior-point methods for bound-constrained nonlinear optimization, where the system of equations that arise are solved with Newton’s method. There a trade-off between solving Newton systems directly, which give high quality solutions, and many approximate computationally less expensive but lower solutions. We propose partial full solutions to systems. speci...
Aerodynamic shape optimization (ASO) involves finding an optimal surface while constraining a set of nonlinear partial differential equations (PDE). The conventional approaches use quasi-Newton methods operating in the reduced-space, where PDE constraints are eliminated at each design step by decoupling flow solver from optimizer. Conversely, full-space Lagrange-Newton-Krylov-Schur (LNKS) appro...
Based on extensive computational evidence (hundreds of thousands of randomly generated problems) the second author conjectured that κ̄(ζ) = 1 (Conjecture 5.1 in [1]), which is a factor of √ 2n better than has been proved in [1], and which would yield an O( √ n) iteration full-Newton step infeasible interior-point algorithm. In this paper we present an example showing that κ̄(ζ) is in the order of...
The convex feasibility problem in general is a problem of finding a point in a convex set that contains a full dimensional ball and is contained in a compact convex set. We assume that the outer set is described by second-order cone inequalities and propose an analytic center cutting plane technique to solve this problem. We discuss primal and dual settings simultaneously. Two complexity result...
Abstract. We provide a semilocal convergence analysis for a cubically convergent two-step Newton method (2) recently introduced by H. Homeier [8], [9], and also studied by A. Özban [13]. In contrast to the above works we examine the semilocal convergence of the method in a Banach space setting, instead of the local in the real or complex number case. A comparison is given with a two step Newton...
Most of the current algorithms for solving distributed online optimization problems are based on first-order method, which simple in computation but slow convergence. Newton’s algorithm with fast convergence speed needs to calculate Hessian matrix and its inverse, leading computationally complex. A step is proposed this paper, constructs a positive definite by using information objective functi...
Newton-Krylov methods, primarily using the Jacobian-Free Newton-Krylov (JFNK) approximation, are examined as an alternative to the traditional power iteration method for the calculation of the fundamental eigenmode in reactor analysis applications based on diffusion theory. One JFNK approach can be considered an acceleration technique for the standard power iteration as it is “wrapped around” t...
We propose a fast batch learning method for linearchain Conditional Random Fields (CRFs) based on Newton-CG methods. Newton-CG methods are a variant of Newton method for high-dimensional problems. They only require the Hessian-vector products instead of the full Hessian matrices. To speed up Newton-CG methods for the CRF learning, we derive a novel dynamic programming procedure for the Hessian-...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید