نتایج جستجو برای: full newton step

تعداد نتایج: 566996  

Journal: :Mathematical Programming 2021

In this paper, a novel stochastic extra-step quasi-Newton method is developed to solve class of nonsmooth nonconvex composite optimization problems. We assume that the gradient smooth part objective function can only be approximated by oracles. The proposed combines general higher order steps derived from an underlying proximal type fixed-point equation with additional guarantee convergence. Ba...

2013
L.Métivier R.Brossier J.Virieux S.Operto

Full Waveform Inversion (FWI) methods use generally gradient based method, such as the nonlinear conjugate gradient method or more recently the l-BFGS quasi-Newton method. Several authors have already investigated the possibility of accounting more accurately for the inverse Hessian operator in the minimization scheme through Gauss-Newton or exact Newton algorithms. We propose a general framewo...

Journal: :Journal of Computational and Applied Mathematics 2009

Journal: :Neural Computation 2007
Liefeng Bo Ling Wang Licheng Jiao

Some algorithms in the primal have been recently proposed for training support vector machines. This letter follows those studies and develops a recursive finite Newton algorithm (IHLF-SVR-RFN) for training nonlinear support vector regression. The insensitive Huber loss function and the computation of the Newton step are discussed in detail. Comparisons with LIBSVM 2.82 show that the proposed a...

2006
Xian Zhang Jianfeng Cai Yimin Wei

In this paper, we import interval method to the iteration for computing Moore-Penrose inverse of the full row (or column) rank matrix. Through modifying the classical Newton iteration by interval method, we can get better numerical results. The convergence of the interval iteration is proven. We also give some numerical examples to compare interval iteration with classical Newton iteration.

Journal: :Statistics, Optimization and Information Computing 2022

Getting a perfectly centered initial point for feasible path-following interior-point algorithms is hard practical task. Therefore, it worth to analyze other cases when the starting not necessarily centered. In this paper, we propose short-step weighted-path following algorithm (IPA) solving convex quadratic optimization (CQO). The latter based on modified search direction which obtained by tec...

Journal: :IEEJ Transactions on Electronics, Information and Systems 1999

2008
Giovanni Fasano Stefano Lucidi G. Fasano S. Lucidi

We propose a new truncated Newton method for large scale unconstrained optimization, where a Conjugate Gradient (CG)-based technique is adopted to solve Newton’s equation. In the current iteration, the Krylov method computes a pair of search directions: the first approximates the Newton step of the quadratic convex model, while the second is a suitable negative curvature direction. A test based...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید