نتایج جستجو برای: limited memory bfgs

تعداد نتایج: 672103  

Journal: :SIAM Journal on Optimization 1998

1998
Tim Oliver

A quasi-Newton algorithm using the BFGS update is one of the most widely used unconstrained numerical optimisation algorithms. We describe three parallel algorithms to perform the BFGS update on a local memory MIMD architecture such as . These algorithms are distinguished by the way in which Hessian information is stored. Cost models are developed for the algorithms and used to compare their pe...

Journal: :Computational & Applied Mathematics 2021

The alternating direction method of multipliers (ADMM) is an effective for solving convex problems from a wide range fields. At each iteration, the classical ADMM solves two subproblems exactly. However, in many applications, it expensive or impossible to obtain exact solutions subproblems. To overcome difficulty, some proximal terms are added This class methods typically original subproblem ap...

2015
Ritesh Kolte Murat Erdogdu Ayfer Özgür

We consider the problem of minimizing an objective function that is a sum of convex functions. For large sums, batch methods suffer from a prohibitive periteration complexity, and are outperformed by incremental methods such as the recent variance-reduced stochastic gradient methods (e.g. SVRG). In this paper, we propose to improve the performance of SVRG by incorporating approximate curvature ...

2003
Omer Tsimhoni Yili Liu

The Queueing Network-Model Human Processor (QN-MHP) is a computational architecture that combines the mathematical theories and simulation methods of queueing networks (QN) with the symbolic and procedure methods of GOMS analysis and the Model Human Processor (MHP). QN-MHP has been successfully used to model reaction time tasks and visual search tasks (Feyen and Liu, 2001a,b). This paper descri...

2016
Suvrit Sra Reshad Hosseini

Machine learning models often rely on sparsity, low-rank, orthogonality, correlation, or graphical structure. The structure of interest in this chapter is geometric, specifically the manifold of positive definite (PD) matrices. Though these matrices recur throughout the applied sciences, our focus is on more recent developments in machine learning and optimization. In particular, we study (i) m...

Journal: :J. Computational Applied Mathematics 2013
Jan Vlcek Ladislav Luksan

Two families of limited-memory variable metric or quasi-Newton methods for unconstrained minimization based on quasi-product form of update are derived. As for the first family, four variants how to utilize the Strang recurrences for the Broyden class of variable metric updates are investigated; three of them use the same number of stored vectors as the limitedmemory BFGS method. Moreover, one ...

Journal: :CoRR 2015
Xuezhe Ma Hai Zhao

This paper presents generalized probabilistic models for high-order projective dependency parsing and an algorithmic framework for learning these statistical models involving dependency trees. Partition functions and marginals for high-order dependency trees can be computed efficiently, by adapting our algorithms which extend the inside-outside algorithm to higher-order cases. To show the effec...

2015

where U = V k−mV k−m+1 · · ·V k−1. For the L-BFGS, we need not explicitly store the approximated inverse Hessian matrix. Instead, we only require matrix-vector multiplications at each iteration, which can be implemented by a twoloop recursion with a time complexity of O(mn) (Jorge & Stephen, 1999). Thus, we only store 2m vectors of length n: sk−1, sk−2, · · · , sk−m and yk−1,yk−2, · · · ,yk−m w...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید