نتایج جستجو برای: limited memory bfgs

تعداد نتایج: 672103  

2012
R. Jaafar M. Mamat

The BFGS methods is a method to solve an unconstrained optimization. Many modification have been done for solving this problems. In this paper, we present a new scaled hybrid modified BFGS. The new scaled hybrid modified BFGS algorithms are proposed and analyzed. The scaled hybrid modified BFGS can improve the number of iterations. Results obtained by the hybrid modified BFGS algorithms are com...

Journal: :Proceedings of the ... SIAM International Conference on Data Mining. SIAM International Conference on Data Mining 2014
Mahdi Pakdaman Naeini Iyad Batal Zitao Liu Charmgil Hong Milos Hauskrecht

This paper studies multi-label classification problem in which data instances are associated with multiple, possibly high-dimensional, label vectors. This problem is especially challenging when labels are dependent and one cannot decompose the problem into a set of independent classification problems. To address the problem and properly represent label dependencies we propose and study a pairwi...

2015
Fabrizio Riguzzi Elena Bellodi Riccardo Zese Giuseppe Cota Evelina Lamma

Probabilistic logic models are used ever more often to deal with the uncertain relations that are typical of the real world. However, these models usually require expensive inference and learning procedures. Very recently the problem of identifying tractable languages has come to the fore. In this paper we consider the models used by the Inductive Constraint Logic (ICL) system, namely sets of i...

Journal: :CoRR 2012
Stephen Purpura Dustin Hillard Mark Hubenthal Jim Walsh Scott Golder Scott Smith

We present a system that enables rapid model experimentation for tera-scale machine learning with trillions of non-zero features, billions of training examples, and millions of parameters. Our contribution to the literature is a new method (SA L-BFGS) for changing batch L-BFGS to perform in near real-time by using statistical tools to balance the contributions of previous weights, old training ...

Journal: :Journal of Computational Chemistry 2021

Abstract Numerical optimization is a common technique in various areas of computational chemistry, molecular modeling and drug design. It key element 3D techniques, for example, the protein–ligand poses small‐molecule conformers. Here, often BFGS algorithm or variants thereof are used. However, tends to make unreasonable large changes optimized system under certain circumstances. This behavior ...

Journal: :SIAM Journal on Optimization 2016
Frank Schöpfer

Linear convergence rates of descent methods for unconstrained minimization are usually proven under the assumption that the objective function is strongly convex. Recently it was shown that the weaker assumption of restricted strong convexity suffices for linear convergence of the ordinary gradient descent method. A decisive difference to strong convexity is that the set of minimizers of a rest...

Journal: :Multimedia Tools and Applications 2022

Abstract The Part-Of-Speech tagging is widely used in the natural language process. There are many statistical approaches this area. most popular one Hidden Markov Model. In paper, an alternative approach, linear-chain Conditional Random Fields, introduced. Fields a factor graph approach that can naturally incorporate arbitrary, non-independent features of input without conditional independence...

Journal: :Physics in medicine and biology 2003
Michael Lahanas Eduard Schreibmann Dimos Baltas

We consider the behaviour of the limited memory L-BFGS algorithm as a representative constraint-free gradient-based algorithm which is used for multiobjective (MO) dose optimization for intensity modulated radiotherapy (IMRT). Using a parameter transformation, the positivity constraint problem of negative beam fluences is entirely eliminated: a feature which to date has not been fully understoo...

Journal: :Math. Program. 2012
Roger Fletcher

The possibilities inherent in steepest descent methods have been considerably amplified by the introduction of the Barzilai-Borwein choice of step-size, and other related ideas. These methods have proved to be competitive with conjugate gradient methods for the minimization of large dimension unconstrained minimization problems. This paper suggests a method which is able to take advantage of th...

Journal: :J. Computational Applied Mathematics 2010
Neculai Andrei

New accelerated nonlinear conjugate gradient algorithms which are mainly modifications of the Dai and Yuan’s for unconstrained optimization are proposed. Using the exact line search, the algorithm reduces to the Dai and Yuan conjugate gradient computational scheme. For inexact line search the algorithm satisfies the sufficient descent condition. Since the step lengths in conjugate gradient algo...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید