نتایج جستجو برای: limited memory bfgs

تعداد نتایج: 672103  

2013
Mingjie Qian ChengXiang Zhai

A new unsupervised feature selection method, i.e., Robust Unsupervised Feature Selection (RUFS), is proposed. Unlike traditional unsupervised feature selection methods, pseudo cluster labels are learned via local learning regularized robust nonnegative matrix factorization. During the label learning process, feature selection is performed simultaneously by robust joint l2,1 norms minimization. ...

Journal: :Applied Mathematics and Computation 2010
M. S. Apostolopoulou D. G. Sotiropoulos C. A. Botsaris

We present a newmatrix-free method for the computation of negative curvature directions based on the eigenstructure of minimal-memory BFGS matrices. We determine via simple formulas the eigenvalues of these matrices and we compute the desirable eigenvectors by explicit forms. Consequently, a negative curvature direction is computed in such a way that avoids the storage and the factorization of ...

Journal: :Hydrology and Earth System Sciences 2021

Abstract. Timely and accurate estimation of reference evapotranspiration (ET0) is indispensable for agricultural water management efficient use. This study aims to estimate the amount ET0 with machine learning approaches by using minimum meteorological parameters in Corum region, which has an arid semi-arid climate regarded as important centre Turkey. In this context, monthly averages variables...

1994
Aiping Liao

In this paper, we propose a modiied BFGS method and study the global and superlinear convergence properties of this method. We show that under certain circumstances this modiied BFGS method corrects the eigenvalues better than the BFGS does. Our numerical results support this claim and also indicate that the modiied BFGS method may be competitive with the BFGS method in general. This modiied me...

2017
Arnaud Nguembang Fadja Fabrizio Riguzzi

Probabilistic logic programming (PLP) provides a powerful tool for reasoning with uncertain relational models. However, learning probabilistic logic programs is expensive due to the high cost of inference. Among the proposals to overcome this problem, one of the most promising is lifted inference. In this paper we consider PLP models that are amenable to lifted inference and present an algorith...

2007
Lin Zhang Adrian Sandu

In this paper we discuss variational data assimilation using the STEM atmospheric Chemical Transport Model. STEM is a multiscale model and can perform air quality simulations and predictions over spatial and temporal scales of different orders of magnitude. To improve the accuracy of model predictions we construct a dynamic data driven application system (DDDAS) by integrating data assimilation...

Journal: :CoRR 2015
Nicolas Ray Dmitry Sokolov

L-BFGS is a hill climbing method that is guarantied to converge only for convex problems. In computer graphics, it is often used as a black box solver for a more general class of non linear problems, including problems having many local minima. Some works obtain very nice results by solving such difficult problems with L-BFGS. Surprisingly, the method is able to escape local minima: our interpr...

Journal: :Optimization Methods and Software 2008
M. S. Apostolopoulou D. G. Sotiropoulos Panayiotis E. Pintelas

We present a new matrix-free method for the large-scale trust-region subproblem, assuming that the approximate Hessian is updated by the L-BFGS formula with m = 1 or 2. We determine via simple formulas the eigenvalues of these matrices and, at each iteration, we construct a positive definite matrix whose inverse can be expressed analytically, without using factorization. Consequently, a directi...

Journal: :Communications in Statistics - Simulation and Computation 2011
Steven P. Ellis

A descent algorithm, “Quasi-Quadratic Minimization with Memory” (QQMM), is proposed for unconstrained minimization of the sum, F , of a non-negative convex function, V , and a quadratic form. Such problems come up in regularized estimation in machine learning and statistics. In addition to values of F , QQMM requires the (sub)gradient of V . Two features of QQMM help keep low the number of eval...

2011
Yong Ma

A Hessian matrix in full waveform inversion (FWI) is difficult to compute directly because of high computational cost and an especially large memory requirement. Therefore, Newton-like methods are rarely feasible in realistic large-size FWI problems. We modify the quasi-Newton BFGS method to use a projected Hessian matrix that reduces both the computational cost and memory required, thereby mak...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید