نتایج جستجو برای: double parameter scaled quasi newton formula

تعداد نتایج: 648605  

Journal: :Math. Program. 1983
Mukund N. Thapa

Newton-type methods and quasi-Newton methods have proven to be very successful in solving dense unconstrained optimization problems. Recently there has been considerable interest in extending these methods to solving large problems when the Hessian matrix has a known a priori sparsity pattern, This paper treats sparse quasi-Newton methods in a uniform fashion and shows the effect of loss of pos...

Journal: :Computational Optimization and Applications 2022

Although the performance of popular optimization algorithms such as Douglas–Rachford splitting (DRS) and ADMM is satisfactory in convex well-scaled problems, ill conditioning nonconvexity pose a severe obstacle to their reliable employment. Expanding on recent convergence results for DRS applied nonconvex we propose two linesearch enhance robustify these methods by means quasi-Newton directions...

Journal: :IEEE Trans. Signal Processing 2000
Zhengjiu Kang Chanchal Chatterjee Vwani P. Roychowdhury

In this paper, we derive and discuss a new adaptive quasi-Newton eigen-estimation algorithm and compare it with the RLS-type adaptive algorithms and the quasi-Newton algorithm proposed by Mathew et al. through experiments with stationary and nonstationary data.

2015
Melvin Leok

V sequence fxkg. Similar result is also true for quasiNewton methods with trust region (see [16]). Another type of special quasi-Newton methods is that the quasi-Newton matrices are sparse. It is quite often that large-scale problems have separable structure, which leads to special structure of the Hessian matrices. In such cases we can require the quasiNewton matrices to have similar structures.

2010
Hong-Wei Liu

In this paper, non-monotone line search procedure is studied, which is combined with the non-quasi-Newton family. Under the uniformly convexity assumption on objective function, the global and superlinear convergence of the non-quasi-Newton family with the proposed nonmonotone line search is proved under suitable conditions.

Journal: :SIAM Journal on Optimization 2013
Xiaojun Chen Lingfeng Niu Ya-Xiang Yuan

Abstract. Regularized minimization problems with nonconvex, nonsmooth, perhaps nonLipschitz penalty functions have attracted considerable attention in recent years, owing to their wide applications in image restoration, signal reconstruction and variable selection. In this paper, we derive affine-scaled second order necessary and sufficient conditions for local minimizers of such minimization p...

Journal: :SIAM Journal on Matrix Analysis and Applications 2023

We consider the problem of computing square root a perturbation scaled identity matrix, , where and are matrices with . This arises in various applications, including computer vision optimization methods for machine learning. derive new formula th that involves weighted sum powers matrix is particularly attractive root, since has just one term when also class Newton iterations exploit low-rank ...

1992
Chang-Pu Sun

The exotic quantum double and its universal R-matrix for quantum Yang-Baxter equation are constructed in terms of Drinfeld’s quantum double theory.As a new quasi-triangular Hopf algebra, it is much different from those standard quantum doubles that are the q-deformations for Lie algebras or Lie superalgebras. By studying its representation theory,many-parameter representations of the exotic qua...

2005
Ladislav Lukšan Jan Vlček

In this report, we propose a new partitioned variable metric method for minimizing nonsmooth partially separable functions. After a short introduction, the complete algorithm is introduced and some implementation details are given. We prove that this algorithm is globally convergent under standard mild assumptions. Computational experiments given confirm efficiency and robustness of the new met...

Journal: :Inf. Sci. 2011
Xudong Ma Ping Luo Fuzhen Zhuang Qing He Zhongzhi Shi Zhiyong Shen

Ensemble learning with output from multiple supervised and unsupervised models aims to improve the classification accuracy of supervised model ensemble by jointly considering the grouping results from unsupervised models. In this paper we cast this ensemble task as an unconstrained probabilistic embedding problem. Specifically, we assume both objects and classes/clusters have latent coordinates...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید