نتایج جستجو برای: eigenvalue gradient method

تعداد نتایج: 1735319  

2015
Zhaojun Bai Ren-Cang Li Wen-Wei Lin

The locally optimal block preconditioned 4-d conjugate gradient method (LOBP4dCG) for the linear response eigenvalue problem was proposed in [SIAM J. Matrix Anal. Appl., 34(2):392–416, 2013] and later was extended to the generalized linear response eigenvalue problem in [BIT Numer. Math., 54(1):31–54, 2014]. In this paper, we put forward two improvements to the method: a shifting deflation tech...

2004
May S. Yu

The family of iterative methods for static and natural vibration analysis, based on preconditioned conjugate gradient (PCG) method with aggregation multilevel preconditioning, is considered. Both: the element-by-element procedure for assembling of stiffness matrix and sparse direct solver for it factoring and fast forward − backward substitutions ensure the high stability of methods against ill...

Journal: :Math. Comput. 2004
Marcelo Gomes de Queiroz Joaquim Júdice Carlos Humes

In this paper the Eigenvalue Complementarity Problem (EiCP) with real symmetric matrices is addressed. It is shown that the symmetric (EiCP) is equivalent to finding an equilibrium solution of a differentiable optimization problem in a compact set. A necessary and sufficient condition for solvability is obtained which, when verified, gives a convenient starting point for any gradient-ascent loc...

2005
Volkan Akçelik George Biros Omar Ghattas David Keyes Kwok Ko Lie-Quan Lee Esmond G. Ng

We formulate the problem of designing the low-loss cavity for the International Linear Collider (ILC) as an electromagnetic shape optimization problem involving a Maxwell eigenvalue problem. The objective is to maximize the stored energy of a trapped mode in the end cell while maintaining a specified frequency corresponding to the accelerating mode. A continuous adjoint method is presented for ...

2008
Andrew J. Wathen Tyrone Rees Victor Pereyra

It is widely believed that Krylov subspace iterative methods are better than Chebyshev semi-iterative methods. When the solution of a linear system with a symmetric and positive definite coefficient matrix is required then the Conjugate Gradient method will compute the optimal approximate solution from the appropriate Krylov subspace, that is, it will implicitly compute the optimal polynomial. ...

2013
Francis Bach

In this paper, we consider supervised learning problems such as logistic regression and study the stochastic gradient method with averaging, in the usual stochastic approximation setting where observations are used only once. We show that after N iterations, with a constant step-size proportional to 1/R √ N where N is the number of observations and R is the maximum norm of the observations, the...

2007
Stephen Whalen

CG approximates the largest eigenvalue of a sparse, symmetric, positive definite matrix, using inverse iteration [3]. The matrix is generated by summing outer products of sparse vectors, with a fixed number of nonzero elements in each generating vector. The matrix sizes and total number of nonzero elements (“computed nonzeros,” following [3]) are listed in Table 1. The benchmark computes a give...

2012
Shu-Tian Liu Xin-Long Luo

Article history: Received 29 August 2008 Accepted 19 November 2009 Available online 12 January 2010 Submitted by V. Mehrmann AMS classification: 65F15 65K05 65K10 65L15

2007
Martin Stoll Andy Wathen

The Bramble-Pasciak Conjugate Gradient method is a well known tool to solve linear systems in saddle point form. A drawback of this method in order to ensure applicability of Conjugate Gradients is the need for scaling the preconditioner which typically involves the solution of an eigenvalue problem. Here, we introduce a modified preconditioner and inner product which without scaling enable the...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید