نتایج جستجو برای: matrix krylove subspace
تعداد نتایج: 378189 فیلتر نتایج به سال:
1. When is it true? (5 points for each correct answer, -3 point for each wrong answer, 0 point for each blank) Fill in each blank with “always,” “sometimes,” or “never.” For example, • A nonsingular matrix is always invertible. • A square matrix is sometimes full-rank. • A strictly tall matrix is never onto. Here the matrix dimensions are such that each expression makes sense, but they are othe...
1. When is it true? (5 points for each correct answer, -3 point for each wrong answer, 0 point for each blank) Fill in each blank with “always,” “sometimes,” or “never.” For example, • A nonsingular matrix is always invertible. • A square matrix is sometimes full-rank. • A strictly tall matrix is never onto. Here the matrix dimensions are such that each expression makes sense, but they are othe...
In this paper, we present new variants of global bi-conjugate gradient (Gl-BiCG) and global bi-conjugate residual (Gl-BiCR) methods for solving nonsymmetric linear systems with multiple right-hand sides. These methods are based on global oblique projections of the initial residual onto a matrix Krylov subspace. It is shown that these new algorithms converge faster and more smoothly than the Gl-...
Global Krylov subspace methods are the most efficient and robust methods to solve generalized coupled Sylvester matrix equation. In this paper, we propose the nested splitting conjugate gradient process for solving this equation. This method has inner and outer iterations, which employs the generalized conjugate gradient method as an inner iteration to approximate each outer iterate, while each...
A recently developed Schur-type matrix approximation technique is applied to subspace estimation. The method is applicable if an upper bound of the noise level is approximately known. The main feature of the algorithm is that updating and downdating is straightforward and efficient, and that the subspace dimension is elegantly tracked as well.
We show that a certain subspace of space of elliptic cusp forms is isomorphic as a Hecke module to a certain subspace of space of Jacobi cusp forms of degree one with matrix index by constructing an explicit lifting. This is a partial generalization of the work of Skoruppa and Zagier. This lifting is also related with the Ikeda lifting.
This paper presents a new subspace modeling and selection approach for noisy speech recognition. In subspace modeling, we develop factor analysis (FA) for representing noisy speech. FA is a data generation model where the common factors are extracted with factor loading matrix and specific factors. We bridge the connection of FA to signal subspace (SS) approach. Interestingly, FA partitions noi...
In this paper, we represent an inexact inverse subspace iteration method for computing a few eigenpairs of the generalized eigenvalue problem Ax = Bx [Q. Ye and P. Zhang, Inexact inverse subspace iteration for generalized eigenvalue problems, Linear Algebra and its Application, 434 (2011) 1697-1715 ]. In particular, the linear convergence property of the inverse subspace iteration is preserved.
Given the set of matrix pairs M ⊂ Mm,n(C) × Mn(C) keeping a subspace S ⊂ C invariant, we obtain a miniversal deformation of a pair belonging to an open dense subset of M. It generalizes the known results when S is a supplementary subspace of the unobservable one. Keywords-Conditioned invariant subspaces, Miniversal deformation, Stratified manifold, Vertical pairs of matrices.
The classical Rayleigh quotient iteration (RQI) allows one to compute a one-dimensional invariant subspace of a symmetric matrix A. Here we propose a generalization of the RQI which computes a p-dimensional invariant subspace of A. Cubic convergence is preserved and the cost per iteration is low compared to other methods proposed in the literature.
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید