نتایج جستجو برای: ‎backward IJK version of Gaussian elimination‎

تعداد نتایج: 21179831  

‎In this paper‎, ‎we use a complete pivoting strategy to compute the IUL preconditioner obtained as the by-product of the Backward Factored APproximate INVerse process‎. ‎This pivoting is based on the complete pivoting strategy of the Backward IJK version of Gaussian Elimination process‎. ‎There is a parameter $alpha$ to control the complete pivoting process‎. ‎We have studied the effect of dif...

2015
A. Rafiei M. Bollhöfer

Consider the linear system of equations of the form Ax = b where the coefficient matrix A ∈ Rn×n is nonsingular, large, sparse and nonsymmetric and also x, b ∈ R. We refer to this system as the original system. An explicit preconditioner M for this system is an approximation of matrix A−1. In [1], Lou presented the Backward Factored INVerse or BFINV algorithm which computes the inverse factoriz...

Journal: :Computers & Mathematics with Applications 2014
Amin Rafiei Behnaz Tolue Matthias Bollhöfer

In this paper, we have used a complete pivoting strategy to compute the left-looking version of RIF preconditioner. This pivoting is based on the complete pivoting strategy of the IJK version of Gaussian Elimination process. There is a parameter α to control the pivoting process. To study the effect of α on the quality of the left-looking version of RIF preconditioner with complete pivoting str...

1989
R. Bruce Mattingly Carl D. Meyer James M. Ortega

This paper concerns the implementation of the QR factorization by Givens and Householder transformations on vector computers . Following the analysis of Dongarra, et al. [1984] for Gaussian elimination, various ijk forms for both Givens and Householder transformations are investigated. Conclusions concerning which of these forms have desirable or undesirable properties for vector computers are ...

2006
Liefeng Bo Ling Wang Licheng Jiao

Gaussian Processes (GPs) have state of the art performance in regression. In GPs, all the basis functions are required for prediction; hence its test speed is slower than other learning algorithms such as support vector machines (SVMs), relevance vector machine (RVM), adaptive sparseness (AS), etc. To overcome this limitation, we present a backward elimination algorithm, called GPs-BE that recu...

1995
D. B. Heras M. J. Martin M. Amor F. Arguello F. F. Rivera O. Plata

In this work we present a study on the vectorization of code segments that are typical for solving linear equation systems. We have selected Gaussian Elimination as representative of this type of problems. The sequential algorithm that performs this computation has a main loop with three nesting levels (indices i, j and k), that can be arranged according to six diierent organizations (called ij...

In this paper, we present a block version of incomplete LU preconditioner which is computed as the by-product of block A-biconjugation process. The pivot entries of this block preconditioner are one by one or two by two blocks. The L and U factors of this block preconditioner are computed separately. The block pivot selection of this preconditioner is inherited from one of the block versions of...

2006
Nicholas J. Higham Desmond J. Higham

The growth factor plays an important role in the error analysis of Gaussian elimination. It is well known that when partial pivoting or complete pivoting is used the growth factor is usually small, but it can be large. The examples of large growth usually quoted involve contrived matrices that are unlikely to occur in practice. We present real and complex n n matrices arising from practical app...

Journal: :J. Economic Theory 2002
Elena Katok Martin Sefton Abdullah Yavas

We report experimental results on the relative performance of simultaneous and sequential versions of the Abreu–Matsushima mechanism. Under the simultaneous version, subjects typically use undominated strategies, but apply only a limited number of iterations of dominance. Consequently the unique strategy surviving iterative elimination of strictly dominated strategies is rarely observed. Under ...

Journal: :Neurocomputing 2000
Chun-Shin Lin Chien-Kuo Li

This paper presents a sum-of-product neural network (SOPNN) structure. The SOPNN can learn to implement static mapping that multilayer neural networks and radial basis function networks normally perform. The output of the neural network has the sum-of-product form +Np i/1 <Nv j/1 f ij (x j ), where x j 's are inputs, N v is the number of inputs, f ij ( ) is a function generated through network ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید