Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms

We develop and analyze a broad family of stochastic/randomized algorithms for inverting a matrix. We also develop specialized variants maintaining symmetry or positive definiteness of the iterates. All methods in the family converge globally and linearly (i.e., the error decays exponentially), with explicit rates. In special cases, we obtain stochastic block variants of several quasiNewton upda...

متن کامل

Quasi-Newton updates with weighted secant equations

We provide a formula for variational quasi-Newton updates with multiple weighted secant equations. The derivation of the formula leads to a Sylvester equation in the correction matrix. Examples are given.

متن کامل

Incorporating Function Values into Quasi-Newton Updates

The traditional quasi-Newton method for updating the approximate Hessian is based on the change in the gradient of the objective function. This paper describes a new update method that incorporates also the change in the value of the function. The method effectively uses a cubic approximation of the objective function to better approximate its directional second derivative. The cubic approximat...

متن کامل

Globally Convergent Newton Algorithms for Blind Decorrelation

This paper presents novel Newton algorithms for the blind adaptive decorrelation of real and complex processes. They are globally convergent and exhibit an interesting relationship with the natural gradient algorithm for blind decorrelation and the Goodall learning rule. Indeed, we show that these two later algorithms can be obtained from their Newton decorrelation versions when an exact matrix...

متن کامل

Least-change quasi-Newton updates for equality-constrained optimization

January 28, 2000 Abstract. This paper investigates quasi-Newton updates for equality-constrained optimization. Using a least-change argument we derive a class of rank-3 updates to approximations of the one-sided projection of the Hessian of the Lagrangian which keeps the appropriate part symmetric (and possibly positive definite). By imposing the usual assumptions we are able to prove 1-step su...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Matrix Analysis and Applications

سال: 2017

ISSN: 0895-4798,1095-7162

DOI: 10.1137/16m1062053