Consensus acceleration in multiagent systems with the Chebyshev semi-iterative method

نویسندگان

  • Renato L. G. Cavalcante
  • Alex Rogers
  • Nicholas R. Jennings
چکیده

We consider the fundamental problem of reaching consensus in multiagent systems. To date, the consensus problem has been typically solved with decentralized algorithms based on graph Laplacians. However, the convergence of these algorithms is often too slow for many important multiagent applications, and thus they are increasingly being combined with acceleration methods. Unfortunately, state-of-the-art acceleration techniques require parameters that can be optimally selected only if complete information about the network topology is available, which is rarely the case in practice. We address this limitation by deriving two novel acceleration methods that can deliver good performance even if little information about the network is available. The first is based on the Chebyshev semi-iterative method and maximizes the worst-case convergence speed given that only rough bounds on the extremal eigenvalues of the network matrix are available. It can be applied to systems where agents use unreliable communication links, and its computational complexity is similar to those of simple Laplacian-based methods. This algorithm requires synchronization among agents, so we also propose an asynchronous version that approximates the output of the synchronous algorithm. Mathematical analysis and numerical simulations show that the convergence speed of the proposed acceleration methods decrease gracefully in scenarios where the sole use of Laplacian-based methods is known to be impractical.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

CHEBYSHEV ACCELERATION TECHNIQUE FOR SOLVING FUZZY LINEAR SYSTEM

In this paper, Chebyshev acceleration technique is used to solve the fuzzy linear system (FLS). This method is discussed in details and followed by summary of some other acceleration techniques. Moreover, we show that in some situations that the methods such as Jacobi, Gauss-Sidel, SOR and conjugate gradient is divergent, our proposed method is applicable and the acquired results are illustrate...

متن کامل

An Acceleration Method for Stationary Iterative Solution to Linear System of Equations

An acceleration scheme based on stationary iterativemethods is presented for solving linear system of equations. Unlike Chebyshev semi-iterative method which requires accurate estimation of the bounds for iterative matrix eigenvalues, we use a wide range of Chebyshev-like polynomials for the accelerating process without estimating the bounds of the iterative matrix. A detailed error analysis is...

متن کامل

Parallel Quasi-chebyshev Acceleration to Nonoverlapping Multisplitting Iterative Methods Based on Optimization

In this paper, we present a parallel quasi-Chebyshev acceleration applied to the nonoverlapping multisplitting iterative method for the linear systems when the coefficient matrix is either an H-matrix or a symmetric positive definite matrix. First, m parallel iterations are implemented in m different processors. Second, based on l1-norm or l2-norm, the m optimization models are parallelly treat...

متن کامل

ITPACK 2C: A FORTRAN Package for Solving Large Sparse Linear Systems by Adaptive Accelerated Iterative Methods

ITPACK 2C is a collection of seven FORTRAN subroutines for solving large sparse linear systems by adaptive accelerated iterative algorithms. Basic iterative procedures, such as the Jacobi method, the Successive Overrelaxation method, the Symmetric Successive Overrelaxation method, and the RS method for the reduced system are combined, where possible, with acceleration procedures such as Chebysh...

متن کامل

Chebyshev semi-iteration in Preconditioning

It is widely believed that Krylov subspace iterative methods are better than Chebyshev semi-iterative methods. When the solution of a linear system with a symmetric and positive definite coefficient matrix is required then the Conjugate Gradient method will compute the optimal approximate solution from the appropriate Krylov subspace, that is, it will implicitly compute the optimal polynomial. ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011