On the Convergence of Multi-Block Alternating Direction Method of Multipliers and Block Coordinate Descent Method

نویسندگان

  • Caihua Chen
  • Min Li
  • Xin Liu
  • Yinyu Ye
چکیده

The paper answers several open questions of the alternating direction method of multipliers (ADMM) and the block coordinate descent (BCD) method that are now wildly used to solve large scale convex optimization problems in many fields. For ADMM, it is still lack of theoretical understanding of the algorithm when the objective function is not separable across the variables. In this paper, we analyze the convergence of the 2-block ADMM for solving the linearly constrained convex optimization with coupled quadratic objective, and show that the classical ADMM point-wisely converges to a primal-dual solution pair of this problem. Moreover, we propose to use the randomly permuted ADMM (RPADMM) to solve the nonseparable multi-block convex optimization, and prove its expected convergence while applied to solve a class of quadratic programming problems. When the linear constraint vanishes, the 2-block ADMM and the randomly permuted ADMM reduce to the 2-block cyclic BCD method (also known as the alternating minimization method) and the EPOCHS. Interestingly, our study provides the first iterate convergence result of the 2-block cyclic BCD method without assuming the boundedness of the iterates. Under the same setting, the sublinear convergence rate of the function values can also be verified. Moreover, we also theoretically establish the expected iterate convergence result of the multi-block EPOCHS for convex quadratic optimization which can be regarded as one of the first iterate convergence analysis of EPOCHS. Last but not least, although random permutation is indeed to make multi-block ADMM and BCD more robust, we theoretically demonstrate that EPOCHS has a worse convergence rate than the cyclic BCD does in solving 2-block convex quadratic minimization problems. Therefore, EPOCHS should be applied in caution when solving general optimization problems.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Parallel Direction Method of Multipliers

We consider the problem of minimizing block-separable convex functions subject to linear constraints. While the Alternating Direction Method of Multipliers (ADMM) for two-block linear constraints has been intensively studied both theoretically and empirically, in spite of some preliminary work, effective generalizations of ADMM to multiple blocks is still unclear. In this paper, we propose a pa...

متن کامل

Block-Simultaneous Direction Method of Multipliers: A proximal primal-dual splitting algorithm for nonconvex problems with multiple constraints

We introduce a generalization of the linearized Alternating Direction Method of Multipliers to optimize a real-valued function f of multiple arguments with potentially multiple constraints g◦ on each of them. The function f may be nonconvex as long as it is convex in every argument, while the constraints g◦ need to be convex but not smooth. If f is smooth, the proposed Block-Simultaneous Direct...

متن کامل

Stochastic Dual Coordinate Ascent with Alternating Direction Method of Multipliers

We propose a new stochastic dual coordinate ascent technique that can be applied to a wide range of regularized learning problems. Our method is based on Alternating Direction Method of Multipliers (ADMM) to deal with complex regularization functions such as structured regularizations. Our method can naturally afford mini-batch update and it gives speed up of convergence. We show that, under mi...

متن کامل

Solving Multiple-Block Separable Convex Minimization Problems Using Two-Block Alternating Direction Method of Multipliers

Abstract. In this paper, we consider solving multiple-block separable convex minimization problems using alternating direction method of multipliers (ADMM). Motivated by the fact that the existing convergence theory for ADMM is mostly limited to the two-block case, we analyze in this paper, both theoretically and numerically, a new strategy that first transforms a multiblock problem into an equ...

متن کامل

Continuous Relaxation of MAP Inference: A Nonconvex Perspective

In this paper, we study a nonconvex continuous relaxation of MAP inference in discrete Markov random fields (MRFs). We show that for arbitrary MRFs, this relaxation is tight, and a discrete stationary point of it can be easily reached by a simple block coordinate descent algorithm. In addition, we study the resolution of this relaxation using popular gradient methods, and further propose a more...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015