A stochastic alternating direction method of multipliers for non-smooth and non-convex optimization

نویسندگان

چکیده

Alternating direction method of multipliers (ADMM) is a popular first-order owing to its simplicity and efficiency. However, similar other proximal splitting methods, the performance ADMM degrades significantly when scale optimization problems solve becomes large. In this paper, we consider combining with class stochastic gradient variance reduction for solving large-scale non-convex non-smooth problems. Global convergence generated sequence established under extra additional assumption that object function satisfies Kurdyka-Lojasiewicz (KL) property. Numerical experiments on graph-guided fused Lasso computed tomography are presented demonstrate proposed methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Stochastic Alternating Direction Method of Multipliers

The Alternating Direction Method of Multipliers (ADMM) has been studied for years. Traditional ADMM algorithms need to compute, at each iteration, an (empirical) expected loss function on all training examples, resulting in a computational complexity proportional to the number of training examples. To reduce the complexity, stochastic ADMM algorithms were proposed to replace the expected loss f...

متن کامل

Fast Stochastic Alternating Direction Method of Multipliers

In this paper, we propose a new stochastic alternating direction method of multipliers (ADMM) algorithm, which incrementally approximates the full gradient in the linearized ADMM formulation. Besides having a low per-iteration complexity as existing stochastic ADMM algorithms, the proposed algorithm improves the convergence rate on convex problems from O ( 1 √ T ) to O ( 1 T ) , where T is the ...

متن کامل

Stochastic Alternating Direction Method of Multipliers

The Alternating Direction Method of Multipliers (ADMM) has received lots of attention recently due to the tremendous demand from large-scale and data-distributed machine learning applications. In this paper, we present a stochastic setting for optimization problems with non-smooth composite objective functions. To solve this problem, we propose a stochastic ADMM algorithm. Our algorithm applies...

متن کامل

Scalable Stochastic Alternating Direction Method of Multipliers

Alternating direction method of multipliers (ADMM) has been widely used in many applications due to its promising performance to solve complex regularization problems and large-scale distributed optimization problems. Stochastic ADMM, which visits only one sample or a mini-batch of samples each time, has recently been proved to achieve better performance than batch ADMM. However, most stochasti...

متن کامل

Inexact Alternating Direction Methods of Multipliers for Separable Convex Optimization

Abstract. Inexact alternating direction multiplier methods (ADMMs) are developed for solving general separable convex optimization problems with a linear constraint and with an objective that is the sum of smooth and nonsmooth terms. The approach involves linearized subproblems, a back substitution step, and either gradient or accelerated gradient techniques. Global convergence is established. ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Inverse Problems

سال: 2021

ISSN: ['0266-5611', '1361-6420']

DOI: https://doi.org/10.1088/1361-6420/ac0966