Stochastic Variance-Reduced ADMM

نویسندگان

  • Shuai Zheng
  • James T. Kwok
چکیده

The alternating direction method of multipliers (ADMM) is a powerful optimization solver in machine learning. Recently, stochastic ADMM has been integrated with variance reduction methods for stochastic gradient, leading to SAGADMM and SDCA-ADMM that have fast convergence rates and low iteration complexities. However, their space requirements can still be high. In this paper, we propose an integration of ADMM with the method of stochastic variance reduced gradient (SVRG). Unlike another recent integration attempt called SCAS-ADMM, the proposed algorithm retains the fast convergence benefits of SAG-ADMM and SDCA-ADMM, but is more advantageous in that its storage requirement is very low, even independent of the sample size n. We also extend the proposed method for nonconvex problems, and obtain a convergence rate of O(1/T ). Experimental results demonstrate that it is as fast as SAGADMM and SDCA-ADMM, much faster than SCAS-ADMM, and can be used on much bigger data sets.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accelerated Variance Reduced Stochastic ADMM

Recently, many variance reduced stochastic alternating direction method of multipliers (ADMM) methods (e.g. SAGADMM, SDCA-ADMM and SVRG-ADMM) have made exciting progress such as linear convergence rates for strongly convex problems. However, the best known convergence rate for general convex problems is O(1/T ) as opposed to O(1/T ) of accelerated batch algorithms, where T is the number of iter...

متن کامل

Fast-and-Light Stochastic ADMM

The alternating direction method of multipliers (ADMM) is a powerful optimization solver in machine learning. Recently, stochastic ADMM has been integrated with variance reduction methods for stochastic gradient, leading to SAG-ADMM and SDCA-ADMM that have fast convergence rates and low iteration complexities. However, their space requirements can still be high. In this paper, we propose an int...

متن کامل

Accelerated Stochastic ADMM with Variance Reduction

Alternating Direction Method of Multipliers (ADMM) is a popular method in solving Machine Learning problems. Stochastic ADMM was firstly proposed in order to reduce the per iteration computational complexity, which is more suitable for big data problems. Recently, variance reduction techniques have been integrated with stochastic ADMM in order to get a fast convergence rate, such as SAG-ADMM an...

متن کامل

Fast Stochastic Variance Reduced ADMM for Stochastic Composition Optimization

We consider the stochastic composition optimization problem proposed in [17], which has applications ranging from estimation to statistical and machine learning. We propose the first ADMM-based algorithm named com-SVRADMM, and show that com-SVR-ADMM converges linearly for strongly convex and Lipschitz smooth objectives, and has a convergence rate of O(logS/S), which improves upon the O(S−4/9) r...

متن کامل

Stochastic Strictly Contractive Peaceman-Rachford Splitting Method

In this paper, we propose a couple of new Stochastic Strictly Contractive PeacemanRachford Splitting Method (SCPRSM), called Stochastic SCPRSM (SS-PRSM) and Stochastic Conjugate Gradient SCPRSM (SCG-PRSM) for large-scale optimization problems. The two types of Stochastic PRSM algorithms respectively incorporate stochastic variance reduced gradient (SVRG) and conjugate gradient method. Stochasti...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016