نتایج جستجو برای: dichotomous coordinate descent dcd

تعداد نتایج: 77679  

Journal: :SIAM Journal on Optimization 2013
Ankan Saha Ambuj Tewari

Cyclic coordinate descent is a classic optimization method that has witnessed a resurgence of interest in Signal Processing, Statistics and Machine Learning. Reasons for this renewed interest include the simplicity, speed, and stability of the method as well as its competitive performance on `1 regularized smooth optimization problems. Surprisingly, very little is known about its non-asymptotic...

2017
Ahmet Alacaoglu Quoc Tran-Dinh Olivier Fercoq Volkan Cevher

We propose a new randomized coordinate descent method for a convex optimization template with broad applications. Our analysis relies on a novel combination of four ideas applied to the primal-dual gap function: smoothing, acceleration, homotopy, and coordinate descent with non-uniform sampling. As a result, our method features the first convergence rate guarantees among the coordinate descent ...

Journal: :Image Vision Comput. 2003
Stephen M. Pizer P. Thomas Fletcher Andrew Thall Martin Styner Guido Gerig Sarang C. Joshi

Object descriptions used for 3D segmentation by deformable models and for statistical characterization of 3D object classes benefit from having intrinsic correspondences over deformation of the objects or multiple instances in the same object class. These correspondences apply over a variety of spatial scale levels and consequently lead to efficient segmentation and probability distributions of...

Journal: :J. Optimization Theory and Applications 2016
Rachael Tappenden Peter Richtárik Jacek Gondzio

In this paper we consider the problem of minimizing a convex function using a randomized block coordinate descent method. One of the key steps at each iteration of the algorithm is determining the update to a block of variables. Existing algorithms assume that in order to compute the update, a particular subproblem is solved exactly. In his work we relax this requirement, and allow for the subp...

Journal: :Optimization Letters 2016
Peter Richtárik Martin Takác

We propose and analyze a new parallel coordinate descent method—‘NSync— in which at each iteration a random subset of coordinates is updated, in parallel, allowing for the subsets to be chosen non-uniformly. We derive convergence rates under a strong convexity assumption, and comment on how to assign probabilities to the sets to optimize the bound. The complexity and practical performance of th...

2014
Bogdan Dumitrescu

In this note we compare the randomized extended Kaczmarz (EK) algorithm and randomized coordinate descent (CD) for solving the full-rank overdetermined linear least-squares problem and prove that CD needs less operations for satisfying the same residual-related termination criteria. For the general least-squares problems, we show that running first CD to compute the residual and then standard K...

Journal: :CoRR 2014
Huahua Wang Arindam Banerjee

Two types of low cost-per-iteration gradient descent methods have been extensively studied in parallel. One is online or stochastic gradient descent ( OGD/SGD), and the other is randomzied coordinate descent (RBCD). In this paper, we combine the two types of methods together and propose online randomized block coordinate descent (ORBCD). At each iteration, ORBCD only computes the partial gradie...

2013
Shai Shalev-Shwartz Tong Zhang

Stochastic dual coordinate ascent (SDCA) is an effective technique for solving regularized loss minimization problems in machine learning. This paper considers an extension of SDCA under the minibatch setting that is often used in practice. Our main contribution is to introduce an accelerated minibatch version of SDCA and prove a fast convergence rate for this method. We discuss an implementati...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید