نتایج جستجو برای: dichotomous coordinate descent dcd

تعداد نتایج: 77679  

Journal: :Journal of Artificial Intelligence Research 2014

Journal: :Lecture Notes in Computer Science 2021

Novel coordinate descent (CD) methods are proposed for minimizing nonconvex functions consisting of three terms: (i) a continuously differentiable term, (ii) simple convex and (iii) concave continuous term. First, by extending randomized CD to nonsmooth settings, we develop subgradient method that randomly updates block-coordinate variables using block composite mapping. This converges asymptot...

2016
Daniel Dadush László A. Végh Giacomo Zambelli

We propose two simple polynomial-time algorithms to find a positive solution to Ax = 0. Both algorithms iterate between coordinate descent steps similar to von Neumann’s algorithm, and rescaling steps. In both cases, either the updating step leads to a substantial decrease in the norm, or we can infer that the condition measure is small and rescale in order to improve the geometry. We also show...

2014
Ji Liu Stephen J. Wright Christopher Ré Victor Bittorf Srikrishna Sridhar

We describe an asynchronous parallel stochastic coordinate descent algorithm for minimizing smooth unconstrained or separably constrained functions. The method achieves a linear convergence rate on functions that satisfy an essential strong convexity property and a sublinear rate (1/K) on general convex functions. Near-linear speedup on a multicore system can be expected if the number of proces...

Journal: :CoRR 2016
Zebang Shen Hui Qian Chao Zhang Tengfei Zhou

Algorithms with fast convergence, small number of data access, and low periteration complexity are particularly favorable in the big data era, due to the demand for obtaining highly accurate solutions to problems with a large number of samples in ultra-high dimensional space. Existing algorithms lack at least one of these qualities, and thus are inefficient in handling such big data challenge. ...

2010
André F. T. Martins Kevin Gimpel Noah A. Smith Eric P. Xing Pedro M. Q. Aguiar Mário A. T. Figueiredo

We present a unified framework for online learning of structured classifiers. This framework handles a wide family of convex loss functions that includes as particular cases CRFs, structured SVMs, and the structured perceptron. We introduce a new aggressive online algorithm that optimizes any loss in this family; for the structured hinge loss, this algorithm reduces to 1-best MIRA; in general, ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید