نتایج جستجو برای: lagrangian method
تعداد نتایج: 1646303 فیلتر نتایج به سال:
We present three new approximate versions of alternating direction method of multipliers (ADMM), all of which require only knowledge of subgradients of the subproblem objectives, rather than bounds on the distance to the exact subproblem solution. One version, which applies only to certain common special cases, is based on combining the operator-splitting analysis of the ADMM with a relative-er...
We propose an augmented Lagrangian algorithm for solving large-scale constrained optimization problems. The novel feature of the algorithm is an adaptive update for the penalty parameter motivated by recently proposed techniques for exact penalty methods. This adaptive updating scheme greatly improves the overall performance of the algorithm without sacrificing the strengths of the core augment...
Regularized logistic regression is a very successful classification method, but for large-scale data, its distributed training has not been investigated much. In this work, we propose a distributed Newton method for training logistic regression. Many interesting techniques are discussed for reducing the communication cost. Experiments show that the proposed method is faster than state of the ar...
Sometimes, the feasible set of an optimization problem that one aims to solve using a Nonlinear Programming algorithm is empty. In this case, two characteristics of the algorithm are desirable. On the one hand, the algorithm should converge to a minimizer of some infeasibility measure. On the other hand, one may wish to find a point with minimal infeasibility for which some optimality condition...
We provide a simplified form of Primal Augmented Lagrange Multiplier algorithm. We intend to fill the gap in the steps involved in the mathematical derivations of the algorithm so that an insight into the algorithm is made. The experiment is focused to show the reconstruction done using this algorithm. Keywords-compressive sensing; l1-minimization; sparsity; coherence I.INTRODUCTION Compressive...
Studies have shown that the surrogate subgradient method, to optimize non-smooth dual functions within the Lagrangian relaxation framework, can lead to significant computational improvements as compared to the subgradient method. The key idea is to obtain surrogate subgradient directions that form acute angles toward the optimal multipliers without fully minimizing the relaxed problem. The majo...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید