نتایج جستجو برای: augmented lagrangian method

تعداد نتایج: 1688574  

Journal: :Comp. Opt. and Appl. 2002
Nadir Arada Jean-Pierre Raymond Fredi Tröltzsch

2013
Leon Wenliang Zhong James T. Kwok

In classification problems, isotonic regression has been commonly used to map the prediction scores to posterior class probabilities. However, isotonic regression may suffer from overfitting, and the learned mapping is often discontinuous. Besides, current efforts mainly focus on the calibration of a single classifier. As different classifiers have different strengths, a combination of them can...

2015
Zhou Zhao Ruihua Song Xing Xie Xiaofei He Yueting Zhuang

With the prevalence of mobile search nowadays, the benefits of mobile query recommendation are well recognized, which provide formulated queries sticking to users’ search intent. In this paper, we introduce the problem of query recommendation on mobile devices and model the user-location-query relations with a tensor representation. Unlike previous studies based on tensor decomposition, we stud...

Journal: :SIAM Journal on Optimization 2011
Min Tao Xiaoming Yuan

Many applications arising in a variety of fields can be well illustrated by the task of recovering the low-rank and sparse components of a given matrix. Recently, it is discovered that this NP-hard task can be well accomplished, both theoretically and numerically, via heuristically solving a convex relaxation problem where the widely-acknowledged nuclear norm and l1 norm are utilized to induce ...

Journal: :RAIRO - Operations Research 2010
Alfredo N. Iusem Mostafa Nasri

We introduce augmented Lagrangian methods for solving finite dimensional variational inequality problems whose feasible sets are defined by convex inequalities, generalizing the proximal augmented Lagrangian method for constrained optimization. At each iteration, primal variables are updated by solving an unconstrained variational inequality problem, and then dual variables are updated through ...

2010
Alexandre Caboussat Roland Glowinski Allison Leonard

A numerical method for the computation of the best constant in a Sobolev inequality involving the spacesH2(Ω) and C0(Ω) is presented. Green’s functions corresponding to the solution of Poisson problems are used to express the solution. This (kind of) non-smooth eigenvalue problem is then formulated as a constrained optimization problem and solved with two different strategies: an augmented Lagr...

Journal: :Math. Oper. Res. 2003
Xuexiang Huang Xiaoqi Yang

In this paper, the existence of an optimal path and its convergence to the optimal set of a primal problem of minimizing an extended real-valued function are established via a generalized augmented Lagrangian and corresponding generalized augmented Lagrangian problems, in which no convexity is imposed on the augmenting function. These results further imply a zero duality gap property between th...

Journal: :Comp. Opt. and Appl. 2003
Gianni Di Pillo Giampaolo Liuzzi Stefano Lucidi Laura Palagi

This paper is aimed toward the definition of a new exact augmented Lagrangian function for two-sided inequality constrained problems. The distinguishing feature of this augmented Lagrangian function is that it employs only one multiplier for each two-sided constraint. We prove that stationary points, local minimizers and global minimizers of the exact augmented Lagrangian function correspond ex...

2011
André F. T. Martins Mário A. T. Figueiredo Pedro M. Q. Aguiar Noah A. Smith Eric P. Xing

We propose a new algorithm for approximate MAP inference on factor graphs, which combines augmented Lagrangian optimization with the dual decomposition method. Each slave subproblem is given a quadratic penalty, which pushes toward faster consensus than in previous subgradient approaches. Our algorithm is provably convergent, parallelizable, and suitable for fine decompositions of the graph. We...

Journal: :Adv. Comput. Math. 2013
Yunhai Xiao Soon-Yi Wu Dong-Hui Li

Given a set of corrupted data drawn from a union of multiple subspace, the subspace recovery problem is to segment the data into their respective subspace and to correct the possible noise simultaneously. Recently, it is discovered that the task can be characterized, both theoretically and numerically, by solving a matrix nuclear-norm and a `2,1-mixed norm involved convex minimization problems....

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید