An Incremental Gradient Method for Optimization Problems with Variational Inequality Constraints
نویسندگان
چکیده
We consider minimizing a sum of agent-specific nondifferentiable merely convex functions over the solution set variational inequality (VI) problem in that each agent is associated with local monotone mapping. This finds an application computation best equilibrium nonlinear complementarity problems arising transportation networks. develop iteratively regularized incremental gradient method where at iteration, agents communicate directed cycle graph to update their iterates using information about objective and The proposed single-timescale sense it does not involve any excessive hard-to-project per iteration. derive non-asymptotic agent-wise convergence rates for suboptimality global function infeasibility VI constraints measured by suitably defined dual gap function. appears be first fully iterative scheme equipped iteration complexity can address distributed optimization graphs.
منابع مشابه
Necessary Optimality Conditions for Optimization Problems with Variational Inequality Constraints
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your perso...
متن کاملMultiobjective optimization problem with variational inequality constraints
We study a general multiobjective optimization problem with variational inequality, equality, inequality and abstract constraints. Fritz John type necessary optimality conditions involving Mordukhovich coderivatives are derived. They lead to Kuhn-Tucker type necessary optimality conditions under additional constraint qualifications including the calmness condition, the error bound constraint qu...
متن کاملAn Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems
In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...
متن کاملConstraint Qualifications and Necessary Optimality Conditions for Optimization Problems with Variational Inequality Constraints
A very general optimization problem with a variational inequality constraint, inequality constraints, and an abstract constraint are studied. Fritz John type and Kuhn–Tucker type necessary optimality conditions involving Mordukhovich coderivatives are derived. Several constraint qualifications for the Kuhn–Tucker type necessary optimality conditions involving Mordukhovich coderivatives are intr...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Automatic Control
سال: 2023
ISSN: ['0018-9286', '1558-2523', '2334-3303']
DOI: https://doi.org/10.1109/tac.2023.3251851