A proximal algorithm with backtracked extrapolation for a class of structured fractional programming

نویسندگان

چکیده

In this paper, we consider a class of structured fractional minimization problems where the numerator part objective is sum convex function and Lipschitz differentiable (possibly) nonconvex function, while denominator function. By exploiting structure problem, propose first-order algorithm, namely, proximal-gradient-subgradient algorithm with backtracked extrapolation (PGSA_BE) for solving type optimization problem. It worth pointing out that there are few differences between our other popular extrapolations used in optimization. One such as follows: if new iterate obtained from extrapolated iteration satisfies backtracking condition, then will be replaced by one generated non-extrapolated iteration. We show any accumulation point sequence PGSA_BE critical problem regarded. addition, assuming some auxiliary functions satisfy Kurdyka-Łojasiewicz property, able to establish global convergence entire sequence, case locally differentiable, or its conjugate calmness condition. Finally, present preliminary numerical results illustrate efficiency PGSA_BE.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A proximal-like algorithm for a class of nonconvex programming

In this paper, we study a proximal-like algorithm for minimizing a closed proper function f(x) subject to x ≥ 0, based on the iterative scheme: x ∈ argmin{f(x)+ μkd(x, x k−1)}, where d(·, ·) is an entropy-like distance function. The algorithm is welldefined under the assumption that the problem has a nonempty and bounded solution set. If, in addition, f is a differentiable quasi-convex function...

متن کامل

A generalized implicit enumeration algorithm for a class of integer nonlinear programming problems

Presented here is a generalization of the implicit enumeration algorithm that can be applied when the objec-tive function is being maximized and can be rewritten as the difference of two non-decreasing functions. Also developed is a computational algorithm, named linear speedup, to use whatever explicit linear constraints are present to speedup the search for a solution. The method is easy to u...

متن کامل

Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems

In this paper, we study the proximal gradient algorithm with extrapolation for minimizing the sum of a Lipschitz differentiable function and a proper closed convex function. Under the error bound condition used in [19] for analyzing the convergence of the proximal gradient algorithm, we show that there exists a threshold such that if the extrapolation coefficients are chosen below this threshol...

متن کامل

A numerical algorithm for solving a class of matrix equations

In this paper, we present a numerical algorithm for solving matrix equations $(A otimes B)X = F$  by extending the well-known Gaussian elimination for $Ax = b$. The proposed algorithm has a high computational efficiency. Two numerical examples are provided to show the effectiveness of the proposed algorithm.

متن کامل

A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming

In this paper we propose a randomized block coordinate non-monotone gradient (RBCNMG) method for minimizing the sum of a smooth (possibly nonconvex) function and a block-separable (possibly nonconvex nonsmooth) function. At each iteration, this method randomly picks a block according to any prescribed probability distribution and typically solves several associated proximal subproblems that usu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Applied and Computational Harmonic Analysis

سال: 2022

ISSN: ['1096-603X', '1063-5203']

DOI: https://doi.org/10.1016/j.acha.2021.08.004