An accelerated directional derivative method for smooth stochastic convex optimization

نویسندگان

چکیده

We consider smooth stochastic convex optimization problems in the context of algorithms which are based on directional derivatives objective function. This can be considered as an intermediate one between derivative-free and gradient-based optimization. assume that at any given point for direction, a approximation derivative function this direction is available with some additive noise. The noise assumed to unknown nature, but bounded absolute value. underline we opposed coordinate descent methods use only directions. For setting, propose non-accelerated accelerated method provide their complexity bounds. Our algorithm has bound similar algorithm, is, without dimension-dependent factor. coincides up factor square root problem dimension. extend these results strongly problems.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization

We consider an unconstrained problem of minimization of a smooth convex function which is only available through noisy observations of its values, the noise consisting of two parts. Similar to stochastic optimization problems, the first part is of a stochastic nature. On the opposite, the second part is an additive noise of an unknown nature, but bounded in the absolute value. In the two-point ...

متن کامل

An adaptive accelerated first-order method for convex optimization

In this paper, we present a new accelerated variant of Nesterov’s method for solving a class of convex optimization problems, in which certain acceleration parameters are adaptively (and aggressively) chosen so as to: preserve the theoretical iteration-complexity of the original method, and; substantially improve its practical performance in comparison to the other existing variants. Computatio...

متن کامل

Randomized Similar Triangles Method: A Unifying Framework for Accelerated Randomized Optimization Methods (Coordinate Descent, Directional Search, Derivative-Free Method)

In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. We focus on a...

متن کامل

Accelerated Regularized Newton Method for Unconstrained Convex Optimization∗

We consider a global complexity bound of regularized Newton methods for the unconstrained convex optimization. The global complexity bound is an upper bound of the number of iterations required to get an approximate solution x such that f(x)− inf f(y) ≤ ε, where ε is a given positive constant. Recently, Ueda and Yamashita proposed the regularized Newton method whose global complexity bound is O...

متن کامل

Inexact Restoration Method for Derivative-Free Optimization with Smooth Constraints

A new method is introduced for solving constrained optimization problems in which the derivatives of the constraints are available but the derivatives of the objective function are not. The method is based on the Inexact Restoration framework, by means of which each iteration is divided in two phases. In the first phase one considers only the constraints, in order to improve feasibility. In the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: European Journal of Operational Research

سال: 2021

ISSN: ['1872-6860', '0377-2217']

DOI: https://doi.org/10.1016/j.ejor.2020.08.027