Smooth minimization of non-smooth functions

نویسنده

  • Yurii Nesterov
چکیده

In this paper we propose a new approach for constructing efficient schemes for non-smooth convex optimization. It is based on a special smoothing technique, which can be applied to functions with explicit max-structure. Our approach can be considered as an alternative to black-box minimization. From the viewpoint of efficiency estimates, we manage to improve the traditional bounds on the number of iterations of the gradient schemes from O ( 1 2 ) to O ( 1 ) , keeping basically the complexity of each iteration unchanged.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quasi-Gap and Gap Functions for Non-Smooth Multi-Objective Semi-Infinite Optimization Problems

In this paper‎, ‎we introduce and study some new single-valued gap functions for non-differentiable semi-infinite multiobjective optimization problems with locally Lipschitz data‎. ‎Since one of the fundamental properties of gap function for optimization problems is its abilities in characterizing the solutions of the problem in question‎, ‎then the essential properties of the newly introduced ...

متن کامل

Bundle method for non-convex minimization with inexact subgradients and function values∗

We discuss a bundle method to minimize non-smooth and non-convex locally Lipschitz functions. We analyze situations where only inexact subgradients or function values are available. For suitable classes of non-smooth functions we prove convergence of our algorithm to approximate critical points.

متن کامل

Differentially Private Empirical Risk Minimization Revisited: Faster and More General

In this paper we study the differentially private Empirical Risk Minimization (ERM) problem in different settings. For smooth (strongly) convex loss function with or without (non)-smooth regularization, we give algorithms that achieve either optimal or near optimal utility bounds with less gradient complexity compared with previous work. For ERM with smooth convex loss function in high-dimensio...

متن کامل

Relaxed Majorization-Minimization for Non-Smooth and Non-Convex Optimization

We propose a new majorization-minimization (MM) method for non-smooth and non-convex programs, which is general enough to include the existing MM methods. Besides the local majorization condition, we only require that the difference between the directional derivatives of the objective function and its surrogate function vanishes when the number of iterations approaches infinity, which is a very...

متن کامل

Minimization of cost-functions with a non-smooth data-fidelity term. A new approach to the processing of impulsive noise

We consider signal and image restoration using convex cost-functions composed of a non-smooth data-fidelity term and a smooth regularization term. First, we provide a convergent method to minimize such cost-functions. Then we propose an efficient method to remove impulsive noise by minimizing cost-functions composed of an l1 data-fidelity term and an edge-preserving regularization term. Their m...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Math. Program.

دوره 103  شماره 

صفحات  -

تاریخ انتشار 2005