نتایج جستجو برای: strongly convex function

تعداد نتایج: 1435527  

Journal: :Comp. Opt. and Appl. 2017
Reza Eghbali Maryam Fazel

We study the convergence rate of the proximal-gradient homotopy algorithm applied to normregularized linear least squares problems, for a general class of norms. The homotopy algorithm reduces the regularization parameter in a series of steps, and uses a proximal-gradient algorithm to solve the problem at each step. Proximal-gradient algorithm has a linear rate of convergence given that the obj...

1992
Zhenyu Li Victor Milenkovic

One useful generalization of the convex hull of a set S of n points is the-strongly convex-hull. It is deened to be a convex polygon P with vertices taken from S such that no point in S lies farther than outside P and such that even if the vertices of P are perturbed by as much as , P remains convex. It was an open question 1 as to whether an-strongly convex O()-hull existed for all positive. W...

2013
Lijun Zhang Tianbao Yang Rong Jin Xiaofei He

A. Proof of Lemma 1 We need the following lemma that characterizes the property of the extra-gradient descent. Lemma 8 (Lemma 3.1 in (Nemirovski, 2005)). Let Z be a convex compact set in Euclidean space E with inner product 〈·, ·〉, let ‖ · ‖ be a norm on E and ‖ · ‖∗ be its dual norm, and let ω(z) : Z 7→ R be a α-strongly convex function with respect to ‖ · ‖. The Bregman distance associated wi...

2016
HAMID REZA MORADI

We give a Jensen’s operator inequality for strongly convex functions. As a corollary, we improve Hölder-McCarthy inequality under suitable conditions. More precisely we show that if Sp (A) ⊂ I ⊆ (1,∞), then 〈Ax, x〉 r ≤ 〈Ax, x〉 − r − r 2 (

2012
DANIEL AZAGRA

Let U ⊆ R be open and convex. We prove that every (not necessarily Lipschitz or strongly) convex function f : U → R can be approximated by real analytic convex functions, uniformly on all of U . We also show that C-fine approximation of convex functions by smooth (or real analytic) convex functions on R is possible in general if and only if d = 1. Nevertheless, for d ≥ 2 we give a characterizat...

2014
Jakub Konevcn'y Peter Richt'arik

We consider the problem of unconstrained minimization of a smooth function in the derivativefree setting using. In particular, we propose and study a simplified variant of the direct search method (of direction type), which we call simplified direct search (SDS). Unlike standard direct search methods, which depend on a large number of parameters that need to be tuned, SDS depends on a single sc...

Journal: :Computational Optimization and Applications 2022

Abstract In this work we aim to solve a convex-concave saddle point problem, where the coupling function is smooth in one variable and nonsmooth other not assumed be linear either. The problem augmented by regulariser component. We propose investigate novel algorithm under name of OGAProx , consisting an optimistic gradient ascent step coupled with proximal regulariser, which alternated compone...

Journal: :International electronic journal of geometry 2022

Considering a projectively invariant metric $\tau$ defined by the kernel function on strongly convex bounded domain $\Omega\subset\mathbb{R}^n$, we study asymptotic expansion of scalar curvature with respect to distance function, and use Fubini-Pick describe second term in expansion. This implies that if $n\geq 3$ $(\Omega,\tau )$ has constant curvature, then is equivalent ball.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید