نتایج جستجو برای: locally nonconvex lipschitz function

تعداد نتایج: 1291344  

2012
Martin Fuchs Michael Bildhauer

We establish interior gradient bounds for functions u ∈ W 1 1,loc (Ω) which locally minimize the variational integral J[u, Ω] = ∫ Ω h (|∇u|) dx under the side condition u ≥ Ψ a.e. on Ω with obstacle Ψ being locally Lipschitz. Here h denotes a rather general N-function allowing (p, q)-ellipticity with arbitrary exponents 1 < p ≤ q < ∞. Our arguments are based on ideas developed in [BFM] combined...

1990
Martin Costabel

Let ~ u be a vector eld on a bounded Lipschitz domain in R 3 , and let ~ u together with its divergence and curl be square integrable. If either the normal or the tangential component of ~ u is square inte-grable over the boundary, then ~ u belongs to the Sobolev space H 1=2 on the domain. This result gives a simple explanation for known results on the compact embedding of the space of solution...

Journal: :Proceedings of the American Mathematical Society 2015

Journal: :Siam Journal on Optimization 2022

We introduce two algorithms for nonconvex regularized finite sum minimization, where typical Lipschitz differentiability assumptions are relaxed to the notion of relative smoothness. The first one is a Bregman extension Finito/MISO, studied fully problems when sampling random, or under convexity nonsmooth term it essentially cyclic. second algorithm low-memory variant, in spirit SVRG and SARAH,...

Journal: :Rocky Mountain Journal of Mathematics 1996

2010
François Bolley José A. Cañizo José A. Carrillo

We consider general stochastic systems of interacting particles with noise which are relevant as models for the collective behavior of animals, and rigorously prove that in the mean-field limit the system is close to the solution of a kinetic PDE. Our aim is to include models widely studied in the literature such as the CuckerSmale model, adding noise to the behavior of individuals. The difficu...

Journal: :Computational Optimization and Applications 2021

In this paper, we describe and establish iteration-complexity of two accelerated composite gradient (ACG) variants to solve a smooth nonconvex optimization problem whose objective function is the sum differentiable f with Lipschitz continuous simple nonsmooth closed convex h. When convex, first ACG variant reduces well-known FISTA for specific choice input, hence one can be viewed as natural ex...

2016
Yan Kaganovsky Ikenna Odinaka David E. Carlson Lawrence Carin

We propose an optimization framework for nonconvex problems based on majorizationminimization that is particularity well-suited for parallel computing. It reduces the optimization of a high dimensional nonconvex objective function to successive optimizations of locally tight and convex upper bounds which are additively separable into low dimensional objectives. The original problem is then brok...

Journal: :Journal of Industrial and Management Optimization 2022

&lt;p style='text-indent:20px;'&gt;In this paper, one minimizes a fractional function over compact set. Using an exact separation theorem, gives necessary optimality conditions for strict optimal solutions in terms of Fréchet subdifferentials. All data are assumed locally Lipschitz.&lt;/p&gt;

2017
Runze Zhang Siyu Zhu Tian Fang Long Quan

We provide a proof for the following statement for nonconvex function in this section and the convergence statement of ADMM algorithm for the bundle adjustment objective function in section 3.1 of the paper body can be obtained by using the following statement. Theorem With the objective function in Eqn. 7 in the paper body in which the gradients of each function fi are local Lipschitz continuo...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید