نتایج جستجو برای: strongly convex function
تعداد نتایج: 1435527 فیلتر نتایج به سال:
We propose a new convex optimization formulation for the Fisher market problem with linear utilities. Like the Eisenberg-Gale formulation, the set of feasible points is a polyhedral convex set while the cost function is non-linear; however, unlike that, the optimum is always attained at a vertex of this polytope. The convex cost function depends only on the initial endowments of the buyers. Thi...
In this paper, we generalize the proximal point algorithm to complete CAT(0) spaces and show that the sequence generated by the proximal point algorithm $w$-converges to a zero of the maximal monotone operator. Also, we prove that if $f: Xrightarrow ]-infty, +infty]$ is a proper, convex and lower semicontinuous function on the complete CAT(0) space $X$, then the proximal...
We introduce the regularized Newton method (rnm) for unconstrained convex optimization. For any convex function, with a bounded optimal set, the rnm generates a sequence that converges to the optimal set from any starting point. Moreover the rnm requires neither strong convexity nor smoothness properties in the entire space. If the function is strongly convex and smooth enough in the neighborho...
We study multivariate entire functions and polynomials with non-negative coefficients. A class of Strongly Log-Concave entire functions, generalizingMinkowski volume polynomials, is introduced: an entire function f in m variables is called Strongly Log-Concave if the function (∂x1) c1 ...(∂xm) mf is either zero or log((∂x1) c1 ...(∂xm) mf) is concave on R + . We start with yet another point of ...
In this work we introduce the concept of an Underestimate Sequence (UES), which is a natural extension of Nesterov’s estimate sequence [16]. Our definition of a UES utilizes three sequences, one of which is a lower bound (or under-estimator) of the objective function. The question of how to construct an appropriate sequence of lower bounds is also addressed, and we present lower bounds for stro...
We consider the problem of unconstrained minimization of a smooth function in the derivativefree setting. In particular, we study the direct search method (of directional type). Despite relevant research activity spanning several decades, until recently no complexity guarantees— bounds on the number of function evaluations needed to find a satisfying point—for methods of this type were establis...
We study distributed composite optimization over networks: agents minimize a sum of smooth (strongly) convex functions-the agents' sum-utility-plus nonsmooth (extended-valued) one. propose general unified algorithmic framework for such class problems and provide convergence analysis leveraging the theory operator splitting. Distinguishing features our scheme are: (i) When each agent's functions...
We consider a multi-agent framework for distributed optimization where each agent has access to local smooth strongly convex function, and the collective goal is achieve consensus on parameters that minimize sum of agents' functions. propose an algorithm wherein operates asynchronously independently other agents. When functions are strongly-convex with Lipschitz-continuous gradients, we show it...
We develop primal-dual algorithms for distributed training of linear models in the Spark framework. We present the ProxCoCoA+ method which represents a generalization of the CoCoA+ algorithm and extends it to the case of general strongly convex regularizers. A primal-dual convergence rate analysis is provided along with an experimental evaluation of the algorithm on the problem of elastic net r...
In this paper, we study the optimal convergence rate for distributed convex optimization problems in networks. We model the communication restrictions imposed by the network as a set of affine constraints and provide optimal complexity bounds for four different setups, namely: the function $F(\xb) \triangleq \sum_{i=1}^{m}f_i(\xb)$ is strongly convex and smooth, either strongly convex or smooth...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید