نتایج جستجو برای: nonlinear programming generalized lambda

تعداد نتایج: 707978  

2015
M. Jaiswal S. K. Mishra Vasile Preda

This paper deals with a nonlinear multiobjective semi-infinite programming problem involving generalized (C,α, ρ, d)-convex functions. We obtain sufficient optimality conditions and formulate the Mond-Weirtype dual model for the nonlinear multiobjective semi-infinite programming problem. We also establish weak, strong and strict converse duality theorems relating the problem and the dual problem.

2000
Sangbum Lee Ignacio E. Grossmann

Generalized Disjunctive Programming (GDP) has been introduced recently as an alternative model to MINLP for representing discrete/continuous optimization problems. The basic idea of GDP consists of representing discrete decisions in the continuous space with disjunctions, and constraints in the discrete space with logic propositions. In this paper, we describe a new convex nonlinear relaxation ...

2015
Saeed Ghadimi Guanghui Lan Hongchao Zhang

In this paper, we present a generic framework to extend existing uniformly optimal convex programming algorithms to solve more general nonlinear, possibly nonconvex, optimization problems. The basic idea is to incorporate a local search step (gradient descent or Quasi-Newton iteration) into these uniformly optimal convex programming methods, and then enforce a monotone decreasing property of th...

2011
V. Jeyakumar G. Li G. M. Lee

In this paper we present a robust duality theory for generalized convex programming problems in the face of data uncertainty within the framework of robust optimization. We establish robust strong duality for an uncertain nonlinear programming primal problem and its uncertain Lagrangian dual by showing strong duality between the deterministic counterparts: robust counterpart of the primal model...

Journal: :journal of mathematical modeling 0
el amir djeffal department of mathematics, university of batna 2, batna, algeria lakhdar djeffal department of mathematics, university of batna 2, batna, algeria

in this paper, we deal to obtain some new complexity results for solving semidefinite optimization (sdo) problem by interior-point methods (ipms). we define a new proximity function for the sdo by a new kernel function. furthermore we formulate an algorithm for a primal dual interior-point method (ipm) for the sdo by using the proximity function and give its complexity analysis, and then we sho...

Journal: :journal of industrial engineering, international 2008
m.s sabbagh m roshanjooy

presented here is a generalization of the implicit enumeration algorithm that can be applied when the objec-tive function is being maximized and can be rewritten as the difference of two non-decreasing functions. also developed is a computational algorithm, named linear speedup, to use whatever explicit linear constraints are present to speedup the search for a solution. the method is easy to u...

Journal: :journal of industrial engineering, international 2011
m.b aryanezhad h malekly m karimi-nasab

in this paper, the portfolio selection problem is considered, where fuzziness and randomness appear simultaneously in optimization process. since return and dividend play an important role in such problems, a new model is developed in a mixed environment by incorporating fuzzy random variable as multi-objective nonlinear model. then a novel interactive approach is proposed to determine the pref...

Journal: :SIAM Review 2003
Aslihan Altay-Salih Mustafa Ç. Pinar Sven Leyffer

This paper proposes a constrained nonlinear programming view of generalized autoregressive conditional heteroskedasticity (GARCH) volatility estimation models in financial econometrics. These models are usually presented to the reader as unconstrained optimization models with recursive terms in the literature, whereas they actually fall into the domain of nonconvex nonlinear programming. Our re...

Journal: :Math. Oper. Res. 1994
Daniel Ralph

A natural damping of Newton's method for nonsmooth equations is presented. This damping, via the path search instead of the traditional line search, enlarges the domain of convergence of Newton's method and therefore is said to be globally convergent. Convergence behavior is like that of line search damped Newton's method for smooth equations, including Q-quadratic convergence rates under appro...

Journal: :CoRR 2016
Michael A. Bukatin Steve Matthews Andrey Radul

Dataflow matrix machines are self-referential generalized recurrent neural nets. The self-referential mechanism is provided via a stream of matrices defining the connectivity and weights of the network in question. A natural question is: what should play the role of untyped lambda-calculus for this programming architecture? The proposed answer is a discipline of programming with only one kind o...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید