نتایج جستجو برای: globally convergence

تعداد نتایج: 160982  

Journal: :Neural networks : the official journal of the International Neural Network Society 2002
Eric de Bodt Marie Cottrell Michel Verleysen

Results of neural network learning are always subject to some variability, due to the sensitivity to initial conditions, to convergence to local minima, and, sometimes more dramatically, to sampling variability. This paper presents a set of tools designed to assess the reliability of the results of self-organizing maps (SOM), i.e. to test on a statistical basis the confidence we can have on the...

2017
Philipp Hungerländer Franz Rendl

The minimization of a convex quadratic function under bound constraints is a fundamental building block for solving more complicated optimization problems. The active-set method introduced by Bergounioux et al. [1, 2] has turned out to be a powerful, fast and competitive approach for this problem. Hintermüller et al. [15] provide a theoretical explanation of its efficiency by interpreting it as...

Journal: :SIAM Journal on Optimization 2000
Nobuo Yamashita Masao Fukushima

In this paper, we consider a proximal point algorithm (PPA) for solving monotone nonlinear complementarity problems (NCP). PPA generates a sequence by solving subproblems that are regularizations of the original problem. It is known that PPA has global and superlin-ear convergence property under appropriate criteria for approximate solutions of subproblems. However, it is not always easy to sol...

2016
Huikang Liu Weijie Wu Anthony Man-Cho So

A fundamental class of matrix optimization problems that arise in many areas of science and engineering is that of quadratic optimization with orthogonality constraints. Such problems can be solved using line-search methods on the Stiefel manifold, which are known to converge globally under mild conditions. To determine the convergence rate of these methods, we give an explicit estimate of the ...

Journal: :Math. Program. 1993
Jorge Nocedal Ya-Xiang Yuan

We study the self-scaling BFGS method of Oren and Luenberger (1974) for solving unconstrained optimization problems. For general convex functions, we prove that the method is globally convergent with inexact line searches. We also show that the directions generated by the self-scaling BFGS method approach Newton's direction asymptotically. This would ensure superlinear convergence if, in additi...

Journal: :Systems & Control Letters 2012
Paolo Frasca

This note studies a network of agents having continuous-time dynamics with quantized interactions and time-varying directed topology. Due to the discontinuity of the dynamics, solutions of the resulting ODE system are intended in the sense of Krasovskii. A limit connectivity graph is defined, which encodes persistent interactions between nodes: if such graph has a globally reachable node, Kraso...

2014
A. Chiche

This paper analyses the behavior of the augmented Lagrangian algorithm when it deals with an infeasible convex quadratic optimization problem. It is shown that the algorithm finds a point that, on the one hand, satisfies the constraints shifted by the smallest possible shift that makes them feasible and, on the other hand, minimizes the objective on the corresponding shifted constrained set. Th...

Journal: :JSW 2013
Haodong Yu

We propose a semismooth active-set Newton algorithm for solving the nonlinear complementarity problems with degenerate solutions. This method introduces the active-set technique to identify the degenerate set. At each iteration, the search direction is obtained by two reduced linear systems. Instead of employing gradient steps as adjustments to guarantee the sufficient reduction of the merit fu...

2000
Milan Keser Kuntal Joardar

The Levenberg-Marquardt (LM) minimization algorithm commonly employed in MOSFET model parameter extraction has several known deficiencies, such as poor convergence characteristics without a good initial guess, low likelihood of convergence to the globally optimal solution, and difficulty with simultaneous multiobjective optimizations. Furthermore, conventional tools require an expert user with ...

Journal: :Comp. Opt. and Appl. 2011
Dexuan Xie Mazen G. Zarrouk

This paper gives a general convergence analysis to the truncated incomplete Hessian Newton method (T-IHN). It shows that T-IHN is globally convergent even with an indefinite incomplete Hessian matrix or an indefinite preconditioner, which may happen in practice. It also proves that when the T-IHN iterates are close enough to a minimum point, T-IHN has a Q-linear rate of convergence, and an admi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید