نتایج جستجو برای: hybrid steepest descent method

تعداد نتایج: 1803458  

2008
Lu-Chuan Ceng Chinsan Lee Jen-Chih Yao

Assume that F is a nonlinear operator on a real Hilbert space H which is strongly monotone and Lipschitzian with constants η > 0 and κ > 0, respectively on a nonempty closed convex subset C of H . Assume also that C is the intersection of the fixed point sets of a finite number of nonexpansive mappings on H . We develop an implicit hybrid steepest-descent method which generates an iterative seq...

2008
U. Ascher

The integration to steady state of many initial value ODEs and PDEs using the forward Euler method can alternatively be considered as gradient descent for an associated minimization problem. Greedy algorithms such as steepest descent for determining the step size are as slow to reach steady state as is forward Euler integration with the best uniform step size. But other, much faster methods usi...

Journal: :Computers & Mathematics with Applications 1990

2008
Uri M. Ascher Kees van den Doel Hui Huang Benar F. Svaiter

The integration to steady state of many initial value ODEs and PDEs using the forward Euler method can alternatively be considered as gradient descent for an associated minimization problem. Greedy algorithms such as steepest descent for determining the step size are as slow to reach steady state as is forward Euler integration with the best uniform step size. But other, much faster methods usi...

1998
David Helmbold

AdaBoost is a popular and eeective leveraging procedure for improving the hypotheses generated by weak learning algorithms. AdaBoost and many other leveraging algorithms can be viewed as performing a constrained gradient descent over a potential function. At each iteration the distribution over the sample given to the weak learner is the direction of steepest descent. We introduce a new leverag...

2012
Thanyarat Jitpeera Nopparat Wairojjana Poom Kumam

An explicit hierarchical fixed point algorithm is introduced to solve the monotone variational inequality over the fixed point set of a nonexpansive mapping. This paper discusses a monotone variational inequality with variational constraint and convex optimization problems over the fixed point set of a nonexpansive mapping. The strong convergence for the proposed algorithm to the solution is gu...

2000
Hiroshi HASEGAWA Isao YAMADA Kohichi SAKANIWA

In this paper, we propose a projection based design of near perfect reconstruction QMF banks. An advantage of this method is that additional design specifications are easily implemented by defining new convex sets. To apply convex projection technique, the main difficulty is how to approximate the design specifications by some closed convex sets. In this paper, introducing a notion of Magnitude...

2012
Tomasz Piotrowski Isao Yamada

In this paper we consider the problem of efficient computation of the stochastic MV-PURE estimator which is a reduced-rank estimator designed for robust linear estimation in ill-conditioned inverse problems. Our motivation for this result stems from the fact that the reduced-rank estimation by the stochastic MV-PURE estimator, while avoiding the problem of regularization parameter selection app...

2007
Yanrong Yu Rudong Chen Yeol Je Cho

Let H be a real Hilbert space and let C be a nonempty closed convex subset of H . Let F :H →H be an operator such that for some constants k,η > 0, F is k-Lipschitzian and η-strongly monotone on C; that is, F satisfies the following inequalities: ‖Fx− Fy‖ ≤ k‖x− y‖ and 〈Fx− Fy,x− y〉 ≥ η‖x− y‖2 for all x, y ∈ C, respectively. Recall that T is nonexpansive if ‖Tx−Ty‖ ≤ ‖x− y‖ for all x, y ∈H . We ...

Journal: :Bulletin of the London Mathematical Society 2006

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید