نتایج جستجو برای: stepsize
تعداد نتایج: 879 فیلتر نتایج به سال:
In this paper, we workout a detailed mathematical analysis for a new learning algorithm termed Cascade Error ProJection (CEP) and a general learning frame work. This frame work can be used to obtain the cascade correlation learning algorithm by choosing a particular set of parameters. Furthermore, CEP learning algorithm is operated only on one layer, whereas the other set of weights can be calc...
Unconditional Convergence of Some Crank - Nicolson Lod Methods for Initial - Boundary Value Problems
In this paper convergence properties are discussed for some locally one-dimensional (LOD) splitting methods applied to linear parabolic initialboundary value problems. We shall consider unconditional convergence, where both the stepsize in time and the meshwidth in space tend to zero, independently of each other.
A new approach to the construction of finite-difference methods is presented. It is shown how the multi-point differentiators can generate regularizing algorithms with a stepsize h being a regularization parameter. The explicitly computable estimation constants are given. Also an iteratively regularized scheme for solving the numerical differentiation problem in the form of Volterra integral eq...
We propose a diagonal metric selection for variable metric proximal gradient method (VMPG). The proposed metric better captures the local geometry of the problem and provides improved convergence compared to the standard proximal gradient (PG) methods with Barzilai-Borwein (BB) stepsize selection. Further, we provide convergence guarantees for the proposed method and illustrate its advantages o...
An inversion-free iterative algorithm is presented for solving nonlinear matrix equation with a stepsize parameter t. The existence of the maximal solution is discussed in detail, and the method for finding it is proposed. Finally, two numerical examples are reported that show the efficiency of the method. Keywords—Inversion-free method, Hermitian positive definite
We show how the Hamiltonian Monte Carlo algorithm can sometimes be speeded up by “splitting” the Hamiltonian in a way that allows much of the movement around the state space to be done at low computational cost. One context where this is possible is when the log density of the distribution of interest (the potential energy function) can be written as the log of a Gaussian density, which is a qu...
We consider inital-value problems governed by a system of autonomous ordinary differential equations (ODEs). When the required integration stepsize of the ODE system is very very small in comparison to the time domain of interest. Then the initial-value problem is said to be stiff.
A new theorem for the development and convergence analysis of supervised training algorithms with an adaptive learning rate for each weight is presented. Based on this theoretical result, a strategy is proposed to automatically adapt the search direction, as well as the stepsize length along the resultant search direction. This strategy is applied to some well known local learning algorithms to...
We consider the bilinear optimal control of an advection-reaction-diffusion system, where arises as velocity field in advection term. Such a problem is generally challenging from both theoretical analysis and algorithmic design perspectives, mainly because state variable depends nonlinearly on and, additional divergence-free constraint coupled together with equation. Mathematically, proof exist...
The DI methods for directly solving a system of a general higher order ODEs are discussed. The convergence of the constant stepsize and constant order formulation of the DI methods is proven first before the convergence for the variable order and stepsize case. 1. INTRODUCTION Many problems in engineering and science can be formulated in terms of such a system. The general system of higher orde...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید