نتایج جستجو برای: t convergence
تعداد نتایج: 811002 فیلتر نتایج به سال:
We study a special class of solutions to the 3D Navier-Stokes equations ∂tu +∇uνu +∇p = ν∆u , with no-slip boundary condition, on a domain of the form Ω = {(x, y, z) : 0 ≤ z ≤ 1}, dealing with velocity fields of the form u(t, x, y, z) = (v(t, z), w(t, x, z), 0), describing plane-parallel channel flows. We establish results on convergence u → u as ν → 0, where u solves the associated Euler equat...
In this paper we consider online mirror descent (OMD) algorithms, a class of scalable online learning algorithms exploiting data geometric structures through mirror maps. Necessary and sufficient conditions are presented in terms of the step size sequence {ηt}t for the convergence of an OMD algorithm with respect to the expected Bregman distance induced by the mirror map. The condition is limt→...
Regular convergence, together with other types of have been studied since the 1970s for discrete approximations linear operators. In this paper, we consider eigenvalue approximation a compact operator $T$ that can be written as an problem holomorphic Fredholm function $F(\eta) = T-\frac{1}{\eta} I$. Focusing on finite element methods (conforming, discontinuous Galerkin, non-conforming, etc.), s...
Let t be a triangle in R. We find the Longest Edge (LE) of t, insert n−1 equally-space points in the LE and connect them to the opposite vertex. This yields the generation of n new sub-triangles whose parent is t. Now, continue this process iteratively. Proficient algorithms for mesh refinement using this method are known when n = 2, but less known when n = 3 and completely unknown when n > 4. ...
Stochastic (on-line) learning can be faster than batch learning. However, at late times, the learning rate must be annealed to remove the noise present in the stochastic weight updates. In this annealing phase, the convergence rate (in mean square) is at best proportional to l/T where T is the number of input presentations. An alternative is to increase the batch size to remove the noise. In th...
The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems. Despite of its success in practice, the convergence properties of the standard ADMM for minimizing the sum of N (N ≥ 3) convex functions with N block variables linked by linear constraints, have remained unclear for a very long time. In this paper, we present convergence and...
Fractional partial differential equations with distributed-order fractional derivatives describe some important physical phenomena. In this paper, we propose a local discontinuous Galerkin (LDG) method for the distributedorder time and Riesz space fractional convection-diffusion and Schrödinger type equations. We prove stability and optimal order of convergence O(h + (∆t) θ 2 + θ) for the distr...
In this paper, we design and analyze a new zeroth-order online algorithm, namely, the zeroth-order online alternating direction method of multipliers (ZOO-ADMM), which enjoys dual advantages of being gradient-free operation and employing the ADMM to accommodate complex structured regularizers. Compared to the first-order gradient-based online algorithm, we show that ZOO-ADMM requires √ m times ...
In this paper, we propose a couple of new Stochastic Strictly Contractive PeacemanRachford Splitting Method (SCPRSM), called Stochastic SCPRSM (SS-PRSM) and Stochastic Conjugate Gradient SCPRSM (SCG-PRSM) for large-scale optimization problems. The two types of Stochastic PRSM algorithms respectively incorporate stochastic variance reduced gradient (SVRG) and conjugate gradient method. Stochasti...
A f ine-grain pipelined architecture for least meaiisquare (LMS) filtering i s developed by employing a stochastic f o r m of look-ahead. T h e new architecture oflers a trade-off between a variable output latency and the adaptation accuracy. Analytical expressions describing the convergence properties are provided. A comparison with previous work indicates that the new architecture has the lea...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید