نتایج جستجو برای: stochastic gradient descent

تعداد نتایج: 258150  

Journal: :Journal of Computational Physics 2021

Randomness is ubiquitous in modern engineering. The uncertainty often modeled as random coefficients the differential equations that describe underlying physics. In this work, we a two-step framework for numerically solving semilinear elliptic partial with coefficients: 1) reformulate problem functional minimization based on direct method of calculus variation; 2) solve using stochastic gradien...

Journal: :Machine Learning 2022

Abstract Stochastic gradient descent (SGD) is a widely adopted iterative method for optimizing differentiable objective functions. In this paper, we propose and discuss novel approach to scale up SGD in applications involving non-convex functions large datasets. We address the bottleneck problem arising when using both shared distributed memory. Typically, former bounded by limited computation ...

Journal: :Photonics 2021

For a high-power slab solid-state laser, obtaining high output power and beam quality are the most important indicators. Adaptive optics systems can significantly improve qualities by compensating for phase distortions of laser beams. In this paper, we developed an improved algorithm called Gradient Estimation Stochastic Parallel Descent (AGESPGD) cleanup laser. A second-order gradient search p...

Journal: :CoRR 2017
Penghang Yin Minh Pham Adam M. Oberman Stanley Osher

In this paper, we propose an implicit gradient descent algorithm for the classic k-means problem. The implicit gradient step or backward Euler is solved via stochastic fixed-point iteration, in which we randomly sample a mini-batch gradient in every iteration. It is the average of the fixed-point trajectory that is carried over to the next gradient step. We draw connections between the proposed...

Journal: :CoRR 2017
Fanhua Shang

In this paper, we propose a simple variant of the original stochastic variance reduction gradient (SVRG) [1], where hereafter we refer to as the variance reduced stochastic gradient descent (VR-SGD). Different from the choices of the snapshot point and starting point in SVRG and its proximal variant, Prox-SVRG [2], the two vectors of each epoch in VRSGD are set to the average and last iterate o...

Journal: :Mathematics 2023

In the age of artificial intelligence, best approach to handling huge amounts data is a tremendously motivating and hard problem. Among machine learning models, stochastic gradient descent (SGD) not only simple but also very effective. This study provides detailed analysis contemporary state-of-the-art deep applications, such as natural language processing (NLP), visual processing, voice audio ...

Journal: :Lecture Notes in Computer Science 2023

Abstract We study a relaxed version of the column-sampling problem for Nyström approximation kernel matrices, where approximations are defined from multisets landmark points in ambient space; such referred to as samples. consider an unweighted variation radial squared-kernel discrepancy (SKD) criterion surrogate classical criteria used assess accuracy; this setting, we discuss how samples can b...

Journal: :Numerical Mathematics: Theory, Methods and Applications 2019

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید