نتایج جستجو برای: gradient descent

تعداد نتایج: 137892  

Journal: :Statistics and Computing 2021

Abstract Stochastic gradient descent is an optimisation method that combines classical with random subsampling within the target functional. In this work, we introduce stochastic process as a continuous-time representation of descent. The dynamical system coupled Markov living on finite state space. system—a flow—represents part, space represents subsampling. Processes type are, for instance, u...

2017
Shuxia Lu Zhao Jin

In order to improve the efficiency and classification ability of Support vector machines (SVM) based on stochastic gradient descent algorithm, three algorithms of improved stochastic gradient descent (SGD) are used to solve support vector machine, which are Momentum, Nesterov accelerated gradient (NAG), RMSprop. The experimental results show that the algorithm based on RMSprop for solving the l...

2017
Immanuel Bayer Xiangnan He Bhargav Kanagal Steffen Rendle

In recent years, interest in recommender research has shifted from explicit feedback towards implicit feedback data. A diversity of complex models has been proposed for a wide variety of applications. Despite this, learning from implicit feedback is still computationally challenging. So far, most work relies on stochastic gradient descent (SGD) solvers which are easy to derive, but in practice ...

2018
Yoonho Lee Seungjin Choi

Gradient-based meta-learning has been shown to be expressive enough to approximate any learning algorithm. While previous such methods have been successful in meta-learning tasks, they resort to simple gradient descent during meta-testing. Our primary contribution is the MT-net, which enables the meta-learner to learn on each layer’s activation space a subspace that the task-specific learner pe...

Following the setting of the Dai-Liao (DL) parameter in conjugate gradient (CG) methods‎, ‎we introduce two new parameters based on the modified secant equation proposed by Li et al‎. ‎(Comput‎. ‎Optim‎. ‎Appl‎. ‎202:523-539‎, ‎2007) with two approaches‎, ‎which use an extended new conjugacy condition‎. ‎The first is based on a modified descent three-term search direction‎, ‎as the descent Hest...

Journal: :CoRR 2014
Huahua Wang Arindam Banerjee

Two types of low cost-per-iteration gradient descent methods have been extensively studied in parallel. One is online or stochastic gradient descent ( OGD/SGD), and the other is randomzied coordinate descent (RBCD). In this paper, we combine the two types of methods together and propose online randomized block coordinate descent (ORBCD). At each iteration, ORBCD only computes the partial gradie...

Journal: :Journal of Statistical Mechanics: Theory and Experiment 2019

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید