نتایج جستجو برای: gradient descent

تعداد نتایج: 137892  

Journal: :bulletin of the iranian mathematical society 2014
saman babaie-kafaki

‎based on an eigenvalue analysis‎, ‎a new proof for the sufficient‎ ‎descent property of the modified polak-ribière-polyak conjugate‎ ‎gradient method proposed by yu et al‎. ‎is presented‎.

Journal: :journal of ai and data mining 2015
f. alibakhshi m. teshnehlab m. alibakhshi m. mansouri

the stability of learning rate in neural network identifiers and controllers is one of the challenging issues which attracts great interest from researchers of neural networks. this paper suggests adaptive gradient descent algorithm with stable learning laws for modified dynamic neural network (mdnn) and studies the stability of this algorithm. also, stable learning algorithm for parameters of ...

ژورنال: :مطالعات جغرافیای مناطق خشک 0
جواد سدیدی javad sadidi karaj, kharazmi university, department of geography, remote sensing and gisکرج. دانشگاه خوارزمی. دانشکده علوم جغرافیایی گروه سنجش از دور و gis محمد کمانگر mohammad kamangar hormozgan universityدانشگاه هرمزگان هانی رضائیان hani rezaian karaj, kharazmi university, department of geography, remote sensing and gisکرج. دانشگاه خوارزمی. دانشکده علوم جغرافیایی گروه سنجش از دور و gis علیرضا حمیدیان alireza hamidian hakim sabzevari universityدانشگاه حکیم سبزواری محمد باعقیده mohammad baaghideh hakim sabzevari universityدانشگاه حکیم سبزواری حیدر آریانژاد heidar aryanejad hormozgan universityدانشگاه هرمزگان

قسمت اعظم مساحت کشور از لحاظ جغرافیایی در کمربند خشک و نیمه خشک با بارندگی کم قرار گرفته است. در نواحی فلات مرکزی و جنوبی اجتماعات شهری و روستایی با اتکاء به منابع آب زیر زمینی شکل گرفته و این منابع عمده ترین تامین کننده نیازهای آبی در این مناطق محسوب می شود. رشد روز افزون جمعیت و محدودیت منابع آبی لزوم پیش بینی دقیق مقدار این منابع را به دلیل اهمیت در برنامه ریزی و مدیریت بهینه می طلبد. پیش بی...

With respect to importance of the conjugate gradient methods for large-scale optimization, in this study a descent three-term conjugate gradient method is proposed based on an extended modified secant condition. In the proposed method, objective function values are used in addition to the gradient information. Also, it is established that the method is globally convergent without convexity assu...

Journal: :Journal of Computational and Graphical Statistics 2007

Journal: :IEEE open journal of control systems 2023

Several recent empirical studies demonstrate that important machine learning tasks such as training deep neural networks, exhibit a low-rank structure, where most of the variation in loss function occurs only few directions input space. In this paper, we leverage structure to reduce high computational cost canonical gradient-based methods gradient descent (GD). Our proposed Low-Rank Gradient De...

Journal: :Michigan Mathematical Journal 2009

Journal: :IEEE Transactions on Neural Networks and Learning Systems 2018

Journal: :Research in the Mathematical Sciences 2022

We propose a class of very simple modifications gradient descent and stochastic leveraging Laplacian smoothing. show that when applied to large variety machine learning problems, ranging from logistic regression deep neural nets, the proposed surrogates can dramatically reduce variance, allow take larger step size, improve generalization accuracy. The methods only involve multiplying usual (sto...

2016
Marcin Andrychowicz Misha Denil Sergio Gomez Colmenarejo Matthew W. Hoffman David Pfau Tom Schaul Nando de Freitas

The move from hand-designed features to learned features in machine learning has been wildly successful. In spite of this, optimization algorithms are still designed by hand. In this paper we show how the design of an optimization algorithm can be cast as a learning problem, allowing the algorithm to learn to exploit structure in the problems of interest in an automatic way. Our learned algorit...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید