نتایج جستجو برای: Gradient descent
تعداد نتایج: 137892 فیلتر نتایج به سال:
based on an eigenvalue analysis, a new proof for the sufficient descent property of the modified polak-ribière-polyak conjugate gradient method proposed by yu et al. is presented.
the stability of learning rate in neural network identifiers and controllers is one of the challenging issues which attracts great interest from researchers of neural networks. this paper suggests adaptive gradient descent algorithm with stable learning laws for modified dynamic neural network (mdnn) and studies the stability of this algorithm. also, stable learning algorithm for parameters of ...
قسمت اعظم مساحت کشور از لحاظ جغرافیایی در کمربند خشک و نیمه خشک با بارندگی کم قرار گرفته است. در نواحی فلات مرکزی و جنوبی اجتماعات شهری و روستایی با اتکاء به منابع آب زیر زمینی شکل گرفته و این منابع عمده ترین تامین کننده نیازهای آبی در این مناطق محسوب می شود. رشد روز افزون جمعیت و محدودیت منابع آبی لزوم پیش بینی دقیق مقدار این منابع را به دلیل اهمیت در برنامه ریزی و مدیریت بهینه می طلبد. پیش بی...
With respect to importance of the conjugate gradient methods for large-scale optimization, in this study a descent three-term conjugate gradient method is proposed based on an extended modified secant condition. In the proposed method, objective function values are used in addition to the gradient information. Also, it is established that the method is globally convergent without convexity assu...
Several recent empirical studies demonstrate that important machine learning tasks such as training deep neural networks, exhibit a low-rank structure, where most of the variation in loss function occurs only few directions input space. In this paper, we leverage structure to reduce high computational cost canonical gradient-based methods gradient descent (GD). Our proposed Low-Rank Gradient De...
We propose a class of very simple modifications gradient descent and stochastic leveraging Laplacian smoothing. show that when applied to large variety machine learning problems, ranging from logistic regression deep neural nets, the proposed surrogates can dramatically reduce variance, allow take larger step size, improve generalization accuracy. The methods only involve multiplying usual (sto...
The move from hand-designed features to learned features in machine learning has been wildly successful. In spite of this, optimization algorithms are still designed by hand. In this paper we show how the design of an optimization algorithm can be cast as a learning problem, allowing the algorithm to learn to exploit structure in the problems of interest in an automatic way. Our learned algorit...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید