نتایج جستجو برای: gradient descent

تعداد نتایج: 137892  

Journal: :Journal of Scientific Computing 2021

Stochastic gradient descent (SGD) for strongly convex functions converges at the rate $$\mathcal {O}(1/k)$$ . However, achieving good results in practice requires tuning parameters (for example learning rate) of algorithm. In this paper we propose a generalization Polyak step size, used subgradient methods, to stochastic descent. We prove non-asymptotic convergence with constant which can be be...

Journal: :IEEE Control Systems Letters 2022

We systematically develop a learning-based treatment of stochastic optimal control (SOC), relying on direct optimization parametric policies. propose derivation adjoint sensitivity results for differential equations through application variational calculus. Then, given an objective function predetermined task specifying the desiderata controller, we optimize their parameters via iterative gradi...

Journal: :Image Analysis & Stereology 2023

The batch clustering algorithm for classification application requires the initial parameters and also has a drifting phenomenon stochastic process. are critical to con-verge partial optimum. in original still space be improved thus speed up convergence based on parameters. This paper proposes an unsupervised method by addressing these two issues. Firstly, estimation been given preliminary with...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید