نتایج جستجو برای: dichotomous coordinate descent dcd

تعداد نتایج: 77679  

2015
Julie Nutini Mark W. Schmidt Issam H. Laradji Michael P. Friedlander Hoyt A. Koepke

There has been significant recent work on the theory and application of randomized coordinate descent algorithms, beginning with the work of Nesterov [SIAM J. Optim., 22(2), 2012 ], who showed that a random-coordinate selection rule achieves the same convergence rate as the Gauss-Southwell selection rule. This result suggests that we should never use the Gauss-Southwell rule, because it is typi...

2008
Michael Frech Frank Holzäpfel

b = vortex spacing l = lateral displacement N = Brunt–Väisälä frequency q = root-mean-square turbulence velocity t = time u = axial velocity v = lateral velocity, vertical displacement V = aircraft ground speed w = descent speed, vertical wind velocity x = axial coordinate y = spanwise coordinate z = vertical coordinate = circulation = eddy dissipation rate = potential temperature = standard de...

2017
Immanuel Bayer Xiangnan He Bhargav Kanagal Steffen Rendle

In recent years, interest in recommender research has shifted from explicit feedback towards implicit feedback data. A diversity of complex models has been proposed for a wide variety of applications. Despite this, learning from implicit feedback is still computationally challenging. So far, most work relies on stochastic gradient descent (SGD) solvers which are easy to derive, but in practice ...

Journal: :CoRR 2017
Jason D. Lee Ioannis Panageas Georgios Piliouras Max Simchowitz Michael I. Jordan Benjamin Recht

We establish that first-order methods avoid saddle points for almost all initializations. Our results apply to a wide variety of first-order methods, including gradient descent, block coordinate descent, mirror descent and variants thereof. The connecting thread is that such algorithms can be studied from a dynamical systems perspective in which appropriate instantiations of the Stable Manifold...

Journal: :Proceedings of the AAAI Conference on Artificial Intelligence 2019

Journal: :Proceedings of the ... AAAI Conference on Artificial Intelligence 2023

Difference-of-Convex (DC) minimization, referring to the problem of minimizing difference two convex functions, has been found rich applications in statistical learning and studied extensively for decades. However, existing methods are primarily based on multi-stage relaxation, only leading weak optimality critical points. This paper proposes a coordinate descent method class DC functions seque...

Journal: :IEEE Signal Processing Letters 2021

In this paper, we describe a new algorithm to build few sparse principal components from given data matrix. Our approach does not explicitly create the covariance matrix of and can be viewed as an extension Kogbetliantz approximate singular value decomposition for components. We show performance proposed recover on various datasets literature perform dimensionality reduction classification appl...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید