Steepest descent method on a Riemannian manifold: the convex case

نویسنده

  • Julien Munier
چکیده

In this paper we are interested in the asymptotic behavior of the trajectories of the famous steepest descent evolution equation on Riemannian manifolds. It writes ẋ (t) + gradφ (x (t)) = 0. It is shown how the convexity of the objective function φ helps in establishing the convergence as time goes to infinity of the trajectories towards points that minimize φ. Some numerical illustrations are given for the Rosenbrock’s function. M.S.C. 2000: 34C40, 37N40, 37M05, 65J15.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ON THE LIFTS OF SEMI-RIEMANNIAN METRICS

In this paper, we extend Sasaki metric for tangent bundle of a Riemannian manifold and Sasaki-Mok metric for the frame bundle of a Riemannian manifold [I] to the case of a semi-Riemannian vector bundle over a semi- Riemannian manifold. In fact, if E is a semi-Riemannian vector bundle over a semi-Riemannian manifold M, then by using an arbitrary (linear) connection on E, we can make E, as a...

متن کامل

Least-Squares on the Real Symplectic Group

The present paper discusses the problem of least-squares over the real symplectic group of matrices Sp(2n,R). The least-squares problem may be extended from flat spaces to curved spaces by the notion of geodesic distance. The resulting non-linear minimization problem on manifold may be tackled by means of a gradient-descent algorithm tailored to the geometry of the space at hand. In turn, gradi...

متن کامل

Averaging Stochastic Gradient Descent on Riemannian Manifolds

We consider the minimization of a function defined on a Riemannian manifold M accessible only through unbiased estimates of its gradients. We develop a geometric framework to transform a sequence of slowly converging iterates generated from stochastic gradient descent (SGD) on M to an averaged iterate sequence with a robust and fast O(1/n) convergence rate. We then present an application of our...

متن کامل

A Modified Algorithm of Steepest Descent Method for Solving Unconstrained Nonlinear Optimization Problems

The steepest descent method (SDM), which can be traced back to Cauchy (1847), is the simplest gradient method for unconstrained optimization problem. The SDM is effective for well-posed and low-dimensional nonlinear optimization problems without constraints; however, for a large-dimensional system, it converges very slowly. Therefore, a modified steepest decent method (MSDM) is developed to dea...

متن کامل

A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems

In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007