Fixed-rank matrix factorizations and Riemannian low-rank optimization

نویسندگان

  • Bamdev Mishra
  • Gilles Meyer
  • Silvere Bonnabel
  • Rodolphe Sepulchre
چکیده

Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of fixed-rank matrices. We adopt the geometric framework of optimization on Riemannian quotient manifolds. We study the underlying geometries of several well-known fixed-rank matrix factorizations and then exploit the Riemannian quotient geometry of the search space in the design of a class of gradient descent and trust-region algorithms. The proposed algorithms generalize our previous results on fixed-rank symmetric positive semidefinite matrices, apply to a broad range of applications, scale to high-dimensional problems, and confer a geometric basis to recent contributions on the learning of fixed-rank non-symmetric matrices. We make connections with existing algorithms in the context of low-rank matrix completion and discuss the usefulness of the proposed framework. Numerical experiments suggest that the proposed algorithms compete with state-of-the-art algorithms and that manifold optimization offers an effective and versatile framework for the design of machine learning algorithms that learn a fixed-rank matrix. B. Mishra (B) · G. Meyer · R. Sepulchre Department of Electrical Engineering andComputer Science, University of Liège, 4000Liege, Belgium e-mail: [email protected] G. Meyer e-mail: [email protected] R. Sepulchre Department of Engineering, University of Cambridge, Cambridge CB2 1PZ, UK e-mail: [email protected] S. Bonnabel Robotics Center, Mines ParisTech, Boulevard Saint-Michel, 60, 75272 Paris, France e-mail: [email protected]

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Riemannian rank-adaptive method for low-rank optimization

This paper presents an algorithm that solves optimization problems on a matrix manifold M ⊆ Rm×n with an additional rank inequality constraint. The algorithm resorts to well-known Riemannian optimization schemes on fixed-rank manifolds, combined with new mechanisms to increase or decrease the rank. The convergence of the algorithm is analyzed and a weighted low-rank approximation problem is use...

متن کامل

Low-rank optimization with trace norm penalty

The paper addresses the problem of low-rank trace norm minimization. We propose an algorithm that alternates between fixed-rank optimization and rank-one updates. The fixed-rank optimization is characterized by an efficient factorization that makes the trace norm differentiable in the search space and the computation of duality gap numerically tractable. The search space is nonlinear but is equ...

متن کامل

A Riemannian Optimization Approach for Computing Low-Rank Solutions of Lyapunov Equations

We propose a new framework based on optimization on manifolds to approximate the solution of a Lyapunov matrix equation by a low-rank matrix. The method minimizes the error on the Riemannian manifold of symmetric positive semi-definite matrices of fixed rank. We detail how objects from differential geometry, like the Riemannian gradient and Hessian, can be efficiently computed for this manifold...

متن کامل

Guarantees of Riemannian Optimization for Low Rank Matrix Completion

We study the Riemannian optimization methods on the embedded manifold of low rank matrices for the problem of matrix completion, which is about recovering a low rank matrix from its partial entries. Assume m entries of an n× n rank r matrix are sampled independently and uniformly with replacement. We first prove that with high probability the Riemannian gradient descent and conjugate gradient d...

متن کامل

Survey on Probabilistic Models of Low-Rank Matrix Factorizations

Low-rank matrix factorizations such as Principal Component Analysis (PCA), Singular Value Decomposition (SVD) and Non-negative Matrix Factorization (NMF) are a large class of methods for pursuing the low-rank approximation of a given data matrix. The conventional factorization models are based on the assumption that the data matrices are contaminated stochastically by some type of noise. Thus t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1209.0430  شماره 

صفحات  -

تاریخ انتشار 2012