Beyond Procrustes: Balancing-Free Gradient Descent for Asymmetric Low-Rank Matrix Sensing
نویسندگان
چکیده
Low-rank matrix estimation plays a central role in various applications across science and engineering. Recently, nonconvex formulations based on factorization are provably solved by simple gradient descent algorithms with strong computational statistical guarantees. However, when the low-rank matrices asymmetric, existing approaches rely adding regularization term to balance scale of two factors which practice can be removed safely without hurting performance initialized via spectral method. In this paper, we provide theoretical justification for sensing problem, aims recover from small number linear measurements. As long as measurement ensemble satisfies restricted isometry property, -- conjunction initialization converges linearly need explicitly promoting balancedness factors; fact, stay balanced automatically throughout execution algorithm. Our analysis is analyzing evolution new distance metric that directly accounts ambiguity due invertible transforms, might independent interest.
منابع مشابه
Projected Wirtinger Gradient Descent for Low-Rank Hankel Matrix Completion in Spectral Compressed Sensing
This paper considers reconstructing a spectrally sparse signal from a small number of randomly observed time-domain samples. The signal of interest is a linear combination of complex sinusoids at R distinct frequencies. The frequencies can assume any continuous values in the normalized frequency domain [0, 1). After converting the spectrally sparse signal recovery into a low rank structured mat...
متن کاملLow-rank Solutions of Linear Matrix Equations via Procrustes Flow
In this paper we study the problem of recovering an low-rank positive semidefinite matrix from linear measurements. Our algorithm, which we call Procrustes Flow, starts from an initial estimate obtained by a thresholding scheme followed by gradient descent on a non-convex objective. We show that as long as the measurements obey a standard restricted isometry property, our algorithm converges to...
متن کاملStochastic Variance-reduced Gradient Descent for Low-rank Matrix Recovery from Linear Measurements
We study the problem of estimating low-rank matrices from linear measurements (a.k.a., matrix sensing) through nonconvex optimization. We propose an efficient stochastic variance reduced gradient descent algorithm to solve a nonconvex optimization problem of matrix sensing. Our algorithm is applicable to both noisy and noiseless settings. In the case with noisy observations, we prove that our a...
متن کاملNonconvex Low-Rank Matrix Recovery with Arbitrary Outliers via Median-Truncated Gradient Descent
Recent work has demonstrated the effectiveness of gradient descent for directly recovering the factors of low-rank matrices from random linear measurements in a globally convergent manner when initialized properly. However, the performance of existing algorithms is highly sensitive in the presence of outliers that may take arbitrary values. In this paper, we propose a truncated gradient descent...
متن کاملNon-Convex Projected Gradient Descent for Generalized Low-Rank Tensor Regression
In this paper, we consider the problem of learning high-dimensional tensor regression problems with low-rank structure. One of the core challenges associated with learning high-dimensional models is computation since the underlying optimization problems are often non-convex. While convex relaxations could lead to polynomialtime algorithms they are often slow in practice. On the other hand, limi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Signal Processing
سال: 2021
ISSN: ['1053-587X', '1941-0476']
DOI: https://doi.org/10.1109/tsp.2021.3051425