Subspace Expanders and Matrix Rank Minimization
- Creators
- Oymak, Samet
- Khajehnejad, Amin
- Hassibi, Babak
Abstract
Matrix rank minimization (RM) problems recently gained extensive attention due to numerous applications in machine learning, system identification and graphical models. In RM problem, one aims to find the matrix with the lowest rank that satisfies a set of linear constraints. The existing algorithms include nuclear norm minimization (NNM) and singular value thresholding. Thus far, most of the attention has been on i.i.d. Gaussian or Bernoulli measurement operators. In this work, we introduce a new class of measurement operators, and a novel recovery algorithm, which is notably faster than NNM. The proposed operators are based on what we refer to as subspace expanders, which are inspired by the well known expander graphs based measurement matrices in compressed sensing. We show that given an n×n PSD matrix of rank r, it can be uniquely recovered from a minimal sampling of O(nr) measurements using the proposed structures, and the recovery algorithm can be cast as matrix inversion after a few initial processing steps.
Additional Information
© 2011 IEEE. Date of Current Version: 03 October 2011.Attached Files
Submitted - Subspace_20Expanders_20and_20Matrix_20Rank_20Minimization.pdf
Files
Name | Size | Download all |
---|---|---|
md5:750342ca15e2c766638c5604a7bcea6a
|
365.5 kB | Preview Download |
Additional details
- Eprint ID
- 30014
- Resolver ID
- CaltechAUTHORS:20120406-111241824
- Created
-
2012-04-06Created from EPrint's datestamp field
- Updated
-
2021-11-09Created from EPrint's last_modified field