General performance metrics for the LASSO
Abstract
A recent line of work has established accurate predictions of the mean squared-error (MSE) performance of non-smooth convex optimization methods when used to recover structured signals (e.g. sparse, low-rank) from noisy linear (and possibly compressed) observations. Specifically, in a recent paper [15] we precisely characterized the MSE performance of a general class of regularized M-estimators using a framework that is based on Gaussian process methods. Here, we extend the framework to the analysis of a general class of Lipschitz performance metrics, which in addition to the standard MSE, includes the ℓ1-reconstruction error, the probability of successfully identifying whether an element belongs to the support of a sparse signal, the empirical distribution of the error, etc. For concreteness, we primarily focus on the problem of sparse recovery under ℓ1-regularized least-squares (aka LASSO). We illustrate the validity of the theoretical predictions through numerical simulations and discuss the importance of their precise nature in optimally tuning the involved parameters of the reconstruction method.
Additional Information
© 2016 IEEE.Additional details
- Eprint ID
- 71674
- DOI
- 10.1109/ITW.2016.7606820
- Resolver ID
- CaltechAUTHORS:20161102-074552760
- Created
-
2016-11-02Created from EPrint's datestamp field
- Updated
-
2021-11-11Created from EPrint's last_modified field