A Universal Analysis of Large-Scale Regularized Least Squares Solutions
Abstract
A problem that has been of recent interest in statistical inference, machine learning and signal processing is that of understanding the asymptotic behavior of regularized least squares solutions under random measurement matrices (or dictionaries). The Least Absolute Shrinkage and Selection Operator (LASSO or least-squares with ℓ_1 regularization) is perhaps one of the most interesting examples. Precise expressions for the asymptotic performance of LASSO have been obtained for a number of different cases, in particular when the elements of the dictionary matrix are sampled independently from a Gaussian distribution. It has also been empirically observed that the resulting expressions remain valid when the entries of the dictionary matrix are independently sampled from certain non-Gaussian distributions. In this paper, we confirm these observations theoretically when the distribution is sub-Gaussian. We further generalize the previous expressions for a broader family of regularization functions and under milder conditions on the underlying random, possibly non-Gaussian, dictionary matrix. In particular, we establish the universality of the asymptotic statistics (e.g., the average quadratic risk) of LASSO with non-Gaussian dictionaries.
Additional Information
© 2017 Neural Information Processing Systems Foundation, Inc.Attached Files
Published - 6930-a-universal-analysis-of-large-scale-regularized-least-squares-solutions.pdf
Files
Name | Size | Download all |
---|---|---|
md5:c77afbccc7e743e0529d2f5b3b75a691
|
413.5 kB | Preview Download |
Additional details
- Eprint ID
- 92113
- Resolver ID
- CaltechAUTHORS:20190107-105602343
- Created
-
2019-01-07Created from EPrint's datestamp field
- Updated
-
2019-10-03Created from EPrint's last_modified field
- Series Name
- Advances in Neural Information Processing Systems
- Series Volume or Issue Number
- 30