CaltechAUTHORS
  A Caltech Library Service

The Impact of Regularization on High-dimensional Logistic Regression

Salehi, Fariborz and Abbasi, Ehsan and Hassibi, Babak (2019) The Impact of Regularization on High-dimensional Logistic Regression. In: 33rd Conference on Neural Information Processing Systems. Neural Information Processing Systems Foundation, Inc. , Art. No. 9369. https://resolver.caltech.edu/CaltechAUTHORS:20190628-084529981

[img] PDF - Published Version
See Usage Policy.

886kB
[img] PDF - Submitted Version
See Usage Policy.

1MB
[img] Archive (ZIP) - Supplemental Material
See Usage Policy.

751kB

Use this Persistent URL to link to this item: https://resolver.caltech.edu/CaltechAUTHORS:20190628-084529981

Abstract

Logistic regression is commonly used for modeling dichotomous outcomes. In the classical setting, where the number of observations is much larger than the number of parameters, properties of the maximum likelihood estimator in logistic regression are well understood. Recently, Sur and Candes have studied logistic regression in the high-dimensional regime, where the number of observations and parameters are comparable, and show, among other things, that the maximum likelihood estimator is biased. In the high-dimensional regime the underlying parameter vector is often structured (sparse, block-sparse, finite-alphabet, etc.) and so in this paper we study regularized logistic regression (RLR), where a convex regularizer that encourages the desired structure is added to the negative of the log-likelihood function. An advantage of RLR is that it allows parameter recovery even for instances where the (unconstrained) maximum likelihood estimate does not exist. We provide a precise analysis of the performance of RLR via the solution of a system of six nonlinear equations, through which any performance metric of interest (mean, mean-squared error, probability of support recovery, etc.) can be explicitly computed. Our results generalize those of Sur and Candes and we provide a detailed study for the cases of ℓ²₂-RLR and sparse (ℓ₁-regularized) logistic regression. In both cases, we obtain explicit expressions for various performance metrics and can find the values of the regularizer parameter that optimizes the desired performance. The theory is validated by extensive numerical simulations across a range of parameter values and problem instances.


Item Type:Book Section
Related URLs:
URLURL TypeDescription
https://papers.nips.cc/paper/9369-the-impact-of-regularization-on-high-dimensional-logistic-regressionPublisherArticle
https://arxiv.org/abs/1906.03761arXivDiscussion Paper
Additional Information:© 2019 Neural Information Processing Systems Foundation, Inc. This work was supported in part by the National Science Foundation under grants CNS-0932428, CCF-1018927, CCF-1423663 and CCF-1409204, by a grant from Qualcomm Inc., by a grant from Futurewei Inc., by NASA’s Jet Propulsion Laboratory through the President and Director’s Fund, and by King Abdullah University of Science and Technology.
Funders:
Funding AgencyGrant Number
NSFCNS-0932428
NSFCCF-1018927
NSFCCF-1423663
NSFCCF-1409204
Qualcomm Inc.UNSPECIFIED
Futurewei Inc.UNSPECIFIED
JPL President and Director's FundUNSPECIFIED
King Abdullah University of Science and Technology (KAUST)UNSPECIFIED
Record Number:CaltechAUTHORS:20190628-084529981
Persistent URL:https://resolver.caltech.edu/CaltechAUTHORS:20190628-084529981
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:96810
Collection:CaltechAUTHORS
Deposited By: Tony Diaz
Deposited On:28 Jun 2019 17:08
Last Modified:09 Jul 2020 21:15

Repository Staff Only: item control page