CaltechAUTHORS
  A Caltech Library Service

Beyond Online Balanced Descent: An Optimal Algorithm for Smoothed Online Optimization

Goel, Gautam and Lin, Yiheng and Sun, Haoyuan and Wierman, Adam (2019) Beyond Online Balanced Descent: An Optimal Algorithm for Smoothed Online Optimization. In: 33rd Conference on Neural Information Processing Systems. Neural Information Processing Systems Foundation, Inc. , Art. No. 8463. https://resolver.caltech.edu/CaltechAUTHORS:20190626-100003384

[img] PDF - Published Version
See Usage Policy.

249kB
[img] PDF - Submitted Version
See Usage Policy.

546kB
[img] Archive (ZIP) - Supplemental Material
See Usage Policy.

354kB

Use this Persistent URL to link to this item: https://resolver.caltech.edu/CaltechAUTHORS:20190626-100003384

Abstract

We study online convex optimization in a setting where the learner seeks to minimize the sum of a per-round hitting cost and a movement cost which is incurred when changing decisions between rounds. We prove a new lower bound on the competitive ratio of any online algorithm in the setting where the costs are m-strongly convex and the movement costs are the squared ℓ₂ norm. This lower bound shows that no algorithm can achieve a competitive ratio that is o(m^(−1/2)) as m tends to zero. No existing algorithms have competitive ratios matching this bound, and we show that the state-of-the-art algorithm, Online Balanced Decent (OBD), has a competitive ratio that is Ω(m^(−2/3)). We additionally propose two new algorithms, Greedy OBD (G-OBD) and Regularized OBD (R-OBD) and prove that both algorithms have an O(m^(−1/2)) competitive ratio. The result for G-OBD holds when the hitting costs are quasiconvex and the movement costs are the squared ℓ₂ norm, while the result for R-OBD holds when the hitting costs are m-strongly convex and the movement costs are Bregman Divergences. Further, we show that R-OBD simultaneously achieves constant, dimension-free competitive ratio and sublinear regret when hitting costs are strongly convex.


Item Type:Book Section
Related URLs:
URLURL TypeDescription
https://papers.nips.cc/paper/8463-beyond-online-balanced-descent-an-optimal-algorithm-for-smoothed-online-optimizationPublisherArticle
https://arxiv.org/abs/1905.12776arXivDiscussion Paper
ORCID:
AuthorORCID
Goel, Gautam0000-0002-7054-7218
Lin, Yiheng0000-0001-6524-2877
Sun, Haoyuan0000-0002-6203-0198
Wierman, Adam0000-0002-5923-0199
Additional Information:© 2020 Neural Information Processing Systems Foundation, Inc. Gautam Goel, Yiheng Lin, and Haoyuan Sun contributed equally to this work. This work was supported by NSF grants AitF-1637598 and CNS-1518941, with additional support for Gautam Goel provided by an Amazon AWS AI Fellowship.
Funders:
Funding AgencyGrant Number
NSFCCF-1637598
NSFCNS-1518941
Amazon Web ServicesUNSPECIFIED
DOI:10.48550/arXiv.1905.12776
Record Number:CaltechAUTHORS:20190626-100003384
Persistent URL:https://resolver.caltech.edu/CaltechAUTHORS:20190626-100003384
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:96721
Collection:CaltechAUTHORS
Deposited By: Tony Diaz
Deposited On:26 Jun 2019 17:07
Last Modified:02 Jun 2023 00:57

Repository Staff Only: item control page