Published August 2, 2022
| Version public
Discussion Paper
A Note on Zeroth-Order Optimization on the Simplex
Creators
Abstract
We construct a zeroth-order gradient estimator for a smooth function defined on the probability simplex. The proposed estimator queries the simplex only. We prove that projected gradient descent and the exponential weights algorithm, when run with this estimator instead of exact gradients, converge at a O(T^(-1/4)}) rate.
Additional details
Identifiers
- Eprint ID
- 118519
- Resolver ID
- CaltechAUTHORS:20221220-221907545
Related works
- Describes
- http://arxiv.org/abs/2208.01185 (URL)
Dates
- Created
-
2022-12-21Created from EPrint's datestamp field
- Updated
-
2023-06-02Created from EPrint's last_modified field