Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published August 2017 | Published + Supplemental Material
Journal Article Open

A simple multi-class boosting framework with theoretical guarantees and empirical proficiency


There is a need for simple yet accurate white-box learning systems that train quickly and with lit- tle data. To this end, we showcase REBEL, a multi-class boosting method, and present a novel family of weak learners called localized similar- ities. Our framework provably minimizes the training error of any dataset at an exponential rate. We carry out experiments on a variety of synthetic and real datasets, demonstrating a con- sistent tendency to avoid overfitting. We eval- uate our method on MNIST and standard UCI datasets against other state-of-the-art methods, showing the empirical proficiency of our method.

Additional Information

© 2017 The author(s). The authors would like to thank anonymous reviewers for their feedback and Google Inc. and the Office of Naval Research MURI N00014-10-1-0933 for funding this work.

Attached Files

Supplemental Material - appel17a-supp.pdf

Published - appel17a.pdf


Files (2.4 MB)
Name Size Download all
36.6 kB Preview Download
2.4 MB Preview Download

Additional details

August 19, 2023
August 19, 2023