A Caltech Library Service

A simple multi-class boosting framework with theoretical guarantees and empirical proficiency

Appel, Ron and Perona, Pietro (2017) A simple multi-class boosting framework with theoretical guarantees and empirical proficiency. Proceedings of Machine Learning Research, 70 . pp. 186-194. ISSN 1938-7228.

[img] PDF - Published Version
See Usage Policy.

[img] PDF - Supplemental Material
See Usage Policy.


Use this Persistent URL to link to this item:


There is a need for simple yet accurate white-box learning systems that train quickly and with lit- tle data. To this end, we showcase REBEL, a multi-class boosting method, and present a novel family of weak learners called localized similar- ities. Our framework provably minimizes the training error of any dataset at an exponential rate. We carry out experiments on a variety of synthetic and real datasets, demonstrating a con- sistent tendency to avoid overfitting. We eval- uate our method on MNIST and standard UCI datasets against other state-of-the-art methods, showing the empirical proficiency of our method.

Item Type:Article
Related URLs:
URLURL TypeDescription
Perona, Pietro0000-0002-7583-5809
Additional Information:© 2017 The author(s). The authors would like to thank anonymous reviewers for their feedback and Google Inc. and the Office of Naval Research MURI N00014-10-1-0933 for funding this work.
Funding AgencyGrant Number
Office of Naval Research (ONR)N00014-10-1-0933
Record Number:CaltechAUTHORS:20180627-133511664
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:87405
Deposited By: Caroline Murphy
Deposited On:27 Jun 2018 20:50
Last Modified:03 Oct 2019 19:55

Repository Staff Only: item control page