Li, Ling and Abu-Mostafa, Yaser S. and Pratap, Amrit (2003) CGBoost: Conjugate Gradient in Function Space. California Institute of Technology . (Unpublished) http://resolver.caltech.edu/CaltechCSTR:2003.007
See Usage Policy.
Use this Persistent URL to link to this item: http://resolver.caltech.edu/CaltechCSTR:2003.007
The superior out-of-sample performance of AdaBoost has been attributed to the fact that it minimizes a cost function based on margin, in that it can be viewed as a special case of AnyBoost, an abstract gradient descent algorithm. In this paper, we provide a more sophisticated abstract boosting algorithm, CGBoost, based on conjugate gradient in function space. When the AdaBoost exponential cost function is optimized, CGBoost generally yields much lower cost and training error but higher test error, which implies that the exponential cost is vulnerable to overfitting. With the optimization power of CGBoost, we can adopt more "regularized" cost functions that have better out-of-sample performance but are difficult to optimize. Our experiments demonstrate that CGBoost generally outperforms AnyBoost in cost reduction. With suitable cost functions, CGBoost can have better out-of-sample performance.
|Item Type:||Report or Paper (Technical Report)|
|Group:||Computer Science Technical Reports|
|Subject Keywords:||conjugate-gradient, cost function, boosting, generalization, AdaBoost, AnyBoost, CGBoost|
|Usage Policy:||You are granted permission for individual, educational, research and non-commercial reproduction, distribution, display and performance of this work in any format.|
|Deposited By:||Imported from CaltechCSTR|
|Deposited On:||29 Aug 2003|
|Last Modified:||26 Dec 2012 14:14|
Repository Staff Only: item control page