A Caltech Library Service

CGBoost: Conjugate Gradient in Function Space

Li, Ling and Abu-Mostafa, Yaser S. and Pratap, Amrit (2003) CGBoost: Conjugate Gradient in Function Space. California Institute of Technology . (Unpublished)

See Usage Policy.


Use this Persistent URL to link to this item:


The superior out-of-sample performance of AdaBoost has been attributed to the fact that it minimizes a cost function based on margin, in that it can be viewed as a special case of AnyBoost, an abstract gradient descent algorithm. In this paper, we provide a more sophisticated abstract boosting algorithm, CGBoost, based on conjugate gradient in function space. When the AdaBoost exponential cost function is optimized, CGBoost generally yields much lower cost and training error but higher test error, which implies that the exponential cost is vulnerable to overfitting. With the optimization power of CGBoost, we can adopt more "regularized" cost functions that have better out-of-sample performance but are difficult to optimize. Our experiments demonstrate that CGBoost generally outperforms AnyBoost in cost reduction. With suitable cost functions, CGBoost can have better out-of-sample performance.

Item Type:Report or Paper (Technical Report)
Related URLs:
URLURL TypeDescription
Group:Computer Science Technical Reports
Subject Keywords:conjugate-gradient, cost function, boosting, generalization, AdaBoost, AnyBoost, CGBoost
Record Number:CaltechCSTR:2003.007
Persistent URL:
Usage Policy:You are granted permission for individual, educational, research and non-commercial reproduction, distribution, display and performance of this work in any format.
ID Code:27069
Deposited By: Imported from CaltechCSTR
Deposited On:29 Aug 2003
Last Modified:03 Oct 2019 03:20

Repository Staff Only: item control page