CaltechAUTHORS
  A Caltech Library Service

Model Selection for Support Vector Machine Classification

Gold, Carl and Sollich, Peter (2003) Model Selection for Support Vector Machine Classification. Neurocomputing, 55 (1-2). pp. 221-249. ISSN 0925-2312. http://resolver.caltech.edu/CaltechAUTHORS:20130816-103255120

[img]
Preview
PDF (Uncorrected proof) - Accepted Version
See Usage Policy.

406Kb
[img]
Preview
PDF (ArXiv deposit) - Draft Version
See Usage Policy.

323Kb

Use this Persistent URL to link to this item: http://resolver.caltech.edu/CaltechAUTHORS:20130816-103255120

Abstract

We address the problem of model selection for Support Vector Machine (SVM) classification. For fixed functional form of the kernel, model selection amounts to tuning kernel parameters and the slack penalty coefficient C. We begin by reviewing a recently developed probabilistic framework for SVM classification. An extension to the case of SVMs with quadratic slack penalties is given and a simple approximation for the evidence is derived, which can be used as a criterion for model selection. We also derive the exact gradients of the evidence in terms of posterior averages and describe how they can be estimated numerically using Hybrid Monte-Carlo techniques. Though computationally demanding, the resulting gradient ascent algorithm is a useful baseline tool for probabilistic SVM model selection, since it can locate maxima of the exact (unapproximated) evidence. We then perform extensive experiments on several benchmark data sets. The aim of these experiments is to compare the performance of probabilistic model selection criteria with alternatives based on estimates of the test error, namely the so-called “span estimate” and Wahba's Generalized Approximate Cross-Validation (GACV) error. We find that all the “simple” model criteria (Laplace evidence approximations, and the span and GACV error estimates) exhibit multiple local optima with respect to the hyperparameters. While some of these give performance that is competitive with results from other approaches in the literature, a significant fraction lead to rather higher test errors. The results for the evidence gradient ascent method show that also the exact evidence exhibits local optima, but these give test errors which are much less variable and also consistently lower than for the simpler model selection criteria.


Item Type:Article
Related URLs:
URLURL TypeDescription
http://dx.doi.org/10.1016/S0925-2312(03)00375-8DOIArticle
http://www.sciencedirect.com/science/article/pii/S0925231203003758PublisherArticle
http://arxiv.org/abs/cond-mat/0203334arXivArticle
Additional Information:Access to the Hewlett-Packard V2500 was provided by the Caltech Center for Advanced Computing Research (http://www.cacr.caltech.edu) through the National Partnership for Advanced Computational Infrastructure–A Distributed Laboratory for Computational Science and Engineering, supported by the NSF cooperative agreement ACI-9619020. We also thank Andrew Buchan for assistance with the early stages of the numerical experiments for quadratic penalty SVMs.
Group:Koch Laboratory, KLAB
Funders:
Funding AgencyGrant Number
NSF Cooperative AgreementACI-9619020
Subject Keywords:Support Vector Machines, model selection, probabilistic methods, Bayesian evidence
Record Number:CaltechAUTHORS:20130816-103255120
Persistent URL:http://resolver.caltech.edu/CaltechAUTHORS:20130816-103255120
Official Citation:Carl Gold, Peter Sollich, Model selection for support vector machine classification, Neurocomputing, Volume 55, Issues 1–2, September 2003, Pages 221-249, ISSN 0925-2312, http://dx.doi.org/10.1016/S0925-2312(03)00375-8. (http://www.sciencedirect.com/science/article/pii/S0925231203003758)
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:40588
Collection:CaltechAUTHORS
Deposited By: KLAB Import
Deposited On:16 Jan 2008 01:53
Last Modified:23 Sep 2013 22:17

Repository Staff Only: item control page