A Caltech Library Service

The Vapnik-Chervonenkis Dimension: Information versus Complexity in Learning

Abu-Mostafa, Yaser S. (1989) The Vapnik-Chervonenkis Dimension: Information versus Complexity in Learning. Neural Computation, 1 (3). pp. 312-317. ISSN 0899-7667. doi:10.1162/neco.1989.1.3.312.

PDF - Published Version
See Usage Policy.


Use this Persistent URL to link to this item:


When feasible, learning is a very attractive alternative to explicit programming. This is particularly true in areas where the problems do not lend themselves to systematic programming, such as pattern recognition in natural environments. The feasibility of learning an unknown function from examples depends on two questions: 1. Do the examples convey enough information to determine the function? 2. Is there a speedy way of constructing the function from the examples? These questions contrast the roles of information and complexity in learning. While the two roles share some ground, they are conceptually and technically different. In the common language of learning, the information question is that of generalization and the complexity question is that of scaling. The work of Vapnik and Chervonenkis (1971) provides the key tools for dealing with the information issue. In this review, we develop the main ideas of this framework and discuss how complexity fits in.

Item Type:Article
Related URLs:
URLURL TypeDescription
Additional Information:© 1989 The MIT Press. Received 6 July 1989; accepted 25 July 1989. The support of the Air Force Office of Scientific Research under Grant AFOSR-88-0213 is gratefully acknowledged.
Funding AgencyGrant Number
Air Force Office of Scientific ResearchAFOSR-88-0213
Issue or Number:3
Record Number:CaltechAUTHORS:ABUnc89
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:11996
Deposited By: Archive Administrator
Deposited On:16 Oct 2008 22:24
Last Modified:08 Nov 2021 22:23

Repository Staff Only: item control page