Abu-Mostafa, Yaser S. (1989) The Vapnik-Chervonenkis Dimension: Information versus Complexity in Learning. Neural Computation, 1 (3). pp. 312-317. ISSN 0899-7667. http://resolver.caltech.edu/CaltechAUTHORS:ABUnc89
- Published Version
See Usage Policy.
Use this Persistent URL to link to this item: http://resolver.caltech.edu/CaltechAUTHORS:ABUnc89
When feasible, learning is a very attractive alternative to explicit programming. This is particularly true in areas where the problems do not lend themselves to systematic programming, such as pattern recognition in natural environments. The feasibility of learning an unknown function from examples depends on two questions: 1. Do the examples convey enough information to determine the function? 2. Is there a speedy way of constructing the function from the examples? These questions contrast the roles of information and complexity in learning. While the two roles share some ground, they are conceptually and technically different. In the common language of learning, the information question is that of generalization and the complexity question is that of scaling. The work of Vapnik and Chervonenkis (1971) provides the key tools for dealing with the information issue. In this review, we develop the main ideas of this framework and discuss how complexity fits in.
|Additional Information:||© 1989 The MIT Press. Received 6 July 1989; accepted 25 July 1989. The support of the Air Force Office of Scientific Research under Grant AFOSR-88-0213 is gratefully acknowledged.|
|Usage Policy:||No commercial reproduction, distribution, display or performance rights in this work are provided.|
|Deposited By:||Archive Administrator|
|Deposited On:||16 Oct 2008 22:24|
|Last Modified:||26 Dec 2012 10:25|
Repository Staff Only: item control page