
Accession Number : ADA297408
Title : The Mathematics of Measuring Capabilities of Artificial Neural Networks.
Descriptive Note : Doctoral thesis,
Corporate Author : AIR FORCE INST OF TECH WRIGHTPATTERSON AFB OH
Personal Author(s) : Carter, Martha A.
PDF Url : ADA297408
Report Date : JUN 1995
Pagination or Media Count : 121
Abstract : Researchers rely on the mathematics of Vapnik and Chervonenkis to capture quantitatively the capabilities of specific artificial neural network (ANN) architectures. The quantifier is known as the VC dimension, and is defined on functions or sets. Its value is the largest cardinality 1 of a set of vectors in Rd such that there is at least one set of vectors of cardinality 1 such that all dichotomies of that set into two sets can be implemented by the function or set. Stated another way, the VC dimension of a set of functions is the largest cardinality of a set, such that there exists one set of that cardinality which can be shattered by the set of functions. A set of functions is said to shatter a set if each dichotomy of that set can be implemented by a function in the set. There is an abundance of research on determining the value of VC dimensions of ANNs. In this document, research on VC dimension is refined and extended yielding formulas for evaluating VC dimension for the set of functions representable by a feedforward, single hiddenlayer perceptron artificial neural network.The fundamental thesis of this research is that the VC dimension is not an appropriate quantifier of ANN capabilities. (KAR) P. 11
Descriptors : *MEASUREMENT, *NEURAL NETS, *MATHEMATICAL ANALYSIS, FUNCTIONS, THESES, INVARIANCE.
Subject Categories : Numerical Mathematics
Computer Systems
Distribution Statement : APPROVED FOR PUBLIC RELEASE