Accession Number : ADA213369

Title :   Proceedings of the 1988 Workshop on Computational Learning Theory,

Corporate Author : CALIFORNIA UNIV SANTA CRUZ

Personal Author(s) : Spatz, Bruce

Report Date : 1988

Pagination or Media Count : 454

Abstract : Contents: Learning in Neural Networks; Training a 3-node Neural network is NP-Complete; Equivalence of Models for Polynomial Learnability; Results on Learnability and the Vapnik-Chervonenkis Dimension; Learning complicated Concepts Reliably and Usefully; types of Noise in Data for Concept Learning; Learning in Parallel; Some Remarks About Space-complexity of Learning, and Circuit Complexity of Recognizing;A General Lower bound on the Number of Examples needed for Learning; The Power of Vacillation; Prudence in Language Learning; Transformation of Probabilistic Learning Strategies into Deterministic learning Strategies; Inductive Inference: an Abstract Approach; Learning Theories in a Subset of a Polyadic Logic. Predicting (0,1)-Functions on Randomly Drawn Points; Learning Probabilistic Prediction functions; Learning Context-Free Grammars from Structural Data in Polynomial Time; Learning Pattern Languages from a Single Initial Example and from Queries; and Learning Regular Languages from Counterexamples. Keywords: Symposia. (JHD)

Descriptors :   *LEARNING, *NEURAL NETS, CIRCUITS, COMPUTATIONS, DETERMINANTS(MATHEMATICS), FUNCTIONS, INTERROGATION, LANGUAGE, MODELS, PATTERNS, POLYNOMIALS, PREDICTIONS, PROBABILITY, STRATEGY, STRUCTURAL PROPERTIES, SYMPOSIA, THEORY, TIME, NONLINEAR PROGRAMMING, BACKGROUND NOISE, CONTEXT FREE GRAMMARS.

Subject Categories : Cybernetics

Distribution Statement : APPROVED FOR PUBLIC RELEASE