Accession Number : ADA164453

Title :   Learning Internal Representations by Error Propagation

Descriptive Note : Technical rept. Mar-Sep 1985

Corporate Author : CALIFORNIA UNIV SAN DIEGO LA JOLLA INST FOR COGNITIVE SCIENCE

Personal Author(s) : Rumelhart, David E. ; Hinton, Geoffrey E. ; Williams, Ronald J.

PDF Url : ADA164453

Report Date : SEP 1985

Pagination or Media Count : 49

Abstract : This paper presents a generalization of the perception learning procedure for learning the correct sets of connections for arbitrary networks. The rule, falled the generalized delta rule, is a simple scheme for implementing a gradient descent method for finding weights that minimize the sum squared error of the sytem's performance. The major theoretical contribution of the work is the procedure called error propagation, whereby the gradient can be determined by individual units of the network based only on locally available information. The major empirical contribution of the work is to show that the problem of local minima not serious in this application of gradient descent. Keywords: Learning; networks; Perceptrons; Adaptive systems; Learning machines; and Back propagation.

Descriptors :   *ADAPTIVE SYSTEMS, *NETWORKS, *LEARNING, ERRORS, DESCENT, PERCEPTION, GRADIENTS, INTERNAL, LEARNING MACHINES, PROPAGATION, WEIGHT

Subject Categories : Psychology
      Electrical and Electronic Equipment

Distribution Statement : APPROVED FOR PUBLIC RELEASE