Accession Number : ADA184484
Title : General Potential Surfaces and Neural Networks.
Descriptive Note : Technical rept.,
Corporate Author : BROWN UNIV PROVIDENCE RI CENTER FOR NEURAL SCIENCE
Personal Author(s) : Dembo,Amir ; Zeitouni,Ofer
PDF Url : ADA184484
Report Date : 24 Jun 1987
Pagination or Media Count : 32
Abstract : Investigating Hopfield's model of associative memory implementation by a neural network, led to a generalized potential system with a much superior performance as an associative memory. In particular, there are no spurious memories, and any set of desired points can be stored, with unlimited capacity (in the continuous time and real space version of the model). There are no limit cycles in this system, and the size of all basins of attraction can reach up to half the distance between the stored points, by proper choice of the design parameters. A discrete time version with its state space being the unit hypercube is also derived, and admits superior properties compared to the corresponding Hopfield network. In particular the capacity of any system of N neurons, with a fixed desired size of basins of attractions, is exponentially growing with N and is asymptotically optimal in the information theory sense. The computational complexity of this model is slightly larger than that of the Hopfield memory, but of the same order. The results are derived under an axiomatic approach which determines the desired properties and shows that the above mentioned model is the only one to achieve them.
Descriptors : *NEURAL NETS, ASSOCIATIVE PROCESSING, MODELS, SURFACES, INFORMATION THEORY
Subject Categories : Cybernetics
Distribution Statement : APPROVED FOR PUBLIC RELEASE