Accession Number : ADA181408
Title : Knowledge Based Synthesis of Efficient Structures for Concurrent Computation Using Fat-Trees and Pipelining.
Descriptive Note : Annual technical rept.,
Corporate Author : KESTREL INST PALO ALTO CA
Personal Author(s) : King,Richard M ; Brown,Tom ; Green,Cordell
PDF Url : ADA181408
Report Date : 31 Dec 1986
Pagination or Media Count : 70
Abstract : In a previous work the authors developed techniques to synthesize lattice and tree parallel structures from first order logic specifications. They have now developed new techniques that synthesize new structures. First the new techiques enable the synthesis of trees in which the width of the width of the interconnections and the power of the nodes increases as the distance from the leaves increases. This type of tree has been given the name fat-tree. Fat-trees are univeral in that the performance of any network at all can be equal by a fat-trees, to a constant and some factors logarithmic in the size of the structure to be simulated. The constant is immense, making fat-trees, at present not a general method for simulating other structures. The idea of such a varying-width tree can, however, be used in specific cases as a synthesis target. The authors describe techniques for using extensions of previous work to build specialized fat-trees to satisfy certain first order logic specifications. These fat-tree are efficient, because they are specialized. The second extension is a proof that an appropriately defined parallel structure can be modified to produce a structure capable of pipelining, or processing different parts of several problem instances simultaneously in a manner similar to an assembly line. The roof is a constructive one; a synthesis method based on the proof is feasible.
Descriptors : *PARALLEL PROCESSORS, *COMPUTER LOGIC, ASSEMBLY, EFFICIENCY, STRUCTURES, NODES, SYNTHESIS, SPECIFICATIONS, TARGETS, DISCRETE FOURIER TRANSFORMS, TOPOLOGY
Subject Categories : Computer Hardware
Distribution Statement : APPROVED FOR PUBLIC RELEASE