Accession Number : ADA117100

Title :   Data Compression for Transient Measurements.

Descriptive Note : Conference paper,


Personal Author(s) : Harley,Samuel F

PDF Url : ADA117100

Report Date : 18 Jun 1982

Pagination or Media Count : 15

Abstract : Data compression can be defined as the elimination of redundant data samples. Such techniques as the extraction of the mean and standard deviation, the root-mean-square (RMS), or other statistical measures such as histograms are certainly compressive in nature, but for the purposes of this presentation only those processes which allow the reconstruction of the time history of the original signal will be considered. Experimenters can be very protective of data collected during their experiments and often resist any effort to eliminate redundant data samples. The inclusion of nonintelligence bearing data, however, can impose unacceptable burdens on data acquisition and processing systems, and can thus slow the testing process and inhibit productivity. The compression algorithm described here provides a modest compression ratio while maintaining the fidelity of the reconstructed signal. The use of these techniques decreases the requirements for both dedicated off-board data memory and processor memory, for test site mass storage, and for archival storage.

Descriptors :   *Data compression, *Data reduction, *Transients, *Pressure measurement, Blast, Overpressure, Experimental data, Data acquisition, Requirements, Algorithms

Subject Categories : Test Facilities, Equipment and Methods

Distribution Statement : APPROVED FOR PUBLIC RELEASE