Accession Number : ADA295490
Title : Informed Prefetching and Caching,
Corporate Author : CARNEGIE-MELLON UNIV PITTSBURGH PA SCHOOL OF COMPUTER SCIENCE
Personal Author(s) : Patterson, R. H. ; Gibson, Garth A. ; Ginting, Eka ; Stodolsky, Daniel ; Zelenka, Jim
PDF Url : ADA295490
Report Date : 11 MAY 1995
Pagination or Media Count : 26
Abstract : The underutilization of disk parallelism and file cache buffers by traditional file systems induces I/O stall time that degrades the performance of modern microprocessor-based systems. In this paper, we present aggressive mechanisms that tailor file system resource management to the needs of I/O-intensive applications. In particular, we show how to use application-disclosed access patterns (hints) to expose and exploit I/O parallelism and to allocate dynamically file buffers among three competing demands: prefetching hinted blocks, caching hinted blocks for reuse, and caching recently used data for unhinted accesses. Our approach estimates the impact of alternative buffer allocations on application execution time and applies a cost-benefit analysis to allocate buffers where they will have the greatest impact. We implemented informed prefetching and caching in DEC's OSF/1 operating system and measured its performance on a 150 MHz Alpha equipped with 15 disks running a range of applications including text search, 3D scientific visualization, relational database queries, speech recognition, and computational chemistry. Informed prefetching reduces the execution time of the first four of these applications by 20% to 87%. Informed caching reduces the execution time of the fifth application by up to 30%.
Descriptors : *INPUT OUTPUT PROCESSING, *RESOURCE MANAGEMENT, *COMPUTER FILES, *MAGNETIC DISKS, DATA BASES, BUFFERS, COMPUTATIONS, MICROPROCESSORS, COST ANALYSIS, ESTIMATES, TIME, CHEMISTRY, STALLING, ALLOCATIONS, SPEECH RECOGNITION, INTERROGATION, BENEFITS.
Subject Categories : Computer Programming and Software
Distribution Statement : APPROVED FOR PUBLIC RELEASE