Accession Number : AD0656470

Title :   A CLASS OF MODELS FOR ADAPTIVE EXPERIMENTATION AND CONTROL.

Descriptive Note : Technical rept.,

Corporate Author : MASSACHUSETTS INST OF TECH CAMBRIDGE OPERATIONS RESEARCH CENTER

Personal Author(s) : Hurst,Ernest Gerald , Jr

Report Date : JUL 1967

Pagination or Media Count : 136

Abstract : Solutions are presented for several control problems with discretely dynamic, stochastic, partially observable states in which the amount of experimentation at each stage constitutes an important control decision. Bayesian autoregressive time series models are given, both in general and assuming Normal density functions for the change process, data generator, and statistical description of the state. A general dynamic programming formulation for control of the known first-order process is obtained; it is specialized to the case with quadratic cost of error and proportional cost of experimentation. The optimal experimental policy at every stage is found. The form of and numerical values for the steady state policy are presented. A set-up cost of experimentation is introduced, and a two-level experimental policy analogous to the (s,S) policy in inventory theory is obtained. The assumption of completely known process parameters is relaxed by allowing uncertainty in the precision of change. A three-variable dynamic programming formulation is solved for its optimal experimental policy, which is described by two critical numbers. A simple approximately optimal policy in terms of the earlier numerical results is proposed. (Author)

Descriptors :   (*ADAPTIVE SYSTEMS, CONTROL), (*STOCHASTIC PROCESSES, DECISION THEORY), (*OPERATIONS RESEARCH, MATHEMATICAL MODELS), THESES, TIME SERIES ANALYSIS, SAMPLING, MULTIVARIATE ANALYSIS, PROBABILITY DENSITY FUNCTIONS, DYNAMIC PROGRAMMING, INVENTORY CONTROL, MANAGEMENT PLANNING AND CONTROL, COSTS

Subject Categories : Operations Research

Distribution Statement : APPROVED FOR PUBLIC RELEASE