Accession Number : AD0656696

Title :   OPTIMAL CONTROL OF A DISCRETE-TIME STOCHASTIC SYSTEM LINEAR IN THE STATE,

Corporate Author : RAND CORP SANTA MONICA CALIF

Personal Author(s) : Midler,Joseph L.

Report Date : AUG 1967

Pagination or Media Count : 10

Abstract : Considered is a discrete-time stochastic control problem whose dynamic equations and loss function are linear in the state vector with random coefficients, but which may vary in a nonlinear, random manner with the control variables. The controls are constrained to lie in a given set. For this system it is shown that the optimal control or policy is independent of the value of the state. The result follows from a simple dynamic programming argument. Under suitable restrictions on the functions, the dynamic programming approach leads to efficient computational methods for obtaining the controls via a sequence of mathematical programming problems in fewer variables than the number of controls in the entire process. The result provides another instance of certainty equivalence for a sequential stochastic decision problem. The expectations of the random variables play the role of certainty equivalents in the sense that the optimal control can be found by solving a deterministic problem in which expectations replace the random quantities.

Descriptors :   (*DECISION THEORY, STOCHASTIC PROCESSES), (*CONTROL, OPTIMIZATION), (*STOCHASTIC PROCESSES, CONTROL SYSTEMS), RANDOM VARIABLES, MATHEMATICAL PROGRAMMING, DYNAMICS, MATHEMATICAL MODELS, MAPPING(TRANSFORMATIONS), DYNAMIC PROGRAMMING, ALGORITHMS

Subject Categories : Statistics and Probability

Distribution Statement : APPROVED FOR PUBLIC RELEASE