Accession Number : AD0684541

Title :   STOCHASTIC OPTIMAL CONTROL WITH IMPERFECTLY KNOWN DISTURBANCES.

Descriptive Note : Doctoral thesis,

Corporate Author : WASHINGTON UNIV ST LOUIS MO DEPT OF SYSTEMS MECHANICAL AND AEROSPACE ENGINEERING

Personal Author(s) : Tarn,Tzyh-John

Report Date : JUN 1968

Pagination or Media Count : 111

Abstract : The optimal adaptive control of linear, discrete stochastic systems is studied. A method is presented for relaxing the usual assumption that the distributions of the disturbances are known. The additive white Gaussian disturbances are regarded to have fixed but unknown parameters. The basic idea is to consider the unknown parameters as random variables whose a priori probability densities are given. Applying Bayesian filtering theory, the problem solution consists of recursion equations for sequentially computing the a posteriori probability densities of these random variables based on measurements. From these a posteriori probability densities estimates can be formed. To determine the control, the expected value of a quadratic cost functional is used as a criterion function. By applying Bellman's dynamic programming approach, one obtains the exact analytical solution of the feedback control law. Based on the exact analytical solution, it is easy to study the dual aspect of the optimal control. (Author)

Descriptors :   (*ADAPTIVE CONTROL SYSTEMS, STOCHASTIC PROCESSES), INFORMATION THEORY, DYNAMIC PROGRAMMING, ANALYSIS OF VARIANCE, OPTIMIZATION, FEEDBACK, THESES

Subject Categories : Statistics and Probability

Distribution Statement : APPROVED FOR PUBLIC RELEASE