Accession Number : AD0431055

Title :   APPLICATION OF DYNAMIC PROGRAMMING TO STOCHASTIC TIME OPTIMAL CONTROL,

Corporate Author : SYSTEM DEVELOPMENT CORP SANTA MONICA CALIF

Personal Author(s) : Ash,M.

Report Date : 31 JAN 1964

Pagination or Media Count : 13

Abstract : A non-linear control process is discussed where the control is bounded as absolute value. A random element (noise) that appears additively as part of the control variable is assumed. The performance criterion of driving the system back to equilibrium from its present perturbed state in minimum ''expected'' time, due to the presence of the random noise is used. The principle of optimality of dynamic programming to derive a novel partial differential equation in the minimum expected time is applied. Solutions of this equation yield the optimal control policy, which is bang bang. Specifically, far from the origin of the corresponding phase space, the control is set, once only, to drive the system into the linear region near the origin. In the linear region, the control switching sequence corresponding to Bushaw's theorem ensues. (Author)

Descriptors :   (*STOCHASTIC PROCESSES, ADAPTIVE CONTROL SYSTEMS), (*MATHEMATICAL LOGIC, THEORY), NONLINEAR SYSTEMS, OPTIMIZATION, TIME, EQUATIONS OF STATE, PROBABILITY, DISTRIBUTION, TRAJECTORIES

Distribution Statement : APPROVED FOR PUBLIC RELEASE