Publisher's Synopsis
This paper presents a method for finding optimal controls of nonlinear systems subject to random excitations. The method is capable to generate global control solutions when state and control constraints are present. The solution is global in the sense that controls for all initial conditions in a region of the state space are obtained. The approach is based on Bellman's Principle of optimality, the Gaussian closure and the Short-time Gaussian approximation. Examples include a system with a state-dependent diffusion term, a system in which the infinite hierarchy of moment equations cannot be analytically closed, and an impact system with a elastic boundary. The uncontrolled and controlled dynamics are studied by creating a Markov chain with a control dependent transition probability matrix via the Generalized Cell Mapping method. In this fashion, both the transient and stationary controlled responses are evaluated. The results show excellent control performances.Crespo, Luis G. and Sun, Jian Q.Langley Research CenterOPTIMAL CONTROL; STOCHASTIC PROCESSES; NONLINEAR SYSTEMS; HAMILTON-JACOBI EQUATION; BELLMAN THEORY; NUMERICAL ANALYSIS; MARKOV CHAINS; TRANSITION PROBABILITIES; BOUNDARY CONDITIONS; NORMAL DENSITY FUNCTIONS; DIFFERENTIAL EQUATIONS...