Stochastic Optimal Control Via Bellman's Principle

Stochastic Optimal Control Via Bellman's Principle

Paperback (17 Sep 2018)

Not available for sale

Includes delivery to the United States

Out of stock

This service is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Publisher's Synopsis

This paper presents a method for finding optimal controls of nonlinear systems subject to random excitations. The method is capable to generate global control solutions when state and control constraints are present. The solution is global in the sense that controls for all initial conditions in a region of the state space are obtained. The approach is based on Bellman's Principle of optimality, the Gaussian closure and the Short-time Gaussian approximation. Examples include a system with a state-dependent diffusion term, a system in which the infinite hierarchy of moment equations cannot be analytically closed, and an impact system with a elastic boundary. The uncontrolled and controlled dynamics are studied by creating a Markov chain with a control dependent transition probability matrix via the Generalized Cell Mapping method. In this fashion, both the transient and stationary controlled responses are evaluated. The results show excellent control performances.Crespo, Luis G. and Sun, Jian Q.Langley Research CenterOPTIMAL CONTROL; STOCHASTIC PROCESSES; NONLINEAR SYSTEMS; HAMILTON-JACOBI EQUATION; BELLMAN THEORY; NUMERICAL ANALYSIS; MARKOV CHAINS; TRANSITION PROBABILITIES; BOUNDARY CONDITIONS; NORMAL DENSITY FUNCTIONS; DIFFERENTIAL EQUATIONS...

Book information

ISBN: 9781723759253
Publisher: Independently Published
Imprint: Independently Published
Pub date:
Language: English
Number of pages: 30
Weight: 95g
Height: 280mm
Width: 216mm
Spine width: 2mm