- Title
- Stochastic Optimal Control for Multivariable Dynamical Systems Using Expectation Maximization
- Creator
- Mallick, Prakash; Chen, Zhiyong
- Relation
- IEEE Transactions on Neural Networks and Learning Systems Vol. -, Issue 21 July 2022, p. 1-15
- Publisher Link
- http://dx.doi.org/10.1109/tnnls.2022.3190246
- Publisher
- Institute of Electrical and Electronics Engineers (IEEE)
- Resource Type
- journal article
- Date
- 2022
- Description
- Trajectory optimization is a fundamental stochastic optimal control (SOC) problem. This article deals with a trajectory optimization approach for dynamical systems subject to measurement noise that can be fitted into linear time-varying stochastic models. Exact/complete solutions to these kind of control problems have been deemed analytically intractable in literature because they come under the category of partially observable Markov decision processes (MDPs). Therefore, effective solutions with reasonable approximations are widely sought for. We propose a reformulation of stochastic control in a reinforcement learning setting. This type of formulation assimilates the benefits of conventional optimal control procedure, with the advantages of maximum likelihood approaches. Finally, an iterative trajectory optimization paradigm called as SOC—expectation maximization (SOC-EM) is put forth. This trajectory optimization procedure exhibits better performance in terms of reduction in cumulative cost-to-go which is proven both theoretically and empirically. Furthermore, we also provide novel theoretical work which is related to uniqueness of control parameter estimates. Analysis of the control covariance matrix is presented, which handles stochasticity through efficiently balancing exploration and exploitation.
- Subject
- expectation maximization (EM); maximum likelihood; optimal control; reinforcement learning; stochastic systems; trajectory optimization
- Identifier
- http://hdl.handle.net/1959.13/1449269
- Identifier
- uon:43629
- Identifier
- ISSN:2162-237X
- Language
- eng
- Hits: 668
- Visitors: 666
- Downloads: 0