Linear Stochastic Control Systems presents a thorough description of the mathematical theory and fundamental principles of linear stochastic control systems. Both continuous-time and discrete-time systems are thoroughly covered.
Reviews of the modern probability and random processes theories and the Ito stochastic differential equations are provided. Discrete-time stochastic systems theory, optimal estimation and Kalman filtering, and optimal stochastic control theory are studied in detail. A modern treatment of these same topics for continuous-time stochastic control systems is included. The text is written in an easy-to-understand style, and the reader needs only to have a background of elementary real analysis and linear deterministic systems theory to comprehend the subject matter.
This graduate textbook is also suitable for self-study, professional training, and as a handy research reference. Linear Stochastic Control Systems is self-contained and provides a step-by-step development of the theory, with many illustrative examples, exercises, and engineering applications.
Series edited by: Richard Durrett, Mark Pinsky