This text provides a detailed and self-contained introduction to the core topics of optimal control for finite-dimensional deterministic dynamical systems. Skillfully designed to guide the student through the development of the subject, the book provides a rich collection of examples, exercises, illustrations, and applications, to support comprehension of the material. Solutions to odd-numbered exercises are included, while a complete set of solutions is available to instructors who adopt the text for their class. The book is adaptable to coursework for final year undergraduates in (applied) mathematics or beginning graduate students in engineering. Required mathematical background includes calculus, linear algebra, a basic knowledge of differential equations, as well as a rudimentary acquaintance with control systems.
The book has developed out of lecture notes that were tested, adapted, and expanded over many years of teaching. Chapters 1-4 constitute thematerial for a basic course on optimal control, covering successively the calculus of variations, minimum principle, dynamic programming, and linear quadratic control. The additional Chapter 5 provides brief views to a number of selected topics related to optimal control, which are meant to peak the reader’s interest. Some mathematical background is summarized in Appendix A for easy review. Appendix B recalls some of the basics of differential equations and also provides a detailed treatment of Lyapunov stability theory including LaSalle’s invariance principle, as occasionally used in Chapters 3 and 4.