Systems that evolve with time occur frequently in nature and modelling the behaviour of such systems provides an important application of mathematics. These systems can be completely deterministic, but it may be possible too to control their behaviour by intervention through 'controls'. The theory of optimal control is concerned with determining such controls which, at minimum cost, either direct the system along a given trajectory or enable it to reach a given point in its state space.
This textbook is a straightforward introduction to the theory of optimal control with an emphasis on presenting many different applications. Professor Hocking has taken pains to ensure that the theory is developed to display the main themes of the arguments but without using sophisticated mathematical tools.
Problems in this setting can arise across a wide range of subjects and there are illustrative examples of systems from as diverse fields as dynamics, economics, population control, and medicine. Throughout there are many worked examples, and numerous exercises (with solutions) are provided.