The main purpose of the book is to give a rigorous introduction to the most important and useful solution methods of various types of stochastic control problems for jump diffusions and their applications. Both the dynamic programming method and the stochastic maximum principle method are discussed, as well as the relation between them. Corresponding verification theorems involving the Hamilton–Jacobi–Bellman equation and/or (quasi-)variational inequalities are formulated. The text emphasises applications, mostly to finance. All the main results are illustrated by examples and exercises appear at the end of each chapter with complete solutions. This will help the reader understand the theory and see how to apply it. The book assumes some basic knowledge of stochastic analysis, measure theory and partial differential equations.
The 3
rd edition is an expanded and updated version of the 2
nd edition, containing recent developments within stochastic control and its applications. Specifically, there is a new chapter devoted to a comprehensive presentation of financial markets modelled by jump diffusions, and one on backward stochastic differential equations and convex risk measures. Moreover, the authors have expanded the optimal stopping and the stochastic control chapters to include optimal control of mean-field systems and stochastic differential games.