This monograph covers recent advances in a range of acceleration techniques frequently used in convex optimization. Using quadratic optimization problems, the authors introduce two key families of methods, namely momentum and nested optimization schemes. These methods are covered in detail and include Chebyshev Acceleration, Nonlinear Acceleration, Nesterov Acceleration, Proximal Acceleration and Catalysts and Restart Schemes.
This book provides the reader with an in-depth description of the developments in Acceleration Methods since the early 2000s, whilst referring the reader back to underpinning earlier work for further understanding. This topic is important in the modern-day application of convex optimization techniques in many applicable areas.
This book is an introduction to the topic that enables the reader to quickly understand the important principles and apply the techniques to their own research.