By establishing an alternative foundation of control theory, this thesis represents a significant advance in the theory of control systems, of interest to a broad range of scientists and engineers. While common control strategies for dynamical systems center on the system state as the object to be controlled, the approach developed here focuses on the state trajectory. The concept of precisely realizable trajectories identifies those trajectories that can be accurately achieved by applying appropriate control signals. The resulting simple expressions for the control signal lend themselves to immediate application in science and technology. The approach permits the generalization of many well-known results from the control theory of linear systems, e.g. the Kalman rank condition to nonlinear systems. The relationship between controllability, optimal control and trajectory tracking are clarified. Furthermore, the existence of linear structures underlying nonlinear optimal control is revealed, enabling the derivation of exact analytical solutions to an entire class of nonlinear optimal trajectory tracking problems. The clear and self-contained presentation focuses on a general and mathematically rigorous analysis of controlled dynamical systems. The concepts developed are visualized with the help of particular dynamical systems motivated by physics and chemistry.