The aim of this little book is to convey three principal developments in the evolution of modern information theory: Shannon's initiation of a revolution in 1948 by his interpretation of Boltzmann entropy as a measure of information yielded by an elementary statistical experiment and basic coding theorems on storing messages and transmitting them through noisy communication channels in an optimal manner; the influence of ergodic theory in the enlargement of the scope of Shannon's theorems through the works of McMillan, Feinstein, Wolfowitz, Breiman and others and its impact on the appearance of the Kolmogorov-Sinai invariant for elementary dynamical systems; and finally, the more recent work of Schumacher, Holevo, Winter and others on the role of von Neumann entropy in the quantum avatar of the basic coding theorems when messages are encoded as quantum states, transmitted through noisy quantum channels, and retrieved by generalized measurements.