Publisher's Synopsis
The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term "entropy" with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the "driving force" of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy.It has been 140 years since Clausius coined the term "entropy"; almost 50 years since Shannon developed the mathematical theory of "information" - subsequently renamed "entropy". In this book, the author advocates replacing "entropy" by "information", a term that has become widely used in many branches of science.The author also takes a new and bold approach to thermodynamics and statistical mechanics. Information is used not only as a tool for predicting distributions but as the fundamental cornerstone concept of thermodynamics, held until now by the term "entropy".The topics covered include the fundamentals of probability and information theory; the general concept of information as well as the particular concept of information as applied in thermodynamics; the re-derivation of the Sackur-Tetrode equation for the entropy of an ideal gas from purely informational arguments; the fundamental formalism of statistical mechanics; and many examples of simple processes the "driving force" for which is analyzed in terms of information.