# Entropy definitions

Entropy definition

Several definitions to the same quantity.

Unlike most physical quantities entropy has several definitions:

1. Clausius entropy is the heat (energy change) divided by the temperature of its source. This entropy was derived by Clausius from Carnot’s efficiency. The Clausius entropy or Clausius inequality is the physical meaning of the second law of thermodynamics and gives the physical explanation why entropy tends to grow.  The Boltzmann entropy, Gibbs entropy and Shannon entropy as being identical to Clausius entropy also tends to grow. Nevertheless no intuitive explanation why they tend to grow known to us.

2. Boltzmann entropy which is the logarithm of the number of configurations possible for a statistical system multiplied by the Boltzmann constant. Boltzmann tried to obtain the Clausius inequality from his expression of entropy in his papers on “H-theorem“. Despite the fact that mathematician Henri Poincare proved that Boltzmann work was erroneous, many scientist are still use it and the origin of many misconceptions about entropy emerged from this theorem.

3. Gibbs entropy is the sum of the probabilities of the microstates multiplied by their logarithm and multiplied by the (-) Boltzmann constant. Gibbs entropy is basically Boltzmann entropy written as a sum of probabilities. It is worth noting that some scholars sum the probabilities in gibbs entropy instead on the microstates on the states. This error gives the wrong impression that the canonical distribution is the only distribution possible at maximum entropy. A whole new erroneous field  called “nonextensive thermodynamics” was formed in order to explain the long tail distribution.

4. Shannon entropy which is the Gibbs entropy without Boltzmann constant . Some scholars claims that the fact that Shannon entropy and Gibbs entropy are identical is a coincidence. In this book we show that they are the same quantity. When information is conveyed by very hot pulses Gibbs entropy identical to Shannon entropy and doesn’t depends on the energy of the pulses. Hot electromagnetic pulse is a classical pendulum and its entropy is one Boltzmann constant.