It is unbelievable how simple and straightforward each result appears once it has been found, and how difficult, as long as the way which leads to it is unknown.
- Ludwig Boltzmann, 189418
- Josiah Gibbs, 189430
- Max Planck, 189452
- Claude Shannon61
I = –[p ln p + (1 – p) ln (1 – p)],
That is, the bit carries information that is just 0.32/ln2 = 0.46 of the maximum value.
Figure 6: Benford’s law – the relative frequency of a digit in a file of random numbers in not uniform. The frequency of the digit “1” is 6.5 times greater than that of the digit “9”.
Figure 7: The logarithm of the relative frequency of nodes having n links (horizontal axis) plotted against the logarithm of the number of links, n, (vertical axis) produces a straight line with a slope of -1 for any integer n. This distribution is called a power law distribution. To the right of the vertical axis, the number of links is smaller than the number of nodes (n < 1), leading to exponential decay and a curved line.
That is, the product of the relative frequency of a node with n links by n is constant – exactly what Zipf found empirically with regard to the relative frequency of words in texts.(Zipf’s law is analogue to the classical limit of Planck's equation, namely,
The distribution of word frequencies in texts is an example of a poll in which the authors vote for words according to their popularity. Poll distributions in general are discussed later in this chapter.
Figure 9: Full Logarithmic graph of the distribution of wealth
gives the typical straight line of a power law function.
The energy that is stored in a body is called internal energy. Heat is the energy removed from a body or added to it. Heat is similar to work. There is no work inside a body: work can be applied on a body or can be applied by a body.
Why this error is significant to the understanding of entropy? Entropy is defined as heat divided by temperature. Since temperature is a property of a body, therefore entropy is of the same nature as heat and work.
This is very significant as we will see later.
Entropy is the logarithm of the number of the microstates of a thermally isolated system.
The number of the microstates is the number of possible distinguishable ways in which a statistical system can be found.
Therefore, entropy is a measure of complexity and uncertainty of a system and NOT disorder.
This mistake comes from our intuition that the entropy is generated in an irreversible operation (which is true). Nevertheless, entropy is defined in equilibrium where Q/T has a maximal value. Namely,
and the second law states that:
S ≥ (Q/T)
Which means that heat divided by temperature is biggest in reversible operation and not
(Q/T)irreversible > (Q/T)reversible !
This error comes from our intuition that it is much easier to break things than to fix them. If we put a cube of sugar in a glass of tea, the sugar will dissolve. It requires much work to obtain back sugar cube from a glass of sweeten tea. However, there are opposite examples, i.e. emulsion of oil and water will spontaneously be separated into two nice layers of oil and water.
Many believe that disorder increases spontaneously because it is a common belief. However if we look around us we see that “order” increases all the time. Lord Kelvin, a famous 19th century scientist, claimed that objects heavier than air (namely objects that their specific density is higher than that of air) cannot fly. He made this colossal mistake not because he did not know about Bernoulli law (that is forgivable…) but because he did not look at the birds in the sky! Like most people he saw birds flying but he never related their flight to physics.
Order is generated all around us, and spontaneous generation of order should be explained by physics.
Everything in our world is energy. Therefore, when a file is transferred from a transmitter to a receiver, it is bound by the laws of thermodynamics.
If we use pulses for the energetic bits for the file transmission (EM pulse is a classic oscillator), the transferred energy from the hot transmitter to the cold receiver is a thermodynamic process in which Shannon entropy is the amount of the increase of Gibbs entropy of the process.
Shannon information IS entropy. The reason for this common error is the confusion between information a-la Shannon that is defined as the logarithm of the number of the possible different transmitted files and our intuition that information is ONE specific file. In our book a specific transmitted file (which is a microstate) is called content. Therefore Shannon information is the logarithm of the number of all possible contents.