entropy-summary

Entropy Summary

We have traveled a long way – from Carnot to Shannon – toward the understanding of entropy and the second law of thermodynamics, which states that entropy can never decrease, and tends only to increase. Our journey began with Carnot, who derived the expression for the maximum efficiency of a theoretical machine that produces mechanical work from the spontaneous flow of energy from a hotter object to a colder one. Next, Clausius identified entropy in Carnot’s efficiency equation, and gave it its definition: heat divided by temperature. Clausius also formulated the second law of thermodynamics, which says that all systems tend to reach equilibrium, a point where entropy is at a maximum.

A huge breakthrough in the understanding of entropy was independently made by Boltzmann and Gibbs, who showed that the entropy of a statistical system is proportional to the logarithm of the number of distinguishable arrangements in which it may be found. Each possible arrangement of a system is called a microstate, and since every system, at any given moment, can only be in one microstate, the entropy thus acquires the meaning of uncertainty. This is because if a system is in one particular but unknown to us microstate, the more microstates possible, the greater the uncertainty. Thus the second law of thermodynamics obtained a new meaning: Every system tends to maximize the number of its microstates and its uncertainty.

How does nature maximize the number of microstates? If we consider an ideal gas as an example, its energetic particles are distributed among the possible locations (states) within a system in a way that maximizes the number of microstates. The distribution in which the number of microstates is at a maximum is called the equilibrium distribution.

We examined two important distributions that are common in nature: one kind, an exponential one, occurs in systems where the number of particles is smaller than the number of states – for example, the distribution of the energy of particles of gases, or the distribution of heights among peoples. These distributions (graphically depicted by bell-like curves) favor the average. Another kind includes power-law distributions (graphically, these are long-tailed curves), which occur in systems where the number of particles is greater than the number of states. These distributions allow much greater divergence from the average and are observed, for example, in the classical limit to the energy distribution of black body radiation and in the distribution of wealth.

Up to that point, we have reviewed the contributions to the understanding of entropy made by Carnot, Clausius, Boltzmann, Gibbs, Maxwell and Planck. Their works were carried out between the early 19th century and the early 20th century, during the golden age of the physical sciences.

Later on it was also shown that the uncertainty associated with data transmission, such as receiving a computer file, is in fact the file’s entropy. This entropy is named after Shannon, who was the first to calculate it. How is Shannon’s entropy different from the entropy of, for example, a container full of gas? The difference is due to the fact that a digital file is transmitted by electric or electromagnetic harmonic oscillators. The oscillators used to transfer files are much hotter than their surroundings – that is, they have a lot of energy. As we saw, when an oscillator is very hot, its entropy is independent of its energy and has a value of one Boltzmann constant. In such cases, as in digital transmissions, where the entropy of a file is not a function of its energy, we call it logical entropy. By way of analogy, one plus one is always two, regardless of the amount of energy each digit one may have. This contrasts with oscillators (such as the molecular oscillators in a gas, for example) whose temperature is closer to the environment’s, such that a change in the oscillator’s energy leads to a change in its entropy. It thus follows that if oscillators whose temperature is closer to the environment’s were used to transfer files, any change in the energy of the file during transmission would affect its entropy. Since it is impossible to transfer files without losing energy during transmission, content transfer using oscillators with a temperature similar to their surroundings is inefficient.

If the laws of thermodynamics also apply to logic, they should be observed in logical distributions. And indeed, based on this assumption, we showed how the non-uniform distribution of digits in random numerical files, such as balance sheets, logarithmic tables and so forth (Benford Law), does, indeed, obey the second law of thermodynamics. This distribution we call the Planck- Benford distribution, after Planck who first calculated it and Benford who discovered this distribution in numerous databases.

We next examined the effects of the Planck-Benford distribution in social systems, and we found it in many networks, such as social, transportation and communication networks. All these networks are generated by a spontaneous linking of nodes. The invisible hand that distributes the links among nodes is entropy, namely, the tendency of spontaneous networks to increase the number of microstates. Calling the relative number of links a rank, we saw that texts obey Zipf’s law, which states that the product of the rank and the frequency of any word in a long enough text is constant. In other words, nodes with many links appear less frequently, and nodes with few links appear more frequently.

We explored the distribution of wealth and saw a surprising result: When particles (coins) are distributed among states (people), such that every microstate has an equal probability, a highly non-uniform distribution is obtained (Perato’s 80:20 rule), which is reflected in the distribution of wealth in free economies. In other words, there is a high probability that a small number of randomly chosen individuals will have huge amounts of money, whereas most individuals will have to make do with much less – a counterintuitive result in view of our equal probability starting point, yet all too familiar in real life. No less surprising, we saw that surveys and polls tend to obey the same Planck-Benford statistics.

Is our thinking also influenced by the second law of thermodynamics? Let us rephrase the question: do we humans aspire to increase the number of possibilities (microstates) at our disposal? Business people will certainly agree without hesitation, and even go so far as to add that anyone who does not act in this way is a sucker. To reinforce this point, let us tell an anecdote we heard on Israeli radio by author and journalist Amnon Dankner.

Dankner was good friend of another author, Dan Ben-Amotz. Ben-Amotz’s personal philosophy was based on his firm belief that nobody should give anybody something for nothing. This philosophical school is also known as “penny-pinching.” One day, while Ben-Amotz was on an extended stay in New York, Dankner paid him a visit, and the two went for breakfast in an inexpensive eatery. When Dankner saw the paltry bill, he offered to pick up the tab. To his surprise, Ben-Amotz frowned. Now, had Ben-Amotz been your typical cheapskate, his reluctance when offered a free meal would have been peculiar indeed. But, as mentioned, Ben- Amotz was something of a philosopher, and his explanation for rejecting the offer was this: the next time they would dine together, the place would probably be classier, hence the bill would be larger, and since it will be his turn to pick up the tab, this morning’s “free meal” would eventually cost him more.

Unconvinced, Dankner did some back of envelope calculations and showed Ben-Amotz that even if he (Ben-Amotz) would always end up paying for the more expensive meals, it would not cost him more than fifty dollars over a whole year. So he asked him: “Why does this bother you so much?” (Ben-Amotz was a man of means.) Ben-Amotz answered: “I hate being a played for sucker.” And then Dankner asked him a really profound question: “Why is it so important to you not to be played for a sucker?”

The rest of the story has nothing to do with the topic at hand, that is, why do we so hate to feel that we are suckers? But we shall continue with it anyway, so as to not leave it unfinished.

Ben-Amotz’s answer, which came after a long reflection, was: “Let me think about it.” Dankner himself, so he said, totally forgot about it as soon as the meal was over. But four years later, Ben- Amotz, on his deathbed at home, called for Dankner. When Dankner arrived, Ben-Amotz, his remaining strength almost spent, whispered: “Do you remember that you asked me why I hate so much being a sucker?” Dankner nodded. And Ben-Amotz told him: “I don’t know.”

Any one of us sometimes does things which are plainly irrational because of this aversion to being a sucker. Kahneman and Tversky74 discussed the illogical economic decisions that people make just because of loss aversion. But before turning to loss aversion, we must deal with a more fundamental question: Why do we want to get richer and richer? This is an interesting question, because we always seem to want more money than we have, no matter how much we actually have. It seems that no amount of money that can satisfy our hunger for even more money – as opposed to food: everybody understands that there is no need to have infinite amounts of it.

If we interpret a microstate as an option, we can understand that money increases the number of options open to us. The greater the budget, the greater the selections of products we can choose from as best suits us. Therefore, our desire to increase the number of options open to us is an outcome of entropy’s tendency to increase.

Let us return now to Ben-Amotz and to his grumpiness at Dankner’s offer to pay for his meal. According to Ben-Amotz’s reasoning, what Dankner did was reduce the number of options open to Ben-Amotz in the future: declining the offer, it would be up to him whether or not to invite Dankner for a meal later on; accepting it, only one of these options would be left. In general, receiving an expensive gift may make us feel distressed because we became indebted to the giver. Suppose you are celebrating your only son’s Bar-Mitzvah. You invite your good friend, who happens to have two sons, ages eleven and twelve, and he gives the boy a present worth one thousand dollars. Chances are you may feel somewhat like Ben-Amotz because now your allegedly good friend has reduced the number of options open to you upon receiving successive invitations to his sons’ Bar-Mitzvahs. His present actually cost you one thousand dollars! What a dreadful man.

Reducing the number of options available to us thus means a net loss. Every system in nature “wants” to increase its entropy, that is, the number of options available to it. But reducing the entropy of a system is possible only by applying work (power) from the outside.

Freedom, then, is the ability to choose at will from the greatest number of options available to us. In other words, entropy is freedom; and the equal opportunity (rather than equality per se) that maximizes the number of options available is the second law of thermodynamics. When the number of options available to us is infinite, choice becomes random and the microstate in which we exist is our fate that is determined by God’s game of dice.