“A remarkably complex yet fascinating scientific exploration that illuminates a particularly thorny area of physics for laypersons and professionals alike… An earnest examination that walks the tightrope between the scientific community and casual readers.…” Read more

“... This book is a Masterpiece… trying to explain in a plain and understandable language a hard-to-explain subject..” Read more

efficiency

The Carnot efficiency is the maximum amount of work that can be produced from a given amount of heat transferred between two temperatures. Read more

entropy

The Clausius entropy is the heat (energy added or removed) divided by the temperature of its source. Read more

entropy

The Boltzmann entropy is the logarithm of the possible distinguishable arrangements (microstates) of a system multiplied by Boltzmann constant. Read more

entropy

The Gibbs entropy is the sum over all microstates of the probability of the microstate times its logarithm, multiplied by minus Boltzmann constant. Read more

**NOT TRUE**

The energy that is stored in a body is called internal energy. Heat is the energy removed from a body or added to it. Heat is similar to work. There is no work inside a body: work can be applied on a body or can be applied by a body.

Why this error is significant to the understanding of entropy? Entropy is defined as heat divided by temperature. Since temperature is a property of a body, therefore entropy is of the same nature as heat and work.

This is very *significant* as we will see later.

NOT TRUE

Entropy is the logarithm of the number of the microstates of a thermally isolated system.

The number of the microstates is the number of possible distinguishable ways in which a statistical system can be found.

Therefore, entropy is a measure of complexity and uncertainty of** **a system and NOT disorder.

NOT TRUE

This mistake comes from our intuition that the entropy is generated in an irreversible operation (which is true). Nevertheless, entropy is defined in equilibrium where Q/T has a maximal value. Namely,

S≡ (Q/T)_{reversible}

and the second law states that:

S ≥ (Q/T)

Which means that heat divided by temperature is biggest in reversible operation and not

(Q/T)_{irreversible} > (Q/T)_{reversible }!

**NOT TRUE**

This error comes from our intuition that it is much easier to break things than to fix them. If we put a cube of sugar in a glass of tea, the sugar will dissolve. It requires much work to obtain back sugar cube from a glass of sweeten tea. However, there are opposite examples, i.e. emulsion of oil and water will spontaneously be separated into two nice layers of oil and water.

Many believe that disorder increases *spontaneously* because it is a common belief. However if we look around us we see that “order” increases all the time. Lord Kelvin, a famous 19^{th} century scientist, claimed that objects heavier than air (namely objects that their specific density is higher than that of air) cannot fly. He made this colossal mistake not because he did not know about Bernoulli law (that is forgivable…) but because he did not look at the birds in the sky! Like most people he saw birds flying but he never related their flight to physics.

Order is generated all around us, and spontaneous generation of order should be explained by physics.

NOT TRUE

Everything in our world is energy. Therefore, when a file is transferred from a transmitter to a receiver, it is bound by the laws of thermodynamics.

If we use pulses for the energetic bits for the file transmission (EM pulse is a classic oscillator), the transferred energy from the hot transmitter to the cold receiver is a thermodynamic process in which Shannon entropy is the amount of the increase of Gibbs entropy of the process.

NOT TRUE

Shannon information IS entropy. The reason for this common error is the confusion between information *a-la* Shannon that is defined as the logarithm of the number of the possible different transmitted files and our intuition that information is ONE specific file. In our book a specific transmitted file (which is a microstate) is called content. Therefore Shannon information is the logarithm of the number of all possible contents.

28 Sep 16Economic Inequality

28 Jul 14The Physical Meaning of Money

28 Jan 14Kolmogorov Complexity and Entropy