Monthly Archives: October 2013

Unified evolution

Evolution theory was suggested as an answer to one of the most intriguing questions: How was the variety of biological species on earth created? Contemporary evolution theory is based on biological and chemical changes. Many believe that life started from some primordial chemical soup.

Does evolution have a deeper root than chemistry? Is there a physical law that is responsible for evolution? Our book advocates a positive answer.

There is a semantic differentiation between “natural things” and “artificial things”. Namely, natural things are created by nature and artificial things were made by us. Is there a justification for this egocentric view of the world? Here it is argued that both “we” and the “things” we make are all part of nature and are subject to the same laws. From holistic point of view, the computers, telephones, cars, roads, etc. are all created spontaneously in nature by the same invisible hand that created us. There are no special laws of nature for humans’ actions. A unified evolution theory should explain the origin of our passion to make tools, to develop technology, to create social networks, to trade and to seek “justice”.

Here we claim that entropy, in its propensity to increase, is the invisible hand of both our life and our actions.

Understanding Uncertainty

Probability is a well-established mathematical branch of high importance. In mathematics probability is calculated with consistency with a set of axioms. Sometimes uncertainty is defined by the statisticians according to probability rules.

For example: Suppose Bob plans to dine with Alice in the evening: there is 1/10 chance the he will not be available. Since the total probability is 1 (Kolmogorov 2nd axiom), therefore there is 9/10 chance that they will dine together and 1/10 that they will not. If there is a chance of 1/2 that Bob will not be available, the total probability is still 1, but now it comprises of a probability of 1/2 for a joint dinner. In General, if the probability that Bob will not be available is p, it implies that the probability of the joint dinner is 1-p.

In this example some statisticians may say that the uncertainty of having a joint dinner in the first case is 10%, and 50% in the second. This is not correct.

Uncertainty is defined by its Shannon’s entropy and its expression for the joint dinner is,

-plnp-(1-p)ln(1-p).

Usually engineers use the logarithm in base 2 and the uncertainty is expressed in bits. If p=1/2 then the uncertainty is 1 bit (one or zero).  If p=1/10 then the uncertainty is 0.46 bit, namely, it is little less than half a bit. The entropy is a physical quantity which is a function of a mathematical quantity p, but unlike mathematical quantities that exist in a formal mathematical space defined by its axioms, entropy is bounded by a physical law, the second law of thermodynamics. Namely, entropy tends to increase to its maximum.

The maximum value of S, in our example, is ln⁡2 when p=1/2. Does it mean that nature prefers the chance of Bob not being available for dinner with Alice to be 1/2, where the entropy is at its maximum? The answer, surprisingly for a mathematician, is yes! If we will examine many events of this nature we will see a (bell-like) distribution that has a pick at the value p=1/2.

Similarly, the average of many polls in which one picks, randomly, 1 out of 3 choices, will be a distribution of  50%:29%:21% and not 33%:33%:33%  as is expected from simple probability calculations. Laws of nature (the second law) can tell us something about the probabilities of probabilities. The function that describes the most probable distribution of the various events is called the distribution function.

The distribution functions in nature that are the result of the tendency of entropy to maximize are, among others:

  • Bell like distributions for humans: mortality, heights, IQ etc.
  • Long tail distributions for humans: Zipf law in networks and texts, Benford’s law in numbers, Pareto Law for wealth etc.

Benford law

Benford law is about the uneven distribution of digits in random decimal files. It was discovered by Simon Newcomb by way of noting consistent differentiation in the wear-and-tear of logarithmic books at the end of the 19th century. The phenomenon was re-discovered by Frank Benford in 1938.

Newcomb found and stated the law in its most general form by declaring that mantissa is uniformly distributed. Benford set out to check the law empirically and also guessed successfully its equation for the 1st digits :ρ(n)=log10[(n+1)/n]:namely, the probability of digit n (n=1,2,3,…,8,9), ρ(n) is monotonically decreasing such that digit 9 will be found about 6.5 times less than digit 1. The law is also called “the first digit law”. Benford has shown that this law holds for many naturally generated decimal files.

Misconception: Benford law applies only for the first digits of numbers.

NOT TRUE. Benford law holds for the first, second, third, or any other digit order of decimal data. The law was originally stated mostly in terms of 1st digit sense which does not include the 0 digit. Second and higher orders naturally incorporate the 0 digit as a distinct possibility of course.

Benford law is applied for any decimal file that is compressed to Shannon limit. In a binary file at the Shannon limit all the bits excluding the 0’s are 1. In the case of 0, 1, 2 counting system the ratio between the digits 1 and 2 is 63:37 and in 0,1,2,3 counting system, the ratios between the digits 1, 2 and 3 are 50:29:21. In the same way a compressed decimal file has Benford’s law distribution.

Why calculating the Shannon limit does not gives us information about the “0”s? Strictly speaking zero has no entropy and therefore it does not count. Or in a formal way entropy is logarithmic and this is also the reason why the changes in frequencies of the digits are logarithmic (exactly like the distances in a slide rule).

Why entropy is logarithmic? Because, that IS the way God plays dice!