May 09, 2013 | By

The Puzzle of Entropy in Physics and Intelligence

Entropy, the measurement of disorder in a physical system, is one of the most profound puzzles in physics.  The Second Law of Thermodynamics, formulated in the 19th century, states that entropy always increases as physical systems naturally progress from order to disorder.  However, modern physics has never been able to explain why the universe has this directionality.  While we intuitively understand the “arrow of time,” it is absent in the formulations of classical and quantum physics.  In recent decades, the concept of entropy and its related mathematics has also found applications in information theory.  A recent study has linked entropy with the emergence of intelligence.  Why is there such an unusual connection?

Entropy is a measure of statistical uniformity – of particles in space, of waveform probabilities, or of the information contained in large data sets.  To the extent that different sub-parts of the system cannot be readily distinguished from each other, the system has high entropy.  Essentially, this means that the stuff being studied is well mixed, and exhibits very little structure or order.  On the other hand, if the system shows a high degree of structure or order, the different subsets or sub-volumes will be quite different, in which case the system has low entropy.  A simple example is a bucket into which two different colors of paint are poured.  Initially, you can see the two colors as they are poured in, before they get mixed up.  This shows a degree of order consistent with lower entropy.  Once the paint is mixed, you can no longer distinguish the two colors – every part of the bucket looks the same – this is a state of higher entropy.

The Second Law of Thermodynamics point out a remarkable universal feature of the world – that entropy increases over time.  This is something we observe constantly and understand easily in our day-to-day life – broken glass does not suddenly become whole – spilled milk does not go back into the jar – things get old and break.  This seems so simple and yet the concept is one that theoretical physics has never been able to explain.  As noted by Brian Greene in 2004: “Even though experience reveals over and over again that there is an arrow of how events unfold in time, this arrow seems not to be found in the fundamental laws of physics.” (The Fabric of the Cosmos, p 145.)   Renowned physicist and mathematician Roger Penrose put it more simply in 2010: “It is just observational fact that the Second Law holds good.” (Cycles of Time, p. 51.)

Brian Greene traces the source of this puzzle back to the very beginnings of the universe.  For unknown reasons (unknown to the theoretical physicist), our particular universe started in a highly ordered, low entropy and extremely low-probability state – analogous to that of a rubber band wound up tight.  At the Big Bang, this highly ordered state began un-winding to higher probability, higher entropy states, giving us the universe as we know it.  “How is it that the universe began in such a highly ordered configuration, setting things up so that for billions of years to follow everything could slowly evolve through steadily less ordered configurations toward higher and higher entropy?” (Ibid, p. 175.)

This is not the end of the story, however.  Ever since the Second Law was formulated, people have noticed that in some systems, there seem to be violations of the Second Law – as systems become apparently more structured rather than less.  For example, patterns can sometimes emerge in physical systems such as crystals or whirlpools.  More significantly, however, life itself and its biological systems seems to run counter to the Second Law:  Individual plants or animals grow from simple to more complex structures.  The complexity of life itself seemingly arose from a highly mixed chemical soup, and has continued to evolve into more complex and sophisticated forms.

However, these examples turn out not to be violations of the Second Law, but something else – systems that “extract” order from the environment.  When looked at as a whole, the biological system and its environment (including, for example, the sun and its radiation) obey the Second Law, but there is increased order in one subsystem (the life form) that is paid for with decreased order elsewhere (e.g. nuclear reactions in the sun).  A similar rationale applies to the evolution of stars and galaxies – the apparent higher order is consistent with the overall decreased order of the universe as a whole.  The fact that these increases in structure and order are not in violation of the Second Law does not, however, explain why such structures appear.

Additional hints about the strange mystery of entropy have recently been identified in a very different arena – computer science.  The extension of entropy to information theory extends back some fifty years, when a formula describing certain properties of information was shown to be identical to the formula for entropy in statistical mechanics by physicist E.T. Jaynes.  Essentially, entropy extends to any dynamic system of sufficiently large numbers of objects (particles, fields or information) for the statistical concepts to apply.  Given these large numbers, any transitions through time will result in a loss of information or structure and an increase in entropy.

In a paper published this past April in Physical Review Letters titled “Causal Entropic Forces” by Wissner-Gross and Freer, the authors present an analysis of what happens when they model the behavior of certain systems in light of the universal propensity to maximize entropy.  In short, they conclude that the principle of seeking to maximize entropy over an extended time horizon will result in the emergence of intelligent behaviors.  As the authors note:  “In conclusion, we have explicitly proposed a novel physical connection between adaptive behavior and entropy maximization…”

The coverage in Forbes online stated it this way: “They see ‘intelligence as a fundamentally thermodynamic process,’ where any given system engages in a ‘physical process of trying to capture as many future histories as possible.’ ”  Dr. Raphael Brusso in an interview with the BBC described it as follows:  “The paper argues that intelligent behavior, which is hard to quantify, can be reduced to maximizing one’s options, which is relatively easy to quantify. But it cannot explain intelligent behavior from first principles.” 

Conclusion and Speculation:

Entropy is a concept that is key to understanding the behaviors of large, complex dynamic systems.  Such systems inevitably tend to “wind down” – resulting in an overall increase in entropy as systems proceed from more ordered and structured states to ones that appear less so.  However, at the same time there appears to be another consequence of the directionality of the entropic force – maximization of entropy through time results in the emergence of pockets of increased structure and complexity.  This emergence could be described as a response within a system to the “pull” of entropy.  Whirlpools arise from the pressure of the water as it passes through a constriction.   Galaxies arise from the pull of gravity as matter and energy expand.  Intelligence arises from the pressure of information seeking to be understood.

Furthermore, while science has been able to tease out the workings of this mechanism, it has no explanation of why it is so.  To state this metaphorically, science may help explain what “is” (being) but is not so helpful in explaining what “is to be” (becoming).

Entropy is as much a metaphysical puzzle as it is a feature of matter, energy and information.   It is fundamental to our notion of causality and time, and appears to supply a mechanism but not an explanation for the stunning and beautiful emergence of structure, order and intelligence from apparent chaos.

One simple answer may be that it is the ultimate ordering principle of God’s creation.  I intend to explore this concept and its implications in greater detail in a future article, based on the work of physicist Ian Thompson in his book Starting Science From God: Rational Scientific Theories from Theism, 2011.

Sources:

Brian Greene, The Fabric of the Cosmos, 2004

Roger Penrose, Cycles of Time, 2010

Forbes online, 4-21-2013, “From Atoms To Bits, Physics Shows Entropy As The Root Of Intelligence”: http://www.forbes.com/sites/anthonykosner/2013/04/21/from-atoms-to-bits-physics-shows-entropy-as-the-root-of-intelligence/

BBC online, 4-23-2013, “Entropy law linked to intelligence, say researchers: http://www.bbc.co.uk/news/science-environment-22261742

Wissner-Gross and Freer, “Causal Entropic Forces”, Physical Review Letters April 19, 2013.

5 Responses to “The Puzzle of Entropy in Physics and Intelligence”

  1. […] explain why the universe follows the arrow of time towards higher entropy and decreasing order (see The Puzzle of Entropy), neither have complexity theorists offered an explanation for why the universe displays an […]

  2. […] in time.  Which means, in essence, they are unable to explain time.  While entropy (see:  The Puzzle of Entropy in Physics and Intelligence) seems to characterize the directionality of time, physicists do not have a coherent consensus […]

  3. Stephen H. Smith, M.D. says:

    This is a very thoughtful and intriguing article, makes me want to explore the ramifications of the Second Law in much greater detail. One wonders for example how it may relate to our old friend the number “phi” (as in the Golden Section or Divine Proportion). Does the Second Law have any bearing on “phi” in the design of life forms, and even our intuitive appreciation of beauty…both visual in the 1.6….ratio of the Parthenon, carpets woven by peasants the world over, etc., and auditory, in musical scales.

    Then consider the occasional association of genius and madness; remarkable insights emerging from a mind in chaos, or even with well known relationship between depression and creativity.

    The article describes a strict economy in creation…there is a price to be paid for every new thing.

    • George Gantz says:

      There is a price to be paid, indeed. Speaking of “phi”, I wrote a paper years ago for the SSA musing on the Spiritual Significance of the Number Phi which you might find interesting –
      http://swedenborg-philosophy.org/journal/article.php?issue=108b&page=1044.

      • Stephen H. Smith, M.D. says:

        I very well remember your article on “phi” in the New Philosophy and was amazed to read in it that one of the many ways to generate the number was the sum of the square root of one taken to infinity. The abundance of phi in nature certainly seems to be telling us something about creation (it was the topic of a lecture I gave to the students at the British Summer Academy in Atherstone, England with Charlie Cole back in the ’80’s).

Join the Discussion

Why ask?