Search This Blog

Tuesday, December 13, 2011

The 2nd Law: Is Increased Entropy Stochastic (incidental) or Causal (intrinsic)?


Recent science news is dominated by the multi-trillion dollar experimental search for the Higgs boson particle. A definitive observation of the theorized, but illusive, Higgs will finally complete the verification of the Standard Model – the most respected mathematical model of the evolution of our universe, explaining the emergence of each of the known forces and all of the matter we can observe. In the Standard Model, the Higgs is responsible for gravity – surrounding the more pedestrian particles – lending them the property we call "mass". If the Higgs exists, it is important as the causal bridge between the quantum world of the small and the relativistic world of the large. How could a particle that causes gravity be so hard to find? Because it doesn't actually have mass. It is as a result, known as "weakly interacting". It is only when a whole bunch of Higgs get together and surround other particles that mass is detected, and then, only in the surrounded particles. The Higgs binds so tightly to other particles, that it takes an extraordinary amount of energy, to break it free so that its presence can be detected. This is what the "Large Hadron Collider" does – it smashes heavy atomic nucleus (stripped of their electrons) at energies equivalent to those of the first moments after the Big Bang when all of the matter and energy in the entire universe was still smaller than a single star.

But there is a far more fundamental question. Gravity is a property. It is domain-dependent. It is specific to and belongs to a class of objects of a particular makeup and composition. The existence or nonexistence of the Higgs has no effect upon other properties of the universe like electromagnetism.

But there is a candidate for a domain-independent attribute of any and all causal systems. This attribute has been labeled the "Causal Entropic Principle" – it is generally discussed within the context of the transfer of heat (at astronomical scales) – within the study of thermodynamics. It is the logical extension of the concept of increased entropy, as first postulated, measured, and later described as the 2nd Law of Thermodynamics. But now, a hundred and fifty years after the formalization the laws of thermodynamics (of the phenomena and parameters of the transfer of heat, of the ratio of potential energy and work) correlative investigations in the fields of information, communication, computation, language, energy/mass, logic, and structure have uncovered parallel principles and constraints.  It is reasonable now to understand the 2nd Law as a description of a fundamental constraint on any change, in any system, no matter what forces and materials are at play. We now understand the 2nd Law to describe the reduction in the quality (density) of the energy and or structure of the universe (or any part therein) as results any change at all. We have come to understand the 2nd Law as a constraint on the outcome of change in structure, which is to say "information", on its construction, maintenance, and or transfer. This insight has rendered an equivalence between energy and structure in much the same way that Einsteinian Relativity exposed the equivalence between energy and mass.

There is however a daemon lurking within our understanding of the 2nd Law, a daemon that threatens to undermine our understanding of causality itself, a daemon that, once defined, may provide the basis for an understanding of any self-consistent causal system, including but not exclusive of our own universe and its particular set of properties and behaviors.

The daemon of the 2nd Law is the daemon of stochastic – is 2nd Law dictated dissipation (entropy) statistical, or is statistics simply a tool we use in the absence of microscopic knowledge? Asked another way, is the reduction in the quality of energy or information that the 2nd Law demands of every action, a property of the universe or is it a property of the measurement or observation of the universe? Is action equivalent to measurement? Is there a measurement or stochastic class of action free of the entropy-increase demanded by the 2nd Law?

This question is of far greater consequence to the universe and the understanding of the universe than the mechanics of mass as it would describe and thus parameterize ALL action and ALL configuration and the precipitation or evolution of all possible action and configuration. Where the existence of the Higgs Boson may explain the source of mass and gravity in this universe, an understanding of the causal attributes leading to the behavior described by the 2nd Law of Thermodynamics might just provide a foundation from which any and all causal systems must precipitate.

The implications and issues orbiting this problem are many and deep. At stake is an demonstrative understanding of change itself. We tend to think of change as exception. But, can a thing exist without change? If not, what is the difference between data and computation, between thing and abstraction of thing, and profoundly, an answer to the question, can data exist without computation? Can thing exist outside of abstraction of thing?

In thermodynamics and information theory, an effort is made to distinguish process and stochastic process. Heat is defined as an aggregate property describing the average or holistic state of systems composed so many interacting parts to keep track of all of them individually. Heat is a calculous of sorts, a system of shortcuts that allows mathematics to be employed successfully to determine the gross state of a huge collection of similar parts. There is a tendency then to assume that the laws that describe heat are laws that only apply to aggregate systems where knowledge is incomplete.

Are there non-stochastic systems? Are there discrete systems or dynamic changes within systems for which the laws of thermodynamics don't apply? Does the Causal Entropic Principle apply if you know and can observe every attribute of, and calculate the exact and complete state of a dynamic system?

Such questions are more involved than they may seem on first reading. Answering them will expose the very nature of change, independent of domain, illuminating the causal chain that has resulted from full evolutionary lineage of the universe.

Randall Lee Reetz

Note: The Causal Entropic Principle isn't a complex concept. It is the simple application of the 2nd Law's demand for increased universal entropy as a result of every change in any system. It says that every action in every system must be that action that causes the largest reduction in the quality of information or energy (the greatest dissipation). It says that a universe has only one possible end state – heat death – and that processes that maximize the rate towards this end state will be evolutionarily favored (selected), simply because entropy-maximizing processes and structures demand a higher throughput of energy and thus end up dominating their respective locality. Such entropy-maximizing schemes are thus more likely to determine the structure and behavior of the event cone stretching off into the future. An obvious extension of this principle is that complexity, or more precisely, the family of complexity that can find, record, and process abstractions that represent the salient aspects (physics) of the (an) universe, will help that complexity better predict the shape and behavior it must assume to maximize its competitive influence upon the future of entropy maximization. The "Causal Entropic Principle" thus represents a logically self-consistant (scientific) replacement for the awkwardly self-centered and causally impossible "anthropomorphic principle" (which lacks a physical or causal explanation and leans heavily on painfully erroneous macroscopic stretching of the quantum electro dynamics). Stretching circular logic to its most obvious and illogical end, the anthropomorphic principle borrows awkwardly and erroneously and ironically form the Heisenberg / Uncertainty Principle by asserting the necessity of "observers" as a precursor to the emergence of complexity. The Causal Entropic Principle explains the production of localized complexity without the need for prior-knowledge, and does so within the bounds of, as a result of, the 2nd Law of Thermodynamics, by showing that localized complexity can both come into existence as a result of the constant increase in universal entropy, and more specifically, that localized complexity has an evolutionary advantage, and will thus out-compete, less complex structures. In a Causal Entropic Principle universe, intelligence is the expected evolutionary result of competition to reach heat death faster. Falling down is enhanced by a particular class of complexity that can come into existence as a natural result of things falling down. Should one form of such complexity "understand" the universe better than another form, it will have an advantage and will be more likely to influence the shape of complexity in the future. The better a system gets at abstracting the dynamics of its environment the more likely it will be able to eat other systems than be eaten by them. Where the anthropomorphic principle requires an a-priori "observer", the causal entropic principle simply requires the 2nd Law's demand for increased entropy, for things falling down.

No comments:

Recent Posts