Do Black Holes warp the universe such that it is self-computable? Kurt Godel famously proved that a computer has to be larger than the problem being computed. This places seemingly fatal constraints on the size of the universe as a computation of itself. Saying as it has become popular to do, that the universe is just one of an infinite set of parallel universe doesn't solve the problem. Even infinities can not be said to be larger than themselves.
TED talk by Stephen Wolfram on the computable universe.
Is it possible that black holes work as Kline bottles for the whole Universe – stretching space-time back around onto itself? If so, it may be possible to circumvent Godel's causal constraints on the computability of the self, as well as the entropic leaking demanded by the second law of thermodynamics. I admit that these questions are not comfortable. They certainly don't result in the kind of ideas I like to entertain. They spawn ideas that seem to be built of need and not logic. They are jokes written to support a punch line.
But something has to give. Either Godel and Turing are wrong, or there is a part of our universe in which they don't apply. There is no other option. If there is a part of the universe not restricted by incompleteness than black holes are obvious candidates if for no other reason than we don't know much about them. I am at once embarrassed by the premise of this thought and excited to talk openly about what is probably the core hiccup in our scientific understanding of the universe. Any other suggestions? At the very least, this problem seems to point to (at least) five options; 1. a deeper understanding of causality will derive Godel and Turning from a deeper causal layer that also has room for super-computable problems. 2. Godel and Turing are dead wrong. 3. the universe is not at all what it seems to be, rendering all of physics mute, and 4. the universe is always in some real way, larger than itself, and 5. evolution IS the computation of the universe, it happens at the only pace allowable by causality, is an intractable program, and can not be altered or reduced, (event cones, the only barriers between parallel simultaneous execution).
I am challenged by the first option, find the second option empirically problematic, am rhetorically repulsed by the third, simply do not know what to do with the fourth, the fifth is where I place my bets but I don't fully understand the implications or the parameters. Personal affinities aside, we had better face the fact that our understanding of the universe is at odds with the universe itself. That we have a set of basic laws that contradict the existence of the universe as a whole is problematic at best. Disturbing.
One of the unknowns that haunt our effort to understand the universe as a system is the ongoing confusion between what we think of as "primary" reality on the one hand and "descriptive" reality on the other. Real or just apparent, it is a distinction that has motivated the clumsily explorations of the "Post-Modern" theoretical movement – it deserves better. I am not so romantic to believe that this dichotomy represents a real qualitative difference between the material and the abstract (made up as it is of the same "real" materials), but this confusion may indeed hint towards a sixth option that, once explained and understood, will obliterate the causal contradictions that have so confused our understanding of the largest of all questions. When a chunk of reality is used as abstraction signifying another part of reality or a part the same reality of which the abstraction is built, does that shift in vantage demand a new physics, a new set of evaluation semantics? What modifications does one have to perform to E = mC^2 when one is computing the physical nature of the equation itself? What new term is to be added to our most basic physical laws such that the causal and the representative can be brought into harmony?
My own view is that the universe, like all systems, like any system, is always in the only configuration it can be in at that time. Wow, that sounds Taoist and I absolutely hate it when attempts at rationality result in assessments that are so easily resonant with emotionally satisfying sentimentality (What the Bleep, and such). But the Second Law clearly points to a maxed out rate as the only possible reading of process at all scales. Computation of anything, including the whole of the universe, is always limping along at the maximum rate dictated by each current configuration. The rate of the process, of the computation, accelerates through time as complexities stack up into self optimized hierarchies of grammar, but the rate is, at each moment, absolutely maxed out.
Are these daft notions chasing silly abstraction-bounded issues or do they point to a real "new [and necessary] kind of science"?
OK, as usual, Mr. Wolfram has expansive dreams – awesomely audacious and attractively resonant notions. Though, from my own perspective, a perspective I will say is more sober and less rhetorical, there are some huge problems that beg to be exposed.
Wolfram's declares: the universe is, at base, computation. Wow, talk about putting the carriage before the horse. That the universe and everything in it is "computing" is hard to dispute. Everywhere there is a difference there will be computation. So long as there is more than one thing, there is a difference. But computation demands stuff. What we call computation is always at base a causal cascade attempting to level an energetic or configurational topology. If you want to call that cascade "computation", well I won't disagree. But no computation can happen unless the running of it diminishes to some extent an energy cline. Computation is slave to the larger more causal activity that is the dissipation of difference. That a universe will result in computation an entirely different assertion.
When Wolfram says that computation exists below the standard model causality that is matter and force, time and space, I am suspicious that he is seeking transcendence, a loophole, access by any means out of the confines of the strictures imposed by physical law. That he is smart and talented and prodigiously effective towards the accomplishment of complex and practical projects does not in itself mean that his musings are not fantastic or monstrous.
Let's play a thought experiment. Let's start from the assumption that Wolfram is correct, that the universe is at base pure computation. His book and this talk hint towards the idea that pure computation running through computational abstraction space, will eventually produce the causality of this universe… and many others. Testing the validity of this assertion is logically impossible. But what we can test is the logical validity of the notion that one could, from the confines of this finite universe, use computation to reach back down to the level of pure computation from which a universe can be made or described. At this level, Turing and Godel both present lock-tight logic showing how Wolfram's assertions are impossible.
In his own examples, Wolfram uses a mountain of human computational space built on billions of years of "computation" (evolution) and technological configurations to make his "simple" programs run. There is NOTHING simple about a program that took a mind like Wolfram's to build (stacked as it is on top of an almost bottomless mountain of causal filtering reaching back to the big bang (or before).
To cover for these logical breaches, Wolfram recites his "computational equivalence" mantra. This is a restating of Alan Turing's notion that a computable problem is computable on any so-called "Turing Complete" computer. But the Turing Machine concept does not contend with the causally important constraint that run-time places on a program. Of course there are non-computable problems. But even within the set of problems that a computer can run and run to completion, there are problems so large that they require billions of times longer to run than the full life cycle of the universe. Problems like these really aren't computable in any practical sense – causality being highly time and location sensitive (isn't that what "causality" means?).
And then there is the parallel processing issue, its potentials and its pitfalls. One might (a universe might), in the course of designing a system that will compute huge programs, decide to break them apart and run sections of the problem on separate machines. Isn't that what nature has done? But there are constraints here as well. Some problems can not be broken apart at all. Some that can, break apart into an unwieldy network constrained by time sensitive links dependent upon fast, wide, and accurate communication channels. if program A needs the result of program B before it can initiate program C but program A only is only relevant for one year and program B takes 2 years to run?
A large percentage of the set of all potential programs, though theoretically run-able on Turing Machines, are not practically run-able given the finite timescales and computational material resource availability. If there is a layer of causality below this universe, and that layer is made of much smaller and much more abundant stuff, than it is conceivable that Godel's strictures on the size of a computer won't conflict with the notion that this Universe could be an example of a Turning Complete computer capable of running the universe as a program.
But Wolfram doesn't stop there. In addition to asserting that a universe is the result of a computation, he says that we humans (and, or, our technology), will be able to write a small program that perfectly computes the universe and that it will be so simple (both as a program and presumably to write it) that we will be able to run it on almost any minimal computer. His cites as example, "rule 30", the fractal equation variation that seems to produce endless variety along an algorithmic theme, as evidence that this universe describing meta-program, is as easy to discover. One has to ask: "Would the running of such a program bud off another universe, or is Wolfram's assertion intentionally restrained to abstraction space?" Given the boldness of his declaration that the universe is a computation, it is reasonable to assume that his statements regarding the discovery of a program that computes a universe is meant in the literal sense. Surely he can talk to the issue of abstraction space vs. causal space, the advantages and constraints of each, and how programs use this difference to compute different types of problems. If he does, he doesn't reveal this understanding to his audience. The distinction between abstraction and causality is slippery and central to the concept of computation.
I am convinced that Stephen Wolfram is so lost in the emotional motivations that push him towards his "computable universe" rhetoric that none of his considerable powers of intellect can save him from the fact that he didn't get the evolution memo. Evolution IS the computation. If it could happen any faster it would have. If he is simply saying that our new understanding of computation will increase the rate and reach of evolution, well then I agree. But if he is saying that our first awkward steps into computation reveal enough of the unknown to expose the God program, the program that will complete all other programs (in a decade), well I can only say that he is nuts.
Stephen is a smart guy. The fact that a mind so capable can overlook, even actively avoid the simple logic that shows terminal flaws in his thesis is yet another reminder of the danger that is hubris. That he never talks to his own motivations, or the potential fallacies upon which his theory depends should be worrisome to anyone listening. I suspect that, like religion, his rhetoric so closely parallels the general human rhetoric, that it will be a rare person who can look behind the curtains and find these logical inconsistencies (no matter how obvious).
I applaud Mr. Wolfram's work. The world is richer as a result. But none of his programming should be taken as guarantee that his theory, at the level of a computational universe is sound.
Randall Reetz
Change increases entropy. The only variable; how fast the Universe falls towards chaos. Determining this rate is the complexity being carried. Complexity exists only to increase disorder. Evolution is the refinement of a fitness metric. It is the process of refining a criteria for the measurement of the capacity of a system to maximize its future potential to hold complexity. This metric becomes ever more sophisticated, and can never be predetermined. Evolution is the computation.
Search This Blog
Just Where is the Computer that Computes the Universe? (Steven Wolfram's invisible rhetoric)
Labels:
abstraction,
causality,
computable,
Computation,
computational,
computer,
constraints,
evolution,
Godel,
logical,
problems,
program,
programs,
reality,
set,
space,
Turing,
understanding,
universe,
Wolfram
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment