Search This Blog

Book, Chunks, Parts, Chapters, Sections, Subsections?

BlogBook, Chunks, Parts, Chapters, Sections, Subsections?
Each hierarchical section separated by a string of invisible chars (ASCII 0) of specified length…
User •••••••••••••(13)
Library ••••••••••••(12)
Collection •••••••••••(11)
Set ••••••••••(10)
Volume •••••••••(9)
Book Chunks ••••••••(8)
Cover Matter ••••••••(8)
Front Cover ••••••(7)
Title Page ••••••(7)
Front Matter ••••••••(8)
Title Page ••••••(7)
Publication Page ••••••(7)
Acknowlegements ••••••(7)
Dedication ••••••(7)
Table of Contents ••••••(7)
Forward ••••••(7)
Introduction ••••••(7)
Graph/Photo/Illustration/Map/Timeline ••••••(7)

Body Matter ••••••••(8)
Parts: 1- 5 •••••••(7)
Chapters: 5-10 per Part (all about the same length, start with Chapter Title Case) ••••••(6)
Sections: 3 to 5 per chapter (A-Head Title Case) •••••(5)
Subsections: (B-Head Title Case) ••••(4)
Pages
Page Foot Notes
Paragraphs: •••(3)
Sentences: ••(2)
Phrases: •(1)
Words:
Syllables:
Characters:
Chapter End Notes •••••(5)

Back Matter ••••••••(8)
Book End Notes ••••••(7)
Index ••••••(7)
Glossary ••••••(7)
Bibliography ••••••(7)
List of Illustrations ••••••(7)
Also By This Author ••••••(7)
Recommended By This Author ••••••(7)
About This Author ••••••(7)

Expect more, demand more: A letter to all humans

Next time you are seduced into the idea that computers have evolved at an extraordinary exponential pace, think about what COULD be, what SHOULD be. This morning, I took to the internet to find an application that would automatically classify my photos, would go through them all and tease out their contents (cars, people, friends, family, objects, places, situations, time of day, time of year, etc.), you know, the who what where how why of my images. But nope. No such software exists at the consumer level. Nope.

My computer (your computer), doesn't have a f-ing clue what you do, what you create, who you are, what you want, where you've been, who you interact with, what characterizes any of the interactions you have with any of it, and it certainly doesn't care or know or have the capacity to do anything about any of what you are or want or could be or create. Nope, your computer doesn't do anything but sequence your typed instructions, you computer sits their dead-dog until you click on a button, or until you type a key.

All this talk of Artificial Intelligence? None of it is anything but desperate attempts to manipulate you as a consumer. You have been pimped out to the highest bidder, and mostly to tap into the most basic of your desires, to get you to buy pizza or to watch porn or distracting cat videos that make you say "Ahhhhhhh so cute!" or to protect yourself from spying eyes, from the manipulation it is all set up to accomplish.

Do you understand the "cloud"? Here I will explain. If your data can be stored off site, not on your hard drive, but on banks of hard drives owned by someone else, than any of that data about you and your behavior and the things you've made, all of it is wholly owned by someone other than you and can be sold to others out to get you to send your dollars towards the hands of their clients.

At least relatively, your computer is getting dumber while the cloud gets "smarter" (gets better at manipulating you).

Who is at fault? You wont like the answer, but it is you. At fault is your inability to imagine something more powerful from your computer.

Yes it is your fault. But there is blame to go around. Where you might be too busy doing what you do to have the time to understand how the computer industry is messing with you, the computer industry itself isn't. Silicon Valley tech workers and especially tech entrepreneurs, they know, they are each of them well aware of the situation, and even of how to fix it. It is their inaction their active ignore-ance of the problem that is directly to blame.

And why do tech companies and tech workers continue to choose consumer manipulation over human-potential? Because its dead cheaper to get you to buy another pizza, than it is to provide the tools that will amplify your efforts.

But am I asking for too much? Is what I am suggesting, computers that know what you are working on and why, who know how to put it all together and offer you collaborative intelligent predictions and that build out solutions for you and your most potent future? Actually, yes. And it isn't particularly hard to do.

I am an amateur programmer at best. Yet I have been able to hack together bits and pieces of the foundation of a system that would allow a computer to collaborate powerfully with its user. And I have done this with less than a couple of thousand lines of poorly written code.

My code isn't magic or profound in any real way. Pretty simple stuff actually. The trick is thinking the problem through. The trick is digging down past all of the particulars and looking instead at the general, the shared, the commonalities. The trick isn't so much providing the most profound answer, but asking a general enough question.

What is intelligence? What would be required of a computer that would act intelligently in collaboration with you? Well if you look deeply, intelligence is simply the capacity to predict, or more accurately, the fidelity of your predictions. So the problem facing any person wanting to build a computer that is intelligent is to design a system that gets better and better at prediction.

Getting better and better at prediction presents yet more questions. What are the resources the components of a system that evolves, that gets better and better at anything at all? Turns out the answer is fairly simple, though it requires a bit of thinking, of thinking at a level most of us are not comfortable with, of thinking that goes against the grain of how we like to believe thinking to be. Turns out thinking as a process can be broken down into components that are themselves simple as hell, components that can be combined in simple ways into a system that produces a full spectrum of intelligence, a spectrum upon which any particular instance of intelligence can be found not of some special qualitative magic, but simply as a result of how many of the component parts have been associated to each-other.

At any rate, the reason we don't have machines that collaborate with us, the reason we don't have machines that get better and better at collaboration, has nothing to do with how difficult it may be to produce them, and has everything to do with the fact that its cheaper to sell us pizza and porn.

I am begging all of you to expect more, to demand more of the tech industry.

Randall Lee Reetz

Projection or Prediction

Most formal scientific cosmological investigation is preferential to the determination of the earliest events in our universe's history. As we tease away the noise, we illuminate conditions and dynamics ever closer to the moment of origin. The notion, reasonable enough, is that the more robust our knowledge of the earliest conditions, the greater our ability to predict the scope (limits) of all future events.

Prediction, remember, is the purpose of all intelligence, all science, all knowledge, all structure, and all evolution. So, yes, if one is interested in advancing our power and accuracy of prediction, it is methodologically reasonable to understand the earliest conditions of our nascent universe.

But somewhere along the scientific way, investigators revealed an interesting attribute of systems, of all systems, the attribute now called "The Second Law" of thermodynamics. What makes the second law interesting and unique is that it predicts the same future for all posible systems. That future is maximal dissipation. The second law says that all systems at all times are moving en mass towards disorder… are falling apart.

The 2nd Law was discovered by people interested in the flow of heat. Specifically, their interest was the maximal efficiency of steam powered equipment, factories, and transportation. They wanted to get the most production bang for their coal fired steam buck. And what they found was rather frustrating to a factory or train owner. What they found was that all systems no matter how well designed, leak a lot of heat, a lot of potential power, power that would ideally be used by the factory to make carpets, or by the locomotive to pull freight from point A. to point B. To make matters worse, the nature of this leaking of energy problem was such that it was irreversible. Once energy leaked out to the surroundings, any effort to recover it would cost more energy than was lost, lots more.

Now you might expect that there is something special about factories and locomotives that makes all this leaking energy so big a problem. You might be inclined to hope that leaking thermodynamic energy is specific to man made or artificial systems. You'd be wrong. Second law demanded energy dissipation is as true of natural systems as it is of man made systems. But the real kicker is that 2nd Law dissipation has nothing specific to do with heat or steam or coal, manmade or otherwise, but to all systems and all forms of energy applied in any way and under any conditions. This universality of dissipation became crystal clear in the 1940's when Claude Shannon of the Bell Labs in New Jersey, USA independently discovered the same dissipation dynamics in information and communication while trying to do for the telephone industry what the original thermodynamics investigators had attempted for the steam power industry a century and a half before. Shannon found that trying to shove signals down a wire or through the air resulted inevitably in noise that degraded the original signal and that insuring accuracy or distance in communication was a costly affair where more and more energy must be pumped into the system with less and less of that energy resulting in actually moving that signal from point sender to receiver. The final kicker came a few decades later when it became clear that computation suffered the same dissipative pitfalls as had been earlier discovered in communication and the conversion of power to work.

OK, so what does any of this 2nd Law stuff have to do with the question I posed at the top of this post, essentially: what is a more effective path towards predictive understanding of a universe, knowing when and how it started or knowing how it will all end?

Until the 2nd Law, all scientific effort resulted in understandings that started with initial conditions and worked forwards in time. Newton's laws of acceleration are a great example. If you know where and object is and what forces are brought to bear on that object, you can use newton's math to accurately predict that object's position at any time in the future. Einstein's work simply reinforced Newton's laws and provided a more robust contextual understanding of why they worked and when they could be expected not to work. But the 2nd Law is a strange bird indeed. The 2nd law simply doesn't care how a system starts, or what it is made of, or what forces pertain. The 2nd Law focuses our attention on the way systems move into the future, and mostly, on what systems become in the end. That end, on the grandest universal scale is something called "heat death". Heat death isn't really an end, the time doesn't stop, its just that things fall down and fall down, the dissipate and dissipate until less and less becomes posible. The slide into maximal dissipation is what we call an asymptote, it is an end never actually met. An end that in effect keeps ending. The universe is scheduled to become yet more dissipated forever. But the lion's share of that forever will look almost the same from eon to endless eon. The 2nd Law end is an end that never quite ends.

From a scientific perspective, at least from the perspective of most of the short history of science, the 2nd Law predicted end is absolute and perfectly knowable and absolutely independent on the initial state of our universe or for that matter, of any posible universe. Previous scientific knowledge had settled in on the idea that the future is only predictable to the extent that the past is known, that the laws of the dynamics of the universe are known, and even then, the predicted future becomes fuzzier and fuzzier the deeper one looks.  Yes the 2nd Law is strange indeed, flipping prediction end to end, it says that the end state of any universe is absolutely known, and the intractable part is instead the path towards that end. Got it? No, its not an obvious idea to grasp.

So now lets revisit the original question I asked. Is there any point in the full arch of a universe's life, when what is known of its past is less important than knowledge of its end?

To answer that question one might want to look not to the beginning or to the end, but to vast middle. In both predictive models, the classical causal model that predicts the future by knowing the past, and the entropic model which says that the past is always just a ramp towards a perfectly knowable end state, it is the middle ground that is the the most intractable. A thermodynamically determined universe is one in which falling down is the determining factor. A thermodynamic universe is one in which the end is absolutely known but the path getting there is not known. In a thermodynamically determined universe, each new moment presents a new set of conditions that must be computed upon in order to make the best posible prediction of the shortest path from that indeterminate here and now to the perfectly determined eventual then and there.

A universe locked into the 2nd Law dance would seem to be a universe in which the dynamics of change is a dynamics that becomes better and better and understanding its own dynamics. We have come to table this cumulative understanding "evolution". Evolution it would seem, is the process by which a universe becomes better and better at playing the only game a universe can play, and that is the game of getting to the end state as soon as possible. So we must reframe our original question and ask which knowledge is most evolutionarily potent, knowledge of the past or knowledge of the future? Or, in the spirt of my original question, is the situation more complex, more dyanmic, does the answer to the question vary depending on the particular epoch one asks it? Is the past more determinate in an early universe, and the future more determinate in an older universe? The reverse hardly seems reasonable.

If not, if predictions are more dependent on knowledge of the end than they are of the beginning, and if this is true no matter when in the arch of the lifespan of a universe one asks, what can be said of the value of traditional cosmological formalisms? If evolution is a process by which a universe finds the shortest path from any beginning to its entropic end, how important is knowledge of a causal classical dynamics in the solving of any of the moment to moment shortest path computations that must be eternally computed?


… to be continued …

TEST BLOG

THIS IS A TEST OF THE I NEED A RSS FEED CENTER FOR EXCEPTIONAL IDIOCY
Macro-evolution (not micro-evolution)

1. Any change in any system is THE change that would have resulted in the greatest dissipation.
2. A universe only becomes more dissipated with time. There are no exceptions to this process… it is a one way trip.
3. This is the only domain independent behavior in any system and in any possible universe.
4. As things fall down, they sometimes cause other smaller things to fall up, or to land in locally complex arrangements (so long as the energies released are greater than the energies needed to assemble).
5. Some of these assemblies are structurally stable (resist dissipation). Even more rarely, they are both stable and catalyze faster local dissipation.
6. As a result, these stable catalyzers result in greater local energy throughput. Think of them as the deepest and steepest canyon that will gather the greatest flow of river water.
7. This increased flow will have the greatest effect on the future topology of the dissipative landscape.
8. Complexities that can survive nearest this flow will by necessity need to be even more structurally stable (survive across time despite an extremely corrosive environment), and to do so, they will by necessity have to draw more and more energy through their own system to repair and maintain stability… increasing the rate and density of dissipation at that locality.
9. Each of these processes are change catalyzers. They increase flux at the local level. They take the universe closer to heat death not by doing anything qualitatively different, but by doing what a universe does, but faster.
10. We call this process evolution.
11. Evolution is not teleological. No knowledge of the patterns and structure and behavior of a universe is required. Evolution works in any universe with any set of forces and structures and initial conditions.
12. Evolution has nothing in particular to do with biology or to any particular system or domain. It is agnostic to domain. They way in which atoms hydrogen and lithium and helium both come into existence and precipitate into proto-star clouds that accrete into stars is no different than ways in which biology evolves. The methods used are the result of the materials and forces and environments at hand and are independent of the overarching reason that dissipative change results in or selects for systems that dissipate more quickly and comprehensively… that get the universe to its "heat death" end state at the highest possible rate.
13. Darwin described well, the "how" of evolution within the domain of biology. But he couldn't put together the more general "how" of evolution such that the process could be seen stripped of its dependency on one particular domain.
14. Darwin was correct. But he described evolution at the local or micro-scale. For instance, the Galapagos finch populations he observed felt survival pressure that selected towards beak shapes that made them more and more specialized towards the exploitation of particular seeds and nuts in their environment. Such solutions do indeed favor optimization in what topologists call "local minimums". This is indeed what happens, for the most part, in any evolving system. It is the path of least resistance. But it doesn't explain (or not directly) the way that evolution finds solutions that involve looking outward to non-local opportunities of resource acquisition. Getting good at the local game can very much make you unprepared for the larger game just over the ridge into the next still larger valley. A great beak and excellent nut detection skills, is of no use should the bush you have learned to exploit either go extinct or if the greater environment (especially that environment which is the future) not have much at all to do with that particular kind of nut or for that matter, with nuts of any kind. Specialization causes local advantage, but ultimately, specialization (local optimization) always makes a species less well adapted as a generalist, and thus, less flexible and robust into the future. Optimization always results in extinction.
15. Any change in any system will be judged by the environment based on the degree with which that change adds or subtracts from that system's capacity to advantageously predict its dissipative future. Evolution filters for prediction because prediction allows an entity to extract greater advantage while using less energy. If the energy in your environment is in nuts, a beak predicts energy access. If you have a way to store the experiences you've had today, you won't have to try every door to find the bathroom tomorrow. Prediction can involve a brain, but it can just as easily and more commonly involve a shape or the presence of an appendage or sensor. A shark's tail predicts the need to move swiftly through water to catch and consume other fish.
16. Prediction is equivalent to intelligence. And as with beak shape specialization, locally optimized intelligence is ultimately less important evolutionarily than general purpose wide-scope intelligence.
17. The most general of all predictions/intelligences is the prediction that thermodynamics provides… the eventual and always closer asymptote of total dissipation (heat death). The capacity to make such a prediction (and to pay attention to it) gives humans great potential advantage. But only should we be able to crawl outside of our own evolutionarily acquired set of attention enhancers, such that we can push our interests and motives in the direction of universal dynamics.
18. Just having the capacity to detect the universal direction of evolution, does not necessarily insure evolutionary advantage. You have to have the means to make it an actionable goal and motive. If we humans can't take action and create advantage from the prediction of heat death we will eventually be as food for some other entity or system that can.
19. The Zeroth Law of causality: the universe is at all times, changing at the maximum rate possible. There is no holding back, no waiting around, all systems are decaying and dissipating to the full possible extent and speed, given their present context and configuration.
20. The 2nd law says that total info or energy concentration is reduced as a result of any change in any change in any system (energy applied to a system). It doesn't say anything about why the change that happened was the one of all possible changes that could have happened. And it doesn't say that all systems are changing at their maximal change rate at all times.

Post Script:

Imagine a bunch of atmosphere separated by temp. Top layer is cold, bottom layer is hot. This is an unstable situation as the hot air is less dense, wants to rise above the more dense cold air.
I've just described a typical storm cell.

Now imagine that some smaller areas of air are moving laterally, and some others are moving up or down. Somewhere in the cell, units of air are moving in almost every direction.

All units are moving in the direction they are moving because that is the least energy thing to do. They, like all systems, are simply falling down.

These units compete with each-other in the falling down olympics that is the brewing storm (imbalance). The ones that can reduce the cline between hot and cold air the fastest, are the ones that fall the fastest and thus dominate the falling down in their immediate region. As units less dominant (efficient) motions are absorbed, that part of the storm becomes a larger and larger competitor and its original or combined behavioral dynamics become dominant on a larger and larger scale.

Eventually, spiral dynamics dominate as they are better at reducing the temperature cline than are other shapes and gas meta-dynamics.

At each level of scale, each granularity in the system both structurally and temporally falling down is always at its maximal rate for that configuration at that particular moment in time. As changes accrue, the new configuration allows a new maximal rate of dissipation.

This should be obvious. Change is always motivated by difference. A universe doesn't like difference. Change always follows the fall line, the fastest path to greatest dissipation.

As a description of dynamics, maximal dissipation rate is only interesting in context to what the brain likes to believe about systems and especially likes to believe about systems that are responsive to human interaction.

As example, I cite the often repeated "we only use 5% of our brain". This statement is physically and causally false. No system can ever be any faster than it is currently operating at. If it could be running faster, it would be. Full stop. Now, it is possible that a brain sent to Cambridge University will after 8 years of graduate studies, be capable of operating at a higher rate (what ever that means), but that brain would not be the same brain that existed prior to those 8 years of studies.

It is instructive to periodically remind ourselves of this very important aspect of the causal physical world of which we are a part.