If you are living within a modern developed economy you are most likely the unwitting master of over 40 computer chips. Of these chips only one or two are the big expensive general purpose CPU's that sit at the center of PC's and laptops of the sort Intel or AMD or Motorola charge several hundred dollars, aggressively advertise, and which, improbably, get both more powerful and less expensive at the wild rate dictated by Moore's Law (respectively doubling and halving every 18 months).
The rest of your chips live less conspicuous lives of simple servitude. They are embedded within many of the products and appliances you already own and almost all of the products you will buy tomorrow or own in the future. Compared with CPU's, embedded chips are usually smaller, more specific, and limited in their abilities. Uncomplainingly, and in many cases, without the need for any human interaction at all, they go about their menial tasks in nauseating repetition… electrons, it turns out, are exceedingly cheap. Embedded chips now control your car's fuel/air mixture, and time the ignition sparks in each cylinder. These chips are one of the reasons modern cars are so much more powerful and manage better mileage at the same time. There are special chips that keep your wheels from locking up when you brake hard, that track the relative rotational speed of each tire and independently brake each wheel to keep any one of them from slipping on ice and water, from digging into mud or snow at the side of the road, and still others that compile information from your speedometer, engine, and from accelerometers to individually regulate the responsiveness of your suspension at each of your car's four corners.
There are chips that inflate the air bags that keep you alive by sensing an impact and reacting to it in the few ten thousandths of a second between "Crash!" and and when your head would have collided with the steering wheel or side window. There are several that do nothing but regulate the temperature and humidity of the air flowing within your car's interior. Likewise, your home thermostat contains one that helps your furnace and air conditioner balance the opposing demands of energy efficiency and comfort. Back in your car there are chips in your GPS system, in your sound system, chips that track sensors all over your car and tell you when something mechanical is amiss or if you are scheduled for a tuneup. There are chips that control the display of your speedometer and other in-dash instruments. One keeps your cell phone attuned to the closest transmission tower. There is one in your wrist watch. If you are an avid athlete, you might wear a computer that keeps track of your heart rate and oxygen uptake. Your bike might have one that keeps track of your distance and speed and the amount of pressure you are applying to your pedals, tracking your motion through geography and altitude, even tracking your performance as it changes over months or years of training. Hell, your blender and stove and refrigerator have them. There are chips in your TV, in your DVR, in your CD and DVD player. On your bed side table your alarm clock and home phone have them, even each of the little black boxes that charge your wireless devices. Most washing machines have them, more and more refrigerators come with them, some stoves, most microwave ovens, bread makers, coffee machines, home security systems, multi-room media systems, and lots of toys have them, robots, learning systems, trivia games, and then there are home weather stations, TV set top boxes for cable and satellite, hand held language translators, calculators, walky-talkies, baby monitors, pace makers, insulin pumps, fertility monitors, there are even little plastic flowers on plastic stems that you shove into the dirt in your yard which dutifully measure your soil's acidity, temperature, nitrogen content, moisture level, and the direction, hours, and intensity of sunlight, after 24 hours you pull the stem apart and plug the USB plug into your computer which goes to a site which takes your zip code and spits back a list of plants suited for that part of your garden's conditions.
Some toilets are controlled by embedded chips.
The world of stuff is coming alive through computation. And it isn't just because smart stuff is better stuff. One of the main reasons chips are finding their way into so many things is because they have become so damn cheap to manufacture.
CPU manufactures compete by constantly building newer and more advanced fabrication plants (each costing billions of dollars) that spit out more and more powerful processors by incorporating new processes that can etch smaller and smaller transistors which means more and more transistors on the same sized chip. The etching is a projection process which means hundreds of chips can be etched at once which means once you have built the fabrication plant, it costs no more per chip to print out chips with two or four times as many parts. Meanwhile, older plants, having already been paid for through the sale of millions of what was state of the art just a few years prior, can be cheaply retooled to spit out lesser chips at unit costs that drop to nearly pennies per.
Chips are becoming the inexpensive souls we install in our devices to bring them alive. Soon, as these devices gain communication capabilities, they will build out their own network of things. This thing-net will bring our environment alive with shared information. Where computers themselves are information object, the smart network of stuff will become information backdrop. You've heard the projection that your grocery cart will get biometric data via your smart toilet and will gently guide you to purchase organic broccoli or whole grain breads (partially subsidized by your health provider who can cut costs by helping it's patients stay healthy and out of the hospital). This scenario is not as far fetched as it first sounds. As embedded processors proliferate, the stuff around us will become brainy and chatty. Add in some software that understands the hopes and habits and needs of the humans that use them, and you have the beginnings of what I call "forced serendipity" where the environment conspires around us to make our lives more complete and our minds more content. How this will play out is beyond the scope of guessing. That it will play out… this is inevitable.
Simultaneously, the world gets richer. The third world becomes second world and the second world becomes first world. Technology is no longer the sole domain of the top one tenth of the global population. A double explosion results; 1. processing power increases ridiculously as dictated by Moore's Law, and, 2. a growing percentage of global citizens that have gained sufficient economic power to effect the demand curve for products and services that are enabled by embedded and shared processors.
It is hard to over-estimate the ramifications brought about by this Diaspora of processor infused devices. Born of computers but now running amok into the firmament of the day to day environment, the thing-net becomes a background, a living web composed of millions of ad-hoc collaborative groupings of shared processing. What we call a computer today, a single self contained processing device, wow will that come to seem quaint as we roll into the next decade. Think instead of amorphous coalitions of lesser processors and sensors coming together as the need arises and as opportunity presents itself… of two, ten, devices in your car, or two hundred thousand chips spread out across the planet, devices owned by many entities, or maybe even by no-one. A computer? In the new smart dust network of things, a computer is a what happens for a few seconds and what goes away as others, as other ethereal combinations of devices snap into and out of existence and layered such that any one node is part of many such computers if they are apart of one.
Then what? It is easy to see that our shared future is to be ever more defined and dominated over by an ever growing and ever more dense "All-Net" of environmental sensors and actuators that talk to each other, fall naturally into functional groups, learn and adapt to the ever changing demands of those people, organizations, and machine aggregates that emerge, compete, and collaborate in this ever more complex information and control exchange.
By the most conservative estimates, the network of things will grow out at exponential rates, rates faster even than Moore's Law. If a billion of Earth's six billion inhabitants each own 4o chips and if that number is doubling every 5 years and if the world keeps getting richer while chips get cheaper, than it wouldn't be crazy to suggest that in just twenty years we will live within a thing-net composed of somewhere between ten trillion and a hundred trillion intercommunicating processors. The total potential communication graph described by such a staggeringly large network is unimaginably complex. On what logic will it run?
In our everyday lives, we are used to complex aggregates like culture, society, and government evolving from similar loose and unstructured associations and connections. Culture as network. But that, we say, happens because humans are the nodes and humans are themselves complex. Indeed, as network nodes, we humans are very complex. We participate in the societal network that is culture as self contained arbitrators or processors of wildly complex patterns like situation and context, need and availability, of propriety and collaboration, of initiative and momentum, of concept and diffusion, of pattern and chaos. What minimal subset of these skills can we distill from the human node, then generalize and subsume into the silicon and software of our tiny smart-dust kin to be? This is the big project that looms in front of computer science. THE PROJECT. But before you throw up you arms in despair, remember that our own minds are consistent with the smart-dust model. Each of the 100 billion or so neurons in our head is relatively simple (at least in its role as an information and logic processor). "Mind", the remarkable confluence that makes us, us, that results, isn't the neurons, the things, the nodes, so much as it is a super-product of the n-dimensional connection map, the network that flits in and out of existence, the fluctuating and overlapping web connecting neurons to neurons.
As the design of sensor/controller/communicator modules grows in sophistication, as we figure out what absolutely has to exist in a node and what is best left to the emergent properties of the network itself, as the same efficiency of scale factors that gave us Moore's law come to play in their design and production, as these All-Net motes shrink, and become more and more powerful, we will see them incorporated into more and more of the stuff around us. Eventually, we will live in a world where almost everything has sensory, processing and communications capabilities and where these capabilities are diffuse, cheap, expendable, and by virtue of their numbers alone, omnipresent.
But how will these devices keep track of themselves in relation to others? By which shared language will they communicate? How will they filter the cacophony of noise that will be the totality of chatter across millions of trillions of devices?
It is one thing to build a radio station and a bunch of receivers. Quite another to build both into every device. But that is just the beginning, these little radios can compute and they can broadcast what they think so that other nodes can think on those thoughts, take action, or send them forward to still more nodes. What kinds of logic are made necessary by the communications craziness produced when everything everywhere contains tens or hundreds of intelligent processing motes built into it?
Clearly, today's operating systems would collapse under such a load and without the centrality and determinism afforded to todays computing environments. Computing as it stands today, is a reflection of simpler times when the computer stood all by itself. But what, you ask, of the network, the internet, the world wide web? Surely, the internet and email are proof that we live in the age of networked computing? The sober answer is that we have been duped. Clicking on a hyper-link and having a web page served up from some hard drive a thousand miles away is a parlor trick. When the UPS guy shows up with a pair of shoes you wouldn't confuse that with a conversation, with any kind of deeply collaborative process. Today's network is a network of transactions. I request, you send. You request, I send.
Operating system designers have only recently begun to think of the computer as a node, as a member of a network of nodes. Asking them to jump paradigms even further, to define a computer as some minimally complex, some minimally competent collection of nodes... as a collection of nodes aggregating and de-aggregating at a pace dictated by need, by task, by goal, by opportunity… this is beyond the cognitive scope of most industry engineers, is difficult even for academics and theorists to wrap their minds around. Ever hear those words used together: computer theorist? Historically, computing leans heavily towards practice. I am going to step on some toes here, but most computer researchers will probably agree that as a science, computing is dangerously long on practice and criminally short on theory.
Standing here at the edge of the now, looking out into the abyss that is the future, it seems obvious to me that we need to redefine the very notion of the word "computer". We need to dump the old definition of the computer as thing, and work towards a new understanding of a computer as a collection of computing devises, as a shifting sphere of influence, a probability cloud, a sphere of influence shared by, indeed made up of, hundreds or thousands of computing devices, each seeing themselves as the center of their own simultaneously overlapping spheres. If this is a radically new idea, it is an idea that is predated by the reality of the ever expanding network that is the natural product of chips embedded in so many of the things around us. It makes sense to call this new kind of center-less computing network, "Swam Computing" but the term still fails to evoke the complex dimensionality, simultaneity, and swift plasticity that will be true network computing.
So what will it take? What new logical structures will we have to build to get from where we are today, from the isolated computing systems we use today, to the self optimized, agent generating, pattern building, semantically aware, swarm computing of tomorrow? What kinds of architectural logic will be required of a computing system that has no center, no definitive physicality, no boundaries, where even the physical resources of computing (processing, memory, storage, and communication channel) shift and dance in a shimmer of dynamically shared and momentary and ever changing flux? Where real-time systems come and go at the speed of environmental flux, where need and demand and supply and goal and condition and context can never be pre-determined, what kinds of logical structures will be made necessary by the overlay of so ethereal a system onto the reliable accuracy we demand and expect of deterministic computing systems?
This is the future already flowing like a fog into and around our daily lives. This is the future practicality predicts, but for which our current knowledge and computing infrastructure is hopelessly ill-prepared. Everyday things are already laden with computing power. Some can even send and receive communications. But the wall of complexity that stands between this self-evident now and any true swarm computing future is daunting. Many computer researchers are beginning to sense as I do, that progress in computer science has come to a halt, that we have, as an industry, as a field, collectively run head long into this wall that stands between the single computer past and the swarm computer future. We know how to architect single computer solutions, but we are struggling to bring multi-computer self optimizing swam computing solutions from idea or dream to full working detail.
Of course having a hundred trillion nodes doesn't necessarily mean you have anything approaching swarm computing… it might mean nothing more than the fact that there are a hundred trillion nodes. Nodes that aren't designed to work together, to self aggregate… will never result in anything more capable than any one of them is as an individual unit. A flock maybe, but never a swarm. A flock is composed of lot of self-similar individuals acting in their own self interest, any patterns that appear to emerge are crystalline, pretty, but never complex, emergent, or evolving. A swarm, on the other hand, is defined as structures and behaviors of the aggregate whole that are not themselves possible in the individual parts of which it is composed. A swarm doesn't just "look" like a system… it is one. Where a flock can emerge from the behavior of individuals even when they have no means of sensing the goals of others around them, a swarm can only result when each part has the capacity to be aware of the parts around them and can grasp and process the construct of a shared goal. For a bunch of nodes ever to swarm, each nodes must posses the capacity to see itself in the context of an environment that is at least partially composed of other nodes.
Current estimates put the number of computers hooked to the global web we call the internet at between one and two billion. But there is nothing swarm-like about the internet. Obviously, today's computers do not have the requisite attributes and abilities to swarm. Exchanging email documents and web pages isn't exactly the stuff of deeply collaborative computing. For one thing, your computer has absolutely no means of understanding the contents of your emails or the web pages it dutifully displays. Adding more of these semantically challenged computers to the net won't get us any closer to swarm computing.
Likewise, most of the yardsticks we use to measure computer performance and progress like clock speed, data bandwidth, bus speed, chip memory, and disk capacity are useless as indicators of the capacity of a node to participate in a computational swarm. Today's computers are powered by chips composed of over a billion transistors. The amount of storage available to each computer on the net today is enough to hold tens of thousands of books, hundreds of hours of video and thousands of hours of recorded music. though it is obvious that they are not nearly as mentally endowed as is the average laptop, individual ants seem to swarm just fine. The crucial swarming criteria can not be quantity, or power so much as it is some measure of quality.
So, what is the missing secret sauce? What crucial "it" results in swarm behavior? Let's make a list. All nodes must speak a common language, that common language must be semantically reflexive (each node must be able to self-interpret the meaning of the work of other nodes), the nodes must be hooked to, listen, and talk through a shared network, each node must understand and be able to readily process such complex concepts as ownership, membership, resources, value, goals, and direction, they must know how to enlist the help of other nodes and node groups, and they must be able to know when one project is done and how to determine what might be formed into a new project, they must be able to know their own strengths or unique value and how to build on these strengths and form coalitions where other nodes are better equipped. Each of these capabilities is complex in its own right, but none of them alone will facilitate swarm computing. Swarm computing will happen only as a holistic solution. Any reasonable holistic solution will most likely be built from a set of shareable low-level logical machines… machines not unlike the boolean gates that make up binary computation, but at a higher level of abstraction.
Though it will not be easy to specify, design, or build, we desperately need an architectural-level solution to the swarm computing problem. We need a solution that allows each node to act as a cell, both autonomous and fully capable of coming together to build out collaborative systems equivalent to the organs, bodies, species, and culture that are the emergent aggregates of biological cells. As is true with biological cells, any true swarm computing architecture will have to imbue each and every computing node all of the logic necessary to autonomously swarm into larger and larger computing entities. The ideal smart-dust swarm computing architecture will consist of only two things, the nodes themselves and the medium or network that connects them. Specialization might play a role, each nodes might need to be able to physically adapt to specific environmental situations. Ideally, a node would contain everything it needed to conform and self optimize to changing demands and opportunities. But most of the plasticity in the system will come from the connections between the nodes, how they self aggregate into opportunistic groups to solve problems, refine and store information, and communicate, edit, and propagate pattern, how they learn and how other groups learn to use this accumulated knowledge as decision and control structure.
I am compelled to take a short detour and talk to the "multi-core processor" issue. Every major chip manufacturer is building them. A new arms race has shifted bragging rights from clock speed to core count. Commercial chips with 16 cores are about to hit the market, 32 core chips will follow. Intel has bragged of a working 80-core prototype. The dirty truth is that chip makers hit a performance wall about 6 years ago, forcing a retreat to multi-core layouts. As they attempted to pack more and more transistors onto a CPU, they came up against the hard reality of physics. Smaller and smaller wires meant greater and greater heat and worse and worse efficiency (as a ratio of their mass, chips were producing more heat than the interior of the sun). Then electrons started to jump gates (tunneling beneath space-time!) making transistors unreliable. The shear size of the chips was rising to the point where the distance electrons had to travel (at 96 thousand miles per second) was slowing things down. The surface area to edge ratio had shifted so to the point that there wasn't enough length at the chip's perimeter through which to supply data to keep it running at full capacity. Real world uses tended not to match the type for which these big chips were optimized. But mostly, the industry was rapidly reaching the physical limits of the project-and-etch manufacturing process that had so reliably yielded biannual doubling of processor density.
The industry responded by going condo… by dividing the chips into smaller and smaller sub-units. Multi-core chips come with their own unique set of problems. The main problem is that nobody has figured out how to write a compiler that converts source code into a format that takes full advantage of multiple cores. Compilers that customized for two core chips worse than useless for four or eight core chips. The other problem is that problems that are cut up to be processed on multiple cores produce results out of synch with the cores next to them. So data that one part of a process needs may have to wait around for another process to finnish. The results produced by each of these core assigned threads must be put into some sort of shared memory and must be labeled and tracked so that the processes being run on other cores can avoid using unprocessed data or data that is now owned by another process thread. The logical topology of multi-core processing models has proved difficult to generalize. Automating the processing sequence and the locking of transitory in-process memory regardless of the number of cores available to the execution of an application thread is beyond the practical scope of most programmers. The end result is that most code runs slower on multi-core chips.
Until the industry responds with a way to reliably and automatically break executable code into self optimized threads managed for multi-core chips, you'd better think twice when told that a thirty two core chip is better than a two core chip.
But there may be a silver lining to all of this multi-core craziness. When you think about it, multiple core chips are the same as having several chips… and isn't that what swarm computing is all about? Solving the problems and meeting the challenges of designing a true multi-core processing architecture is exactly equivalent to solving the problems and meeting the challenges of designing an processing architecture for swarm computing. The only real difference between the multi-core and multi-node computing challenge is security. Cores, existing as they do on one chip are mostly owned by one entity. Ownership is implicit. Nodes, on the other hand, can be far flung and owned by any number of separate entities. Multi-node processing is processing that must adhere to strict ownership and data security handshakes. Where processing involves multiple entities, data must be served up from shared memory in accordance with strict asynchronous locking protocol.
We have taken old-style solitary and discrete processing and programming model (computing in a box), about as far as human creativity and cognitive capacity will permit. As a result, and because each of our other research pursuits is so intimately dependent on the steady advance of computing's power and scope, progress in almost every human endeavor has been stalled by the complexity wall that stands between computing's solitary and linear past and its collaborative and n-dimensional future. Designing our way forward, towards an architectural solution to the swarm computing problem must be made humanity's central, shared, and most pressing goal. Until we meet this challenge, progress in all human endeavors will remain stalled.
As happened with the introduction of electrical power transmission, industrial agriculture, the internal combustion engine, and electronic communication, the other side of computing's complexity wall heralds fold increases in global productivity, dwarfing all previous epochs.
Change increases entropy. The only variable; how fast the Universe falls towards chaos. Determining this rate is the complexity being carried. Complexity exists only to increase disorder. Evolution is the refinement of a fitness metric. It is the process of refining a criteria for the measurement of the capacity of a system to maximize its future potential to hold complexity. This metric becomes ever more sophisticated, and can never be predetermined. Evolution is the computation.
Search This Blog
When advances get in the way of innovation…
From the perspective of true innovation – not run of the mill, better, faster, smaller, cheaper innovation – but deep and fundamental – changes everything – innovation, the computer industry has stagnated for more than fifteen years. This matters. This matters to everyone and everything. This might matter more than any other thing that matters at this point in the history of human culture. Like it or not, computers are THE primary tool driving progress in ALL fields of human endeavor and ALL sectors of the market. The result? Hell we LIVE the results of this stagnation all day every day. The future might be forced to live it even more viscerally. Larger economic metrics like national productivity expose this standstill for what it is and what it effects.
This innovation standstill, and by extension, every wild financial boom-and-crash cycle the world has been through since, is the direct result of the fact that productivity (which of course is directly linked to innovation cycles) has not risen since 1992. As we all know, productivity rises in direct relation to the pace of innovation in tools and infrastructure. You can say what you will about the shear scale and rapid global adoption of the internet, of virtual shopping, banking, and stock trading, of email, and cell phones... neither new channels for communication, nor new venues for consumption, (despite the obvious revenue they have created), have significantly increased our global ability to do more work in less time... to increase productivity. Please remember that businesses are in the business of grabbing as much of the current GDP, the current economic pie. That is market competition... temporal and immediate. Productivity on the other hand, what I am talking about, totally different, is about growing the total economic pie. Productivity is an economics issue. Though it is often effected by or set into motion by the actions of businesses, productivity is a long term function... it increases as new tools, techniques, social behavior, or infrastructure allows the same production as before for less hours of labor, or less input of energy. It is the plow and irrigation channels that allow one family to feed nineteen families so that all twenty families together can build a more complex and still more productive infrastructure.
In the absence of the kinds of substantial technological innovations that are needed for paradigm increases in productivity, markets react in maladapted, but perfectly predictable and expectable ways. Left starving for real growth in wealth and wealth generation, markets adjust emphasis and focus towards the shifting around of assets, creative accounting, slice and sell, merge and downsize, layered asset repackaging, in short, they do what they have to do to keep equity partners and stock owners happy by any and every means possible. In the absence of innovation-driven rises in productivity, "any and every means possible", almost always leads to destructive long term results... to the opposite of growth.
At the same time, and somewhat ironically, second and third world economies have seen spectacular growth. Though daunting in other ways, "developing" regions owe there achievements in growth to the acquisition and application of innovation, pre-built. They simply adopt decades and even centuries old western knowledge and innovation... building out modern infrastructures at pennies on the dollar. All this while first world economies stagnate. How could this be? Surely, the first world benefits from global modernization, from the rest of the world catching up with western levels of wealth and productivity? Yes. Of course it does. The non-western world just happens to be composed of four fifths of the planet's population. From a purely business, abjectly greedy, point of view, you can make a lot more money selling four times as many washing machines as you can by treating these people as a source of cheep labor and the regions they represent as cheep sources of raw materials.
Because of the shear size of the undeveloped world, small rises in wealth have huge investment implications. As a result, western stock, commodities, and money markets, always the most attractive, have surged as much of this new money has entered the investment fray. But this surge of third and second world investment in western securities (sounds funny doesn't it) has not been matched by the kinds of true infrastructure advantages that would justify the resulting staggering growth in valuation. The west has found itself in the enviable position of having access to more money than it's own innovation mechanisms can support.
By the way, consumption does not equal productivity. Never did and never will. Consumption can act as a rough indicator of wealth, the inevitable outcome of wealth, but when driven by credit and debt, consumption can increase independent of, and often at odds with wealth. When people have more money, they can and do spend more of it on the things they need, and they spend in new markets (ipods and restaurant quality kitchen remodels, bigger houses, cars with all leather interiors, navigation systems, and flat panel media centers), and they spend what's left over on savings and market speculation. But consumption is sometimes the result of credit. With personal credit, people spend money that isn't yet theirs. People buy money in order to buy things and experiences. This activity is euphemistically labeled "consumer confidence" because it suggests that people think that though they don't have adequate money to make purchases today, they will somehow acquire more money in some reasonable tomorrow. But, in the presence of credit, people don't behave reasonably. In the presence of personal credit, consumption outpaces productivity generated real wealth. The more people owe, the less money they have to both pay back their credit debt and make more purchases. The credit industry responds by extending more debt. Real value of real money becomes, as a result, very hard to track. What does consumption driven revenue actually mean when that consumption is being financed by debt? Disturbingly, in wholesale markets, debt is conceived of, marketed, and packaged as "product".
Consumer credit is an extreme bastardization of the concept of "capital". Capital, as traditionally defined, as loans made to producers, can legitimately be show to increase an economy's ability be productive. Personal credit is rarely used to increase capital, rarely thought of as a tool to increase a person's ability to make wealth. It is somewhat ironic that we may cringe at the idea of max-ing out a credit card to finance a business or to initiate a creative project, but we have no such aversion to using the same credit to go on a weekend trip or to acquire new shoes. We recognize that business, in pursuit the creation of wealth, need access to loans, to lines of credit, to venture capital, to the sale of equity as stock. But we are generally cognizant of the fact that the same access to capital when offered to individuals as credit, does not often drive anything more profound than debt and the kinds of consumption that are non-productive, that don't add to an economy's ability to produce new wealth, to do more with less labor.
Consumption isn't restricted to the individual shopper at the local mall. Consumption happens at huge corporate and international scales as well. Consumption as defined as the act of purchasing things and services that do not lead to greater productivity is becoming an alarmingly common business practice involving tens of trillions of dollars exchanged daily in global markets. To the extent the money being spent is not owned by the consuming party, to the extent the things being purchased do lead to increases in productivity, and to the extent that the intention of the participating parties is not motivated by an understanding of productivity or a desire to act in its behalf, exchange throughput must be viewed for what it really is, an unreliable indicator of true growth. To the extent a market is propped up by the infusion of cash, to the extent that a market grows for reasons that are not tied to its ability to increase productivity, eventually the whole system has to crash. As it has many many times, and as it has crashed recently.
The same thing happened and caused the dot com boom. New money, money that had not previously had access to the markets, found its way to wall street through internet trading. Ma and Pa Simpleton could hook up a modem and siphon their savings and retirement accounts right into APL and IBM stocks. This new money made old stocks look more valuable. "Rising demand" and all. Only nobody got that there is a big difference between the new "more demand" and our old notions of "higher demand".
Likewise, the unprecedented surge of non-western investment in western stock markets coincided with an ongoing flattening of our own innovation-starved growth trajectories, good money was and continues to be funneled into very bad monitory mechanisms and "creative" securities products, when it should have gone towards innovation induced capital and productivity. We really don't have an economic theory to fit this crazy bottom up investment model. We really don't know how to track and predict the differences between and interactions in the intersection of investment markets and the real product markets they (sometimes) represent.
Meanwhile, second and third world markets have indeed grown as a direct result of the build out of more efficient infrastructures. The last 15 years has seen the second and third world adopting modern industrial farming, modern highway systems, modern water and energy production and distribution systems, modern banking and credit, pluralist governance and education, modern health care, and the industrial machinery that can only be produced and maintained by a well educated work force. Along the way, attitudes and cultures have adjusted to the individual freedoms that come hand in hand with wealth, capital, and stability. But all of this growth has been a result of the rest of the world adopting ideas, knowledge, tools and infrastructure that has long existed in the west. Productivity has risen as expected at the rate of adoption.
In traditional western economies, economies that originate(ed) the knowledge, tools and infrastructures now being adopted in the rest of the world, increases in productivity must come from innovation. We don't have the luxury of adoption... there are no societies more advanced from which to borrow innovation.
We must innovate! In a modern economy, innovation is mandatory. More to the point, the kinds of innovations we must produce, that the west must build and implement, must be the the particular kinds of innovation that result in true increases in global productivity. Forget about little innovations, or surface innovation, or innovations that extrapolate on older innovations. Don't bother with innovations that exploit markets created by earlier innovations... for apex economies, innovation is only innovation if it catalyzes whole new markets, if it fundamentally reshapes the future of innovation.
At every large organization, there are people who's primary responsibility is the happiness of investors. Investors expect the value of their equity share to increase steadily. Included in this group are CEO's, CFO's, Boards of Directors, and almost everyone working at top levels of management. Their careers are directly linked to the value performance of the organization they represent. When productivity doesn't rise apace, institutional professionals are compelled to find any and every artificial means of inflating the value of their product. A bubble is born. Unsupported by real value, these epochs of artificially inflated values must inevitably come crashing down.
Regulation does help. By restricting market managers from engaging in the most egregious and obvious inflationary tactics, some market shenanigans can be avoided. But ultimately, unhappy times, market mangers will find a way to do their job... to keep share holders (temporarily) happy. And then, of course, there is always, fiscal policy. Government and banking driven manipulation of fiscal and monitory policy (controlling the base price and availability of capital as loans) has important, but will only have limited and short term effects on markets. Fiscal policy is a surface fix. The fed board (and its equivalent in other nations/economies) is a fine-tuning instrument... has absolutely disastrous implications when used beyond this narrow band of effect (currency devaluation, runaway inflation, decreased foreign investment, etc.).
Again, we come back to the basics... to the the most fundamental metric in any economy... to productivity. If you can find a way to feed your entire society with just two percent of your population working the soil (rather than ninety), you can get more done with your total labor pool. Productivity rises. Industrialized farming, automated factories, the delivery of clean water to homes and work places, an accepted currency and banking system that makes capital available to the masses, and equitable and respected system of governance, and justice, the removal of garbage and human waste, an efficient transportation system for people, goods, and industrial materials, a reliable communications network, public education and career and health care, and an efficient source of energy that can be routed where it is needed... these are the foundational infrastructures that drive an economy. Each presaged an increase in productivity and is linked to true growth.
It has been a long time since an infrastructure-scale innovation has rocked the national and global productivity metrics. The cultivation of plants and domestication of animals. Metallurgy. Fire. Shelter and clothing. Devision of labor. Governance. Spoken and written language. The printing press. Transportation infrastructure. Production and distribution of energy. Communication systems. These epochs are at the scale I am addressing. Electricity. Oil and gas production. The telegraph/radio/TV. The internal combustion engine. Numbers, measurement, and mathematics. Public education. Currency. The periodic table. Germ theory. Genetics. Evolution theory. Information theory. Machines that compute. These are the innovations that produce epochs of productivity.
Marked in staccato evolutionary steps, computing has proceeded apace through its rather short history, aping the dumb media we used to conduct culture... before computers. Ledger sheets, paper and pencil, typewriters, notebooks, folders and filing cabinets, printing presses, desks, drawers, chalk boards, telephones, mail, even cameras, sound and video recorders, televisions and radios. Very little of this analog to digital conversion process has involved added to the science of computing. Very little of the computerization of media has advanced the infrastructure of logic, of systems, of knowledge automation. Mostly our efforts and the markets that have resulted (rich as they have been) have sidetracked true evolution in computing by aligning attention to artifacts instead of meaning. A the existence of computerized spreadsheets might make working with ledger data easier, but it doesn't help us understand economics any better, and it doesn't advance the science of computation or logic. Progress in computing over the last 30 years has done little more than adding efficiency to old methods and processes, very little of the effort expended has resulted in better computing.
This is a dead end process. Using all of this logical power to make a better piece of paper... what a joke. We should instead aim to evolve technology that can generalize the larger problems that end up expressing themselves as ledger sheets and notebooks... technology that gets to the base of the issues that manifest the need for spreadsheets and word processors... technology that understands goals, that tracks resources, that builds collaborative solutions, that seeks patterns that build patterns... technology that can process the hierarchies of influence that effect the transition from any now to any inevitable future.
Can we look to today's computers, as amazing as they are, and truly say of them that mimicking paper added as much value to computing as it did to writing and typewriters? Efficiencies have been gained, sure, but at what cost? A computer can and should be much much more than a writing device, it should be an evolution machine. Is it? They have the power. What's missing is vision… not their vision… ours. Humans need to expect more of this most plastic of all machines.
Randall Reetz
This innovation standstill, and by extension, every wild financial boom-and-crash cycle the world has been through since, is the direct result of the fact that productivity (which of course is directly linked to innovation cycles) has not risen since 1992. As we all know, productivity rises in direct relation to the pace of innovation in tools and infrastructure. You can say what you will about the shear scale and rapid global adoption of the internet, of virtual shopping, banking, and stock trading, of email, and cell phones... neither new channels for communication, nor new venues for consumption, (despite the obvious revenue they have created), have significantly increased our global ability to do more work in less time... to increase productivity. Please remember that businesses are in the business of grabbing as much of the current GDP, the current economic pie. That is market competition... temporal and immediate. Productivity on the other hand, what I am talking about, totally different, is about growing the total economic pie. Productivity is an economics issue. Though it is often effected by or set into motion by the actions of businesses, productivity is a long term function... it increases as new tools, techniques, social behavior, or infrastructure allows the same production as before for less hours of labor, or less input of energy. It is the plow and irrigation channels that allow one family to feed nineteen families so that all twenty families together can build a more complex and still more productive infrastructure.
In the absence of the kinds of substantial technological innovations that are needed for paradigm increases in productivity, markets react in maladapted, but perfectly predictable and expectable ways. Left starving for real growth in wealth and wealth generation, markets adjust emphasis and focus towards the shifting around of assets, creative accounting, slice and sell, merge and downsize, layered asset repackaging, in short, they do what they have to do to keep equity partners and stock owners happy by any and every means possible. In the absence of innovation-driven rises in productivity, "any and every means possible", almost always leads to destructive long term results... to the opposite of growth.
At the same time, and somewhat ironically, second and third world economies have seen spectacular growth. Though daunting in other ways, "developing" regions owe there achievements in growth to the acquisition and application of innovation, pre-built. They simply adopt decades and even centuries old western knowledge and innovation... building out modern infrastructures at pennies on the dollar. All this while first world economies stagnate. How could this be? Surely, the first world benefits from global modernization, from the rest of the world catching up with western levels of wealth and productivity? Yes. Of course it does. The non-western world just happens to be composed of four fifths of the planet's population. From a purely business, abjectly greedy, point of view, you can make a lot more money selling four times as many washing machines as you can by treating these people as a source of cheep labor and the regions they represent as cheep sources of raw materials.
Because of the shear size of the undeveloped world, small rises in wealth have huge investment implications. As a result, western stock, commodities, and money markets, always the most attractive, have surged as much of this new money has entered the investment fray. But this surge of third and second world investment in western securities (sounds funny doesn't it) has not been matched by the kinds of true infrastructure advantages that would justify the resulting staggering growth in valuation. The west has found itself in the enviable position of having access to more money than it's own innovation mechanisms can support.
By the way, consumption does not equal productivity. Never did and never will. Consumption can act as a rough indicator of wealth, the inevitable outcome of wealth, but when driven by credit and debt, consumption can increase independent of, and often at odds with wealth. When people have more money, they can and do spend more of it on the things they need, and they spend in new markets (ipods and restaurant quality kitchen remodels, bigger houses, cars with all leather interiors, navigation systems, and flat panel media centers), and they spend what's left over on savings and market speculation. But consumption is sometimes the result of credit. With personal credit, people spend money that isn't yet theirs. People buy money in order to buy things and experiences. This activity is euphemistically labeled "consumer confidence" because it suggests that people think that though they don't have adequate money to make purchases today, they will somehow acquire more money in some reasonable tomorrow. But, in the presence of credit, people don't behave reasonably. In the presence of personal credit, consumption outpaces productivity generated real wealth. The more people owe, the less money they have to both pay back their credit debt and make more purchases. The credit industry responds by extending more debt. Real value of real money becomes, as a result, very hard to track. What does consumption driven revenue actually mean when that consumption is being financed by debt? Disturbingly, in wholesale markets, debt is conceived of, marketed, and packaged as "product".
Consumer credit is an extreme bastardization of the concept of "capital". Capital, as traditionally defined, as loans made to producers, can legitimately be show to increase an economy's ability be productive. Personal credit is rarely used to increase capital, rarely thought of as a tool to increase a person's ability to make wealth. It is somewhat ironic that we may cringe at the idea of max-ing out a credit card to finance a business or to initiate a creative project, but we have no such aversion to using the same credit to go on a weekend trip or to acquire new shoes. We recognize that business, in pursuit the creation of wealth, need access to loans, to lines of credit, to venture capital, to the sale of equity as stock. But we are generally cognizant of the fact that the same access to capital when offered to individuals as credit, does not often drive anything more profound than debt and the kinds of consumption that are non-productive, that don't add to an economy's ability to produce new wealth, to do more with less labor.
Consumption isn't restricted to the individual shopper at the local mall. Consumption happens at huge corporate and international scales as well. Consumption as defined as the act of purchasing things and services that do not lead to greater productivity is becoming an alarmingly common business practice involving tens of trillions of dollars exchanged daily in global markets. To the extent the money being spent is not owned by the consuming party, to the extent the things being purchased do lead to increases in productivity, and to the extent that the intention of the participating parties is not motivated by an understanding of productivity or a desire to act in its behalf, exchange throughput must be viewed for what it really is, an unreliable indicator of true growth. To the extent a market is propped up by the infusion of cash, to the extent that a market grows for reasons that are not tied to its ability to increase productivity, eventually the whole system has to crash. As it has many many times, and as it has crashed recently.
The same thing happened and caused the dot com boom. New money, money that had not previously had access to the markets, found its way to wall street through internet trading. Ma and Pa Simpleton could hook up a modem and siphon their savings and retirement accounts right into APL and IBM stocks. This new money made old stocks look more valuable. "Rising demand" and all. Only nobody got that there is a big difference between the new "more demand" and our old notions of "higher demand".
Likewise, the unprecedented surge of non-western investment in western stock markets coincided with an ongoing flattening of our own innovation-starved growth trajectories, good money was and continues to be funneled into very bad monitory mechanisms and "creative" securities products, when it should have gone towards innovation induced capital and productivity. We really don't have an economic theory to fit this crazy bottom up investment model. We really don't know how to track and predict the differences between and interactions in the intersection of investment markets and the real product markets they (sometimes) represent.
Meanwhile, second and third world markets have indeed grown as a direct result of the build out of more efficient infrastructures. The last 15 years has seen the second and third world adopting modern industrial farming, modern highway systems, modern water and energy production and distribution systems, modern banking and credit, pluralist governance and education, modern health care, and the industrial machinery that can only be produced and maintained by a well educated work force. Along the way, attitudes and cultures have adjusted to the individual freedoms that come hand in hand with wealth, capital, and stability. But all of this growth has been a result of the rest of the world adopting ideas, knowledge, tools and infrastructure that has long existed in the west. Productivity has risen as expected at the rate of adoption.
In traditional western economies, economies that originate(ed) the knowledge, tools and infrastructures now being adopted in the rest of the world, increases in productivity must come from innovation. We don't have the luxury of adoption... there are no societies more advanced from which to borrow innovation.
We must innovate! In a modern economy, innovation is mandatory. More to the point, the kinds of innovations we must produce, that the west must build and implement, must be the the particular kinds of innovation that result in true increases in global productivity. Forget about little innovations, or surface innovation, or innovations that extrapolate on older innovations. Don't bother with innovations that exploit markets created by earlier innovations... for apex economies, innovation is only innovation if it catalyzes whole new markets, if it fundamentally reshapes the future of innovation.
At every large organization, there are people who's primary responsibility is the happiness of investors. Investors expect the value of their equity share to increase steadily. Included in this group are CEO's, CFO's, Boards of Directors, and almost everyone working at top levels of management. Their careers are directly linked to the value performance of the organization they represent. When productivity doesn't rise apace, institutional professionals are compelled to find any and every artificial means of inflating the value of their product. A bubble is born. Unsupported by real value, these epochs of artificially inflated values must inevitably come crashing down.
Regulation does help. By restricting market managers from engaging in the most egregious and obvious inflationary tactics, some market shenanigans can be avoided. But ultimately, unhappy times, market mangers will find a way to do their job... to keep share holders (temporarily) happy. And then, of course, there is always, fiscal policy. Government and banking driven manipulation of fiscal and monitory policy (controlling the base price and availability of capital as loans) has important, but will only have limited and short term effects on markets. Fiscal policy is a surface fix. The fed board (and its equivalent in other nations/economies) is a fine-tuning instrument... has absolutely disastrous implications when used beyond this narrow band of effect (currency devaluation, runaway inflation, decreased foreign investment, etc.).
Again, we come back to the basics... to the the most fundamental metric in any economy... to productivity. If you can find a way to feed your entire society with just two percent of your population working the soil (rather than ninety), you can get more done with your total labor pool. Productivity rises. Industrialized farming, automated factories, the delivery of clean water to homes and work places, an accepted currency and banking system that makes capital available to the masses, and equitable and respected system of governance, and justice, the removal of garbage and human waste, an efficient transportation system for people, goods, and industrial materials, a reliable communications network, public education and career and health care, and an efficient source of energy that can be routed where it is needed... these are the foundational infrastructures that drive an economy. Each presaged an increase in productivity and is linked to true growth.
It has been a long time since an infrastructure-scale innovation has rocked the national and global productivity metrics. The cultivation of plants and domestication of animals. Metallurgy. Fire. Shelter and clothing. Devision of labor. Governance. Spoken and written language. The printing press. Transportation infrastructure. Production and distribution of energy. Communication systems. These epochs are at the scale I am addressing. Electricity. Oil and gas production. The telegraph/radio/TV. The internal combustion engine. Numbers, measurement, and mathematics. Public education. Currency. The periodic table. Germ theory. Genetics. Evolution theory. Information theory. Machines that compute. These are the innovations that produce epochs of productivity.
Marked in staccato evolutionary steps, computing has proceeded apace through its rather short history, aping the dumb media we used to conduct culture... before computers. Ledger sheets, paper and pencil, typewriters, notebooks, folders and filing cabinets, printing presses, desks, drawers, chalk boards, telephones, mail, even cameras, sound and video recorders, televisions and radios. Very little of this analog to digital conversion process has involved added to the science of computing. Very little of the computerization of media has advanced the infrastructure of logic, of systems, of knowledge automation. Mostly our efforts and the markets that have resulted (rich as they have been) have sidetracked true evolution in computing by aligning attention to artifacts instead of meaning. A the existence of computerized spreadsheets might make working with ledger data easier, but it doesn't help us understand economics any better, and it doesn't advance the science of computation or logic. Progress in computing over the last 30 years has done little more than adding efficiency to old methods and processes, very little of the effort expended has resulted in better computing.
This is a dead end process. Using all of this logical power to make a better piece of paper... what a joke. We should instead aim to evolve technology that can generalize the larger problems that end up expressing themselves as ledger sheets and notebooks... technology that gets to the base of the issues that manifest the need for spreadsheets and word processors... technology that understands goals, that tracks resources, that builds collaborative solutions, that seeks patterns that build patterns... technology that can process the hierarchies of influence that effect the transition from any now to any inevitable future.
Can we look to today's computers, as amazing as they are, and truly say of them that mimicking paper added as much value to computing as it did to writing and typewriters? Efficiencies have been gained, sure, but at what cost? A computer can and should be much much more than a writing device, it should be an evolution machine. Is it? They have the power. What's missing is vision… not their vision… ours. Humans need to expect more of this most plastic of all machines.
Randall Reetz
Problems with Search Engines, Who's Your Daddy?
If you are living within a modern developed economy you are most likely the unwitting master of over 40 computer chips. Of these chips only one or two are the big expensive general purpose CPU's that sit at the center of PC's and laptops of the sort Intel or AMD or Motorola charge several hundred dollars, aggressively advertise, and which, improbably, get both more powerful and less expensive at the wild rate dictated by Moore's Law (respectively doubling and halving every 18 months).
The rest of your chips live less conspicuous lives of simple servitude. They are embedded within many of the products and appliances you already own and almost all of the products you will buy tomorrow or own in the future. Compared with CPU's, embedded chips are usually smaller, more specific, and limited in their abilities. Uncomplainingly, and in many cases, without the need for any human interaction at all, they go about their menial tasks in nauseating repetition… electrons, it turns out, are exceedingly cheap. Much to the chagrin of front yard hotrodders, embedded chips now control your car's gas/air mixture, and time the ignition sparks in each cylinder. One of the reasons modern cars are so much more powerful than earlier cars and manage better milage at the same time. There are special chips that keep your wheels from locking up when you break hard, that track the relative rotational speed of each tire and independently break each wheel to keep any one of them from slipping on ice and water, from digging into mud or snow at the side of the road, and still others that compile information from your speedometer, engine, and from accelerometers to individually regulate the responsiveness of your suspension at each of your car's four corners.
There are chips that inflate the air bags and keep you alive by sensing an impact and reacting to it in the few ten thousandths of a second between then and and when your head would have collided with the steering wheel or side window. There are several that do nothing but regulate the temperature and humidity and air flow within your car. Likewise, your home thermostat contains one that helps your furnace and air conditioner balance the opposing demands of energy efficiency and comfort. Back in your car there are chips in your GPS system, in your sound system, chips that track sensors all over your car and tell you when something mechanical is amiss or if you are scheduled for a tuneup. There are chips that control the display of your speedometer and other in-dash instruments. One keeps your cell phone attuned to the closest transmission tower. There is one in your wrist watch. If you are an avid athlete, you might ware a computer that keeps track of your heart rate and oxygen uptake. Your bike might have one that keeps track of your distance and speed and the amount of pressure you are applying to your pedals, tracking your motion through geography and altitude, even tracking your performance as it changes over months or years of training. Hell, your blender and stove and refrigerator have them. There are chips in your TV, in your DVR, in your CD and DVD player. On your bed side table your alarm clock and home phone have them, even each of the little black boxes that charge your wireless devices. Even some toilets are controlled by embedded chips. Most washing machines have them, more and more refrigerators come with them, some stoves, most microwave ovens, bread makers, coffee machines, home security systems, multi-room media systems, and lots of toys have them, robots, learning systems, trivia games, and then there are home weather stations, TV set top boxes for cable and satellite, hand held language translators, calculators, walky-talkies, baby monitors, pace makers, insulin pumps, fertility monitors, there are even little plastic flowers on plastic stems that you shove into the dirt in your yard which dutifully measure your soil's acidity, temperature, nitrogen content, moisture level, and the direction, hours, and intensity of sunlight, after 24 hours you pull the stem apart and plug the USB plug into your computer which goes to a site which takes your zip code and spits back a list of plants suited for that part of your garden's conditions.
The world of stuff is coming alive through computation. And it isn't just because smart stuff is better stuff. One of the main reasons chips are finding there way into so many things is because they have become so damn cheap to manufacture.
CPU manufactures compete by constantly building newer and more advanced fabrication plants (each costing billions of dollars) that spit out more and more powerful processors by incorporating new processes that can etch smaller and smaller transistors which means more and more transistors on the same sized chip. The etching is a projection process which means hundreds of chips can be etched at once which means once you have built the fabrication plant, it costs no more per chip to print out chips with two or four times as many parts. Meanwhile, older plants, having already been paid for through the sale of millions of what was state of the art just a few years ago, can be cheaply retooled to spit out lesser chips for at almost no cost at all.
Chips are becoming the inexpensive souls we install in our devices to bring them alive. Soon, as these devices gain communication capabilities, they will build out their own network of things. This thing-net will bring our environment alive with shared information. Where computers themselves are information object, the smart network of stuff will become information backdrop. You've heard the projection that your grocery cart will get biometric data via your smart toilet and will gently guide you to purchase organic broccoli or whole grain breads (partially subsidized by your health provider who can cut costs by helping it's patients stay healthy and out of the hospital). This scenario is not as far fetched as it first sounds. As embedded processors proliferate, the stuff around us will become brainy and chatty. Add in some software that understands the hopes and habits and needs of the humans that use them, and you have the beginnings of what I call "forced serendipity" where the environment conspires around us to make our lives more complete and our minds more content. How this will play out is beyond the scope of guessing. That it will play out… this is inevitable.
Simultaneously, the world gets richer. The third world becomes the second world and the second world becomes first world. Technology is no longer the sole domain of the top one tenth of the global population. There is a double explosion of processing power a growing demographic with growing demands for information and control.
It is hard to over-estimate the ramifications brought about by this Diaspora of processor infused devices. Born of computers but now running amok into the firmament of the day to day environment, the thing-net becomes a background, a living web composed of millions of ad-hoc collaborative groupings of shared processing. What we call a computer today, a single self contained processing device, wow will that come to seem quaint as we roll into the next decade. Think instead of amorphous coalitions of lesser processors and sensors coming together as the need arises and as opportunity presents itself… of two, ten, devices in your car, or two hundred thousand chips spread out across the planet, devices owned by many entities, or maybe even by no-one. A computer? In the new smart dust network of things, a computer is a what happens for a few seconds and what goes away as others, as other ethereal combinations of devices snap into and out of existence and layered such that any one node is part of many such computers if they are apart of one.
Then what? It is easy to see that our shared future is to be ever more defined and dominated over by an ever growing and ever more dense "All-Net" of environmental sensors and actuators that talk to each other, fall naturally into functional groups, learn and adapt to the ever changing demands of those people, organizations, and machine aggregates that emerge, compete, and collaborate in this ever more complex information and control exchange.
By the most conservative estimates, the network of things will grow out at exponential rates, rates faster even than Moore's Law. If a billion of Earth's six billion inhabitants each own 4o chips and if that number is doubling every 5 years and if the world keeps getting richer while chips get cheaper, than it wouldn't be crazy to suggest that in just twenty years we will live within a thing-net composed of somewhere between ten trillion and a hundred trillion intercommunicating processors. The total potential communication graph described by such a staggeringly large network is unimaginably complex. On what logic will it run? We are used to culture and society and government evolving by such loose and unstructured associations and connections… but that is because us humans are the nodes in social networks.
As the design of sensor/controller/communicator modules grows in sophistication, as the same factors that gave us Moore's law come to play in their design and production, as these All-Net motes shrink, and become more powerful, we will see them incorporated into more and more of the stuff around us. Eventually, we will live in a world where almost everything has sensory, processing and communications capabilities and where these capabilities are diffuse, cheep, expendable, and by virtue of their numbers alone, omnipresent.
But how will these devices keep track of themselves in relation to others? By which shared language will they communicate? How will they filter the cacophony of noise that will be the totality of chatter across millions of trillions of devices?
It is one thing to build a radio station and a bunch of receivers. Quite another to build both into every device. But that is just the beginning, these little radios can compute and they can broadcast what they think so that other nodes can think on those thoughts, take action, or send them forward to still more nodes. What kinds of logic are made necessary by the communications craziness produced when everything everywhere contains tens or hundreds of intelligent processing motes built into it?
Clearly, today's operating systems would collapse under such a load and without the centrality and determinism afforded to todays computing environments. Computing as it stands today, is a reflection of simpler times when the computer stood all by itself. But what of the network, the internet, the world wide web? Surely, the internet and email are proof that we live in the age of networked computing? The sober answer is that we have been duped. Clicking on a hyper-link and having a web page served up from some hard drive a thousand miles away is a parlor trick. When the UPS guy shows up with a pair of shoes you wouldn't confuse that with a conversation, with any kind of deeply collaborative process. Today's network is a network of transactions. I request, you send. You request, I send.
Operating system designers have only recently begun to think of the computer as a node, as a member of a network of nodes. Asking them to jump paradigms even further, to define a computer as some minimally complex, some minimally competent collection of nodes... as a collection of nodes aggregating and de-aggregating at a pace dictated by need, by task, by goal, by opportunity… this is beyond the cognitive scope of most industry engineers, is difficult even for academics and theorists to wrap their minds around. Ever hear those words used together: computer theorist? Historically, computing leans heavily towards practice. I am going to step on some toes here, but most computer researchers will probably agree that as a science, computing is dangerously long on practice and criminally short on theory.
Standing here at the edge of the now, looking out into the abyss that is the future, it seems obvious to me that we need to redefine the very notion of the word "computer". We need to dump the old definition of the computer as thing, and work towards a new understanding of a computer as a collection of computing devises, as a shifting sphere of influence, a probability cloud, a sphere of influence shared by, indeed made up of, hundreds or thousands of computing devices, each seeing themselves as the center of their own simultaneously overlapping spheres. If this is a radically new idea, it is an idea that is predated by the reality of the ever expanding network that is the natural product of chips embedded in so many of the things around us. It makes sense to call this new kind of center-less computing network, "Swam Computing" but the term still fails to evoke the complex dimensionality, simultaneity, and swift plasticity that will be true network computing.
So what will it take? What new logical structures will we have to build to get from where we are today, from the isolated computing systems we use today, to the What kinds of architectural logic will be required of a computing that has no center, no definitive physicality, no boundaries, where even the physical resources of computing (processing, memory, storage, and communication channel) shift and dance in a shimmer of dynamically shared and momentary and ever changing flux? Where real-time systems come and go at the speed of environmental flux, where need and demand and supply and goal and condition and context can never be pre-determined, what kinds of logical structures will be made necessary by the overlay of so ethereal a system onto the reliable accuracy we demand and expect of deterministic computing systems?
This is the future already flowing like a fog into and around our daily lives. This is the future practicality predicts, but for which our current knowledge and computing infrastructure is hopelessly ill-prepared. Everyday things are already laden with computing power. Some can even send and receive communications. But the wall of complexity that stands between this self-evident now and any true swarm computing future is daunting. Many computer researchers are beginning to sense as I do, that progress in computer science has come to a halt, that we have, as an industry, as a field, collectively run head long into this wall that stands between the single computer past and the swarm computer future. We know how to architect single computer solutions, but we are struggling to bring multi-computer self optimizing swam computing solutions from idea or dream to full working detail.
We desperately need an architectural solution to the swarm computing problem. We need a solution that allows each node to act as a cell, as autonomous but fully capable of coming together to build out collaborative systems as organs, bodies, species, and culture. As is true with biological cells, any true swarm computing architecture will have to imbue each and every computing node all of the logic necessary to autonomously swarm into larger and larger computing entities.
Progress in every human endeavor has been stalled by the complexity wall that stands between computing past and its future. Designing our way towards an architectural solution to the swarm computing problem must to be made humanity's central, shared, and most pressing goal. Until we meet this challenge, evolution in all human endeavors will remain stalled. As happened with the introduction of electrical power transmission, industrial agriculture, the internal combustion engine, and electronic communication, the other side of the complexity wall waits an increase in global productivity that will dwarf all previous epochs.
Computing hasn't even begun.
The rest of your chips live less conspicuous lives of simple servitude. They are embedded within many of the products and appliances you already own and almost all of the products you will buy tomorrow or own in the future. Compared with CPU's, embedded chips are usually smaller, more specific, and limited in their abilities. Uncomplainingly, and in many cases, without the need for any human interaction at all, they go about their menial tasks in nauseating repetition… electrons, it turns out, are exceedingly cheap. Much to the chagrin of front yard hotrodders, embedded chips now control your car's gas/air mixture, and time the ignition sparks in each cylinder. One of the reasons modern cars are so much more powerful than earlier cars and manage better milage at the same time. There are special chips that keep your wheels from locking up when you break hard, that track the relative rotational speed of each tire and independently break each wheel to keep any one of them from slipping on ice and water, from digging into mud or snow at the side of the road, and still others that compile information from your speedometer, engine, and from accelerometers to individually regulate the responsiveness of your suspension at each of your car's four corners.
There are chips that inflate the air bags and keep you alive by sensing an impact and reacting to it in the few ten thousandths of a second between then and and when your head would have collided with the steering wheel or side window. There are several that do nothing but regulate the temperature and humidity and air flow within your car. Likewise, your home thermostat contains one that helps your furnace and air conditioner balance the opposing demands of energy efficiency and comfort. Back in your car there are chips in your GPS system, in your sound system, chips that track sensors all over your car and tell you when something mechanical is amiss or if you are scheduled for a tuneup. There are chips that control the display of your speedometer and other in-dash instruments. One keeps your cell phone attuned to the closest transmission tower. There is one in your wrist watch. If you are an avid athlete, you might ware a computer that keeps track of your heart rate and oxygen uptake. Your bike might have one that keeps track of your distance and speed and the amount of pressure you are applying to your pedals, tracking your motion through geography and altitude, even tracking your performance as it changes over months or years of training. Hell, your blender and stove and refrigerator have them. There are chips in your TV, in your DVR, in your CD and DVD player. On your bed side table your alarm clock and home phone have them, even each of the little black boxes that charge your wireless devices. Even some toilets are controlled by embedded chips. Most washing machines have them, more and more refrigerators come with them, some stoves, most microwave ovens, bread makers, coffee machines, home security systems, multi-room media systems, and lots of toys have them, robots, learning systems, trivia games, and then there are home weather stations, TV set top boxes for cable and satellite, hand held language translators, calculators, walky-talkies, baby monitors, pace makers, insulin pumps, fertility monitors, there are even little plastic flowers on plastic stems that you shove into the dirt in your yard which dutifully measure your soil's acidity, temperature, nitrogen content, moisture level, and the direction, hours, and intensity of sunlight, after 24 hours you pull the stem apart and plug the USB plug into your computer which goes to a site which takes your zip code and spits back a list of plants suited for that part of your garden's conditions.
The world of stuff is coming alive through computation. And it isn't just because smart stuff is better stuff. One of the main reasons chips are finding there way into so many things is because they have become so damn cheap to manufacture.
CPU manufactures compete by constantly building newer and more advanced fabrication plants (each costing billions of dollars) that spit out more and more powerful processors by incorporating new processes that can etch smaller and smaller transistors which means more and more transistors on the same sized chip. The etching is a projection process which means hundreds of chips can be etched at once which means once you have built the fabrication plant, it costs no more per chip to print out chips with two or four times as many parts. Meanwhile, older plants, having already been paid for through the sale of millions of what was state of the art just a few years ago, can be cheaply retooled to spit out lesser chips for at almost no cost at all.
Chips are becoming the inexpensive souls we install in our devices to bring them alive. Soon, as these devices gain communication capabilities, they will build out their own network of things. This thing-net will bring our environment alive with shared information. Where computers themselves are information object, the smart network of stuff will become information backdrop. You've heard the projection that your grocery cart will get biometric data via your smart toilet and will gently guide you to purchase organic broccoli or whole grain breads (partially subsidized by your health provider who can cut costs by helping it's patients stay healthy and out of the hospital). This scenario is not as far fetched as it first sounds. As embedded processors proliferate, the stuff around us will become brainy and chatty. Add in some software that understands the hopes and habits and needs of the humans that use them, and you have the beginnings of what I call "forced serendipity" where the environment conspires around us to make our lives more complete and our minds more content. How this will play out is beyond the scope of guessing. That it will play out… this is inevitable.
Simultaneously, the world gets richer. The third world becomes the second world and the second world becomes first world. Technology is no longer the sole domain of the top one tenth of the global population. There is a double explosion of processing power a growing demographic with growing demands for information and control.
It is hard to over-estimate the ramifications brought about by this Diaspora of processor infused devices. Born of computers but now running amok into the firmament of the day to day environment, the thing-net becomes a background, a living web composed of millions of ad-hoc collaborative groupings of shared processing. What we call a computer today, a single self contained processing device, wow will that come to seem quaint as we roll into the next decade. Think instead of amorphous coalitions of lesser processors and sensors coming together as the need arises and as opportunity presents itself… of two, ten, devices in your car, or two hundred thousand chips spread out across the planet, devices owned by many entities, or maybe even by no-one. A computer? In the new smart dust network of things, a computer is a what happens for a few seconds and what goes away as others, as other ethereal combinations of devices snap into and out of existence and layered such that any one node is part of many such computers if they are apart of one.
Then what? It is easy to see that our shared future is to be ever more defined and dominated over by an ever growing and ever more dense "All-Net" of environmental sensors and actuators that talk to each other, fall naturally into functional groups, learn and adapt to the ever changing demands of those people, organizations, and machine aggregates that emerge, compete, and collaborate in this ever more complex information and control exchange.
By the most conservative estimates, the network of things will grow out at exponential rates, rates faster even than Moore's Law. If a billion of Earth's six billion inhabitants each own 4o chips and if that number is doubling every 5 years and if the world keeps getting richer while chips get cheaper, than it wouldn't be crazy to suggest that in just twenty years we will live within a thing-net composed of somewhere between ten trillion and a hundred trillion intercommunicating processors. The total potential communication graph described by such a staggeringly large network is unimaginably complex. On what logic will it run? We are used to culture and society and government evolving by such loose and unstructured associations and connections… but that is because us humans are the nodes in social networks.
As the design of sensor/controller/communicator modules grows in sophistication, as the same factors that gave us Moore's law come to play in their design and production, as these All-Net motes shrink, and become more powerful, we will see them incorporated into more and more of the stuff around us. Eventually, we will live in a world where almost everything has sensory, processing and communications capabilities and where these capabilities are diffuse, cheep, expendable, and by virtue of their numbers alone, omnipresent.
But how will these devices keep track of themselves in relation to others? By which shared language will they communicate? How will they filter the cacophony of noise that will be the totality of chatter across millions of trillions of devices?
It is one thing to build a radio station and a bunch of receivers. Quite another to build both into every device. But that is just the beginning, these little radios can compute and they can broadcast what they think so that other nodes can think on those thoughts, take action, or send them forward to still more nodes. What kinds of logic are made necessary by the communications craziness produced when everything everywhere contains tens or hundreds of intelligent processing motes built into it?
Clearly, today's operating systems would collapse under such a load and without the centrality and determinism afforded to todays computing environments. Computing as it stands today, is a reflection of simpler times when the computer stood all by itself. But what of the network, the internet, the world wide web? Surely, the internet and email are proof that we live in the age of networked computing? The sober answer is that we have been duped. Clicking on a hyper-link and having a web page served up from some hard drive a thousand miles away is a parlor trick. When the UPS guy shows up with a pair of shoes you wouldn't confuse that with a conversation, with any kind of deeply collaborative process. Today's network is a network of transactions. I request, you send. You request, I send.
Operating system designers have only recently begun to think of the computer as a node, as a member of a network of nodes. Asking them to jump paradigms even further, to define a computer as some minimally complex, some minimally competent collection of nodes... as a collection of nodes aggregating and de-aggregating at a pace dictated by need, by task, by goal, by opportunity… this is beyond the cognitive scope of most industry engineers, is difficult even for academics and theorists to wrap their minds around. Ever hear those words used together: computer theorist? Historically, computing leans heavily towards practice. I am going to step on some toes here, but most computer researchers will probably agree that as a science, computing is dangerously long on practice and criminally short on theory.
Standing here at the edge of the now, looking out into the abyss that is the future, it seems obvious to me that we need to redefine the very notion of the word "computer". We need to dump the old definition of the computer as thing, and work towards a new understanding of a computer as a collection of computing devises, as a shifting sphere of influence, a probability cloud, a sphere of influence shared by, indeed made up of, hundreds or thousands of computing devices, each seeing themselves as the center of their own simultaneously overlapping spheres. If this is a radically new idea, it is an idea that is predated by the reality of the ever expanding network that is the natural product of chips embedded in so many of the things around us. It makes sense to call this new kind of center-less computing network, "Swam Computing" but the term still fails to evoke the complex dimensionality, simultaneity, and swift plasticity that will be true network computing.
So what will it take? What new logical structures will we have to build to get from where we are today, from the isolated computing systems we use today, to the What kinds of architectural logic will be required of a computing that has no center, no definitive physicality, no boundaries, where even the physical resources of computing (processing, memory, storage, and communication channel) shift and dance in a shimmer of dynamically shared and momentary and ever changing flux? Where real-time systems come and go at the speed of environmental flux, where need and demand and supply and goal and condition and context can never be pre-determined, what kinds of logical structures will be made necessary by the overlay of so ethereal a system onto the reliable accuracy we demand and expect of deterministic computing systems?
This is the future already flowing like a fog into and around our daily lives. This is the future practicality predicts, but for which our current knowledge and computing infrastructure is hopelessly ill-prepared. Everyday things are already laden with computing power. Some can even send and receive communications. But the wall of complexity that stands between this self-evident now and any true swarm computing future is daunting. Many computer researchers are beginning to sense as I do, that progress in computer science has come to a halt, that we have, as an industry, as a field, collectively run head long into this wall that stands between the single computer past and the swarm computer future. We know how to architect single computer solutions, but we are struggling to bring multi-computer self optimizing swam computing solutions from idea or dream to full working detail.
We desperately need an architectural solution to the swarm computing problem. We need a solution that allows each node to act as a cell, as autonomous but fully capable of coming together to build out collaborative systems as organs, bodies, species, and culture. As is true with biological cells, any true swarm computing architecture will have to imbue each and every computing node all of the logic necessary to autonomously swarm into larger and larger computing entities.
Progress in every human endeavor has been stalled by the complexity wall that stands between computing past and its future. Designing our way towards an architectural solution to the swarm computing problem must to be made humanity's central, shared, and most pressing goal. Until we meet this challenge, evolution in all human endeavors will remain stalled. As happened with the introduction of electrical power transmission, industrial agriculture, the internal combustion engine, and electronic communication, the other side of the complexity wall waits an increase in global productivity that will dwarf all previous epochs.
Computing hasn't even begun.
The Reetz Test!
Allen Turing proposed a simple test for assessing the parity of artificial and human intelligence. A computer would pass the test if a person interacting through a keyboard, couldn't tell weather or not he was communicating with a human or a computer. But this test is not sufficient for me. To achieve true parity with humans, the computer should be better than a good cheat!
I have never seen human potential defined as the ability to take (or create) a test!
My own best definition of the highest of human capacity, that which separates us from everything that came before us, is that we have the ability to see ourselves in the context, and as active participants, in the evolution of the universe.
So, I await a computer who will scoff at Turing's little test and write something similar to this post.
A Sobering Look at Our Role in Evolution
We have a tendency to lump time and progress together, intuiting a direct link between the two, maybe even considering them one and the same. With time, there is a past, a present, and a future. Though they flow smoothly from one to the other, they are linearly separated by a line called now and remain experientially distinct. But what of causality, of progress? Compare the complexity of humans (or of any biological entity), with the chemistry from which life is composed, and you are forced to concede that progress towards complexity is as linear as time itself. In the past there was simple, followed by a now that is more complex, and we talk all of this as proof that the future will be yet more complex.
But how does one go about getting to the future? The passive answer: wait patiently. But that implies that progress, like time, just plain old happens... independent of intent. We don't yet know from what stuff or situation time propagates. Einstein's little relativity equation allows us to compute the state or value of time in relation to energy, matter and distance, but that really isn't the same as understanding the causal chain from which time itself is born. We experience time only after the fact, after time exists. "So what?" I hear you say, "time goes about its merry business without need of understanding anyway!" And that is my point. We accept that which never changes. Omnipotent things and processes like time are experienced as background, firmament, unquestionable. But constancy doesn't justify ignorance. In fact, if the unimaginable success of the uniquely human and very recent activity called SCIENCE says anything, it says that sticking our fingers in the cosmic cracks and pealing back the firmament's many layers, is the single most efficient means of success. Is that what progress is? It is at the very least, a type of progress. But the evolutionary record shows that progress happened for quite a while before the appearance of science or humans or animals or for that matter, before life itself. Which makes me wonder... if algae could think, would algae be as confused by photosynthesis as we are by intent?
Specifically, I am thinking here about the future of computing. But that doesn't really matter, when I think about any future, I use a trick I have learned over the years, I look for the ultimate future, the pinnacle, the end state, the future that will, if given enough time, happen. To this end I have spent the last thirty years pondered evolution it self (the rules and patterns of change), and by simple expansion of logic, self-evolving computing. I assume that something like computing, something of computing, will some day proceed beyond the yoke of human intent.
As an aid to understanding the events around us, we humans developed counting and measurement. From there we developed arithmetic rules and methods for manipulating and comparing our sums and measurements. After that, it was only inevitable that we would build computing ticks and machines to slog through all of that arithmetic heavy lifting. Computers and maths, like shovels and spears, are nothing but external tools that extend control of our environment beyond our own biological abilities. Anyone who has spent time exploring the notion of tools has come up against the foreground background problem that makes it impossible, or at the very least, arbitrary, to come up with criteria that perfectly separates any tool, from any tool user. The arm or hand, is it a part of you or a tool? Any answer you come up with depends heavily on how you define "you", the boundaries of "you", and how you define "tool". If a tool is something a thing develops so that it can do something it couldn't previously do, then how is a hand, developed by evolution, not a tool? When any discussion of progress, of the evolution of tools, and of the boundaries that define a system reaches the level of sentience, of intent, of purpose, we are faced by mutually linked concepts that can quickly cascade into feedback loops of infinite recursion.
Given that, is there a way to measure progress? If the thing trying to measure progress is also a product of, and participant in that progress, is there a way to know the differences between actual universal limits and limits induced by our own structure and knowledge? These are big big issues, and they are not the stuff of philosophy anymore. Progress in almost every human endeavor is linked directly or indirectly to our ability to build systems that handle greater and greater complexities.
Eventually of course, it won't be enough to build better and better extensions to human capacity. Eventually, our complexity handling tools will reach a level of self-genorating complexity handling such that interaction with humans will actually slow them down, will hamper their own evolution!
That eventuality, that inevitability, is a long way from where we sit today. But, if we are going to go there, and we know it, why not make it an intentional and directed effort. That way, we will get there sooner, we will waste less time and energy exploring dead-end evolutionary what ifs.
Along the way, I imagine a computing that seeks its own complexity maximizing goals, an epoch stretching out forward of the of birth of the paradigm that comes after biology, a period of evolution that begins with what is now popularly known as "the singularity".
By any reasonable argument, it would be difficult to imagine a future that does not eventually reach such a point. It happened with chemistry... that is how life got here! Life, I argue, is just chemistry following a higher order of self organization. So it would seem arrogant to suppose that a similar jump in self-organization wouldn't happen on top of the current scheme; biology.
I don't have a religious obsession with the future. Meaning, I am not attracted to the future as a nether-world that will come and save us. I don't see the future as a panacea, or a back door, as a way out. I see it as the direct result of a process that resulted in us and that we are necessary participants. In the exact same sense that we are the direct result of actions taken by things before us, the future, to me, is a thing to be built... by us! The beauty of evolution is that it just plain happens. Obviously. In this neck of the cosmic woods, we are the first result of evolution that has the ability to see itself in the context of this grand process... as an agent or ingredient in the process! But, because we are the first, we have to accept that consciousness is not requisite to the process. In fact, one has to wonder if consciousness, if sentience, doesn't at times very much get in the way of evolution.
Regardless, here we are, self aware, and like everything else, part and parcel to a system that changes over time.... that gets more complex in pockets that are already more complex. So, it is natural for us to ask the biggest questions of fate, purpose, and intention. If, as I do, we accept that intent is simply a mechanism of organization (not qualitatively different then the krebs cycle or photosynthesis or RNA transcription), we have a responsibility to do evolution proud, to honor all of the hard work that has resulted in this level of complexity that is us, and to run our little leg of the relay in a way that respects the race that brought us into existence.
But what exactly is our role in this race that gave us, us? We are beginning to understand the rules of the race, of evolution itself. How does knowing about the race change our participation? Is this the ultimate faustian hubris? Are we flirting with the reification of Pandora's Box? The Ouroboros (snake eating its own tail)? It seems the very expression of human reason to explore questions of purpose. But the old ways of seeking resulted in abstractions, philosophy, fantasy... this is different. This is the blueprint of change. This is a recipe for process itself. Not some process... EVERY process.
I am reluctant to imagine that humans are truly equipped desire an actual understandings of reality. We seek "enlightenment" not "reality". Reality requires a closeness that is uncomfortable, or at least, unfamiliar. But as we acquire an accurate and causal understanding of purpose, what will happen to purpose or purposes transition into action? Thermodynamics and information science have given us the structure of a theory of change. Will we accept it as reality? Can we understand it as reality? And if so, then what? In the theater of the mind, does knowing how change happens play itself out differently from seeking enlightenment or any of the more spiritual practices that have driven individual morality, shaped cultures and their ontologies, and ultimately resulted in motivation?
I was asked recently how I define the difference between humans and other life... I remembered Gregory Bateson's challenge to come up with a reliable set of criteria that would allow anyone to determine the difference between a thing that had been alive to something that had never been alive... and I answered:
A human can see itself as an active participant in the process that resulted in humans.
If we can, so informed, imagine ourselves, accept our purpose, as that which, like all before us, is here to maximize the potential of complexity will we, equipped with self knowledge of the actual workings of the system, will we be able to do evolution better than the systems that did it and did it so well without knowledge. Knowledge should make us better. But knowldge will surely bring its own unique challenges to the process. Roughly, there were particles and particles accumulated into super particles which accumulated into atoms which accumlated into stars which ran through their fuel, exploded and accumulated into new stars and planets, the atoms on planets accumulated into more and more complex molecules and these molecules accumulated structures that alowed them to reproduce which led to even more complex molecules that worked as the molecule factories we call life, and these living systems eventually built abstraction systems or minds and these minds eventually evolved sentience and sentience gives us this sentence:
What matters is what matters, knowing what matters and how to know it matters the most.
That sentence is the first sentence of the book I am writing explaining evolution from the perspective of why. The sentence is not special, it could have been written by anyone, it probably has been written before. But what it means, and the ability to mean it is special. It represents what and who we are as a species.
Darwin did a really great job explaining the how of evolution, at least with regards to biology. What I work towards is a domain independent (any system) understanding of the rare but influential emergence and self-stability of greater and greater complexity. I don't see biology as special. I don't see humans as special. I don't even see human sentience as special. I see each of these systems as ever increasing, often layered, accumulations of complexity that are quantitatively but not qualitatively different from each other. There is a huge difference between sentience and chemistry of course, but both levels of complexity are derived by the same dynamics, the same process. Nothing new or bold or other-worldly has to be added to the general evolutionary scheme in order to move from the evolution of atoms to the evolution of sentience. I labor this point only as a means of grounding the other theses I write here. Grounding seems especially important when extrapolating any ideas to the future.
Of course my insistence on fusing human intention to this largest of problems, to evolution itself may seem superfluous or self aggrandizing given the concurrent argument that complexity just plain happens. Where oxygen can't help but to bind to iron, human behavior is such that intention, though every bit as mechanical, requires the effort of thought against the noise of other competing mental processes. A structure must be built in the brain, a real, mechanical structure, in order for intent to be realized. The building of these structures demands energy. Thermodynamically, we know that any structure is always (ALWAYS!) the result of the least energy path causally accumulated... that the next easiest thing to happen is in fact what always happens next. If you ate french fries today but you want instead to run five miles tomorrow, than you have to go about building a new structure in your brain so that it takes less energy tomorrow to make the decision to run than it took to eat french fries today. Worse than that, the process of reengineering these mental energy topologies must itself take less energy than every other competing process. Given the hard taskmaster that is thermodynamics, it is a wonder that any complexity ever happens, let alone sticks around for long. But then again, our brains must be pretty good at facilitating this seemingly impossible or improbable act... at turning the building of a thing into the mechanism that requires less energy.
The way that thermodynamics shapes and restricts causality is best understood if you think about dropping sand from your hand. Most of the time (almost all of the time) the sand will land in configurations considerably less organized than the already low organization it had in your fist. But once in a while, a couple of grains, shaped just right, with just the right internal properties, will land in just the right orientation and proximity to each-other so that their new arrangement accomplishes two entirely improbable things at once, the structure is self stable (resists disruption) and facilitates an increase in the whole system's ability to do what it is already doing faster and more completely. Random interactions between grains of sand will at times create complex patterns, even patterns that might accelerate the processes at hand, but most of these random aggregates will not be stable, will not pass their structure into the future. The appearance of new complexities is profoundly improbable. Even rarer are new complexities who's structure can be maintained over time. In order for this to happen at all, a new complexity must cause the total system in which it resides to become less stable and less complex. Nature falls apart easier than it falls together. The long future of any system is away from complexity and towards chaos. Systems can only become complex to the extent that they accelerate or help to maximize this grand movement towards disorder.
And that my friends is probably the most anti-intuitive truth any complex system will ever be asked to understand.
None of us know exactly how intent is manifested, how it plays itself out in the brain, but we do know that it must be as compliant to the hard restraints of thermodynamics as is every other system in the universe. I look at intent through the lens of thermodynamics only to show that we find ourselves at a strangely confusing threshold, previously we lived in blissful ignorance, didn't know enough to question the difference between the way if feels to think and how thinking makes thinking seem this way. In front of us is the era we must now live in, the era of the singular strangeness of the enlightenment that comes with being able to see the mind as a mechanical system even as we think these very un-mechanical thoughts about intent. Wow.
Now what? Given access to this great big brain we all posses, on which class of tasks can we put it to work that will do the greatest justice to it's evolutionarily rare and recent potential, and to the cosmic scale of the sacrifices made to produce it in the first place? An even more perplexing question is the extent to which self-knowledge at this snake-eating-its-own-tail level effects mental productivity. If a Chevy engine could sacrifice some of its piston strokes to calculate the correct gas/air mix to maximize it's power output, should it? What of the power lost to those re-calibration strokes? Do we know enough about the workings of the mind, about learning, about the effects of intention when inwardly directed, to risk messing with the system? On the other hand, what of our mental activities are not in point of fact, exactly this kind of self-tinkering?
We have a tendency to lump time and progress together, intuiting a direct link between the two, maybe even considering them one and the same. With time, there is a past, a present, and a future. Though they flow smoothly from one to the other, they are linearly separated by a line called now and remain experientially distinct. But what of causality, of progress? Compare the complexity of humans (or of any biological entity), with the chemistry from which life is composed, and you are forced to concede that progress towards complexity is as linear as time itself. In the past there was simple, followed by a now that is more complex, and we talk all of this as proof that the future will be yet more complex.
But how does one go about getting to the future? The passive answer: wait patiently. But that implies that progress, like time, just plain old happens... independent of intent. We don't yet know from what stuff or situation time propagates. Einstein's little relativity equation allows us to compute the state or value of time in relation to energy, matter and distance, but that really isn't the same as understanding the causal chain from which time itself is born. We experience time only after the fact, after time exists. "So what?" I hear you say, "time goes about its merry business without need of understanding anyway!" And that is my point. We accept that which never changes. Omnipotent things and processes like time are experienced as background, firmament, unquestionable. But constancy doesn't justify ignorance. In fact, if the unimaginable success of the uniquely human and very recent activity called SCIENCE says anything, it says that sticking our fingers in the cosmic cracks and pealing back the firmament's many layers, is the single most efficient means of success. Is that what progress is? It is at the very least, a type of progress. But the evolutionary record shows that progress happened for quite a while before the appearance of science or humans or animals or for that matter, before life itself. Which makes me wonder... if algae could think, would algae be as confused by photosynthesis as we are by intent?
Specifically, I am thinking here about the future of computing. But that doesn't really matter, when I think about any future, I use a trick I have learned over the years, I look for the ultimate future, the pinnacle, the end state, the future that will, if given enough time, happen. To this end I have spent the last thirty years pondered evolution it self (the rules and patterns of change), and by simple expansion of logic, self-evolving computing. I assume that something like computing, something of computing, will some day proceed beyond the yoke of human intent.
As an aid to understanding the events around us, we humans developed counting and measurement. From there we developed arithmetic rules and methods for manipulating and comparing our sums and measurements. After that, it was only inevitable that we would build computing ticks and machines to slog through all of that arithmetic heavy lifting. Computers and maths, like shovels and spears, are nothing but external tools that extend control of our environment beyond our own biological abilities. Anyone who has spent time exploring the notion of tools has come up against the foreground background problem that makes it impossible, or at the very least, arbitrary, to come up with criteria that perfectly separates any tool, from any tool user. The arm or hand, is it a part of you or a tool? Any answer you come up with depends heavily on how you define "you", the boundaries of "you", and how you define "tool". If a tool is something a thing develops so that it can do something it couldn't previously do, then how is a hand, developed by evolution, not a tool? When any discussion of progress, of the evolution of tools, and of the boundaries that define a system reaches the level of sentience, of intent, of purpose, we are faced by mutually linked concepts that can quickly cascade into feedback loops of infinite recursion.
Given that, is there a way to measure progress? If the thing trying to measure progress is also a product of, and participant in that progress, is there a way to know the differences between actual universal limits and limits induced by our own structure and knowledge? These are big big issues, and they are not the stuff of philosophy anymore. Progress in almost every human endeavor is linked directly or indirectly to our ability to build systems that handle greater and greater complexities.
Eventually of course, it won't be enough to build better and better extensions to human capacity. Eventually, our complexity handling tools will reach a level of self-genorating complexity handling such that interaction with humans will actually slow them down, will hamper their own evolution!
That eventuality, that inevitability, is a long way from where we sit today. But, if we are going to go there, and we know it, why not make it an intentional and directed effort. That way, we will get there sooner, we will waste less time and energy exploring dead-end evolutionary what ifs.
Along the way, I imagine a computing that seeks its own complexity maximizing goals, an epoch stretching out forward of the of birth of the paradigm that comes after biology, a period of evolution that begins with what is now popularly known as "the singularity".
By any reasonable argument, it would be difficult to imagine a future that does not eventually reach such a point. It happened with chemistry... that is how life got here! Life, I argue, is just chemistry following a higher order of self organization. So it would seem arrogant to suppose that a similar jump in self-organization wouldn't happen on top of the current scheme; biology.
I don't have a religious obsession with the future. Meaning, I am not attracted to the future as a nether-world that will come and save us. I don't see the future as a panacea, or a back door, as a way out. I see it as the direct result of a process that resulted in us and that we are necessary participants. In the exact same sense that we are the direct result of actions taken by things before us, the future, to me, is a thing to be built... by us! The beauty of evolution is that it just plain happens. Obviously. In this neck of the cosmic woods, we are the first result of evolution that has the ability to see itself in the context of this grand process... as an agent or ingredient in the process! But, because we are the first, we have to accept that consciousness is not requisite to the process. In fact, one has to wonder if consciousness, if sentience, doesn't at times very much get in the way of evolution.
Regardless, here we are, self aware, and like everything else, part and parcel to a system that changes over time.... that gets more complex in pockets that are already more complex. So, it is natural for us to ask the biggest questions of fate, purpose, and intention. If, as I do, we accept that intent is simply a mechanism of organization (not qualitatively different then the krebs cycle or photosynthesis or RNA transcription), we have a responsibility to do evolution proud, to honor all of the hard work that has resulted in this level of complexity that is us, and to run our little leg of the grand relay, in a way that respects the race that brought us into existence.
But what exactly is our role in this race that gave us, us? We are beginning to understand the rules of the race, of evolution itself. How does knowing about the race change our participation? Is this the ultimate faustian hubris? Are we flirting with the reification of Pandora's Box? The Ouroboros (snake eating its own tail)? It seems the very expression of human reason to explore questions of purpose. But the old ways of seeking resulted in abstractions, philosophy, fantasy... this is different. This is the blueprint of change. This is a recipe for process itself. Not some process... EVERY process.
I am reluctant to imagine that humans are equipped to desire an realistic understanding of reality. More often, we seek "enlightenment" instead "reality". Reality requires a closeness that is uncomfortable, or at least, unfamiliar. But as we acquire an accurate and causal understanding of purpose, what will happen to purpose or purposes transition into action? Thermodynamics and information science have given us the structure of a theory of change. Will we accept it as reality? Can we understand it as reality? And if so, then what? In the theater of the mind, does knowing how change happens play itself out differently from seeking enlightenment or any of the more spiritual practices that have driven individual morality, shaped cultures and their ontologies, and ultimately resulted in motivation?
I was asked recently how I define the difference between humans and other life... I remembered Gregory Bateson's challenge to come up with a reliable set of criteria that would allow anyone to determine the difference between a thing that had been alive to something that had never been alive... and I answered:
A human can see itself as an active participant in the process that resulted in humans.
If we can, so informed, imagine ourselves, accept our purpose, as that which, like all before us, is here to maximize the potential of complexity will we, equipped with self knowledge of the actual workings of the system, will we be able to do evolution better than the systems that did it and did it so well without knowledge. Knowledge should make us better. But knowldge will surely bring its own unique challenges to the process. Roughly, there were particles and particles accumulated into super particles which accumulated into atoms which accumlated into stars which ran through their fuel, exploded and accumulated into new stars and planets, the atoms on planets accumulated into more and more complex molecules and these molecules accumulated structures that alowed them to reproduce which led to even more complex molecules that worked as the molecule factories we call life, and these living systems eventually built abstraction systems or minds and these minds eventually evolved sentience and sentience gives us this sentence:
What matters is what matters, knowing what matters and how to know it matters the most.
That sentence is the first sentence of the book I am writing explaining evolution from the perspective of why. The sentence is not special, it could have been written by anyone, it probably has been written before. But what it means, and the ability to mean it is special. It represents what and who we are as a species.
Darwin did a really great job explaining the how of evolution, at least with regards to biology. What I work towards is a domain independent (any system) understanding of the rare but influential emergence and self-stability of greater and greater complexity. I don't see biology as special. I don't see humans as special. I don't even see human sentience as special. I see each of these systems as ever increasing, often layered, accumulations of complexity that are quantitatively but not qualitatively different from each other. There is a huge difference between sentience and chemistry of course, but both levels of complexity are derived by the same dynamics, the same process. Nothing new or bold or other-worldly has to be added to the general evolutionary scheme in order to move from the evolution of atoms to the evolution of sentience. I labor this point only as a means of grounding the other theses I write here. Grounding seems especially important when extrapolating any ideas to the future.
Of course my insistence on fusing human intention to this largest of problems, to evolution itself may seem superfluous or self aggrandizing given the concurrent argument that complexity just plain happens. Where oxygen can't help but to bind to iron, human behavior is such that intention, though every bit as mechanical, requires the effort of thought against the noise of other competing mental processes. A structure must be built in the brain, a real, mechanical structure, in order for intent to be realized. The building of these structures demands energy. Thermodynamically, we know that any structure is always (ALWAYS!) the result of the least energy path causally accumulated... that the next easiest thing to happen is in fact what always happens next. If you ate french fries today but you want instead to run five miles tomorrow, than you have to go about building a new structure in your brain so that it takes less energy tomorrow to make the decision to run than it took to eat french fries today. Worse than that, the process of reengineering these mental energy topologies must itself take less energy than every other competing process. Given the hard taskmaster that is thermodynamics, it is a wonder that any complexity ever happens, let alone sticks around for long. But then again, our brains must be pretty good at facilitating this seemingly impossible or improbable act... at turning the building of a thing into the mechanism that requires less energy.
The way that thermodynamics shapes and restricts causality is best understood if you think about dropping sand from your hand. Most of the time (almost all of the time) the sand will land in configurations considerably less organized than the already low organization it had in your fist. But once in a while, a couple of grains, shaped just right, with just the right internal properties, will land in just the right orientation and proximity to each-other so that their new arrangement accomplishes two entirely improbable things at once, the structure is self stable (resists disruption) and facilitates an increase in the whole system's ability to do what it is already doing faster and more completely. Random interactions between grains of sand will at times create complex patterns, even patterns that might accelerate the processes at hand, but most of these random aggregates will not be stable, will not pass their structure into the future. The appearance of new complexities is profoundly improbable. Even rarer are new complexities who's structure can be maintained over time. In order for this to happen at all, a new complexity must cause the total system in which it resides to become less stable and less complex. Nature falls apart easier than it falls together. The long future of any system is away from complexity and towards chaos. Systems can only become complex to the extent that they accelerate or help to maximize this grand movement towards disorder.
And that my friends is probably the most anti-intuitive truth any complex system will ever be asked to understand.
None of us know how exactly how intent is manifested, how it plays itself out in the brain, but we do know that it must be as compliant to the hard restraints of thermodynamics as is every other system in the universe. I look at intent through the lens of thermodynamics only to show that we find ourselves at a strangely confusing threshold, previously we lived in blissful ignorance, didn't know enough to question the difference between the way if feels to think and how thinking makes thinking seem this way. In front of us is the era we must now live in, the era of the singular strangeness of the enlightenment that comes with being able to see the mind as a mechanical system even as we think these very un-mechanical thoughts about intent. Wow.
Now what? Given access to this great big brain we all posses, on which class of tasks can we put it to work that will do the greatest justice to it's evolutionarily rare and recent potential, and to the cosmic scale of the sacrifices made to produce it in the first place? An even more perplexing question is the extent to which self-knowledge at this snake-eating-its-own-tail level effects mental productivity. If a Chevy engine could sacrifice some of its piston strokes to calculate the correct gas/air mix to maximize it's power output, should it? What of the power lost to those re-calibration strokes? Do we know enough about the workings of the mind, about learning, about the effects of intention when inwardly directed, to risk messing with the system? On the other hand, what of our mental activities are not in point of fact, exactly this kind of self-tinkering?
There is even more cause for concern when we look to what we are learning about the way our brains evolved. We think with a brain, a machine that "accumulated" more than it "changed"... a machine built as layers added over time, each one more complex, culminating in this eighth-inch thin top-most layer we call the neocortex, each one adding but never replacing functionality to functions handled by deeper layers. Because of this, there is a necessary one-way communications challenge. The outer most, more recent and more complex layers speak a language the inner layers are simply too simple to understand. Yet the inner layers, being developed when organisms were themselves less complex, are far more likely to be more directly wired to action and control. So, in order for our wonderfully complex and capable outer and more recent layers to effect change, they must send messages down to older, dumber, more directly wired layers, layers that are by definition incapable of understanding the reasoning that supports a directive. Its as if each of us is a ship full of geniuses all trying to tell a really stupid captain when to turn and why. The stupid captain may be able to understand the directives, but is at a loss to understand the reasoning backing them up. This crazy situation is the reality of the current state of complexity... it is the sand currently being dropped by the cosmic hand of evolution.
It is my argument that we succeed only to the extent that we can build a more and more accurate understanding of both the general parameters of evolution and the current situation evolutionary processes must work with.
NBC/MSNBC, The Worst Olympics Coverage EVER!
I can't tell if this is the worst ever coverage of an Olympics, or if it just seems so bad in contrast to what could have been in this internet broad band world.
Even if we completely forgot that we live in a world where almost everyone has some sort of access to the internet... NBC is doing nothing to take advantage of the breadth and depth of the spectacle that is the Olympics. Think of the hundreds of events taking place daily. The tens of venues. The tens of thousands of athletes. The millions of potential stories. Even if the games had to be told through the linearity of the tube yet again, there have been far better examples. In fact, the only thing about NBC's coverage that gives any hint of the size of a modern summer olympics is the number of regular shows they have canceled. Are the graphics people on strike? Are the sports writers on strike? Did all of the "up close and personal" gonzo reporters retire? Is the instant replay button on the fritz? Can't show a map of the venues between segments? Can't interview a foreign athlete? Can't show a grand schedule of the day's events? Can't show Olympic, World or National records next to the current times or scores?
One of the funniest things about NBC's coverage is the hourly cut-aways to that big 70's living room with all the empty chairs and couches and one guy or gal sitting by themselves at the end of a ridiculously long dolly shot through the vast emptiness of the set. What is that about? The old preacher looking guy they have sitting there dead pan delivering the day's events perfectly completes the "700 Club" feel of the place. Would it be possible to design a less "sports" looking set or find a less sports sounding guy to sit there (and do nothing!)? All this excitement and sports-fever is just about killing me! Is the guy even alive? Surreal.
Oh yah, I almost forgot, we live in the internet age. An age where we are used to getting exactly what we want when we want it. So, for instance, if there are a thousand or so Olympics spectacles each day, why can't we just go on line, punch in a search phrase, and watch exactly what we want and watch it exactly when we want to watch it? And, as many times as we want to watch it? YouTube has written software that automates the entire video publish and delivery system. Even if NBC can't figure out how to do the internet thing themselves, they could just upload the events as segments the way the rest of us do... to YouTube.
And, don't give me that "there is no money in it" crap. If you can insert a 30 second ad into a tv broadcast, you can insert one into a digital video stream. I heard that Google has made a little money on the web through ad sales. What the hell is going on? To bad NBC can't partner with a mega-large technology company (who could show them the digital ropes). What's that? They have? With Microsoft? What kind of crazy backwards-world is this? Nothing makes sense.
If I was an NBC or Microsoft shareholder I would be demanding heads right now. Think of the potential revenues lost! If I was a citizen of the 21st century, I would be wondering why all of this technology we own is making things worse.
I don't read them, but my local Boarders book store has a huge section with thousands of the latest and greatest books on business and marketing. I will bet any one of them would outline at least ten big ways in which NBC and Microsoft have totally blown this exclusive coverage opportunity.
I have an idea, how about we award three coverage contracts instead of just one. If NBC had to compete for our viewer-ship, I would bet things would be substantially different. We used to stand up against monopolies and monopolistic policy. A monopoly derived through open bidding is no less a monopoly. A market is open only to the extent there is choice in the market. Each consumer must have the power to choose at any moment between any of several suppliers. The very notion of a "contract" runs counter to consumer choice.
Come on people, speak up! Demand more! Please don't give up. A 60 inch high definition plasma isn't going to make bad television anything but more obvious. Lets stop purchasing high-resolution screens and start demanding high-relevence content.
Subscribe to:
Posts (Atom)