And have an idea. Plot the ratio of the total number of dollars in the stock market vs. the total number of dollars in the US economy sans the market (all over the same 100 year history).
My guess is that there have been spikes in this stock vs. M. ratio that correspond with new money entering the market, either shifted from other domestic segments (real-estate, retirement accounts, etc.) or from international influx of investments (rapidly rising wealth of Asia and rest of world, sudden collapse of a large industry, commodity, or governmental or regional stability). Very few of standard economic metrics measure true macro or global interaction between geo-scale segments.
Comparing the Dow Jones against itself over time, or rarely, against other metrics like the GDP is different than reifying this and other comparisons as named metrics in and of themselves. But completely missing is geo-scale metrics that track the shifting values across regions and segments, in effect treating economic entities as markets competing for the maximum percentage of the total global value or geo-M.
What I am getting to is some way to accurately read the total global M and total labor L and total energy use E and to track the motion in real time of the density of these values geographically, or geopolitically, or by production or consumption segment. Once a true global economic sandbox tracker/simulator has been built, we will have the ability to read the economy as it is, in real time, and find the factors that sit at the base of the actual influence hierarchy that drives and causes economic flux.
My suspicion is that actual economic growth is equivalent to, always and only the result of increases in the means and use of tools and infrastructure that can build more for less labor... productivity, and that, in the absence of true growth in productivity, a market responds through acts of trickery which are ultimately not supportable and always culminate in crashes or "adjustments" which tend to pull market values into closer alignment with actual M and productivity values.
When new money comes into a market, standard supply and demand metrics are no longer accurate predictors or models. When new money comes into a market, standard supply and demand metrics will tend to say that the value of a product has risen. In a closed system this evaluation would most often be accurate In an open system, where money can stream into a market not to chase a product for consumption or industry, but just because that market seems a more attractive place in which to speculate than others, it throws the whole system into instability. Producers under such conditions are want to make more shoes, even though people are not growing more feet or walking more holes into their souls. End-consumers begin at inflationary times like these, to speculate with their purchases, buying products and commodities not because they need them, but because it seems foolish not to. Products and commodities take on a currency-like property, and through over-valuation supersaturate the market... leading to an inevitable value crash. If a large enough percentage of an economy's value has been suckered into such a bubble, the crash will bleed over into the economy as a whole... hitting the financial and banking markets first and hardest.
Over the past two decades, three fundamental factors have made large markets in the west especially sensitive and vulnerable to these new-money boom/crash cycles.
First, the undeveloped economies of the world have experienced exponential growth as they adopt tools and infrastructures borrowed from the first world. Importantly, because the third and second world represented the vast majority of the world's population and geography, this explosion in wealth (though still on average only bringing the third world slightly out of poverty) began to represent (by shear size) a larger and larger segment of the global economy. Remember that this "rest of world" economy represents roughly six times the population of the first world. Even smallish changes to a segment of this relative multiple have huge effect on the total global economy. And the actual changes have not been small. Second and third world economies have absolutely exploded! Much of this new money has of course been reinvested into the local economies from which it sprung. But, increasingly, larger and larger chunks of this new money have gone searching for boutique markets like the New York Stock Exchange and its equivalent in Japan, Germany, England, France, and the EU.
The second factor has to do with the paucity of true growth in productivity experienced in western and first world economies during this same twenty or thirty year period. In post-industrial economies, regions that have secure and constant access to reliable transportation of goods and services (shipping, highway, rail, and air transport), ready and secure capital (through investment banking and business and consumer credit), private property ownership (as a means to secure capitalization), education (to steadily feed highly skilled workers into labor markets), and governments that protect and promote the well being and promote the success of their citizens en-mass, and who have built a dependable infrastructure to create, extract, and distribute energy, and the means to grow and process foods cheaply on industrial scales... these rare segments of global marketplace... have had these capabilities for some forty years. Excepting of course for incremental gains made in efficiency of the above systems as a result of new knowledge and tools that result from better understanding of nature through advances in the sciences productivity has largely leveled off and remained level for the better part of a quarter century. What of the computer? you say. Surely the computer and the World Wide Web have had a huge positive impact on first world economies. But interestingly, the net net economic effect of computation and the digital networks it creates, has been surpassingly neutral. We do pump a larger and larger percentage of first world moneys into computation and its infrastructure that consume and use computers.
Real productivity metrics have yet to precipitate outward from the large success of computing as a market and into the larger first world economy. Ironically, the computer industry's success in the west may have impacted second and third world economies the most. It may be that money made in the computing industry flowed more deeply and directly into the rest of world economies where much of the computer industry does its manufacturing, assembly, and customer support. That computing has not resulted in measurable increases in first world productivity has computer industry insiders scratching their heads. During the Dot Com boom, pro-industry annalists creatively sidestepped this uncomfortable truth by inventing the idea of a "new economy" or "cyber economy", famously proclaiming, "The old rules and metrics don't apply". They were wrong, in the short term, but maybe, just maybe, in the long term they will be correct.
I suspect that the true economic benefits or potential benefits caused by the computer and computing upon global productivity have yet to be realized. The computer industry has spent the last 30 years largely learning how to get computers to do what we did (though slower and more awkwardly) before we had computers (writing, printing, telephony, accounting, payroll, data processing, advertising, point of purchase, audio and video broadcast, mathematics, graphing, market tracking and trading, banking, news and reporting, information sharing, post and mail, libraries, process control, etc.). This conversion has been expensive, and time consuming. Much of the time, we have proceeded as an industry (and a society) without a clear goal. Let me restate; neither the computing industry or the consuming public has had a clear idea where computing has been or ought to be going. Much of the time, both industry and market have been happy just to see what new (old) thing the computer can be taught to do... blindly building and consuming our way into the future just because it is "cool" or "fun" or "neat" or adds some strange and intoxicating "immediacy" to our daily lives (even when that immediacy does not equate effectiveness or lead us to deeper and more efficient infrastructure's necessary to cause the kinds of profound increases in productivity we expect from new technology paradigms).
I am a big believer in the future of computing, or the future that computing could build towards, but this belief is contingent upon society getting to a deep clarity of understanding about what computing is and why it matters. We have got to work hard at determining the difference between that which is cool and that which is profound. That which we want and that which will change the world. Until then, we are simply designing and producing towards consumption which will make segments of the industry rich and will bring money from the consuming west into emerging economies, but it will not ultimately support real growth, the kind of growth that is supported by knowledge, tools, and infrastructure that have the capacity to catapult productivity to the next level (the way the tractor pulled plow, germ theory, general education, the steam engine, and electricity have done in the past).
[to be continued...]
Change increases entropy. The only variable; how fast the Universe falls towards chaos. Determining this rate is the complexity being carried. Complexity exists only to increase disorder. Evolution is the refinement of a fitness metric. It is the process of refining a criteria for the measurement of the capacity of a system to maximize its future potential to hold complexity. This metric becomes ever more sophisticated, and can never be predetermined. Evolution is the computation.
Search This Blog
Artificial General Intelligence X-Prize Challenge
Proposal:
Entrants submit a storage device containing general purpose AI code. At the day of the challenge, each entrant's code is dumped into a computer aboard a device that is not announced until after each entrant has submitted their code. The device is situated in an environment with an obvious goal. The goal is implied by the capabilities of the device in combination with the attributes of the environment and situation. The entrant's code is expected to explore the capabilities of the device into which it is implanted by reading data streams from embedded sensors and sending control signals to actuators (arms, wheels, eye focusing and pointing, etc.).
The challenge device may be a submarine, a car, a microscope, a motorcycle, a missile, a fish, an airplane, dirigible, or helicopter, a walking or crawling robot, a kitchen robot, a bank ATM, a database, etc... there may actually be two or three devices that each entrant's code must inhabit and learn to control in a situation with an implied goal and specific dangers.
The idea is that AI challenges have, to date, been too specific and narrow leading to brittle customized "AI" that is in point of fact only tricky logic and programming that is not intelligent at all (can't learn, adapt, or improvise).
The entrants can assume that the challenge device(s) will deliver several sensor data streams and output ports for the control of actuators. More detailed input/output protocols and conventions will be published well before the challenge date.
The winner of each challenge will be selected through a combination of subjective judgement by a panel of experts and objective metrics (exp. time-to-completion vs. mips of processing used).
Entrants submit a storage device containing general purpose AI code. At the day of the challenge, each entrant's code is dumped into a computer aboard a device that is not announced until after each entrant has submitted their code. The device is situated in an environment with an obvious goal. The goal is implied by the capabilities of the device in combination with the attributes of the environment and situation. The entrant's code is expected to explore the capabilities of the device into which it is implanted by reading data streams from embedded sensors and sending control signals to actuators (arms, wheels, eye focusing and pointing, etc.).
The challenge device may be a submarine, a car, a microscope, a motorcycle, a missile, a fish, an airplane, dirigible, or helicopter, a walking or crawling robot, a kitchen robot, a bank ATM, a database, etc... there may actually be two or three devices that each entrant's code must inhabit and learn to control in a situation with an implied goal and specific dangers.
The idea is that AI challenges have, to date, been too specific and narrow leading to brittle customized "AI" that is in point of fact only tricky logic and programming that is not intelligent at all (can't learn, adapt, or improvise).
The entrants can assume that the challenge device(s) will deliver several sensor data streams and output ports for the control of actuators. More detailed input/output protocols and conventions will be published well before the challenge date.
The winner of each challenge will be selected through a combination of subjective judgement by a panel of experts and objective metrics (exp. time-to-completion vs. mips of processing used).
Swarm Computing: Are We Ready?
If you are living within a modern developed economy you are most likely the unwitting master of over 40 computer chips. Of these chips only one or two are the big expensive general purpose CPU's that sit at the center of PC's and laptops of the sort Intel or AMD or Motorola charge several hundred dollars, aggressively advertise, and which, improbably, get both more powerful and less expensive at the wild rate dictated by Moore's Law (respectively doubling and halving every 18 months).
The rest of your chips live less conspicuous lives of simple servitude. They are embedded within many of the products and appliances you already own and almost all of the products you will buy tomorrow or own in the future. Compared with CPU's, embedded chips are usually smaller, more specific, and limited in their abilities. Uncomplainingly, and in many cases, without the need for any human interaction at all, they go about their menial tasks in nauseating repetition… electrons, it turns out, are exceedingly cheap. Embedded chips now control your car's fuel/air mixture, and time the ignition sparks in each cylinder. These chips are one of the reasons modern cars are so much more powerful and manage better mileage at the same time. There are special chips that keep your wheels from locking up when you brake hard, that track the relative rotational speed of each tire and independently brake each wheel to keep any one of them from slipping on ice and water, from digging into mud or snow at the side of the road, and still others that compile information from your speedometer, engine, and from accelerometers to individually regulate the responsiveness of your suspension at each of your car's four corners.
There are chips that inflate the air bags that keep you alive by sensing an impact and reacting to it in the few ten thousandths of a second between "Crash!" and and when your head would have collided with the steering wheel or side window. There are several that do nothing but regulate the temperature and humidity of the air flowing within your car's interior. Likewise, your home thermostat contains one that helps your furnace and air conditioner balance the opposing demands of energy efficiency and comfort. Back in your car there are chips in your GPS system, in your sound system, chips that track sensors all over your car and tell you when something mechanical is amiss or if you are scheduled for a tuneup. There are chips that control the display of your speedometer and other in-dash instruments. One keeps your cell phone attuned to the closest transmission tower. There is one in your wrist watch. If you are an avid athlete, you might wear a computer that keeps track of your heart rate and oxygen uptake. Your bike might have one that keeps track of your distance and speed and the amount of pressure you are applying to your pedals, tracking your motion through geography and altitude, even tracking your performance as it changes over months or years of training. Hell, your blender and stove and refrigerator have them. There are chips in your TV, in your DVR, in your CD and DVD player. On your bed side table your alarm clock and home phone have them, even each of the little black boxes that charge your wireless devices. Most washing machines have them, more and more refrigerators come with them, some stoves, most microwave ovens, bread makers, coffee machines, home security systems, multi-room media systems, and lots of toys have them, robots, learning systems, trivia games, and then there are home weather stations, TV set top boxes for cable and satellite, hand held language translators, calculators, walky-talkies, baby monitors, pace makers, insulin pumps, fertility monitors, there are even little plastic flowers on plastic stems that you shove into the dirt in your yard which dutifully measure your soil's acidity, temperature, nitrogen content, moisture level, and the direction, hours, and intensity of sunlight, after 24 hours you pull the stem apart and plug the USB plug into your computer which goes to a site which takes your zip code and spits back a list of plants suited for that part of your garden's conditions.
Some toilets are controlled by embedded chips.
The world of stuff is coming alive through computation. And it isn't just because smart stuff is better stuff. One of the main reasons chips are finding their way into so many things is because they have become so damn cheap to manufacture.
CPU manufactures compete by constantly building newer and more advanced fabrication plants (each costing billions of dollars) that spit out more and more powerful processors by incorporating new processes that can etch smaller and smaller transistors which means more and more transistors on the same sized chip. The etching is a projection process which means hundreds of chips can be etched at once which means once you have built the fabrication plant, it costs no more per chip to print out chips with two or four times as many parts. Meanwhile, older plants, having already been paid for through the sale of millions of what was state of the art just a few years prior, can be cheaply retooled to spit out lesser chips at unit costs that drop to nearly pennies per.
Chips are becoming the inexpensive souls we install in our devices to bring them alive. Soon, as these devices gain communication capabilities, they will build out their own network of things. This thing-net will bring our environment alive with shared information. Where computers themselves are information object, the smart network of stuff will become information backdrop. You've heard the projection that your grocery cart will get biometric data via your smart toilet and will gently guide you to purchase organic broccoli or whole grain breads (partially subsidized by your health provider who can cut costs by helping it's patients stay healthy and out of the hospital). This scenario is not as far fetched as it first sounds. As embedded processors proliferate, the stuff around us will become brainy and chatty. Add in some software that understands the hopes and habits and needs of the humans that use them, and you have the beginnings of what I call "forced serendipity" where the environment conspires around us to make our lives more complete and our minds more content. How this will play out is beyond the scope of guessing. That it will play out… this is inevitable.
Simultaneously, the world gets richer. The third world becomes second world and the second world becomes first world. Technology is no longer the sole domain of the top one tenth of the global population. A double explosion results; 1. processing power increases ridiculously as dictated by Moore's Law, and, 2. a growing percentage of global citizens that have gained sufficient economic power to effect the demand curve for products and services that are enabled by embedded and shared processors.
It is hard to over-estimate the ramifications brought about by this Diaspora of processor infused devices. Born of computers but now running amok into the firmament of the day to day environment, the thing-net becomes a background, a living web composed of millions of ad-hoc collaborative groupings of shared processing. What we call a computer today, a single self contained processing device, wow will that come to seem quaint as we roll into the next decade. Think instead of amorphous coalitions of lesser processors and sensors coming together as the need arises and as opportunity presents itself… of two, ten, devices in your car, or two hundred thousand chips spread out across the planet, devices owned by many entities, or maybe even by no-one. A computer? In the new smart dust network of things, a computer is a what happens for a few seconds and what goes away as others, as other ethereal combinations of devices snap into and out of existence and layered such that any one node is part of many such computers if they are apart of one.
Then what? It is easy to see that our shared future is to be ever more defined and dominated over by an ever growing and ever more dense "All-Net" of environmental sensors and actuators that talk to each other, fall naturally into functional groups, learn and adapt to the ever changing demands of those people, organizations, and machine aggregates that emerge, compete, and collaborate in this ever more complex information and control exchange.
By the most conservative estimates, the network of things will grow out at exponential rates, rates faster even than Moore's Law. If a billion of Earth's six billion inhabitants each own 4o chips and if that number is doubling every 5 years and if the world keeps getting richer while chips get cheaper, than it wouldn't be crazy to suggest that in just twenty years we will live within a thing-net composed of somewhere between ten trillion and a hundred trillion intercommunicating processors. The total potential communication graph described by such a staggeringly large network is unimaginably complex. On what logic will it run?
In our everyday lives, we are used to complex aggregates like culture, society, and government evolving from similar loose and unstructured associations and connections. Culture as network. But that, we say, happens because humans are the nodes and humans are themselves complex. Indeed, as network nodes, we humans are very complex. We participate in the societal network that is culture as self contained arbitrators or processors of wildly complex patterns like situation and context, need and availability, of propriety and collaboration, of initiative and momentum, of concept and diffusion, of pattern and chaos. What minimal subset of these skills can we distill from the human node, then generalize and subsume into the silicon and software of our tiny smart-dust kin to be? This is the big project that looms in front of computer science. THE PROJECT. But before you throw up you arms in despair, remember that our own minds are consistent with the smart-dust model. Each of the 100 billion or so neurons in our head is relatively simple (at least in its role as an information and logic processor). "Mind", the remarkable confluence that makes us, us, that results, isn't the neurons, the things, the nodes, so much as it is a super-product of the n-dimensional connection map, the network that flits in and out of existence, the fluctuating and overlapping web connecting neurons to neurons.
As the design of sensor/controller/communicator modules grows in sophistication, as we figure out what absolutely has to exist in a node and what is best left to the emergent properties of the network itself, as the same efficiency of scale factors that gave us Moore's law come to play in their design and production, as these All-Net motes shrink, and become more and more powerful, we will see them incorporated into more and more of the stuff around us. Eventually, we will live in a world where almost everything has sensory, processing and communications capabilities and where these capabilities are diffuse, cheap, expendable, and by virtue of their numbers alone, omnipresent.
But how will these devices keep track of themselves in relation to others? By which shared language will they communicate? How will they filter the cacophony of noise that will be the totality of chatter across millions of trillions of devices?
It is one thing to build a radio station and a bunch of receivers. Quite another to build both into every device. But that is just the beginning, these little radios can compute and they can broadcast what they think so that other nodes can think on those thoughts, take action, or send them forward to still more nodes. What kinds of logic are made necessary by the communications craziness produced when everything everywhere contains tens or hundreds of intelligent processing motes built into it?
Clearly, today's operating systems would collapse under such a load and without the centrality and determinism afforded to todays computing environments. Computing as it stands today, is a reflection of simpler times when the computer stood all by itself. But what, you ask, of the network, the internet, the world wide web? Surely, the internet and email are proof that we live in the age of networked computing? The sober answer is that we have been duped. Clicking on a hyper-link and having a web page served up from some hard drive a thousand miles away is a parlor trick. When the UPS guy shows up with a pair of shoes you wouldn't confuse that with a conversation, with any kind of deeply collaborative process. Today's network is a network of transactions. I request, you send. You request, I send.
Operating system designers have only recently begun to think of the computer as a node, as a member of a network of nodes. Asking them to jump paradigms even further, to define a computer as some minimally complex, some minimally competent collection of nodes... as a collection of nodes aggregating and de-aggregating at a pace dictated by need, by task, by goal, by opportunity… this is beyond the cognitive scope of most industry engineers, is difficult even for academics and theorists to wrap their minds around. Ever hear those words used together: computer theorist? Historically, computing leans heavily towards practice. I am going to step on some toes here, but most computer researchers will probably agree that as a science, computing is dangerously long on practice and criminally short on theory.
Standing here at the edge of the now, looking out into the abyss that is the future, it seems obvious to me that we need to redefine the very notion of the word "computer". We need to dump the old definition of the computer as thing, and work towards a new understanding of a computer as a collection of computing devises, as a shifting sphere of influence, a probability cloud, a sphere of influence shared by, indeed made up of, hundreds or thousands of computing devices, each seeing themselves as the center of their own simultaneously overlapping spheres. If this is a radically new idea, it is an idea that is predated by the reality of the ever expanding network that is the natural product of chips embedded in so many of the things around us. It makes sense to call this new kind of center-less computing network, "Swam Computing" but the term still fails to evoke the complex dimensionality, simultaneity, and swift plasticity that will be true network computing.
So what will it take? What new logical structures will we have to build to get from where we are today, from the isolated computing systems we use today, to the self optimized, agent generating, pattern building, semantically aware, swarm computing of tomorrow? What kinds of architectural logic will be required of a computing system that has no center, no definitive physicality, no boundaries, where even the physical resources of computing (processing, memory, storage, and communication channel) shift and dance in a shimmer of dynamically shared and momentary and ever changing flux? Where real-time systems come and go at the speed of environmental flux, where need and demand and supply and goal and condition and context can never be pre-determined, what kinds of logical structures will be made necessary by the overlay of so ethereal a system onto the reliable accuracy we demand and expect of deterministic computing systems?
This is the future already flowing like a fog into and around our daily lives. This is the future practicality predicts, but for which our current knowledge and computing infrastructure is hopelessly ill-prepared. Everyday things are already laden with computing power. Some can even send and receive communications. But the wall of complexity that stands between this self-evident now and any true swarm computing future is daunting. Many computer researchers are beginning to sense as I do, that progress in computer science has come to a halt, that we have, as an industry, as a field, collectively run head long into this wall that stands between the single computer past and the swarm computer future. We know how to architect single computer solutions, but we are struggling to bring multi-computer self optimizing swam computing solutions from idea or dream to full working detail.
Of course having a hundred trillion nodes doesn't necessarily mean you have anything approaching swarm computing… it might mean nothing more than the fact that there are a hundred trillion nodes. Nodes that aren't designed to work together, to self aggregate… will never result in anything more capable than any one of them is as an individual unit. A flock maybe, but never a swarm. A flock is composed of lot of self-similar individuals acting in their own self interest, any patterns that appear to emerge are crystalline, pretty, but never complex, emergent, or evolving. A swarm, on the other hand, is defined as structures and behaviors of the aggregate whole that are not themselves possible in the individual parts of which it is composed. A swarm doesn't just "look" like a system… it is one. Where a flock can emerge from the behavior of individuals even when they have no means of sensing the goals of others around them, a swarm can only result when each part has the capacity to be aware of the parts around them and can grasp and process the construct of a shared goal. For a bunch of nodes ever to swarm, each nodes must posses the capacity to see itself in the context of an environment that is at least partially composed of other nodes.
Current estimates put the number of computers hooked to the global web we call the internet at between one and two billion. But there is nothing swarm-like about the internet. Obviously, today's computers do not have the requisite attributes and abilities to swarm. Exchanging email documents and web pages isn't exactly the stuff of deeply collaborative computing. For one thing, your computer has absolutely no means of understanding the contents of your emails or the web pages it dutifully displays. Adding more of these semantically challenged computers to the net won't get us any closer to swarm computing.
Likewise, most of the yardsticks we use to measure computer performance and progress like clock speed, data bandwidth, bus speed, chip memory, and disk capacity are useless as indicators of the capacity of a node to participate in a computational swarm. Today's computers are powered by chips composed of over a billion transistors. The amount of storage available to each computer on the net today is enough to hold tens of thousands of books, hundreds of hours of video and thousands of hours of recorded music. though it is obvious that they are not nearly as mentally endowed as is the average laptop, individual ants seem to swarm just fine. The crucial swarming criteria can not be quantity, or power so much as it is some measure of quality.
So, what is the missing secret sauce? What crucial "it" results in swarm behavior? Let's make a list. All nodes must speak a common language, that common language must be semantically reflexive (each node must be able to self-interpret the meaning of the work of other nodes), the nodes must be hooked to, listen, and talk through a shared network, each node must understand and be able to readily process such complex concepts as ownership, membership, resources, value, goals, and direction, they must know how to enlist the help of other nodes and node groups, and they must be able to know when one project is done and how to determine what might be formed into a new project, they must be able to know their own strengths or unique value and how to build on these strengths and form coalitions where other nodes are better equipped. Each of these capabilities is complex in its own right, but none of them alone will facilitate swarm computing. Swarm computing will happen only as a holistic solution. Any reasonable holistic solution will most likely be built from a set of shareable low-level logical machines… machines not unlike the boolean gates that make up binary computation, but at a higher level of abstraction.
Though it will not be easy to specify, design, or build, we desperately need an architectural-level solution to the swarm computing problem. We need a solution that allows each node to act as a cell, both autonomous and fully capable of coming together to build out collaborative systems equivalent to the organs, bodies, species, and culture that are the emergent aggregates of biological cells. As is true with biological cells, any true swarm computing architecture will have to imbue each and every computing node all of the logic necessary to autonomously swarm into larger and larger computing entities. The ideal smart-dust swarm computing architecture will consist of only two things, the nodes themselves and the medium or network that connects them. Specialization might play a role, each nodes might need to be able to physically adapt to specific environmental situations. Ideally, a node would contain everything it needed to conform and self optimize to changing demands and opportunities. But most of the plasticity in the system will come from the connections between the nodes, how they self aggregate into opportunistic groups to solve problems, refine and store information, and communicate, edit, and propagate pattern, how they learn and how other groups learn to use this accumulated knowledge as decision and control structure.
I am compelled to take a short detour and talk to the "multi-core processor" issue. Every major chip manufacturer is building them. A new arms race has shifted bragging rights from clock speed to core count. Commercial chips with 16 cores are about to hit the market, 32 core chips will follow. Intel has bragged of a working 80-core prototype. The dirty truth is that chip makers hit a performance wall about 6 years ago, forcing a retreat to multi-core layouts. As they attempted to pack more and more transistors onto a CPU, they came up against the hard reality of physics. Smaller and smaller wires meant greater and greater heat and worse and worse efficiency (as a ratio of their mass, chips were producing more heat than the interior of the sun). Then electrons started to jump gates (tunneling beneath space-time!) making transistors unreliable. The shear size of the chips was rising to the point where the distance electrons had to travel (at 96 thousand miles per second) was slowing things down. The surface area to edge ratio had shifted so to the point that there wasn't enough length at the chip's perimeter through which to supply data to keep it running at full capacity. Real world uses tended not to match the type for which these big chips were optimized. But mostly, the industry was rapidly reaching the physical limits of the project-and-etch manufacturing process that had so reliably yielded biannual doubling of processor density.
The industry responded by going condo… by dividing the chips into smaller and smaller sub-units. Multi-core chips come with their own unique set of problems. The main problem is that nobody has figured out how to write a compiler that converts source code into a format that takes full advantage of multiple cores. Compilers that customized for two core chips worse than useless for four or eight core chips. The other problem is that problems that are cut up to be processed on multiple cores produce results out of synch with the cores next to them. So data that one part of a process needs may have to wait around for another process to finnish. The results produced by each of these core assigned threads must be put into some sort of shared memory and must be labeled and tracked so that the processes being run on other cores can avoid using unprocessed data or data that is now owned by another process thread. The logical topology of multi-core processing models has proved difficult to generalize. Automating the processing sequence and the locking of transitory in-process memory regardless of the number of cores available to the execution of an application thread is beyond the practical scope of most programmers. The end result is that most code runs slower on multi-core chips.
Until the industry responds with a way to reliably and automatically break executable code into self optimized threads managed for multi-core chips, you'd better think twice when told that a thirty two core chip is better than a two core chip.
But there may be a silver lining to all of this multi-core craziness. When you think about it, multiple core chips are the same as having several chips… and isn't that what swarm computing is all about? Solving the problems and meeting the challenges of designing a true multi-core processing architecture is exactly equivalent to solving the problems and meeting the challenges of designing an processing architecture for swarm computing. The only real difference between the multi-core and multi-node computing challenge is security. Cores, existing as they do on one chip are mostly owned by one entity. Ownership is implicit. Nodes, on the other hand, can be far flung and owned by any number of separate entities. Multi-node processing is processing that must adhere to strict ownership and data security handshakes. Where processing involves multiple entities, data must be served up from shared memory in accordance with strict asynchronous locking protocol.
We have taken old-style solitary and discrete processing and programming model (computing in a box), about as far as human creativity and cognitive capacity will permit. As a result, and because each of our other research pursuits is so intimately dependent on the steady advance of computing's power and scope, progress in almost every human endeavor has been stalled by the complexity wall that stands between computing's solitary and linear past and its collaborative and n-dimensional future. Designing our way forward, towards an architectural solution to the swarm computing problem must be made humanity's central, shared, and most pressing goal. Until we meet this challenge, progress in all human endeavors will remain stalled.
As happened with the introduction of electrical power transmission, industrial agriculture, the internal combustion engine, and electronic communication, the other side of computing's complexity wall heralds fold increases in global productivity, dwarfing all previous epochs.
The rest of your chips live less conspicuous lives of simple servitude. They are embedded within many of the products and appliances you already own and almost all of the products you will buy tomorrow or own in the future. Compared with CPU's, embedded chips are usually smaller, more specific, and limited in their abilities. Uncomplainingly, and in many cases, without the need for any human interaction at all, they go about their menial tasks in nauseating repetition… electrons, it turns out, are exceedingly cheap. Embedded chips now control your car's fuel/air mixture, and time the ignition sparks in each cylinder. These chips are one of the reasons modern cars are so much more powerful and manage better mileage at the same time. There are special chips that keep your wheels from locking up when you brake hard, that track the relative rotational speed of each tire and independently brake each wheel to keep any one of them from slipping on ice and water, from digging into mud or snow at the side of the road, and still others that compile information from your speedometer, engine, and from accelerometers to individually regulate the responsiveness of your suspension at each of your car's four corners.
There are chips that inflate the air bags that keep you alive by sensing an impact and reacting to it in the few ten thousandths of a second between "Crash!" and and when your head would have collided with the steering wheel or side window. There are several that do nothing but regulate the temperature and humidity of the air flowing within your car's interior. Likewise, your home thermostat contains one that helps your furnace and air conditioner balance the opposing demands of energy efficiency and comfort. Back in your car there are chips in your GPS system, in your sound system, chips that track sensors all over your car and tell you when something mechanical is amiss or if you are scheduled for a tuneup. There are chips that control the display of your speedometer and other in-dash instruments. One keeps your cell phone attuned to the closest transmission tower. There is one in your wrist watch. If you are an avid athlete, you might wear a computer that keeps track of your heart rate and oxygen uptake. Your bike might have one that keeps track of your distance and speed and the amount of pressure you are applying to your pedals, tracking your motion through geography and altitude, even tracking your performance as it changes over months or years of training. Hell, your blender and stove and refrigerator have them. There are chips in your TV, in your DVR, in your CD and DVD player. On your bed side table your alarm clock and home phone have them, even each of the little black boxes that charge your wireless devices. Most washing machines have them, more and more refrigerators come with them, some stoves, most microwave ovens, bread makers, coffee machines, home security systems, multi-room media systems, and lots of toys have them, robots, learning systems, trivia games, and then there are home weather stations, TV set top boxes for cable and satellite, hand held language translators, calculators, walky-talkies, baby monitors, pace makers, insulin pumps, fertility monitors, there are even little plastic flowers on plastic stems that you shove into the dirt in your yard which dutifully measure your soil's acidity, temperature, nitrogen content, moisture level, and the direction, hours, and intensity of sunlight, after 24 hours you pull the stem apart and plug the USB plug into your computer which goes to a site which takes your zip code and spits back a list of plants suited for that part of your garden's conditions.
Some toilets are controlled by embedded chips.
The world of stuff is coming alive through computation. And it isn't just because smart stuff is better stuff. One of the main reasons chips are finding their way into so many things is because they have become so damn cheap to manufacture.
CPU manufactures compete by constantly building newer and more advanced fabrication plants (each costing billions of dollars) that spit out more and more powerful processors by incorporating new processes that can etch smaller and smaller transistors which means more and more transistors on the same sized chip. The etching is a projection process which means hundreds of chips can be etched at once which means once you have built the fabrication plant, it costs no more per chip to print out chips with two or four times as many parts. Meanwhile, older plants, having already been paid for through the sale of millions of what was state of the art just a few years prior, can be cheaply retooled to spit out lesser chips at unit costs that drop to nearly pennies per.
Chips are becoming the inexpensive souls we install in our devices to bring them alive. Soon, as these devices gain communication capabilities, they will build out their own network of things. This thing-net will bring our environment alive with shared information. Where computers themselves are information object, the smart network of stuff will become information backdrop. You've heard the projection that your grocery cart will get biometric data via your smart toilet and will gently guide you to purchase organic broccoli or whole grain breads (partially subsidized by your health provider who can cut costs by helping it's patients stay healthy and out of the hospital). This scenario is not as far fetched as it first sounds. As embedded processors proliferate, the stuff around us will become brainy and chatty. Add in some software that understands the hopes and habits and needs of the humans that use them, and you have the beginnings of what I call "forced serendipity" where the environment conspires around us to make our lives more complete and our minds more content. How this will play out is beyond the scope of guessing. That it will play out… this is inevitable.
Simultaneously, the world gets richer. The third world becomes second world and the second world becomes first world. Technology is no longer the sole domain of the top one tenth of the global population. A double explosion results; 1. processing power increases ridiculously as dictated by Moore's Law, and, 2. a growing percentage of global citizens that have gained sufficient economic power to effect the demand curve for products and services that are enabled by embedded and shared processors.
It is hard to over-estimate the ramifications brought about by this Diaspora of processor infused devices. Born of computers but now running amok into the firmament of the day to day environment, the thing-net becomes a background, a living web composed of millions of ad-hoc collaborative groupings of shared processing. What we call a computer today, a single self contained processing device, wow will that come to seem quaint as we roll into the next decade. Think instead of amorphous coalitions of lesser processors and sensors coming together as the need arises and as opportunity presents itself… of two, ten, devices in your car, or two hundred thousand chips spread out across the planet, devices owned by many entities, or maybe even by no-one. A computer? In the new smart dust network of things, a computer is a what happens for a few seconds and what goes away as others, as other ethereal combinations of devices snap into and out of existence and layered such that any one node is part of many such computers if they are apart of one.
Then what? It is easy to see that our shared future is to be ever more defined and dominated over by an ever growing and ever more dense "All-Net" of environmental sensors and actuators that talk to each other, fall naturally into functional groups, learn and adapt to the ever changing demands of those people, organizations, and machine aggregates that emerge, compete, and collaborate in this ever more complex information and control exchange.
By the most conservative estimates, the network of things will grow out at exponential rates, rates faster even than Moore's Law. If a billion of Earth's six billion inhabitants each own 4o chips and if that number is doubling every 5 years and if the world keeps getting richer while chips get cheaper, than it wouldn't be crazy to suggest that in just twenty years we will live within a thing-net composed of somewhere between ten trillion and a hundred trillion intercommunicating processors. The total potential communication graph described by such a staggeringly large network is unimaginably complex. On what logic will it run?
In our everyday lives, we are used to complex aggregates like culture, society, and government evolving from similar loose and unstructured associations and connections. Culture as network. But that, we say, happens because humans are the nodes and humans are themselves complex. Indeed, as network nodes, we humans are very complex. We participate in the societal network that is culture as self contained arbitrators or processors of wildly complex patterns like situation and context, need and availability, of propriety and collaboration, of initiative and momentum, of concept and diffusion, of pattern and chaos. What minimal subset of these skills can we distill from the human node, then generalize and subsume into the silicon and software of our tiny smart-dust kin to be? This is the big project that looms in front of computer science. THE PROJECT. But before you throw up you arms in despair, remember that our own minds are consistent with the smart-dust model. Each of the 100 billion or so neurons in our head is relatively simple (at least in its role as an information and logic processor). "Mind", the remarkable confluence that makes us, us, that results, isn't the neurons, the things, the nodes, so much as it is a super-product of the n-dimensional connection map, the network that flits in and out of existence, the fluctuating and overlapping web connecting neurons to neurons.
As the design of sensor/controller/communicator modules grows in sophistication, as we figure out what absolutely has to exist in a node and what is best left to the emergent properties of the network itself, as the same efficiency of scale factors that gave us Moore's law come to play in their design and production, as these All-Net motes shrink, and become more and more powerful, we will see them incorporated into more and more of the stuff around us. Eventually, we will live in a world where almost everything has sensory, processing and communications capabilities and where these capabilities are diffuse, cheap, expendable, and by virtue of their numbers alone, omnipresent.
But how will these devices keep track of themselves in relation to others? By which shared language will they communicate? How will they filter the cacophony of noise that will be the totality of chatter across millions of trillions of devices?
It is one thing to build a radio station and a bunch of receivers. Quite another to build both into every device. But that is just the beginning, these little radios can compute and they can broadcast what they think so that other nodes can think on those thoughts, take action, or send them forward to still more nodes. What kinds of logic are made necessary by the communications craziness produced when everything everywhere contains tens or hundreds of intelligent processing motes built into it?
Clearly, today's operating systems would collapse under such a load and without the centrality and determinism afforded to todays computing environments. Computing as it stands today, is a reflection of simpler times when the computer stood all by itself. But what, you ask, of the network, the internet, the world wide web? Surely, the internet and email are proof that we live in the age of networked computing? The sober answer is that we have been duped. Clicking on a hyper-link and having a web page served up from some hard drive a thousand miles away is a parlor trick. When the UPS guy shows up with a pair of shoes you wouldn't confuse that with a conversation, with any kind of deeply collaborative process. Today's network is a network of transactions. I request, you send. You request, I send.
Operating system designers have only recently begun to think of the computer as a node, as a member of a network of nodes. Asking them to jump paradigms even further, to define a computer as some minimally complex, some minimally competent collection of nodes... as a collection of nodes aggregating and de-aggregating at a pace dictated by need, by task, by goal, by opportunity… this is beyond the cognitive scope of most industry engineers, is difficult even for academics and theorists to wrap their minds around. Ever hear those words used together: computer theorist? Historically, computing leans heavily towards practice. I am going to step on some toes here, but most computer researchers will probably agree that as a science, computing is dangerously long on practice and criminally short on theory.
Standing here at the edge of the now, looking out into the abyss that is the future, it seems obvious to me that we need to redefine the very notion of the word "computer". We need to dump the old definition of the computer as thing, and work towards a new understanding of a computer as a collection of computing devises, as a shifting sphere of influence, a probability cloud, a sphere of influence shared by, indeed made up of, hundreds or thousands of computing devices, each seeing themselves as the center of their own simultaneously overlapping spheres. If this is a radically new idea, it is an idea that is predated by the reality of the ever expanding network that is the natural product of chips embedded in so many of the things around us. It makes sense to call this new kind of center-less computing network, "Swam Computing" but the term still fails to evoke the complex dimensionality, simultaneity, and swift plasticity that will be true network computing.
So what will it take? What new logical structures will we have to build to get from where we are today, from the isolated computing systems we use today, to the self optimized, agent generating, pattern building, semantically aware, swarm computing of tomorrow? What kinds of architectural logic will be required of a computing system that has no center, no definitive physicality, no boundaries, where even the physical resources of computing (processing, memory, storage, and communication channel) shift and dance in a shimmer of dynamically shared and momentary and ever changing flux? Where real-time systems come and go at the speed of environmental flux, where need and demand and supply and goal and condition and context can never be pre-determined, what kinds of logical structures will be made necessary by the overlay of so ethereal a system onto the reliable accuracy we demand and expect of deterministic computing systems?
This is the future already flowing like a fog into and around our daily lives. This is the future practicality predicts, but for which our current knowledge and computing infrastructure is hopelessly ill-prepared. Everyday things are already laden with computing power. Some can even send and receive communications. But the wall of complexity that stands between this self-evident now and any true swarm computing future is daunting. Many computer researchers are beginning to sense as I do, that progress in computer science has come to a halt, that we have, as an industry, as a field, collectively run head long into this wall that stands between the single computer past and the swarm computer future. We know how to architect single computer solutions, but we are struggling to bring multi-computer self optimizing swam computing solutions from idea or dream to full working detail.
Of course having a hundred trillion nodes doesn't necessarily mean you have anything approaching swarm computing… it might mean nothing more than the fact that there are a hundred trillion nodes. Nodes that aren't designed to work together, to self aggregate… will never result in anything more capable than any one of them is as an individual unit. A flock maybe, but never a swarm. A flock is composed of lot of self-similar individuals acting in their own self interest, any patterns that appear to emerge are crystalline, pretty, but never complex, emergent, or evolving. A swarm, on the other hand, is defined as structures and behaviors of the aggregate whole that are not themselves possible in the individual parts of which it is composed. A swarm doesn't just "look" like a system… it is one. Where a flock can emerge from the behavior of individuals even when they have no means of sensing the goals of others around them, a swarm can only result when each part has the capacity to be aware of the parts around them and can grasp and process the construct of a shared goal. For a bunch of nodes ever to swarm, each nodes must posses the capacity to see itself in the context of an environment that is at least partially composed of other nodes.
Current estimates put the number of computers hooked to the global web we call the internet at between one and two billion. But there is nothing swarm-like about the internet. Obviously, today's computers do not have the requisite attributes and abilities to swarm. Exchanging email documents and web pages isn't exactly the stuff of deeply collaborative computing. For one thing, your computer has absolutely no means of understanding the contents of your emails or the web pages it dutifully displays. Adding more of these semantically challenged computers to the net won't get us any closer to swarm computing.
Likewise, most of the yardsticks we use to measure computer performance and progress like clock speed, data bandwidth, bus speed, chip memory, and disk capacity are useless as indicators of the capacity of a node to participate in a computational swarm. Today's computers are powered by chips composed of over a billion transistors. The amount of storage available to each computer on the net today is enough to hold tens of thousands of books, hundreds of hours of video and thousands of hours of recorded music. though it is obvious that they are not nearly as mentally endowed as is the average laptop, individual ants seem to swarm just fine. The crucial swarming criteria can not be quantity, or power so much as it is some measure of quality.
So, what is the missing secret sauce? What crucial "it" results in swarm behavior? Let's make a list. All nodes must speak a common language, that common language must be semantically reflexive (each node must be able to self-interpret the meaning of the work of other nodes), the nodes must be hooked to, listen, and talk through a shared network, each node must understand and be able to readily process such complex concepts as ownership, membership, resources, value, goals, and direction, they must know how to enlist the help of other nodes and node groups, and they must be able to know when one project is done and how to determine what might be formed into a new project, they must be able to know their own strengths or unique value and how to build on these strengths and form coalitions where other nodes are better equipped. Each of these capabilities is complex in its own right, but none of them alone will facilitate swarm computing. Swarm computing will happen only as a holistic solution. Any reasonable holistic solution will most likely be built from a set of shareable low-level logical machines… machines not unlike the boolean gates that make up binary computation, but at a higher level of abstraction.
Though it will not be easy to specify, design, or build, we desperately need an architectural-level solution to the swarm computing problem. We need a solution that allows each node to act as a cell, both autonomous and fully capable of coming together to build out collaborative systems equivalent to the organs, bodies, species, and culture that are the emergent aggregates of biological cells. As is true with biological cells, any true swarm computing architecture will have to imbue each and every computing node all of the logic necessary to autonomously swarm into larger and larger computing entities. The ideal smart-dust swarm computing architecture will consist of only two things, the nodes themselves and the medium or network that connects them. Specialization might play a role, each nodes might need to be able to physically adapt to specific environmental situations. Ideally, a node would contain everything it needed to conform and self optimize to changing demands and opportunities. But most of the plasticity in the system will come from the connections between the nodes, how they self aggregate into opportunistic groups to solve problems, refine and store information, and communicate, edit, and propagate pattern, how they learn and how other groups learn to use this accumulated knowledge as decision and control structure.
I am compelled to take a short detour and talk to the "multi-core processor" issue. Every major chip manufacturer is building them. A new arms race has shifted bragging rights from clock speed to core count. Commercial chips with 16 cores are about to hit the market, 32 core chips will follow. Intel has bragged of a working 80-core prototype. The dirty truth is that chip makers hit a performance wall about 6 years ago, forcing a retreat to multi-core layouts. As they attempted to pack more and more transistors onto a CPU, they came up against the hard reality of physics. Smaller and smaller wires meant greater and greater heat and worse and worse efficiency (as a ratio of their mass, chips were producing more heat than the interior of the sun). Then electrons started to jump gates (tunneling beneath space-time!) making transistors unreliable. The shear size of the chips was rising to the point where the distance electrons had to travel (at 96 thousand miles per second) was slowing things down. The surface area to edge ratio had shifted so to the point that there wasn't enough length at the chip's perimeter through which to supply data to keep it running at full capacity. Real world uses tended not to match the type for which these big chips were optimized. But mostly, the industry was rapidly reaching the physical limits of the project-and-etch manufacturing process that had so reliably yielded biannual doubling of processor density.
The industry responded by going condo… by dividing the chips into smaller and smaller sub-units. Multi-core chips come with their own unique set of problems. The main problem is that nobody has figured out how to write a compiler that converts source code into a format that takes full advantage of multiple cores. Compilers that customized for two core chips worse than useless for four or eight core chips. The other problem is that problems that are cut up to be processed on multiple cores produce results out of synch with the cores next to them. So data that one part of a process needs may have to wait around for another process to finnish. The results produced by each of these core assigned threads must be put into some sort of shared memory and must be labeled and tracked so that the processes being run on other cores can avoid using unprocessed data or data that is now owned by another process thread. The logical topology of multi-core processing models has proved difficult to generalize. Automating the processing sequence and the locking of transitory in-process memory regardless of the number of cores available to the execution of an application thread is beyond the practical scope of most programmers. The end result is that most code runs slower on multi-core chips.
Until the industry responds with a way to reliably and automatically break executable code into self optimized threads managed for multi-core chips, you'd better think twice when told that a thirty two core chip is better than a two core chip.
But there may be a silver lining to all of this multi-core craziness. When you think about it, multiple core chips are the same as having several chips… and isn't that what swarm computing is all about? Solving the problems and meeting the challenges of designing a true multi-core processing architecture is exactly equivalent to solving the problems and meeting the challenges of designing an processing architecture for swarm computing. The only real difference between the multi-core and multi-node computing challenge is security. Cores, existing as they do on one chip are mostly owned by one entity. Ownership is implicit. Nodes, on the other hand, can be far flung and owned by any number of separate entities. Multi-node processing is processing that must adhere to strict ownership and data security handshakes. Where processing involves multiple entities, data must be served up from shared memory in accordance with strict asynchronous locking protocol.
We have taken old-style solitary and discrete processing and programming model (computing in a box), about as far as human creativity and cognitive capacity will permit. As a result, and because each of our other research pursuits is so intimately dependent on the steady advance of computing's power and scope, progress in almost every human endeavor has been stalled by the complexity wall that stands between computing's solitary and linear past and its collaborative and n-dimensional future. Designing our way forward, towards an architectural solution to the swarm computing problem must be made humanity's central, shared, and most pressing goal. Until we meet this challenge, progress in all human endeavors will remain stalled.
As happened with the introduction of electrical power transmission, industrial agriculture, the internal combustion engine, and electronic communication, the other side of computing's complexity wall heralds fold increases in global productivity, dwarfing all previous epochs.
When advances get in the way of innovation…
From the perspective of true innovation – not run of the mill, better, faster, smaller, cheaper innovation – but deep and fundamental – changes everything – innovation, the computer industry has stagnated for more than fifteen years. This matters. This matters to everyone and everything. This might matter more than any other thing that matters at this point in the history of human culture. Like it or not, computers are THE primary tool driving progress in ALL fields of human endeavor and ALL sectors of the market. The result? Hell we LIVE the results of this stagnation all day every day. The future might be forced to live it even more viscerally. Larger economic metrics like national productivity expose this standstill for what it is and what it effects.
This innovation standstill, and by extension, every wild financial boom-and-crash cycle the world has been through since, is the direct result of the fact that productivity (which of course is directly linked to innovation cycles) has not risen since 1992. As we all know, productivity rises in direct relation to the pace of innovation in tools and infrastructure. You can say what you will about the shear scale and rapid global adoption of the internet, of virtual shopping, banking, and stock trading, of email, and cell phones... neither new channels for communication, nor new venues for consumption, (despite the obvious revenue they have created), have significantly increased our global ability to do more work in less time... to increase productivity. Please remember that businesses are in the business of grabbing as much of the current GDP, the current economic pie. That is market competition... temporal and immediate. Productivity on the other hand, what I am talking about, totally different, is about growing the total economic pie. Productivity is an economics issue. Though it is often effected by or set into motion by the actions of businesses, productivity is a long term function... it increases as new tools, techniques, social behavior, or infrastructure allows the same production as before for less hours of labor, or less input of energy. It is the plow and irrigation channels that allow one family to feed nineteen families so that all twenty families together can build a more complex and still more productive infrastructure.
In the absence of the kinds of substantial technological innovations that are needed for paradigm increases in productivity, markets react in maladapted, but perfectly predictable and expectable ways. Left starving for real growth in wealth and wealth generation, markets adjust emphasis and focus towards the shifting around of assets, creative accounting, slice and sell, merge and downsize, layered asset repackaging, in short, they do what they have to do to keep equity partners and stock owners happy by any and every means possible. In the absence of innovation-driven rises in productivity, "any and every means possible", almost always leads to destructive long term results... to the opposite of growth.
At the same time, and somewhat ironically, second and third world economies have seen spectacular growth. Though daunting in other ways, "developing" regions owe there achievements in growth to the acquisition and application of innovation, pre-built. They simply adopt decades and even centuries old western knowledge and innovation... building out modern infrastructures at pennies on the dollar. All this while first world economies stagnate. How could this be? Surely, the first world benefits from global modernization, from the rest of the world catching up with western levels of wealth and productivity? Yes. Of course it does. The non-western world just happens to be composed of four fifths of the planet's population. From a purely business, abjectly greedy, point of view, you can make a lot more money selling four times as many washing machines as you can by treating these people as a source of cheep labor and the regions they represent as cheep sources of raw materials.
Because of the shear size of the undeveloped world, small rises in wealth have huge investment implications. As a result, western stock, commodities, and money markets, always the most attractive, have surged as much of this new money has entered the investment fray. But this surge of third and second world investment in western securities (sounds funny doesn't it) has not been matched by the kinds of true infrastructure advantages that would justify the resulting staggering growth in valuation. The west has found itself in the enviable position of having access to more money than it's own innovation mechanisms can support.
By the way, consumption does not equal productivity. Never did and never will. Consumption can act as a rough indicator of wealth, the inevitable outcome of wealth, but when driven by credit and debt, consumption can increase independent of, and often at odds with wealth. When people have more money, they can and do spend more of it on the things they need, and they spend in new markets (ipods and restaurant quality kitchen remodels, bigger houses, cars with all leather interiors, navigation systems, and flat panel media centers), and they spend what's left over on savings and market speculation. But consumption is sometimes the result of credit. With personal credit, people spend money that isn't yet theirs. People buy money in order to buy things and experiences. This activity is euphemistically labeled "consumer confidence" because it suggests that people think that though they don't have adequate money to make purchases today, they will somehow acquire more money in some reasonable tomorrow. But, in the presence of credit, people don't behave reasonably. In the presence of personal credit, consumption outpaces productivity generated real wealth. The more people owe, the less money they have to both pay back their credit debt and make more purchases. The credit industry responds by extending more debt. Real value of real money becomes, as a result, very hard to track. What does consumption driven revenue actually mean when that consumption is being financed by debt? Disturbingly, in wholesale markets, debt is conceived of, marketed, and packaged as "product".
Consumer credit is an extreme bastardization of the concept of "capital". Capital, as traditionally defined, as loans made to producers, can legitimately be show to increase an economy's ability be productive. Personal credit is rarely used to increase capital, rarely thought of as a tool to increase a person's ability to make wealth. It is somewhat ironic that we may cringe at the idea of max-ing out a credit card to finance a business or to initiate a creative project, but we have no such aversion to using the same credit to go on a weekend trip or to acquire new shoes. We recognize that business, in pursuit the creation of wealth, need access to loans, to lines of credit, to venture capital, to the sale of equity as stock. But we are generally cognizant of the fact that the same access to capital when offered to individuals as credit, does not often drive anything more profound than debt and the kinds of consumption that are non-productive, that don't add to an economy's ability to produce new wealth, to do more with less labor.
Consumption isn't restricted to the individual shopper at the local mall. Consumption happens at huge corporate and international scales as well. Consumption as defined as the act of purchasing things and services that do not lead to greater productivity is becoming an alarmingly common business practice involving tens of trillions of dollars exchanged daily in global markets. To the extent the money being spent is not owned by the consuming party, to the extent the things being purchased do lead to increases in productivity, and to the extent that the intention of the participating parties is not motivated by an understanding of productivity or a desire to act in its behalf, exchange throughput must be viewed for what it really is, an unreliable indicator of true growth. To the extent a market is propped up by the infusion of cash, to the extent that a market grows for reasons that are not tied to its ability to increase productivity, eventually the whole system has to crash. As it has many many times, and as it has crashed recently.
The same thing happened and caused the dot com boom. New money, money that had not previously had access to the markets, found its way to wall street through internet trading. Ma and Pa Simpleton could hook up a modem and siphon their savings and retirement accounts right into APL and IBM stocks. This new money made old stocks look more valuable. "Rising demand" and all. Only nobody got that there is a big difference between the new "more demand" and our old notions of "higher demand".
Likewise, the unprecedented surge of non-western investment in western stock markets coincided with an ongoing flattening of our own innovation-starved growth trajectories, good money was and continues to be funneled into very bad monitory mechanisms and "creative" securities products, when it should have gone towards innovation induced capital and productivity. We really don't have an economic theory to fit this crazy bottom up investment model. We really don't know how to track and predict the differences between and interactions in the intersection of investment markets and the real product markets they (sometimes) represent.
Meanwhile, second and third world markets have indeed grown as a direct result of the build out of more efficient infrastructures. The last 15 years has seen the second and third world adopting modern industrial farming, modern highway systems, modern water and energy production and distribution systems, modern banking and credit, pluralist governance and education, modern health care, and the industrial machinery that can only be produced and maintained by a well educated work force. Along the way, attitudes and cultures have adjusted to the individual freedoms that come hand in hand with wealth, capital, and stability. But all of this growth has been a result of the rest of the world adopting ideas, knowledge, tools and infrastructure that has long existed in the west. Productivity has risen as expected at the rate of adoption.
In traditional western economies, economies that originate(ed) the knowledge, tools and infrastructures now being adopted in the rest of the world, increases in productivity must come from innovation. We don't have the luxury of adoption... there are no societies more advanced from which to borrow innovation.
We must innovate! In a modern economy, innovation is mandatory. More to the point, the kinds of innovations we must produce, that the west must build and implement, must be the the particular kinds of innovation that result in true increases in global productivity. Forget about little innovations, or surface innovation, or innovations that extrapolate on older innovations. Don't bother with innovations that exploit markets created by earlier innovations... for apex economies, innovation is only innovation if it catalyzes whole new markets, if it fundamentally reshapes the future of innovation.
At every large organization, there are people who's primary responsibility is the happiness of investors. Investors expect the value of their equity share to increase steadily. Included in this group are CEO's, CFO's, Boards of Directors, and almost everyone working at top levels of management. Their careers are directly linked to the value performance of the organization they represent. When productivity doesn't rise apace, institutional professionals are compelled to find any and every artificial means of inflating the value of their product. A bubble is born. Unsupported by real value, these epochs of artificially inflated values must inevitably come crashing down.
Regulation does help. By restricting market managers from engaging in the most egregious and obvious inflationary tactics, some market shenanigans can be avoided. But ultimately, unhappy times, market mangers will find a way to do their job... to keep share holders (temporarily) happy. And then, of course, there is always, fiscal policy. Government and banking driven manipulation of fiscal and monitory policy (controlling the base price and availability of capital as loans) has important, but will only have limited and short term effects on markets. Fiscal policy is a surface fix. The fed board (and its equivalent in other nations/economies) is a fine-tuning instrument... has absolutely disastrous implications when used beyond this narrow band of effect (currency devaluation, runaway inflation, decreased foreign investment, etc.).
Again, we come back to the basics... to the the most fundamental metric in any economy... to productivity. If you can find a way to feed your entire society with just two percent of your population working the soil (rather than ninety), you can get more done with your total labor pool. Productivity rises. Industrialized farming, automated factories, the delivery of clean water to homes and work places, an accepted currency and banking system that makes capital available to the masses, and equitable and respected system of governance, and justice, the removal of garbage and human waste, an efficient transportation system for people, goods, and industrial materials, a reliable communications network, public education and career and health care, and an efficient source of energy that can be routed where it is needed... these are the foundational infrastructures that drive an economy. Each presaged an increase in productivity and is linked to true growth.
It has been a long time since an infrastructure-scale innovation has rocked the national and global productivity metrics. The cultivation of plants and domestication of animals. Metallurgy. Fire. Shelter and clothing. Devision of labor. Governance. Spoken and written language. The printing press. Transportation infrastructure. Production and distribution of energy. Communication systems. These epochs are at the scale I am addressing. Electricity. Oil and gas production. The telegraph/radio/TV. The internal combustion engine. Numbers, measurement, and mathematics. Public education. Currency. The periodic table. Germ theory. Genetics. Evolution theory. Information theory. Machines that compute. These are the innovations that produce epochs of productivity.
Marked in staccato evolutionary steps, computing has proceeded apace through its rather short history, aping the dumb media we used to conduct culture... before computers. Ledger sheets, paper and pencil, typewriters, notebooks, folders and filing cabinets, printing presses, desks, drawers, chalk boards, telephones, mail, even cameras, sound and video recorders, televisions and radios. Very little of this analog to digital conversion process has involved added to the science of computing. Very little of the computerization of media has advanced the infrastructure of logic, of systems, of knowledge automation. Mostly our efforts and the markets that have resulted (rich as they have been) have sidetracked true evolution in computing by aligning attention to artifacts instead of meaning. A the existence of computerized spreadsheets might make working with ledger data easier, but it doesn't help us understand economics any better, and it doesn't advance the science of computation or logic. Progress in computing over the last 30 years has done little more than adding efficiency to old methods and processes, very little of the effort expended has resulted in better computing.
This is a dead end process. Using all of this logical power to make a better piece of paper... what a joke. We should instead aim to evolve technology that can generalize the larger problems that end up expressing themselves as ledger sheets and notebooks... technology that gets to the base of the issues that manifest the need for spreadsheets and word processors... technology that understands goals, that tracks resources, that builds collaborative solutions, that seeks patterns that build patterns... technology that can process the hierarchies of influence that effect the transition from any now to any inevitable future.
Can we look to today's computers, as amazing as they are, and truly say of them that mimicking paper added as much value to computing as it did to writing and typewriters? Efficiencies have been gained, sure, but at what cost? A computer can and should be much much more than a writing device, it should be an evolution machine. Is it? They have the power. What's missing is vision… not their vision… ours. Humans need to expect more of this most plastic of all machines.
Randall Reetz
This innovation standstill, and by extension, every wild financial boom-and-crash cycle the world has been through since, is the direct result of the fact that productivity (which of course is directly linked to innovation cycles) has not risen since 1992. As we all know, productivity rises in direct relation to the pace of innovation in tools and infrastructure. You can say what you will about the shear scale and rapid global adoption of the internet, of virtual shopping, banking, and stock trading, of email, and cell phones... neither new channels for communication, nor new venues for consumption, (despite the obvious revenue they have created), have significantly increased our global ability to do more work in less time... to increase productivity. Please remember that businesses are in the business of grabbing as much of the current GDP, the current economic pie. That is market competition... temporal and immediate. Productivity on the other hand, what I am talking about, totally different, is about growing the total economic pie. Productivity is an economics issue. Though it is often effected by or set into motion by the actions of businesses, productivity is a long term function... it increases as new tools, techniques, social behavior, or infrastructure allows the same production as before for less hours of labor, or less input of energy. It is the plow and irrigation channels that allow one family to feed nineteen families so that all twenty families together can build a more complex and still more productive infrastructure.
In the absence of the kinds of substantial technological innovations that are needed for paradigm increases in productivity, markets react in maladapted, but perfectly predictable and expectable ways. Left starving for real growth in wealth and wealth generation, markets adjust emphasis and focus towards the shifting around of assets, creative accounting, slice and sell, merge and downsize, layered asset repackaging, in short, they do what they have to do to keep equity partners and stock owners happy by any and every means possible. In the absence of innovation-driven rises in productivity, "any and every means possible", almost always leads to destructive long term results... to the opposite of growth.
At the same time, and somewhat ironically, second and third world economies have seen spectacular growth. Though daunting in other ways, "developing" regions owe there achievements in growth to the acquisition and application of innovation, pre-built. They simply adopt decades and even centuries old western knowledge and innovation... building out modern infrastructures at pennies on the dollar. All this while first world economies stagnate. How could this be? Surely, the first world benefits from global modernization, from the rest of the world catching up with western levels of wealth and productivity? Yes. Of course it does. The non-western world just happens to be composed of four fifths of the planet's population. From a purely business, abjectly greedy, point of view, you can make a lot more money selling four times as many washing machines as you can by treating these people as a source of cheep labor and the regions they represent as cheep sources of raw materials.
Because of the shear size of the undeveloped world, small rises in wealth have huge investment implications. As a result, western stock, commodities, and money markets, always the most attractive, have surged as much of this new money has entered the investment fray. But this surge of third and second world investment in western securities (sounds funny doesn't it) has not been matched by the kinds of true infrastructure advantages that would justify the resulting staggering growth in valuation. The west has found itself in the enviable position of having access to more money than it's own innovation mechanisms can support.
By the way, consumption does not equal productivity. Never did and never will. Consumption can act as a rough indicator of wealth, the inevitable outcome of wealth, but when driven by credit and debt, consumption can increase independent of, and often at odds with wealth. When people have more money, they can and do spend more of it on the things they need, and they spend in new markets (ipods and restaurant quality kitchen remodels, bigger houses, cars with all leather interiors, navigation systems, and flat panel media centers), and they spend what's left over on savings and market speculation. But consumption is sometimes the result of credit. With personal credit, people spend money that isn't yet theirs. People buy money in order to buy things and experiences. This activity is euphemistically labeled "consumer confidence" because it suggests that people think that though they don't have adequate money to make purchases today, they will somehow acquire more money in some reasonable tomorrow. But, in the presence of credit, people don't behave reasonably. In the presence of personal credit, consumption outpaces productivity generated real wealth. The more people owe, the less money they have to both pay back their credit debt and make more purchases. The credit industry responds by extending more debt. Real value of real money becomes, as a result, very hard to track. What does consumption driven revenue actually mean when that consumption is being financed by debt? Disturbingly, in wholesale markets, debt is conceived of, marketed, and packaged as "product".
Consumer credit is an extreme bastardization of the concept of "capital". Capital, as traditionally defined, as loans made to producers, can legitimately be show to increase an economy's ability be productive. Personal credit is rarely used to increase capital, rarely thought of as a tool to increase a person's ability to make wealth. It is somewhat ironic that we may cringe at the idea of max-ing out a credit card to finance a business or to initiate a creative project, but we have no such aversion to using the same credit to go on a weekend trip or to acquire new shoes. We recognize that business, in pursuit the creation of wealth, need access to loans, to lines of credit, to venture capital, to the sale of equity as stock. But we are generally cognizant of the fact that the same access to capital when offered to individuals as credit, does not often drive anything more profound than debt and the kinds of consumption that are non-productive, that don't add to an economy's ability to produce new wealth, to do more with less labor.
Consumption isn't restricted to the individual shopper at the local mall. Consumption happens at huge corporate and international scales as well. Consumption as defined as the act of purchasing things and services that do not lead to greater productivity is becoming an alarmingly common business practice involving tens of trillions of dollars exchanged daily in global markets. To the extent the money being spent is not owned by the consuming party, to the extent the things being purchased do lead to increases in productivity, and to the extent that the intention of the participating parties is not motivated by an understanding of productivity or a desire to act in its behalf, exchange throughput must be viewed for what it really is, an unreliable indicator of true growth. To the extent a market is propped up by the infusion of cash, to the extent that a market grows for reasons that are not tied to its ability to increase productivity, eventually the whole system has to crash. As it has many many times, and as it has crashed recently.
The same thing happened and caused the dot com boom. New money, money that had not previously had access to the markets, found its way to wall street through internet trading. Ma and Pa Simpleton could hook up a modem and siphon their savings and retirement accounts right into APL and IBM stocks. This new money made old stocks look more valuable. "Rising demand" and all. Only nobody got that there is a big difference between the new "more demand" and our old notions of "higher demand".
Likewise, the unprecedented surge of non-western investment in western stock markets coincided with an ongoing flattening of our own innovation-starved growth trajectories, good money was and continues to be funneled into very bad monitory mechanisms and "creative" securities products, when it should have gone towards innovation induced capital and productivity. We really don't have an economic theory to fit this crazy bottom up investment model. We really don't know how to track and predict the differences between and interactions in the intersection of investment markets and the real product markets they (sometimes) represent.
Meanwhile, second and third world markets have indeed grown as a direct result of the build out of more efficient infrastructures. The last 15 years has seen the second and third world adopting modern industrial farming, modern highway systems, modern water and energy production and distribution systems, modern banking and credit, pluralist governance and education, modern health care, and the industrial machinery that can only be produced and maintained by a well educated work force. Along the way, attitudes and cultures have adjusted to the individual freedoms that come hand in hand with wealth, capital, and stability. But all of this growth has been a result of the rest of the world adopting ideas, knowledge, tools and infrastructure that has long existed in the west. Productivity has risen as expected at the rate of adoption.
In traditional western economies, economies that originate(ed) the knowledge, tools and infrastructures now being adopted in the rest of the world, increases in productivity must come from innovation. We don't have the luxury of adoption... there are no societies more advanced from which to borrow innovation.
We must innovate! In a modern economy, innovation is mandatory. More to the point, the kinds of innovations we must produce, that the west must build and implement, must be the the particular kinds of innovation that result in true increases in global productivity. Forget about little innovations, or surface innovation, or innovations that extrapolate on older innovations. Don't bother with innovations that exploit markets created by earlier innovations... for apex economies, innovation is only innovation if it catalyzes whole new markets, if it fundamentally reshapes the future of innovation.
At every large organization, there are people who's primary responsibility is the happiness of investors. Investors expect the value of their equity share to increase steadily. Included in this group are CEO's, CFO's, Boards of Directors, and almost everyone working at top levels of management. Their careers are directly linked to the value performance of the organization they represent. When productivity doesn't rise apace, institutional professionals are compelled to find any and every artificial means of inflating the value of their product. A bubble is born. Unsupported by real value, these epochs of artificially inflated values must inevitably come crashing down.
Regulation does help. By restricting market managers from engaging in the most egregious and obvious inflationary tactics, some market shenanigans can be avoided. But ultimately, unhappy times, market mangers will find a way to do their job... to keep share holders (temporarily) happy. And then, of course, there is always, fiscal policy. Government and banking driven manipulation of fiscal and monitory policy (controlling the base price and availability of capital as loans) has important, but will only have limited and short term effects on markets. Fiscal policy is a surface fix. The fed board (and its equivalent in other nations/economies) is a fine-tuning instrument... has absolutely disastrous implications when used beyond this narrow band of effect (currency devaluation, runaway inflation, decreased foreign investment, etc.).
Again, we come back to the basics... to the the most fundamental metric in any economy... to productivity. If you can find a way to feed your entire society with just two percent of your population working the soil (rather than ninety), you can get more done with your total labor pool. Productivity rises. Industrialized farming, automated factories, the delivery of clean water to homes and work places, an accepted currency and banking system that makes capital available to the masses, and equitable and respected system of governance, and justice, the removal of garbage and human waste, an efficient transportation system for people, goods, and industrial materials, a reliable communications network, public education and career and health care, and an efficient source of energy that can be routed where it is needed... these are the foundational infrastructures that drive an economy. Each presaged an increase in productivity and is linked to true growth.
It has been a long time since an infrastructure-scale innovation has rocked the national and global productivity metrics. The cultivation of plants and domestication of animals. Metallurgy. Fire. Shelter and clothing. Devision of labor. Governance. Spoken and written language. The printing press. Transportation infrastructure. Production and distribution of energy. Communication systems. These epochs are at the scale I am addressing. Electricity. Oil and gas production. The telegraph/radio/TV. The internal combustion engine. Numbers, measurement, and mathematics. Public education. Currency. The periodic table. Germ theory. Genetics. Evolution theory. Information theory. Machines that compute. These are the innovations that produce epochs of productivity.
Marked in staccato evolutionary steps, computing has proceeded apace through its rather short history, aping the dumb media we used to conduct culture... before computers. Ledger sheets, paper and pencil, typewriters, notebooks, folders and filing cabinets, printing presses, desks, drawers, chalk boards, telephones, mail, even cameras, sound and video recorders, televisions and radios. Very little of this analog to digital conversion process has involved added to the science of computing. Very little of the computerization of media has advanced the infrastructure of logic, of systems, of knowledge automation. Mostly our efforts and the markets that have resulted (rich as they have been) have sidetracked true evolution in computing by aligning attention to artifacts instead of meaning. A the existence of computerized spreadsheets might make working with ledger data easier, but it doesn't help us understand economics any better, and it doesn't advance the science of computation or logic. Progress in computing over the last 30 years has done little more than adding efficiency to old methods and processes, very little of the effort expended has resulted in better computing.
This is a dead end process. Using all of this logical power to make a better piece of paper... what a joke. We should instead aim to evolve technology that can generalize the larger problems that end up expressing themselves as ledger sheets and notebooks... technology that gets to the base of the issues that manifest the need for spreadsheets and word processors... technology that understands goals, that tracks resources, that builds collaborative solutions, that seeks patterns that build patterns... technology that can process the hierarchies of influence that effect the transition from any now to any inevitable future.
Can we look to today's computers, as amazing as they are, and truly say of them that mimicking paper added as much value to computing as it did to writing and typewriters? Efficiencies have been gained, sure, but at what cost? A computer can and should be much much more than a writing device, it should be an evolution machine. Is it? They have the power. What's missing is vision… not their vision… ours. Humans need to expect more of this most plastic of all machines.
Randall Reetz
Problems with Search Engines, Who's Your Daddy?
If you are living within a modern developed economy you are most likely the unwitting master of over 40 computer chips. Of these chips only one or two are the big expensive general purpose CPU's that sit at the center of PC's and laptops of the sort Intel or AMD or Motorola charge several hundred dollars, aggressively advertise, and which, improbably, get both more powerful and less expensive at the wild rate dictated by Moore's Law (respectively doubling and halving every 18 months).
The rest of your chips live less conspicuous lives of simple servitude. They are embedded within many of the products and appliances you already own and almost all of the products you will buy tomorrow or own in the future. Compared with CPU's, embedded chips are usually smaller, more specific, and limited in their abilities. Uncomplainingly, and in many cases, without the need for any human interaction at all, they go about their menial tasks in nauseating repetition… electrons, it turns out, are exceedingly cheap. Much to the chagrin of front yard hotrodders, embedded chips now control your car's gas/air mixture, and time the ignition sparks in each cylinder. One of the reasons modern cars are so much more powerful than earlier cars and manage better milage at the same time. There are special chips that keep your wheels from locking up when you break hard, that track the relative rotational speed of each tire and independently break each wheel to keep any one of them from slipping on ice and water, from digging into mud or snow at the side of the road, and still others that compile information from your speedometer, engine, and from accelerometers to individually regulate the responsiveness of your suspension at each of your car's four corners.
There are chips that inflate the air bags and keep you alive by sensing an impact and reacting to it in the few ten thousandths of a second between then and and when your head would have collided with the steering wheel or side window. There are several that do nothing but regulate the temperature and humidity and air flow within your car. Likewise, your home thermostat contains one that helps your furnace and air conditioner balance the opposing demands of energy efficiency and comfort. Back in your car there are chips in your GPS system, in your sound system, chips that track sensors all over your car and tell you when something mechanical is amiss or if you are scheduled for a tuneup. There are chips that control the display of your speedometer and other in-dash instruments. One keeps your cell phone attuned to the closest transmission tower. There is one in your wrist watch. If you are an avid athlete, you might ware a computer that keeps track of your heart rate and oxygen uptake. Your bike might have one that keeps track of your distance and speed and the amount of pressure you are applying to your pedals, tracking your motion through geography and altitude, even tracking your performance as it changes over months or years of training. Hell, your blender and stove and refrigerator have them. There are chips in your TV, in your DVR, in your CD and DVD player. On your bed side table your alarm clock and home phone have them, even each of the little black boxes that charge your wireless devices. Even some toilets are controlled by embedded chips. Most washing machines have them, more and more refrigerators come with them, some stoves, most microwave ovens, bread makers, coffee machines, home security systems, multi-room media systems, and lots of toys have them, robots, learning systems, trivia games, and then there are home weather stations, TV set top boxes for cable and satellite, hand held language translators, calculators, walky-talkies, baby monitors, pace makers, insulin pumps, fertility monitors, there are even little plastic flowers on plastic stems that you shove into the dirt in your yard which dutifully measure your soil's acidity, temperature, nitrogen content, moisture level, and the direction, hours, and intensity of sunlight, after 24 hours you pull the stem apart and plug the USB plug into your computer which goes to a site which takes your zip code and spits back a list of plants suited for that part of your garden's conditions.
The world of stuff is coming alive through computation. And it isn't just because smart stuff is better stuff. One of the main reasons chips are finding there way into so many things is because they have become so damn cheap to manufacture.
CPU manufactures compete by constantly building newer and more advanced fabrication plants (each costing billions of dollars) that spit out more and more powerful processors by incorporating new processes that can etch smaller and smaller transistors which means more and more transistors on the same sized chip. The etching is a projection process which means hundreds of chips can be etched at once which means once you have built the fabrication plant, it costs no more per chip to print out chips with two or four times as many parts. Meanwhile, older plants, having already been paid for through the sale of millions of what was state of the art just a few years ago, can be cheaply retooled to spit out lesser chips for at almost no cost at all.
Chips are becoming the inexpensive souls we install in our devices to bring them alive. Soon, as these devices gain communication capabilities, they will build out their own network of things. This thing-net will bring our environment alive with shared information. Where computers themselves are information object, the smart network of stuff will become information backdrop. You've heard the projection that your grocery cart will get biometric data via your smart toilet and will gently guide you to purchase organic broccoli or whole grain breads (partially subsidized by your health provider who can cut costs by helping it's patients stay healthy and out of the hospital). This scenario is not as far fetched as it first sounds. As embedded processors proliferate, the stuff around us will become brainy and chatty. Add in some software that understands the hopes and habits and needs of the humans that use them, and you have the beginnings of what I call "forced serendipity" where the environment conspires around us to make our lives more complete and our minds more content. How this will play out is beyond the scope of guessing. That it will play out… this is inevitable.
Simultaneously, the world gets richer. The third world becomes the second world and the second world becomes first world. Technology is no longer the sole domain of the top one tenth of the global population. There is a double explosion of processing power a growing demographic with growing demands for information and control.
It is hard to over-estimate the ramifications brought about by this Diaspora of processor infused devices. Born of computers but now running amok into the firmament of the day to day environment, the thing-net becomes a background, a living web composed of millions of ad-hoc collaborative groupings of shared processing. What we call a computer today, a single self contained processing device, wow will that come to seem quaint as we roll into the next decade. Think instead of amorphous coalitions of lesser processors and sensors coming together as the need arises and as opportunity presents itself… of two, ten, devices in your car, or two hundred thousand chips spread out across the planet, devices owned by many entities, or maybe even by no-one. A computer? In the new smart dust network of things, a computer is a what happens for a few seconds and what goes away as others, as other ethereal combinations of devices snap into and out of existence and layered such that any one node is part of many such computers if they are apart of one.
Then what? It is easy to see that our shared future is to be ever more defined and dominated over by an ever growing and ever more dense "All-Net" of environmental sensors and actuators that talk to each other, fall naturally into functional groups, learn and adapt to the ever changing demands of those people, organizations, and machine aggregates that emerge, compete, and collaborate in this ever more complex information and control exchange.
By the most conservative estimates, the network of things will grow out at exponential rates, rates faster even than Moore's Law. If a billion of Earth's six billion inhabitants each own 4o chips and if that number is doubling every 5 years and if the world keeps getting richer while chips get cheaper, than it wouldn't be crazy to suggest that in just twenty years we will live within a thing-net composed of somewhere between ten trillion and a hundred trillion intercommunicating processors. The total potential communication graph described by such a staggeringly large network is unimaginably complex. On what logic will it run? We are used to culture and society and government evolving by such loose and unstructured associations and connections… but that is because us humans are the nodes in social networks.
As the design of sensor/controller/communicator modules grows in sophistication, as the same factors that gave us Moore's law come to play in their design and production, as these All-Net motes shrink, and become more powerful, we will see them incorporated into more and more of the stuff around us. Eventually, we will live in a world where almost everything has sensory, processing and communications capabilities and where these capabilities are diffuse, cheep, expendable, and by virtue of their numbers alone, omnipresent.
But how will these devices keep track of themselves in relation to others? By which shared language will they communicate? How will they filter the cacophony of noise that will be the totality of chatter across millions of trillions of devices?
It is one thing to build a radio station and a bunch of receivers. Quite another to build both into every device. But that is just the beginning, these little radios can compute and they can broadcast what they think so that other nodes can think on those thoughts, take action, or send them forward to still more nodes. What kinds of logic are made necessary by the communications craziness produced when everything everywhere contains tens or hundreds of intelligent processing motes built into it?
Clearly, today's operating systems would collapse under such a load and without the centrality and determinism afforded to todays computing environments. Computing as it stands today, is a reflection of simpler times when the computer stood all by itself. But what of the network, the internet, the world wide web? Surely, the internet and email are proof that we live in the age of networked computing? The sober answer is that we have been duped. Clicking on a hyper-link and having a web page served up from some hard drive a thousand miles away is a parlor trick. When the UPS guy shows up with a pair of shoes you wouldn't confuse that with a conversation, with any kind of deeply collaborative process. Today's network is a network of transactions. I request, you send. You request, I send.
Operating system designers have only recently begun to think of the computer as a node, as a member of a network of nodes. Asking them to jump paradigms even further, to define a computer as some minimally complex, some minimally competent collection of nodes... as a collection of nodes aggregating and de-aggregating at a pace dictated by need, by task, by goal, by opportunity… this is beyond the cognitive scope of most industry engineers, is difficult even for academics and theorists to wrap their minds around. Ever hear those words used together: computer theorist? Historically, computing leans heavily towards practice. I am going to step on some toes here, but most computer researchers will probably agree that as a science, computing is dangerously long on practice and criminally short on theory.
Standing here at the edge of the now, looking out into the abyss that is the future, it seems obvious to me that we need to redefine the very notion of the word "computer". We need to dump the old definition of the computer as thing, and work towards a new understanding of a computer as a collection of computing devises, as a shifting sphere of influence, a probability cloud, a sphere of influence shared by, indeed made up of, hundreds or thousands of computing devices, each seeing themselves as the center of their own simultaneously overlapping spheres. If this is a radically new idea, it is an idea that is predated by the reality of the ever expanding network that is the natural product of chips embedded in so many of the things around us. It makes sense to call this new kind of center-less computing network, "Swam Computing" but the term still fails to evoke the complex dimensionality, simultaneity, and swift plasticity that will be true network computing.
So what will it take? What new logical structures will we have to build to get from where we are today, from the isolated computing systems we use today, to the What kinds of architectural logic will be required of a computing that has no center, no definitive physicality, no boundaries, where even the physical resources of computing (processing, memory, storage, and communication channel) shift and dance in a shimmer of dynamically shared and momentary and ever changing flux? Where real-time systems come and go at the speed of environmental flux, where need and demand and supply and goal and condition and context can never be pre-determined, what kinds of logical structures will be made necessary by the overlay of so ethereal a system onto the reliable accuracy we demand and expect of deterministic computing systems?
This is the future already flowing like a fog into and around our daily lives. This is the future practicality predicts, but for which our current knowledge and computing infrastructure is hopelessly ill-prepared. Everyday things are already laden with computing power. Some can even send and receive communications. But the wall of complexity that stands between this self-evident now and any true swarm computing future is daunting. Many computer researchers are beginning to sense as I do, that progress in computer science has come to a halt, that we have, as an industry, as a field, collectively run head long into this wall that stands between the single computer past and the swarm computer future. We know how to architect single computer solutions, but we are struggling to bring multi-computer self optimizing swam computing solutions from idea or dream to full working detail.
We desperately need an architectural solution to the swarm computing problem. We need a solution that allows each node to act as a cell, as autonomous but fully capable of coming together to build out collaborative systems as organs, bodies, species, and culture. As is true with biological cells, any true swarm computing architecture will have to imbue each and every computing node all of the logic necessary to autonomously swarm into larger and larger computing entities.
Progress in every human endeavor has been stalled by the complexity wall that stands between computing past and its future. Designing our way towards an architectural solution to the swarm computing problem must to be made humanity's central, shared, and most pressing goal. Until we meet this challenge, evolution in all human endeavors will remain stalled. As happened with the introduction of electrical power transmission, industrial agriculture, the internal combustion engine, and electronic communication, the other side of the complexity wall waits an increase in global productivity that will dwarf all previous epochs.
Computing hasn't even begun.
The rest of your chips live less conspicuous lives of simple servitude. They are embedded within many of the products and appliances you already own and almost all of the products you will buy tomorrow or own in the future. Compared with CPU's, embedded chips are usually smaller, more specific, and limited in their abilities. Uncomplainingly, and in many cases, without the need for any human interaction at all, they go about their menial tasks in nauseating repetition… electrons, it turns out, are exceedingly cheap. Much to the chagrin of front yard hotrodders, embedded chips now control your car's gas/air mixture, and time the ignition sparks in each cylinder. One of the reasons modern cars are so much more powerful than earlier cars and manage better milage at the same time. There are special chips that keep your wheels from locking up when you break hard, that track the relative rotational speed of each tire and independently break each wheel to keep any one of them from slipping on ice and water, from digging into mud or snow at the side of the road, and still others that compile information from your speedometer, engine, and from accelerometers to individually regulate the responsiveness of your suspension at each of your car's four corners.
There are chips that inflate the air bags and keep you alive by sensing an impact and reacting to it in the few ten thousandths of a second between then and and when your head would have collided with the steering wheel or side window. There are several that do nothing but regulate the temperature and humidity and air flow within your car. Likewise, your home thermostat contains one that helps your furnace and air conditioner balance the opposing demands of energy efficiency and comfort. Back in your car there are chips in your GPS system, in your sound system, chips that track sensors all over your car and tell you when something mechanical is amiss or if you are scheduled for a tuneup. There are chips that control the display of your speedometer and other in-dash instruments. One keeps your cell phone attuned to the closest transmission tower. There is one in your wrist watch. If you are an avid athlete, you might ware a computer that keeps track of your heart rate and oxygen uptake. Your bike might have one that keeps track of your distance and speed and the amount of pressure you are applying to your pedals, tracking your motion through geography and altitude, even tracking your performance as it changes over months or years of training. Hell, your blender and stove and refrigerator have them. There are chips in your TV, in your DVR, in your CD and DVD player. On your bed side table your alarm clock and home phone have them, even each of the little black boxes that charge your wireless devices. Even some toilets are controlled by embedded chips. Most washing machines have them, more and more refrigerators come with them, some stoves, most microwave ovens, bread makers, coffee machines, home security systems, multi-room media systems, and lots of toys have them, robots, learning systems, trivia games, and then there are home weather stations, TV set top boxes for cable and satellite, hand held language translators, calculators, walky-talkies, baby monitors, pace makers, insulin pumps, fertility monitors, there are even little plastic flowers on plastic stems that you shove into the dirt in your yard which dutifully measure your soil's acidity, temperature, nitrogen content, moisture level, and the direction, hours, and intensity of sunlight, after 24 hours you pull the stem apart and plug the USB plug into your computer which goes to a site which takes your zip code and spits back a list of plants suited for that part of your garden's conditions.
The world of stuff is coming alive through computation. And it isn't just because smart stuff is better stuff. One of the main reasons chips are finding there way into so many things is because they have become so damn cheap to manufacture.
CPU manufactures compete by constantly building newer and more advanced fabrication plants (each costing billions of dollars) that spit out more and more powerful processors by incorporating new processes that can etch smaller and smaller transistors which means more and more transistors on the same sized chip. The etching is a projection process which means hundreds of chips can be etched at once which means once you have built the fabrication plant, it costs no more per chip to print out chips with two or four times as many parts. Meanwhile, older plants, having already been paid for through the sale of millions of what was state of the art just a few years ago, can be cheaply retooled to spit out lesser chips for at almost no cost at all.
Chips are becoming the inexpensive souls we install in our devices to bring them alive. Soon, as these devices gain communication capabilities, they will build out their own network of things. This thing-net will bring our environment alive with shared information. Where computers themselves are information object, the smart network of stuff will become information backdrop. You've heard the projection that your grocery cart will get biometric data via your smart toilet and will gently guide you to purchase organic broccoli or whole grain breads (partially subsidized by your health provider who can cut costs by helping it's patients stay healthy and out of the hospital). This scenario is not as far fetched as it first sounds. As embedded processors proliferate, the stuff around us will become brainy and chatty. Add in some software that understands the hopes and habits and needs of the humans that use them, and you have the beginnings of what I call "forced serendipity" where the environment conspires around us to make our lives more complete and our minds more content. How this will play out is beyond the scope of guessing. That it will play out… this is inevitable.
Simultaneously, the world gets richer. The third world becomes the second world and the second world becomes first world. Technology is no longer the sole domain of the top one tenth of the global population. There is a double explosion of processing power a growing demographic with growing demands for information and control.
It is hard to over-estimate the ramifications brought about by this Diaspora of processor infused devices. Born of computers but now running amok into the firmament of the day to day environment, the thing-net becomes a background, a living web composed of millions of ad-hoc collaborative groupings of shared processing. What we call a computer today, a single self contained processing device, wow will that come to seem quaint as we roll into the next decade. Think instead of amorphous coalitions of lesser processors and sensors coming together as the need arises and as opportunity presents itself… of two, ten, devices in your car, or two hundred thousand chips spread out across the planet, devices owned by many entities, or maybe even by no-one. A computer? In the new smart dust network of things, a computer is a what happens for a few seconds and what goes away as others, as other ethereal combinations of devices snap into and out of existence and layered such that any one node is part of many such computers if they are apart of one.
Then what? It is easy to see that our shared future is to be ever more defined and dominated over by an ever growing and ever more dense "All-Net" of environmental sensors and actuators that talk to each other, fall naturally into functional groups, learn and adapt to the ever changing demands of those people, organizations, and machine aggregates that emerge, compete, and collaborate in this ever more complex information and control exchange.
By the most conservative estimates, the network of things will grow out at exponential rates, rates faster even than Moore's Law. If a billion of Earth's six billion inhabitants each own 4o chips and if that number is doubling every 5 years and if the world keeps getting richer while chips get cheaper, than it wouldn't be crazy to suggest that in just twenty years we will live within a thing-net composed of somewhere between ten trillion and a hundred trillion intercommunicating processors. The total potential communication graph described by such a staggeringly large network is unimaginably complex. On what logic will it run? We are used to culture and society and government evolving by such loose and unstructured associations and connections… but that is because us humans are the nodes in social networks.
As the design of sensor/controller/communicator modules grows in sophistication, as the same factors that gave us Moore's law come to play in their design and production, as these All-Net motes shrink, and become more powerful, we will see them incorporated into more and more of the stuff around us. Eventually, we will live in a world where almost everything has sensory, processing and communications capabilities and where these capabilities are diffuse, cheep, expendable, and by virtue of their numbers alone, omnipresent.
But how will these devices keep track of themselves in relation to others? By which shared language will they communicate? How will they filter the cacophony of noise that will be the totality of chatter across millions of trillions of devices?
It is one thing to build a radio station and a bunch of receivers. Quite another to build both into every device. But that is just the beginning, these little radios can compute and they can broadcast what they think so that other nodes can think on those thoughts, take action, or send them forward to still more nodes. What kinds of logic are made necessary by the communications craziness produced when everything everywhere contains tens or hundreds of intelligent processing motes built into it?
Clearly, today's operating systems would collapse under such a load and without the centrality and determinism afforded to todays computing environments. Computing as it stands today, is a reflection of simpler times when the computer stood all by itself. But what of the network, the internet, the world wide web? Surely, the internet and email are proof that we live in the age of networked computing? The sober answer is that we have been duped. Clicking on a hyper-link and having a web page served up from some hard drive a thousand miles away is a parlor trick. When the UPS guy shows up with a pair of shoes you wouldn't confuse that with a conversation, with any kind of deeply collaborative process. Today's network is a network of transactions. I request, you send. You request, I send.
Operating system designers have only recently begun to think of the computer as a node, as a member of a network of nodes. Asking them to jump paradigms even further, to define a computer as some minimally complex, some minimally competent collection of nodes... as a collection of nodes aggregating and de-aggregating at a pace dictated by need, by task, by goal, by opportunity… this is beyond the cognitive scope of most industry engineers, is difficult even for academics and theorists to wrap their minds around. Ever hear those words used together: computer theorist? Historically, computing leans heavily towards practice. I am going to step on some toes here, but most computer researchers will probably agree that as a science, computing is dangerously long on practice and criminally short on theory.
Standing here at the edge of the now, looking out into the abyss that is the future, it seems obvious to me that we need to redefine the very notion of the word "computer". We need to dump the old definition of the computer as thing, and work towards a new understanding of a computer as a collection of computing devises, as a shifting sphere of influence, a probability cloud, a sphere of influence shared by, indeed made up of, hundreds or thousands of computing devices, each seeing themselves as the center of their own simultaneously overlapping spheres. If this is a radically new idea, it is an idea that is predated by the reality of the ever expanding network that is the natural product of chips embedded in so many of the things around us. It makes sense to call this new kind of center-less computing network, "Swam Computing" but the term still fails to evoke the complex dimensionality, simultaneity, and swift plasticity that will be true network computing.
So what will it take? What new logical structures will we have to build to get from where we are today, from the isolated computing systems we use today, to the What kinds of architectural logic will be required of a computing that has no center, no definitive physicality, no boundaries, where even the physical resources of computing (processing, memory, storage, and communication channel) shift and dance in a shimmer of dynamically shared and momentary and ever changing flux? Where real-time systems come and go at the speed of environmental flux, where need and demand and supply and goal and condition and context can never be pre-determined, what kinds of logical structures will be made necessary by the overlay of so ethereal a system onto the reliable accuracy we demand and expect of deterministic computing systems?
This is the future already flowing like a fog into and around our daily lives. This is the future practicality predicts, but for which our current knowledge and computing infrastructure is hopelessly ill-prepared. Everyday things are already laden with computing power. Some can even send and receive communications. But the wall of complexity that stands between this self-evident now and any true swarm computing future is daunting. Many computer researchers are beginning to sense as I do, that progress in computer science has come to a halt, that we have, as an industry, as a field, collectively run head long into this wall that stands between the single computer past and the swarm computer future. We know how to architect single computer solutions, but we are struggling to bring multi-computer self optimizing swam computing solutions from idea or dream to full working detail.
We desperately need an architectural solution to the swarm computing problem. We need a solution that allows each node to act as a cell, as autonomous but fully capable of coming together to build out collaborative systems as organs, bodies, species, and culture. As is true with biological cells, any true swarm computing architecture will have to imbue each and every computing node all of the logic necessary to autonomously swarm into larger and larger computing entities.
Progress in every human endeavor has been stalled by the complexity wall that stands between computing past and its future. Designing our way towards an architectural solution to the swarm computing problem must to be made humanity's central, shared, and most pressing goal. Until we meet this challenge, evolution in all human endeavors will remain stalled. As happened with the introduction of electrical power transmission, industrial agriculture, the internal combustion engine, and electronic communication, the other side of the complexity wall waits an increase in global productivity that will dwarf all previous epochs.
Computing hasn't even begun.
Subscribe to:
Comments (Atom)