If you are living within a modern developed economy you are most likely the unwitting master of over 40 computer chips. Of these chips only one or two are the big expensive general purpose CPU's that sit at the center of PC's and laptops of the sort Intel or AMD or Motorola charge several hundred dollars, aggressively advertise, and which, improbably, get both more powerful and less expensive at the wild rate dictated by Moore's Law (respectively doubling and halving every 18 months).
The rest of your chips live less conspicuous lives of simple servitude. They are embedded within many of the products and appliances you already own and almost all of the products you will buy tomorrow or own in the future. Compared with CPU's, embedded chips are usually smaller, more specific, and limited in their abilities. Uncomplainingly, and in many cases, without the need for any human interaction at all, they go about their menial tasks in nauseating repetition… electrons, it turns out, are exceedingly cheap. Embedded chips now control your car's fuel/air mixture, and time the ignition sparks in each cylinder. These chips are one of the reasons modern cars are so much more powerful and manage better mileage at the same time. There are special chips that keep your wheels from locking up when you brake hard, that track the relative rotational speed of each tire and independently brake each wheel to keep any one of them from slipping on ice and water, from digging into mud or snow at the side of the road, and still others that compile information from your speedometer, engine, and from accelerometers to individually regulate the responsiveness of your suspension at each of your car's four corners.
There are chips that inflate the air bags that keep you alive by sensing an impact and reacting to it in the few ten thousandths of a second between "Crash!" and and when your head would have collided with the steering wheel or side window. There are several that do nothing but regulate the temperature and humidity of the air flowing within your car's interior. Likewise, your home thermostat contains one that helps your furnace and air conditioner balance the opposing demands of energy efficiency and comfort. Back in your car there are chips in your GPS system, in your sound system, chips that track sensors all over your car and tell you when something mechanical is amiss or if you are scheduled for a tuneup. There are chips that control the display of your speedometer and other in-dash instruments. One keeps your cell phone attuned to the closest transmission tower. There is one in your wrist watch. If you are an avid athlete, you might wear a computer that keeps track of your heart rate and oxygen uptake. Your bike might have one that keeps track of your distance and speed and the amount of pressure you are applying to your pedals, tracking your motion through geography and altitude, even tracking your performance as it changes over months or years of training. Hell, your blender and stove and refrigerator have them. There are chips in your TV, in your DVR, in your CD and DVD player. On your bed side table your alarm clock and home phone have them, even each of the little black boxes that charge your wireless devices. Most washing machines have them, more and more refrigerators come with them, some stoves, most microwave ovens, bread makers, coffee machines, home security systems, multi-room media systems, and lots of toys have them, robots, learning systems, trivia games, and then there are home weather stations, TV set top boxes for cable and satellite, hand held language translators, calculators, walky-talkies, baby monitors, pace makers, insulin pumps, fertility monitors, there are even little plastic flowers on plastic stems that you shove into the dirt in your yard which dutifully measure your soil's acidity, temperature, nitrogen content, moisture level, and the direction, hours, and intensity of sunlight, after 24 hours you pull the stem apart and plug the USB plug into your computer which goes to a site which takes your zip code and spits back a list of plants suited for that part of your garden's conditions.
Some toilets are controlled by embedded chips.
The world of stuff is coming alive through computation. And it isn't just because smart stuff is better stuff. One of the main reasons chips are finding their way into so many things is because they have become so damn cheap to manufacture.
CPU manufactures compete by constantly building newer and more advanced fabrication plants (each costing billions of dollars) that spit out more and more powerful processors by incorporating new processes that can etch smaller and smaller transistors which means more and more transistors on the same sized chip. The etching is a projection process which means hundreds of chips can be etched at once which means once you have built the fabrication plant, it costs no more per chip to print out chips with two or four times as many parts. Meanwhile, older plants, having already been paid for through the sale of millions of what was state of the art just a few years prior, can be cheaply retooled to spit out lesser chips at unit costs that drop to nearly pennies per.
Chips are becoming the inexpensive souls we install in our devices to bring them alive. Soon, as these devices gain communication capabilities, they will build out their own network of things. This thing-net will bring our environment alive with shared information. Where computers themselves are information object, the smart network of stuff will become information backdrop. You've heard the projection that your grocery cart will get biometric data via your smart toilet and will gently guide you to purchase organic broccoli or whole grain breads (partially subsidized by your health provider who can cut costs by helping it's patients stay healthy and out of the hospital). This scenario is not as far fetched as it first sounds. As embedded processors proliferate, the stuff around us will become brainy and chatty. Add in some software that understands the hopes and habits and needs of the humans that use them, and you have the beginnings of what I call "forced serendipity" where the environment conspires around us to make our lives more complete and our minds more content. How this will play out is beyond the scope of guessing. That it will play out… this is inevitable.
Simultaneously, the world gets richer. The third world becomes second world and the second world becomes first world. Technology is no longer the sole domain of the top one tenth of the global population. A double explosion results; 1. processing power increases ridiculously as dictated by Moore's Law, and, 2. a growing percentage of global citizens that have gained sufficient economic power to effect the demand curve for products and services that are enabled by embedded and shared processors.
It is hard to over-estimate the ramifications brought about by this Diaspora of processor infused devices. Born of computers but now running amok into the firmament of the day to day environment, the thing-net becomes a background, a living web composed of millions of ad-hoc collaborative groupings of shared processing. What we call a computer today, a single self contained processing device, wow will that come to seem quaint as we roll into the next decade. Think instead of amorphous coalitions of lesser processors and sensors coming together as the need arises and as opportunity presents itself… of two, ten, devices in your car, or two hundred thousand chips spread out across the planet, devices owned by many entities, or maybe even by no-one. A computer? In the new smart dust network of things, a computer is a what happens for a few seconds and what goes away as others, as other ethereal combinations of devices snap into and out of existence and layered such that any one node is part of many such computers if they are apart of one.
Then what? It is easy to see that our shared future is to be ever more defined and dominated over by an ever growing and ever more dense "All-Net" of environmental sensors and actuators that talk to each other, fall naturally into functional groups, learn and adapt to the ever changing demands of those people, organizations, and machine aggregates that emerge, compete, and collaborate in this ever more complex information and control exchange.
By the most conservative estimates, the network of things will grow out at exponential rates, rates faster even than Moore's Law. If a billion of Earth's six billion inhabitants each own 4o chips and if that number is doubling every 5 years and if the world keeps getting richer while chips get cheaper, than it wouldn't be crazy to suggest that in just twenty years we will live within a thing-net composed of somewhere between ten trillion and a hundred trillion intercommunicating processors. The total potential communication graph described by such a staggeringly large network is unimaginably complex. On what logic will it run?
In our everyday lives, we are used to complex aggregates like culture, society, and government evolving from similar loose and unstructured associations and connections. Culture as network. But that, we say, happens because humans are the nodes and humans are themselves complex. Indeed, as network nodes, we humans are very complex. We participate in the societal network that is culture as self contained arbitrators or processors of wildly complex patterns like situation and context, need and availability, of propriety and collaboration, of initiative and momentum, of concept and diffusion, of pattern and chaos. What minimal subset of these skills can we distill from the human node, then generalize and subsume into the silicon and software of our tiny smart-dust kin to be? This is the big project that looms in front of computer science. THE PROJECT. But before you throw up you arms in despair, remember that our own minds are consistent with the smart-dust model. Each of the 100 billion or so neurons in our head is relatively simple (at least in its role as an information and logic processor). "Mind", the remarkable confluence that makes us, us, that results, isn't the neurons, the things, the nodes, so much as it is a super-product of the n-dimensional connection map, the network that flits in and out of existence, the fluctuating and overlapping web connecting neurons to neurons.
As the design of sensor/controller/communicator modules grows in sophistication, as we figure out what absolutely has to exist in a node and what is best left to the emergent properties of the network itself, as the same efficiency of scale factors that gave us Moore's law come to play in their design and production, as these All-Net motes shrink, and become more and more powerful, we will see them incorporated into more and more of the stuff around us. Eventually, we will live in a world where almost everything has sensory, processing and communications capabilities and where these capabilities are diffuse, cheap, expendable, and by virtue of their numbers alone, omnipresent.
But how will these devices keep track of themselves in relation to others? By which shared language will they communicate? How will they filter the cacophony of noise that will be the totality of chatter across millions of trillions of devices?
It is one thing to build a radio station and a bunch of receivers. Quite another to build both into every device. But that is just the beginning, these little radios can compute and they can broadcast what they think so that other nodes can think on those thoughts, take action, or send them forward to still more nodes. What kinds of logic are made necessary by the communications craziness produced when everything everywhere contains tens or hundreds of intelligent processing motes built into it?
Clearly, today's operating systems would collapse under such a load and without the centrality and determinism afforded to todays computing environments. Computing as it stands today, is a reflection of simpler times when the computer stood all by itself. But what, you ask, of the network, the internet, the world wide web? Surely, the internet and email are proof that we live in the age of networked computing? The sober answer is that we have been duped. Clicking on a hyper-link and having a web page served up from some hard drive a thousand miles away is a parlor trick. When the UPS guy shows up with a pair of shoes you wouldn't confuse that with a conversation, with any kind of deeply collaborative process. Today's network is a network of transactions. I request, you send. You request, I send.
Operating system designers have only recently begun to think of the computer as a node, as a member of a network of nodes. Asking them to jump paradigms even further, to define a computer as some minimally complex, some minimally competent collection of nodes... as a collection of nodes aggregating and de-aggregating at a pace dictated by need, by task, by goal, by opportunity… this is beyond the cognitive scope of most industry engineers, is difficult even for academics and theorists to wrap their minds around. Ever hear those words used together: computer theorist? Historically, computing leans heavily towards practice. I am going to step on some toes here, but most computer researchers will probably agree that as a science, computing is dangerously long on practice and criminally short on theory.
Standing here at the edge of the now, looking out into the abyss that is the future, it seems obvious to me that we need to redefine the very notion of the word "computer". We need to dump the old definition of the computer as thing, and work towards a new understanding of a computer as a collection of computing devises, as a shifting sphere of influence, a probability cloud, a sphere of influence shared by, indeed made up of, hundreds or thousands of computing devices, each seeing themselves as the center of their own simultaneously overlapping spheres. If this is a radically new idea, it is an idea that is predated by the reality of the ever expanding network that is the natural product of chips embedded in so many of the things around us. It makes sense to call this new kind of center-less computing network, "Swam Computing" but the term still fails to evoke the complex dimensionality, simultaneity, and swift plasticity that will be true network computing.
So what will it take? What new logical structures will we have to build to get from where we are today, from the isolated computing systems we use today, to the self optimized, agent generating, pattern building, semantically aware, swarm computing of tomorrow? What kinds of architectural logic will be required of a computing system that has no center, no definitive physicality, no boundaries, where even the physical resources of computing (processing, memory, storage, and communication channel) shift and dance in a shimmer of dynamically shared and momentary and ever changing flux? Where real-time systems come and go at the speed of environmental flux, where need and demand and supply and goal and condition and context can never be pre-determined, what kinds of logical structures will be made necessary by the overlay of so ethereal a system onto the reliable accuracy we demand and expect of deterministic computing systems?
This is the future already flowing like a fog into and around our daily lives. This is the future practicality predicts, but for which our current knowledge and computing infrastructure is hopelessly ill-prepared. Everyday things are already laden with computing power. Some can even send and receive communications. But the wall of complexity that stands between this self-evident now and any true swarm computing future is daunting. Many computer researchers are beginning to sense as I do, that progress in computer science has come to a halt, that we have, as an industry, as a field, collectively run head long into this wall that stands between the single computer past and the swarm computer future. We know how to architect single computer solutions, but we are struggling to bring multi-computer self optimizing swam computing solutions from idea or dream to full working detail.
Of course having a hundred trillion nodes doesn't necessarily mean you have anything approaching swarm computing… it might mean nothing more than the fact that there are a hundred trillion nodes. Nodes that aren't designed to work together, to self aggregate… will never result in anything more capable than any one of them is as an individual unit. A flock maybe, but never a swarm. A flock is composed of lot of self-similar individuals acting in their own self interest, any patterns that appear to emerge are crystalline, pretty, but never complex, emergent, or evolving. A swarm, on the other hand, is defined as structures and behaviors of the aggregate whole that are not themselves possible in the individual parts of which it is composed. A swarm doesn't just "look" like a system… it is one. Where a flock can emerge from the behavior of individuals even when they have no means of sensing the goals of others around them, a swarm can only result when each part has the capacity to be aware of the parts around them and can grasp and process the construct of a shared goal. For a bunch of nodes ever to swarm, each nodes must posses the capacity to see itself in the context of an environment that is at least partially composed of other nodes.
Current estimates put the number of computers hooked to the global web we call the internet at between one and two billion. But there is nothing swarm-like about the internet. Obviously, today's computers do not have the requisite attributes and abilities to swarm. Exchanging email documents and web pages isn't exactly the stuff of deeply collaborative computing. For one thing, your computer has absolutely no means of understanding the contents of your emails or the web pages it dutifully displays. Adding more of these semantically challenged computers to the net won't get us any closer to swarm computing.
Likewise, most of the yardsticks we use to measure computer performance and progress like clock speed, data bandwidth, bus speed, chip memory, and disk capacity are useless as indicators of the capacity of a node to participate in a computational swarm. Today's computers are powered by chips composed of over a billion transistors. The amount of storage available to each computer on the net today is enough to hold tens of thousands of books, hundreds of hours of video and thousands of hours of recorded music. though it is obvious that they are not nearly as mentally endowed as is the average laptop, individual ants seem to swarm just fine. The crucial swarming criteria can not be quantity, or power so much as it is some measure of quality.
So, what is the missing secret sauce? What crucial "it" results in swarm behavior? Let's make a list. All nodes must speak a common language, that common language must be semantically reflexive (each node must be able to self-interpret the meaning of the work of other nodes), the nodes must be hooked to, listen, and talk through a shared network, each node must understand and be able to readily process such complex concepts as ownership, membership, resources, value, goals, and direction, they must know how to enlist the help of other nodes and node groups, and they must be able to know when one project is done and how to determine what might be formed into a new project, they must be able to know their own strengths or unique value and how to build on these strengths and form coalitions where other nodes are better equipped. Each of these capabilities is complex in its own right, but none of them alone will facilitate swarm computing. Swarm computing will happen only as a holistic solution. Any reasonable holistic solution will most likely be built from a set of shareable low-level logical machines… machines not unlike the boolean gates that make up binary computation, but at a higher level of abstraction.
Though it will not be easy to specify, design, or build, we desperately need an architectural-level solution to the swarm computing problem. We need a solution that allows each node to act as a cell, both autonomous and fully capable of coming together to build out collaborative systems equivalent to the organs, bodies, species, and culture that are the emergent aggregates of biological cells. As is true with biological cells, any true swarm computing architecture will have to imbue each and every computing node all of the logic necessary to autonomously swarm into larger and larger computing entities. The ideal smart-dust swarm computing architecture will consist of only two things, the nodes themselves and the medium or network that connects them. Specialization might play a role, each nodes might need to be able to physically adapt to specific environmental situations. Ideally, a node would contain everything it needed to conform and self optimize to changing demands and opportunities. But most of the plasticity in the system will come from the connections between the nodes, how they self aggregate into opportunistic groups to solve problems, refine and store information, and communicate, edit, and propagate pattern, how they learn and how other groups learn to use this accumulated knowledge as decision and control structure.
I am compelled to take a short detour and talk to the "multi-core processor" issue. Every major chip manufacturer is building them. A new arms race has shifted bragging rights from clock speed to core count. Commercial chips with 16 cores are about to hit the market, 32 core chips will follow. Intel has bragged of a working 80-core prototype. The dirty truth is that chip makers hit a performance wall about 6 years ago, forcing a retreat to multi-core layouts. As they attempted to pack more and more transistors onto a CPU, they came up against the hard reality of physics. Smaller and smaller wires meant greater and greater heat and worse and worse efficiency (as a ratio of their mass, chips were producing more heat than the interior of the sun). Then electrons started to jump gates (tunneling beneath space-time!) making transistors unreliable. The shear size of the chips was rising to the point where the distance electrons had to travel (at 96 thousand miles per second) was slowing things down. The surface area to edge ratio had shifted so to the point that there wasn't enough length at the chip's perimeter through which to supply data to keep it running at full capacity. Real world uses tended not to match the type for which these big chips were optimized. But mostly, the industry was rapidly reaching the physical limits of the project-and-etch manufacturing process that had so reliably yielded biannual doubling of processor density.
The industry responded by going condo… by dividing the chips into smaller and smaller sub-units. Multi-core chips come with their own unique set of problems. The main problem is that nobody has figured out how to write a compiler that converts source code into a format that takes full advantage of multiple cores. Compilers that customized for two core chips worse than useless for four or eight core chips. The other problem is that problems that are cut up to be processed on multiple cores produce results out of synch with the cores next to them. So data that one part of a process needs may have to wait around for another process to finnish. The results produced by each of these core assigned threads must be put into some sort of shared memory and must be labeled and tracked so that the processes being run on other cores can avoid using unprocessed data or data that is now owned by another process thread. The logical topology of multi-core processing models has proved difficult to generalize. Automating the processing sequence and the locking of transitory in-process memory regardless of the number of cores available to the execution of an application thread is beyond the practical scope of most programmers. The end result is that most code runs slower on multi-core chips.
Until the industry responds with a way to reliably and automatically break executable code into self optimized threads managed for multi-core chips, you'd better think twice when told that a thirty two core chip is better than a two core chip.
But there may be a silver lining to all of this multi-core craziness. When you think about it, multiple core chips are the same as having several chips… and isn't that what swarm computing is all about? Solving the problems and meeting the challenges of designing a true multi-core processing architecture is exactly equivalent to solving the problems and meeting the challenges of designing an processing architecture for swarm computing. The only real difference between the multi-core and multi-node computing challenge is security. Cores, existing as they do on one chip are mostly owned by one entity. Ownership is implicit. Nodes, on the other hand, can be far flung and owned by any number of separate entities. Multi-node processing is processing that must adhere to strict ownership and data security handshakes. Where processing involves multiple entities, data must be served up from shared memory in accordance with strict asynchronous locking protocol.
We have taken old-style solitary and discrete processing and programming model (computing in a box), about as far as human creativity and cognitive capacity will permit. As a result, and because each of our other research pursuits is so intimately dependent on the steady advance of computing's power and scope, progress in almost every human endeavor has been stalled by the complexity wall that stands between computing's solitary and linear past and its collaborative and n-dimensional future. Designing our way forward, towards an architectural solution to the swarm computing problem must be made humanity's central, shared, and most pressing goal. Until we meet this challenge, progress in all human endeavors will remain stalled.
As happened with the introduction of electrical power transmission, industrial agriculture, the internal combustion engine, and electronic communication, the other side of computing's complexity wall heralds fold increases in global productivity, dwarfing all previous epochs.
No comments:
Post a Comment