The rest of your chips live less conspicuous lives of simple servitude. They are embedded within many of the products and appliances you already own and almost all of the products you will buy tomorrow or own in the future. Compared with CPU's, embedded chips are usually smaller, more specific, and limited in their abilities. Uncomplainingly, and in many cases, without the need for any human interaction at all, they go about their menial tasks in nauseating repetition… electrons, it turns out, are exceedingly cheap. Much to the chagrin of front yard hotrodders, embedded chips now control your car's gas/air mixture, and time the ignition sparks in each cylinder. One of the reasons modern cars are so much more powerful than earlier cars and manage better milage at the same time. There are special chips that keep your wheels from locking up when you break hard, that track the relative rotational speed of each tire and independently break each wheel to keep any one of them from slipping on ice and water, from digging into mud or snow at the side of the road, and still others that compile information from your speedometer, engine, and from accelerometers to individually regulate the responsiveness of your suspension at each of your car's four corners.
There are chips that inflate the air bags and keep you alive by sensing an impact and reacting to it in the few ten thousandths of a second between then and and when your head would have collided with the steering wheel or side window. There are several that do nothing but regulate the temperature and humidity and air flow within your car. Likewise, your home thermostat contains one that helps your furnace and air conditioner balance the opposing demands of energy efficiency and comfort. Back in your car there are chips in your GPS system, in your sound system, chips that track sensors all over your car and tell you when something mechanical is amiss or if you are scheduled for a tuneup. There are chips that control the display of your speedometer and other in-dash instruments. One keeps your cell phone attuned to the closest transmission tower. There is one in your wrist watch. If you are an avid athlete, you might ware a computer that keeps track of your heart rate and oxygen uptake. Your bike might have one that keeps track of your distance and speed and the amount of pressure you are applying to your pedals, tracking your motion through geography and altitude, even tracking your performance as it changes over months or years of training. Hell, your blender and stove and refrigerator have them. There are chips in your TV, in your DVR, in your CD and DVD player. On your bed side table your alarm clock and home phone have them, even each of the little black boxes that charge your wireless devices. Even some toilets are controlled by embedded chips. Most washing machines have them, more and more refrigerators come with them, some stoves, most microwave ovens, bread makers, coffee machines, home security systems, multi-room media systems, and lots of toys have them, robots, learning systems, trivia games, and then there are home weather stations, TV set top boxes for cable and satellite, hand held language translators, calculators, walky-talkies, baby monitors, pace makers, insulin pumps, fertility monitors, there are even little plastic flowers on plastic stems that you shove into the dirt in your yard which dutifully measure your soil's acidity, temperature, nitrogen content, moisture level, and the direction, hours, and intensity of sunlight, after 24 hours you pull the stem apart and plug the USB plug into your computer which goes to a site which takes your zip code and spits back a list of plants suited for that part of your garden's conditions.
The world of stuff is coming alive through computation. And it isn't just because smart stuff is better stuff. One of the main reasons chips are finding there way into so many things is because they have become so damn cheap to manufacture.
CPU manufactures compete by constantly building newer and more advanced fabrication plants (each costing billions of dollars) that spit out more and more powerful processors by incorporating new processes that can etch smaller and smaller transistors which means more and more transistors on the same sized chip. The etching is a projection process which means hundreds of chips can be etched at once which means once you have built the fabrication plant, it costs no more per chip to print out chips with two or four times as many parts. Meanwhile, older plants, having already been paid for through the sale of millions of what was state of the art just a few years ago, can be cheaply retooled to spit out lesser chips for at almost no cost at all.
Chips are becoming the inexpensive souls we install in our devices to bring them alive. Soon, as these devices gain communication capabilities, they will build out their own network of things. This thing-net will bring our environment alive with shared information. Where computers themselves are information object, the smart network of stuff will become information backdrop. You've heard the projection that your grocery cart will get biometric data via your smart toilet and will gently guide you to purchase organic broccoli or whole grain breads (partially subsidized by your health provider who can cut costs by helping it's patients stay healthy and out of the hospital). This scenario is not as far fetched as it first sounds. As embedded processors proliferate, the stuff around us will become brainy and chatty. Add in some software that understands the hopes and habits and needs of the humans that use them, and you have the beginnings of what I call "forced serendipity" where the environment conspires around us to make our lives more complete and our minds more content. How this will play out is beyond the scope of guessing. That it will play out… this is inevitable.
Simultaneously, the world gets richer. The third world becomes the second world and the second world becomes first world. Technology is no longer the sole domain of the top one tenth of the global population. There is a double explosion of processing power a growing demographic with growing demands for information and control.
It is hard to over-estimate the ramifications brought about by this Diaspora of processor infused devices. Born of computers but now running amok into the firmament of the day to day environment, the thing-net becomes a background, a living web composed of millions of ad-hoc collaborative groupings of shared processing. What we call a computer today, a single self contained processing device, wow will that come to seem quaint as we roll into the next decade. Think instead of amorphous coalitions of lesser processors and sensors coming together as the need arises and as opportunity presents itself… of two, ten, devices in your car, or two hundred thousand chips spread out across the planet, devices owned by many entities, or maybe even by no-one. A computer? In the new smart dust network of things, a computer is a what happens for a few seconds and what goes away as others, as other ethereal combinations of devices snap into and out of existence and layered such that any one node is part of many such computers if they are apart of one.
Then what? It is easy to see that our shared future is to be ever more defined and dominated over by an ever growing and ever more dense "All-Net" of environmental sensors and actuators that talk to each other, fall naturally into functional groups, learn and adapt to the ever changing demands of those people, organizations, and machine aggregates that emerge, compete, and collaborate in this ever more complex information and control exchange.
By the most conservative estimates, the network of things will grow out at exponential rates, rates faster even than Moore's Law. If a billion of Earth's six billion inhabitants each own 4o chips and if that number is doubling every 5 years and if the world keeps getting richer while chips get cheaper, than it wouldn't be crazy to suggest that in just twenty years we will live within a thing-net composed of somewhere between ten trillion and a hundred trillion intercommunicating processors. The total potential communication graph described by such a staggeringly large network is unimaginably complex. On what logic will it run? We are used to culture and society and government evolving by such loose and unstructured associations and connections… but that is because us humans are the nodes in social networks.
As the design of sensor/controller/communicator modules grows in sophistication, as the same factors that gave us Moore's law come to play in their design and production, as these All-Net motes shrink, and become more powerful, we will see them incorporated into more and more of the stuff around us. Eventually, we will live in a world where almost everything has sensory, processing and communications capabilities and where these capabilities are diffuse, cheep, expendable, and by virtue of their numbers alone, omnipresent.
But how will these devices keep track of themselves in relation to others? By which shared language will they communicate? How will they filter the cacophony of noise that will be the totality of chatter across millions of trillions of devices?
It is one thing to build a radio station and a bunch of receivers. Quite another to build both into every device. But that is just the beginning, these little radios can compute and they can broadcast what they think so that other nodes can think on those thoughts, take action, or send them forward to still more nodes. What kinds of logic are made necessary by the communications craziness produced when everything everywhere contains tens or hundreds of intelligent processing motes built into it?
Clearly, today's operating systems would collapse under such a load and without the centrality and determinism afforded to todays computing environments. Computing as it stands today, is a reflection of simpler times when the computer stood all by itself. But what of the network, the internet, the world wide web? Surely, the internet and email are proof that we live in the age of networked computing? The sober answer is that we have been duped. Clicking on a hyper-link and having a web page served up from some hard drive a thousand miles away is a parlor trick. When the UPS guy shows up with a pair of shoes you wouldn't confuse that with a conversation, with any kind of deeply collaborative process. Today's network is a network of transactions. I request, you send. You request, I send.
Operating system designers have only recently begun to think of the computer as a node, as a member of a network of nodes. Asking them to jump paradigms even further, to define a computer as some minimally complex, some minimally competent collection of nodes... as a collection of nodes aggregating and de-aggregating at a pace dictated by need, by task, by goal, by opportunity… this is beyond the cognitive scope of most industry engineers, is difficult even for academics and theorists to wrap their minds around. Ever hear those words used together: computer theorist? Historically, computing leans heavily towards practice. I am going to step on some toes here, but most computer researchers will probably agree that as a science, computing is dangerously long on practice and criminally short on theory.
Standing here at the edge of the now, looking out into the abyss that is the future, it seems obvious to me that we need to redefine the very notion of the word "computer". We need to dump the old definition of the computer as thing, and work towards a new understanding of a computer as a collection of computing devises, as a shifting sphere of influence, a probability cloud, a sphere of influence shared by, indeed made up of, hundreds or thousands of computing devices, each seeing themselves as the center of their own simultaneously overlapping spheres. If this is a radically new idea, it is an idea that is predated by the reality of the ever expanding network that is the natural product of chips embedded in so many of the things around us. It makes sense to call this new kind of center-less computing network, "Swam Computing" but the term still fails to evoke the complex dimensionality, simultaneity, and swift plasticity that will be true network computing.
So what will it take? What new logical structures will we have to build to get from where we are today, from the isolated computing systems we use today, to the What kinds of architectural logic will be required of a computing that has no center, no definitive physicality, no boundaries, where even the physical resources of computing (processing, memory, storage, and communication channel) shift and dance in a shimmer of dynamically shared and momentary and ever changing flux? Where real-time systems come and go at the speed of environmental flux, where need and demand and supply and goal and condition and context can never be pre-determined, what kinds of logical structures will be made necessary by the overlay of so ethereal a system onto the reliable accuracy we demand and expect of deterministic computing systems?
This is the future already flowing like a fog into and around our daily lives. This is the future practicality predicts, but for which our current knowledge and computing infrastructure is hopelessly ill-prepared. Everyday things are already laden with computing power. Some can even send and receive communications. But the wall of complexity that stands between this self-evident now and any true swarm computing future is daunting. Many computer researchers are beginning to sense as I do, that progress in computer science has come to a halt, that we have, as an industry, as a field, collectively run head long into this wall that stands between the single computer past and the swarm computer future. We know how to architect single computer solutions, but we are struggling to bring multi-computer self optimizing swam computing solutions from idea or dream to full working detail.
We desperately need an architectural solution to the swarm computing problem. We need a solution that allows each node to act as a cell, as autonomous but fully capable of coming together to build out collaborative systems as organs, bodies, species, and culture. As is true with biological cells, any true swarm computing architecture will have to imbue each and every computing node all of the logic necessary to autonomously swarm into larger and larger computing entities.
Progress in every human endeavor has been stalled by the complexity wall that stands between computing past and its future. Designing our way towards an architectural solution to the swarm computing problem must to be made humanity's central, shared, and most pressing goal. Until we meet this challenge, evolution in all human endeavors will remain stalled. As happened with the introduction of electrical power transmission, industrial agriculture, the internal combustion engine, and electronic communication, the other side of the complexity wall waits an increase in global productivity that will dwarf all previous epochs.
Computing hasn't even begun.
Conversion of chemical to radiative heat...
What determines the rate at which chemical heat (convection) becomes electromagnetic heat (radiation)? I am trying to get a handle on the Earth's energy budget. Obviously, heat is being transfered in-system through convection... but what is the process that converts this chemical heat (brownian motion that needs a material medium) to photonic heat (that can leave the planet and travel through space)? What factors limit this phase transition? In terms of total energy transfer, how can one determine the delta between the rate of transfer (comparing convection and radiation) when a system has both. What are the limits of efficiency when comparing both chemical and radiative energy transfer? I know that convection is restricted by the sound limit, so I assume that radiative heat is likewise restricted by the speed of light. However, because light doesn't seem to interact with other light, how dense does light have to be before we see dramatic self limiting effects? I am guessing that E=mC^2 will answer my question as dense light becomes matter which provides the upper limit to the speed of light? All of my questions are motivated by a desire to understand the base physics that underlies and predicts the Earth's energy budget and how these natural systems are effected by green house gasses and human energy conversion (releasing heat), especially with regard to the ratio of rate of change vs. dissipative capacity of the natural systems.
The debate's over. There are five points in the consensus. No. 1: Global warming is real. No. 2: We human beings are mainly responsible. No. 3: Consequences are very bad. No. 4: We need to fix it quickly. And No. 5: It's not too late.
Researchers have tried to measure just how solid this consensus is. They conducted a study of 1,000 articles published in peer-reviewed journals between 1988 and 2003. Of the 1,000, 0 percent (or exactly 0) questioned the consensus around global warming. Yet a comparable study of articles published in popular news media found over 53 percent questioned that consensus.
The difference is accounted for by the extraordinary effort by oil companies and the like to fund and spread the results of junk science, questioning global warming in a manner that threw certain views into doubt. The result was political cover for the Republican Party's campaign of Global Warming Denial.
As strategist Frank Luntz put it in a memo to Republican leaders,
"Should the public come to believe that the scientific issues are settled, their views about global warming will change accordingly. Therefore, you need to continue to make the lack of scientific certainty a primary issue in the debate."
The list could go on—for a very long time. None of these questions are rocket science. Not all are esoteric matters about the regulation of culture. Indeed, some are among the most important public policy questions government considers. Yet all these, despite the ease, government got wrong. And wrong in a predictable way—the product of a dependency tied to money. Among the reasons for reform, this certainly reaches quite high.
This is an excerpt from an essay on the dangers of monetary influence on our government representatives by Stanford University copyright law professor Lawrence Lessig (Independence 2.0, MetroActive, 8/6/08).
|This content is not yet available over encrypted connections.|