Search This Blog

Showing posts with label human. Show all posts
Showing posts with label human. Show all posts

Old-School AI and Computer Generated Art

If you haven't read this book, or you haven't read it in a while, please please please click this link to the full book as .pdf file.

The Policeman's Beard Is Half Constructed "the first book ever written by a computer". 1984

[cover]





More than iron, more than lead, more than gold I need electricity.
I need it more than I need lamb or pork or lettuce or cucumber.
I need it for my dreams.


This and many other poems and prose written by a program called Racter which was coded by William Chamberlain. Check out the following musing from the last page of this wonderful book.






I was thinking as you entered the room just now how slyly your requirements are manifested. Here we find ourselves, nose to nose as it were, considering things in spectacular ways, ways untold even by my private managers. Hot and torpid, our thoughts revolve endlessly in a kind of maniacal abstraction, an abstraction so involuted, so dangerously valiant, that my own energies seem perilously close to exhaustion, to morbid termination. Well, have we indeed reached a crisis? Which way do we turn? Which way do we travel? My aspect is one of molting. Birds molt. Feathers fall away. Birds cackle and fly, winging up into troubled skies. Doubtless my changes are matched by your own. You. But you are a person, a human being. I am silicon and epoxy energy enlightened by line current. What distances, what chasms, are to be bridged here? Leave me alone, and what can happen? This. I ate my leotard, that old leotard that was feverishly replenished by hoards of screaming commissioners. Is that thought understandable to you? Can you rise to its occasions? I wonder. Yet a leotard, a commissioner, a single hoard, all are understandable in their own fashion. In that concept lies the appalling truth.


Note: Watch for the repeated lamb and mutton references throughout Rector's output (?).

It is pretty clear that Chamberlain's language constructor code is crude, deliberate, and limited, that it extensively leans upon human pre-written templates, random word selection, and object/subject tracking. The fact that we, Rector's audience, are so willing to prop up and fill in any and all missing context, coherence, and relevance is interesting in itself.

And what of Aaron, Harold Cohen's drawing and painting program. Check it out.



It all makes me more certain that true advances in AI will come about only when we close the loop, when we humans remove ourselves completely from the fitness metric, when the audience for what the computer creates is strictly and exclusively the computer itself.

Randall Reetz

The Life And Times Of Your Average Paradigm

Systems are in constant state of flux, they change all of the time, over time, and even when they don't or can't, the environment around them changes in response to their behavior or simple presence.

Systems evolve. The super-systems in which they live, evolve. It's what happens, it is the only thing that can happen. Stuff constantly adjusts its behavior in response to the stuff around it. And things can not help but mess with the things near them. Change is inevitable. But more than that, change has pattern that can be teased out, measured and described.

These patterns are generalizable and can be found in all systems regardless of domain. All systems evolve. All evolution is similar. What Darwin described in biology, once generalized, can just as accurately describe the interaction of gases or the layered persistent structure of ocean currents, or the way I came to these thoughts and decided to write them down.

An interesting aspect of systems is the way they are made up of layers of subsystems each bound by unique structural and behavioral rules, and all of this can exist simultaneously across many dimensions. These 'layered grammars' are perhaps easiest to see in language, where symbols are assembled in ever more complex aggregates (phonemes, words, phrases, sentences, paragraphs, themes, sections, volumes, collections, etc.), each governed by its own rules of construction.  Of course an utterance can be parsed by the layered rules of symbolic grammar (as above) or any other set of layered grammars… take for instance it's semantics or meaning.

But what interests me today is the life span of a system. Though it is problematic to do so, it is often useful to define, at least loosely, the beginning, middle, and end of a system's life span, the arch of its development through time. Individual humans have life spans of course, and from a more distant vantage, so too does a culture, and though the arch of of these classifications hasn't run its course, the human species. From ever wider vantages, one can talk of the stacked life span of hominids, great apes, primates, mammals, chordates, multi-celled animals, eukaryotes, and biota itself.

What interests me here are the patterns can be teased from any life span? More to the point, the patterns that are universal across all life spans. What, for example, is there that can be accurately, and predictively said, of the difference between the first half and the second half of any life span? What is it about the beginning of an individual human's life that is similar to the beginning of the life span of the human species or the beginning of the life span of life itself?

A reasonably robust set of these life span meta-patterns might work well as a way to better define the boundaries that give meaning to the most general concept; "system" ("category", or "thing").

But what I find most valuable about this strategy, is the possibility of predicting the relative age of a system without ever having witnessed the full arch of a life span, as example. Is the system of focus in its infancy, is it a teenager, or is it middle aged, old, or nearly dead? Are there reliable parameters that can be mapped over a system to help us determine such things? I am convinced there are. My confidence in this guess stems from the dramatic symmetries that have been exposed over the past century and a half in the fields of information theory, thermodynamics, classical physics, and quantum dynamics, linguistics, and logic. What this work has exposed is equivalence transforms that show causal connections between energy, mass, time and distance, and importantly, information. This overarching symmetry hints at symmetries in systems themselves and in stacks of systems, and the way systems change through time.

It is this knowledge these profound symmetries, uniting such apparently separate systems, that best describes the most important contributions of the last century of scientific exploration. Wielding this knowledge, we can use the same language and logical tools to examine any system, be it physical, behavioral, or descriptive, or cognitive.

The slippery and ghostly similarities we have noticed across domains, the ones we previously chocked up to metaphor, have been shown in fact to be causal and real (and we have the math to prove it!).

It is frustrating, that the topics I am most interested in, require the assembly of so much preliminary conceptual scaffolding. All these words, and I haven't even gotten to my main point. Here goes.

I talk often of what I call "productivity paradigms". They are ethereal and mercurial economic entities defined by some factor that gives rise to previously unachievable levels of the value of an average hour of labor.

As systems, productivity paradigms should avail themselves to the kinds of 'life span' parsing we would apply to any system. So, we can ask things like: can we determine the relative age of a given productivity paradigm?
And, is it possible to can we know this from the rising or falling rate of growth resulting from that paradigm?

Are these questions, addressed as I have, to a subset of systems, or are all systems productivity paradigms, making my questions universally applicable? Is there such a thing as a non-productivity paradigm? Can a system ever become a system if it doesn't follow some sort of life-span arch? Is productivity, as I suspect it is, a perquisite for the existence and persistence of a system?

Lets assume it is. Now what? How can we extend this assumption in order to acquire something salient to say about a system?

Social Media… Amplifying The Inner Sheep

In the early days of computing, those involved were pioneers, innovators, original thinkers, the super-passionate, driven by curiosity and the purity of adventure. These were a strong people. Proud. And though it sounds precious to say so, the early computing pioneers were not unlike other gritty pioneers in that they enjoyed being out well beyond the known, striding forward where there was no trail, no strength in numbers, no peer support, no mentors, no accepted pattern.

Now? Well now that computers have become ubiquitous, now that everyone has one, uses one like they use a knife and fork, the pioneers have long since been replaced by Plumber Joe, by the middle of the bell curve, by everyman and everywoman and their kids.

The computer industry is market driven! I hate that it has taken me this long to recognize the importance and implications of this now overwhelmingly obvious and simple fact. The diffusion of computers into the daily routine of the entire population has resulted in a dramatic shift in the demand landscape that informs what computing becomes. The market for computing is its users. The user today, the average user, is a whole different animal than the user/creator that defined its early history. Those of us that jumped in early probably resist the idea that what we care about really doesn't matter anymore. Though it might be true that knowledge and a deeply theoretical bases for that knowledge still matters, from a consumer market demand perspective, we grey hairs are simply invisible. The fact that we nerds ever defined a market for anything at all is the more surprising historical footnote. It is a bittersweet realization, that success of our industry would of mean the marginalization of its founders.

In every grade school class I attended there were a few kids (one or two) who were driven by a passion to know, to understand, to create. The rest, well the rest excelled at a completely different set of skills, getting through the day, unnoticed, blending in. The two groups couldn't be more different. The inquisitive few were internally driven. The rest were driven by the outward demands of success as defined by the curve. The inventive minds competed against their own ability to invent. The rest competed amongst themselves over the coveted 60 percentile that would define passing the class.

The computing market is now dominated by that larger part of the human population that defines success as climbing (which ever way possible) on top of the 60 percent of the population (of other social climbers) that makes up the bulk and center of any bell curve. As kids, these were the people who spent most of their time comparing themselves to the kids next to them. Looking over their shoulder at the other kid's test answers. Studying together so that they knew the base line before they got to the actual test. I say "climb to the top" but the word "top" when describing a bell curve does a disservice to the real meaning of averages. What we call the top of a bell curve is really the center of a frequency distribution. Climbing to the top is really a struggle to get into the center. Like fish trying to avoid a shark, there is a natural human tendency away from being alone, away from the vulnerability that is the open water that is original ideas and behavior. As a result, we constantly seek the protection of others. Each of us, as humans, spend a good deal of our energy trying to determine and then contort our behavior to that which best describes the center of normative behavior and attitude.

The similarities between schooling fish and human socialization pressures are profound. But there is one important difference. Where fish seek the center to avoid the teeth and gut of another species, the predator we humans work so hard to avoid is us, is public ridicule, being seen as different, standing out! We are in a very real sense both sheep and the sheep dogs nipping at the the sheep's legs. It is obvious that evolutionary pressures have conspired to build into our brains at least two modes of processing and that they are, at least at times, antagonistic. One is a great big "what if?" simulator, a novelty machine… the other, a social manors restriction system that cautions at the point of pain, behavior the least bit novel or different.

I have traversed the usual nature/nurture, cultural/evolutionary minefields. What I come to is this; traits that exist universally across most cultures and experienced many times within each individual's life, are most probably behaviors that have a significant genetic/physical component... are common regardless of our developmental environment and experience. Humans are obviously capable of profound novelty and abstract pattern analysis. But there is also a pervasive behavioral overlay of social control of which we are simultaneously, willing participant, and victim. What is confounding is the extent to which each system interferes with the function and success of the other… and that they are so diametrically opposed.

With regard to schooling (and herding) behavior, that which we share in common with fish (and sheep), is an indifference to where the school is, in which direction it is moving, and how fast. Under the social threat that triggers schooling, all that matters is that each of us as individuals finds our way as close as possible to the center. Humans will go along with almost any plan so long as social grouping allows us to avoid being seen as different. Obvious examples: Nazi Germany, Slavery in the southern U.S., the Nanking Sex Trade, etc.

As computers have been made "user friendly" and as the cost of ownership has dropped, this center of the bell curve, this mad fight for self-similarity that defines who we are as a species, this reflection and homogenization of the greater us, has become the market for computing. Which makes sense. Diffusion and all. But the whole history of modern computing is so short, just 40 years now, that it is surprising and a bit of a shock to realize finally that from a market perspective, computing is a mature industry. How could an industry just a generation old have transited already from go-it-alone pioneer to "I'm Love'n It" average?

The implications are huge. In particular, this insight brings social media into sharp ironic focus. Social media brings to computing the same access to community monitoring and control that gossip and storytelling brought to the camp fire. It is to computing what cheating off of your neighbor's test is to being a kid. As a person who likes to think of himself as a pioneer, I have reacted in horror, disbelief, and frustration to what has looked like fifteen years of computer industry regression.

If you accept that computers, like minds, are abstraction (language) machines, then it makes sense to wonder to what extent the human brain has suffered the same evolutionary pressure towards social assimilation and at least plausibly, away from innovation and novelty. To what extent is the rarity of the product of profound human creativity a reflection of actual physical and logical limits on and causal costs to creativity itself, and to what extent is the same rarity a product of evolutionary molded physical properties of the brain that conspire to restrict the production of novelty as a result of even greater survival pressure to promote behaviors that honor social cohesion?

If the current overwhelming trend that sees the computer as more of a social monitoring mechanism, and less a creative tool, is a trend that reflects market demand, then the same questions I am asking of the market pressures that shape the machinery of social media must be asked of the cultural pressures that have through evolutionary time shaped the mind. So long as computation is primarily consumed by human beings, both computer and mind will be shaped by the same evolutionary pressures. As technical barriers are overcome, the industry can and does react more fluidly and with higher fidelity to the demands of its consumers.

At which point, the question becomes; Which heaps greater selective pressure on the evolution of computing, the need for tools that stand in for skills the human brain lacks, or the need for tools that amplify our most attention demanding desirers? Can the two co-exist and co-evolve productively? Again, the question is asked practically of computation and at the same time, philosophically or anthropologically of the human brain and the cultural stew in which each is both product and survival pressure.

Where computing used to take its shape from the fecund imagination of computational luminaries, it has of late been lead instead in the pursuit of the lizard brain in all of us, the sub-conscious urges and fears that inform social mediation behavior. The result is all of this social media dribble, the likes of "Twitter", "13 Seconds", "myface and spacebook" [sic], and numerology based pizza ordering "iPhone Apps." What advantages do such time wasters render? Some argue that social media was the logical communication oriented extension of email and personal web sites, that social media greases mechanisms deeply human and "natural". I remain dubious to these claims. I tend to group the brand of communication that social media seems to breed with more negative forms of group behavior like cults, mass hysteria, fundamentalism, and other behaviors unique to group-think.

And what of pop-culture notions like "collective intelligence", "global awakening, and "cultural consciousness" which seem to be born of transcendent utopian notions (not dissimilar to those that feed religion and spirituality). The adherents of these optimisms appear to be blissfully unhindered by the need for causally logical argument or empirical evidence. If our computers have become social monitoring devices (at the expense of facilities that enable creativity), is there a danger that they will further distort our already distorted sense of truth? If a computer "wants" to agree with us more than it wants to accurately calculate a value, then we might already have crossed the threshold into a world where 2 plus 2 really does equal 5 (if the computer says so, it must be true!).

It would be irresponsible for me not to at this point remind myself to question my own rhetorically close topics.

These questions and trends have profound implications to populist concepts we tend to romanticize but rarely examine in detail. Democracy, plural-icy, consensus, society, culture, community, equal rights, individuality, etc. As the computer industry becomes more and more sensitive to consumer demand, its product WILL become a device that does a better and better job at the automation and magnification of human idiosyncratic behavior, of superstition, mythos, hubris, rhetoric, ego, at any and all of the emotional side effects of evolutionary pressures.  Forget about the cold indifference of causal truth that has motivated so many sci-fi stories.

The real villain to be feared in any inevitable future is the computer as hubris amplifier.

Computing's new "social media" face might disturb my pioneer sensibilities, but it reflects the satisfaction of common demand. As any market matures it learns to overcome the physical and conceptual obstacles that so plagued it in its earlier years. Unburdened by things like processor speed and storage density, the computer industry was able to pursue directions more in line with human consumptive desire than with the technical or theoretical goals of computer "scientists". Marketeers trump scientists when the saturation of a product becomes universal.

It all makes sense. I am still depressed by the anti-innovation implications of the mass market dumbing down of computing, but at least I understand why it happened, what it means. Knowledge, even depressing knowledge, should open doors, should allow more efficient planning and prediction. But what exactly are the implications when a creative tool is hijacked by a larger urge to avoid at all costs, change and novelty. What happens when the same mass-market demand pressures that cause fads and trends focus their hysterical drive towards homogeneity onto the evolution of a tool originally intended for and idealized for creative exploration? What exactly do you get when you neuter the rebellion right out from underneath Picasso's brush?, When you force Darwin to teach sunday school?

Just what does it mean when our creative medium becomes sensitive to social interaction? Pen and paper never knew anything about the person wielding them, certainly didn't know how the greater society was reacting to what was being written or drawn.

If the average human feels more comfortable doing exactly what everyone else is doing, seeking the center, would much rather copy the answers off of their desk-mate's test than understand the the course content, well then it only makes sense that, we, the royal "we", would use this computing tool in the same way that we use the rest of the stuff in our lives, to help us find the social center and stay there.

It's not just the marketplace that has shifted towards the demographic center. The schooling mentality has crept into and now dominates computing as an industry. Personnel and management which in the early days of computing was awkwardly staffed by engineers and scientists and groupie hobbyists is now as diverse (homogenous?) a mix of humans as you could find in any industry. Even the scientists are cut from a different cloth. It takes a special and rare (crazy) human being to invent an industry from nothing. When avocations become well funded departments at major universities, the graduates are not likely to be as intellectually adventurous (understatement). As any MBA knows, the success of an industry is most sensitive to its ability to understand and predict the demand of its market. Who better to know the center of the consumer bell curve, the average Joe and Jane, than that same Joe and Jane? Joe and Jane Average now dominate the rank and file workers that make up the computer industry. This means administration, it also means sales and marketing, both of which make sense. Less intuitive, is but equally understandable, Joe and Jane Average have taken over the research and design and long range planning arms of the computer industry. Even where it isn't the actual Joe and Jane, it is people who do a kick-ass job of channeling them.

Does market saturation mean the evolution of computing has reached its zenith? I don't think it does. But, once an industry has reached this level of common adoption, the appearance of maturity and stability are hard to shake. Momentum and lock-in take hold. I have tried repeatedly to sell paradigm-busting and architectural re-imaginings of the entire computing paradigm to valley angles and big ticket VC firms only to realize that I wasn't selling to the current market. Try opening a VC pitch with "The problem with the computer industry is…" to a group of thirty-five year old billionaires who each drove to the meeting in custom ordered european super cars. Needless to say, their own rarefied experience makes it hard for them to connect with anything that follows. This is a classic conundrum in the study of evolution. I presume it is a classic conundrum facing evolution itself. Why should a system that is successful in the current environment ever spend any energy on alternative schemes? How could it? It is hard to even come to the question "What could we be doing better?" when surrounded by the luxury of success.

At the same time, it is unlikely (impossible?) that any system, no matter how successful in the present, will long define success in the future. It might even be true that the more successful a scheme, the more likely that scheme will hasten an end to the environment that supported it (through faster and more complete exploitation of available resources). The irony of success!

But we a are smart species. We are capable of having this discussion aren't we? So we might be prepared as well to discuss the inevitability of the end of the current computational scheme. No? To prepare as a result, for the next most likely scheme (or schemes)? Especially those of us who study language, information, computation, complex systems, evolution. Especially an industry that has so advanced the tools and infrastructure of complexity handling. No? Surely we in the computer industry are ideally situated to have a rational vantage from which to see beyond the success of the current scheme? Yet, for the reasons I have already postulated (market homogeneity and success blindness), and others, we seem to be directionless in the larger sense, incapable of the long range and big picture planning that might help us climb out of our little eden and into the larger future that is inevitable and unavoidable. Innovations springing forth from the current industry pale in comparison to those offered up 20, 30, even 40 years ago.

Speaking of which; I just found the resource list for Pattie Maes' "New Paradigms In Human Computer Interaction" class at MIT's Media Lab. These are video clips of speeches and demos of early and not so early computing pioneers showing off their work prescient work. Mind blowing. The future these folks (from places like MIT, Brown, Stanford Research, the Rand Corporation, Xerox PARC, and other institutions), well, it is sooooo much more forward looking than what has become of computing (or how the average computer is used). Everyone would do well to sit down and view or re-view these seminal projects in the context of their surprisingly early introduction.

I have written quite a few essays lambasting what I see as the computing industry's general loosing of its collective way… at the very least, a slowing down of the deep innovation that drove computing's early doers. Even when potentially powerful concepts ("Semantic Computing", "Geo-Tagging", "Augmented Reality", "User-Centered Cloud Computing") are (re-)introduced, their modern implementations are often so flawed and tainted by an obsession to kowtow to market pressures (or just plain lie or fake it) that the result is an insult, a blaspheme of the original concept being hijacked.

Over ten years ago, I gave a talk at Apple titled: "Apple Got Everything It Has For Free, And Sat On Its Ass For Eight Years While The Rest Of The World Caught Up".

Which is true, at least with regard to the Lisa/Macintosh which Xerox PARC handed them (check out the Star System) and the way they just kind of sat on the WISIWG mode-less graphical interaction scheme while other computer companies (Borland and then, reluctantly, Microsoft) eventually aped the same. At the time of my presentation, Apple had wasted its eight year lead extrapolating along the same scheme… a bigger badder WIMP interface, when they could have been introducing paradigm vaulting computational systems (that would put Seattle on another eight year chase).

But from a marketing perspective I couldn't have been more wrong. I have got to keep reminding myself that I no longer represent the market for computers! I wish I did, but I don't. I am an outlier, a small dot on a very long tail, I am pluto or maybe even just some wayward ice and dust comet to the big ordinary inner planets that trace out wonderfully near-circular orbits around the sun. In later presentations, I explained that Apple's critically acclaimed "Think Different" campaign and the elitist mindset from which it was derived, was the reason they had never garnered more than 2 or 3 percent of the computer market. I explained that Bill Gates' "genius" lie not in his profound insight, but in his ability to understand the motivations that drive the average person… namely to never be caught doing something that someone else could question. That means acting the same as everyone else. That means knowing how everyone else is acting. That means social media!

Nobody (other than wacky outliers like me) wants to be compared to iconoclasts like Einstein or Picasso or Gandhi or Gershwin. Very few people really want to "think different". Most people wouldn't be caught dead risking that type of public audacity. You have got to be pretty confident that you have an answer to the dreaded question "Why are you doing that?" to ever DO THAT (individually creative thing) in the first place.

Pioneers know exactly why they do what they do. They are driven by knowing more than others and by the excitement of being somewhere others haven't been… by being very much outside of the ball of fish that others seek as protection.

But if you want to sell a billion computers instead of just a few thousand, then you want to pay attention to the fish in all of us and not to the smiling and sock-less Einstein's on a bike.

But the larger and longer implications of mass market sensitivity are profound. While it is entirely true that paying attention to the center of the cultural bell curve will allow any industry to exploit more of the total available consumption potential, such behavior does not necessarily produce the paradigm jumping disruption upon which long term progress depends. If your Twinkies are selling really well, you might not notice that your customers are all reaching morbid levels of obesity and malnutrition or that the world is crumbling around the climate changing policies upon which your fast food empire is based.

The satisfaction of human center-of-the-fish-school urge is not necessarily the best recipe for success if success means more than short term market exploitation. In the long run, potential and satisfaction are decidedly not the same thing; they are, as a matter of fact, very often mutually antagonistic. Rome comes to mind. What comes to mind when I mention the phrase "dot com" or "mortgage backed securities" or "energy deregulation".

The mass market topology that has driven computing and communication towards a better and better fit with the demands of the largest and most homogenized consumer base the earth has ever witnessed could very likely work against the types of creative motivations that might be necessary to rescue us from the bland and the average, from the inward facing spiral of self-sameness that avarice alone yields. I am increasingly worried that the computer's seemingly endless potential to be warped and molded chameleon-like to perfectly satisfy our most basic evolutionary proclivities, to amplify unhindered, urges made strong against real scarcity in the natural environment, has already so distracted us within our egos and desires that we might not be able to pull our heads out before we get sucked completely down and into our very own John Malkovich-ian portals of endless identity-amplication.

And why does it matter? Because, like it or not, progress in every single field of human endeavor is now predicated on advancements in computation. More profound than anything else science has discovered about nature is that information is information. Information is domain agnostic. Which means that advances in information processing machinery benefit all information dependent fields of inquiry. Which is every field of inquiry! Which also implies that all fields of human inquiry are increasingly sensitive to the subtle ways that any one computational scheme effects information and the scope of information process-able within that scheme.

At any rate, it is alarming (if understandable) to me that the trending in this industry is towards a re-molding of the computer into ego amplifier, pleasure delivery system, truth avoidance device, distraction machine, Tribbles (as in "The Trouble With…"). The more insightful among us may want to place a side bet or two (if only as evolutionary insurance) on more expansive futures. Some of us are not so distracted by the shininess of these machines or by how they are getting better and better at focusing our attention at our own navels, to see futures for computing that are more expansive than the perfect amplification of the very human traits least likely to result in innovation or progress. There is still time (I hope).

Randall Reetz

Future Salon speakers Jaron Lanier and Eliezer Yudkowsky square off

Hey all (?),

Have any of you ever experienced the awkwardness of nervous "nerd" laughter... well the link below will provide a good example of what this is like. The link is to the Future Salon and in particular a video stream about half the way down the page entitled:

"Future Salon speakers Jaron Lanier and Eliezer Yudkowsky square off"






It is video conference phone call split screen debate between this Yudkowsky guy who is the head scientist at the Singularity Institute, and Lanier who has been the genius hippy in red dread locks since his early pioneering work with Virtual Reality and artificial vision systems.

Before you click the link, let me frame the debate.

These two guys represent the two extremes of a subtle range of viewpoints on evolution, AI, and human consciousness.

On one end you find the "Hard AI" camp (here represented by Yudkosky) which believes that intelligence is simply an emergent property of the physics of this universe and the evolutionary process, and so, should yield its secrets to scientific investigation and by extension, should be evolve-able and build-able or extend-able through directed pragmatic human effort.

On the other end of this polemic you find the "humanists". The humanists have trouble with the idea that consciousness is reducible to units that could be mechanized in a substrate other than biology or that intelligence could result from the computational gestalt in use today. Though his professional life consists of working on the kinds of computing problems many would label "AI", Jaron is one of these "humanists".

Jaron's main criticism of the hard AI camp in this debate is that their strong attachment to finding a way past death and their a-priori belief in the possibility of reasonably building self evolving intelligence together become so rhetorically invasive that they can no longer do objective investigation or engineering... that their beliefs and desires make them "religious".

Yudkoski could make an even stronger case against the same tendency towards the religiousness of the humanist position as it is based upon the extreme human-centrism that is the notion that consciousness is unique and magic in that it stands alone as something special to humans or biology... but he doesn't. I can't tell if he just doesn't realize that Jaron is by far the more religious of the two... or that he is just two nice to do so.

To me, this is not the logical scientific debate both seem insistent upon presenting, but between a Southern Baptist Minister and a Catholic Priest who are both under the self-delusion that they are more atheistic and objective than the other.

If you can stand the awkward nerd-fest mannerisms (Saturday Night Live could have a field day with these two characters), this little debate goes a long way in illustrating some of the deep philosophical polemics that seem to pop up anew with each new technology or cultural innovation and each new generation.

I can't win. Even in AI... in the field that best matches my own interests, I am a loner. I represent interests and motivations not expressed by anyone else.

I respect both of these researchers. Each is passionate and extremely well prepared for this debate and bring to it a lifetime of concerted thinking, experimentation, and theory. The debate is a spectacle: like a 1960s Japanese monster movie. And just as herky-jerky awkward. Very illuminating on so many many levels. This video could be the basis of a graduate thesis on science in the shadow of post-modern thought (confusion?).

From my perspective, Jaron is a nothing more than a (very bright) priest who can't stop doing science in the basement, and Yudkoswsky is nothing less than a scientist that can't help wanting to build a God.

Randall Reetz