What is really going on in cognition, thinking, intelligence, processing?
At base cognition is two things:
1. Physical storage of an abstraction
2. Processing across that abstraction
Key to an understanding of cognition of any kind is persistence. An abstraction must be physical and it must be stable. In this case, stability means, at minimum, the structural resistance necessary to allow processing without that processing undoly changing the data's original order or structural layout.
The causal constraints and limits of both systems, abstraction and processing, must work such that neither prohibits or destroys the other.
Riding on top of this abstraction storage/processing dance is the necessity of a cognition system to be energy agnostic with regard to syntactic mapping. This means that it shouldn't take more energy to store and process the string "I ate my lunch" than it takes to store and process the string, "I ate my house".
Syntactic mapping (abstraction storage) and walking those maps (abstraction processing) must be energy agnostic. The abstraction space must be topologically flat with respect to the energy necessary to both store and process.
Thermodynamically, such a system, allows maximum variability and novelty at minimum cost.
What if's… playing out, at a safe distance, simulations, virtualizations of events and situations which would, in actuality, result in huge and direct consequences, is the great advantage of any abstraction system. A powerful cognition system is one that can propagate endless variations on a theme, and do so at low energy cost.
And yet. And yet… syntactical topological flatness carries its own obvious disadvantages. If it takes no more energy to write and read "I ate my house" than it does to write or process the statement, "I ate my lunch", how does one go about measure validity in an abstraction? How does one store and process the very necessary topological inequality that leads to semantic landscapes… to causal distinction?
The flexibility necessary in an optimal syntactic system, topological flatness, works against the validity mapping that makes semantics topologically rugged, that gives an abstraction syntactic fidelity.
This problem is solved by biology, by mind, though learning. Learning is a physical process. As such it is sensitive to the direction of time. Learning is growth. Growth is directional. Growth is additive. Learning takes aggregate structures from any present and builds super-aggragate structures that can be further aggregated in the next moment.
I will go so far as suggesting that definitions of both evolution and complexity are hinged on the some metric of a system to physically abstract salient aspects of the environment in which it is situated. This abstraction might be as complex as experience stored as memory in mind, and it may be as simple as a shape that maximizes (or minimizes) surface area.
A growth system is a system that can not help but to be organized ontologically. A system that is laid up through time is a system that reflects the hierarchy of influence from which its environment is organized. Think of it this way, the strongest forces effecting an environment will overwhelm and wipe out structures based on less energetic forces. Cosmological evolution provides an easy to understand example. The heat and pressure right after the big bang only allow aggregates based on the most powerful forces. Quarks form first, this lowers the temperature and pressure enough for sub atomic particles, then atoms. Once the heat and pressure is low enough, once the environmental energy is less than the relatively weak electrical bonds of chemistry, molecules can precipitate from the atomic soup. The point is that evolved systems (all systems) are morphological ontologies that accurately abstract the energy histories of the environments from which they evolved. The layered grammars that define the shape and structure (and behavior) of any molecule, reflect the energy epochs from which they were formed. This is learning. It is exactly the same phenomenon that produces any abstraction and processing system. Mind and molecule, at least with regard to structure (data) and processing (environment), are the result of identical process, and as a result, will (statistically) represent the energy ontology that is the environment from which they were formed.
It is for this reason that the ontological structure of any growth system is always and necessarily organized semantically. Regardless of domain, if a system grew into existence, an observer can assume overwhelming semantic relevance that differentiates those things that appeared earlier (causally more energetic) from those things that appeared later (causally less energetic).
This is true of all systems. All systems exhibit semantic contingency as a result of growth. Cognition system's included (but not special). The mind (a mind, any mind), is an evolving system. Intelligence evolves over the life span of an individual in the same way that the proclivity towards intelligence evolves over the life-span of the species (or deeper). Evolving systems can not be expressed as equation. If they could, evolution wouldn't be necessary, wouldn't happen. Math-obsessed people have a tendency to confuse the feeling of the concept of pure abstraction with the causal reality of processing (that allows them to experience this confusion).
Just as important, data is only intelligible, (process-able, representative, model, abstraction) if it is made of parts in a specific and stable arrangement to one another. The zeroith law of computation is that information or data or abstraction must be made of physical parts. The crazies who advocate a "pure math" form of mind or information simply sidestep this most important aspect of information. This is why quantum computing is in reality something completely different than the information-as-ether inclination of the duelists and metaphysics nuts. Where it may indeed be true that the universe (any universe) has to, by principle, be describable, abstract-able by self consistent system of logic, that is not the same what's so ever as the claim that the universe IS (purely and only) math.
Logic is an abstraction. As such it needs a physical realm in which to hold its concepts as parts in steady and constant and particular relation to each-other.
My guess is that we confuse the FEELING of math as ethereal and non-corporal pure-concept with the reality which of course necessitates both a physical REPRESENTATION (in neural memory or on paper or chip or disc) and a set of physical PROCESSING MACHINERY to crawl it and perform transforms on it.
What feels like "pure math" only FEELS like anything because of the physicality that is our brains as copular machinery as they represent and process a very physical entity that IS logic.
We make this mistake all day long. When the only access to reality we have is through our abstraction mechanism, we begin to confuse the theater that is processing with that which is being processed and ultimately with that which that which is being processed represents.
Some of the things the mind (any mind) processes are abstractions, stand-ins for other external objects and processes. Other things the mind processes only and ever exist in the mind. But that doesn't make them any less physical. Alfred Korzybski is famous for declaring truthfully, "The map is not the territory!" But this statement is not logically similar to the false declaration, "The map is not territory!". Abstractions are always and only physical things. The physics of a map, an abstraction system, a language, a grammar, is rarely the same as the physics of the things that map is meant to represent, but the map always obeys and is consistent with some set of physical causal forces and structures built of them.
What one can say is that abstraction systems are either lossy or they aren't useful as abstraction systems. The point of an abstraction is flexibility and processing efficiency. A map of a mountain range could be built out of rocks and made larger than the original it represents. But that would very much defeat the purpose. On the other hand, one is advised to understand that the tradeoff of the flexibility of an effective map is that a great deal of detail has been excluded.
Yet, again and again, we ourselves, as abstraction machines, confuse the all too important difference between representation and what is represented.
Until we get clear on this, any and all attempts at merely squaring up against the problem of machine intelligence will fail.
[note: this post is a work in progress]
You've met the private school freaks. They can't believe anyone would put their kids in a public school. What with all of the riffraff, the minority students, the dropout rates, the low performance on standardized tests, the poor state of school grounds and facilities, the struggle for funding for special programs like music, art, and sports, the lack of emphasis on college preparation, etc. Man, look at that list! Those are some strong and obvious arguments against public school education. Or are they?
What people who advocate private or limited solutions fail to calculate is what I call the "boutique effect". If for instance, there isn't enough money in a culture to give everyone a rich and safe childhood and education, it is absolutely expectable that those who are given those resources will excel (at least when compared to the scores of the less fortunate masses).
However, there is a real societal cost to exceptionality, and this goes to my "solutions that don't scale" thesis. With regard to exclusive schools for the rich, that cost has to do with the fact that the parents of rich kids just happen also to be the people with the most political and economic influence. When their kids are not part of the public education system, neither is their protective passion, experience, knowledge… or influence. They don't care about public education. Why should they? Their kids are not dependent upon it.
From another angle, when money goes to elite institutions, it is not being put where it is needed the most. Rich kids already live in information-ally rich environments. Rich kids are already live in safe and calm environments where learning works. Rich kids are much more likely to have direct access to positive role models… their environment is chock full of success stories. Rich kids are not as likely to live in families broken apart by drugs and prison terms and gang deaths and violence and people struggling with a second language or a culture that is not in step with the larger population. Rich kids have educated parents around them who can help them with their homework! Rich kids can afford to think about learning and succeeding, their world is devoid of the concentration-destroying stresses that poor kids deal with all day long.
And the whole reason people can get richer in this country than they can anywhere else, is because our population is more competitive (or used to be) than every other country. Why? Because we did this obscenely radical thing 150 years ago, we decided that everyone had a right to a publicly funded education! Educated people build factories and high tech energy delivery systems, they build transportation systems and are more likely to participate in advancements and the types of change that increase productivity. Productivity is the key. If you can get more value, more product, out of the average hour worked by your population, you can produce more wealth. Education is the radical difference that gave America its high per/hour, per/person productivity, it is why the rich are rich and why they can afford to borrow from the equity that is american productive wealth, even though that borrowing actually destroys wealth on a national scale in the process.
But these are not factors that have anything to do with a valid comparison of private education vs. public education. These are strictly socio-economic factors. The problem is when desperate parents look at the performance divide that exists between public and private schools and concludes that public schools should do exactly what private schools do. Private schools could be significantly worse than public schools and still produce higher test scores, more college acceptances, fewer drop outs, and lower crime rates. All of the intangibles work in their favor. In fact, it is often true that private schools employ teachers with less education and training than their public school counterparts.
Private school programs tend to spend less time on basic subjects (math, reading, etc.) and get better results! That is why they can spend a higher percentage of their student's time on extra-curricular activities (music, art, sports, theater, community projects, etc.). Obviously, if a public school would fail if it decided that it should therefore shorten the instructional time devoted to math and reading. Private exclusive schooling is not a solution that scales. It is obvious that it is a solution that only works for a very small percentage of the population of a society, and that it works only at the expense of the whole country as a whole. There are lots of examples out there of countries with very good exclusive education systems and very very very poor economies embedded within extremely unstable social chaos dominated by poverty. Most central African nations play this game, Myanmar, The United Arab Emeritus, South Africa (before apartheid was upended), Iran, North Korea, Brunei, etc, etc, etc. Exclusivity is the norm in the poorest nations on Earth. If exclusivity worked, these would be the most productive nations.
Lets look at China. Until recently, from a strictly financial perspective, China has done all of wrong things, it has restricted entrepreneurship, it has restricted credit, it has promoted an exclusively top-down decision and influence structure… everything that works against the kinds of fluid business environment that are attributable to a growing and healthy post industrial economy. And yet… and yet, despite all of these huge shortcomings and mistakes, China succeeds like no other country. Why, because it has spent an inordinate percentage of its wealth on its public education system. China's people are well educated by any global standards. Not some of its people. Not the exclusive elite. Everyone. And given China's huge population, that is a lot of educated people to compete en-mass with the rest of the world. When they finally turn the last capitalist stone, when they finally create a legal structure to support personal liberty, property ownership, and unfettered personal expression, watch out! Even without these capitalist standards, China reaps such benefits from its non-exclusive infrastructural investments. Meanwhile, those of us in the west have all but forgotten which factors really matter, and which, in comparison, are just fluff.
But, if you happen to live in a nation that has paid attention to productivity, has produced wealth as a result of a fair and solid infrastructure (transportation, energy, safety, medicine, credit, agriculture, justice, and education… for everyone!), you can play the exclusive game in limited numbers, even though it destroys wealth on a national scale.
Now let's switch gears. Inoculation. How did inoculation get to be such hotbed of superstitious thought? Inoculations have been blamed for hyper-activity, for cancer, for Asperger, and MS, for Attention Deficit Disorder, even for AIDS, etc. What inoculations are never blamed for is the one thing they are most definitely and unequivocally guilty of… preventing pandemic spread of disease! But inoculation programs only work when a certain minimum threshold of the population participate. Each vaccine (/disease combination) has its own special number… correlating to a specific minimum percentage of the population that must be inoculated. If a smaller slice of the population are given the vaccine, an outbreak is certain. When a parent makes the decision not to get their kid vaccinated, what they are really doing is passing the responsibility and risk (if there is any) to the kids who do get vaccinated. They reap the rewords but pay none of the price!
This is a great example of a solution that doesn't scale. While the vast majority of parents accept the shared responsibility a few parents can get by without inoculating their kids. It is a solution. But it isn't a solution that scales. Obviously, it doesn't work if everyone chooses this solution. It doesn't even work if more than a few choose not to inoculate. This type of solution only works because others are not choosing to shirk their share of the responsibility (and risk). In fact, if more and more people choose the non-solution option, there will become a point where everyone will suffer, even those who did take the vaccine. Most of these vaccines only work when the exposure to the pathogen is very low level. If a pandemic took hold, and if many people got really sick, high concentrations of the pathogen could overwhelm the immune resistance afforded through inoculations.
Any of this remind you of the "libertarian" platform? It should. The libertarian program is the very definition of a solution that doesn't scale.
[more to come…]
In the early days of computing, those involved were pioneers, innovators, original thinkers, the super-passionate, driven by curiosity and the purity of adventure. These were a strong people. Proud. And though it sounds precious to say so, the early computing pioneers were not unlike other gritty pioneers in that they enjoyed being out well beyond the known, striding forward where there was no trail, no strength in numbers, no peer support, no mentors, no accepted pattern.
Now? Well now that computers have become ubiquitous, now that everyone has one, uses one like they use a knife and fork, the pioneers have long since been replaced by Plumber Joe, by the middle of the bell curve, by everyman and everywoman and their kids.
The computer industry is market driven! I hate that it has taken me this long to recognize the importance and implications of this now overwhelmingly obvious and simple fact. The diffusion of computers into the daily routine of the entire population has resulted in a dramatic shift in the demand landscape that informs what computing becomes. The market for computing is its users. The user today, the average user, is a whole different animal than the user/creator that defined its early history. Those of us that jumped in early probably resist the idea that what we care about really doesn't matter anymore. Though it might be true that knowledge and a deeply theoretical bases for that knowledge still matters, from a consumer market demand perspective, we grey hairs are simply invisible. The fact that we nerds ever defined a market for anything at all is the more surprising historical footnote. It is a bittersweet realization, that success of our industry would of mean the marginalization of its founders.
In every grade school class I attended there were a few kids (one or two) who were driven by a passion to know, to understand, to create. The rest, well the rest excelled at a completely different set of skills, getting through the day, unnoticed, blending in. The two groups couldn't be more different. The inquisitive few were internally driven. The rest were driven by the outward demands of success as defined by the curve. The inventive minds competed against their own ability to invent. The rest competed amongst themselves over the coveted 60 percentile that would define passing the class.
The computing market is now dominated by that larger part of the human population that defines success as climbing (which ever way possible) on top of the 60 percent of the population (of other social climbers) that makes up the bulk and center of any bell curve. As kids, these were the people who spent most of their time comparing themselves to the kids next to them. Looking over their shoulder at the other kid's test answers. Studying together so that they knew the base line before they got to the actual test. I say "climb to the top" but the word "top" when describing a bell curve does a disservice to the real meaning of averages. What we call the top of a bell curve is really the center of a frequency distribution. Climbing to the top is really a struggle to get into the center. Like fish trying to avoid a shark, there is a natural human tendency away from being alone, away from the vulnerability that is the open water that is original ideas and behavior. As a result, we constantly seek the protection of others. Each of us, as humans, spend a good deal of our energy trying to determine and then contort our behavior to that which best describes the center of normative behavior and attitude.
The similarities between schooling fish and human socialization pressures are profound. But there is one important difference. Where fish seek the center to avoid the teeth and gut of another species, the predator we humans work so hard to avoid is us, is public ridicule, being seen as different, standing out! We are in a very real sense both sheep and the sheep dogs nipping at the the sheep's legs. It is obvious that evolutionary pressures have conspired to build into our brains at least two modes of processing and that they are, at least at times, antagonistic. One is a great big "what if?" simulator, a novelty machine… the other, a social manors restriction system that cautions at the point of pain, behavior the least bit novel or different.
I have traversed the usual nature/nurture, cultural/evolutionary minefields. What I come to is this; traits that exist universally across most cultures and experienced many times within each individual's life, are most probably behaviors that have a significant genetic/physical component... are common regardless of our developmental environment and experience. Humans are obviously capable of profound novelty and abstract pattern analysis. But there is also a pervasive behavioral overlay of social control of which we are simultaneously, willing participant, and victim. What is confounding is the extent to which each system interferes with the function and success of the other… and that they are so diametrically opposed.
With regard to schooling (and herding) behavior, that which we share in common with fish (and sheep), is an indifference to where the school is, in which direction it is moving, and how fast. Under the social threat that triggers schooling, all that matters is that each of us as individuals finds our way as close as possible to the center. Humans will go along with almost any plan so long as social grouping allows us to avoid being seen as different. Obvious examples: Nazi Germany, Slavery in the southern U.S., the Nanking Sex Trade, etc.
As computers have been made "user friendly" and as the cost of ownership has dropped, this center of the bell curve, this mad fight for self-similarity that defines who we are as a species, this reflection and homogenization of the greater us, has become the market for computing. Which makes sense. Diffusion and all. But the whole history of modern computing is so short, just 40 years now, that it is surprising and a bit of a shock to realize finally that from a market perspective, computing is a mature industry. How could an industry just a generation old have transited already from go-it-alone pioneer to "I'm Love'n It" average?
The implications are huge. In particular, this insight brings social media into sharp ironic focus. Social media brings to computing the same access to community monitoring and control that gossip and storytelling brought to the camp fire. It is to computing what cheating off of your neighbor's test is to being a kid. As a person who likes to think of himself as a pioneer, I have reacted in horror, disbelief, and frustration to what has looked like fifteen years of computer industry regression.
If you accept that computers, like minds, are abstraction (language) machines, then it makes sense to wonder to what extent the human brain has suffered the same evolutionary pressure towards social assimilation and at least plausibly, away from innovation and novelty. To what extent is the rarity of the product of profound human creativity a reflection of actual physical and logical limits on and causal costs to creativity itself, and to what extent is the same rarity a product of evolutionary molded physical properties of the brain that conspire to restrict the production of novelty as a result of even greater survival pressure to promote behaviors that honor social cohesion?
If the current overwhelming trend that sees the computer as more of a social monitoring mechanism, and less a creative tool, is a trend that reflects market demand, then the same questions I am asking of the market pressures that shape the machinery of social media must be asked of the cultural pressures that have through evolutionary time shaped the mind. So long as computation is primarily consumed by human beings, both computer and mind will be shaped by the same evolutionary pressures. As technical barriers are overcome, the industry can and does react more fluidly and with higher fidelity to the demands of its consumers.
At which point, the question becomes; Which heaps greater selective pressure on the evolution of computing, the need for tools that stand in for skills the human brain lacks, or the need for tools that amplify our most attention demanding desirers? Can the two co-exist and co-evolve productively? Again, the question is asked practically of computation and at the same time, philosophically or anthropologically of the human brain and the cultural stew in which each is both product and survival pressure.
Where computing used to take its shape from the fecund imagination of computational luminaries, it has of late been lead instead in the pursuit of the lizard brain in all of us, the sub-conscious urges and fears that inform social mediation behavior. The result is all of this social media dribble, the likes of "Twitter", "13 Seconds", "myface and spacebook" [sic], and numerology based pizza ordering "iPhone Apps." What advantages do such time wasters render? Some argue that social media was the logical communication oriented extension of email and personal web sites, that social media greases mechanisms deeply human and "natural". I remain dubious to these claims. I tend to group the brand of communication that social media seems to breed with more negative forms of group behavior like cults, mass hysteria, fundamentalism, and other behaviors unique to group-think.
And what of pop-culture notions like "collective intelligence", "global awakening, and "cultural consciousness" which seem to be born of transcendent utopian notions (not dissimilar to those that feed religion and spirituality). The adherents of these optimisms appear to be blissfully unhindered by the need for causally logical argument or empirical evidence. If our computers have become social monitoring devices (at the expense of facilities that enable creativity), is there a danger that they will further distort our already distorted sense of truth? If a computer "wants" to agree with us more than it wants to accurately calculate a value, then we might already have crossed the threshold into a world where 2 plus 2 really does equal 5 (if the computer says so, it must be true!).
It would be irresponsible for me not to at this point remind myself to question my own rhetorically close topics.
These questions and trends have profound implications to populist concepts we tend to romanticize but rarely examine in detail. Democracy, plural-icy, consensus, society, culture, community, equal rights, individuality, etc. As the computer industry becomes more and more sensitive to consumer demand, its product WILL become a device that does a better and better job at the automation and magnification of human idiosyncratic behavior, of superstition, mythos, hubris, rhetoric, ego, at any and all of the emotional side effects of evolutionary pressures. Forget about the cold indifference of causal truth that has motivated so many sci-fi stories.
The real villain to be feared in any inevitable future is the computer as hubris amplifier.
Computing's new "social media" face might disturb my pioneer sensibilities, but it reflects the satisfaction of common demand. As any market matures it learns to overcome the physical and conceptual obstacles that so plagued it in its earlier years. Unburdened by things like processor speed and storage density, the computer industry was able to pursue directions more in line with human consumptive desire than with the technical or theoretical goals of computer "scientists". Marketeers trump scientists when the saturation of a product becomes universal.
It all makes sense. I am still depressed by the anti-innovation implications of the mass market dumbing down of computing, but at least I understand why it happened, what it means. Knowledge, even depressing knowledge, should open doors, should allow more efficient planning and prediction. But what exactly are the implications when a creative tool is hijacked by a larger urge to avoid at all costs, change and novelty. What happens when the same mass-market demand pressures that cause fads and trends focus their hysterical drive towards homogeneity onto the evolution of a tool originally intended for and idealized for creative exploration? What exactly do you get when you neuter the rebellion right out from underneath Picasso's brush?, When you force Darwin to teach sunday school?
Just what does it mean when our creative medium becomes sensitive to social interaction? Pen and paper never knew anything about the person wielding them, certainly didn't know how the greater society was reacting to what was being written or drawn.
If the average human feels more comfortable doing exactly what everyone else is doing, seeking the center, would much rather copy the answers off of their desk-mate's test than understand the the course content, well then it only makes sense that, we, the royal "we", would use this computing tool in the same way that we use the rest of the stuff in our lives, to help us find the social center and stay there.
It's not just the marketplace that has shifted towards the demographic center. The schooling mentality has crept into and now dominates computing as an industry. Personnel and management which in the early days of computing was awkwardly staffed by engineers and scientists and groupie hobbyists is now as diverse (homogenous?) a mix of humans as you could find in any industry. Even the scientists are cut from a different cloth. It takes a special and rare (crazy) human being to invent an industry from nothing. When avocations become well funded departments at major universities, the graduates are not likely to be as intellectually adventurous (understatement). As any MBA knows, the success of an industry is most sensitive to its ability to understand and predict the demand of its market. Who better to know the center of the consumer bell curve, the average Joe and Jane, than that same Joe and Jane? Joe and Jane Average now dominate the rank and file workers that make up the computer industry. This means administration, it also means sales and marketing, both of which make sense. Less intuitive, is but equally understandable, Joe and Jane Average have taken over the research and design and long range planning arms of the computer industry. Even where it isn't the actual Joe and Jane, it is people who do a kick-ass job of channeling them.
Does market saturation mean the evolution of computing has reached its zenith? I don't think it does. But, once an industry has reached this level of common adoption, the appearance of maturity and stability are hard to shake. Momentum and lock-in take hold. I have tried repeatedly to sell paradigm-busting and architectural re-imaginings of the entire computing paradigm to valley angles and big ticket VC firms only to realize that I wasn't selling to the current market. Try opening a VC pitch with "The problem with the computer industry is…" to a group of thirty-five year old billionaires who each drove to the meeting in custom ordered european super cars. Needless to say, their own rarefied experience makes it hard for them to connect with anything that follows. This is a classic conundrum in the study of evolution. I presume it is a classic conundrum facing evolution itself. Why should a system that is successful in the current environment ever spend any energy on alternative schemes? How could it? It is hard to even come to the question "What could we be doing better?" when surrounded by the luxury of success.
At the same time, it is unlikely (impossible?) that any system, no matter how successful in the present, will long define success in the future. It might even be true that the more successful a scheme, the more likely that scheme will hasten an end to the environment that supported it (through faster and more complete exploitation of available resources). The irony of success!
But we a are smart species. We are capable of having this discussion aren't we? So we might be prepared as well to discuss the inevitability of the end of the current computational scheme. No? To prepare as a result, for the next most likely scheme (or schemes)? Especially those of us who study language, information, computation, complex systems, evolution. Especially an industry that has so advanced the tools and infrastructure of complexity handling. No? Surely we in the computer industry are ideally situated to have a rational vantage from which to see beyond the success of the current scheme? Yet, for the reasons I have already postulated (market homogeneity and success blindness), and others, we seem to be directionless in the larger sense, incapable of the long range and big picture planning that might help us climb out of our little eden and into the larger future that is inevitable and unavoidable. Innovations springing forth from the current industry pale in comparison to those offered up 20, 30, even 40 years ago.
Speaking of which; I just found the resource list for Pattie Maes' "New Paradigms In Human Computer Interaction" class at MIT's Media Lab. These are video clips of speeches and demos of early and not so early computing pioneers showing off their work prescient work. Mind blowing. The future these folks (from places like MIT, Brown, Stanford Research, the Rand Corporation, Xerox PARC, and other institutions), well, it is sooooo much more forward looking than what has become of computing (or how the average computer is used). Everyone would do well to sit down and view or re-view these seminal projects in the context of their surprisingly early introduction.
I have written quite a few essays lambasting what I see as the computing industry's general loosing of its collective way… at the very least, a slowing down of the deep innovation that drove computing's early doers. Even when potentially powerful concepts ("Semantic Computing", "Geo-Tagging", "Augmented Reality", "User-Centered Cloud Computing") are (re-)introduced, their modern implementations are often so flawed and tainted by an obsession to kowtow to market pressures (or just plain lie or fake it) that the result is an insult, a blaspheme of the original concept being hijacked.
Over ten years ago, I gave a talk at Apple titled: "Apple Got Everything It Has For Free, And Sat On Its Ass For Eight Years While The Rest Of The World Caught Up".
Which is true, at least with regard to the Lisa/Macintosh which Xerox PARC handed them (check out the Star System) and the way they just kind of sat on the WISIWG mode-less graphical interaction scheme while other computer companies (Borland and then, reluctantly, Microsoft) eventually aped the same. At the time of my presentation, Apple had wasted its eight year lead extrapolating along the same scheme… a bigger badder WIMP interface, when they could have been introducing paradigm vaulting computational systems (that would put Seattle on another eight year chase).
But from a marketing perspective I couldn't have been more wrong. I have got to keep reminding myself that I no longer represent the market for computers! I wish I did, but I don't. I am an outlier, a small dot on a very long tail, I am pluto or maybe even just some wayward ice and dust comet to the big ordinary inner planets that trace out wonderfully near-circular orbits around the sun. In later presentations, I explained that Apple's critically acclaimed "Think Different" campaign and the elitist mindset from which it was derived, was the reason they had never garnered more than 2 or 3 percent of the computer market. I explained that Bill Gates' "genius" lie not in his profound insight, but in his ability to understand the motivations that drive the average person… namely to never be caught doing something that someone else could question. That means acting the same as everyone else. That means knowing how everyone else is acting. That means social media!
Nobody (other than wacky outliers like me) wants to be compared to iconoclasts like Einstein or Picasso or Gandhi or Gershwin. Very few people really want to "think different". Most people wouldn't be caught dead risking that type of public audacity. You have got to be pretty confident that you have an answer to the dreaded question "Why are you doing that?" to ever DO THAT (individually creative thing) in the first place.
Pioneers know exactly why they do what they do. They are driven by knowing more than others and by the excitement of being somewhere others haven't been… by being very much outside of the ball of fish that others seek as protection.
But if you want to sell a billion computers instead of just a few thousand, then you want to pay attention to the fish in all of us and not to the smiling and sock-less Einstein's on a bike.
But the larger and longer implications of mass market sensitivity are profound. While it is entirely true that paying attention to the center of the cultural bell curve will allow any industry to exploit more of the total available consumption potential, such behavior does not necessarily produce the paradigm jumping disruption upon which long term progress depends. If your Twinkies are selling really well, you might not notice that your customers are all reaching morbid levels of obesity and malnutrition or that the world is crumbling around the climate changing policies upon which your fast food empire is based.
The satisfaction of human center-of-the-fish-school urge is not necessarily the best recipe for success if success means more than short term market exploitation. In the long run, potential and satisfaction are decidedly not the same thing; they are, as a matter of fact, very often mutually antagonistic. Rome comes to mind. What comes to mind when I mention the phrase "dot com" or "mortgage backed securities" or "energy deregulation".
The mass market topology that has driven computing and communication towards a better and better fit with the demands of the largest and most homogenized consumer base the earth has ever witnessed could very likely work against the types of creative motivations that might be necessary to rescue us from the bland and the average, from the inward facing spiral of self-sameness that avarice alone yields. I am increasingly worried that the computer's seemingly endless potential to be warped and molded chameleon-like to perfectly satisfy our most basic evolutionary proclivities, to amplify unhindered, urges made strong against real scarcity in the natural environment, has already so distracted us within our egos and desires that we might not be able to pull our heads out before we get sucked completely down and into our very own John Malkovich-ian portals of endless identity-amplication.
And why does it matter? Because, like it or not, progress in every single field of human endeavor is now predicated on advancements in computation. More profound than anything else science has discovered about nature is that information is information. Information is domain agnostic. Which means that advances in information processing machinery benefit all information dependent fields of inquiry. Which is every field of inquiry! Which also implies that all fields of human inquiry are increasingly sensitive to the subtle ways that any one computational scheme effects information and the scope of information process-able within that scheme.
At any rate, it is alarming (if understandable) to me that the trending in this industry is towards a re-molding of the computer into ego amplifier, pleasure delivery system, truth avoidance device, distraction machine, Tribbles (as in "The Trouble With…"). The more insightful among us may want to place a side bet or two (if only as evolutionary insurance) on more expansive futures. Some of us are not so distracted by the shininess of these machines or by how they are getting better and better at focusing our attention at our own navels, to see futures for computing that are more expansive than the perfect amplification of the very human traits least likely to result in innovation or progress. There is still time (I hope).
|This content is not yet available over encrypted connections.|