Search This Blog

How Close Are We To The Practical Limits Of Human Intelligence?

Of course there are rare individuals of prodigious intellectual potential. However, we might legitimately ask; is culture more the result of breakthroughs of the rare genius or the ability of the masses to absorb knowledge thus revealed?

The following list illustrates a range of possible outcomes.

a. Breakthroughs in fundamental knowledge will continue at accelerating pace.
b. Complexity of remaining unknowns is approaching human limits of understanding.
c. Fewer and fewer members of human species have the capacity to understand at the frontiers of knowledge.
d. Technological tools that assist human understanding of complexity will allow humans to overcome cognitive limits.
e. Technology embedded in our daily infrastructure will continue to effectively protect us from our lack of knowledge as average members of human society.
f. Innate limits in human complexity handling mean we are in for a heap of trouble.
g. The future depends on systems that are more cognitively robust than humans.

Arguments are made that any such theoretical limits are mitigated at least in part by the products of human intellectual achievement... by technology. Technology, it is argued, is a set of tools that act as a lever between our own limits and that which is possible. By this reasoning my original question is rendered mute as human cognitive capacity becomes mercurial... remains in lockstep with theoretical geometric acceleration of discovery... there are no limits! But this argument is dependent upon some rather large assumptions. The central assumption is sort of a Keynesian extension to Allen Turing's notion that given enough time, any problem solving machine can solve any problem regardless of size or complexity. It is understood that Allen was correct in this regard. What makes the idea Keynesian is the tendency to ignore the "given enough time" caveat. In the context of human affairs (and human productive life-spans), it is obvious that time is not a resource to be trivialized. In addition, one must consider the difference between using technology and understanding the technology we use. To what extent does the use of technology actually distance us from the reality of the natural laws that are thus harnessed? I believe I am seeing a disturbing trend in a growing percentage of the population in the most technologically advanced regions of global society who seem to be moving towards a world view that is consciously ignorant of physical law. I wonder if the absolute stability and reliability of modern infrastructure (heat, plumbing, every manor of electrical convenience and entertainment, almost guaranteed access to 85 years of life, ready and easy availability of almost unlimited quantities of food, cars trains boats jet aircraft and rockets that both flatten and shrink the geographic world to our every whim) has itself worked against any personal or societal need to respect and honor (and provide continued resources towards) the hard won knowledge that sits at base of each of these luxuries.

There are a million and one ways in which our future is hinged upon a deep and close examination of these issues critical to the continued evolution of complexity.

Randall

What Happened To Computing's Future?

Much has happened in computing in the last 5 years (or has it?).

1. Social Networking ad nauseum (do you have more friends now?).
2. Web 2.0, 3.0 et al (and still not up to pre-web interaction standards).
3. You Tube doing for video what cork did for Laundromat bulletin boards (isn't it just a little ironic that market penetration of hi-def TVs peaks exactly at the same time that the most popular video content is shot with $300 camcorders, dropped to 320x240 pixels, and compressed all to hell and back?).
4. The iPhone is a telephone! (aren't you glad you stood in line for 2 days?).
5. City-wide free WiFi (OK, so maybe Google ran out of money).
6. Multi-core CPU's. (nobody really knows if they are faster cause nobody knows how to write software for them... really, I need to know how to write a compiler in order to write an app?).
7. Rumba and Scuba (our robot future looks for all the world like the little toy cars I had as a kid that knew how to back up and start off in a different direction when they hit a wall... did they have advanced digital sensors and multi-core CPUs?).
8. Spam 2.0 (I will wager that spam has a much deeper detrimental effect on global GDP then does terrorism or the flu).
9. Global Climate Catastrophe (no problem here... just 7 times the extinction rate as occurred when the 10 mile wide meteor tore into the Yucatan 65 million years ago and killed off the dinosaurs).
10. The war (anyone know how many more people we need to kill in order to reverse the economic law of supply and demand?).

If ever there was a time when change was more needed I don't know about it. If ever there was a technology more intrinsically capable of enabling change I haven't heard of it.

I have this crazy notion that computing hasn't even come close to starting yet, that we haven't even begun to scratch the surface of what computing can and could do. Getting there probably means stepping back far enough from the day to day, from the quarterly demands, to look at computing as an agent of change, as an enabler of complexity handling, as evolution's most potent crucible.

Is there anyone out there who can sober-up long enough from this decade's long tech-binge to get a clear-eyed vantage on the causal topology beneath the hype-and-gadget cacophony that has so deafened this industry to the true music that could be computation's self evolving symphony?

Gadget

This content is not yet available over encrypted connections.