I’M ABOUT A THIRD OF THE WAY into Joel Garreau’s book, Radical Evolution, and there’s much talk about the Singularity (including a very cool interview with Vernor Vinge). (Decent short description of the Singularity concept here).

But while I think that the Singularity is something to take seriously, I also think that the focus — shown in the interview and the passages surrounding it — is a bit myopic. The fear is that we’ll wind up creating superhuman intelligence, and that it will quickly take over the world. Personally, I suspect that superhuman intelligence will be harder to create, and less superhuman, than many suspect. But that’s not the main point. The main point is that the dangers, in my estimation, don’t come from the creation of a godlike (or demonlike) superhuman entity. Or at least, if such an entity exists, the threat won’t be because of its intelligence. As I wrote a while back:

It is not obvious, however, that intelligence has much to do with world domination. Certainly, those currently ruling the world did not attain their positions by virtue of their intelligence, and it may be that, like James Branch Cabell’s eponymous protagonist Jurgen, superintelligent machines would find that “cleverness was not at the top of things, and never had been.” While scientists and computer experts, whose chief pride (as with Jurgen) lies in their intelligence, would tend to regard superior intellect as the sine qua non of power, this view can be quickly dispelled by a glance at the headlines.

The bigger danger won’t be the creation of a godlike artificial intelligence. It will be the creation of many millions (and eventually billions) of individuals with powers that would have been until recently regarded as godlike, in the rather small space that humanity currently inhabits. That problem will be reduced, however, if we expand beyond the earth beforehand. I certainly agree with Stephen Hawking that the alternative is extinction. But I think that we’ll do it in time.

Overall, I’m less afraid of the singularity than some. And one characteristic of entering a singularity is that you don’t generally realize it as it’s happening — like crossing the event horizon of a black hole, it’s not apparent while it’s underway. We may be entering the Singularity already. As my alter ego suggests, cloning seems frightening now. One day it will seem . . . quaint.

UPDATE: More thoughts here.