Tuesday, October 16, 2007

Death special: The plan for eternal life





To many, these ideas sound seriously scary, and transhumanists have been attacked for jeopardising the future of humanity. What if they ended up creating a race of elite superhumans bent on enslaving the unmodified masses, or unwittingly programmed an army of self-replicating nanobots that would turn us all into grey goo? In 2004, political scientist Francis Fukuyama singled out transhumanism as the world's "most dangerous idea".


It may well be "dangerous" -- but it's also our single-best strategy for escaping our planetary cradle.

5 comments:

Anonymous said...

The traditionalists should be scared. There's a lot of us out here who would gladly shed the limitations and paradigms of humanity in general to reach for something greater. But rather than being "against God" or unnatural in some way, I feel that this is the exact evolutionary path that most higher order creatures take (sorry for stating the obvious there).

Probably won't happen in my lifetime but if it does in my sons that would be really fantastic. Thinking of him living to an age that he ultimately decides is a wonderful thing.

Anonymous said...

I prefer the term "extreme longevity" to "immortality." Immortality implies indestructibility as well as an unlimited lifespan. Would someone who live to be 1000 be considered "immortal" -- metaphorically maybe but not actually. Even someone who lived to be 1 MILLION years old would bot be genuinely "immortal." One reason I raise this point is that, in discussing issue like this, it strikes me as highly important to get our terminology straight. For one thing, I think "extreme longevity" would probably be much more politically acceptable than "immortality" (which does, somehow, sound "inhuman" and godlike).

--W.M. Bear

a tired pessimist said...

Fukuyama? Isn't that the guy who wrote a book about the "end of history"? He was wrong about that, too.

I agree with Mac that some form of transhuman modification to our intellect and problem-solving capabilities may be one of the few, if not the only, means to escape our self-created species death spiral.

But there probably isn't time to do it right, which would be by excruciatingly careful processes.

We don't have several decades left to fine-tune biogenetic mods to the brain, in an ethical manner.

I suspect we are fucking doomed.

Anonymous said...

It's also our single-best strategy for escaping our species-level stupidity.

Mac said...

Fukuyama's "Our Posthuman Future" is worth reading if you want to read about transhumanism from the perspective of someone who's sincerely afraid of its potential ... which, IMO, should be everyone willing to embrace it.