Thursday, February 12, 2009

Transhumanist misgivings

Asocial Singularitarianism - Breeding an Incomplete View of Convergent Accelerating Change

The TEB is a tendency among transhumanists to force their objective vision of the future to fit with their subjective expectation of the future. Many of the futurists and outright transhumanists that I have come to know and respect over the years suffer from this. (I too came down with it for a spell when I first encountered the awesome power of Moore's Law and other hard-tech diffusion curves.) It's as if they 1) expect the future to create a magical utopia into which they project their unchanged present-day personalities, 2) can't or don't want to credit the dumb masses (their detractors) with the ability to perform amazing operations (social computation) critical to acceleration, and/or 3) are so focused on the post-human age / life-extending digitization that they fail to adequately consider what it will take to get there.


I suppose it's entirely possible that I'm not a transhumanist. If so, I'm not particularly worried. The label has always been a matter of convenience; there's no denying my general philosophical affinity for ideas such as negligible senescence, nano-scale manufacturing and artificial intelligence.

Hardcore "Singularitarianism" (the word itself seems the stuff of deliberate self-parody) has always struck me as a brittle proposition: while Ray Kurzweil's technology-saturated vision of the near-future remains valid in theory, I've never quite managed the trick of incorporating it into my own worldview except on an abstract, intellectual level.

To be sure, many of the advances proclaimed by transhumanists will probably be realized, some even sooner than anticipated. But the overarching vision of a human species suddenly free to shed its physical form at will or upload itself into virtual environments -- as resolutely fascinating as these prospects are -- somehow lacks the texture of genuine prophecy (which typically attempts to prevent at least much of the future as it tries to actualize).





My doubts regarding the future scenario espoused by Kurzweil are based, in part, on the amount of time necessary to construct the technologies in question -- and our seeming inability to make the deadline. The deeper and harder I look into the future -- or at least my mind's necessarily incomplete approximation of it -- the more I'm convinced that the latter half of the century will be spent scrambling for resources. We may well find ourselves squeezed into an evolutionary bottleneck, with relatively few of us securing safe havens against a disintegrating climate and collapsing biosphere.

While the extravagant baubles promised by many Singularitarians just might materialize in a world of plenty, I simply can't shake the feeling that the Western world's casual reliance on material abundance is nearing a jarring end, precluding the visions of ardent transhumanists.

And while I'm as impressed with the steady ascension of computing power as anyone, it often seems as if an alarming number of futurists have forgotten that Moore's Law is a human construct, not physical law in the sense that, say, gravity or thermodynamics dictate our existence. If it seems stalwart and unyielding, we must take into account its fairly recent arrival; knowing nothing else, we naturally assume that electronic systems will inexorably increase in complexity and ubiquity, perhaps to rival the human brain within the next few decades. (The issue of mind-uploading remains a powerful and uniquely daunting question, far more laden with philosophical baggage than contemporary debates over the ethics of germ-line engineering or conscientious application of brain implants.)

Sadly, I must conclude (for now), that the Kurzweilian future, for all of its marvel and promise, is unlikely to occur. How our species will react to the perils already manifesting on the planetary stage remains an open question; it's entirely possible, for example, that our present mode of consciousness must undergo some form of mutation before we can deal with future threats sensibly. Futurists such as Daniel Pinchbeck have speculated that we've driven ourselves to the threshold of extinction (the "cusp of terminal dissolution" found in this blog's masthead) as a necessary precursor to becoming something fundamentally different.

Humanity appears to thrive on shocks -- our history (and prehistory) reflect an ability to assimilate the lessons of failed incarnations. Like the line plotted by Moore's Law, our tenure as a sentient presence on Earth indicates the capacity for exponential growth. Could it be that we're skating toward two Singularity events simultaneously without consciously realizing it? Multiple Singularities might constitute a topological headache, but could hardly be deemed more perplexing than the slippery, recursive nature of consciousness itself.

4 comments:

Anonymous said...

Good to see some original thought and writing about some of the essential assumptions of HTAE-oriented transhumanism, and the failure, in turn, of some from the hard sciences aspect to apparently recognize how crucial the time, social, and resource constraints on any such prospective future will be. We have a software problem, more than one of hardware, in a way.

I'd very much like to see a further explication and discussion of these critical issues on your blog, as they seem important to a serious debate about how we may (or may not) be able to survive as a viable, sustainable species, and in order to develop and fulfill our possible potential as sentient beings.

I agree that Kurzweil's vision of the singularity not only is unlikely, but perhaps is restrictive and could unintentionally damage our future potential, as it seems not to fully encompass or consider deeply the effects on humanity as a whole of the laws and effects of unintended consequences, punctuated equilibrium, and divergent evolutionary dependence on technologies of a kind of mechanistic, almost elitist nature which culturally and sociologically would be incredibly disruptive.

Kurzweil seems to be reaching toward and attempting to grasp solutions based more on technology, not biological or social reality, or the reality we now face of how we can only subsist within the ecology of an environment our burgeoning existence now threatens. He's trying to cross "a bridge too far" in a sense, and too quickly, with his emphasis on accelerated change, and I fear a divergence between the proto-transhumanist haves and the plebian or philosophically opposed have nots, which could very well trigger a backlash or culture war if the practical requirements for a sustainable world are not addressed first and foremost.

Where you say, "I'm convinced that the latter half of the century will be spent scrambling for resources. We may well find ourselves squeezed into an evolutionary bottleneck, with relatively few of us securing safe havens against a disintegrating climate and collapsing biosphere," I would suggest that "scramble" for resources has already been occurring for several decades now, and that if we cannot solve the foundational problems of overpopulation, consumerism, resource depletion, ecological damage, and complexity management over the next few decades, there won't be a second half of this century that involves civilized technologically-based societies scrambling for anything--humanity will have already destroyed itself before the second half of the century. That is the real "evolutionary bottleneck" that confronts us, and if we do not solve it, there will be no secure or safe havens anywhere, for anyone, in the long term.

Some kind of paradigm shift in consciousness, about how we think and live, will be required before any more advanced possibilities can occur, which is what I mean by efforts to deal with the "software" problem having initial primacy over attempts to resolve the issues presented by transhumanism and survival via "hardware" means.

Bruce Duensing said...

Jules Verne, HG Welles and Arthur C Clarke wrestled with the same issues as you within the cultural context of their own time and mirrored the human condition in the technological metaphors of their day.

It's a noble calling, this profession of yours,and those who came before you and one that is often misunderstood as being simply entertainment for popular culture, while at the same time this form of probing, has set a platform for discussing the vital issues of the day.

Anonymous said...

So, I recognize my brain. I may be bright in some areas, but I'm often am unable to really *relate* to normal people properly or understand completely the constraints of life as an "average" person.

I don't see that with a lot of Singularity folks. I feel like the Sigularity is like the Virtual Reality of the 2000s. Except that VR had more practical applications...

There is probably a genre of novel to be written that does to Peak Oil what Cyberpunk did to computing...

Paul said...

Hi Mac,

Very thoughtful and heartfelt post. I've been thinking about these very constraints for at least 10 years now, and new that some kind of 'crash' or biospheric-economic constraint was inevitable.

I can't say my thoughts are fully formed on this subject, as the last three months of news has given me much food for thought, but I will say this:

1) A huge percentage of resources gets turned into disposable obnoxico (stupid, non-durable fluff that fill the shelves of Walmart). Up to this point it has been cheaper to manufacture new goods in this way, but now it is becoming more expensive, not to mention people will have less money for all these non-essentials.

So,

3) Products will become more durable, made more from recyclable materials, and the regenerative "steady state" economy the new norm.

However,

4) The economic growth will still occur, but not materially, but informationally, as more of our entertainment comes from the information economy, through lighter, cheaper, more durable and recyclable information products. In other words our civilization could/should and hopefully will become more intelligent and efficient at creating more and mor e value with less and less. Bucky Fuller called this ephermalization, and it's a trend that has been occuring through history anyway, except now it's become a survival imperative.

Thanks again for the thoughtful posts.

Paul Hughes
http://astranaut.org/blog/