The TEB is a tendency among transhumanists to force their objective vision of the future to fit with their subjective expectation of the future. Many of the futurists and outright transhumanists that I have come to know and respect over the years suffer from this. (I too came down with it for a spell when I first encountered the awesome power of Moore's Law and other hard-tech diffusion curves.) It's as if they 1) expect the future to create a magical utopia into which they project their unchanged present-day personalities, 2) can't or don't want to credit the dumb masses (their detractors) with the ability to perform amazing operations (social computation) critical to acceleration, and/or 3) are so focused on the post-human age / life-extending digitization that they fail to adequately consider what it will take to get there.
I suppose it's entirely possible that I'm not a transhumanist. If so, I'm not particularly worried. The label has always been a matter of convenience; there's no denying my general philosophical affinity for ideas such as negligible senescence, nano-scale manufacturing and artificial intelligence.
Hardcore "Singularitarianism" (the word itself seems the stuff of deliberate self-parody) has always struck me as a brittle proposition: while Ray Kurzweil's technology-saturated vision of the near-future remains valid in theory, I've never quite managed the trick of incorporating it into my own worldview except on an abstract, intellectual level.
To be sure, many of the advances proclaimed by transhumanists will probably be realized, some even sooner than anticipated. But the overarching vision of a human species suddenly free to shed its physical form at will or upload itself into virtual environments -- as resolutely fascinating as these prospects are -- somehow lacks the texture of genuine prophecy (which typically attempts to prevent at least much of the future as it tries to actualize).
My doubts regarding the future scenario espoused by Kurzweil are based, in part, on the amount of time necessary to construct the technologies in question -- and our seeming inability to make the deadline. The deeper and harder I look into the future -- or at least my mind's necessarily incomplete approximation of it -- the more I'm convinced that the latter half of the century will be spent scrambling for resources. We may well find ourselves squeezed into an evolutionary bottleneck, with relatively few of us securing safe havens against a disintegrating climate and collapsing biosphere.
While the extravagant baubles promised by many Singularitarians just might materialize in a world of plenty, I simply can't shake the feeling that the Western world's casual reliance on material abundance is nearing a jarring end, precluding the visions of ardent transhumanists.
And while I'm as impressed with the steady ascension of computing power as anyone, it often seems as if an alarming number of futurists have forgotten that Moore's Law is a human construct, not physical law in the sense that, say, gravity or thermodynamics dictate our existence. If it seems stalwart and unyielding, we must take into account its fairly recent arrival; knowing nothing else, we naturally assume that electronic systems will inexorably increase in complexity and ubiquity, perhaps to rival the human brain within the next few decades. (The issue of mind-uploading remains a powerful and uniquely daunting question, far more laden with philosophical baggage than contemporary debates over the ethics of germ-line engineering or conscientious application of brain implants.)
Sadly, I must conclude (for now), that the Kurzweilian future, for all of its marvel and promise, is unlikely to occur. How our species will react to the perils already manifesting on the planetary stage remains an open question; it's entirely possible, for example, that our present mode of consciousness must undergo some form of mutation before we can deal with future threats sensibly. Futurists such as Daniel Pinchbeck have speculated that we've driven ourselves to the threshold of extinction (the "cusp of terminal dissolution" found in this blog's masthead) as a necessary precursor to becoming something fundamentally different.
Humanity appears to thrive on shocks -- our history (and prehistory) reflect an ability to assimilate the lessons of failed incarnations. Like the line plotted by Moore's Law, our tenure as a sentient presence on Earth indicates the capacity for exponential growth. Could it be that we're skating toward two Singularity events simultaneously without consciously realizing it? Multiple Singularities might constitute a topological headache, but could hardly be deemed more perplexing than the slippery, recursive nature of consciousness itself.