Transhumanists have a label — “the argument from incredulity” — for one kind of criticism of their visions and predictions: The instinctual but largely un-evidenced assertion that transhumanist claims are simply too fantastical and difficult to fathom and so must be false. While there’s plenty of reason, empirical and otherwise, to doubt transhumanist predictions, they’re certainly right to point out and criticize the prevalence of the argument from incredulity.
But there’s a transhumanist counterpart to the argument from incredulity: the argument from inevitability. This argument is prone to be just as un-evidenced, and at least as morally suspect. So I’d like to begin a new (hopefully regular) series on Futurisms: the Transhumanist Inevitability Watch.
It’s 2010, and transhumanism has already won. Billions of people around the world would love to upgrade their bodies, extend their youth, and amplify their powers of perception, thought, and action with the assistance of safe and tested technologies. The urge to be something more, to go beyond, is the norm rather than the exception…. Mainstream culture around the world has already embraced transhumanism and transhumanist ideals.
Well, then! Empirical evidence, maybe?
All we have to do is survive our embryonic stage, stay in control of our own destiny, and expand outwards in every direction at the speed of light. Ray Kurzweil makes this point in The Singularity is Near, a book that was #1 in the Science & Technology section on Amazon and [also appeared] on the NYT bestsellers list for a reason.
Ah. Well, if we’re going to use the bestseller lists as tea leaves, right now Sean Hannity’s Conservative Victory
is on the top of the Times list
, and Chelsea Handler’s Are You There, Vodka? It’s Me, Chelsea
is #2. Does this mean conservatism and alcoholism have also already won?
Similarly, his other major piece of evidence is that it would be “hard for the world to give transhumanism a firmer endorsement” than making Avatar
, a “movie about using a brain-computer interface to become what is essentially a transhuman being,” the highest-grossing film of all time. Okay, then surely the fact that the Pirates of the Caribbean
and Harry Potter
movies occupy five
of the other top 10 spots
means even firmer endorsements of pirates and wizards, no? And actually, Avatar
only ranks 14th in inflation-adjusted dollars
in the U.S. market, far behind the highest-grossing film, which, of course, is Gone with the Wind
— unassailable evidence that sexy blue aliens aren’t nearly as “in” as corsets and the Confederacy, right?
Mr. Anissimov’s post at least contains his usual sobriety and caution about the potentially disastrous effects of transhumanism on safety and security. But he and other transhumanists would do well to heed the words of artificial intelligence pioneer Joseph Weizenbaum in his 1976 Computer Power and Human Reason
The myth of technological and political and social inevitability is a powerful tranquilizer of the conscience. Its service is to remove responsibility from the shoulders of everyone who truly believes in it.
Keep Weizenbaum’s words in mind as we continue the Inevitability Watch. Humanity’s future is always a matter of human choice and responsibility.
Transhumanist issues are obscenely mainstream nowadays, who even cares. We’re not even edgy anymore. The excitement is over. It’s time to start racing towards a safe intelligence explosion so we can end the Human-only Era once and for all. Let’s just get it over with.
Of course nothing is inevitable unless people work to make it happen. Creating an open, dynamic future requires hard work, self-discipline, and perseverance. It requires intelligence and future time orientation (sometimes called executive function). Of course the future belongs to those who work the hardest and have the greatest enthusiasm for it. Why would you have it any different?
I'll bite Ari. First of all, I grant that the inevitability argument is of course often cavalierly invoked to get around serious moral objections to one or another form of technological development. This said, it comes in various forms and rests on various sets of premises, some more compelling than others. (Francoise Bayliss has a decent essay on this considering the case of genetic engineering). The development of nuclear weapons is, of course, the stock example of something perilously close to totally inevitable given a world of competitive sovereign or semi-sovereign states and a certain series of scientific developments. It may be that no aspect of the transhumanist project approaches this level of inevitability. But I still wonder whether there isn't something to their claim. Given a divided world, institutionalized and technologically oriented science, the ascendancy of permissive democratic liberalism in the West, I do think there is good grounds for supposing the critics have much the tougher row to hoe. I wonder how completely the thesis can be rejected. I look forward to the series.
tlcraig: First, thanks for another excellent comment.
I quite agree with you that many technological and social changes are inevitable, for the reasons you've listed among others. But the purpose of this series is not so much to dispute that idea as to combat its invocation, as you mentioned, to preempt serious discussion of moral issues. Perhaps even more troublingly, the rhetoric of inevitability, as Weizenbaum says, serves as "a tranquilizer of the conscience," in that it allows the very people who are creating that change to think it is ultimately due to a sort of extra-human force for which they bear no responsibility.
At a more practical level, though some sort of technological change is probably inevitable, the specific form of that change is quite indeterminate, and very much subject to the choices we make at an individual and societal level. There are any number of new technologies that may have seemed inevitable not long ago but hardly seem so today—cloning, for instance, seems to have mostly faded from the public conscience, and, this blog's recent debate notwithstanding, seems to be sort of a fringe issue even among transhumanists. Technologies like this, and, say, embryonic stem cell research and steroids, are ones where you can clearly see that the path we took was influenced a great deal by public discussion.
We face similar situations ahead about other types of human enhancement, and the path we take is likewise not yet determined. Invoking inevitability in those discussions seems like a way of declaring victory before the game's even been played.
Nothing is inevitable unless someone exerts effort to make it happen.
I've noticed from watching the Deepwater Horizon mess that people assume that technological solutions to this problem must exist, and they therefore demand that someone finds one of these solutions and implements it to put a plug or valve on BP's well in the Gulf of Mexico.
Nobody I know of has said that we should instead accept an increasingly contaminated Gulf as an existential reality, construct anxiety buffers to mitigate our terror when we encounter Deepwater Horizon salience, and deal with the situation the best we can through religion, philosophy and humor. In fact, we would probably laugh at anyone who proposed something that absurd.
In other words, we insist on getting technological "inevitability" when something sufficiently dangerous backs us up to a wall. If we could mobilize this sort of effort to the transhumanists' goal of conquering aging and death through technological means, I would bet that it would make transhumanism seem considerably less flaky in short order.
Comments are closed.