[Continuing coverage of the 2010 H+ Summit at Harvard.]
As Gubrud described, Hopkins looks at the language used to describe mind uploading. What are the metaphors we use when speaking about it? The first is location: the mind is “in” or “within” a brain, and can be put “onto” a computer. The second is motion: the mind can be “moved,” “transferred,” “put” into a computer. And the third is substance: the mind is a thing that can be moved from one “receptacle” to another. But, Hopkins asks, do these metaphors really work? Is the mind truly an object that is housed “inside” a brain and can be “moved” to another “receptacle”? According to naturalist theories of mind, no. The positions that do think this, Hopkins says, are basically religious, relying on notions of souls, spirits, and ghosts.
Hopkins tries to absolve uploading advocates from blame; he says that they have just inherited this language from religion. I think it’s far more likely that they’re inheriting language and concepts from the discipline that gives rise to the notion of “uploading” in the first place: computer science. Computers are heavily dualistic systems, and transhumanists think the mind/brain is a computer, so they treat it as dualistic too.
Hopkins anticipates the rebuttal that this language is just metaphorical. But, he says, central to the idea of uploading is that personal identity is preserved. So the question is, does copying preserve identity? Is copying the same thing as transferring, as literally moving a mind? He sas no: copying creates something that is exactly structurally and behaviorally similar to the original, but that is not the same as identity. The copied mind has a different history, and is made of different matter; we can metaphysically tell the difference (as usual, see SMBC). If you want to believe that the mind is a pattern, he says, then it’s important to know that a pattern is not an object that can be plucked out and moved; it’s a way of organizing matter.
He describes a familiar scenario from the philosophy of mind: You’re sitting in a room and someone holds a gun to your head and says he’s about to shoot you, but before he does that he’s going to copy your mind into the other room. You’d still be unsettled, but maybe you’d be okay because you’d think that you would just go to sleep in one room and wake up in another. But what if the gunman then said “just kidding, I’m not going to shoot you, but I still made the copy.” It couldn’t be you in the other room, then, could it? Well your relationship to the mind in the other room is no different than it was a moment earlier; the only difference is that the gun is no longer at your temple. Mind uploading, Hopkins concludes, will not work as we like to think it will. (He doesn’t say it explicitly, but basically what he’s demonstrated is that psychological continuity is not all that is required for personal identity.)
Patrick Hopkins provides what is easily the best talk of the conference so far — he manages to convey sophisticated ideas effectively and concisely in a ten-minute slot that few other speakers have been able to own. And his message is convincing. Again, I wish the conference had put far more emphasis on talks of this level of thoughtfulness and speakers who were this effective.
I do have a few quibbles, though. First, Hopkins either misrepresents or misunderstands the significance of the argument he presents. To say that “uploading won’t work” makes it sound like he’s presenting a philosophical case for why we couldn’t have machines that are conscious, and whose consciousness very closely resembles that of existing persons. But his argument is based on the premise that we could. His conclusion is just that the results wouldn’t be as clean and transparent as everyone assumes.
So Hopkins’s claim is that a mind cannot be separated from a body and continued. But that is not quite the same as claiming that a mind cannot be copied. What if it could — what if a duplication were possible? Hopkins offers no consideration to the huge moral dilemmas that would arise if such beings were somehow created. If it were somehow technically possible, such duplicate beings might well consider themselves to have a continuous personal identity, complete with memories, thoughts, and feelings — only their memories, thoughts, and feelings about their own history of self would be false. The identity of the original being would be thrown into chaos just by the fact of its duplicate’s existence. How then would we treat such beings? Could we hold the copy responsible for crimes that it remembers having committed, but did not? Could we deny it credit for accomplishments it thinks it made but did not? These questions would become impossible to answer, and we would find many of the bases for our legal and social order similarly thrown into chaos, and impossible to resolve.
Maintaining continuous personal identity (and other really fundamental aspects of consciousness and mind) is not simply a philosophical matter of recognizing the necessary components, but a practical matter of maintaining them, socially and as lived lives. The conclusion Hopkins should arrive at by the end of his talk is not “this is why uploading won’t work” but “this is why we shouldn’t do it.”