Alice and Bob Talk Transporters
A dialogue on personal identity, psychological continuity, and Chihuahuas
The transporter problem goes as follows.
On Earth, there is a machine that a person, let's call her Jane, can step into. This machine perfectly scans her, atom by atom, and then disassembles her. The scanned information is then sent to Mars, where a machine 3d prints her body into a perfect, atom-for-atom, replica, who then goes on to live on Mars.
The philosophical question is "Is that copy Jane, or is Jane dead, and is the person who steps out on Mars not Jane?"
***********************************************
Alice having been given a job on Mars, on her way to the teleporter, happens to run into her philosopher friend, Bob.
Alice: Oh great and wise philosopher... uh... Bob. Will I cease to be when I am teleported away to Mars?
Bob: Cease to be what, exactly?
Alice: Cease to be Alice! Will my experience end?
Bob: To put it in the third person, your question is "Will Alice continue to experience things", correct?
Alice: I'm not sure if one can put it in the third person. Surely, there is still a being that will continue to experience things on Mars. The question is, "Is that being me, or is it someone else?"
Bob: Ok, well then the question is "What defines something, or someone, as 'you'?" Is it your memories, thoughts, personality?
Alice: Yes, all of those things, but not just those things. Look, suppose the transporter didn't disassemble me, and instead the perfect, down-to-the-atom reconstruction of me appeared right beside me. That Alice would be closer to me than an identical twin, surely. It would have all the characteristics you pointed out, but it wouldn't be me.
Bob: Well, it sounds like you have your answer then! If the perfect replica wouldn't be you, then the perfect replica wouldn't be you!
Alice: And yet, that still isn't a satisfying answer. Suppose physicists tell the general public that in our universe, every, I don't know, plank second, or whatever, our bodies cease to exist for a "flash", only to be perfectly reformed in the next moment, with our atoms switching out for other, identical atoms because of mumble quantum mumble something-or-other. If that were the case, I would shrug my shoulders, and move on with my life; no existential dread about being a new person, no second thoughts about how I live my life. It wouldn't influence my decisions in the least. It would have as much impact as the Boltzmann brain problem. A "Yeah, sure, maybe I'm a Boltzmann brain, so anyway, what should I get for lunch" sort of thing.
Bob: I see, and what if, it was not the case that this "flash" of non-existence happened every second, but that one could be induced by stepping into a machine?
Alice: I would be worried the machine would mess up.
Bob: But this is fighting the thought experiment on practicalities! Surely your objection to the transporter isn't based on practicality, is it? It isn't "I don't trust the transporter to faithfully recreate me, atom-for-atom". It's, "Even assuming it does, I still have philosophical issues with it". So, let's assume here that the machine is totally equivalent to the "flash" you spoke of; it works perfectly.
Alice: Hmm.... I suppose... it shouldn't be any different then? I should be indifferent between going into the machine or not doing so? If I wasn't afraid before, then why should it bother me merely because it is happening inside a machine? I am still uncomfortable slightly with the idea, but I can't say why. There’s still some weirdness with this "flash machine" though.
Bob: "Weirdness" is just a heuristic, is it not? We call things weird because they are new, untested, or untrustworthy. Suppose the transporter had been invented only last week, rather than decades ago. If you had not grown up, learning about the transporter in school, seeing celebrities use the transporter and come out on the other side, no worse for wear, and seeing your friends do the same, would it not be just as weird to you as the hypothetical machine is now?
Alice: I guess in that case, you're right, the "flash" machine wouldn’t feel weird to me, and I would have no real objection to using it.
Bob: So is there a difference between this machine and the teleporter?
Alice: There are certainly differences. For example, with the teleporter, "I" would reappear in another place, after some non-negligible amount of time.
Bob: So it sounds like you are saying that the issue is both that space and time elapses before reassembly? But every moment we walk from one end of the room to another, both space and time is elapsing, no?
Alice: That's different! For one thing, the atoms that make up my body are largely the same, say 99.999% of them. They'll get switched out eventually, like the Ship of Theseus, I admit, but... it still seems like I'm the same me even after they are all replaced.
Bob: Surely not exactly the same you? The you of right now has thoughts that the you of yesterday didn't! Even if the atoms were preserved, they'd be in a slightly different arrangement compared to yesterday. As for the ship of Theseus, it seems like you're saying that it is indeed the same ship at the beginning and at the end when everything has been replaced, no?
Alice: Correct. Do you think differently?
Bob: Yes! I say we keep calling it the same ship because it’s practical, not because there’s some deep metaphysical identity preserved. The copy is still a ship of Theseus, but for convenience we say it’s the ship of Theseus. Similarly, the "you" of today and the "you" of yesterday are both an Alice, but we just say Alice for practicality.
Alice: Ok, I do accept the point about practicality, but there's a difference here. If Mars Alice is poked with a needle one day from now, I will not feel it, but she will.
Bob: And if I prick you with needle 5 seconds from now, who will feel it?
Alice: Me, of course!
Bob: Yes, the you of 5 seconds from now would feel it, once the needle enters the skin and the pain receptors fire up the chain all the way to the brain. But the you of right now will not feel that pain. It's the same scenario here. You claim that you, the Alice standing before me, will not feel the needle on Mars, and that is surely true in that our past selves never feel what happens to them in the future!
Alice: Of course, but this misses the point. Let’s say… if I had been told that I was to be hurt tomorrow, I would wince at the prospect of future pain, knowing that I would be the one feeling it, in a way that I wouldn't if it was happening to someone else, or, for that matter, if it was happening to Mars Alice. I may feel empathy for someone else's future pain, but the visceral wincing wouldn't happen.
Bob: But this is only evidence on how you view Mars Alice, not necessarily of any real property of Mars Alice. Here, we are talking about your feelings, not philosophy, not reality! You wince at the pain now so that you may take actions to avoid it, and therefore you may have greater reproductive success, but evolution had no reason to give you good intuitions about a teleporter; no reason, I mean, to make you think of a teleported being as yourself.
Alice: Hmm...that does make evolutionary sense, but explaining why I feel that way is just bulverism. It doesn't address the actual argument.
Bob: Not so! You imply that Mars Alice cannot be you because you do not feel the same way about her as you do about a future Earth Alice. I am merely giving an explanation for why we should expect to feel that way even if the underlying reality is that Mars Alice is equally "you" as a hypothetical Earth Alice.
Alice: Fine, it isn't bulverism, but there is still a difference between the me of the future and the "me", if indeed it is a "me", on Mars. There will be an unbroken chain of consciousness here on Earth, but not on Mars.
Bob: I admit this is true. And I'm sure you've heard the thought experiments about sleep, with the general counterarguments that the brain is still active during sleep, but let us sidestep that, and go further. As it stands, we have general anesthesia that fully stops one's consciousness leaving only sub-conscious processing.
Alice: Well we don’t know for sure if that’s true. It could be that there is still conscious thought, but we simply don’t form memories, and so can’t report on it afterward.
Bob: Technically possible, but if you underwent some procedure with this, only to wake up fully alert, would you be at all worried that you would be a new Alice?
Alice: Maybe I would be? I admit that once I woke up, which is to say, whoever woke up, that person wouldn't feel like a "new" person. They would just feel like Alice, just like the Mars situation. But perhaps we can define "me" as the being that has had no break in consciousness regardless. Perhaps we really are new people once we get general anesthesia?
Bob: When a stance is a matter of grouping individuals into categories, one can always define them in ways that include everything we want to include while excluding everything we want to exclude. Take this too far from common usage, and you end up with a No True Scotsman, but I don’t think this qualifies. To put this another way, you are free to define "Alice" in such a way that it excludes Mars Alice while not excluding Earth Alice, and you can do that without internal contradiction, and defining it via continuity of consciousness is one such way to do that.
Alice: But surely, you’re arguing that the definition that isn’t that includes Mars Alice is a better definition than mine?
Bob: Of course, but I can’t say anything that will logically force you to accept that definition. I can point out that continuity of consciousness doesn’t handle sleep, or anesthesia very well, and you can counter (as you did) that maybe consciousness persists in some way during those states. You’re hanging your hat on a supposition there, though, in a way that I’m not. If philosophers and scientists solve the Hard Problem one day and assess that, in fact, anesthesia does fully stop consciousness, what then? Whether they prove it or the converse correctly, it makes no difference to my philosophy, but what about yours? Do you avoid anesthesia from then on, or do you shrug your shoulders and say “I wonder what I’ll get for lunch after my surgery?”
Alice: Surely there's an objective answer. It is either the case that they would be a new person, or they would be the same person, after anesthesia. It doesn’t seem like it’s just a definitional issue.
Bob: Hmm… consider this: Suppose, we had your atom-for-atom data, and we had my atom-for-atom data. Then suppose one single atom was removed from your body, and it was replaced with an atom so that it more closely matches my body. You are then, say, 99.999% Alice, and 0.001% Bob. And then…
Alice: And then this procedure happens iteratively, until I am 100% Bob, correct?
Bob: Yes, so even though each step takes you just as far away from “Alice” as walking down the street does, at the end you are no longer Alice. How do we resolve this? Just as the paradox of the heap1 is resolved when you more closely define how many grains of sand are in a heap, this paradox resolves when you more closely define Alice. To put it succinctly, the definition of “Alice” is a definitional issue.
Alice: But how can we define Alice in any way other than through continuity of consciousness? When I was a baby, I was not like myself today. Perhaps I hated peas as much as I do today, or some silly thing, but, other than things like that, what binds us together?
Bob: DNA, perhaps?
Alice: Now you're just teasing me. Identical twins aren't the same person, and their DNA is... well... identical.
Bob: Quite right, but I thought I'd mention it nonetheless. In any case, there is a way to bind you and your infant self together without bringing in continuity of consciousness. And that is with identity being a moving target.
Alice: A moving target? That can't be rigorous; it's almost the definition of unrigorous!
Bob: Is that so? Well, consider the humble banana.
Alice: If you start going all Ray Comfort on me, I'm outta here.
Bob: Fine, fine. Consider the humble Chihuahua. If I took a chihuahua, and stood it next to their wolf ancestor, there is no sense in which you would call the two the same species. And if you wish to argue with me on that, then let me pull it back further, back to a fish ancestor if required.
Alice: Point taken, a wolf will be just fine. I accept that modern dogs are descended from, but are not the same species as their wolf ancestor.
Bob: Excellent, consider (1) that there is an unbroken lineage from a Chihuahua to a wolf ancestor. And, (2) each child is undeniably the same species as their parent. If we had a complete record of the lineage, we could try to draw borders between the "first dog" and the "last wolf" in that chain, but then again, this violates our second condition, doesn't it?
Alice: Yes, if we draw a border at any point, it splits up a parent from their child, where the last wolf gives birth to the first dog, and therefore they are not of the same species.
Bob: Precisely! And here is the way to solve that. We recognize that "wolf" and "dog" are just names we've given retrospectively based on their characteristics. Each time a new dog is born, it ever so slightly shifts what it means to be a "dog". This is nearly imperceptible at human time scales, but as new breeds emerge, and new traits emerge in dogs (perhaps, dogs being bred to have longer snouts, or something), the definition shifts, ever-so-slightly. What it means to be a dog is, indeed, a moving target. If we had to define "dog" 10,000 years ago, it may not include the characteristics of a Chihuahua, which is to say, if a Chihuahua had appeared ex nihilo, the first people to encounter it would, perhaps, put it in the “fox” category, rather than the “dog” category. If they had genetics, they may note that it is closer to what they call a dog than a fox, but still balk at it being included in the dog category, and opt instead for it to be it’s own category, since a Chihuahua is quite far from what they considered a “dog” 10,000 years ago.
Alice: I get it, so you're saying that "Alice" is much the same as "dog". From when I was a baby, with each act that I choose, I slightly change the definition of being Alice, so even though I have little in common with myself when I was baby, I am still considered the same person.
Bob: Yes, that is certainly one way to see it, but I have one further point. Suppose I went to the lab, took the genome of an ancient wolf (don't ask where I got it; I have my ways), and systematically changed the genes so that it came out genetically identical to a modern day Chihuahua. Would it be a wolf?
Alice: Umm... no, I guess we would still call it a Chihuahua, a modern dog.
Bob: And what if I went further, and edited it from, say, a buffalo, or even from scratch?
Alice: That sounds like a lot of work, but I suppose it would still be a Chihuahua?
Bob: And yet, it had no unbroken chain from a modern dog, or even from a wolf. You see, you've defined the Chihuahua by their genetics. In having those necessary features of a Chihuahua, it falls within the moving target of what we call a modern dog, even without the unbroken chain.
Alice: But a Chihuahua is a group; that's why I can define them by their genetics. I am an individual, and just like with identical twins, I cannot be defined solely by my genetics.
Bob: Quite right that you are not defined solely by your genetics! But I argue that you are a group! Baby Alice and current-day Alice (and all the days in-between) make up a group, even if they are all separated by time. So similarly, we can say that Mars Alice, having the necessary features of an Alice ( not just genetics, but also personality, thoughts, feelings, memories, etc), can also be considered a dog, ummm... excuse me, an Alice.
Alice: Hmmm... And what if I changed my mind, and said, "No, it wouldn't be a Chihuahua, because it did not descend from a Chihuahua?" After all, I can still accept your point about moving targets while saying that it is, I don't know, a pseudo-Chihuahua; something that superficially has the traits of a Chihuahua, but doesn't have the defining characteristic of it, which is being born of two parents whose DNA could combine in such a way to form a Chihuahua genome.
Bob: I would then say "That's totally fine."
Alice: Huh?
Bob: Yes, indeed, that is a totally valid philosophical stance. Remember what I said before, Alice. You can define identity in a way that excludes Mars Alice and includes Earth Alice. Similarly, you can define Chihuahua in a way that excludes a lab-grown Chihuahua from being considered one. It seems a bit like you made your definition solely to avoid having this creature be considered a Chihuahua, though, and I don’t think that’s particularly good practice.
Alice: But isn’t that basically what you’re doing here, essentially? Just defining Alice in such a way that it includes Mars Alice?
Bob: Not really, no. I’m defining Alice by the things that make you, “you”. Your thoughts, memories, personality, relationships, obligations, emotional reactions, fears, hopes, and even minor things like the shape of your body, or the color of your hair. Ultimately, those are the things that matter here. If you woke up to find that all of those things had changed, even without a break in continuity of consciousness, you would, for all intents and purposes, be a new person. It would mean little to you that you “used to be” Alice if all those things changed. Mars Alice has all those things, so Mars Alice qualifies, in my book.
***********************************************
Alice: That is all well-and-good, but I am still unconvinced that your view is, in fact, without contradiction. Let us try this: Suppose the disassembly process was delayed, perhaps to make sure that Mars Alice is, in fact, safe. Suppose then I could look into her eyes, feeling my own consciousness, and she could return my gaze, feeling her own consciousness. Suppose then that I am disassembled as scheduled, would my consciousness be transferred to enter her body? That seems absurd!
Bob: It is indeed! Why should your consciousness be the sort of thing that travels to the most suitable host? Is your consciousness a parasite? Is it drawn by some pseudo magnetic force that brings it to the new body? Your consciousness doesn't "get" to her. It is a part of her, a pattern in her brain and subject to the structure of it. Your consciousness was copied when your brain was copied, atom-by-atom.
Alice: But then there would be two Alices, two versions of myself!
Bob: There are already two Alices, or rather, millions of Alices. Each second you are a new Alice. The only difference is that here, there would be two Alices at the same time. Would that be impossible?
Alice: Yes! My feelings would be mine, and hers would be hers! Imagine we go on to live our own lives for decades afterwards, rather than me being disassembled. Wouldn't we have separate consciousnesses?
Bob: Yes, I would say you would indeed, just as identical twins who split in the womb have separate consciousnesses, just as branches of a single species can diverge and become distinct. But in the case of the teleporter, you do not diverge to any real degree, and thus there is no problem that I can see. Why must the model of identity that has served us from before transporters exist continue to serve us now?
Alice: And what if I'm not interested, not just in your model, but in any model at all?
Bob: My friend, you certainly are interested in a model! You were able to note the absurdity of your consciousness flying through the skies precisely because you have a model of consciousness.
Alice: True enough, but still, having a model and accepting your model are different things.
Bob: I cannot force you to accept the model. I can only show that, given the definition of your identity is indeterminate, the model works without contradiction. But by all means, if the price of accepting this model of identity is too high, do not pay it2 ; if you cannot accept the model where there can be two simultaneous, equally valid versions of Alice, then do not act according to it! To me, to hold the model that there can be two valid instances of what was once a single person is a small price to pay. To you, perhaps, it is a large one. But at a certain point, one must ask themselves how the model is serving them in their lives. Speaking only for myself, having seen no contradictions in the model, I see no reason to reject it, and if I therefore had a desire to go to mars, I would gladly step into the teleporter. However, if it is the case that baked into your being is the belief that there can only be one "Alice" at a time, then so be it!
***********************************************
Alice: Even if I accept that the Mars Alice would be me, in some sense, my real concern remains... why does Earth Alice have to die so that Mars Alice can live? Suppose I accept all that you've said. That Alice will be a "me", along with the "me" on Earth. Why then, must I be disassembled here on Earth. Even if Alice is happy living on Mars, I want to live; I want an Alice living here on Earth too...
Bob: That, I'm afraid, is not a question for philosophy; it is one for our scientists working at the lab. Perhaps, contrary to our thought experiment where you are kept safe throughout the whole process, the scanning process is necessarily destructive.
Alice: But Bob, you don't hear what I say. I don't want any Alice to die, not here, and not on Mars.
Bob: I do hear you, my friend. There is no denying that your disassembly will be a death, albeit, a painless one. But let us not fall for the Noncentral Fallacy3. Which is to say, I caution you, dear friend, do not view death when an Alice still lives to be the same as a death where there is no longer an Alice at all.
Alice: Death is permanent; that's what it means to be dead. When I am disassembled, I will be dead, forever.
Bob: An Alice will have died, no doubt this is true, and there will have been a death, but you, Alice, will not be dead. To be dead means to no longer exist, forever, as you say. Yet here, an Alice would still live. Before these machines, all that have died are dead. Now, it is not always so. Death is bad because it causes great sadness, and further, it robs the universe of all the good things you will do, and all the good that you will experience. That is not so true here. The Alice on Mars will still see your family and friends, or equivalently, her family and friends; you, on Mars, will make new friends too. You'll have good experiences that you will not have here.
Alice: But I... if I was still alive here on Earth, I could have good experiences here too. Things would be complicated with my family and friends, there being two of us would be hard to understand especially as we diverge over time, but it would still be good. I could bring more good into the world existing compared to not, here on Earth.
Bob: That I cannot deny. It would be better to have two of you compared to only one. But, as it stands, that is not the choice we have before us. You, and perhaps some day myself, are choosing whether we exist on Mars or on Earth. It is currently a question of our technology. It would be better if we can change these, but as of now, we cannot.
Alice: Somehow, that isn't satisfying to me. I don't deny what you say, but it doesn't fit with how things should be.
Bob: Here, philosophy can only describe what is; it can't promise to change reality to what ought to be. We are constrained on our outcomes, and here, we do not have the choice to have two of you existing at the same time. I wish I could offer something more comforting that would also be true. I wish I could tell you "One Alice is just as good as two", but I don't believe that. I do believe, that one Alice is as good as another Alice, but in the end, the choice is yours. No one will force you onto the teleporter. If you decide to believe as I believe, you can walk onto it without fear. If not, then refuse the offer. That is all I can counsel you on the matter.
For how much is lettuce sold? Fifty cents, for instance. If another, then, paying fifty cents, takes the lettuce, and you, not paying it, go without them, don’t imagine that he has gained any advantage over you. For as he has the lettuce, so you have the fifty cents which you did not give. So, in the present case, you have not been invited to such a person’s entertainment, because you have not paid him the price for which a supper is sold. It is sold for praise; it is sold for attendance. Give him then the value, if it is for your advantage. But if you would, at the same time, not pay the one and yet receive the other, you are insatiable, and a blockhead. Have you nothing, then, instead of the supper? Yes, indeed, you have: the not praising him, whom you don’t like to praise; the not bearing with his behavior at coming in.