There’s a well-known thought experiment in the field of identity metaphysics known as the Ship of Theseus. The question it poses is: if the planks of the ship are slowly replaced over many years, then when every one has been replaced, is it still the same ship?

There’s another issue called the “teletransportation paradox”, initially conceived by philosopher Derek Parfit, which has been popularized largely by the sci-fi series Star Trek (and its many descendants). The idea is, imagine two teleportation machines - one instantly deconstructs you, and the other reconstructs you immediately. Once you make the journey, are you still the same person?

To bring in one more idea of note, an increasingly common thread in science fiction (and perhaps beyond, soon) is the concept of mind uploading: essentially, you scan your brain (perhaps destructively) and emulate it on a computer.

You might have already noticed that these three philosophical ideas share some common thread. But it goes even further than that: they are, really, three different ways to phrase the same question: what are you? Specifically, what exactly is the part of you that we consider as the identity, the conscious bit that acts like an unbroken thread of experience tying you to your past states of being? To put it a little more poetically, what part of you is your soul?

a mildly sadistic experiment

A lot of people tend to think that going through the teleporter or uploading yourself would kill you, the “original”, and simply create a copy. I used to think so, too. But to illustrate why this doesn’t hold up, let’s pull off a mildly sadistic thought experiment. Imagine I put you to sleep and use some extremely precise engineering to collect a huge amount of signals going into and coming out of a single neuron in your brain. I use this to train a sufficiently rich neural network until it can approximate that neuron’s behaviour arbitrarily closely. I put this neural network on a tiny chip, and then… replace that neuron with it. Do you notice the difference?

No, of course not. Your neural state is quite robust, to the point where replacing a single neuron with an arbitrarily close approximation won’t be able to perturb it at all. We survive living in chaotic environments all the time (in the physics sense) - if your brain was that sensitive, there’s no way you could survive living in this reality. You’d collapse faster than a Boltzmann brain. (Not really, but it’s fun to say.)

What happens if I continue the process? Is there really a point at which you stop being you? No, not really. As long as those neurons are performing the same computations they were before, then you’re not losing anything by swapping them out with functionally equivalent models of the same thing. (There is, of course, the possibility that quantum mechanics plays some significant role in the brain’s overall computation, such as in Penrose & Hammeroff’s Orch-OR hypothesis, but this is heavily disputed for many reasons, one of which is that the brain is far too warm to support the coherence of quantum states for very long.)

If I finish this process, replacing every neuron with a specially engineered, functionally equivalent component, then have you really changed at all? I don’t think we can say that you have. The sensory data going in through your sensory organs (which we could also replace) doesn’t know that it’s being computed by something else; the predictions your neocortex is constantly making about the future (we’ll talk about predictive coding in a later post) don’t know that they’re being generated any differently than before. The inputs are the same; the outputs are the same; and most critically, the way all that information is processed is the same. For you to be any different - to experience anything different - your experience of being you would need to come from such a miniscule, fragile part of that structure (such as in Orch-OR) that you couldn’t persist for more than a fraction of a second. This is the only real mechanism from the point of which “substrate dependence” would make sense.

TLDR: I’m saying that the reason your neural states are so robust to chaotic noise is because they are functions of the overall way information is processed by the system. This might be blindingly obvious to some of you, but I’m trying to illustrate that this de facto implies what this post is all about: you are your information system.

you are your information system

All of the above would imply that you are effectively your information system: the information object which is encoded by your present neural state, and the rules by which this state is updated and new information is processed. Because all of these macroscale dynamical rules are computable, it means that you, too, are computable; this means you can be run on any universal Turing machine, and because you are your information system, you would still feel conscious, like “you”. (I’m using the term “conscious” here to point at that quality of being that we all know, which I see as how being an information object feels from the inside.)

So with this knowledge, we can finally answer the three philosophical questions I brought up earlier. What, exactly, “is” the Ship of Theseus? I contend it’s the wear and tear on the planks of the hull - the information the ship has gathered about the seas through which it has sailed. Would you die if Picard ordered you to take the transporter down to the planet’s surface? I mean, if you’re wearing a red shirt, yes, but not because of the transporter; you’d still be you on the other side. Would you die if you uploaded yourself to a computer, leaving only a copy to go on? I mean, technically your body would die, but “you” - the thread of conscious experience that seems to persist between states, with your particular quirks of information processing that comprise your personality and memories - would continue.

first-person examples

Right, but what would that feel like?

Perhaps surprisingly to some, that’s not that difficult to answer. It would feel like your conscious experience does now. As long as your neural state was sufficiently well mapped from one place to another (from one end of the teletransporter system to the other, from your brain to the computer), and no memories were lost, you would still feel an unbroken thread of being that connected you to your past states, just as you do now. Sure, the environment would change suddenly and radically, but you would feel that it was changing around you. This is because your memories are exactly what tie you to your past, not just in a poetic way but in a you feeling like you’re the same person you were ten seconds ago kind of way. Of course, memories exist within the present neural state, and merely reference past states; this gets weird with acausal shenaniganery, but I’ll save all that for later.

There is one last thing I’d like to take a look at. We’ll need to carry out one more thought experiment. Suppose I put you to sleep again and use some extraordinarily ill-conceived science-fiction technology to clone you, right then and there, with every single bit of biological ultrastructure exactly the same. I put one of these (you don’t know which) into room A; the other is put into room B. Each room has a sign hanging on the wall above the door. When you wake up, what’s the probability you’ll see A or B on the wall?

It’s 50% - you could wake up to be either of them, and you truly don’t know which information system you are until you observe that sign. Now just briefly, as a teaser for something coming soon, I want you to stretch your neocortex and imagine what might happen if there were always infinitely many such information systems you could be, even after gathering all the data. And whether or not this would resemble the inherent uncertainty that resides in (a first-person view of) quantum mechanics.

Anyway, that’s all for now. Thanks for reading!


<
previous post
racing moloch
>
next post
the slumbering gods