Viewing a single comment thread. View all comments

Frumpagumpus t1_ja2muar wrote

probably the easiest way is to die.

freeze your brain, slice it up into thin slices, map out the connectome with a frozen brain slice scanner, make a mostly accurate clone of yourself that can experience the matrix for you and whom your family&acquaintances will not be able to distiniguish (well except for the fact that your clone is in the matrix).

i expect software intelligence will do this all the time. it will probably be a useful learning technique amongst many other things, clone yourself n times, study, debate, or act (e.g. alphazero generating it's own training data) then perform a merge operation that "kills" all the clones and merges them into one thing (or just wind down how many instances there are since you no longer need them to generate data)

death will not be the same after the singularity (even though there is probably a way to connect yourself up to the matrix without dying, i'm not sure software intelligences will see the point of bothering though given death would probably be a human preoccupation)

−1

DNMbeastly t1_ja2n3bh wrote

Sadly that would only be a clone, meaning you will still die the same way with someone else taking your place who merely pretends to be you.

9

Twinkies100 t1_ja3woor wrote

I thought about this few months ago. I think to make sure that our consciousness transfers to the non biological brain (which will be built in a way to have the same consious experience as we do), we'll have to modify our biological brain in a way so that it can communicate with the artificial one. That isn't cloning as the brain isn't seperate in this case

1

okaterina t1_ja2ypke wrote

Does not matter if the duplicate has the exact same memories, thoughts schematics, perks and mannerisms. There is no difference between "to be some one"and perfectly pretending to be some one. Are you sure you are the same one than ten years ago, while you do not have a single atom in your body from that time ?

0

DNMbeastly t1_ja35vwi wrote

No I'm not the same person, nobody is.. but my continued conscious experience is still alive. Don't conflate the two. When I say pretend, I mean the copied brain or whatever apparatus thinks it's me but it will never truly be me.

5

okaterina t1_ja3ud13 wrote

Never "truly" ? Define truly then. Definitely not the sum of atoms (as you agreed).

What if the copy takes place in a few milliseconds - or during your sleep. The duplicates will both have a "continued conscious experience".

BTW, that's the physicalism philosofical theory - the substrate of consciousness does not matter.

0

DNMbeastly t1_ja485sj wrote

I'll retract my previous statement and just say what you're trying to argue has ZERO meaning. Why? because theories are theories, and consciousness is not proven to be physical, and even if it was, it doesn't matter in this context.

There are key components that of which make up anyone's experience as a human that don't rely on purely brain matter. You being present in time right at this moment, with all your sensations, making decisions is enough to make your experience valid. I'm honestly too fucking tired to do a in-depth rundown but imagine this. If you made an exact copy of someone and put them in the exact same environment, would their thoughts follow in the same exact order sequentially? Or would they be highly variable? That deviation in itself would prove a mere copy of atoms does not equate to you. You see the thing is, everything you do, every decision you make is all apart of what makes you, you. What i'm getting at is your mental state is directly tied to your physical state as time passes through you.

5

okaterina t1_ja4hn82 wrote

"Consciousness is not proven to be physical". What else ? If you start speaking about the immortal soul, I am out of this discussion (sorry, pure agnostic atheist here).

Now you speak of the mental state as tied to the physical state as time passes. But as Descartes said, "I think therefore I am", ie all input from the senses (sensory input) cannot be trusted. Why would you trust your eyes, do they really show you the truth ? What about visual illusions ?

What I am saying is that I do not need to be meat, I need to have the *exact* (and that is the point I am willing to discuss: would that ever be possible ?) processes. Is the brain a super-processor ? What's the part of randomness ? Can a neuron and axones be modelized with enough precision to reproduce a thought process ? And finally, is it needed to simulate neurones and axones to reproduce a thought process ? Is there a possibility to use another substrate, other mecanisms and get the same results ? Is it possible to feed it replica inputs, so it thinks it sees with its eyes, ears with its ears (and can touch the ears as touch is just another sense to duplicate/copy over) ?

I do not have the answer to the questions above.

BUT

If I have a *perfect* copy, therefore not discernable, it will be the same. Just as 1=1, the '1' on the left is not the '1' on the right, but they have the exact same properties and behavior in mathematics. You can use either of them.

Ask yourself: what is the difference between an original and a *perfect* copy ? If there is any difference, then the copy is not perfect.

0

Frumpagumpus t1_ja2n6uw wrote

yep thats what I said

but it doesn't really matter is my contention

star trek teleporter

humans dislike death because of the loss of family and friend group cohesion and institutional knowledge and pain associated with it. in practice we even shut our awareness down for periods when we sleep. software forks and kills processes and services permanently all the time.

−3

helpskinissues t1_ja2ofr8 wrote

Yes. We will all die to become enhanced versions of ourselves. Suicide cult soon.

1

Frumpagumpus t1_ja2ucop wrote

https://youtu.be/WYsDy41QDpA?t=241

but yea i'm not gonna volunteer to be the first one to have my brain sliced up. but if you are going to die anyway why not die in a way that makes sense

as far as we know, entropy even comes for superintelligences

1

visarga t1_ja2vym8 wrote

You don't need to do all that. Train a model on your data without destroying your body, just what can be logged from outside. It will be enough. chatGPT can enter a persona even with just a handful of hints. I think the AI of the future will be able to replicate any personality without fine-tuning.

1

Frumpagumpus t1_ja37luu wrote

you might be right on that count lol, but still that clone is even less me than the brain slice clone! And I'm still stuck here in that situation!

2

FirstEbb2 t1_ja85z99 wrote

I'd put that choice after cryonics and other amazing techniques I don't know about. At least, this kid will definitely make me love more than the offspring who will definitely do things I hate in the future, and I will regard him as my "son plus"

1