Viewing a single comment thread. View all comments

marvinthedog t1_isfhtdt wrote

My theory is this: If an agent is both super intelligent, rational and conscious it should rationally realise that technically its current self is separeted from its former and future selves just as much as its current self is separated from other conscious agents. Therefore it should rationally value all conscious entities as much as its own consciousness.

​

I know that the vast majority disagrees with me about the premise I base this theory on (that your current self is separeted from your former and future selves just as much as your current self is separated from other conscious people). This has been hotly debated when discussing the teleportation dilemma or mind uploading.

1

AdditionalPizza OP t1_isfqvb1 wrote

Hmm, personally from that reasoning I would say your future self is less separated than the others, being that your actions in the present directly affect what you will feel. But also I suppose in this theory the present doesn't even exist so I don't know.

I can't remember what someone else thinks though, so I don't know how much I agree.

But anyway, what were you getting at?

1