You must log in or register to comment.

TotalMegaCool t1_j6ardjl wrote

I'm ready! I still need the home though.


Jenkinswarlock t1_j6axo3a wrote

I’m so ready but I’m so scared to die before then


Steven81 t1_j6chw1k wrote

There is a downside with being infused with such ideas from early on. I'm prolly older than a lot of this sub (hopefully not by a lot) but I came in contact with age of the spiritual machines and similar ideas in the mid to late 90s...

Ever since then , no matter how small the danger (towards my life) I get spooked. For example I had a minor hospitalization lately, my iv hooked veins often develop phlebitis soon after. A mostly benign condition that almost never develops to something worse like DVT.

Yet I'm losing sleep over it, misjudge even slight muscle pain on the upper arm as the start of some nasty DVT. That's not even my 1st time with phlebitis. I get it almost every time I get hooked with IV lines (for often silly reasons) , so my subconcious should have been trained.

I was not like that at all as a kid. I think my hope that longevity escape velocity happens in my generation, made me paranoid in some subconcious manner and I'm accutely aware of my possible mortality, more than I would otherwise be.

I hope that younger generations that learn/read of such stuff do not fall in this pitfall. Whether you have a lot to gain (or less) by staying alive for as long as possible, does not make your death at his moment more imminent/probable. Yet that's the subconcious feeling (suddenly each danger is acute) that often arises if you let it.

Be aware, live your life! Obviously avoid stupid dangers, but often that's enough your body (and some medical checkups as you grow older) takes care of the rest for the vast vast majority of cases (which very probably includes you)

Me stressing over it, even subconsciously has actually made my health worse than it would otherwise be (stress more generally). It's ironic, but it is there. Be aware, our minds can be stupid like that.


Frumpagumpus t1_j6g4uu2 wrote

then there is me where at first i was like, naw i won't destructively upload my mind into the computer because i want causal continuity, but then I thought about it some more, and I think causal continuity may be an old person value soon lol, screw it, i would rather be in two places at once, i will pre commit myself to it XD

(i wont be the first person in the star trek teleporter but heck yeah i would use it)


Steven81 t1_j6hw8fd wrote

Well you are a biological being, I.e. you are your neurons (which is why there is no turnaround in them), I don't think that full uploading will ever be a thing because mass suicide has never been a thing. We are material, not immaterial, hardly anything is immaterial, that's a category error that many in futurism do (just because we have a name for something doesn't mean it is an actual physical thing, for example every instance of a certain software installed in a different computer is actually a different program each time, I.e. differentiated matter gives rise to a very similar behavior, a bit of how monozygotic twins are actually two different people no matter how close alike they seem).

I doubt that materialism will ever be proven wrong, but I guess that's a question for a dif thread.


Frumpagumpus t1_j6i0i4n wrote

> you are your neurons

why does that matter. you go to sleep every night and the cessation of conscioussness doesn't bug you.

People have died for stupider reasons than "I want to create a clone of me that has my values and can clone themselves and possibly shut down their clones if needed such that they can perform tasks in parallel, oh and they also get a massive speedup and don't require nearly as much space or resources and could thus go into space much more easily and can save quite a bit of time on maintenance etc."

its for the cause (though to be clear i am not actively betting that destructive brain uploading will be a thing, more like, even if you had non destructive brain uploading or some ship of theseus stuff or whatever, once you were actually in the computer you would find it VERY VERY convenient to clone yourself. software processes fork all the time, and their children are killed and garbage collected with reckless abandon)

if you were trying to preserve yourself biologically probably the easiest way would be to stick your brain in a jar lol. which i bet a lot of people would also find morally objectionable XD


Steven81 t1_j6i1yei wrote

> and the cessation of conscioussness doesn't bug you.

It only matters in a platonic universe. If we do live in a materialistic world then it doesn't matter, because we are a thing which can switch on or off, but we are that thing regardless, if it switches off and then doesn't switch on, only then it is an issue. In fact materialism sidesteps a lot of the platonic/neoplatonic issues (immortality of soul and such ideas).

Only issue with creating copies of yourself (in a materialistic world), yet cease the function of the original is that you lose context, I.e. you do not see the world go on, instead you see it stop at the moment of the original's cessation. You can say that the world goes on in some higher level/sense but that's not at all what you're going to "see". You are going to see/experience the end of all. Which is -btw- why mass suicide has never caught on in most/any society, most people can make said connection in some subconcious level ("if I die, the world doesn't actually go on, at least not in any way that matters to me, at best it becomes a parallel universe to mine")...


Frumpagumpus t1_j6inapz wrote

idk seems more like a solipsistic point of view than a platonic one

(i actually consider myself a bit of a platonist, in particular i think the distinction between space of ideas/math our brains/gpt navigates and physical space we move through might be a bit more subtle than it seems on the surface, but i don't think that really makes any difference to present discussion (well, not in the way you seem to be arguing it, actually i think it could almost go in the opposite direction... abstract world might be a bit more material than first suspected))


Steven81 t1_j6io97d wrote

Platonism argues that we live in a world of ideas. That things like math or information are extant entities instead of shortcuts we use to describe more involved phenomenons.

Materialism believes that we live in a world of matter. That matter is primary and everything else are shorthand of how matter behaves through time.

Materialism does not believe even in the possibility of things like souls, essence... software.

It does matter whether we live in a materialistic or a platonic universe. In one case Uploading yourself is killing yourself, in another it is living forever without the need of pesky mediums.

It is a rather core question which will show up eventually.

And yeah Materialism can be quite sollipsistic (but not exclusively, for example it does not deny the existence of other experiences as valid, merely as not very relevant once said piece of matter ceases)...


Frumpagumpus t1_j6iung4 wrote

i think we disagree, I think your version of "platonism" is solipsistic lol, since it places so much emphasis on your point of view.

my version of platonism (which is not pure by any means but possibly just as aligned w/original platonism's theory of forms if not moreso than yours) is more: abstract and physical world can both be described with coordinate systems, e.g. numbers. So just like it turned out space and time were actually spacetime, there might be something similar going on.

and yes i don't believe in souls. (in particular there is no a priori reason to believe in them and even if it was a real concept it wouldnt' change much since the soul would also live in a reality similar to our own, e.g. that of space describable by a coordinate system, probably some timelike dimension as well in order to map to our own reality, in my estimation)

> In one case Uploading yourself is killing yourself, in another it is living forever without the need of pesky mediums.

uh, it can be killing yourself in both of them, because causal continuity is a "material" property...

the question is more of how much difference does it make, people die all the time, is it so bad to die, etc. You think maintaining your personal narrative is of paramount importance because it's tied to some trans dimensional soul or something. I see myself as more about fighting for my ideals, and making sacrifices when necessary or important.

Though i'm not sure it will really be much of a sacrifice actually, seems like that to us but we see things differently as exclusively embodied agents than future intelligences will.


Steven81 t1_j6j0968 wrote

> You think maintaining your personal narrative is of paramount importance

If we live in a materialistic universe , I don't think that concepts like "importance" can even enter the conversation.

Things either are or they are not in such a universe. In a materialistic universe your end is akin to the end of the world because there is a lack of observation in the particular timeline you always occupied. Yes the world will go on in some abstract way, but not in a manner that can -even in principle- matter to you. Say the many worlds interpretation of quantum mechanics ends up being true (basically time is multidimensional), in such a world how can it matter what happens in a parallel reality that is not ours. One's death in a materialistic universe is neither important nor unimportant, it does have a definite effect on the individual though (he gets stuck in a dead end version of the universe).

That's why I find a materialistic universe (if we indeed live in one) a partially solipsistic one.

I dont know how Platonism can be sollipsistic though. Plato certainly did believe that we live in a universe made of ideals and that we embody an image of them. The concept of a soul was paramount to his belief and especially to that of neo platonists. That's where Christians got it from (early Christians believed in bodily resurrection, there was no concept of an immortal soul, until neo platonists had their influence on Christianity around the 4th century ce, but I digress)...


Frumpagumpus t1_j6j0vr4 wrote

> If we live in a materialistic universe , I don't think that concepts like "importance" can even enter the conversation.

what, why does a soul or whatever have anything to do with importance? (my suspicion here would be you are trying to do something impossible to do with an axiomatic system)

> Yes the world will go on in some abstract way, but not in a manner that can -even in principle- matter to you

we just went over how "abstract" and "material" (the world) aren't necessarily so different... they are both spaces in a geometric sense mapped by coordinate systems


Steven81 t1_j6jvvdd wrote

> why does a soul or whatever have anything to do with importance?

It doesn't, I was reacting to something else entirely (namely a phrase of yours that I quoted).

> we just went over how "abstract" and "material" (the world) aren't necessarily so different...

A description of a thing (an "abstraction") is not the same as the thing in a materialistic universe. I can see how they can be neighbours in a platonic or more generally an idealistic universe.

Which is why it is crucial for us to know in what type of universe we find ourselves into.


Virgence t1_j6b9px6 wrote

Even if that were to happen, you could be resurrected in virtual reality.


Jenkinswarlock t1_j6b9ulc wrote

I’m not sure if I would constitute that as “myself” unless I were to upload in a way prior


RichardKingg t1_j6bowb9 wrote

And even if, how can you even know if that version of "you" would be really "you" ?


WashiBurr t1_j6bq30m wrote

How can you even know that the current version of you is "you"? For all we know, the consciousness you're experiencing at this very moment is just a state of being your brain accepts as itself, regardless of whether it actually is or not.


4444444vr t1_j6bu249 wrote

Well this is definitely me, maybe not the other guy that they thought they booted up, but me being me is the only thing I can know.



Kaining t1_j6cden6 wrote

You're putting the carriage before the horse here.

Consciousness first, body then. Consciousness is consciousness regardless of whatever body it is. It's molded by the body but is is what it is. New brain, new memories, new nervous system, new feeling, ect... That's kind of how the reincarnation thingy explains why you don't get to keep memories from previous life in buddhism btw and it kind of make sences. And is a bit fallacious and dodgy too as it kind of nullify the appeal of reincarnation when you first learn of the concept. It ain't a restart button at all. More of a "things stay the same in a constantly changing world" impermanence trick.

So you could rez a completely different consciousness into a VR game and it would still act the same as the being you resurected as long as you "built" it right. The problem here is not knowing if its "you" but if there is a "you" inside that VR avatar. That's an aspect of the "brain in a vat" thingy. How can you be sure that others are real when all of reality is merely a projection of your brain. How can you be sure you are even here is another nasty issue.

Ego Death is a thing afterall.


Artanthos t1_j6dos6w wrote

It’s self perception.

It’s just a slightly different version of I think, therefore I am.


winkerback t1_j6bud2b wrote

This sort of gets into weird territory. If you discovered that every night when you are deep asleep in a split second your entire body is vaporized and then an exact copy was rebuilt into the same state that it was before, would you consider the person who wakes up to be you?


phoenixmusicman t1_j6bvg65 wrote

I would not, which is the problem I have with Star Trek teleporters. They're technically killing you every time.


OutOfBananaException t1_j6caq3v wrote

Your uploaded self will get terrible anxiety when moving between networks. I wonder if there will be uploads that refuse to move from the substrate they were uploaded to..


malcolmrey t1_j6cwr5w wrote

i also believe they are being killed and an exact copy is made

it is overall interesting concept

I wonder if people who died for a short time and then they came back - could they be treated similarly? :)

I know they have the same body, but they were technically rebooted. They went off for a moment.


Nanaki_TV t1_j6cyhvw wrote

You’re not being killed. If you have your arm cut off and reattached it’s still “your” arm right? Those teleportation devices and taking you apart by every molecule and then putting you back together. Theseus Ship comes into mind too. It’s why Riker had a clone of himself too. The device made a copy of them. I don’t remember which one was considered the copy anymore.


phoenixmusicman t1_j6e9izm wrote

The arm isn't being cut off though. The arm is heing removed, burned to ashes, then had the ashes sort through and rebuilt.

The clone thing supports my argument.


Jenkinswarlock t1_j6bx50p wrote

The physical would be the same but it would be intrinsically different


MacacoNu t1_j6btc6n wrote

why only in virtual reality? If an ASI really likes humans, and/or we humans follow the development of an ASI closely, I believe that many humans would REALLY try to bring people back, in ways we can't even predict


LightVelox t1_j6dapl4 wrote

That if someone has a copy of his mind to resurrect, has the money to do that and is willing to


Spoffort t1_j6cxaa8 wrote

How old are you?


Jenkinswarlock t1_j6cxs71 wrote



bluemagoo2 t1_j6dnviw wrote

One way of thinking that might help is knowing that it is a very real possibility that death is an unavoidable part of this universe.

Heat death might just be a slowly approaching doom for all free energy including any singularity. Would that make your time here any less meaningful? Same goes for dying before any life prolonging tech gets created.


Ivanthedog2013 t1_j6eojfg wrote

no one here is afraid of death by itself, they are afraid of what they will miss out on in between now and the heat death of the universe.

i would gladly accept death if it means i get to spend thousands of years unveiling all the mystery's of reality.

but dying now knowing that i could miss out on all the epic sci fi like experiences is much a fate much worse than just being anxious of death alone


bluemagoo2 t1_j6es6yz wrote

I guess I view it a little like my experience will end no matter what. After that (can’t know for sure though) I will cease to exist.

It’s a little pointless to be anxious and not accepting of it because at the end of it, it’s not your decision when your number gets called or whether this tech gets developed before your death.

It comes today or in 1 million years, you have equal control in both circumstances. I think it’s okay to want to be around for it but I also think you should be accepting that’s there’s a good chance we won’t. To not accept that means you’re unduly troubling your real current life.


Ivanthedog2013 t1_j6etogg wrote

i understand your logic but its a little flawed.

your assuming that in the next 1,000,000 years that humanity wont ever develop a smart enough intelligence to figure out a way to avoid the heat death of the universe,

considering how we are already seeing significant improvements in our ability to literally manipulate matter on a quantum level whos to say that once we figure out how black holes truly work and how they relate to dark energy or dark matter that we wont be able to take advantage of those systems to manipulate the entirety of the universe to eventually avoid the heat death of the universe.

im not saying its guaranteed to happen but knowing that it could potentially happen directly contradicts your logic that out inability to have control over our fate is equal to what our ability to do so would be 1,000,000 years from now. and that logical inconsistency validate peoples anxiety towards the thought of missing out on those intellectual transcendent/enlightening experiences.


Agarikas t1_j6ci5s9 wrote

Don't worry your atoms ain't going anywhere.


C4PTNK0R34 t1_j6czkmm wrote

This is the part where things get rather difficult to theorize. If the transfer of mind and memory to an artificial construct were possible, would there then be two separate versions of yourself with the same consciousness? Or would you be conscious of your original body only while the duplicate believes itself to also be the original?

Assuming that at the time of the Singularity, we've already created a Matryoshka Brain, a particular Deus Est Machina of sorts. >! (A literal God AI, Deus Est Machina roughly translates to God is Machine. A Matryoshka Brain is a supercomputer-powered AI inside a Dyson Sphere powered by the consumption of an entire star that would theoretically be able to hold conversation with the entire world without using 0.1% of its processing power.) !< As well as assuming that our consciousness is tied to our own minds and memories, then death becomes a minor inconvenience while a new artificial you is created after whatever mishap you endured occurs. Human evolution is then flung forwards as time suddenly becomes endless.


Bluemoo25 t1_j6dlk8t wrote

There was this beautiful thing that I learned from Jack kornfield who learned it from Ajahn Cha a Buddhist monk. I'm going to paraphrase can't find the original. He essentially said imagine humanity as a vast forest. What happens when one tree falls? There are more that sprout and take it's place. The tree falling is a natural part of life. This is what you have to accept when it comes to death. The image of a tree falling in a forest is what clicked for me, and I lost the majority of my fear of death.


InternationalCook346 t1_j6ebsvz wrote

yeah, think that's why Buddhist monks always go wayyyy up in the mountains... like, "if a tree falls in the forest and no one is around to hear it, did it really die?" 😅


InternationalCook346 t1_j6ed7bc wrote

Schrodinger's Monk: A monk leaves his monastery, vowing to meditate atop a mountain in solitude until the end of days, as is the path their particular practice.

None will know whether he has left his body, until the next brother follows this same path. For a certain amount of time, Schrodinger's monk can be thought of as being both alive & dead, up until the next monk observes his body, thereby collapsing the wave function 🤓


Dindonmasker t1_j6b45cd wrote

in this simulation you will experience 54 000 000 years in a single weekend.


surviveingitallagain t1_j6bo87e wrote

Are you sure you want to try the eldrich abomination simulation first sir?


ginger_gcups t1_j6bzmxr wrote

Yes, load "Trump Presidency 2025-?" please


Smellz_Of_Elderberry t1_j6c29ll wrote

You're already in the, "billionaire with golden toilets is president of the United States" timeline.

Seems like pretty good evidence we are already in the simulation.. It surprises me how much this actually seems like the truth....


StarChild413 t1_j6co4n4 wrote

so are you saying it's because of the absurdity (if so, would it be theoretically possible to have a normal enough president (perhaps the closest candidate you'd support) to shift us back/retroactively affect our nature) or that a weird thought experiment joke based on something absurd from our world about that being part of a simulation proves we're in that simulation being joked about


Nanaki_TV t1_j6cyqhm wrote

Class, today we are going to discuss the topic of run-on sentences. Will someone take notes for StarChild since he’s out today?


CrunchyAl t1_j6bjy8r wrote

I miss being in my 11,000,000's


Rebuta t1_j6auq8x wrote

I'm ready.


Ortus14 t1_j6btowi wrote

Even if it takes me a billion years to get a girlfriend I will never give up. Haha


No_Fun_2020 t1_j6bp1sw wrote

Ready for the sea of LCL or whatever happens after singularity

All hail the Omnisiah


vernes1978 t1_j6bc915 wrote

if this wasn't posted here I'd think you were telling us you were homeless and on acid.


Sandbar101 t1_j6b2att wrote

I’m not ready yet, but I’d like to be when I get there


_dekappatated t1_j6bag6k wrote

Still having a human hand at 54 million years old


JohnLemonBot t1_j6c5n3t wrote

We got some interesting ones on this sub lol


ElvinRath t1_j6cb022 wrote

So, still no house and single, right?


Lartnestpasdemain t1_j6b8re9 wrote

Well you'll still have to wait to be this old 🤔


JavaMochaNeuroCam t1_j6csr04 wrote

You won't be you in even the first 100 years after AGI. You will evolve and transcend so fast that your future self will be less like you than you are like your parents. Of course, there may be a choice:

A: Stay human with a human brain and live forever B: Have a human form but 2x intelligence C: Become a God


Foolhardyrunner t1_j6dqa1f wrote

B. With a better back hips etc. and immortal seems like the best option.


AUkion1000 t1_j6bgp0l wrote

Yeahhh ... ok then



yottawa t1_j6eq7my wrote

I am ready!


splita73 t1_j6fdxaq wrote

I'm getting the word " Recessitate" Tattooed on my chest


Ok_Sea_6214 t1_j6bu0mw wrote

The problem is natural selection: you can't introduce this level of technology and expect we'll just all get to enjoy it without any issues. Industrialization led to two World Wars, and nuclear technology led to nuclear bombs, which could still destroy us all before we get to the Singularity.

It's why I believe 90% of people will not survive long enough to see this happen, because it would be too easy.


pandoras_sphere t1_j6dbt9d wrote

Uploading a person to AWS for all eternity will be cheaper than their remaining expected medical expenses.


Ok_Sea_6214 t1_j6dmerx wrote

There's only so much need for individual consciousnesses. Every upload needs to warrant the cost of storage and operation, no matter how insignificant.

And I think a shared biological/digital consciousness is the way to go to transfer legacy humans to digital ones, probably with cyborg upgrades. You can use an old drive as a backup until it fails, and just upgrade to a new one when it does.


SoylentRox t1_j6bxjkj wrote

I hope 10% survive. The skies are dark for a reason, and at our current level of knowledge it looks suspiciously easy for a lot of humans to become immortal and grab most of the universe. The remaining problems all look solvable in a reasonable (years to decades) amount of time if you have a superintelligence to handle the details.


saladmunch2 t1_j6c0ei3 wrote

Reminds me of them long nights on 3 meo pcp.


Agarikas t1_j6ci3ny wrote

Someone might looking at this meme from the future and I don't really know what to tell them besides "It was what it was".


Redditing-Dutchman t1_j6cp51y wrote

When you can't find your own dimension because the dimensions market is still fucked up.


JaSamGovedo t1_j6cta0f wrote

Buy house? How? With what money? Why not "build house"? Are you American?


FirstEbb2 t1_j6hbi40 wrote

There is a saying in ancient China, "Even if you are a gentleman, your descendants will inevitably destroy your family's reputation within five generations."

I'm a bit skeptical that I can live forever, but I believe that if my descendants make an artificial intelligence based on what I did and the evidence, it will definitely be more faithful to my ideals than my descendants. I'm a selfish person, and I believe this idea is more exciting than those "becoming a stream or gamma rays, becoming one with nature" stuff that sounds very altruistic.


No_Ninja3309_NoNoYes t1_j6c383s wrote

In a few decades organs printing and the first cyberpunk implants. In a few centuries healing nanobots. In a thousand years a hive mind in a growing Dyson swarm. In a million years no more need for bodies. Nervous tissue, hardware, and software will become one.


[deleted] t1_j6b06t1 wrote



Phoenix5869 t1_j6b0lzr wrote

So you would rather die at 80?


ajahiljaasillalla t1_j6b2336 wrote

I don't particulary enjoy living but I am just a link of the chain of 3 billion years so a will to survive has been encoded into my DNA. Nietzsche had an idea of eternal return where one would live their life over and over again and I think it would be the worst nightmare.

Why Am I being downvoted


Smellz_Of_Elderberry t1_j6c2hr0 wrote

Because this is the singularity reddit page and most people here want to extend the human lifespan.


cy13erpunk t1_j6b3zk3 wrote

this is pretty edgelord-y