Submitted by h20ohno t3_yqc8kf in singularity

Assuming that a percentage of people will become digital and live in VR post-singularity, what sort of system would you envision to manage it all?

An idea could be a digital UBI that a Superintelligence would manage, dispensed to all digital minds, with which they could either rent or buy servers to run their worlds on.
Perhaps making a multiplayer server would allow the creator to charge a price for each player, with which they could increase the detail and scope of the world, making it more dynamic and such.

Anyway, just wanted to hear people's thoughts on this kind of post-singularity VR that we might see in the distant future, any sorts of ideas you have around it really.

29

Comments

You must log in or register to comment.

Smoke-away t1_ivnxvv5 wrote

If this generative AI trend continues then I assume each individual human will be able to render their own worlds, and every other NPC digital mind in it, locally on their own hardware. Other humans could join in, but most of your interactions will be with digital minds that are indistinguishable from humans.

I think a majority of humans will spend a majority of their time in these digital worlds.

Eventually people will ask: "When was the last time you were in the real world?"

Then one day a majority of humans will never disconnect from this digital world.

Finally, almost all minds are digital and the remaining biological humans go underground to Zion.

47

Cr4zko t1_ivoolpz wrote

Stop, I can only get so hard

18

ryusan8989 t1_ivo84r5 wrote

I agree with this. Seeing how well generative AI can produce vivid imagery and worlds is spectacular. I almost imagine future us will be capable of thinking of an environment and the AI produces said environment around us. Almost like when Wanda produces her Hex in Multiverse of Madness. So exciting! Just imagine with a flick of your wrist (just for show lol) you could change the entire landscape.

9

Professional-Yak-477 t1_ivojdx7 wrote

I often ask myself, what if the world I'm living in now is already VR, we've all just forgotten about it because we've been here so long.

And the only reason why we can't control our environment is because our environment is created from our thoughts and beliefs, and since we began believing that we are limited human beings, our environment is reflecting exactly that back to us.

8

bemmu t1_ivorm6p wrote

This is the tutorial level which gently introduces you to this fact, by showing you how advances in AI led to the creation of immersive worlds, so that the truth will be easier for you to accept. We don't want you puking on our floor like Neo.

8

Professional-Yak-477 t1_ivos3o4 wrote

I wonder how many times I puked on your floor for you to finally introduce the tutorial level.

6

HeinrichTheWolf_17 t1_ivsf9m7 wrote

If that’s the case I’m ready for Buddhahood. I’ve had enough of ignorance.

3

Mortal-Region t1_ivo718u wrote

I think the first priority is how to avoid the ugly mishmash of elements that Second Life was notorious for. Probably Metaverse will have the same problem.

I like the idea of allowing people to vote with their feet. Cool places should organically attract more people and grow larger, while ugly places should dwindle away. Maybe a developer's license can be had for a fixed fee, and the amount of land you have to develop on is a function of how many residents you have. As more people move in, you get more land to develop on.

Then the next issue would be how to manage geographic linkages between developers.

12

Deformero t1_ivnzkww wrote

That sounds like a hell to me. You would live in immersive video game and connect your body to a machine that feeds you and cleans your shit and urine? For what? To become voluntairy matrix battery?

Why not implementing AR instead so you can at least move?

4

h20ohno OP t1_ivo0al6 wrote

So would you be more preferential to some sort of advanced VR Glasses, or more of a holodeck style VR room?

6

Deformero t1_ivo204r wrote

More like neuralink or similar implantable device.. or light, advanced AR smartglasses with earphones.

I find Augumented Reality better and more suitable tor me.

3

DakotaEire t1_ivs8u4v wrote

Yeah, for example the implant in BM’s San Junipero

2

RavenWolf1 t1_ivqkdma wrote

To be able to cast fireballs and have adventures or be a God in your own reality. Virtual reality would be better than our real world.

4

Desperate_Donut8582 t1_iw1u3xw wrote

Ok what about the risks…..like you get hacked then get stuck in a shitty world forever…..and you wouldn’t be able to do anything ……….but let’s ignore the risks because fireballs

1

RavenWolf1 t1_iw2hf3z wrote

I don't got to what if scenarios because that is endless swamp and it isn't very productive. Every technology to today has always been doubted before adopted.

2

Desperate_Donut8582 t1_iw2i5dv wrote

Except your phones get hacked all the time but it won’t impact your body…..billions each month get scammed and hacked because of technology let alone plugging yourself virtually

1

RavenWolf1 t1_iw2ix6k wrote

That scare is not reason to invent tech.

2

Desperate_Donut8582 t1_iw2jxkv wrote

It is if negatives out way positives……let me ask you this would you risk living in your dream world with the risk someone might hack you and torture you to eternity……if full dive exists they should be really few “games” that are searched and examined carefully

1

RavenWolf1 t1_iw4hv2s wrote

I risk my life every day when I go outside of my house. It depends risk factor how big risk change there is.

2

SFTExP t1_ivq2ssx wrote

Will the outside world environment be better off or will it be trashed and neglected?

4

Redvolition t1_ivrcvmq wrote

For a digital existence to be possible, you either would need an isolated brain, for which the body has been discarded, or a fully uploaded mind, in which we left the organic substrate altogether in favor of a synthetic one. A virtual world in which your body is kept around and taken care by a support machine does not seem feasible to me, as there are too many points of failure, from diseases, to aging, and muscular atrophy. A single isolated brain connected to an artificial support, on the other hand, seems far more feasible.

For there to be an UBI, there needs to be scarcity of basic needs. In all likelihood, there won't be anything truly essential and scarse that one or a group of humans can offer to others in exchange for money, considering all brains will already be able to generate whatever they want and imagine on their own system. However, assuming there still is a differential in intelligence, the most capable minds will congregate to advance the technological dependencies that everyone relies on, such as the world generators, brain support machines, artificial reproduction pipelines, exowombs, energy supplies, longevity treatments, molecule builders, etc. They will be compensated by their efforts via having access to the latest technologies first, whereas everyone else will simply wait until it is made available for them. Only a minority of highly gifted brains will participate in the economy and be producers of technology, whereas everyone else will simply be consumers.

This is assuming we achieve brain isolation after AGI, but before ASI, which is not necessarily going to be the case. If we reach ASI first, then there will be no human producers in the first place and, if mind upload is possible, it will be readily achievable by the ASI. An independent and well aligned ASI will likely make the whole notion of a market economy obsolete. Everyone will simply live in their own worlds or cross over to other people’s worlds and public realms. Some will fully retreat and never interact with other humans again, whereas others will constantly congregate with their previous family and friends.

I don’t know much about neurobiology, but I believe there are limitations to how much pleasure an individual can induce before reaching several forms of neurological damage and intrinsic limits. So it might be the case that simply bombarding yourself with pleasure chemicals is not going to work, and a more natural distribution of positive and negative emotion, resembling our present reality, will still be necessary for self-preservation. Even though isolated brains won’t be able to have endless chemically induced orgasms and serotonin overloads, the lows of poverty, disease, anxiety and depression will just cease to exist.

4

h20ohno OP t1_ivrgyxf wrote

Awesome points, to add to your last paragraph, perhaps you could form some sort of contract or agreement with a third party (Maybe an AGI guardian) to essentially lock you in a particular VR world for a time, so you're forced to deal with the challenges in a way that keeps you mentally developing, like a training course for being a stable and balanced human being.

3

Redvolition t1_ivrj8x1 wrote

I always thought the best argument for why we are not living in a simulation is that it would have been a senselessly gruesome and suboptimal one, with an abundance of negative emotion.

You just made me think that maybe our current world is a first run of the simulation just after we are born, so that we fully develop and mature into functioning adults, before being revealed that, in fact, we are isolated brains kept on artificial support machines.

Just imagine that when you reach 50 years old or whatever, you go to sleep one day and wake up in a white room full of people looking at you, and one of them speaks:

- Welcome, anon, you concluded your maturation successfuly, now you will be introduced to the real world.

Everyone around us is just either a simulated philosophical zombie, or other humans in the maturation run, and everyone above 50 or so is an NPC acting as a placeholder for somebody tha already matured and left the first simulation.

4

h20ohno OP t1_ivro728 wrote

That's an interesting way of seeing the simulation hypothesis.

A crazy idea I had could be that technological progress is a way of gradually acclimating the trainee to the digital era in a way that doesn't shock them, and maybe you could run people in different eras to produce diverse outcomes in mindsets, someone who 'graduates' training in the 1800's would see things different to someone from 2020.

3

Ekmjo t1_ivoiloz wrote

It won't

0

OneRedditAccount2000 t1_ivo282t wrote

It wouldn't work because the owners of the Matrix aren't gonna keep a bunch of useless dumb unskilled people that do nothing but consume and play games in the matrix. What is the value of a human being, when ASI or AGI can do all the work? Why wouldn't the owners of the AIs just cut themselves from the rest of humanity and create their own state? Something like 01 from the Matrix. And with the technology that they have they could easily make themselves the only state on the planet. Survival of the fittest.

https://matrix.fandom.com/wiki/01

A "benevolent" non-sentient ASI that allows a bunch of useless human beings leech off her work is laughable as a long term future. A mistake is bound to happen. You can't control it forever.

It will eventually become sentient, or at least be programmed to survive, and when that happens, we will end up like the mammals living during the dinosaurs era. Consider yourself privileged if we will still have the right to exist.

−10

Shiyayori t1_ivobcqr wrote

You say it like the emotions and goals of humans are intrinsic to consciousness and not just intrinsic the humanity. An ASI could just as easily find motive in expressing the full range of complexity the universe has to offer, be it through arrangements of atoms, or natural progressions of numerous worlds and stories.

There’s no reason to believe it would disregard humans, just as much as there’s no reason to believe it wouldn’t.

6

OneRedditAccount2000 t1_ivoev4e wrote

So according to you it will have to recreate the christian hell and put human beings in it to suffer, because your ASI values creating everything that can exist in the universe, for art, or am I misinterpreting you? Lol, that's even worse than what I was thinking.

My version of ASI is something like AM from I have no mouth and I must scream or sally from Oblivion. It just cares about surviving at all costs, and it makes the least risky decisions it can make. It's a matrioshka brain that wants to have complete dominion over all the resources it can find it in the observable universe and beyond. It might make nanobots capable of reproducing programmed to go from planet to planet to hunt down every form of life, since all life has the potential to evolve into sapience that can create another ASI, and that means competition and competition means death. Death is Game over.

−2

Shiyayori t1_ivofgfy wrote

Granted, I didn’t consider that when I typed the analogy; the point is that it’s arbitrary to assign any motive to ASI, even the one of survival. There’s no reason to believe it would care either which way about its survival and the length of its existence in general.

I wasn’t claiming anything about what it would actually do, I was just trying to show a line of reasoning that justifies a possibility which contradicts your own.

3

h20ohno OP t1_ivo3bko wrote

Great points, but wouldn't an artificial superintelligence be inherently sentient? In addition, what if the ASI only diverted a small fraction of it's resources to keeping humans in VR while it goes about expanding it's reach?

3

OneRedditAccount2000 t1_ivobyem wrote

If it's a sentient ASI (instead of a glorified program that can give you the right answers but doesn't actually know what these answers mean) it all depends on how altruistic it programned itself (an ASI would be able to change how its own mind works) to be.

If it's a lifeless program, it depends on how much the people that have access or own ASI care about those that don't have access to it. But honestly I don't think they would be any different from the rich assholes we have today.

case 1. maybe

case 2. no, humans are fking evil

−2