Viewing a single comment thread. View all comments

apple_achia OP t1_iv9vbnx wrote

No we won’t. Because some of us will affect others more, and in THAT way send ripples through the universe. Personally I’d think if you have any connection to nature, you’d never consider getting in an experience machine, because it itself is the opposite of nature, it functions to cleave all of your experience from nature. I also draw meaning from nature, and I believe most people do in some way, which is why I have a hard time believing this would be a functional solution to anything. Functionally what you would be doing is providing a humane and enjoyable form of euthanasia as a solution to the climate crisis, and hoping enough people opt out to change our carbon impact on the world and avert climate catastrophe.

I’d say another problem with that is that the people causing the problem most directly, ie those with power who use exponentially more resources than the rest of us, would be the least likely to take it. And if the poorest billion take this option, but were living off next to no carbon any way, no impact would be made

2

turnip_burrito t1_iv9wn3o wrote

First, I agree it would be sad to watch people isolate until the end of time in VR by themselves.

I was also working off the assumption that this kind of technology is built after some sort of superintelligent AI is. It's really the only scenario where such a VR situation makes sense to discuss. There's absolutely no way it can be built beforehand. And such a super AI would, if it doesn't slaughter the human race, have the capacity to solve the climate crisis.

If such a thing were invented before climate change and AI is solved... somehow.... then yes that would be a threat humanity's survival. The equivalent of a man quitting his job and living off savings until he loses his marriage, kids, house, and food.

The way forward, after this, for any human beings that want to continue to make an impact on the world at large, is I believe to choose the kind of world in which they want to live. All kinds can coexist.

Some will stay normal human beings, which is perfectly fine. This group can spend time doing things in the real world with friends and family.

Some may jump in and out of virtual reality. It doesn't have to be by themselves. They can experience the universe as it is in base reality, or extend their experience to new ones not present in base reality.

Some who want to continue research and development to augment their capabilities. They'd have to become superintelligent themselves in order to continue aiding humanity's technological progress. Then they can match the machines' speed.

Others will do some weird mix of things beyond imagining.

At all points, there will be some who are more prone to isolation than others.

There are and will be options for all people to make a meaningful emotional impact in others lives if we choose. We just have to want it.

5

apple_achia OP t1_iv9xy1a wrote

As for AGI having the capacity to solve the climate crisis: I think this assumes we don’t understand what the solution to the problem is. That’s not the problem, the problem is coordinating actions across human beings to ensure our agency isn’t entirely neutered, we live a comfortable life, and we don’t use up all of the resources our existence depends on. AGI solving this would rely on it coordinating human actions in some way, this would by nature have to be coerced.

If AGI solves the climate crisis, it will be our King, and do so by coordinating our supply chains and economic activity.

3

turnip_burrito t1_iv9yaau wrote

Yes, that's correct. Another (less likely?) scenario is an AGI completely controlled by people, with no actual, or very limited, AGI autonomy. In that case we could use it to accelerate technological progress to make the things you listed easier.

2