Comments

You must log in or register to comment.

Sashinii t1_j1siccq wrote

No. The singularity will qualitatively change everything.

But before the singularity, there will be AGI, which I think will lead to the advent of the technologies that will enable post-scarcity, so there won't be any politics or economic systems.

That might sound crazy, but given the tools AGI will provide, I think most people will leave Earth with their nanofactory and enjoy whatever there is to enjoy beyond just being on a single planet.

15

OldWorldRevival OP t1_j1sjidl wrote

Industrialization and technological progress has tended to worsen inequality - that is a trend that optimists really need to pay attention to and get ahead of.

It may very well be that in 2050, it might be very hard to actually find any land for yourself.

It's also quite the assumption to assume that people will leave earth. And, it would be an incredible tragedy if staying on earth and having land was something that was essentially for the elite, which it could be in a post-singularity world.

Like, you have to think about land-ownership and how it has gone. As our population has grown and our economy has grown, the price of land has gone up, a lot. But, this is largely due to an asymmetric distribution of resources.

Zoning laws also still exist.

Thinking that we can't get ahead on some of these things is a bit too low-resolution of a viewpoint. AGI is going to make huge strides in asymmetric ways as well. So, it'll be very good at some things, but we'll still need human verification for lots of things for a good while, such as laws, engineering designs, etc.

3

DesertBoxing t1_j1sxj8a wrote

Owning land is largely tied to capitalism, post scarcity it will be greatly devalued. Why construct things on real land when you can make your dreams come true in VR at the snap of your fingers?

5

TheRealMDubbs t1_j1tap52 wrote

While inequality has gone up, so has the quality of life for the average person. The average person might be getting a lower percentage of the pie, but the pie is growing so you're still getting more pie after everything is accounted for. I think there still will be megacorps in the future, but in a post scarcity society there will be no reason not to give everyone a utopian standard of living. It would cost nothing in comparison to the infinite resources of space.

4

Ok-Heat1513 t1_j1sypk8 wrote

But like are you taking into consideration, climate change, even if you have crazy advanced tech it’s not going to change the climate over night

2

rixtil41 t1_j1tad9o wrote

If you could live in vr climate change is irrelevant.

1

Ok-Heat1513 t1_j1tchul wrote

Bro are you smoking all the crack? Just cuse you hide from your problems doesn’t mean they go away. You realize being buried under 1.8miles of ice means you aren’t going to be alive, even if you are in a bunker. Saying fuck it to the earth is the stupidest fucking shit I’ve ever heard. You are exactly the problem.

0

Care_Best t1_j1te0gx wrote

the singularity is 20-30 years away. I truely doubt 20-30 years more worth of carbon damage will result in 1.8 miles of ice covering the earth. and the concept of living in vr, doesn't mean surviving in a bunker with some vr headset. he's talking about mind uploading where a person choose to upload their consciousness into a matrix like reality, abandoning their physical body which requires food, water and shelter.

2

rixtil41 t1_j1tekf4 wrote

If you replace you body parts with synthetic ones then you could run on nothing but electricity making climate change no longer a number one problem.

0

Calm_Bonus_6464 t1_j1smhpe wrote

Once singularity is achieved its not going to matter what your political beliefs are, AI would be calling the shots whether you like it or not.

For the first time in 300,000 years we will no longer be the most intelligent form of life on Earth, and this means beings far more intelligent than us will decide humanity's future. How that happens is anyone's guess. A post singularity world will be so radically different from today modern economic theories and solutions will likely have no place.

7

OldWorldRevival OP t1_j1sn3uf wrote

> A post singularity world will be so radically different from today modern economic theories and solutions will likely have no place.

I think this puts too much magic into the AGI without thinking about specifics of actually dealing with things like the control problem and unequal access to the most powerful AGI tech.

I.e. an AGI aligned to one person could be very, very bad, and is in principle, totally possible. Not eliminating our current systems could totally lead to such a state too.

Imagine someone like Trump, but more calculated and cunning being the one that the AGI listens to.

2

Calm_Bonus_6464 t1_j1snyzn wrote

But we're not just talking about AGI here, Singularity would require ASI. Not just human level intelligence, but far beyond the intelligence capabilities of all humans who have ever lived. A being that intelligent would pretty easily be able to orchestrate political takeovers, or even destroy humans if it so desired.

2

OldWorldRevival OP t1_j1splv0 wrote

When I state "singularity requires political revolution to be of maximum benefit," I mean that the political changes have to come before the singularity.

Otherwise, the general benefits may be concentrated in the hands of an elite as the elite and those with resources continually lose the need for the masses with automation, as they're able to be self-sufficient with food, labor, etc.

But it could be worse, where an elite few control the AGI.

Or, lots of people become homeless, and then they're treated like homeless are now.

2

Calm_Bonus_6464 t1_j1srzke wrote

ASI does come before singularity. And ASI would solve much of those concerns. ASI has no reason to be any more benevolent to elites compared to anyone else. Elites cannot control a being that is far more intelligent than them. You're thinking AGI, not ASI, both have to happen before Singularity.

0

Upbeat_Nebula_8795 t1_j1snq7h wrote

yeah i dont see much point in the singularity if we dont help evolution create something better than us. only thing that’s like god

1

Aevbobob t1_j1szgm9 wrote

One consequence of superintelligence is that the cost of basically everything will trend to zero. When the necessities and even the luxuries of life are virtually free for all, I don’t see a lot of political issues left.

Land ownership might be one, though there is a LOT of open land in the world that is just inconvenient to live on in today’s world with today’s technology. Like, most of the land.

4

XPao t1_j1shlu6 wrote

Nope, you have no claim over my land, feel free to try and take it by force but be aware that I will give up my life before giving up my son's heritage which I worked for all my life.

2

OldWorldRevival OP t1_j1shu9x wrote

I think you're too threatened by what I say.

I am soon going to own some land of myself and my family actually has a lot of land, so please don't take it as a threat.

I'm actually more concerned about billionaires buying up endless acres of farmland, pricing people out of being able to establish on their own, turning farming into indentured servitude as a profession in the modern era.

So, progressive tax on land ownership... so if you own less than 500 acres, your tax would be low. If you own a few acres or less, your tax should be zero.

If you own 1,000,000 acres? You'd be taxed more, according to the value of the land.

That's the concept.

4

AwesomeDragon97 t1_j1t0246 wrote

One way we could do it would be that anyone who owns land worth under $5,000,000 shouldn’t be taxed for it at all, but the tax should exponentially increase after that. Alternatively we could just ditch property tax entirely and instead put a cap on the amount of land you can rent out at any given moment, which would prevent billionaires from creating a feudal serf class.

3

OldWorldRevival OP t1_j1t4hya wrote

Pretty much...

The surface of the earth - the real, beautiful natural surface of the earth, for the fact of it, cannot be replaced by a virtual world or being on the surface of another planet.

I do not want to see that ineffable, priceless beauty become something like a commodity like a precious metal.

1

OldWorldRevival OP t1_j1siaqp wrote

Also... zoning laws. Fuck needing permission to build a shed on my own goddamn property.

3

DesertBoxing t1_j1szs18 wrote

That’s some pre-singularity thinking right there. Post singularity you’ll ask yourself why you would have even thought about giving up your great life for a worthless piece of land. Or a rouge AI could take your land and make you into paper clips lol who knows

2

XPao t1_j1tvc4p wrote

You could be right, definitel.

1

[deleted] t1_j1slnch wrote

[deleted]

1

OldWorldRevival OP t1_j1slwzi wrote

We'll still have markets. I just see productivity becoming a very different thing.

I think it would be foolish to try to keep pumping our numbers up right away too; we should focus on actually creating good quality of life for everyone first, on all measures.

I.e. if people feel like life is meaningless because too much is given to them, that's also a problem to solve. Like, lets get life right, basically.

2

HeinrichTheWolf_17 t1_j1t28sr wrote

Abundance will lead to Zero Marginal Cost. Zero Marginal Cost will make Capitalism superfluous

1

IslamDunk t1_j1t7rtf wrote

I don’t like the idea that there is/should be one end-all be-all political system. Society operates in phases and each phase requires a new political system.

In my opinion, we’re still gaining from capitalism but heading more towards a democratic socialist type of economy, similar to Norway/Sweden. I think there should be a focus on building great unshakable democratic institutions with checks and balances that eliminate the possibility of corruption.

I don’t believe in relying on the idea of a singularity but I certainly think AI will help us achieve this.

1

TemetN t1_j1tcy8g wrote

In the intermediate term UBI is necessary to mitigate the damages of the transitory period in between when mass automation occurs, and when scarcity is eliminated. In the long term frankly more protections are needed for individual freedom, I expect further challenges to come there, and not just in the intermediate term.

1

calbhollo t1_j1t1yix wrote

Absolutely. There is no way that AGI isn't a force of pure destruction if it's built under a capitalist system. THere's also no way for us to slow down and not create a misaligned ASI under our system. We need change, and we need it quickly.

0