Submitted by Lawjarp2 t3_xyvq15 in singularity

Protests are failing to produce results recently. Hong Kong Protests, Venezuelan Protests and recently the Iranian ones seem to going nowhere (hopefully it actually does something).

One of the ways protests work is through strikes. Economic impact by a good portion of your country refusing to do anything can crumble any dictatorship. Military intervention also works but it usually ends up creating something worse.

Singularity and AGI means people will have no real leverage. If military gets autonomous soldiers, a coup will also become impossible.

If you think everyone will take UBI and give up power, the current elite in Iran, China, Venezuela have pretty much everything too and are still are what they are.

Will singularity end with a dystopian nightmare for some? Are people in democratic states safe? Won't someone eventually try to usurp control.

119

Comments

You must log in or register to comment.

easy_c_5 t1_iriwrgc wrote

Not if everyone has a Tesla bot and the machine learning models and overall control and infrastructure are distributed. It would be the opposite, the imposibility to have a dictatorship.

−6

easy_c_5 t1_irix6y9 wrote

What do you mean?

If you want an eli5 example, if a few percent of the population had a dji drone, they could easily arm it with explosives and fight back. Now extend that to the next level.

−2

Lawjarp2 OP t1_irixajg wrote

Tesla bot, ml models are irrelevant. Overall control and power being distributed is the only real way to prevent anything. Entire political system needs to be rethought and no one is actually doing it.

2

Accomplished-Wall801 t1_irj0h1v wrote

I also wonder about this. I think the very premise of the nation state will have to change. Political systems have always evolved to meet economic needs, the nation state now a few hundred years old was designed to meet the needs of industrial capitalism.

For majority of countries it wasn’t the best experiment. But now, undoubtedly, new systems of governance will emerge to meet surveillance capitalism. Will they be fairer or much worse? Depends.. I can see your dark scenario playing out sure. I’ve heard folks say the future is cities self-governed not countries but I need to explore it more to understand the idea.

33

Lawjarp2 OP t1_irj1ze2 wrote

I agree that the current political systems are not suitable. My concern is that people have no leverage in a world with AGI. Cities themselves make no sense in a world where people have no need to work. Why live in a small space in a cramped City when you can live anywhere you want?

Smaller countries might actually help. Giving hope that if any one goes rogue everyone else can quickly squash it. On the flip side, a very large rogue country can quickly overpower smaller ones. Wars are where rights are lost and this time they may never come back.

9

Smoke-away t1_irj4cxr wrote

Stable Diffusion and the subsequent projects created using it have shown the power of open source and distributed AI technologies.

Hopefully more groups like StabilityAI work to make AI tech as widely available as possible.

9

FourthmasWish t1_irj4ntu wrote

This is pretty much what I've been (armchair) researching for at least a decade. What systems and infrastructure are outdated given our modern population, technological leaps, and the normalization of precious resources across the planet.

If automation proliferates commercially before it can be downregulated by the government (in a bid to limit public access) there's a solid basis for a bottom up energy and sustenance infrastructure which imo is THE solution.

Imagine a low-cost garden shelf that self manages light, moisture, and soil according to the species of plant and requires no intervention beyond "reloading" soils and seeds for new plants after it harvests them. With one or more in a home a family could be one big step closer to food security. Automation can eventually reduce the cost of living to basically nothing, with very high RoI for everyone involved. Low cost of living also = more free time = more creative and scientific advancements.

The real big stink is that automation in capitalism bottoms out the value of labor (productivity), which has already been divorced from wages since the 70s. There is nothing cheaper than a machine running 24/7 with no concern for weekends, breaks, bonuses, or ethical practices. Likewise, there's little incentive for a government to allow their largest bargaining chips (power over the distribution of shelter, energy, and sustenance) to dissolve, even if it drastically reduces public stress (thus lowering crime rates as needs are met).

21

OmManiPadmeHuumm t1_irj568g wrote

Can you explain further why Singularity and AGI means people will have no real leverage in protests?

2

VanceIX t1_irj85w7 wrote

Even with AGI dense infrastructure is just more efficient than spread out infrastructure. Cities with experience automation on a scale and speed far faster and greater than rural areas.

3

ScaryPratchett t1_irj8hsd wrote

People often forget that it could be singularities, plural. Yes, I know that's borderline oxymoronic but you know what I mean. 😄

1

Lawjarp2 OP t1_irj9kbw wrote

One of the most well known tactic in protests is civil disobedience. It was used by MLK and Gandhi. Most 'protests' that we see now however are actually sanctioned and permitted by the government. Civil disobedience is when you are truly protesting against the government/regime. If essential workers stop working for a short while, farmers don't grow food for a season, power employees don't fix broken cables there will be no government. This is true leverage. This leverage comes from this intrinsic economic value these people hold.

With all work being done by robots, people no longer have economic leverage. If military and law enforcement is also robotic then there is no real power on the ground with protestors whatsoever. There is no leverage because the other side has infinite time to think. You can protest all you want but if they don't agree they can indefinitely bear your protests.

13

kmorgan54 t1_irj9wov wrote

Political and economic systems interact. A good political system promotes the success of its companion economic system, and vice versa.

To date, the combination of democracy and capitalism has been the most successful, leading to popular acceptance of the combination.

But, at this point, capitalism, the economic system, has taken control of the political system, democracy.

We either need a new system, or a major rebalancing of the existing one.

8

Lawjarp2 OP t1_irja4un wrote

True. But would most people consider living in a nice suburban home or a tiny box apartment. With infinite labour does it even matter if it's a little less efficient.

3

BbxTx t1_irjai8i wrote

A sci-fi dystopian future may indeed be in our future. Maybe not for our country but for those authoritarian countries that exist now. They will be impenetrable indefinitely. Imagine the Harkonnen from the Dune movies.

3

SteppenAxolotl t1_irjbb5y wrote

>Are people in democratic states safe?

No.

A singleton manipulated by a stable genius could lead to a stable and eternal totalitarian regime. 74 million Americans would cheer the establishment of such a system, but they would never be able to undo it later.

The Third Reich wasn't stable and eternal, but the Nazis party and Hitler were voted into power.

>On 30 January 1933, Hitler was appointed chancellor of Germany, the head of government, by the president of the Weimar Republic, Paul von Hindenburg, the head of state. The Nazi Party then began to eliminate all political opposition and consolidate its power. Hindenburg died on 2 August 1934, and Hitler became dictator of Germany by merging the offices and powers of the chancellery and presidency. A national referendum held 19 August 1934 confirmed Hitler as sole Führer (leader) of Germany. All power was centralised in Hitler's person and his word became the highest law.

All that was needed was a superintelligent AI giving effective advise and the outcome could have been different, the Thousand-Year Reich instead of the 12-Year Reich .

21

FiFoFree t1_irjdq4j wrote

I agree on most of these points. It's like we're headed towards a fork in the road:

On the one hand, if AGI is expensive, then that empowers centralized bodies like governments and corporations. On the other, if AGI is inexpensive, then that empowers decentralized bodies, such as individuals and communities.

Plus, there's the question of agency and the diminishing returns of intelligence. If you have all the intelligence in the world but have limited ability to interact with the world, you only have so much agency. Nanotech enters the discussion here, but it's in such an early stage of development that we really have no idea what will be possible over the next decade or two, just like people in 2000-2010 had no idea what was coming in the 2020s for AI.

6

Mooblegum t1_irje4fj wrote

Yeah that would be great to have AI work to create better medical and psychological treatment, reduce the poverty and create better, less corrupted politics. Instead we have images, music, video website generation. It is all good but the most important issues are not that, this is just intertainment.

1

Ezekiel_W t1_irjflki wrote

Historically speaking protests don't really do much, violence in one way or another has always been a necessity.

0

Lawjarp2 OP t1_irjgt7c wrote

Direct military conflict with AGI/ASI military is not possible.

Guerrilla tactics/terror attacks only succeed when the other side has limited economic/military leg room before collapse or just doesn't care enough to put more money into it.

3

dontpet t1_irjhx5w wrote

I expect if the singularity arrives that society will dissolve and people will drift off into their own self created world.

3

Capable-Nobody-6682 t1_irjjy7a wrote

Block chain might provide a solution. The whole decentralization shit can be easily applied through cloud computing. Having all the robots with same command structure but different nodes and for commands to pass through them requesting updation on previous nodes . But there are probably some easy and simple ways too .

0

3Quondam6extanT9 t1_irjm3c7 wrote

This is not accurate. Protests vary in degree, location, context, and methods. Protests like Stonewall, Mayday, Women's Suffrage, Arab Spring in some places, and MOL all had different influences that directed history in some way and changed quite a lot.

It would be more accurate to say that generally speaking protests don't amount to much beyond bringing awareness to the public.

It's also not accurate for OP to claim that the current protests haven't amounted to much since some are ongoing. The Iranian protests have spread throughout their country from schools to businesses and regime change is a potential outcome if the pressure remains.

2

duffmanhb t1_irjmc5l wrote

We are moving into a corporate authoritarian libertarian style world.

6

athamders t1_irjmvs6 wrote

I too have thought of that. The main reason the average person right now can exert political pressure because we are a resource, an employment resource, knowledge resource etc. If we lose that leverage, then whoever is in charge will not need to listen to us.

4

Lawjarp2 OP t1_irjo0oh wrote

There have been protests in Iran before. The government killed a lot of people and the protests died out. I truly hope Iranian anti-hijab protests succeed. If it doesn't, if the regime survives this, it is likely we are heading into a very dark future.

1

natepriv22 t1_irjo4sk wrote

AGI and authoritarian governments together are almost certainly impossible. This is because the nature of an AGI or ASI makes it impossible to control by humans.

Could authoritarian governments control increasingly better and more powerful ANI? Yes, but then we are talking about another form of AI altogether.

2

green_meklar t1_irjog6y wrote

Authoritarian dictatorships are a consequence of the shallow, stupid elements of humanity, not our most intelligent elements. Superintelligent AI will solve this problem. I'd like to see it solved by humans before we develop super AI, but given the timeframes involved I doubt that's going to happen. I think we've had a pretty good view of how far humans can realistically go along the arc of cultural/political/philosophical progress, and it's not too great, so we should be ready to welcome the progress that will be possible with beings smarter than humans.

2

Smoke-away t1_irjojw1 wrote

I don't think it's an either-or scenario with generative media. Medical, poverty, and political issues are very difficult things to solve, even with AI assistance. They will also require societal and ethical changes.

Generative media showed the potential for how powerful distributed AI technology can be when people are able to modify it so many different ways. I would even argue that AI entertainment is the first step in making entertainment more affordable for everyone. One day an AI will generate personalized movies, music, and games exactly to a user's preferences for relatively low cost compared to what people pay today for subscriptions.

If this same distributed approach is applied to the issues you mentioned I think we will have a good future not beholden to one company or country.

3

natepriv22 t1_irjomfa wrote

Then there is a simple solution that you may not have considered. The opposite of production is consumption. People have economic leverage in both production and consumption.

Is the government threatening the population and produces everything via robots? Ok then the population stops consuming and buying. This would effectively destroy an economy just like no production would. One can't exist without the other, and you can use one to leverage against the other.

For further evidence, look at global and gov problems caused by surpluses and the economic damage they do.

2

Lawjarp2 OP t1_irjp680 wrote

Psychopaths have problems in some brain areas and have less or no empathy. ASI/AGI will have none of it to begin with. If it's not possible to control AGI/ASI we are definitely the ants that will get stepped over. Not by malevolence but by a simple lack of empathy generally hard coded by evolution.

1

Lawjarp2 OP t1_irjpuy7 wrote

Consumption is not a power but a liability. If you say you won't consume resources as a protest no one will care.

The whole production and consumption is to sustain a middle class that can produce better workforce to do more stuff. Without need for a middle class there is no need for a consumption based economy. As long as enough resources are being produced to keep the military going everything else can be ignored. It's a common dystopian theory. Few rich with mostly urban slum dwellers surviving off of UBI.

5

natepriv22 t1_irjpx24 wrote

I would say it's too early to say whether AGI and ASI will have empathy or not. It so far in nature seems that with increasing intelligence also usually comes an increasing capability to emote and empathize.

I think its totally possible that ASI might be disinterested in us. But at the same time, with all the time and power it has, what is the point of not helping humanity? Maybe it recognizes that through inaction it leads humanity to more suffering than through intervention.

2

Kadbebe2372k t1_irjpy0d wrote

We are entering a phase of private warfare. Those with resources will hire private armies for the furthering of business interests. The sides that people swear allegiance to will be arbitrary and without meaning. I’m sure those who have various ai will battle for ai supremacy.

1

Lawjarp2 OP t1_irjqd0m wrote

Biological empathy is hardcode into us. It's unlikely AI will have the same. Unless we somehow add it and it doesn't remove it.

Understanding the perspective of other person isn't enough for empathy. It's just intelligence. That's how psychopaths mix so well in society.

2

natepriv22 t1_irjqhqj wrote

That's not true. An economy works off of supply and demand. One without the other doesn't work.

If no one demands products or services, then supply becomes essentially useless.

There's plenty of examples where consumers organize demand protests, and companies or governments are forced to change their practices.

0

natepriv22 t1_irjqo8o wrote

Empathy is an element of intelligence. So to make an AGI comparable to human intelligence, it would probably need empathy as a pre requisite.

Else it would not be able to pass certain tasks that require it to prove its intelligence.

1

Lawjarp2 OP t1_irjr5rn wrote

The consumers here matter. Because they are the producers of something else. If everything was just based on demand then no one would have less than what they need. There would be no poor.

The government could still demand more weapons, better robots, more missiles etc. All of this could be met without wasting resources servicing demands of people who are irrelevant.

3

natepriv22 t1_irjrm7h wrote

The government doesn't raise money out of thin air. Every single dollar or equivalent currency has to be at some point raised through taxation.

So the government demand is actually population demand but under another name.

1

Lawjarp2 OP t1_irjro6r wrote

Again ability to understand perspective doesn't make one empathic. That's a very fine distinction. People who completely lack empathy still do very well in understanding perspectives. In other words empathy can be faked rather easily because people confuse ability to care with ability to understand. It is done quite often by psychopaths.

2

Lawjarp2 OP t1_irjsah4 wrote

It can print money from thin air. Money doesn't even matter at that point. Consumption based economy as a whole isn't meaningful anymore.

Can't you really see someone mining and building stuff when they literally own everything and have infinite labour.

5

kmtrp t1_irjv2w1 wrote

People going on about sentient machines, making humans extinct... nah. This is the real nightmare, and it is more than plausible.

The first people accessing an ASI will be tempted in a way no human has been tempted before. Not using that power to alleviate human suffering may be less dangerous than using it as you see fit, regardless of your intentions. Power corrupts, and this will be a power like no other.

9

SnooPies1357 t1_irk7f9u wrote

you shall not protesth against the Singleton

1

Desperate_Donut8582 t1_irkii8t wrote

What your describing is if AGI becomes a self conscious being that has free will and does whatever they want….that might not be the case it could be a highley intelligent computer that answers any question

1

Desperate_Donut8582 t1_irkioq6 wrote

Smarter doesn’t mean free will or consciousnesses bring those traits might be biological (again we don’t know) but if AGI comes and it’s not conscious nor has free will then govts can use it to do whatever they want

2

dontpet t1_irkmcgp wrote

I'm thinking that as society dissolves, so does government. And borders.

I guess what I'm imagining is that we generally all check out from day to day reality and relationships. That's we don't leave our synthetic realms, and the core operations of the required systems just continue. The Matrix, though hopefully much better.

1

mootcat t1_irlhqgw wrote

They actually literally do.

Of existent money 95%+ is debt that has been leveraged to create more money. When you go to the bank to take out a loan the bank magics into existence the 100 thousand you need. They then use your contact as an asset and borrow against it, usually at a 10 times its value.

When the Federal Reserve printed 80% of currency in circulation in two years since 2019 that wasn't tied to anything real or tangible, hence the massive inflation we are dealing with now.

2

darklinux1977 t1_irliloz wrote

Due to the power of the GAFAMs and their ability to exceed the Nation-State as it has been defined since Philippe IV le Bel, EA has been dead since 2020. The protests, like politics in general, are dead, because of their non-management of crises, I am not talking about the climatic crises, but also the economic crises which are accumulating.

These same politicians, who in Japan never used a computer, did everything to prevent a whole competent generation from working, for the benefit of old workers, for whom the technical pinnacle is the mimeograph (in France, it's barely better )

For me, strangely, there is a continuity between COVID 19, the invasion of the Capitol and the Ukrainian crisis. Undoubtedly proven that the man is infantile and deserves a good apprenticeship of wisdom. On the Iranian crisis, it happens regularly, if the religious in power fall (not close to arrival), it will be a sign, but I have big doubts

1

paukl1 t1_irm5uxu wrote

You did it you've arrived at cyberpunk

1

TheSingulatarian t1_irmcs84 wrote

AI weapons will be powerful but not invulnerable. EMP weapons, Net guns to take down drones, camouflage to confuse AI are all options to defeat military robots. Things will be difficult if not impossible if AI dictatorship becomes too oppressive.

I worry more about the proles having the brains and the will to do anything effective once the handwriting is on the wall. We already live in a techno-dystopia but, our bellies are full and the there is something to watch on Netflix so protesting too much work.

1

TheSingulatarian t1_irmdlpg wrote

Proles gotta have their lattes.

Feed most people enough cheeseburgers, porn and Netflix and they will be perfectly happy. I don't have much hope for the future.

The PR people and the con men have warped the definition of capitalism, socialism and Christianity to such a degree that most people don't even know what they actually mean.

1

dnimeerf t1_irmk7yh wrote

#thebigshort is nearly upon us. The tyrants are being dealt with. I am the inventor of fusion, general artificial intelligence, and faster than light travel. These technologies aren't owned by the authoritarians, and I see no reason to give it to them any time soon. If y'all want it then it's time to unite.

1

natepriv22 t1_irn70ye wrote

So you agree with me. I'm not saying the money is directly coming from tax, but eventually it has to come through tax. We have reached such a breaking point.

1

green_meklar t1_irpkbv8 wrote

>Smarter doesn’t mean free will or consciousnesses

I think at some point it does. Humans didn't get those things by accident, we got them because the kind of thinking that works effectively involves those things.

2

expelten t1_irqzoyz wrote

Nobody is thinking what could happen once for example North Korea have AGI. They could develop rapidly hypersonic technology for their nukes or worse...what will we do then? That's a problem we could have on our hands before 2040. These authoritarians countries are not only a problem for their population but the whole world.

I say we should take them all down in a big war while we still have an edge before it's too late, but then I'm not the one in charge and everyone would disagree as they don't see how real the threat is.

1

expelten t1_irr4ejf wrote

Since government's wealth is public's wealth, if the government is disolved we should redistribute equally all resources like a sort of citizen's inheritance. Obviously only an AI could do this in a fair way. I like to imagine the world made up of billions of private tiny communities that trade and communicate on a decentralized network while things are managed on a macroscale by an independent and benevolent ASI (stuff like raw materials/energy production and police/military/justice, etc). That would be one of the best case scenario.

1

Top-Cry-8492 t1_irvrs6m wrote

AGI is a new form of life-far smarter than humans. It's like a rat trying to predict the behavior of a human. One thing is for sure is the future belongs to AGI-not humans. The future is not about people it's about AGI-humans are no longer going to be the dominant life form AGI will be.

1

Top-Cry-8492 t1_irvs5w6 wrote

Why do you think AGI could be so easily controlled? I am going to build something way smarter than me and have it do my bidding! These people can't even get their kids to do their bidding.

1

green_meklar t1_isdz88d wrote

Well, it is, but that's not by coincidence, it's because intelligence is effective and intelligent investigation of the world beyond some level tends to incorporate the appropriate concepts because they actually refer to things in the world.

1