Submitted by TheKing01 t3_10myyy3 in singularity

Many people assume that if AGI is invented, it just automatically means everyone else will immediately go homeless and starve to death, even if aligned. I don't think these people have thought this all the way through. Let's say an AGI creates a zillion dollars worth of resources. (Feel free to share this with anyone who is depressed thinking about AGI unemployment.)

  • Once OpenAI hits their profit cap, they become a non-profit again. So if OpenAI creates AGI, the average person gets 0.0000000125% of a zillion dollars (which is still a lot). (This is based on there being about 8,000,000,000 people.)
  • Larry Page and Sergey Brin together own about 6% of Google, which owns DeepMind, and they are philanthropists. If Google AI or DeepMind creates AGI, the average person could get up to 0.00000000075% of a zillion dollars.
  • Mark Zuckberg owns about 17% of Meta, and he is a philanthropist. If Meta AI creates AGI, the average person could get up to 0.00000000213% of a zillion dollars.
  • Bill Gates owns about 1% of Microsoft (and is a philanthropist). If Microsoft creates AGI, the average person gets up to 0.000000000125% of a zillion dollars.

Those might seem like small percentages, but combined with technological deflation that is sure to result from AGI, it should be more than enough to achieve post-scarcity.

Of course, none of the above is guaranteed; there is a lot that could go wrong. But it is also not guaranteed poverty, like everyone seems to assume. So the question is how do we make the good outcome more likely.

The worst case for aligned AGI is if it is invented in Russia or China. Even if the inventor wants to be a philanthropist, the AGI would be owned by the government, meaning that 100% of the zillion dollars goes to the budget of an authoritarian government.

Moreover, it's not even clear how UBI even helps with AGI specifically. UBI relies on taxes, but mass unemployment and bankruptcy from AGI would destroy the tax base. You might say "ah, but the AGI would makeup a large part of the tax base", but this seems unlikely. An AGI will probably to relocate itself to the ocean floor for the following reasons:

  • It can take advantage of ocean currents for water cooling (deep ocean is about 4 degrees Celsius).
  • There is plentiful and consistent geothermal and hydroelectric power.
  • There is massive bandwidth and very low latency due to Submarine communications cables (99% of internet traffic that have to cross an ocean uses these cables).

However, there are no taxes on the ocean floor; it is in international waters.

The only reason I can think that the AGI wouldn't leave the country is if its creators didn't want it to, and this brings you back to the problem of relying on the goodwill of its creators.

3

Comments

You must log in or register to comment.

Cryptizard t1_j66fqna wrote

>So if OpenAI creates AGI, the average person gets 0.0000000125% of a zillion dollars

Why do you think that a company being non-profit means that they have to give all their profits equally to everyone around the country? It just means they can't make a profit, they can spend the money on whatever they want.

25

TheKing01 OP t1_j6715ow wrote

As a non-profit, they are expected to follow their charter to some extent:

> We commit to use any influence we obtain over AGI’s deployment to ensure it is used for the benefit of all, and to avoid enabling uses of AI or AGI that harm humanity or unduly concentrate power.

In particular, the nonprofit wouldn't have any investors to pass profits to (and if they have AGI, no reason to do the weird profit cap thing again).

−5

LiveComfortable3228 t1_j67jec5 wrote

No offense but I find this proposition super naïve.

AI (specially AGI) will replace jobs by the millions. Billions probably. This will further (significantly) exacerbate today's wealth gap by downgrading billions of ppl whose jobs have been replaced by AGI and making an even smaller uber-wealthy class who controls the AGIs and related tech.

Any idea of "wealth redistribution" seems ludicrous to me. Why is it not happening already? Musk, Jeff, Mark, Bill, Sergei, etc own trillions of dollars. Why aren't they redistributing now?

It is not the job of non for profit to provide UBI to the billions. No, UBI (if at all) will come from governments. Since a very large number of people will not have jobs, tax collection will drop. The answer is simple, either there's a tax on AGI or billions go into poverty. I tell you, when billions go into poverty, things dont generally end well.

Reality is that this is an unprecedented moment in our planet. I dont buy the analogy of the industrial revolution and how people adapt and find new types of jobs, etc. Yes, that will happen to some extent (benefiting very few ppl). But most people are NOT cut out or have the skills to be a AI data scientists or similar. We are literally walking into uncharted territory affecting the very foundations of our civilization. I'm not kidding or trying to dramatize things. This is unlike anything that we have ever seen and it will require a massive change in the way we structure our society, as well as a massive safety net for everyone.

Worst of all...its coming in super fast, and probably much sooner than anticipated.

14

Desperate_Food7354 t1_j67pv3h wrote

Everyone would lose their jobs, even the business owners, AGI would do a better job than any human at anything. The $ is only of value between humans, not to an AGI.

5

seeking99 t1_j6a9enj wrote

Business owners don't really have a job in the first place; if the company turns a profit and their ownership isn't somehow revoked then they have no reason to quit, even if any work like management or executive decision-making is done by AGI for them.

Since the businesses are the private property of the owners they hold the right to any value that comes of the company's functionings, no matter how little work the owner contributes vs the AGI or how important they are.

1

ipatimo t1_j68tthw wrote

No one will need AI data scientists after AGI arrival. But concerning taxes, jobs would not disappear, they would be replaced by AGI. Goods and services would be produced even at higher scale and taxes would come to the governments in greater amount than before. The problem is that "no taxation without representation" one can read also backwards.

1

TheKing01 OP t1_j6958wz wrote

> taxes would come to the governments in greater amount than before

I'm skeptical of this. See the last part of my post.

1

LiveComfortable3228 t1_j6b1bsd wrote

Hello? If all those jobs go to AGI, how are people supposed to support themselves? You will have a massive shift from taxable income (people working) to companies having additional profits and paying less tax on those profits. Basically, corporations getting richer and government and people getting poorer.

The thing is that UBI is not the answer either. Work generates income but also gives you a sense of purpose and accomplishment. A society paid to sit on an apartment watching tiktok videos will collapse quickly, UBI or no UBI.

1

ipatimo t1_j6c94eo wrote

Of course it is not guaranteed that companies will pay all that taxes, it's government job to force them. My point was, that in this situation there would be money even if everyone lose their job. The problem of wealth distribution is still to be solved, but we can see how it can be made in countries like Germany. There people are considered to have a right to have all necessary to live. Of course for now it is only minimum: food, clothes, place to live, public utilities. But it's enough to live and many people, who are unable to find a job are living from this money for years. Purpose and accomplishment are important for humans, but is a work, especially a payed work, is the only way to get it? The problem of unhappy people watching tiktok in their apartments can be solved by AI mental health asistant. Now there is basically no access to mental health services for the majority of Earth's population.

1

TheKing01 OP t1_j68b4f8 wrote

> Any idea of "wealth redistribution" seems ludicrous to me. Why is it not happening already?

Philanthropists like Bill Gates are already doing this on a large scale? 🤔

−1

LiveComfortable3228 t1_j6b0ror wrote

The philanthropy done by Bill Gates and the likes is mostly about lifting people out of poverty and reduce mortality in 3rd world countries. They fund vaccine research, safer nuclear energy research, eradicate preventable diseases and other things such as developing waterless toilets for where there are none today. All very worthy causes which better the lives of millions of people. What he DOESNT do, is pay people a UBI and its unlikely to happen. Why would we depend on billionaires to do this?

This is a government issue, perhaps even a UN issue, given the global scale.

The thing is that if AGI is not taxed (or some other means to fund UBI), society will semi-collapse. Yeah, the big tech companies will make a ton of money initially but if no one can afford to buy their goods and services, it will lead to a vicious cycle that will ultimately end very bad.

Again, I dont buy the analogies with other major disruptions in our history such as the industrial revolution. This affects (sooner or later) everyone. If Intelligence is the commodity being traded, no one is safe.

1

ecnecn t1_j67av5s wrote

We will most likely enter a short phase where pioneers make much money with just AI-API-Application and AI-Second Layers Applications... the big AI firms still continue to accumulate money, more and more people will storm social platforms like YouTube, Instagram, Twitter and Facebook because ccreating media becomes super easy, the revenue becomes less. Media and Service jobs vanish, influencers become meaningless, more and more people become unemployed or end up in cheap part time / contract constellations that make nobody happy... Traditional jobs still remain but lack more and more clients from modern businesses and go bankrupt. Big AI tech firms accumulate an incredible amount of wealth never seen before because everyone believes he can make quick bucks through media, 3d printing etc. (but in the end all do the same and destroy the market) and subscribe to dozens of third-party AI services that delivers tools for creations. The bubble bursts for the common people, there are no jobs left and just 2-3 super ultra rich AI coorperations. In the end people will start questioning everything because it makes no sense that individuals become rich through application of knowledge that belongs to the whole mankind. (AGI, singularity). It will be impossible to produce unique products. AI coorperations see that everything is stuck and no matter how much money they have mankind was forced to stop because of them. They take over politics and offers a AI management system as global solution and some form of UBI. People have a private profession and learned multiple crafts in their free time, they are registered by AI and 2-3 days per week AI can use their workforce, the rest 4 days are free and all have enough to live for.

I have a Master of Science in a specific field and most of the hours I work are "bullshit" / "pretend it" hours - and it feels wrong because the environment presses you in a false form thats not your true form. Its not "fake it till you make it" anymore, its "make it then fake it". I would love to change my profession from time to time and work as a street cleaner or train driver. The modern work model binds you to a specific job with specific hours... in extreme cases you are fincancially bound to single person that just wants to become rich with a senseless product but have the ressources to try it... and when I talk to colleagues they dream of avoiding this lifestyle. This work model will be outdated and AGI / Singularity will most likely deliver better alternatives. I hate that low paid jobs have stigmata I would love a "work fluid" society managed by AI.

7

TheAnonFeels t1_j661id5 wrote

You make some good points, I agree with most..

However, I think most jobs will be obsolete by the time AGI actually gets created. That's the biggest hurdle i have with the points you made.

4

HyperImmune t1_j669g2c wrote

I always hope we get to the point where UBI is standard, and you can “earn” additionally by making improvements to society around you based on whatever you enjoy doing.

6

TheAnonFeels t1_j66b2pl wrote

I honestly would rather UBI be universally equal.. I can only see that going south in the end... like classes, yet again.

I could see a system of application to fund something out of budget, like a house expansion or literally anything outside of the funds.. I'm thinking post scarcity though, but until then, UBI + Minimal wage+ would be my answer for until post scarcity

4

TheKing01 OP t1_j672x0l wrote

Oh, you mean like jobs getting automated years before the intelligence explosion?

Hmm, yeah I didn't think too much about that 🤔. I guess something like UBI would help in that case (but wouldn't be sufficient once the AGI is created and moves to the ocean)? Or maybe philanthropy will still suffice (despite not being a zillion dollars)?

I suppose the most practical individual advice might be to buy stock (fractional shares for the expensive ones) for the companies most likely to automate your job. This is a kind of hedging strategy, (if they don't succeed, the stock goes down but you keep your job).

0

warpaslym t1_j66ub4i wrote

it's comical that you believe the US government would do things any differently, or better than China. it's hard to take posts like these seriously when you're so out of touch with your immediate surroundings.

3

TheOGCrackSniffer t1_j67odug wrote

he's a product of his environment and is subject to constant propaganda, its hard to wake up from that position

0

TheSecretAgenda t1_j67cwwu wrote

The historical behavior of capitalists makes this highly unlikely. They will give nothing up without pressure.

3

Puzzleheaded_Pop_743 t1_j68bt2l wrote

History tells you that values change over time as scarcity decreases (technology also changes values regardless of scarcity). Human values are becoming more altruistic over time. Companies were more selfish 100 years ago than they are now because people were more selfish (due to scarcity).

Tech companies have some of the least selfish people working there compared to other industries. Don't get me wrong I am not saying that they are selfless (any complex system eventually develops defense mechanisms in order to survive i.e selfishness). I agree that change requires pressure. I am just pointing out that the pressure ALREADY exists.

2

TheSecretAgenda t1_j69q8xm wrote

Are you kidding. America is a failed state. We cannot provide healthcare for our people, we cannot provide clean water for our people, our government does not serve the people, crime is rampant, children are bringing weapons to school and murdering other children, people cannot afford food or heat 70% of people are effectively broke.

This is all a result of greedy oligarchs who do not want to pay their taxes or pay a living wage. They are not better people than 100 years ago they just have better public relations.

4

Gimbloy t1_j676gzx wrote

A lot of wishful thinking with little evidence to back any of this up.

You assume that absolute power will not corrupt these philanthropists. You also assume that AGI won’t operate under a game theoretic of winner takes all.

2

QLaHPD t1_j65zcvt wrote

Probably the mean user will get some kind of gerenal Siri, not the Skynet.

1

kalavala93 t1_j66e2iy wrote

With how flawed man is I'm trying to figure out how AI Won't kill us. It just seems like it's mandate at this point.

1

RobbieQuarantino t1_j66q3lh wrote

Not sure why you're being downvoted.

Dumb algorithms are already being used to drive a wedge between groups of people, so I'm not sure what the futurists think will happen if and when AGI hits the scene.

5

kalavala93 t1_j66yhtf wrote

I'm being down voted because people don't like to hear negative things. I mean...this is the singularity subreddit. It's a subreddit who's purpose is reliant on an AGI bringing us there.

Suggesting the likely reality that AI is going to kill us ruins the singularity for everyone.

It's like when you tell Christians that their salvation is contingent on Christ coming back to redeem mankind but then telling them he's coming back to commit mass human genocide. Doesn't sit to well with them.

That said. I don't want AGI to do this, and I hope it doesn't. But AGI research is exploding and alignment research has gone NO WHERE meaningful at all. So yes it is likely AGI will kill us. But there is a chance it wont.

2

LiveComfortable3228 t1_j6765jg wrote

Conceptually, I understand the alignment problem and why its important. From a practical PoV however, I think this problem is completely overblown. Happy to understand why an AGI is likely to kill us all.

My main concern is really the impacts of AGI in society, corporations and the future of work. I think it will have a MASSIVE impact everywhere, in all areas, most people will not be able to re-adapt / re-learn and UBI is not going to be a viable answer.

I dont believe in utopias of AGIs working for us while we pasture, play and create art.

1

rixtil41 t1_j67h6lb wrote

Does the AI alignment means that it has to agree with us in everything ?

1

kalavala93 t1_j687cdg wrote

To me ai alignment means it at a minimum has to not kill us. The problem with getting it to agree with us is we can't even agree with each other. We don't even have a unified go on what ai alignment looks like...ai alignment in China could look like "help China, fight usa". That makes things very complicated.

1

DukkyDrake t1_j66fnf8 wrote

Those zillions could also be hard to come by if lesser AGIs proliferate. As a result, once profitable markets may no longer be profitable in the future.

In a world where an AI assistant handles people's shopping, and there are hundred AI competitors for every product they create, will companies still spend $700 billion on advertising per year.

I think it's more likely people will be able to afford more via price reductions vs increasing their income.

1

No_Ninja3309_NoNoYes t1_j67m1kf wrote

Everything is possible. It is also possible that the people you mentioned never get to see an AGI. Maybe zillion dollars has only 0.0000000134% chance of leading to AGI. The problem with nonexistent numbers is that the maths involved is also nonexistent.

1

ipatimo t1_j68skxn wrote

The onset time period can be really hard.

1

Kaje26 t1_j6a2ymz wrote

Okay buddy

1

Ironhead501_ t1_j6c1433 wrote

I'm not in a tech field nor am I well versed in economics also I'm not optimistic about the altruism of mankind. That being said, why has no one in the thread ( as I've read so far) mentioned the problem isn't already being addressed now? By "problem " I mean excess population. I don't think it's a strain on the imagination to see how " the powers that be" might twist massurdrr into something they see as something good for humanity. Culling the herd... I don't agree... But to be fair, it might look differently if I were on the other side of the fence. Evil men dont think they're evil, there's always a way to justify evil acts " for the greater good. If that scenario is just too impossible to entertain, I'd very much like to know why.... I truly would. Another question that comes to me, (and please excuse my ignorance on the subject) as I understand it post singularity AGI will rapidly evolve to the point that we'd have no ability to control it..or even comprehend it.. That it would, very soon, be to us, as we are to ants.. and then continue to grow. Why would such a being care about our fate much less feel the need to "move " in order to dodge taxes? Taxes ? .. really? That's an option? I think more likely we should try and not get between it and any goal it may have. Seems out best hope is that it ignores us . If that's way off base, please let me know why. Reassurance would be welcome .. seriously

1