Viewing a single comment thread. View all comments

AdditionalPizza t1_iw2hsol wrote

I think people would be happy if UBI comes to fruition, and the best chance of that happening with the least amount of turmoil and suffering is if automation sweeps quickly. If it's too slow of a rollout, there will be a lot of people stuck in limbo, feeling useless to society and losing their life savings.

I don't think any mentally stable person wishes that slow scenario at all. Yet, there's so many people that defend working full time for a living and saying automation won't take their job. Well, it probably will. I don't know when, but it most likely will. Best case scenario is sooner rather than later. They can't fathom that, they think the longer they can work the better. Reality is the shorter the timeframe for everyone, the better.

UBI implementation is so unpredictable though, because we just can't accurately guess how humans, specifically politicians/unions/luddites will react. If AI art generation has been a sign at all, it's that artists are going apeshit and in the broad scope of things, nobody cares about them and just want that sweet sweet automation.

So either it's a case of dominoes, one by one an industry becomes automated because people fight tooth and nail and slow progress. Or an entirely generalist AI capable of doing several tasks over most industries drops, and we're all left with enough time on our hands to enact change.

I will keep insisting the best thing people can do is simply be prepared, don't be caught off guard, and keep an eye on what AI is advancing so you can keep yourself mentally ok if your career path is disrupted.

As for the scenario of billionaires leaving us to die, I don't really see how that is likely. Ok sure, I won't deny someone might be in control of it; But outside of total enslavement of humankind (doubt) they would either need A. People that can afford to buy their products they make or B. Leave us behind and live in their own paradise in which case we will continue on doing our own thing.

It's hard to imagine one single person being in charge of an ASI that has utter control over the entire world and just wants everyone to starve to death.

11