nowrebooting t1_jef0bua wrote

I think you made the right decision in choosing for construcion, however…

> Maybe one day.

In the past, I always thought that creative jobs would be the last to be automated away. One year ago, I would have said “maybe one day”.

While you are right that currently the most likely outcome is that manual labor will remain the purview of humans for a long time, recent history has shown that we’re often only one innovation away from a complete shift in thinking.

At this point, everytime someone says “but my job is safe” I feel like it’ll probably be fully automated within the next five years.


nowrebooting t1_jedyfcu wrote

That’s understandable; even as someone who is a tech cheerleader, there’s a hint of sadness when I think about the value of my programming skillset going down in value - but then again, that may be well my very human ego feeling annoyed about no longer being that special compared to others. It’s also typical that none of us generally cared when it was the prospect of truck drivers being replaced with self driving trucks, but now that it’s us being threatened, it becomes a big philosophical debate. It’s a bit of an echo from every paradigm shift since the industrial revolution - I think it’s arguable that we did lose some of our humanity when we switched to the assembly line or when we all started spending most of our days behind a computer, but it also gained us a lot of freedom to explore our humanity we didn’t have before.


nowrebooting t1_jedwwoe wrote

I think the people who have the most to fear from AI right now are actually the people at the top - you are right that AI advancement will inevitably lead to societal upheaval, uncertainty and a paradigm shift, but the person with the most to lose isn’t Average Joe whose office job is automated, it’s the elite whose claim to power might come crashing down when AI levels the playing field across the board. At the moment almost all capitalist power structures are based on the idea that while I might resent the wealthy elite, I’m dependent on them for my livelihood. They control my income, which means they control me. Their only choice is to either keep Average Joe happy or to face their own French Revolution.

Beyond that, It’s my hope that in a world where AI is so smart that it can reliably replace a majority of all jobs, it’s also going to be smart enough to quickly come up with policies to keep the world from plunging into anarchy. Any AI that can outthink a human will realize that oppression, starvation and violence can always be avoided. A worst case scenario might be a Brave New World type scenario, where we are “domesticated” by an AI that understands our psychology better than we do and keeps us happy while unnecessarily keeping its elite masters in power.

It’s an interesting prospect; at this point we’re looking at a future that is pretty much impossible to predict; while I have my own ideas of what might happen - anything is possible.


nowrebooting t1_je05eyi wrote

It’s like clockwork. “well, but we still need humans for X”, only for an AI to do “X” a few months later. At this point the only relevant discussion left is not IF an AI is going to surpass human intelligence soon, but HOW soon - …and whether people feel like this is a good or bad thing is up for discussion but doesn’t matter much in the end; anyone not already preparing for the coming AI revolution is going to experience a rude awakening.