V_Shtrum

V_Shtrum t1_jaep48h wrote

All of what you say is true, the circle I'm trying to square is that, on the one hand, people often find work dull and unfulfilling etc. On the other, it's been widely observed that unemployment and underemployment correlate with all sorts of negative outcomes such as crime, poor mental and physical health etc. I'm not sold on the idea that more generous unemployment benefits (AKA UBI) alone are going to solve that*

I was convinced by Victor Frankl's book 'Man's Search For Meaning' that (most) people aren't at their core hedonistic, what they really want is meaning in their lives. Many people get this from work, others from having a family (and so on). I think that if AI were to eliminate work, it would eliminate a lot of the meaning that a lot of people get in their lives, and something needs to replace that. If nothing positive fills that vacuum, then something negative will.

EDIT:

I would also add, as you intimate, that the death of meaningful work predates AI, and that the gross dissatisfaction that many people feel at work (and in their lives) is a consequence of this. I don't know what the solution is.

*There will of course be a subsection of people who will be perfectly happy on UBI.

1

V_Shtrum t1_ja8s4yi wrote

>I can imagine some practice areas being wiped out

Don't know the specifics, but as a layman I can imagine a lot of people might get some DIY legal advice from an algorithm, may reduce the number of consultations to firms and reduce revenue.

>one attorney with a good AI would be more productive than one attorney overseeing a handful of associates.

Agreed, easy to forsee this, will be the same with software devs though most seem in denial about it.

However:

  1. The legal profession is practical as well as intellectual: representing clients in court, seeing the whites of their eyes, communicating good and bad news, counseling clients, convincing a judge, literally twisting clients arms etc.

  2. Lawyers enjoy legal protection not afforded to other professionals / AI. There's nothing stopping me using ChatGPT to code, but ChatGPT can't represent me in court.

  3. The courts (etc) are very conservative and resistant to technological change.

Still think it's a safer bet then many other fields.

2

V_Shtrum t1_ja8q5d3 wrote

I think attorneys are much more likely to be augmented by AI than replaced in our working lifetimes. Even if the technology is there, lawyers enjoy privileges and professional protection that won't be granted to algorithms. Seems like a comparatively safe bet compared with management consulting or copywriting for example.

2

V_Shtrum t1_ja8ed3m wrote

Have you read Yuval Noah Harari's books? He talks about this:

He makes the point that throughout history, human beings have been exploited: feudalism, communism, capitalism, slavery etc etc. However unpleasant it is to be exploited, no-one could deny that human beings had value, economic and otherwise.

With developments in biotech, infotech and robotics, we're fast approaching a point where humans have no value, there's literally no need to exploit them, and nothing to gain by doing so.

Globalisation and offshoring have already rendered the manual classes in the West essentially obsolete, and the arguement goes: this is behind the recent resurgence in authoritarianism and xenophobia. We're likely going to see another wave of this - only this time of all classes and all countries simultaneously.

I think that a lot of people on this sub have a utopian view of these developments. Why would our governments and corporations be interested in providing us a utopia? What do they gain by providing it? What can we exchange with them when we have no value?

Having a job is having value in the economic system - however small; to hope that all work becomes obsolete is to hope that we all lose our value.

2

V_Shtrum t1_ja7w2yu wrote

This

Money is just influence, the ability to influence others and make them do what you want. Looked at this way, it's totally unsurprising that we still work long hours in jobs even though almost all our material needs can be met through automation. People will always want money because they want to influence others.

3

V_Shtrum t1_ja70czs wrote

So collecting food (hunter gathering) had an absolutely colossal learning curve, it takes years upon years to learn how to track and kill wild animals. Gathering wild fruit and vegetables requires an intricate knowledge of the landscape, knowledge of which plants are poisonous, which aren't etc. Crafting has a similar learning curve: fashioning baskets from reeds and clothing from animal hides (for example) are highly skilled.

The point I'm making is that "work" - however you define it, has been part of the human experience since prehistory. It has shaped all human cultures and all our psychology, I don't think it's trivial for that to disappear and I wonder what is going to replace it.

8

V_Shtrum t1_ja6s0mn wrote

Yes but taking care of the day to day needs of a community meant work: blacksmithing, farming, carpentry, architecture etc etc. Those trades became specialised, not in the last few decades, but literally thousands of years ago.

4

V_Shtrum t1_j3w8qqb wrote

Hmm, not sure the education system is obsolete, perhaps it's more the 'assignment/essay' method assessment?

In the Italian university system almost all assessment is oral: you sit down at the end of your module and get grilled by your professors. Perhaps the Anglosphere universities will move to that system.

2 downsides of oral exams: examiner bias (your professor could be an '-ist') and it's labour intensive. Perhaps an AI oral examiner could be the way forward?

4

V_Shtrum t1_j3w5hm7 wrote

>I would maybe add : if a machine can do your job better than you, and at a lower cost, you have a Bullshit

That makes a lot of sense, I don't think Graeber talked a lot about that - if at all.

Speaking of data entry, I (unfortunately) have to do a little bit of this at work: nothing difficult but I take input radiological reports and extract information to put onto excel for data analysis.

By way of background: medical radiology reports are written in free text, they have no structure to them and each radiologist writes in a different format. Extracting data from these reports is very important because let's say you develop a new cancer drug: you would really like to know if it improves outcomes - e.g. slowing the spread of cancer.

I pushed for us to hire an administrator to do the data entry on my behalf. The administrator came, I spent hour after hour, day after day trying to teach them to no avail - they just couldn't wrap their head around how to do it.

Brainwave: I thought maybe I can get ChatGPT to do it. I spent 30 minutes teaching it and voilà - paste in report, churns out exactly what I want in spreadsheet format. I've shown it to some academics at the university and they're just absolutely blown away by it, it'll be a complete game changer.

Anyway, without wanting to be disrespectful to our administrator, ChatGPT just put them to shamez it was massively more intelligent and easier to teach than them. I foresee AIs having the ability to 'pilot' windows UI in the near future with the ability to work cross-software with a digital mouse/keyboard. When that's the case, I really think it will be an existential threat to data entry workers and unskilled administrators.

3

V_Shtrum t1_j3vytka wrote

Yes I read and enjoyed it, I found it equally insightful and hilarious

The only thing I'd say is that, in relation to this discussion, he has a very loose definition of what makes a job bullshit. Am I right in saying he considers a job bullshit when (and only when) the employee considers it bullshit? It just seems too broad and leads to contradictions.

2

V_Shtrum t1_j3v3eam wrote

Agree that it's impossible to define precisely what makes a job bullshit, but disagree that all jobs - even in a "well run" private business - increase profit or are necessary.

I think this is because, ultimately, all businesses are just a collection of people who may be somewhat rational collectively, but irrational at an individual level - there will always be managers in a good business who create superfluous jobs and want to expand their team, I've never met one who's said "my team should shrink, my budget is too high" - the talk is always of expansion.

I know that in theory 'the market' should penalise companies that hire too many superfluous employees, but this would presuppose that markets are truly competitive, which many aren't for many reasons.

7