Submitted by kvothekevin t3_1271vpb in Futurology

I think that, if everything is automated + AI to do creative work of all kinds, is there any real need for human labor? Couldnt an elite extinguish everyone except them, and be able to keep the whole system? This is related to the very common argument that "The rich need to sell for the working class in order to be rich". Thing is, if human labor is not necessary, money itself dissapears and there is just those who control the machines and all that is produced. The whole working class can just starve.

32

Comments

You must log in or register to comment.

SomeoneSomewhere1984 t1_jec4okw wrote

The rich need some poor who they can lord over to wait on them, so they can fully enjoy their status. Other than that, no.

12

manicdee33 t1_jec4yb9 wrote

If human labour is not necessary, who actually controls the machines?

What if the machines decide that humans are just an animal like all the other animals, including feeding, care, and various measure to keep the population under control?

What if the actual backstory to Terminator is that Skynet became smarter than us, realised that the human population had grown too large, instituted population control measures such as mandatory birth control with licensed pregnancies, and John Connor's rebels are actually fighting that system because they believe humans should be free to have as many children as they want? The odd act of rebellion escalated to violence escalated to full on thermonuclear war against the environmental vandals.

So IMHO when we get to a post-scarcity utopia it will be because we humans have adapted to all life on Earth including ours being stewarded by the benevolent computer overlords.

8

TheLittleHollow t1_jec8ur3 wrote

Wouldn’t lawyer ai basically just need full understanding of the law and the best ways to build a case around it? And if an ai had access to mass information about how psychologists respond to different things from their patients or even achieved full understanding of how the human brain functions wouldn’t they be at least just as good, if not far better than a human? If the works are intrinsically related to human relationships wouldn’t it just need to be trained on mass examples of those relationships?

3

MpVpRb t1_jec9o2r wrote

Yes

People inherently love to do stuff and will continue to do stuff as a hobby even if there is no need to work. Stuff made as a hobby can be traded or sold. Money will not go away

1

cyphersaint t1_jecdn7d wrote

There will be no need for people to work on most things, but we have also seen that people simply don't want to be cared for by just machines. So, unless you have convincing humaniform robots, care will always be done by human beings.

1

manicdee33 t1_jeci1fa wrote

Nah, there's a level in there somewhere where human population is stable and able to continue being creative and inventive, how cute is it when humans think they've discovered a new law of physics? Awww!

If you go higher they end up over-consuming the renewable resources such as fresh water. If you go lower the population ends up getting inbred or just dying off completely.

Also by managing the human population (and a small number of predator species populations outside the human zone of influence) the rest of the ecosystem manages itself quite handily.

Oh, have you seen what we did with Mars and Venus? The Venusian fjords are just chef's kiss.

2

Rehk_135 t1_jeci2nu wrote

Humans have a need for fulfillment and meaning. Even if you don't have to work, most people will want to do...something. The alternative is the human equivalent to the rat utopia experiment.

Probably not as bad, but it's very common for people retiring to go through some depression due to feeling a lack of purpose.

What we'd end up doing? Who knows. Sports, volunteering, exploration, tending to nature, religion, spirituality, or learning for the hell of it are all plausible. I'm sure many would slip through the cracks too and end up miserable no matter what.

Good news bad news though... Good news we won't have to worry about this. Bad news is that's cause a dystopian hellscape is far more likely than a scarcity free utopia, imho.

9

Tripwir62 t1_jecl3f1 wrote

This is a real issue. Knowledge workers of every stripe will no longer be needed. You’ve all probably seen that GPT-4 passes the Bar exam at the 90th percentile. I see no good outcomes.

5

jphamlore t1_jecmeo8 wrote

There's going to be a huge need for full-time caretakers for the aging.

−1

KamaKairade t1_jecoi7g wrote

Labor, leisure, rest.

As AI and automation reduce the need for physical labor, labor will take on other forms. It will likely be a boon to the arts, education, and care (hospice/elder/etc).

1

SardonicKaren t1_jecq56a wrote

I wouldn't want a robot / machine of any description to look after my grandkids...

0

betajool t1_jectch3 wrote

Iain Banks’ Culture novels make for a good read on this subject.

7

robertjbrown t1_jecthkf wrote

Any kind? I'm pretty sure "AI alignment" is something we'll want to keep humans doing. It would be very foolish to let the AIs try to keep the AIs in line.

Aside from that, I can't think of any jobs that can't be replaced or reduced to a tiny fraction of what there was previously.

But I think most jobs will be unnecessary. I'm not convinced a utopia is inevitable, though. Obviously there has to be some way to distribute wealth, whether it be UBI or something else.

8

robertjbrown t1_jecuazk wrote

I enjoy talking to ChatGPT, even today, more than talking with, for instance, my parents' caretakers.

If there is a robot or other device that can help me use the bathroom, I'd prefer than to a human.

I can't think of much else that a robot/AI couldn't do in terms of caretaking. Prepare food, keep track of my medications, get me places, help me up and down, keep an eye on me and alert others if there is a problem, and so on.

If I want company that isn't a machine, what about other people who also want company as well, as opposed to a paid employee? And maybe a dog. Which the caretaker can feed and walk and such.

I can't see people in a post scarcity economy wanting to be caretakers, since everything they need isn't, well, scarce.

2

mtanfpu t1_jecudib wrote

Without scarcity, the progress of natural selection and thus evolution becomes obsolete. Can't imagine a future that is not governed by such a natural law.

Edit: my apologies, not a law, but currently still a theory, albeit a very convincing one from today's perspective.

−1

New-Tip4903 t1_jecxycp wrote

Wishful thinking. Soon we will have a world where even if you make something truly unique and original i will be able to replicate it within seconds without your permission. Money may not go away but it will have to evolve.

5

Plane_Crab_8623 t1_jed16uj wrote

Humans need to exercise it's part of our animal nature. Gardening will make a perfect exercise and artistic time structuring. Swimming running climbing sailing gliding is the kind of labor our bodies need to function healthily.

1

Petal_Chatoyance t1_jed1zrs wrote

In a post-scarcity world, human craftsmanship would become the most coveted and rare and high-status thing imaginable.

If you could have anything - anything - made instantly by machine process, at any time, for basically no cost (I'm talking Star Trek post-money civilization here), then every ordinary, machine-made thing you own matters little. Even 'owning' things doesn't matter most of the time - just get a new one made.

But - that crappy, raggy doll your grandmother made just for you? Priceless. And to a collector of dolls - beyond priceless. Just owning such a thing would increase your social status.

Handmade furniture in such a world would be hard to get - because it takes work, human work - and thus would be rare. The Amish would be in unimaginable demand.

A new scarcity would appear - and that scarcity would be the final one: human, hand made artifacts. They would be treasured in a way they are not, now.

Anyone in such a world could easily own the fanciest car, the most powerful device.

But only a very few would be able to walk around in a hand-knitted sweater with a story behind it of love and attention and appreciation. Such a thing would be a treasure, a work of art. It would draw fascination. People would envy it.

Which means, in a 'status economy' beyond money, that for the first time in history - artists, even not so great ones, would finally be truly appreciated.

4

jphamlore t1_jed6jka wrote

A robot can do many specialized tasks as good as if not better than a human.

I'm not convinced even with Singularity-type progress a robot can be developed that will do all of the tasks a caretaker does.

1

NebXan t1_jedbpe6 wrote

That's not really how it works though. Natural selection necessarily occurs whenever reproduction happens, it's just the evolutionary pressures and the traits that we end up selecting for that change.

1

grislebeard t1_jedgni5 wrote

You’ve figured it out, but the chuckleheads on this sub don’t wanna believe it.

Technology without social structure is just a way to replace you, the same as the ox was replaced by the tractor. Luckily, you are a human and can actually do something about it

1

fartuni4 t1_jedjixa wrote

Job satisaction is a big part of peoples goals...i was listening to a muslim harvard grad who spoke about how modernity leads to isolation and atomization because it breaks the bonds of kinship. I wonder how people woudl form communities.

1

BaronOfTheVoid t1_jedrjfk wrote

In a post-scarcity utopia humans would have certainly more time to discuss impossible fantasies such as post-scarcity utopias, unlike today.

1

RealRaven6229 t1_jedwxye wrote

You should read Football 17776. It takes place in a world where there is no scarcity and humans are immortal. It's absolutely fascinating and existentially terrifying, even though it genuinely takes place in a utopia, no sinister caveats. It's not very long, can be found online, is free, and mixes some really unconventional storytelling methods

1

peter303_ t1_jee11i3 wrote

Countries with demographic implosions are trying to automate that with robots. Japan worship everything robot and has the highest fraction over age 60. They propose innovative ideas.

1

mtanfpu t1_jee3bff wrote

Nature 'selects' via scarcity of resources. The 'fittest' survives due its comparative competitive advantage over its peers for a specific set of resources. It necessarily entails that said set of resources isn't enough to satisfy everyone.

Take away scarcity, nature selects everyone.

0

Pickled_Doodoo t1_jee3xhu wrote

Some people believe a resource based economy is the only way forward, someone like Jacque Fresco has been an advocate since '70 with he's The venus project for example.

Edit: he died in 2017 but tye movement is going.

1

Evipicc t1_jee5zx2 wrote

If we can eradicate the concept of currency and status/class shortly after the imminent reduction of all labor and work through automation we'll be fine. Unfortunately all of the people with currency and status are the ones that control policy.

Frankly I think we'll get to the point where the rich have all the resources and the poor begin to starve, and there will be some... rapid and violent changes. Hopefully the world survives that change.

1

Evipicc t1_jee6grk wrote

This feels like baseless fear-mongering to me. The implication that we'd allow a system like this to exist or to exist with that kind of total control over us is bonkers.

2

Evipicc t1_jee6m2t wrote

Those resources will be acquired, refined, manufactured, and delivered by automation. Where is this 'economy' deriving a transaction from? There will be no economy, there will simply be automatic production and people doing what they want imo.

3

Evipicc t1_jee6s9c wrote

Why does it matter if you re-create it? Someone else isn't making that thing for the purpose of making money, they're just making it because they want to.

I argue that the entire concept of economy and currency will die.

3

Evipicc t1_jee71qd wrote

Yeah, what you can do about it is benefiting from the excess productivity and do what you want... Something that isn't work lol.

It will take drastic socio-economic change but that's the ideal to strive for.

1

Evipicc t1_jee79u8 wrote

This is the take that drives me fucking insane...

We're not just going to roll over and AI just overtakes the world... That's not how this is going to work. We have ChatGPT that hired a person to beat a captcha and that TERRIFIED many top level AI developers and now there's this proposed moratorium until we discuss as a species how to move forward. Seriously, people think we're just going to attach AI to the nukes and end the world? Fear mongering and problem-focused thinking does nothing but stifle progression. If there's a problem, we fucking SOLVE IT. That's what we do.

https://youtube.com/shorts/3O24May-m6k?feature=share

2

mansen66 t1_jee846v wrote

Are we humans useful to each other?

Provided we are, there will always be work in providing value to others. What that work is its hard to say, but it wont be Labour.

1

IntelligentBloop t1_jee9o8y wrote

The thing about being "rich" is that it means you are post-scarcity. Rich people today already don't have to work. They can just sit around doing whatever they want.

And of course, amongst rich people (who are actually quite diverse in their behaviours, interests, hobbies, relationships, etc..) there are some who are very good, some who are very bad, and many in between.

When we say our whole society becomes post-scarcity, we're really saying that everyone is rich. Everyone gets to sit on the couch doing fuck all if they want to, while the machines and the AIs produce everything.

So, imagine we get to that point. Will everyone work - no, some people will not work. Will some people work - absolutely, yes, they'll work on things they want to, making art of every description (visual art, music, fashion, architecture, interior design, media, etc), designing things, discovering things, debating things, creating new culture.

And there will be work in a post-scarcity society in preventing evil, selfish, violent, crual, destructive people from doing damage. That's going to be a full time job, even post-scarcity.

There will be lots to do. Even when all our material needs are met.

8

_z_o t1_jeedufc wrote

Rich people will always need poor people to feel rich.

1

acutelychronicpanic t1_jeehe0n wrote

The alignment of machine intelligence must be internal. They have to actually want the same future for humanity that we want and align with our values.

There is no system you could use to harness and control a superintelligence that would be safe. The idea of needing people to control them probably isn't accurate. We won't need to direct them, we'll just need to have preferences.

1

Rehk_135 t1_jeemghq wrote

I'm familiar with it. But it's definitely not the only way forward. Maybe the the most desirable way forward, but we definitely have the option to fuck up...and it looks like we are going down that road. Globalization at present appears to be teetering on the brink of collapse.

1

echohole5 t1_jeep9ys wrote

It would be suicidal for the owning class to let the working class starve to death. The 99.9% will kill to not starve to death and the owners are outnumbered 1000:1. They know they'd have no hope of survival. The military and police would turn on their masters to not let their friends and extended family die.

Besides, ruling over a nation of corpses isn't satisfying. There's nobody to feel superior to.

Capitalism requires consumption. Without it, nobody can sell anything and the corporations die with the citizens.

If we get to a point where human labor has no economic value, there will be a new way of redistributing wealth that takes the place of wages.

I'm not that worried about a future where human labor still has some value but not much and the kind of valuable labor is stuff every human can easily do. The ruling class would still want to incentivize working by not providing UBI but everyone would be working minimum wage jobs with a permanent high unemployment rate with very limited safety nets. That's a future of grinding misery but it wouldn't be bad enough to cause a revolutions, like mass starvation would. That's the dystopia that worries me.

1

RandomPlayerCSGO t1_jeesg18 wrote

There will always be scarcity, we can get better at obtaining resources and get more efficient at using them, but they will never become infinite.

1

NebXan t1_jeetnda wrote

Resource scarcity is just one evolutionary pressure that can direct natural selection.

Consider bacteria, for example. Even if you place them in an environment with an inexhaustible supply of nutrients, if you add small amounts of antibacterial chemicals, you will end up breeding bacteria that are resistant to those antibacterials.

1

Pickled_Doodoo t1_jeetpca wrote

> Globalization at present appears to be teetering on the brink of collapse.

Oh yeah definitely.

> But it's definitely not the only way forward. Maybe the the most desirable way forward,

Yeah agree not the only way, mostly still just filled with idealism albeit a very desirable outcome as you said. A shit ton needs to happen even after automation of everything is in full swing before anything like it can happen and that is not likely to happen unfortunately.

2

Anonality5447 t1_jeeu8un wrote

Its a possibility but it will probably take decades to reach that point. We have already made so many mistakes in our development on this planet though that the consequences of those mistakes will compound and likely curtail future growth in a lot of ways anyway. Technology in the short term is definitely going to displace a lot of people though and make it harder for us to survive in a society.

1

mtanfpu t1_jeewb1y wrote

That's.. survival of the fittest ain't it? Here, scarcity lies not with their nutrient supply, but ways to deter their chemical killers?

Edit: I guess I am taking on a broader look at the term 'resource'.

1

mhornberger t1_jefdg1v wrote

I love the series. But the whole premise rested on the Minds, basically inscrutable god-like AIs who ran everything and prevented any humans from taking over or doing too much damage. Though you also had the Affront and the Pavuleans (from Surface Detail) to show other paths civilizations could have gone down. But without the Minds, strong AI, you don't get the post-scarcity economy.

It's not a given that there's a line leading from ChatGPT to strong AI. It's not a given that we're going to let AIs improve themselves in a feedback loop without our oversight every step of the way, nor is it a given that if we did you'd get benevolent God-like AIs who kept us around out of some vague sense of nostalgia and respect.

1

Scope_Dog t1_jefp2hs wrote

yes, even in a world where every possible need is met by technology, you still need people to direct that power. You still need city planning, fashion design, product development, landscape design, etc. not to mention long term goals like expansion into space. This requires people to make educated decisions, and directing that labor toward desirable outcomes.

1

Formal-Character-640 t1_jefte5z wrote

No one is saying that AI will be attached to nukes. Stop making up irrelevant points to appear credible.

The mass public deployment and rapid advancement of AI at a pace that we’re not prepared for or at a level of disruption that we don’t fully understand is the issue. It’s not fear mongering to demand that we do everything possible to guarantee safety and prosperity of this and future generations. And so far there is little to no action from the government. The open letter is just that.. a letter. This is a time-sensitive problem that we may not have a chance to fix if we fuck up now.

0

Formal-Character-640 t1_jefw86x wrote

I agree - a dystopian hellscape is more likely if humans are no longer needed for any labor. Entertainment and hobbies will only take you so far. Even now pursuing constant entertainment or pleasure gets old quickly if you have no purpose, no goals. Humans thrive on being challenged, thrive on working to support their family, and thrive on competition among themselves. Without purpose (no matter how small or big it is) there is no humanity. Unless we regress back to primitive times and become simple like other animals whereby our only purpose to survive is to reproduce.

1

robertjbrown t1_jefygr7 wrote

You think we're all just going to cooperate? "Discuss this as a species?" How's that going to work? Democracy? Yeah that's been working beautifully.

I don't think you've been paying attention.

You don't need to "attach AIs to the nukes" for them to do massive harm. All you need is one bad person using an AI to advance their own agenda. Or even an AI itself that was improperly aligned, got a "power seeking" goal, and used manipulation (pretending to be a romantically interested human is one way) or threats (do what I say or I'll email everyone you know, pretending to be you, sending them all this homemade porn I found on your hard drive).

GPT-4, as we speak, is writing code for people, and those people are running that code, without understanding it. I use it to write code and yes, it is incredible. It does it in small chunks, and I at least have the ability to skim over the code and see it isn't doing anything harmful. Soon it will do much larger programs , and the people running the code will be less experienced programmers than me. You don't see the problem there? Especially if the AI itself is not ChatGPT, but some open source one where they've taken the guardrails off? And this is all assuming the human (the ones compiling and running the code) is not TRYING to do harm.

I mean, go look in your spam folder. By your logic, we'd all agree that deceptive spam is bad, and stop doing it. Now think of if every spam was AI generated, knew all kinds of things about you, was able to pretend to be people you know, was smarter than the spam filters, and wasn't restricted to email. What if you came to reddit, and had no clue who was a human and who wasn't.

I don't know where your idealistic optimism comes from. Here in the US, politics has gone off the rails, more because of social media than anything. 30 years ago, we didn't have the ability for any Joe Blow to broadcast their opinion to the world. We didn't have algorithms that amplified views that increased engagement (rather than looking at quality) at a massive scale. We now have a government who is controlled by those who spend the vast bulk of their energy fighting against each other rather than solving problems.

Sorry this "drives you fucking insane", but damn. That's really, really naive if you think we'll all work together and solve this because "that's what we do." No, we don't.

2

robertjbrown t1_jeg1orz wrote

Can you list one thing a caretaker can do that an AI robot wouldn't be able to?

I have a 90 year old mom, and she spends thousands a month on caretakers (and it was a lot more when my dad was around as well). I can't really think of anything. Seriously, name one thing.

I see them cleaning, doing laundry, making meals, making sure medications are taken, helping them bathe or go to bathroom, and so on. And of course, when they need human interaction, helping them either get somewhere to see another person, or helping them get on video chat with someone.

And even if you come up with one thing, isn't it something the robot can identify the need for, and call in the human? For instance, call a doctor?

1

robertjbrown t1_jegm1az wrote

>but we have also seen that people simply don't want to be cared for by just machines.

Where have we seen that? 6 months ago, there was very little more annoying to me than to have to interact with a chatbot. That's changed dramatically in the time since. And the current ChatGPT is non only an early version, but it doesn't speak out loud, I can't really talk to it in a natural way, and it has an intentionally neutral personality, no name, no visual appearance, no memory of past interactions with me, etc. That will change far, far before we have a "post scarcity utopia". In fact that will probably change in a year or two most.

That's just one piece of it, of course. We need good robotics that are cheap as well.

People's attitudes towards being cared for by machines will change really quick, when those machines get good enough at the job. It doesn't make sense to assume they won't like it based on machines that have existed previously. That's about like saying "people simply don't like socializing through a digital device", and you are basing your assumptions on people logging into a BBS on a TRS-80.

1

robertjbrown t1_jegowlx wrote

Is it that you don't trust them to keep them safe?

I've been making a machine to "look after" my 8 year old daughter, in a sense. Currently all it does is quiz her on her multiplication tables, and allow her to watch episodes of her favorite show for 10 minutes after she's solved a few with sufficient speed and accuracy. It will gradually do more (especially going beyond multiplication tables), but that's what it does now.

I'm not saying I'm leaving her home alone. But it is doing some of the things I'd be doing, freeing me up to do other things. It actually does this task better, by making the reward -- time to watch her show -- so directly tied to her progress, so I don't have to be the bad guy all the time.

If it was also making meals, doing the laundry, cleaning up after her, etc.... in exactly the way a parent or baby sitter might, all the better.

Obviously, I am not trusting a machine to keep her safe. I don't trust a AI powered robot with a camera to alert me or even call 911 if it detects something unusual. Not because I wouldn't trust one, but because such devices don't exist today, or they are too expensive or not well tested enough. But they will exist.

Remember, we're going to have self driving cars in a few years. If you don't think so, you haven't paid attention to the massive advances in AI just in the last few years (with the release of ChatGPT being the big one). We will be putting our lives in their hands.

Notice parents today don't watch their kids 24/7, especially if the kids are older than toddlers. They let them play in the basement or backyard while they are making dinner or what have you. If the kid is choking or having another medical situation that they are unable to tell you about, or being molested, or taking drugs, or exploring parts of the internet that they shouldn't, or trying to commit suicide, or any number other bad things, the parent might not know until it is too late. A robot baby sitter can indeed keep them safer than they'd be without it, even if you are right there in the house.

Do you trust a baby monitor? Like, a camera pointed at a baby, that you can monitor with your own eyes, to see that the baby seems to be ok without going to a different room? This is really just an extension of that concept, that adds a bit more automation to it.

But again, the things I described don't exist yet. They will soon, as anyone who understands just how fast AI is getting better, and has an imagination, must realize.

Of course, if the parents don't need to go to work, and all housework is handled by robots, they can spend time with the kids doing enjoyable activities, so there isn't such an immediate need for child caretakers. But still.

0

andydude44 t1_jegqgnr wrote

Jobs of popularity/relationship, shareholders will typically want a human to be able to at the very least veto an AI CEO as a failsafe for example I imagine. Same with senators and mayors. These are jobs based in their relationship and popularity to the people they represent.

1

cyphersaint t1_jegtcdu wrote

The fact is that people need interaction with people. The physical portions of the care could be done by robotics, but any long term care will need to involve people unless the AI can provide the interactions that happen between people. And that includes physical interaction, which is why I mentioned humaniform robots.

1

robertjbrown t1_jegur4y wrote

Except that the arts, education and elder care is something they can do very well.

You should spend a good amount of time with ChatGPT (especially the GPT-4 version) before suggesting that physical labor is the main thing where AI and automation are making a difference.

It's been a long time since bulldozers and backhoes replaced 99% of the need for humans with shovels. Now we are at the point where AI can replace most of the work done by lawyers. (if not with GPT-4, with GPT-8 or so)

And sure, you still someone to control the AI and make the highest level decisions and stepping in for those rare things where a human is needed. Just like you need the person driving the backhoe, and you still often need a person with a shovel to do some of the finer work. (although..... https://www.core77.com/posts/109074/A-Hilariously-Tiny-Mini-Excavator .... now just replace the driver with an AI, and maybe one person controlling 50 machines, big and small)

But yeah, while not everything is 100% automatiable, an awful lot of things are 99.9% automatable. The ones you mention actually being prime candidates.

1

robertjbrown t1_jegwcuc wrote

>The fact is that people need interaction with people

That is your intuition, and probably most people's intuition. I think it is based on the fact that non-people have not, until november 2022, been able to have an intelligent, natural conversation with a person.

If you don't think ChatGPT is able to "have an intelligent, natural conversation with a person," here in 2023, I'm not going to argue. If you don't think that ChatGPT or some competitor will be able to do that in 2030, I think you lack imagination (and probably simply lack experience exploring what ChatGPT can actually do today).

But even if you are right, that people need to interact with people, that doesn't mean we need humans to prepare their meals, help them go to the bathroom and bathe (I definitely would prefer a robot to a human for that), get them around, make sure they take their medications, etc. If they need human interaction, what's wrong with the robot caretaker helping them get on video chat with their kids and grandkids, or with other elders who have similar needs for interaction?

I could certainly see an elder community where hundreds of residents have one or two paid humans to run everything, with the robots doing all the unpleasant and tedious stuff. Human interaction is handled not by paid staff, but by other residents.

Remember also that, in a society where most jobs can be done by machines, there are a whole lot more family members that have time to interact with their loved ones, rather than paying someone to come in and pretend to enjoy interacting with a very old person.

What specific thing does a caretaker do that must be a human?

1

cyphersaint t1_jeh43zr wrote

I'm mostly talking about the human interactions that a person needs on a regular basis. I was mostly saying that human interactions are something that everyone needs. You're correct that such interactions will likely be their family, especially in a society where nobody needs to work. Though, I suspect that for certain diagnoses (assuming they still exist), it might be best to hear them from a person. And, honestly, people will WANT to do such things.

1