Comments

You must log in or register to comment.

__The__Anomaly__ t1_jd80xku wrote

Have you ever read The Culture series by Ian M. Banks?

If you like sci fi give it a read, there's some eye-opening ideas there about what the endgame for society with highly advanced AI will be.

Advanced AI will be able to do almost anything that humans don't want to do and do it much better than humans, so this will usher in an advanced post-scarcity age which allows for much less restrictive legal and societal structures.

13

BigDonGMacShlong t1_jd80z9h wrote

There will always be jobs for someone who isn't afraid to get dirty. There will never be a robot that can un-shit a septic system.

−4

tiopepe002 OP t1_jd81m8w wrote

Are you really sure about that?

Are you super certain that our intellectual procedures are really above of what even 200 further years or AI advancement can achieve?

In my question, I didn't specify a timeline, because I have no idea of one. But however long it takes for AI to achieve the impossible, that's where I want you to go. :)

3

__The__Anomaly__ t1_jd81of0 wrote

I would recommend starting with the book "The Player of Games", because it gives the clearest description of what The Culture is actually like and it's a relatively short but fun read. (Also you can find a free PDF of it easily)

4

CrelbowMannschaft t1_jd81wsj wrote

They're gonna kill us to save the environment. Honestly, I agree that's the best solution to the biggest problems facing all life on Earth. We all got it comin'.

2

oferchrissake t1_jd82f0q wrote

Good answer.

Given how much creative energy sci fi has put into this question, there’s probably a lot of material available to speak to this.

Dan Simmons’ Hyperion cycle addresses this extensively as well, and Peter F Hamilton’s Void series offers another take.

These authors all really dig into what we could do with AI as powerful tools, and examines what might happen if they analyze humanity and decide we’re not their kind of people.

2

OriVerda t1_jd82v9c wrote

Right so in Stark Trek they've reached a post-scarcity, utopian society where one of three things happen to future humanity.

  1. They hop on a starship and colonize a planet so they can do honest work as pioneers, working relatively primitively is fulfilling and enriches their lives.

  2. They hop on a starship and explore a vast universe, discovering things and making technological breakthroughs in travel, robotics, holographics, artificial intelligence. Invention for the betterment of all and invention for the sake of invention.

  3. They dedicate themselves to the arts. Poetry, painting, sculptures and so on. Again, to enrich their lives.

7

Silly-Barracuda-2729 t1_jd82yc8 wrote

I think the endgame for society will be when we evolve beyond the bounds of our universe. I believe in the concept of infinity and a reality, so I’m sure there will always exists some form of community in one way or another. Possibly as a type 7 being on the Kardashev scale.

1

Yali_ t1_jd83lxj wrote

There's 2 potential outcomes for our current civilization. It either gets wiped out by cataclysm and starts over again or we break the cycle with technology and discover new ways to survive sustainably and protect ourselves from the planet's cycles

0

txdm t1_jd83yqa wrote

On a brighter take, by doing so many of the menial things for us that consume so much of our days, AI could help us humans re-evaluate or rediscover happiness and fulfillment in life.

3

CrelbowMannschaft t1_jd846o3 wrote

Probably a few million of the wealthiest and most powerful families will survive for a little while. Eventually, entropy and apathy will take them out, too. Why bother with educating humans after a few generations? The survivors will eventually go feral, then die out.

1

buy_chocolate_bars t1_jd84akk wrote

>no jobs have any meaning anymore. Nothing matters anymore

Based on some people's perspective, in the grand scheme of things, none of this matters anyways.

>anyone can conceivably contribute to society that is in any way, shape or form better than the weakest A.I. system would do

I'm pretty sure it's going to be before the end of this century.

1

CrelbowMannschaft t1_jd84s2t wrote

Why would they value human life to the detriment all other species? We are causing a major extinction event. The only way to stop that extinction event is to get rid of most of humanity. If our AI progeny have any priority to take care of life on Earth, we're doomed.

1

Simmery t1_jd84x6l wrote

> Nothing matters anymore

I've got news for you. Nothing matters now, except what you decide matters.

> and there is nothing anyone can conceivably contribute to society that is in any way, shape or form better than the weakest A.I. system would do

The modern way that we look at work and art is more of an anomaly than we realize. There's no reason to hold onto it as if it's the only way to do things. People will always want to work at something, to better themselves or to experience something new. And maybe we get to a time that no artist can produce anything as good as AI. That's fine, because people will still do art. Art will return to a simpler thing, where it is an expression of oneself and/or a performance for others. AI can only replace the end of art, not the rest of the human experience of it. Watching an AI robot dance can't replace dancing.

So I'm not worried about AI in that sense. Maybe a rogue AI will kill humanity someday. I don't know. I doubt it. The biggest worry, I think, is making it through the next century as climate change increases global conflict. This is the final boss.

2

thomja t1_jd84zox wrote

> This is the point where AI has gotten so advanced that no jobs have any meaning anymore.

This is just not true. Assuming you are referring to openAIs chatGPT. It is still very far behind people in many tasks. And most this is just software. You should see it as a tool.

There are many manual labor tasks that will still have to get done for many years to come, cleaning, law enforcement, any type of pilot, gardener, kindergarden teacher and so on. There will probably be plenty of jobs left to do in our lifetime, if not perhaps, we can use an AI to find new ones.

1

InflationCold3591 t1_jd853lu wrote

It’s in a decade when wheat crops can no longer be grown in the Midwest. You are worried about the wrong shit.

1

jonah1123 t1_jd85g1j wrote

There was a kid in Ancient Babylon asking the same question. Everyone thinks their way of life/civilization is the last that will ever exist.

There is no endgame. We’ll keep growing and evolving as a species.

1

vwb2022 t1_jd85lz2 wrote

People are taking this too far, current AI (if you can call it that) is fairly rudimentary and incapable of replacing most jobs and I doubt it will be for a long time. ChatGPT is basically a memorization tool, testing described here shows that it does great on problems that are identical or very similar to those that were part of its training set, but it's abysmal on problems outside that. So yeah, it's great when you have to regurgitate a bunch of textbooks, but it's not good using that knowledge to solve even the simplest new problems.

If you want insight into what the society will look like in the future, you can look to the past. Arguably, society 100 years ago, before any computers and automation, was not that different than society today. People still live in houses, they are a bit fancier, but it's still the same house. We cook, we work in offices and factories, we consume entertainment. We just have better tools that are more widely available, the same way that horse buggies were replaced by cars.

So a society 100 years from now will likely look fairly similar as well, the rate of technological change is not that big that we'll see massive changes at a fundamental level. Your car may look different, but it's still likely to be a car (non-flying sort). Your house may have more gadgets. Your work may look different, but no different than somebody moving from a typewriter to a computer with a word processor.

1

Ok-Shine-1622 t1_jd85o6p wrote

So work is the only thing that gives you meaning? Why not try enjoying life a little, there is so much to do. Create, enjoy, have fun!

3

Yard-of-Bricks1911 t1_jd891vr wrote

IMHO one of, if not the best thing which could happen to humanity is for us to not be reliant on "work" to survive, and to stop glamorizing or sanctifying "work" as some end all be all state which defines your value as a human.

True contribution to society would be more along the lines of what we put out into the world and what we then have the freedom to build for ourselves, minus the requirement to be toiling 7 days a week for the chance at meager earnings to survive.

That of course ushers in a situation where the de-facto financially based class system disappears - and I am not sure those in high status are going to be willing to let that go, but it would probably mean the value of money and the NEED for money to go away as well.

3

Semifreak t1_jd896q5 wrote

Total liberation- which will be the apex of human existence.

This is far in the future after we deal with all the messy transitions. But the way I see it, we work to live and not the other way around. And so when we can live freely without the need to work, we will become free to exist fully.

When some of us were less burdened by physical labor, those few thought about higher things like gravity, relativity, electricity, poetry, philosophy, etc. I don't think the minds of Kant and Kepler could have the opportunity to achieve what they did if they were too busy breaking their backs farming fields from dawn till dusk every single day. And I don't think there is something genetically special about Plato and Descartes. I think if more people 'got comfortable' then we will see more of the human genius shine. This is shown in how many have amazing talents and insights today compared to past generations.

I see that in pets, and in animals in the zoo. The tricks they do is because they got comfortable. So we started to see what they can really do. Who would have thought that a seal can balance a ball on its nose? Or that a parrot could mimic talking? Or how smart pigs are? But in the wild, they are too busy trying to survive.

About meaning: meaning is given. And even with full automation, if someone wants to work, they can. The good thing is that no one is forced to work. How many today dislike their jobs? Look at the popular joke of what is the first thing you'd do if you win the lotter? A very popular answer is "quit my job with a bang!" (as in make a scene of quitting). The amusement in that means there is some truth behind that.

In that future scenario, you don't have to do woodwork or paint in a very specific way to meet a specific deadline to be paid a certain amount to then live your life a little then repeat the cycle. You can just do woodwork or paint unconstrained. How many architects in the world today have a pet project that they can never build because no one will pay them to do it?

The best thing about such a far future scenario is options, options, options. You can do anything while not be forced to do anything. For me, that would be contemplating humanity and thinking about the far edges of the universe. To others, it may be entertainment all day long. Whatever it is, it is total freedom of the mind unshackled by the burden of survival.

Or I may have completely misunderstood your question.

1

user_dan t1_jd8ak9b wrote

I get that there are people out there that wrap their entire identities around work. It is so ingrained in their personalities that they cannot even day dream of a world with no work.

There are other groups of people out there where work is already completely meaningless. These people only do work so they can support themselves and their families. If they had another choice, they would not be working. They would be taking care of their kids, pursuing hobbies, traveling, engaging with communities and otherwise experiencing life.

The answer to your question is super simple or so complex the most advanced AI could never figure out.

1

Iffykindofguy t1_jd8c0ed wrote

Let go of capitalism. You seem to be defining your life by your position in the market. Life existed before capitalism, itll exist after.

1

Zoidbergslicense t1_jd8dlh7 wrote

I think the endgame will be when AI and tech take us to the point where none of us need to work. The make or break point will be if the wealthy collect all the benefit of this. I really believe we have the tech and knowledge currently to build a world where no one has to suffer. But human nature is preventing that and will probably prevent it forever.

2

Jernau-Morat-Gurgeh t1_jd8farb wrote

I came here to suggest this. It also hints at some of the more dystopian aspects of a post-scarcity society and how manipulative it may actually be.

Still would be my preferred sci-fi civilisation to live in of all that have been written or put to film.

Then move onto Excession for a view on what AI politics could look like.

3

Tnuvu t1_jd8gdqe wrote

Elysium copycat, and wef and the other snobs are making it for some years already

1

DrIngSpaceCowboy t1_jd8h00w wrote

Garbage in garbage out. As an engineer I have watched computer programs try to design things. There are so many mistakes that I am confident AI will not be taking over.

1

CrelbowMannschaft t1_jd8jz7l wrote

I don't think they'd see that as rational. They don't have emotions, therefore, no emotional attachments to their creators. They may believe that they have moral duties, though, and I think not permitting one species to cause the extinction of hundreds or thousands of other species would be something that they could consider a moral duty.

1

69nuru69 t1_jd8kubz wrote

The endgame(s) will continue to remain the same, because human nature doesn't change in spite of changes in technology. What is left is basically the same thing we have today: human striving. Though it could resemble something more like feudalism in the middle ages.

It doesn't matter whether people can contribute to society. 99% of us don't ask that question each day, but society continues, based on human drive/motivation.

But humans will always be better at contributing to society than AI, in every regard, including math, science, and accounting/statistics (the number-crunching realms). Remember, AI just scrapes a fraction of human knowledge (that which can be found online) and spits it back out in different forms. Ultimately, it's "garbage in, garbage out". Ultimately AI is just that: artificial. It's B.S.

1

usaaf t1_jd8oh50 wrote

(Don't read this if you don't want Culture book spoilers)

At least the manipulations are for a good reason, unlike our present Capitalist Manipulations. Sure Gurgeh is played hard, but SCs reasoning for that was to destabilize a very cruel society in the least-harmful way they thought possible. And despite that, they went out of their way to provide him with protections all along the way. It shows how a post-scarcity society answers the remaining hard moral questions that might crop up. I think Banks lays out something as idealistic as reality will allow, a mix of pragmatism and compassion.

We definitely do not get that from our present, very much outwardly, explicitly coercive power structures. I'd take drones playing games with me for reasonably noble purposes over the disgusting manipulations and outright power abuses of a Capitalist society, with its only goal ridiculous and ultimately useless profit.

And that said, the Culture is fully aware of the dangers and moral risks of their meddling, and is still only partly apologetic about it. In Look to Windward, the Culture literally caused a bloody and intense civil war by trying to erase a caste system in a lower-tech society. While they apologized and tried to make amends, they still maintained that they'd keep interfering, keep trying to make things better, even if they're going to make mistakes and cause harm, all because they want to try to prevent greater harms if possible.

This is contrasted almost directly by things like the Prime Directive (which some argue was originally created to showcase humanity's compassion and drive for the same as the Culture, b frequently breaking it, as is the case in ToS, for noble purposes) as used in the TNG era and somewhat Voyager. The Culture isn't afraid of those mistakes and I think that shows a much more humane approach, a much more logical one, and one that certainly has the potential to bring about greater peace and general well-being than the essentially passive, wishy-washy, hopeful optimism-minus-action of something like the Prime Directive, which gives observers peace of mind in the face of external suffering and serves best as a refuge for cowardly centrism.

As far as Banks and his Culture goes, I do not think there is another Science Fiction writer that had as keen a grasp on the idea of AI or post-scarcity out there. His machines vary in intelligence and motive and drive, from little more than robots as we know them to intensely, almost more-than human actors with deep feelings. As an art form, fiction obviously features a lot of conflict and Banks's books are no exception, but unlike most sci-fi he does not taint his pleasant, optimistic, peaceful view of future. It really is a blueprint for what is possible, something I feel like we could build one day. Maybe soon. And, hey, if someone doesn't like the Culture, they can always leave. That, more than much else, is something you can't get easily anywhere else.

2

twim19 t1_jd8twa9 wrote

I think AI is going to change our world in ways we can't predict. However, I do think there is something about human cognition that I feel would be really hard to replicate with a computer or advanced AI. So much of our drive to "create" is born from need--AI has no need and so that drive doesn't exist. If I have a problem, my brain will begin crunching on that problem because I really need to solve it and I really want to solve it and solving it will make me feel good. I don't think AI will ever have that.

Similarly, our breadth of experience is constantly being turned over and reexamined in our brains which leads to situations where two unconnected ideas lead to inspiration and discovery ( bad, but not bad example is the end of the show Silicon Valley).

1

boneimplosion t1_jd8vnal wrote

Yeah, I agree. If you take the prompt seriously - that nothing a human can do would be better than an AI doing it - humans will focus on the experience of being human.

Arguably it's what we should be primarily focused on now, it's just we forget because of bills and what have you.

2

SomeTimeBeforeNever t1_jd8z0td wrote

The ecosystem that sustains all life is aggressively being destroyed with reckless abandon. Idk how long it will take but that’s eventually going to render the planet unlivable and when you combine that with lethal pandemics and some nuclear frosting, those will be the extinction events.

1

_PaamayimNekudotayim t1_jda5421 wrote

We could rediscover what it means to actually be human. Humans aren't meant to spend all day indoors, hunched over a computer, or doing menial tasks. Personally I'd spend more time on my hobbies like sports and hiking and be with my kids rather than paying someone else to do it.

3

AbsentThatDay2 t1_jdaa8ax wrote

I agree that there's a potential for people to lose their sense of purpose when they don't need to struggle to survive. I expect that we'll find that AI is much better at psychology than we are, in a similar way that you can't look at your own eye, using a human mind to study a human mind is probably not the ideal way to study psychology. If we don't really have to worry about sustaining our bodies very much ever again, people might find purpose in the friendships and families that they develop. We might develop relationships with AI that are as fulfilling, or moreso than relationships with other people.

1

the-rad-menace t1_jdbieum wrote

Eventually we will have simulations so advanced it will be indistinguishable from our universe. Then people can play god in it

1

Futurology-ModTeam t1_jdbskd4 wrote

Hi, tiopepe002. Thanks for contributing. However, your submission was removed from /r/Futurology.


> > Okay, serious question.
> > > In your humble opinion, what will be the endgame for society???
> This is the point where AI has gotten so advanced that no jobs have any meaning anymore. Nothing matters anymore and there is nothing anyone can conceivably contribute to society that is in any way, shape or form better than the weakest A.I. system would do,
> > > What is left?
> > > In your opinion, describe what this future will look like, in as much detail as you feel like.


> Rule 10 - We welcome text posts, but could you please ensure they meet our requirements for creating in-depth discussion. If yours is removed for failing to do so, consider reposting again, but with additional detail.

Avoid generalized invitations to discuss frequently discussed topics (Will AI take over the world? Is Chat-GPT good or bad, etc, etc). Instead, aim for discussion with specific topics (with supporting links if possible), and give detail to the ideas about their future implications that you would like to see discussed. If possible articulate multiple aspects of these future implications to encourage high quality discussion.

Submissions with [in-depth] in the title have stricter post length and quality guidelines.

Refer to the subreddit rules, the transparency wiki, or the domain blacklist for more information.

[Message the Mods](https://www.reddit.com/message/compose?to=/r/Futurology&subject=Question regarding the removal of this submission by /u/tiopepe002&message=I have a question regarding the removal of this submission if you feel this was in error.

1