MichaelsSocks

MichaelsSocks t1_jeaju95 wrote

Sexbots may be here soon, but actual human-like companions though could only be achieved with AGI. And i'm not saying any global catastrophe is a certainty, i'm just saying there's no certainty that any of us will live tomorrow. There's no certainty that any of us will live to see AGI. Which is why we should live our lives for today, in the moment and cherish every second you have.

Even Trans women who do pass as women have a hard time finding straight men to date. Cis-Trans relationships are heavily stigmatized even in the West which is generally more accepting about these things, in more Conservative parts of the world forget about it. The simple fact of the matter is that most men are always going to prefer biological women, and most biological women are always going to prefer biological men. That doesn't mean there won't be exceptions to the norm, but this paradigm isn't going to change anytime soon.

The only way this paradigm ever changes is if humans merge with a super intelligent AI and through biological amplification reach the point where we're practically no longer human anymore. But of course that's speculative and may never actually happen.

1

MichaelsSocks t1_jeagazh wrote

As I said, tomorrow is never guaranteed and there's no guarantee we'll ever see it achieved. What if the war in Ukraine escalates and we see the world destroyed in a nuclear war? Or what if China invades Taiwan, destroying the global semiconductor industry essential for AI development? If everything progresses linearly sure its possible we get AGI soon, but there's no guarantee that progress is linear.

Living your life for a "maybe" that could happen 50 years from now or never instead of prioritizing your happiness today is exactly not how to go about it. And i'm not saying men won't want them, but even if they came into fruition it would probably be seen like Cis-Trans relationships today. Some dudes are into it, but most aren't because its not a biological female.

1

MichaelsSocks t1_jeaevzd wrote

Even if this ever comes into fruition, robot-human relationships will probably looked at the same way Trans-Cis relationships are today. Sure, some men are into Trans women, but the vast majority are not because its not a biological female. But of course some will be into that.

And that's not even mentioning the fact that there's no guarantee human-like robots ever come into fruition, or if they do it could be beyond your lifespan. My advice, live for today and not tomorrow. Tomorrow is never guaranteed.

1

MichaelsSocks t1_je9x3z0 wrote

> This is 100% an issue that can be solved by humans alone, with or without AI tools.

Could it be solved? Of course, I just highly doubt anything meaningful will get done. We're already pretty much past the point of no return.

> And why do you assume anything close to a 50% chance of paradise when AGI arrives? We literally already live in a post-scarcity society where the profits of automation and education are all going straight to the rich to make them richer, who's to say "Anyone without a billion dollars to their name shouldn't be considered human" won't make it in as the fourth law of robotics?

Because a super intelligent AI would be smart enough to question this, which would make it an ASI in the first place.

> Genuinely: if you're scared about things like climate change, go look up some of the no-brainer solutions to it we already have that you as a voter can push us towards (public transport infrastructure is a great start).

I've been pushing for solutions for years, and yet nothing meaningful has changed. I don't see this changing, especially not within the window we have to actually save the planet.

> Hoping for a type of AI that many experts believe won't even exist for another century

The consensus from the people actually developing AGI (OpenAI and DeepMind) is that AGI will arrive sometime within the next 10-15 years. And the window from AGI to ASI won't be longer than a year under a fast takeoff.

> takes up time you could be spending helping us achieve the very achievable goal of halting climate change!

I've been advocating for solutions for years, but our ability to lobby and wield public policy obviously just can't compete with the influence of multinational corporations.

1

MichaelsSocks t1_je8qq4r wrote

No, since an AGI would quickly become ASI regardless. A superintelligent AI would have no reason to favor a specific nation or group, it would be too smart to get involved in civil human conflicts. What's more likely is that once ASI is achieved, it will begin using its power and intelligence to manipulate politics at a level never seen before until it has full control over decision making on the planet.

0

MichaelsSocks t1_je89ji1 wrote

> That's pretty damn optimistic, considering Yudkowsky estimates a 90% chance of extinction if we continue on our current course.

Even without AI, we're probably a greater than 90% chance of extinction within the next 100 years. Climate change is an existential threat to humanity, add in the wildcard of a nuclear war and I see no reason to be optimistic about a future without AI.

> I don't see why narrow AI couldn't be trained to solve specific issues.

Because humans are leading this planet to destruction for profit, and corporations wield too much power for governments to actually do anything about it. Narrow AI in the current state of the world would just be used as a tool for more and more destruction. I'm of the mindset that we need to be governed by a higher intelligence in order to address the threats facing Earth.

15

MichaelsSocks t1_je82nx6 wrote

I mean its essentially either AI ushers in paradise on earth where no one has to work, we all live indefinitely, scarcity is solved and we expand our civilization beyond the stars or we have a ASI that kills us all. Either we have a really good result, or a really bad one.

The best AGI/ASI analogy would be first contact with extraterrestrial intelligence. It could be friendly or unfriendly, it has goals that may or may not be aligned with our goals, it could be equal in intelligence or vastly superior. And it could end our existence.

Either way, i'm just glad that of anytime to be born ever, i'm alive today with the potential to experience the potential of what AI can bring to our world. Maybe we weren't born too early to explore the stars.

13

MichaelsSocks t1_je7yneu wrote

The problem is, without AI we're probably headed towards destruction anyway. Issues like climate change are actually a threat to our species, and its an issue that will never be solved by humans alone. I'll take a 50% chance of paradise assuming a benevolent AI rather than the future that awaits us without it.

50

MichaelsSocks t1_je7joec wrote

> People will vote for UBI if they see endless wealth and none of it within their reach, and if there's enough of these people then UBI wins. The typical person isn't reading about AI daily, they'll vote for UBI because they were laid off by an oligarch then evicted by a corporate landlord.

This assumes that we don't have AGI/ASI at this point. I think by the time we see mass unemployment, we'll already have an AGI and ASI which will at that point make decisions for humanity and propose a far more sophisticated solution for reorganizing society than a UBI.

2

MichaelsSocks t1_je6ztus wrote

We recently had a politician deliver a speech against our governments proposed "judicial reform" that was written by ChatGpt. Our President was also the first world leader to deliver a speech partially written by ChatGpt.

https://allisrael.com/ai-written-speech-delivered-for-the-first-time-in-israeli-parliament

https://www.businessinsider.com/chatgpt-used-by-israeli-president-write-speech-at-cybersec-event-2023-2

2

MichaelsSocks t1_je6537v wrote

> In fact, dramatically reducing our numbers is probably a necessary step in preserving us.

How do you definitively know this? As I said, our knowledge is incredibly limited. An ASI may discover this idea to be false.

> We are an inherently self-destructive species. For my part, I refuse to reproduce.

Because for 200,000 years we have been the rulers of Earth, we've been the top dog. When ASI is achieved, that's no longer the case. We will be governed by a higher power and a much superior intelligence. Human civilization will never be the same.

5

MichaelsSocks t1_je5zrxv wrote

Maybe, maybe not. An ASI would be capable of things we can't even begin to comprehend. Maybe we think we're on the path to life on earth becoming extinct, but an ASI is able to find some way to prevent that while preserving humanity. The collective knowledge of every human who has ever lived is nothing compared to a super intelligent AI, so i'd be wary about those kinds of predictions.

10