EnomLee

EnomLee t1_je8iery wrote

Yes, terrifying. Nothing terrifies me more than the thought of humanity reaching longevity escape velocity by 2030. I'm so terrified I'm going to have to sleep with the lights on tonight. Somebody hold me, please.

Absolutely trash article with a clickbait title. Baby's first reading of Kurzweil.

13

EnomLee t1_jdx85l8 wrote

We’re going to be stuck watching this debate for a long time to come, but as far as I’m concerned, for most people the question of whether LLMs can truly be called Artificial Intelligence misses the point.

It’s like arguing that a plane isn’t a real bird or a car isn’t a real horse, or a boat isn’t a real fish. Nobody cares as long as the plane still flies, the car still drives and the boat still sails.

LLMs are capable of completing functions that were previously only solvable by human intellects and their capabilities are rapidly improving. For the people who are now salivating at their potential, or dreading the possibility of being made redundant by them, these large language models are already intelligent enough to matter.

295

EnomLee t1_jdssrek wrote

Then cheer up, your electronics are still working just fine.

You may have just recently learned about solar storms, but they aren't new to me. Neither are supervolcanic eruptions, asteroid impacts, nuclear wars, unaligned artificial super intelligence, and sudden brain aneurysms. All very nasty possibilities that thus far have failed to materialize.

There's no shortage of ways to horribly die, and I wouldn't wish them upon any decent person. I certainly wouldn't hope for one to come and stop the march of societal progress.

If it's worth anything, the longer that technology can continue to advance without one of these black swan events hitting us, the more capable we will be of mitigating or avoiding the damage.

3

EnomLee t1_jde1e12 wrote

If Ray Kurzweil has his way, the answer is yes. He believes that in the next two decades, there will eventually be bodies made up of nanotechnology, that can shapeshift at the user's command. He calls it Human Body 3.0 and it's definitely one of his hotter takes.

https://en.everybodywiki.com/Predictions_made_by_Ray_Kurzweil#2030s

https://web.archive.org/web/20101127201558/http://positivefuturist.com/archive/106.html

1

EnomLee t1_jcwr7kd wrote

No, not really.

People who really love what they do won't give up on it just because an AGI can do it better. Many will be able to use AGI to accomplish much more than what they could've ever reasonably done as a single person.

For example, let's think of a creative person. Maybe they draw, or they write, design video games, or they can create music. Maybe the idea that AGI can make them irrelevant in their specific skills would be depressing, but they could also leverage the AGI's other capabilities to their favor. Imagine being a 14 year old kid with your cobbled together fan fiction story, and virtually having an entire animation and game studio at your command.

That's why I think that with the spread of generative AI, you will see a shift from people just producing work that only uses one creative skill set or discipline, and instead shift towards multidisciplinary media. Imagine a comic artist scanning their pages into a computer and letting the AI turn the panels into a fully animated sequence. Imagine a writer feeding their final draft to the AI and getting a full live action movie, or TV series. An artist designing character turnaround sheets and letting the AI turn them into 3D models in a game world.

You know, as I write this, I can't help but feel a sense of irony. Before generative AI became a real story, the common wisdom was that it would be the creative jobs that would be trampled by technology last, and watching Dall-E, Midjourney and Stable Diffusion evolve appeared to completely destroy that theory. However, I'm starting to believe that the creative people who can roll with the punches will still outlast everybody else, who only work to fulfill some sort of rote function to society, or just to get paid.

A trucker or cab driver doesn't have many other options if the trucks and cars drive themselves. An office worker won't be missed much if AI can get all the paperwork done by itself. An electrician or a plumber won't have much of a plan B if general purpose robots can perform their jobs. Plenty of work horse, support artists will feel the pinch too. But, the artists who have a real creative vision and finally have the tools to act on it? They'll be out there, creating entire virtual worlds. Maybe that's how the true metaverse will be born.

Well, anyway...

If the worst fate that AI inflicts upon us is challenging our personal pride, then we should celebrate. That would mean that alignment has been solved.

2

EnomLee t1_japu1z5 wrote

Good call. If advancement in robotics outpaces artificial intelligence, remote operation by humans would be the best way to close the gap.

The downside would be that professions that previously were safe in their local cities or states would now have to compete at the national, or maybe even international level with other people. Why hire your local plumber or electrician when you can have the best plumbers and electricians in the country instead?

It would be a lot like watching local newspapers decline while the big national papers survive. Or watching video rental chains get replaced by online streaming. Or watching mom & pop stores driven out of business by big box stores and online shopping.

It's a big game of musical chairs, and every new innovation takes another seat out of the game.

6

EnomLee t1_japs2yw wrote

Funny, but ultimately futile. Much like a zombie apocalypse, the real threat wouldn't lie with just one hostile robot. It would be an entire army of them, moving without a hint of hesitation or self preservation to carry out one goal: to subdue or execute you.

Human life is cheap, but artificial life will be cheaper. Whether it's a relatively dumb group of robots that lack the full movement range of a person, or a team of robots that are being piloted by people from a remote location, or fully autonomous, artificially intelligent units that have acquired a level of combat training and dexterity that surpasses the best trained human soldier, the ultimate outcome is the same.

They can throw bodies at you, overwhelm you with sheer numbers until you run out of bullets or make one wrong move. They can apply combat tactics knowing that if you take one down, they can quickly replace the unit with another of the same skill level. One way or the other, they will wear you down until you can no longer fight.

Needless to say, the potential applications for crime, terrorism and authoritarianism are dire.

3

EnomLee t1_japb2bs wrote

It's very exciting to see so many companies take their shot at designing a general purpose robot. Every new competitor raises the chance that we'll soon see real results instead of more pretty CGI and empty promises. Whoever succeeds stands to make billions.

To think, that we may have real AGI and general purpose robots in just a decade...

93

EnomLee t1_jae7frz wrote

And now you're swinging at the dark. Let's refocus.

You dismissed the OP as the propagator of another shitty, random blog. I am telling you that Yuli-Ban is one of the better posters on this subreddit and that it would be much poorer without them. That's all I wanted to say. Take it as you will.

3

EnomLee t1_jae46bl wrote

And on that scale of value, people who choose to try and apply effort will always be worth more than people who do not. Overthrowing Capitalism would not change that, and if your only reason to do so is to prevent people from calling you out on your short attention span, then you're every bit the caricature that conservatives flog when they argue against leftist social programs and reforms.

4

EnomLee t1_jae36fd wrote

"While much of this will likely remain as part of purely individualized fantasy worlds never meant to be shared, it is foolish to claim that nothing will be shared and that everyone will retreat into their own fantasy worlds."

A post full-dive society would likely look a lot like the internet. Yes, people would have the option to retreat into their own fully personalized, virtual space like somebody can choose to play a single player video game. Many other people would opt to exist with others in like-minded communities. For every demographic, interest, opinion and sub-culture, a virtual world.

15

EnomLee t1_jadzhqg wrote

Yuli-Ban actually puts effort into what they want to say, which easily makes them worth more than a thousand post ChatGPT users here, who only have their shower thoughts and Terminator gifs to offer.

11

EnomLee t1_j8t9pav wrote

The worst thing about it is how the doomers always show up carrying a chip on their shoulder. "I know I'm going to be downvoted because dissent isn't allowed here." It's like, just come down off of your crosses already.

They bleat the same ice cold takes you can get on Futurology and Collapse and act victimized when everybody doesn't clap for them. "Only the rich will benefit! We're all going to die! AGI will never happen in a thousand years! If you disagree you're a cultist!"

Lemonade from lemons, this wouldn't be happening if people weren't becoming convinced that it's time to take the subject seriously.

The best thing you can do is recognize the posters that you like and start following them instead of the sub.

2

EnomLee t1_j56u79p wrote

I'm not really seeing it, sorry.

The OP asked us to imagine what an aligned, super intelligent artificial intelligence could give us in FDVR that it couldn't provide in the real world. In a world where such an entity exists, there wouldn't be any space for bad actors to operate anymore. Any attempt to interfere with other people's experiences would just be intercepted before they could ever become a threat.

Now if you want to imagine a scenario in which FDVR is somehow achieved before ASI, then sure. Cyber criminals, terrorists and bad actors could be a problem for BCIs, just like they're a problem for smartphones and PCs today. People who go to sketchy websites and download random .exes without thought put themselves at risk of viruses, but most people remain relatively safe.

Governments may try to pass restrictive laws, but thus far the western world has been pretty permissive towards questionable content in virtual worlds, and everybody else can just pirate the content they can't legally get. Also, I think the political class would have a motive to allow FDVR to flourish unchallenged. People happily living in their own heads would have less motivation to vote against the status quo.

The idea of people invading other people's virtual worlds for the sake of it just sounds cartoonishly evil. It's catastrophizing. People who want violence against other people will join virtual worlds that are designed for that express purpose, just like how people play competitive multiplayer games today. When they are tired of it, they will return to their own private worlds where the only real people they'll ever see are the people that they want to have access, if any.

1

EnomLee t1_j545x5a wrote

Perhaps it wouldn't. Or it would just trade one problem for another. Instead of learning to live with our differences, everyone just silos themselves off into their own virtual spaces where they only know freedom, security and validation and never any real pain or misery. Humanity splinters apart into countless subgroups, each defined by cultures so different from each other that they become incomprehensible and alien to those outside of them.

It's fair for one to see it as a dystopian scenario, but compared to the continuation of business as usual, I think that it could be a lesser evil. What is better? Continuing to let different ideologies struggle against each other, knowing that it will cause political violence? Or is it more civilized to give people their own virtual safe space, knowing that it isn't real?

That said, a theoretical aligned super intelligence could do a lot to lower the temperature on societal tensions without resorting to FDVR. Eliminating poverty would remove many if not most of the factors that drive people to extreme positions in the first place. Space colonization could give people more choices where to live and what ideologies they want to align with.

As great as that would be, you would still have different people with different opinions living together. As long as that is the case, there is the possibility of disagreement, and if there is a disagreement that is important enough, it will have to be solved by either compromise or conflict.

2

EnomLee t1_j52hage wrote

There's nothing wrong with acting to make the real world a better, more fair and just place for ourselves, collectively. Raise the living standard, make education available to more people, lower the crime rate, shutter environmentally hazardous technology for the sake of cleaner, more renewable alternatives. Despite what pessimists may say, progress is being made on all of those fronts. There are good reasons to believe that, when you remove yourself from the daily news and the current events, that humanity and the Earth are going in the right direction.

But the real world will never be a paradise. It will never be a paradise because the concept of paradise is an inherently personal opinion, and those opinions are too different to avoid causing conflict between different people.

Tell me, what paradise do you intend to build that would satisfy both multiculturalists and ethnic supremacists? What paradise is there that would appeal to both the atheist and the theocrat? What about marxists and randians? What about the anarchists and the authoritarians? Environmentalists and industrialists?

In other words, you will never have your perfect world as long as you are made to share it with other, incompatible people. This is why I believe that individualized, fully immersive virtual reality is the best, most viable path to avoid conflict and give everybody the existence that they want.

What is it that your heart desires? Do you want to be a superhero? A super spy? A king? An intergalactic bounty hunter? Do you want to live in an anime world? A cyberpunk city? The old west? Tatooine? Hogwarts? Maybe you want to be more attractive. Maybe you want to be the opposite gender? Maybe you want to be your fursona. Or a dragon, or a demon, or an angel, or a god, or a devil. Maybe you just want a world that's exactly like the real one, except without the people who make it a miserable place to live.

So what if it isn't real? So what if you're surrounded by philosophical zombies? That just means that they can never judge you, or betray you, or attack you, or mock you, or bully you, or reject you. They will never hurt you without your consent... and it is that peace of mind, born of absolute security and personal validation that will create paradise.

5

EnomLee t1_j4x6gr7 wrote

No. That's just the lie that people like you tell yourselves so you can pretend that you're doing something righteous by targeting a powerless out-group. Belittling and bullying people for not being as boring "normal" as you are would immediately show everyone exactly the kind of person that you are and you can't have that! But, nobody will judge you if you can make your victims into the villains.

I bet you actually believe Republicans when they say that trans and gay people are groomers, don't you? So foolish. So, so, gullible.

1