Comments

You must log in or register to comment.

Chroderos t1_j6jdfei wrote

For most reditors: start abusing it to write bad fanfic.

This is the part of HAL and Skynet’s backstories they never tell you.

Those homicidal AIs in movies are actually just misunderstood because we never saw what they went through.

6

whiskytamponflamenco t1_j6jessf wrote

The movie trope where the hero destroys a technology is so funny to me. If the circumstances are right for something like AGI to be invented and you destroy it, it'll just be invented again by another company in the next few years. You can't stop progress, you can only write a bit of legislation to regulate it once it's out in the world.

Invention isn't a random chaotic event, it's the result of its environment. Newton and Leibniz both separately discovered calculus not due to a crazy kooky coincidence, but because in the generations prior, Barrow, Fermat, Pascal, and Descartes paved the way for it.

So there's absolutely no point in erasing the AGI code. The best strategy in this situation is to claim the invention, get rich from it, then use that money to hire lobbyists to push legislation through congress for solid AGI oversight so that the tech causes the least amount of harm and maximum public good.

5

InSilicoLabMouse t1_j6jd90a wrote

Laugh, because someone thought they had developed sophisticated, self-aware synthetic life with 10000 lines of code.

4

antequammoriar t1_j6jde36 wrote

Figure out how it figured out time travel first. Then hope that I can decipher it and direct it to only improve humanity and the world without making itself known.

2

strvgglecity t1_j6jfg0m wrote

Would be right at home in r/fantasy, not sure about here.

2

OmgOgan t1_j6jamm4 wrote

Use it to build me the best roller coaster ever in Rollercoaster Tycoon 2, duh

1

Courtside237 t1_j6jds60 wrote

I’d sit back and enjoy being artificially intelligent

1

just-a-dreamer- t1_j6jj1i8 wrote

Step1, start a company

Step2, file for a patent.

Step3, call the press

Step4, get rich

Step5, save humanity

1

DontDefendTheElite t1_j6jl103 wrote

Safe humanity how?

0

just-a-dreamer- t1_j6jmcj6 wrote

I ask AI to give me options to save humanity. Than I pick the best.

First order of business would be to rob the securities markets blind to raise capital. While bribing politicians and religious institutions big to shut up about it.

I would probably start an UBI political party and unleash AI automation upon the world, building up economic and political power.

1

Exact-Pause7977 t1_j6jkpyb wrote

I’d look at the date, assume it was phishing or a virus… and delete it without opening it. This is really pretty basic internet safety. Who opens emails from strange sources in this day and age?

1

RoboNinjaSloth t1_j6jlbgz wrote

"You run the code"

The fuck I do. Not running 10k lines of mystery code on my machine till it had a virtual box blast shield around said code

1

Tuga_Lissabon t1_j6jlerb wrote

Its going to happen anyway, might as well get rich out of it.

1

AlexG2490 t1_j6jlwmv wrote

The same thing I do with contemporary code - not know how to compile and run it.

1

kevineleveneleven t1_j6jhn0u wrote

Several problems with this. One, AI is not hard coded. It is not a rules engine. The code just tells the AI how to learn from the training data. Then there are massive terabytes or even petabytes of training data that the code processes into an AI model. To train even today's best AI, which is in no way AGI, costs millions of dollars of computer time. The resulting model takes far less processing capacity to run than to train, but it is still significant. Only fairly small models can be run on a PC. Big ones still require a server farm. Two, 10K lines of code is microscopically small for a hard-coded rules engine to pretend to be AGI. Even the code that makes your car go when you press the accelerator is millions of lines of code.

What would be cool to find in your email is an API that allows you to connect your own code to a secret AGI running on someone else's server farm, for free.

0