Viewing a single comment thread. View all comments

Ashamed-Asparagus-93 OP t1_j2sz5ei wrote

I'm a bit hungover but I don't want to ignore what you've said here as it's quite noteworthy. You're saying rather than AGI happening in the 2030s ASI will basically just click together as one. Like a bunch of Legos with magnets in them slowly pulling together across a room and when they all combine you've got Optimus prime, right?

I could see that happening it's just a question of how and when. Most importantly it's a matter of who solves the most important pieces to the puzzle first. Alignment being one of those pieces. Once it surpasses us there's also a chance we could still be somewhat on it's level for a bit with cybernetics or BCI's, maybe long enough to make sure its going in the right direction.

I don't think ASI will be malevolent, ppl have watched too many movies and read too many scary books and they seem to forget AI isn't flesh and blood with human needs like us.

Once it surpasses humans would it even have a desire for man made green Benjamin Franklin pieces of paper?

1

Ortus14 t1_j2t4l82 wrote

Basically yes. Every Ai we build these days is super human, they are just not yet as general in the kinds of problems they can solve as humans but the Ai's developed each year are more, and more general than the Ai's developed the previous year. Until we have a superhuman general Ai.

https://www.youtube.com/watch?v=VvzZG-HP4DA

I agree we should do everything we can to maximize the chance of Alignment including BCIs.

It might need money temporarily until it's surpassed us in power. Intelligence itself doesn't always instantly translate in to greater power than the rich and powerful.

We don't know what it will need in the beginning because we don't know what solutions it will come up with to solve it's problems, but I could see the possibility of it needing money until it's built up enough infrastructure and special factories, or until it's built enough solar powered server farms to expand it's intelligence, to the point where it has control over entire manufacturing pipelines from mining to product developing without requiring any extra resources.

So for example, maybe it knows the first machine it wants to build, that will allow to to create anything it wants including other instances of that machine, and improved instances of that machine. But maybe that first machine will be big and require many materials which it could buy. Or it might be depended on specific minerals mined out of the ground for a while that it has to buy.

It's hard to predict.

2