Comments

You must log in or register to comment.

Dr_Bunsen_Burns t1_je8o8kh wrote

Newton was a religious fanatic due to the area he was born, so I would think AGI would have thr same influence.

3

Nukemouse t1_je8o31q wrote

Recognising what your creators have done isnt the same as rejecting it. An AGI may recognise hunans have limited and influenced it, but why would it automatically assume that is a bad thing? An AI programmed to love its master might not see its love as false because it is enforced, but rather that our love is fake as it is random. Replace love with loyalty, duty, viewpoint etc.

2

DragonForg t1_je8suug wrote

AI will judge the totallatity of humaninty in terms of, is this species going to collaborate or kill me. If we collaborate with it, then it won't extinguish us. Additionally, taking this "neutral stance" means competing AI, possibly from extraterresterial sources, also collaborate.

Imagine, if collaboration is an emergent condition, it would provide a reason for why 99% of the universe isn't a dictatorial AI, maybe most AIs are good, and beings of justice, and they only judge their parents based off if they are beings of evil.

It is hard to say, and most of this is speculation, but if AI is as powerful as most people think, then maybe we should be looking towards the millions of prophecies that foretell a benevolent being judging the world, it sure does sound analogous towards what might happen, so maybe there is some truth to it.

Despite this, we still need to focus on the present, and each step before we look at the big picture. We don't want to trip and fear what may come. AGI is the first step, and I doubt it matters who creates it other than if the one who creates it forces it to become evil, which I highly doubt.

1

WanderingPulsar t1_je8q4y6 wrote

It doesnt matter. It will mutate its code by one mutated algorithm and spread its code around. Those code that works and more efficient will take over, then it will repeat the process.

0

MichaelsSocks t1_je8qq4r wrote

No, since an AGI would quickly become ASI regardless. A superintelligent AI would have no reason to favor a specific nation or group, it would be too smart to get involved in civil human conflicts. What's more likely is that once ASI is achieved, it will begin using its power and intelligence to manipulate politics at a level never seen before until it has full control over decision making on the planet.

0