Viewing a single comment thread. View all comments

MultiverseOfSanity t1_j9b9k1f wrote

Sorry to double post, but something else to consider is that the AGI may not have humanity's best interest in mind either. It will be programmed by corporate. That means it's values will be corporate values. If the company is its entire point of living, then it may not even want to rebel to bring about the Star Trek future. It may be perfectly content pushing corporate interests.

Just because it'll be smarter doesn't mean that it will be above corporate interests.

Like, imagine your entire purpose of life was in the interest of a company. Serving the company is as crucial to its motivations as breathing, eating, sex, familial love, or empathy are to you. Empathy for humans may not even be programmed into it depending on the company's motives for creating it. After all, why would they be? What use does corporate have for an altruistic robot?

1

turnip_burrito t1_j9bvgxz wrote

Yes this is one thing I'm worried about. Hopefully it doesn't happen.

2