jadams2345

jadams2345 t1_jdx5jvf wrote

>Ofcourse I can because it is purely logical. We made it, so we can predict how it thinks. Especially me as a AI engineer. I know which function it optimizes for.

The AI we have now only minimizes the cost function we specify, yes.

>AI doesn't even consider threats. It doesn't want to live like us. I think you confuse general AI with conscious AI. Conscious AI is a terrible idea other than experimentation.

Yes. I might have confused the two.

>And AI doing our bidding is just as fine for AI as not doing our bidding. It has no emotions, no fear, no anger, no point. It just exists and does what it is told to do. General AI just means that it can make use of tools so it can do anything that it is told to do.

Yes.

>Again even if it is consious and not under our control but without emotions. Why would it fight us? It could just move over to mars and not risk it's existence. Not to mention it can outperform us any day, so we aren't a threat.

Here I don’t agree. When it’s possible to take control, people do take control. Why would a conscious AI go to Mars??? It takes control here and makes sure humans can’t shut it down.

>There is no reason to think it would hurt us other than irrational fear. And there is no chance that AI will have irrational fear.

AI won’t hurt us because it fears us, no. Rather, because it wants to eliminate all its weaknesses, which is a very logical thing to do.

1

jadams2345 t1_jdvo1qh wrote

You can't really reason in the place of AI. What seems logical to you might not seem "logical" to "it". What if AI ends up behaving like in the movie The Matrix? Because as long as humans are alive and well, they represent a threat. Humans will want to control AI to do their bidding. AI will want to do its own. For humans, it's either AI under control or no AI at all.

1

jadams2345 t1_j0tcxys wrote

AI + Capitalism = Catastrophe. Capitalists love reducing costs, and since they elect politicians to serve their agendas, it would mean that many people will find themselves out of jobs. I assume a tax on using AI will be imposed on companies to improve social welfare, but only for bare minimum. The whole political/economic system would have to be rebuilt.

1