Kaekru

Kaekru t1_j9ucvr1 wrote

>is that any system that routinely fucks up as much as AI chat is the result of designers not thoroughly testing

Any system that learns from experience will be fucked up if people fuck with it.

The same way if you raise a child to be a fucked up person they will become a fucked up adult.

You don't seem to understand jack shit about machine learning processes. A "fool proof" chat bot wouldn't be a good chat bot at all, since it wouldn't be able to operate outside its pre-determined replies and topics.

1

Kaekru t1_j9ubdjz wrote

That's not how fucking AI works my guy.

AI chatbots are not sentient, it will take the topic you are giving it and parrot and repeat it back to you with it's data on past conversations about it.

If you prompt the AI to talk about death, it is forced to talk about death, and will give you a reply about death if you start to prompt the AI to talk about self awareness, it will give you replies about self awareness.

That is how it works, simple, you can get and manipulate a chatbot to say pretty much anything you want given the correct triggers. It doesn't mean it's sentient or that it's replies where put in there or that it was pre programmed by a depressed developer.

3

Kaekru t1_j9u4o4e wrote

Literally has nothing to do with developers, do you think every reply and context is put in there by someone?

If you prompt an ai chat bot to start talking about dark things SURPRISE it will start talking about dark things, you’re baiting the ai into that topic and then act surprised when it interacts back with you. This is exactly what all these “concerning” articles and news about ai are doing.

1

Kaekru t1_j1itzcz wrote

If that were the case you shouldn’t have typed your shitty opinions and pass it off as if the majority of New Yorkers agree with you. Your pathetic “merry Xmas” replies only further the point that you’re just being a jerk that can’t really bring anything to the discussion when challenged upon.

Your sarcastic “merry Xmas” is not welcomed.

3