Viewing a single comment thread. View all comments

malcolmrey t1_j7qx1b8 wrote

> Humanity as a whole is -unfortunately - not morally, ethically, or intellectually mature enough to handle an oracle that can answer almost every question

what do you mean by that?

are you worried that someone might ask something, get a wrong response and get hurt because he blindly applies the wrong solution?

2

OllaniusPius t1_j7rxrhs wrote

It's possible, especially if companies start marketing it as a replacement to search engines. We've all seen how these systems can get things factually wrong. Hell, Google's first demo contained a factual error. So if they are presented as a place to get factual information, and people start asking medical questions that they get wrong answers to, that could cause real harm.

1

Unfocusedbrain t1_j7qys9x wrote

That's true enough. Considering people have died to GPS of all things, yeah, its a non-negligible issue.

The more concerning issue is bad faith actors and malicious agents. There are already examples of people using other AI software maliciously. Countless to list.

For Chagpt there is an example of cybersecurity researchers using ChatGPT to make malware even with its filters in place. They were acting in good faith too - but that also means people with less academic pursuits could use it for malicious but similar means.

−1

[deleted] t1_j7sdjau wrote

[deleted]

1

Unfocusedbrain t1_j7ssxdk wrote

True enough that malware is possible without ChatGPT my snarky commenter. I'm more concerned with script kiddies able to mass produce polymorphic malware that makes mitigation cumbersome with very little effort or investment by the creator.

Hackers have the advantage of anonymity, so it becomes incredibly difficult to stop them proactively. This just makes it worse.

But that wasn't my point my bad faithed chum and you know that very well. I mean, your posting history makes it really clear you have a vested interest in ChatGPT being unfettered as possible. So I don't think you and I can have a neutral discussion about this in the first place. Nor would you want one.

1