Viewing a single comment thread. View all comments

currentscurrents t1_j8zz4n3 wrote

Look at things like replika.ai that give you a "friend" to chat with. Now imagine someone evil using that to run a romance scam.

Sure the success rate is low, but it can search for millions of potential victims at once. The cost of operation is almost zero compared to human-run scams.

On the other hand, it also gives us better tools to protect against it. We can use LLMs to examine messages and spot scams. People who are lonely enough to fall for a romance scam may compensate for their loneliness by chatting with friendly or sexy chatbots.

6

ilovethrills t1_j90noyx wrote

But that can be said on paper for thousands of things. Not sure if it actually translates in real life. Although there might be some push to label such content as AI generated, similar to how "Ad" and "promoted" are labelled in results.

−1