Submitted by altmorty t3_113x9ir in technology
HanaBothWays t1_j8suz0a wrote
ChatGPT is kind of like one of those people who says a lot of wrong things in a soothing and very believable and authoritative way. Well, unless you give it a prompt to make it respond with a shitpost.
Or, since it doesn’t really “understand” what it’s outputting, it may give you answers that are mostly right but incorrect in some important and really bizarre ways, like a patient with an unusual neurological condition in an Oliver Sacks story.
Viewing a single comment thread. View all comments