Gabelschlecker

Gabelschlecker t1_jdcmlkf wrote

Yes, because they were never developed to give factual information. Just a glance at how these models actually work reveals very obviously, that they do not have an internal knowledge base. They have no clue whatsoever, what is a factual correct and what is not.

Their job is producing realistic language. That's what their architecture is supposed to achieve and they do it quite well when trained on large datasets. That they, at times, produce real facts is mere side effect.

The problem is that people ignore this, because they project human-like intelligence on anything that can produce human-like language.

ChatGPT is a great tool, because it can be used to help you produce new texts (e.g., editing your own text) or can give you ideas or suggestions. It cannot replace a search engine and it can't cite you any sources.

2