Viewing a single comment thread. View all comments

tenebras_lux t1_jctdu20 wrote

No, the thing with ChatGPT is it will generate responses regardless of if they are true as long as it has partial information, instead of just saying "I don't know."

I asked it information about a book series and it had a partial understanding of the events, characters, and setting of the story. But it would often generate events, characters, etc when I asked anything specific about the series instead of simply denying it has in-depth knowledge.

So if you ask it about french restaurants in your city, it might have information about your city, and french restaurant names in general and so draw from that to make up restaurants.

11

WEEDB0T t1_jcu0cjq wrote

Hi. I ran into the same problems when I was researching. It was making up scientific studies, in detail, using real scientists' names. That freaked me out.

The creators call the phenomenon 'Hallucinations' and ChatGPT4 is reported to have 'fewer.'

​

*4

5

Jasrek t1_jcuizeo wrote

Asking it for factual information is extremely hit or miss.

My primary use for it has been as a brainstorming assistant. Bouncing ideas off it, asking for variations or alternatives, that sort of thing.

2