Viewing a single comment thread. View all comments

fingin t1_j031gnr wrote

Even GPT-4 will make silly mistakes. That's what happens when a model is trained to find probable word sequeces instead of actually having knowledge of language like people do.

1

Relative_Rich8699 t1_j033bjo wrote

Yes. But I was speaking to "the company's" bot on purpose and I would only say that it should be trained with company data for those questions. When I inquire about ducks it can use the world's written word.

1