Submitted by Cool_Abbreviations_9 t3_123b66w in MachineLearning
master3243 t1_jdue84p wrote
Reply to comment by BullockHouse in [D]GPT-4 might be able to tell you if it hallucinated by Cool_Abbreviations_9
> The problem with your solution is that it probably biases the model towards making up some papers just to fit the prompt and have a mix.
That's a very important point, adding an extra condition (if 'p' then 'q') to the prompt makes the model biased towards doing 'p' then doing 'q' to fulfil the prompt despite the condition still being met if it just avoided doing 'p'.
For a more concrete example, here's me asking ChatGPT to write two essays:
1- Write a paragraph about zoos. Figure. (Notice how no Elephants are mentioned)
2- Write a paragraph about zoos with (if 'p' then 'q') condition. Figure (Notice how only this answer mentions Elephants)
Viewing a single comment thread. View all comments