Viewing a single comment thread. View all comments

master3243 t1_jdue84p wrote

> The problem with your solution is that it probably biases the model towards making up some papers just to fit the prompt and have a mix.

That's a very important point, adding an extra condition (if 'p' then 'q') to the prompt makes the model biased towards doing 'p' then doing 'q' to fulfil the prompt despite the condition still being met if it just avoided doing 'p'.

For a more concrete example, here's me asking ChatGPT to write two essays:

1- Write a paragraph about zoos. Figure. (Notice how no Elephants are mentioned)

2- Write a paragraph about zoos with (if 'p' then 'q') condition. Figure (Notice how only this answer mentions Elephants)

46