Submitted by neuralbeans t3_10puvih in deeplearning
like_a_tensor t1_j6mcv1v wrote
I'm not sure how to fix a minimum probability, but you could try softmax with a high temperature.
neuralbeans OP t1_j6md46u wrote
That will just make the model learn larger logits to undo the effect of the temperature.
_vb__ t1_j6ocec9 wrote
No, it would make the logits be closer to one another and the overall model a bit less confident in its probabilities.
Viewing a single comment thread. View all comments