Submitted by neuralbeans t3_10puvih in deeplearning
neuralbeans OP t1_j6md46u wrote
Reply to comment by like_a_tensor in Best practice for capping a softmax by neuralbeans
That will just make the model learn larger logits to undo the effect of the temperature.
_vb__ t1_j6ocec9 wrote
No, it would make the logits be closer to one another and the overall model a bit less confident in its probabilities.
Viewing a single comment thread. View all comments