Viewing a single comment thread. View all comments

MonsieurBlunt t1_j9glzsp wrote

Accomodating as much space for information as you can is not really a good idea. It is prone to overfitting and also harder to learn. You can think of it as a way of regularisation, you are forcing the model to get the useful information and not the rest or, you leave less space where it can encode the training data to overfit.

3