Viewing a single comment thread. View all comments

manuLearning t1_irie761 wrote

I had always good experiences with dropout. Try to put a dropout layer of around 0.75 after your first layer and onedropout layer before your last layer. You can also put a light 0.15 layer before your first layer.

How similar is the test and val set?

2

perfopt OP t1_iriens4 wrote

For creating test and val I used test_train_spilt from sklearn

I'll I manually examine it.

But in general shouldn't the distribution be OK?

inputs_train, inputs_test, targets_train, targets_test = train_test_split(inputs, targets, test_size=0.1)
1

manuLearning t1_irij2hl wrote

A rule of thumb is to take around 30% as val set

1

perfopt OP t1_irij7vh wrote

I tried that as well with similar results when adding L2+dropout

0