Viewing a single comment thread. View all comments

Visual-Arm-7375 OP t1_iz5inba wrote

Thank's for the answer! I don't understand the separation you are doing btw training and validation. Didn't we have train/test and we applied cv to the train? The validation sets would be 1 fold at each cv iteration. What I am not understanding here?

1

rahuldave t1_iz5lmbz wrote

You dont always cross-validate! Yes sometimes after you do the train-test split u will use something like GridCv in sklearn to cross validate. But think of having to do 5-fold cross validation for a large NN model taking 10 days to train..you now just spent 50 days! So there you take the remaining training set after the test was left out (if u left a test out) and split into a smaller training set and a validation set.

1