josejo9423

josejo9423 t1_jcpu2pe wrote

I would go with 1 but I would no tune early stopping just the number of estimators , xgbboost has the option of stopping iterations (early stopping) when there are no improvements in the metric, if you plot then what model believes and realizes that could have been stopped early , step up that number that you consider before overfitting

1

josejo9423 t1_jchq421 wrote

I am not quite familiar with deep learning but don’t you have loss function where you can maximize recall precision or AUC? I believe accuracy would not apply in this case since you have imbalanced dataset, also over sampling as it dealed in random forest you are making up new images i don’t know how good is that, why don’t you try under sampling better or weight adjustments?

1