Viewing a single comment thread. View all comments

Equivalent-Way3 t1_j4wjuxe wrote

XGBoost can do this and you can set its hyperparameters so that it's a random forest

1

monkeysingmonkeynew OP t1_j4z1fum wrote

Thanks! Do you have any more info on how to do it with XGBoost?

1

Equivalent-Way3 t1_j50y33r wrote

Yep very simple. Say you have model1 that you trained already, then you just use the xgb_model argument in your next training.

In R (Python should be the same or close to it)

new_model <- xgb.train(data = new_data, xgb_model = model1, blah blah blah)
1