Submitted by sidney_lumet t3_105syyz in MachineLearning
harpooooooon t1_j3ck3ri wrote
An alternative would be a very large set of of 'if' statements
sidney_lumet OP t1_j3cl51k wrote
😂😅
tdgros t1_j3coj8r wrote
that's what random forests are...
currentscurrents t1_j3fop2j wrote
You can represent any neural network as a decision tree, and I believe you can represent any decision tree as a series of if statements...
But the interesting bit about neural networks is the training process, automatically creating that decision tree.
Immarhinocerous t1_j3g6gf5 wrote
That's really interesting, thanks for the share. Though I wonder if most decision trees still don't converge upon the same solutions as a neural network, even if they're capable of representing the same solutions. If trees don't converge on the same solutions, and NNs outperform trees, that would mean NNs are still needed for training, then the models can be optimized for run-time by basing a tree off the NN.
currentscurrents t1_j3jst6n wrote
I know there's a whole field of decision tree learning, but I'm not super up to date on it.
I assume neural networks are better or else we'd be using trees instead.
Immarhinocerous t1_j3km9hy wrote
My goto model (after a linear model) is usually XGBoost. It's nice to see that the theoretical potential of a tree based model is as high as neural network. Though that's not necessarily true of XGBoost's boosted trees, they do perform well, and I really like how fast their training time is.
Immarhinocerous t1_j3g62fu wrote
Haha, that would look somewhat like a tree!
Viewing a single comment thread. View all comments