Stephanie Glen takes us through quick explanations of decision trees, random forests, and gradient boosting:
The three methods are similar, with a significant amount of overlap. In a nutshell:
– A decision tree is a simple, decision making-diagram.
– Random forests are a large number of trees, combined (using averages or “majority rules”) at the end of the process.
– Gradient boosting machines also combine decision trees, but start the combining process at the beginning, instead of at the end.
Read on for more details. All three are useful algorithms serving similar but slightly different purposes.Comments closed