8.1 Introduction: Tree-based methods
Involve stratifying or segmenting the predictor space into a number of simple regions
Are simple and useful for interpretation
BUT, basic decision trees are NOT competitive with the best supervised learning approaches in terms of prediction accuracy
Thus, we also discuss bagging, random forests, and boosting (i.e., tree-based ensemble methods) to grow multiple trees which are then combined to yield a single consensus prediction
These can result in dramatic improvements in prediction accuracy (but some loss of interpretability)
Can be applied to both regression and classification