6.4 Boosting
Boosting is like making an ensemble, but each additional model in the ensemble is specifically aimed at overcoming the deficiencies in the previous model.
I found the description of boosting in the book to be a bit lacking, and there is a typo in one of the crucial equations (bottom of p. 271 in 2e). I found a helpful reference (with diagrams) here: https://towardsdatascience.com/boosting-algorithms-explained-d38f56ef3f30
The basic idea (for Adaboost, at least): all training examples are weighted, so that their relative importance can be adjusted. When an example is misclassified by one of the models in the ensemble, its weight is increased for the next model.