Extending linear models

  • Polynomial regression: It extends the linear model by adding extra predictors, obtained by raising each of the original predictors to a power. Generally speaking, it is unusual to use d greater than 3 or 4 as the larger d becomes, the easier the function fit becomes overly flexible and oddly shaped as it tends to increase the presence of multicollinearity.

yi=β0+β1xi+β2x2i+β3x3i++βdxdi+ϵi

  • Piecewise constant regression: It’s a step function which breaks the range of X into bins and fit a simple constant (e.g., the mean response) in each bin.

If we define the cutpoints as c1,c2,,cK in the range of X, we can create dummy variables to represent each range. For example, if c1xi<c2 is TRUE then C1(xi)=1 and then we need to repeat that process for each value of X and range. As result we can fit a lineal regression based on the new variables.

yi=β0+β1C1(xi)+β2C2(xi)+βKCK(xi)+ϵi