Least Squares

  • More generally, linear least-squares problems have the form:

\[ f(t) = c_1 f_1(t) + \cdots + c_n f_n(t) \]

Where the function \(f_i\) are all known functions.

  • The fit will only be approximate, with residuals \(y_i - f(t_i)\).

  • The least squares approach minimizes:

\[ R(c_1,\ldots,c_n) = \sum_{i=1}^m\, [ y_i - f(t_i) ]^2 \] - This can be made into a matrix problem:

\[ \begin{aligned} \mathbf{r} &= \begin{bmatrix} y_1 \\ y_2 \\ \vdots \\y_{m-1} \\ y_m \end{bmatrix} - \begin{bmatrix} f_1(t_1) & f_2(t_1) & \cdots & f_n(t_1) \\[1mm] f_1(t_2) & f_2(t_2) & \cdots & f_n(t_2) \\[1mm] & \vdots \\ f_1(t_{m-1}) & f_2(t_{m-1}) & \cdots & f_n(t_{m-1}) \\[1mm] f_1(t_m) & f_2(t_m) & \cdots & f_n(t_m) \\[1mm] \end{bmatrix} \begin{bmatrix} c_1 \\ c_2 \\ \vdots \\ c_n \end{bmatrix}\\ &= \mathbf{b}- \mathbf{A}\mathbf{x} \end{aligned} \]

  • The linear least squares problem is then to minimize \(R = \mathbf{r}^T\mathbf{r}\) or more generally:

3.1.1 Defintion {-} 3.1.3:

Given \(\mathbf{A} \in \mathscr{R}^{m \times n}\) and \(\mathbf{b} \in \mathscr{R}^m\), with \(m > n\), find:

\[ \underset{\mathbf{x} \in \mathscr{R}^n }{\text{argmin}}\, \bigl\| \mathbf{b}-\mathbf{A} \mathbf{x} \bigr\|_2^2 \]