5.12 A simple bootstrap example

  • We want to invest a fixed sum of money in 2 financial assets that yield returns of X and Y respectively
  • We will invest a fraction of our money \({\alpha}\) in X
  • And invest the rest \((1-{\alpha})\) in Y
  • Since there is variability associated with the returns on the 2 assets, we want to choose \({\alpha}\) to minimize the total risk (variance) of our investment
  • Thus, we want to minimize: \(Var({\alpha}X + (1-{\alpha})Y)\)
  • The value of \({\alpha}\) that minimizes the risk is given by:

\[{\alpha} = \frac{{\sigma}_{Y}^{2}-{\sigma}_{XY}}{{\sigma}_{X}^{2}+{\sigma}_{Y}^{2}-2{\sigma}_{XY}}\] where \({\sigma}_{X}^{2} = Var(X)\), \({\sigma}_{Y}^{2} = Var(Y)\), and \({\sigma}_{XY} = Cov(X, Y)\)

  • The quantities \(Var(X)\), \(Var(Y)\) and \(Cov(X, Y)\) are unknown but we can estimate them from a dataset that contains measurements for X and Y.

  • Simulated 100 pairs of data points (X, Y) four times and got four values for \(\hat{\alpha}\) ranging from 0.532 to 0.657.

  • Now, how accurate is this as an estimate of \({\alpha}\)?

    • Get the standard deviation of \(\hat{\alpha}\)
    • Same simulation process as above but done 1,000 times to get 1,000 values of \(\hat{\alpha}\) (i.e., 1,000 estimates for \({\alpha}\))
    • True known value of \({\alpha}\) is 0.6
    • The mean over all 1,000 estimates of \({\alpha}\) = 0.5996
    • Std dev of the estimate is:

\[ \sqrt{\frac{1}{1000-1}\sum_{r=1}^{1000}(\hat{\alpha}_{r}-\bar{\alpha})^{2}} = 0.083\] - Gives fairly good estimate of accuracy of \({\alpha}\) (we expect \(\hat{\alpha}\) to differ from \({\alpha}\) by ~ 0.08)