5.12 A simple bootstrap example
- We want to invest a fixed sum of money in 2 financial assets that yield returns of X and Y respectively
- We will invest a fraction of our money α in X
- And invest the rest (1−α) in Y
- Since there is variability associated with the returns on the 2 assets, we want to choose α to minimize the total risk (variance) of our investment
- Thus, we want to minimize: Var(αX+(1−α)Y)
- The value of α that minimizes the risk is given by:
α=σ2Y−σXYσ2X+σ2Y−2σXY where σ2X=Var(X), σ2Y=Var(Y), and σXY=Cov(X,Y)
The quantities Var(X), Var(Y) and Cov(X,Y) are unknown but we can estimate them from a dataset that contains measurements for X and Y.
Simulated 100 pairs of data points (X, Y) four times and got four values for ˆα ranging from 0.532 to 0.657.
Now, how accurate is this as an estimate of α?
- Get the standard deviation of ˆα
- Same simulation process as above but done 1,000 times to get 1,000 values of ˆα (i.e., 1,000 estimates for α)
- True known value of α is 0.6
- The mean over all 1,000 estimates of α = 0.5996
- Std dev of the estimate is:
√11000−11000∑r=1(ˆαr−ˉα)2=0.083 - Gives fairly good estimate of accuracy of α (we expect ˆα to differ from α by ~ 0.08)