General principle

  • We can generate random numbers via an inverse problem.

  • First, it turns out that is not difficult for a modern computers to generate random numbers uniformly, and, between \(0\) and \(1\) .

  • Now, suppose we want to generate random numbers from another random variable \(X\), with cummulative distribution function \(F_X\).

  • Recall that the values of \(F_X\) lie between \(0\) and \(1\).

  • Also, not only is \(F_X\) a non-decreasing function, for most common cases it’s actually strictly increasing, which allows us to correctly define its inverse fuction \(F^{-1}_X\).

  • Then, we can generate random numbers whose distribution is \(f_X\), as follows:

    • Generate a random number \(u\) between \(0\) and \(1\) .
    • Calculate the unique value \(F^{-1}_X(u)\).
    • Theorem: Let \(X, Y, U\) be random variables such that \(U \sim Uniform(0, 1)\) and \(Y = F^{-1}_X(U)\). Then, the CDF of \(Y\) is \(F_X\).
  • Examples:

    • Generate Gaussian random numbers with mean \(\mu\) and variance \(\sigma^2\).
      • The CDF of the ideal distribution is \(F_X(x) = \Phi(\dfrac{x-\mu}{\sigma})\).
      • The transformation becomes \(F^{-1}_X(U) = \sigma\Phi^{-1}(U) + \mu\).
    • Generate exponential random numbers with parameter \(\lambda\).
      • The CDF of the ideal distribution is \(F_X(x) = 1 - e^{-\lambda x}\).
      • The transformation becomes \(F^{-1}(U) = - \dfrac{1}{\lambda}\log(1-U)\)