Random vectors
- Now we return to more then 2 variables, and look at random vectors:
fX(x)=fX1,...,XN(x1,..,xn)
Where
X=[X1X2...XN]
- Expectation is straightforward, and the vector mean μ is defined:
μ=E[X]=[E[X1]...E[XN]]
- For covariance, each variables has a variance, and there are covariances between all the possible pairs. These are organized into a convenient matrix Σ:
Σ=Cov(X)=[Var[X1]Cov(X1,X2)...Cov(X1,XN)Cov(X2,X1)Var[X2]...Cov(X2,XN)⋮⋮⋱⋮Cov(XN,X1)Cov(Xn,X2)...Var[XN]]
which is more compactly written:
Σ=E[(X−μ)(X−μ)T]
- If the variables are all independant, the Covariances are all zero and we have a diagonal covariance matrix:
Σ=Cov(X)=[Var[X1]0...00Var[X2]...0)⋮⋮⋱⋮00...Var[XN]]