Professional Documents
Culture Documents
1
2
Here’s a proof of the first equation in the first But Var(X) = σX and Var(−Y ) = Var(Y ) = σY2 ,
condition: so that equals
Cov(aX, Y ) = E((aX − E(aX))(Y − E(Y ))) X ±Y
2 + 2 Cov ,
= E(a(X − E(X))(Y − E(Y ))) σX σY
= aE((X − E(X))(Y − E(Y ))) By the bilinearity of covariance, that equals
= a Cov(X, Y )
2
The proof of the second condition is also straight- 2± Cov(X, Y ) = 2 ± 2ρXY )
σx σY
forward.
and we’ve shown that
Correlation. The correlation ρXY of two joint
0 ≤ 2(1 ± ρXY .
variables X and Y is a normalized version of their
covariance. It’s defined by the equation Next, divide by 2 move one term to the other side
Cov(X, Y ) of the inequality to get
ρXY = .
σX σY
∓ρXY ≤ 1,
Note that independent variables have 0 correla-
tion as well as 0 covariance. so
By dividing by the product σX σY of the stan- −1 ≤ ρXY ≤ 1.
dard deviations, the correlation becomes bounded
between plus and minus 1. This exercise should remind you of the same
kind of thing that goes on in linear algebra. In
−1 ≤ ρXY ≤ 1. fact, it is the same thing exactly. Take a set of
real-valued random variables, not necessarily inde-
There are various ways you can prove that in- pendent. Their linear combinations form a vector
equality. Here’s one. We’ll start by proving space. Their covariance is the inner product (also
called the dot product or scalar product) of two
X Y
0 ≤ Var ± = 2(1 ± ρXY ). vectors in that space.
σX σY
There are actually two equations there, and we can X · Y = Cov(X, Y )
prove them at the same time.
2
First note the “0 ≤” parts follow from the fact The norm kXk of X is the square root of kXk
variance is nonnegative. Next use the property defined by
proved above about the variance of a sum.
kXk2 = X · X = Cov(X, X) = V (X) = σX 2
X Y
Var ± and, so, the angle θ between X and Y is defined by
σ σ
X Y
X ±Y X ±Y X ·Y Cov(X, Y )
= Var + Var + 2 Cov , cos θ = = = ρXY
σX σY σX σY kXk kY k σX σY
Now use the fact that Var(cX) = c2 Var(X) to that is, θ is the arccosine of the correlation ρ .
XY
rewrite that as
Math 217 Home Page at
1 1 X Y
2
Var(X) + 2 Var(±Y ) + 2 Cov ,± http://math.clarku.edu/~djoyce/ma217/
σX σY σX σY