You are on page 1of 12

STM2PM

Lecture 29

Covariance

29.1
Covariance

Definition: The covariance of two random


variables X and Y , with respective means µX
and µY , is

Cov(X, Y ) := E((X − µX )(Y − µY ))


or alternatively

Cov(X, Y ) = E(XY ) − E(X)E(Y ).


(In the last lecture we have shown:
E((X − µX )(Y − µY )) = E(XY ) − E(X)E(Y )).

29.2
Properties of Covariance

Property: If the variables X, Y are


independent, then

Cov(X, Y ) = E(XY ) − E(X)E(Y ) = 0.


(The property was proven in the lecture 28).

Remark: The converse statement is not


true. In other words, there are random
variables which have zero covariance, but
which are not independent.
Example: Let Z ∼ N (0, 1) and define
W = Z 2, which is dependent on Z.

29.3
Properties of Covariance, Discussion

Covariance is a measure of the association or


joint variation of two variables. It generalises
variance, because

Cov(X, X) = E((X − µX )2) = Var(X).

The covariance will be positive if, when X is


above its mean, Y tends to be above its
mean, and when X is below its mean, Y
tends to be below its mean.

On the other hand, if when X is above its


mean, Y tends to be below its mean, and
when X is below its mean, Y tends to be
above its mean, the covariance will be
negative.

For independent variables, for a given X − µX ,


there is no general tendency for Y and the
fluctuations cancel (exactly).

29.4
Example, Discrete case

Exercise: A fair coin is tossed three times.


Let X be the number of heads on the first
toss, and let Y be the total number of heads
(in the three tosses). The joint probability
mass function is below and µX = 1/2 and
µY = 3/2.
y
0 1 2 3
x 0 1/8 2/8 1/8 0 1/2
1 0 1/8 2/8 1/8 1/2
1/8 3/8 3/8 1/8

Calculate the covariance.

29.5
Example, Continuous case

Example: For the joint probability density


function
f (x, y) = 2 − 2x − 2y + 4xy x ∈ [0, 1], y ∈ [0, 1]

• Find the marginal probabilities.

• Find the mean of X and Y .

• Find the variance of X and Y .

• Determine whether X and Y are


independent.

• Find the covariance.

29.6
Working space for example

Reminder: The expected value E is linear


(see Lecture 22) i.e.
• E(kX) = kE(X)
• E(X1 + X2) = E(X1) + E(X2)
• E(c) = c

29.7
Working space for example

29.8
More Properties of Covariance

• Cov(X, X) = Var(X).

• Cov(Y, X) = Cov(X, Y ).

• Cov(a, X) = 0.

• Cov(X + Y, Z) = Cov(X, Z) + Cov(Y, Z).

• Cov(aX, bY ) = abCov(X, Y ).

Proofs:

29.9
Working space

29.10
More properties of Covariance

Exercise: Using the results above, what is


Cov(aW + bX, cY + dZ)?
Cov(aW + bX, cY + dZ) =

29.11
Relationship between Variance of (X+Y)
and Covariance.

In the last lecture, for independent variables


we found that

Var(X + Y ) = Var(X) + Var(Y ).

Now we extend this in general.

Result: For any two random variable X and


Y it is valid:

Var(X + Y ) = Var(X) + Var(Y ) + 2 Cov(X, Y )

29.12

You might also like