Professional Documents
Culture Documents
Mathematical Expectation or Expected Value
Mathematical Expectation or Expected Value
Consider a r.v. X with p.d.f. (p.m.f.) 𝑓(𝑥) and distribution function 𝐹(𝑥).
If 𝑔 ∙ is a function such that g(X) is a r.v. and 𝐸[𝑔 𝑋 ] exists, then
𝐸[𝑋 𝑟 ] = 𝑥𝑥
𝑟 𝑝(𝑥)
Which is defined as 𝝁′𝒓 , the rth moment (about origin) of the probability
distribution. Thus 𝜇𝑟′ ( about origin) = 𝐸 𝑋 𝑟
2
𝜇2 = 𝜇2′ − 𝜇1′ = 𝐸 𝑋 2 − [𝐸 𝑋 ]2
2. If 𝑔 𝑋 = 𝐸[𝑋 − 𝐸 𝑋 ]𝑟 = 𝐸[𝑋 − 𝑥 ]𝑟 , then
∞ ∞
𝐸[𝑋 − 𝐸 𝑋 ]𝑟 = −∞
[𝑥 − 𝐸 𝑋 ]𝑟 𝑓(𝑥) 𝑑𝑥 = −∞
[𝑥 − 𝑥 ]𝑟 𝑓(𝑥) 𝑑𝑥
3. If 𝑔(𝑥) = constant = 𝑐,
∞ ∞
𝐸(𝑐) = −∞
𝑐 𝑓(𝑥) 𝑑𝑥 = 𝑐 −∞
𝑓(𝑥) 𝑑𝑥 = 𝑐
∞ ∞
𝐸[𝑔 𝑋, 𝑌 ] = −∞ −∞
𝑔 𝑥, 𝑦 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦 (for continuous r.v.)
Properties of Expectation
1. Addition theorem of Expectation: 𝑬 𝑿 + 𝒀 = 𝑬 𝑿 + 𝑬(𝒀)
∞ ∞
Proof: 𝐸 𝑋+𝑌 = −∞ −∞
𝑥 + 𝑦 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦
∞ ∞ ∞ ∞
= −∞ −∞ 𝑥 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦 + −∞ −∞
𝑦 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦
𝐸 𝑋 + 𝑌 = 𝐸 𝑋 + 𝐸(𝑌)
∞ ∞
𝐸 𝑋𝑌 = −∞ −∞
𝑥𝑦 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦
∞ ∞
= −∞ −∞
𝑥𝑦 𝑓(𝑥) 𝑓(𝑦)𝑑𝑥𝑑𝑦 [Since, X and Y are independent]
∞ ∞
= [ −∞
𝑥 𝑓(𝑥) 𝑑𝑥] [ −∞
𝑦 𝑓(𝑦) 𝑑𝑦]
𝐸 𝑋 𝑌 = 𝐸 𝑋 𝐸(𝑌)
𝐸 𝑋1 ∙ 𝑋2 ∙ ⋯ ∙ 𝑋𝑛 = 𝐸 𝑋1 ) ∙ 𝐸(𝑋2 ) ∙ ⋯ ∙ 𝐸(𝑋𝑛
𝑛 𝑛
or 𝐸 𝑖=1 𝑋𝑖 = 𝑖=1 𝐸(𝑋𝑖 )
3. If 𝑿 is a random variable and 𝒂 and 𝒃 are constants, then
𝑬 𝒂𝑿 + 𝒃 = 𝒂 𝑬 𝑿 + 𝒃
∞
Proof: 𝐸 𝑎𝑋 + 𝑏 = −∞
(𝑎𝑥 + 𝑏) 𝑓(𝑥) 𝑑𝑥
∞ ∞
= 𝑎 −∞
𝑥 𝑓(𝑥) 𝑑𝑥 + 𝑏 −∞
𝑓(𝑥) 𝑑𝑥
𝐸 𝑎𝑋 + 𝑏 = 𝑎𝐸 𝑋 +𝑏
If 𝑏 = 0, then we get 𝐸 𝑎𝑋 = 𝑎 𝐸 𝑋
1 1
However, 𝐸 ≠
𝑋 𝐸(𝑋)
𝐸 𝑋1/2 ≠ [𝐸 𝑋 ]1/2
𝐸 𝑋 2 ≠ [𝐸 𝑋 ]2
Variance 2
2 2
Variance = 𝑽𝒂𝒓(𝑿) = 𝐸[𝑋 − 𝐸 𝑋 ] = E 𝑋 + 𝐸 𝑋 − 2𝑋𝐸 𝑋 = 𝐸 𝑋 2 − [𝐸 𝑋 ]2
Proof: Let 𝑌 = 𝑎𝑋 + 𝑏
E 𝑌 = 𝐸 𝑎𝑋 + 𝑏
=𝑎𝐸 𝑋 +𝑏
𝑌 − 𝐸 𝑌 = (𝑎𝑋 + 𝑏) − (𝑎 𝐸 𝑋 + 𝑏)
= 𝑎[𝑋 − 𝐸(𝑋)]
E[𝑌 − 𝐸 𝑌 ]2 = 𝑎2 𝐸[𝑋 − 𝐸(𝑋)]2
𝑉 𝑌 = 𝑎2 𝑉(𝑋)
𝑋 : -3 6 9
𝑃(𝑋 = 𝑥) : 1/6 1/2 1/3
𝐸 𝑋2 = 𝑥 2 ⋅ 𝑝(𝑥)
2 1 2 1 2 1 93
= −3 × +6 × +9 × =
6 2 3 2
93 11 2 65
𝑉 𝑋 =𝐸 𝑋2 − [𝐸 𝑋 ]2 = − =
2 2 4
𝐸 2𝑋 + 1 2 = 𝐸(4𝑋 2 + 4𝑋 + 1) = 4 𝐸 𝑋 2 + 4 𝐸 𝑋 + 1
93 11
= 4× +4× + 1 = 209
2 2
Example 2
Find the expectation of the number on a die when thrown.
Solution:
Let 𝑋 be the random variable representing the number on a die when thrown.
Then 𝑋 can take any one of the values 1, 2, 3, 4, 5, 6 each with equal probability
1 .
6
1 1 1 1
Hence, 𝐸 𝑋 = ×1+ ×2+ ×3+ ⋯+ × 6
6 6 6 6
1 1 6×7 7
= 1 + 2 +3 + ⋯+ 6 = × =
6 6 2 2
Example 3:
Two unbiased dice are thrown. Find the expected values of the sum of
numbers of points on them.
Co-Variance
If X and Y are two random variables, then the covariance between them is
defined as:
𝐶𝑜𝑣 𝑋, 𝑌 = 𝐸 𝑋 − 𝐸(𝑋) 𝑌 − 𝐸(𝑌)
= 𝐸 𝑋𝑌 − 𝑋 𝐸 𝑌 − 𝑌 𝐸 𝑋 + 𝐸 𝑋 𝐸(𝑌)
= E(𝑋𝑌) − 𝐸(𝑋) 𝐸 𝑌 − 𝐸(𝑌) 𝐸 𝑋 + 𝐸 𝑋 𝐸(𝑌)
𝐶𝑜𝑣 𝑋, 𝑌 = E(𝑋𝑌) − 𝐸(𝑋) 𝐸 𝑌
2. 𝐶𝑜𝑣 𝑋 + 𝑎, 𝑌 + 𝑏 = 𝐶𝑜𝑣 𝑋, 𝑌
𝑋−𝑋 𝑌−𝑌 1
3. 𝐶𝑜𝑣 , = 𝐶𝑜𝑣 𝑋, 𝑌
𝜎𝑋 𝜎𝑌 𝜎𝑋 𝜎𝑌
5. 𝑉 𝑋1 ± 𝑋2 = 𝑉 𝑋1 + V 𝑋2 ± 2𝐶𝑜𝑣(𝑋1 , 𝑋2 )
𝑛 𝑛 2 𝑛 𝑛
6. 𝑉 𝑖=1 𝑎𝑖 𝑋𝑖 = 𝑖=1 𝑎𝑖 𝑉(𝑋𝑖 ) + 2 𝑖=1 𝑗=1 𝑎𝑖 𝑎𝑗 𝐶𝑜𝑣(𝑋𝑖 , 𝑋𝑗 )
Correlation Coefficient
If 𝑋 and 𝑌 are two random variables, then the covariance between them
is denoted as 𝐶𝑜𝑣 𝑋, 𝑌 ,
𝐶𝑜𝑣 𝑋, 𝑌 = E(𝑋𝑌) − 𝐸(𝑋) 𝐸 𝑌
Y X -1 0 1
-1 0 0.1 0.1
0 0.2 0.2 0.2
1 0 0.1 0.1
(ii) 𝐸 𝑋𝑌 = 𝑥𝑖 𝑦𝑗 𝑝𝑖𝑗
= −1 −1 0 + 0 −1 0.1 + 1 −1 0.1
+ 0 −1 0.2 + 0 0 0.2 + 0 1 0.2
+1 −1 0 + 1 0 0.1 + 1 1 0.1
𝐸 𝑋𝑌 = −0.1 + 0.1 = 0
∴ 𝐶𝑜𝑣 𝑋, 𝑌 = 𝐸 𝑋𝑌 − 𝐸 𝑋 𝐸 𝑌 = 0
⟹ 𝑋 and 𝑌 are uncorrelated.
(iii) 𝐸 𝑋 2 = 𝑥 2 𝑃(𝑋 = 𝑥𝑖 ) = −1 2 0.2 + 0 0.4 + 12 0.4 = 0.6
2
𝑉 𝑋 =𝐸 𝑋2 − 𝐸 𝑋 = 0.56
∞
(𝑖) 𝑓 𝑥 = −∞
𝑓 𝑥, 𝑦 𝑑𝑦
1 3
= 0
2 − 𝑥 − 𝑦 𝑑𝑦 = −𝑥
2
3
∴ 𝑓 𝑥 = −𝑥, 0<𝑥<1
2
3
Similarly, 𝑓 𝑦 = −𝑦, 0<𝑦<1
2
∞ 1 3 5
(ii) 𝐸 𝑋 = −∞
𝑥 𝑓 𝑥 𝑑𝑥 = 0
𝑥 − 𝑥 𝑑𝑥 =
2 12
∞ 1 3 5
𝐸 𝑌 = −∞
𝑦 𝑓 𝑦 𝑑𝑦 = 0
𝑦 − 𝑦 𝑑𝑦 =
2 12
∞ 2 1 2 3 1
𝐸 𝑋2 = −∞
𝑥 𝑓 𝑥 𝑑𝑥 = 0
𝑥 − 𝑥 𝑑𝑥 =
2 4
2 ∞ 2 1 2 3 1
𝐸 𝑌 = −∞
𝑦 𝑓 𝑦 𝑑𝑦 = 0
𝑦 − 𝑦 𝑑𝑦 =
2 4
2 11
𝑉(𝑋) = 𝐸 𝑋2 − 𝐸 𝑋 = = 𝑉(𝑌)
144
1 1
𝐸 𝑋𝑌 = 𝑥𝑦 2 − 𝑥 − 𝑦 𝑑𝑥 𝑑𝑦
0 0
1
𝐸 𝑋𝑌 =
6
1
∴ 𝐶𝑜𝑣 𝑋, 𝑌 = 𝐸 𝑋𝑌 − 𝐸 𝑋 𝐸 𝑌 = −
144
Conditional Expectation
If (𝑋, 𝑌) is a two-dimensional discrete random variable with joint probability mass function
𝑃 𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 , then the conditional expectation of 𝑔(𝑋, 𝑌) are defined as:
𝐸 𝑔(𝑋, 𝑌) 𝑌 = 𝑦𝑗 = 𝑖 𝑔 𝑥𝑖 , 𝑦𝑗 × 𝑃 𝑋 = 𝑥𝑖 / 𝑌 = 𝑦𝑗
𝑃 𝑋=𝑥𝑖 , 𝑌=𝑦𝑗
= 𝑖 𝑔 𝑥𝑖 , 𝑦𝑗 ×
𝑃 𝑌=𝑦𝑗
𝑃 𝑋=𝑥𝑖 ,𝑌=𝑦𝑗
and 𝐸 𝑔(𝑋, 𝑌) 𝑋 = 𝑥𝑖 = 𝑗 𝑔 𝑥𝑖 , 𝑦𝑗 ×
𝑃 𝑋=𝑥𝑖
If (𝑋, 𝑌) is a two-dimensional continuous random variable with joint probability
density function 𝑓(𝑥, 𝑦), then
∞
𝐸 𝑔(𝑋, 𝑌) 𝑌 = 𝑦𝑗 = −∞
𝑔 𝑥, 𝑦 × 𝑓 𝑥/𝑦 𝑑𝑥
∞ 𝑓 𝑥,𝑦
= −∞
𝑔 𝑥, 𝑦 × 𝑑𝑥
𝑓(𝑦)
∞ 𝑓 𝑥,𝑦
and 𝐸 𝑔(𝑋, 𝑌) 𝑋 = 𝑥𝑖 = −∞
𝑔 𝑥, 𝑦 × 𝑑𝑦
𝑓(𝑥)
Conditional Mean and Variances
Conditional Mean:
∞
𝜇𝑋/𝑌 = 𝐸 𝑋/𝑌 = −∞
𝑥𝑓 𝑥/𝑦 𝑑𝑥
∞
𝜇𝑌/𝑋 = 𝐸 𝑌/𝑋 = −∞
𝑦 𝑓 𝑦/𝑥 𝑑𝑦
Conditional Variances:
2 2 ∞ 2
𝜎𝑋/𝑌 =𝐸 𝑋 − 𝜇𝑋/𝑌 = −∞
𝑋 − 𝜇𝑋/𝑌 𝑓 𝑥/𝑦 𝑑𝑥
2 2 ∞ 2
𝜎𝑌/𝑋 =𝐸 𝑌 − 𝜇𝑌/𝑋 = −∞
𝑌 − 𝜇𝑌/𝑋 𝑓 𝑦/𝑥 𝑑𝑦
Example 6: Let 𝑋 and 𝑌 be a two random variables each taking three values -
1, 0 and 1 having the joint probability distribution
Y X -1 0 1
-1 0 0.1 0.1
0 0.2 0.2 0.2
1 0 0.1 0.1