Professional Documents
Culture Documents
Nptel: Course On
Nptel: Course On
Course On
STRUCTURAL
RELIABILITY
Module # 02
Lecture 4
Instructor:
Dr. Arunasis Chakraborty
Department of Civil Engineering
Indian Institute of Technology Guwahati
4. Lecture 04: Algebra of Variance
Algebra of Expectation
In statistics, usually, one needs to evaluate first two moments of the random variable for
representing or evaluating. These moments are expected value or mean and variance. Expected
value of a random variable is the average of a data from infinite random variable process. In
other words, it can be said that expected value is weighted average of the random variable where
this weight is probability of occurrence. Hence, one can define expected value 𝐸 𝑋 as
𝐸 𝑋 = 𝑥𝑖 𝑝𝑋 𝑥𝑖 2.4.1
all 𝑥 𝑖
∞
𝐸 𝑋 = 𝑥 𝑓𝑋 𝑥 𝑑𝑥 2.4.2
−∞
where, 𝑓𝑋 𝑥 is probability density function of the continuous random variable 𝑋. Few of the
important properties of the expectation, useful for this course, are listed below.
𝐸 𝑎𝑋 = 𝑎 𝐸(𝑋) 2.4.3
𝐸 𝑋+𝑎 =𝐸 𝑋 +𝑎 2.4.4
𝐸 𝑋+𝑌 =𝐸 𝑋 + 𝐸 𝑌 2.4.5
𝐸 𝑋𝑌 =𝐸 𝑋 𝐸 𝑌 2.4.6
These expressions are applicable for any random variables (𝑋 and 𝑌) and constant (𝑎) whereas
the property shown in Eq. 2.4.6 is only valid when random variables 𝑋 and 𝑌 are independent of
each other. Similar to Eq. 2.4.1 and 2.4.2, expected value of a function of random variable (say
𝐸𝑔 𝑋 = 𝑔 𝑋𝑖 𝑝𝑋 𝑥𝑖 2.4.7
all 𝑋 𝑖
∞
𝐸𝑔 𝑋 = 𝑔 𝑥 𝑓𝑋 𝑥 𝑑𝑥 2.4.8
−∞
In Eq. 2.4.9, expected value of addition of functions of random variable 𝑋, i.e., 𝑔1 𝑋 and 𝑔2 𝑋
is shown. All the definitions and properties discussed above are relevant for independent random
variable and unconditional expectation.
Further, conditional expectation can be defined for random variable 𝑋1 with respect to random
variable 𝑋2 as shown in Eq. 2.4.10 (for discrete random variables) and 2.4.11 (for continuous
random variables).
∞
𝐸 𝑋1 𝑋2 = 𝑥2 = 𝑥1 𝑓𝑋1 |𝑋2 𝑥1 |𝑥2 𝑑𝑥1 2.4.11
−∞
If both the random variables are independent than the conditional expectation turns into same as
unconditional expectation (see Eq. 2.4.12).
𝐸 𝑋1 𝑋2 = 𝑥2 = 𝐸(𝑋1 ) 2.4.12
For finding unconditional expectation, one can evaluate it by summing up the product of
conditional expectation and probability of occurrence (in case of discrete random variable, Eq.
2.4.13) or probability density function (in case of continuous random variable, Eq. 2.4.14). The
expectation evaluated as in these equations are also known as the expectation of marginal
distribution of 𝑋.
∞
𝐸 𝑋1 = 𝐸 𝑋1 𝑋2 = 𝑥2 𝑓𝑋2 (𝑥2 )𝑑𝑥2 2.4.14
−∞
𝐸 𝑋1 = 𝐸 𝐸 𝑋1 𝑋2 = 𝑥2 2.4.15
Now, let's discuss the second moment of probability distribution. This is expressed in an
alternative term based on the central moment (i.e., moment with respect to mean), also known as
the variance. It is defined as the second central moment of the probability distribution as
expressed in Eq. 2.4.16 for discrete random variable and Eq. 2.4.17 for continuous random
variable. It measures how much is the spread out or dispersion of the data around mean value (or
expected value).
Var 𝑋 = 𝑥𝑖 − 𝜇 2 𝑝𝑋 𝑥𝑖 2.4.16
all 𝑥 𝑖
∞
2
Var 𝑋 = 𝑥𝑖 − 𝜇 𝑓𝑋 𝑥 𝑑𝑥 2.4.17
−∞
where, 𝜇 is mean or expected value of the random variable 𝑋, i.e. 𝐸 𝑋 . Statistically, variance
can also be expressed as the expected value of squared deviation from mean which can be shown
as per Eq. 2.4.18.
2
Var 𝑋 = 𝐸 𝑋 − 𝜇 2.4.18
This expression can also be arranged for evaluating the variance in terms of mean and expected
value of squared random variable. This can be shown below
𝐸 𝑋 2 = 𝜎 2 + 𝜇2 2.4.20
where, 𝜎 denotes standard deviation which is square root of the variance. Eq. 2.4.20 gives the
expectation of squared random variable in terms of mean and variance. Again, few of the
important properties of the variance are listed below.
Var 𝑎 = 0
2.4.21
Var 𝑎𝑋 = 𝑎2 Var 𝑋
where, 𝑎 and 𝑏 are constants. Conditional variance of random variable 𝑋1 related to another
random variable 𝑋2 is defined as
2
Var 𝑋1 |𝑋2 = 𝑥2 = 𝐸 𝑋1 − 𝜇𝑋1 |𝑋2 | 𝑋2 = 𝑥2
2
Var 𝑋1 |𝑋2 = 𝑥2 = 𝑥1 𝑖 − 𝜇𝑋1 |𝑋2 𝑝𝑋1 |𝑋2 (𝑥1𝑖 |𝑥2 ) 2.4.22
all 𝑥 1 𝑖
∞
2
Var 𝑋1 |𝑋2 = 𝑥2 = 𝑥1 − 𝜇𝑋1 |𝑋2 𝑓𝑋1 |𝑋2 𝑥1 𝑥2 𝑑𝑥1 2.4.23
−∞
where, Eq 2.4.22 is for discrete random variable whereas Eq. 2.4.23 is for continuous random
variable.
Application of expectation and variance can be used for jointly distributed random variables 𝑋1
and 𝑋2 . For this case, let us consider a function 𝑔 𝑋1 , 𝑋2 of two random variables denoted by 𝑍.
The expectation of the function 𝑔 𝑋1 , 𝑋2 can be given as
𝐸 𝑍 = 𝐸[𝑔(𝑋1 , 𝑋2 )]
∞ ∞ 2.4.24
= 𝑔 𝑥1 , 𝑥2 𝑓𝑋1 𝑋2 (𝑥1 , 𝑥2 ) 𝑑𝑥1 𝑑𝑥2
−∞ −∞
In Eq. 2.4.24, one can note that 𝑓𝑋1 𝑋2 𝑥1 , 𝑥2 is joint probability distribution function of two
continuous random variables, 𝑋1 and 𝑋2 . Moreover, higher order moments can be given in
similar manner. Eq. 2.4.25 shows a elementary formulation of the joint moment of 𝑚 + 𝑛 order.
∞ ∞
𝐸[𝑋1𝑚 𝑋2𝑛 ] = 𝑥1𝑚 𝑥2𝑛 𝑓𝑋1 𝑋2 (𝑥1 , 𝑥2 ) 𝑑𝑥1 𝑑𝑥2 2.4.25
−∞ −∞
𝑚 𝑛
𝑐𝑚𝑛 = 𝐸 𝑋1 − 𝜇1 𝑋2 − 𝜇2 2.4.26
Similarly, one can evaluate the expression for 𝜇2 is equal to 𝐸 𝑋2 by substituting 𝑚 = 0 and
𝑛 = 1 in Eq. 2.4.25. The variance of joint distributed random variables, better known as
covariance is discussed in next section.
According to definition, covariance is the second central moment of joint distributed random
variables. It can be expressed below by substituting 𝑚 = 1 and 𝑛 = 1 in Eq. 2.4.26.
Cov 𝑋1 , 𝑋2 = 𝜎𝑋1 𝑋2 = 𝐸 𝑋1 − 𝜇1 𝑋2 − 𝜇2
∞ ∞
= 𝑥1 − 𝜇1 𝑥2 − 𝜇2 𝑓𝑋1 𝑋2 (𝑥1 , 𝑥2 ) 𝑑𝑥1 𝑑𝑥2
−∞ −∞ 2.4.28
Now, covariance can be formulated in terms of individual as well as combined expectation of the
random variables. Generally for representing covariance, in this course and elsewhere, one may
require coefficient of correlation (𝜌). It is defined as normalized covariance value by individual
standard deviation (square root of variance) of corresponding random variable. This is expressed
as
Cov (𝑋1 , 𝑋2 )
𝜌𝑋1 𝑋2 = 2.4.30
𝜎𝑋1 𝜎𝑋2
In view of this course, the algebra of variance which has been briefly discussed above is vitally
used for evaluating the first two moments of any arbitrary function of the random variables. This
section will discuss how to evaluate expectation and variance of those functions. Let us assume a
function 𝑔 𝑋1 , 𝑋2 , … , 𝑋𝑛 , where 𝑋 = 𝑋1 , 𝑋2 , … , 𝑋𝑛 is set of random variables, which is
Course Instructor: Dr. Arunasis Chakraborty
5
Lecture 04: Algebra of Variance
denoted by 𝑍. Now, expectation of the function can be evaluated either by averaging infinite
responses generated from infinite sample of 𝑋 (which is practically infeasible) or by substituting
the mean values of the random variable in the function itself. This response, in short, gives the
expected value of the function 𝑍 as expressed below
𝐸 𝑍 = 𝜇𝑍 ≃ 𝑔 𝜇1 , 𝜇2 , … , 𝜇𝑛 2.4.31
Variance of the function is estimated by evaluating the derivative of function with respect to
random variables, i.e. 𝜕𝑔/𝜕𝑋𝑖 at mean value of random variables. As expressed in Eq. 2.4.32 and
2.4.33 for uncorrelated and correlated random variables, respectively.
𝑛 2
𝜕𝑔
Var 𝑍 = 𝜎𝑍2 ≃ Var 𝑋𝑖 2.4.32
𝜕𝑋𝑖 𝜇
𝑖 =1
𝑛 𝑛
𝜕𝑔 𝜕𝑔
Var 𝑍 = 𝜎𝑍2 ≃ Cov 𝑋𝑖 , 𝑋𝑗 2.4.33
𝜕𝑋𝑖 𝜇
𝜕𝑋𝑗
𝑖 =1 𝑗 =1 𝜇
𝑍 = 𝑔(𝑋) = 𝑏𝑖 𝑋𝑖 2.4.34
𝑖=1
𝐸(𝑍) = 𝑏𝑖 𝐸 𝑋𝑖 2.4.35
𝑖=1
𝑛 𝑛 𝑛
In case of independent random variables, one has to substitute 0 in place of Cov . . This comes
out to be
Course Instructor: Dr. Arunasis Chakraborty
6
Lecture 04: Algebra of Variance
𝑛
𝑍 = 𝑔(𝑋) = 𝑋1 𝑋2 2.4.38
where, 𝑋 contains only two random variables 𝑋1 and 𝑋2 . Again, the expected value is
𝐸 𝑍 = 𝐸 𝑋1 𝐸(𝑋2 ) 2.4.39
Example
Ex # 01. A discrete random variable 𝑋 and its probability distributions are given in the table
below. Find the expectation 𝐸(𝑥).
𝑿 𝒑𝑿 (𝒙)
1 0.1
2 0.30
3 0.30
4 0.15
5 0.15
∑ = 1.0
Ex # 02. Simplify the following expressions, where 𝑋 and 𝑌 are random variables.
1. 𝐸 𝑋2 + 𝑌2 − 2 𝑋 + 𝑌 2
2. 𝐸 𝑋 − 10𝑌 + 𝐸 𝑌 − 15 − 𝐸 𝑋 − 5𝑌 + 3
Solu.
𝐸 𝑋2 + 𝑌2 − 2 𝑋 + 𝑌 2
= 𝐸 𝑋 2 + 𝑌 2 − 2𝑋 2 − 2𝑌 2 − 4𝑋𝑌
= 𝐸 −𝑋 2 − 𝑌 2 − 4𝑋𝑌
= − 𝐸 𝑋 2 + 𝐸 𝑌 2 + 𝐸(𝑋𝑌)
= − 𝐸 𝑋 2 + 𝐸 𝑌 2 + 𝐸 𝑋 𝐸 𝑌 + 𝜌𝑋𝑌 𝜎𝑋 𝜎𝑌
= − 𝐸 𝑋2 + 𝐸 𝑌2 + 𝐸 𝑋 𝐸 𝑌
𝐸 𝑋 − 10𝑌 + 𝐸 𝑌 − 15 − 𝐸 𝑋 − 5𝑌 + 3
= 𝐸 𝑋 − 𝐸 10𝑌 + 𝐸 𝑌 − 𝐸 15 − 𝐸 𝑋 + 𝐸 5𝑌 − 𝐸 3
= 𝐸 𝑋 − 10𝐸 𝑌 + 𝐸 𝑌 − 15 − 𝐸 𝑋 + 5𝐸 𝑌 − 3
= −4𝐸 𝑌 − 18
2 2 2
Var 𝑓 𝑋 = 0.33 × 3 + 0.55 × 2 + −0.66 × 1.5 = 3.1702
Now,
Ex # 04. Given a set of observation 𝑥𝑖 where variance is estimated from an expression as shown
below
𝑛
1 2
Г𝑛 = 𝑥𝑖 − 𝑚
𝑛
𝑖=1
𝑛
1 2
𝐸 Г𝑛 = 𝐸 𝑥𝑖 − 𝑚
𝑛
𝑖=1
𝑛 𝑛 2
1 1
=𝐸 𝑥𝑖 − 𝑥𝑘
𝑛 𝑛
𝑖=1 𝑘=1
𝑛 𝑛 𝑛 𝑛
1 2 1
= 𝐸 𝑥𝑖2 − 𝑥𝑖 𝑥𝑘 + 2 𝑥𝑘 𝑥𝑙
𝑛 𝑛 𝑛
𝑖=1 𝑘=1 𝑘=1 𝑙=1
𝑛 𝑛 𝑛 𝑛 𝑛 𝑛
1 2 1
= 𝐸 𝑥𝑖2 − 𝐸 𝑥𝑖 𝑥𝑘 + 2 𝐸 𝑥𝑘 𝑥𝑙
𝑛 𝑛 𝑛
𝑖=1 𝑖=1 𝑘=1 𝑖=1 𝑘=1 𝑙=1
1 2 1
= 𝑛𝜎𝑥2 + 𝑛𝜇𝑥2 − 𝑛𝜎𝑥2 + 𝑛2 𝜇𝑥2 + 2 𝑛2 𝜎𝑥2 + 𝑛3 𝜇𝑥2
𝑛 𝑛 𝑛
𝑛−1 2
= 𝜎𝑥
𝑛
Hence,
𝑛−1 2
𝐸 Г𝑛 = 𝜎𝑥
𝑛
From this result it can be concluded that
𝑛
1 2
Г𝑛 = 𝑥𝑖 − 𝑚
𝑛−1
𝑖−1