You are on page 1of 10

3.

0 Discrete Random Variables

EEE251 Probability Methods in Engineering


Lecture 7

Instructor: Bakhtiar Ali

Department of Electrical Engineering.


COMSATS Institute of Information Technology, Islamabad

Bakhtiar Ali 1/10


3.3 EXPECTED VALUE AND MOMENTS
3.0 Discrete Random Variables 3.3.1 Expected Value of Functions of a Random Variable
3.3.2 Variance of a Random Variable

3.3 Expectation

In order to completely describe the behavior of a discrete random


variable, an entire function, namely pX (x) must be given.
In some situations we are interested in a few parameters that summarize
the information provided by the pmf. For example figure below shows the
results of many repetitions of an experiment that produces two random
variables. The random variable Y varies about the value 0, whereas the
random variable X varies around the value 5.

It is also clear that X is more spread out than Y . In this section we


introduce parameters that quantify these properties.
Bakhtiar Ali 2/10
3.3 EXPECTED VALUE AND MOMENTS
3.0 Discrete Random Variables 3.3.1 Expected Value of Functions of a Random Variable
3.3.2 Variance of a Random Variable

The expected value or mean of a discrete random variable X


is defined by X X
mX = E[X] = xpX (x) = xk pX (xk )
x∈SX k

Example
Find the expected value of the Bernoulli random variable IA .
Solution
E[IA ] = 0pI (0) + 1pI (1)

where p is the probability of success in the Bernoulli trial.

Bakhtiar Ali 3/10


3.3 EXPECTED VALUE AND MOMENTS
3.0 Discrete Random Variables 3.3.1 Expected Value of Functions of a Random Variable
3.3.2 Variance of a Random Variable

Example
Let X be the number of heads in three tosses of a fair coin. Find E[X].

Sol.
3
X
E[X] = xpX (x) = 0(1/8) + 1(3/8) + 2(3/8) + 3(1/8) = 1.5
k=0

Example
Let X be a uniform random variable from 0 to M − 1. Find E[X].

Sol.
We have pX (j) = (1/M ) for j = 0, 1, . . . , M − 1
M −1
X 1 1 (M − 1)M (M − 1)
E[X] = k = (0 + 1 + · · · + M − 1) = =
M M 2M 2
k=0
M (M + 1)
where we have used the fact that (1 + 2 + · · · + M ) =
2

Bakhtiar Ali 4/10


3.3 EXPECTED VALUE AND MOMENTS
3.0 Discrete Random Variables 3.3.1 Expected Value of Functions of a Random Variable
3.3.2 Variance of a Random Variable

Example
A player at a fair pays $1.50 to toss a coin three times. The player receives $1
if the number of heads is 2, $8 if the number is 3, but nothing otherwise. Find
the expected value of the reward Y .
What is the expected value of the gain?

Sol.
The expected reward is:
E[Y ] = 0pY (0) + 1pY (1) + 8pY (8) = 0(4/8) + 1(3/8) + 8(1/8) = (11/8)
The expected value of gain is:
11 12 1
E[Y − 1.5] = − =−
8 8 8

Bakhtiar Ali 5/10


3.3 EXPECTED VALUE AND MOMENTS
3.0 Discrete Random Variables 3.3.1 Expected Value of Functions of a Random Variable
3.3.2 Variance of a Random Variable

3.3.1 Expected Value of Functions of a Random Variable

Let X be a discrete random variable, and let Z = g(X). Since X is


discrete, Z = g(X), will assume a countable set of values of the form
g(Xk ), where xk ∈ SX . The direct way to find the expected value of Z
requires that we first find the pmf of Z. Another way is to use
X
E[Z] = g(Xk )pX (xk ).
k

Example (Square Law Device)


Let X be a noise voltage that is uniformly distributed in
SX = {−3, −1, +1, +3} with pX (k) = 1/4 for k in SX .
Find E[Z] where Z = X 2 .

Sol.
X 1 20
E[Z] = E[X 2 ] = k2 pX (k) = {(−3)2 + (−1)2 + (1)2 + (3)2 } = = 5.
4 4
k

Bakhtiar Ali 6/10


3.3 EXPECTED VALUE AND MOMENTS
3.0 Discrete Random Variables 3.3.1 Expected Value of Functions of a Random Variable
3.3.2 Variance of a Random Variable

Example (Square Law Device)


The noise voltage X in the previous example is amplified and
shifted to obtain Y = 2X + 10, and then squared to produce
Z = Y 2 = (2X + 10)2 .
Find E[Z].
Sol.
E[Z] = E[(2X + 10)2 ] = E[(4X 2 + 40X + 100)]
= 4E[X 2 ] + 40E[X] + 100
=4(5) + 40(0) + 100 = 120.

Bakhtiar Ali 7/10


3.3 EXPECTED VALUE AND MOMENTS
3.0 Discrete Random Variables 3.3.1 Expected Value of Functions of a Random Variable
3.3.2 Variance of a Random Variable

3.3.2 Variance of a Random Variable

The expected value E[X], by itself, provides us with limited


information about X.
For example, if we know that E[X] = 0 then it could be that
X is zero all the time.
However, it is also possible that X can take on extremely
large positive and negative values.
We are therefore interested not only in the mean of a random
variable, but also in the extent of the random variable’s
variation about its mean.
The variance of the random variable X is defined as the
expected value of (X − E[X])2 .
2
σX = V AR[X] = E[(X − E[X])2 ]
The standard deviation of the random variable X is :
σX = ST D[X] = V AR[X]1/2
Bakhtiar Ali 8/10
3.3 EXPECTED VALUE AND MOMENTS
3.0 Discrete Random Variables 3.3.1 Expected Value of Functions of a Random Variable
3.3.2 Variance of a Random Variable

3.3.2 Variance of a Random Variable

An alternative expression for the variance can be obtained


as follows:
V AR[X] = E[(X − mX )2 ] = E[X 2 − 2mX X + m2X ]
= E[X 2 ] − 2mX E[X] + m2X
= E[X 2 ] − m2X
E[X 2 ] is called the second moment of X. The nth moment
of X is defined as E[X n ].
Adding a constant to a random variable does not affect the
variance.
V AR[X + c] = E[(X + c − (E[X] + c))2 ]
= E[(X − E[X])2 ] = V AR[X]
Scaling a random variable by c scales the variance by c2 and
the standard deviation by |c|.
V AR[cX] = E[(cX − cE[X])2 ]
= E[c2 (X − E[X])2 ] = c2 V AR[X]
Bakhtiar Ali 9/10
3.3 EXPECTED VALUE AND MOMENTS
3.0 Discrete Random Variables 3.3.1 Expected Value of Functions of a Random Variable
3.3.2 Variance of a Random Variable

Example
Let X be the number of heads in three tosses of a fair coin.
Find V AR[X].
Sol.
1 3 3 1
       
2
E[X ] = 0 + 12 + 22 + 32 =3
8 8 8 8
V AR[X] = E[X 2 ] − m2X = 3 − 1.52 = 0.75

Example (Variance of Bernoulli Random Variable)


Find the variance of the Bernoulli random variable
Sol.
2
E[IA ] = 0pI (0) + 12 pI (1) = p
V AR[IA ] = p − p2 = p(1 − p) = pq

Bakhtiar Ali 10/10

You might also like