You are on page 1of 12

Handout 3

MA 202 - Probability and Statistics

IIT Ropar, Rupnagar

January 27, 2020

1
1 Discrete Random Variables

1. Definition of rvs, Probability Mass Functions (PMF)

2. Expectation, Variance

3. Functions of rvs

Definition 1.1 (Random Variable). Given an experiment and the corresponding set of possible
outcomes (the sample space), a random variable associates a particular number with each
outcomes. Mathematically a random variable denoted by X is a real valued function from Ω
to R, i.e X : Ω → R.

Example 1.1 (Random Variables). 1. In an experiment involving a sequence of 10 tosses


of a coin, the number of heads in the sequence is a rv.

2. In a roll of a 6 faced die, the outcome is a rv.

A random variable is called discrete if its range is either finite or countably infinite.

Probability Mass Function (PMF).


X - random variable
x - realization of random variable.

In the above figure input is an element of the sample space ω ∈ Ω and output is the realization
x = X(ω).

Example 1.2. Consider two independent tosses of a fair coin and let X be the number of
heads obtained. Then Ω = {HH, HT, T H, T T } and x ∈ {0, 1, 2}, i.e. possible realization of
rv X are 0, 1 and 2.

2
Definition 1.2 (PMF). The probability mass of x denoted by pX (x) is the probability of the
event {ω : X(ω) = x} such that pX (x) = P{X = x}.

Example 1.3. In example 1.2



 1/4,
 for x = 0, 2
pX (x) = 1/2, for x = 1

0, otherwise.

Properties of PMF:

(a) pX (x) ≥ 0, for all x.


P
(b) x pX (x) = 1.

Definition 1.3 (Geometric PMF). Suppose that a coin is tossed repeatedly and independently
with probability of a head equal to p. The geometric rv is the number X of tosses required to
get the first head. Its PMF is given by

pX (k) = (1 − p)k−1 p, k = 1, 2, . . .

and ∞ ∞
X X X 1
pX (k) = (1 − p)k−1 p = p (1 − p)k−1 = p = 1.
k=1 k=1
1 − (1 − p)

Definition 1.4 (Binomial PMF). X - number of heads in n independent coin tosses. Let
P(H) = p.

3
For n = 4

pX (2) = P({X = 2}) = P(two heads in 4 independent coin tosses)


= P({HHT T, HT HT, T HT H, HT T H, T HHT, T T HH})
 
2 2 4 2
= 6p (1 − p) = p (1 − p)2 .
2

In general,  
n k
pX (k) = p (1 − p)n−k , k = 0, 1, . . . , n.
k
Definition 1.5 (Poisson PMF). A discrete random variable X is said to follow Poisson PMF
with intensity parameter λ if
(
e−λ λk
k!
, if k = 0, 1, · · ·
pX (k) =
0, otherwise.

Poisson approximation of Binomial PMF. As n → ∞, p → 0 and np → λ, we have


n−k
λk
  
n k n−k n! λ
pX (k) = p (1 − p) = 1−
k (n − k!)k! nk n
n −k
n(n − 1) · · · (n − k + 1) λk
  
λ λ
= 1− 1−
nk k! n n

As n → ∞, we have
 n
n(n − 1) · · · (n − k + 1) λ
→ 1, 1− → e−λ
nk n

and  −k
λ
1− →1
n

4
Therefore,
e−λ λk
pX (k) → , k = 0, 1, 2, . . .
k!
Thus as n → ∞, p → 0 and np → λ, the Binomial PMF converges to Poisson PMF.

Example 1.4. The probability of winning at a certain game is 0.10, if you play the game 10
times, what is the probability that you win at most once.
Solution. Let X be the number of wins, we are interested in P(X ≤ 1). Since it is a Binomial
rv.

P(X ≤ 1) = P(X = 0) + P(X = 1)


   
10 0 10 10
= (0.10) (0.90) + (0.10)1 (0.90)9
0 1
≈ 0.736099

Function of random variables. Given a rv X; one may generate other random variables
by applying various transformations to X.

If X is discrete rv with PMF pX , then for some real valued function g the rv defined by
Y = g(X) is also discrete, and its PMF pY can be calculated using the pmf of X. The PMF
of Y is given by
X
pY (y) = pX (x).
{x:g(x)=y}

5
Example 1.5. Let Y = |X| and let
(
1/9, if x is an integer in the range [−4, 4]
pX (x) =
0, otherwise

Calculate pY (y)?.

Solution. The possible values of Y with positive probabilities are y = 0, 1, 2, 3, 4.

pY (0) = pX (0) = 1/9.

There are two values of X that corresponding to each y = 1, 2, 3, 4. We have

2
pY (1) = pX (−1) + pX (1) =
9
..
.
2
pY (4) = pX (−4) + pX (4) = .
9
Thus the PMF of Y is (
2/9, if y = 1, 2, 3, 4
pY (y) =
1/9, if y = 0

Example 1.6. Let X be a discrete random variable with PMF




 0.2, if x = 0

 0.2, if x = 1



pX (x) = 0.3, if x = 2

0.3, if x = 3





 0, otherwise

Define Y = X(X − 1)(X − 2). Find the PMF of Y .

Answer: 
 0.7,
 for y = 0
pY (y) = 0.3, for y = 6

0, otherwise

Definition 1.6 (Expectation). The expected value (also called the expectation or mean) of a
rv is defined by
X
EX = xpX (x).
x

6
Expectation is a weighted average of the possible values of X, where the weights are
respective probabilities.

Example 1.7. Suppose that range of rv X is {1, 2, 3} with pX (1) = 1/6, pX (2) = 1/3 and
pX (3) = 1/2. Then
1 1 1 7
EX = 1 × + 2 × + 3 × = .
6 3 2 3
Remark 1.1. 1. Note that EX is not necessary to belong to the range of the rv X.

2. Interpretation of expectation in terms of relative frequency: Suppose the ex-


periment in Example 1.7 is repeated 120 times, then we ‘expect’ to obtain the number 1
twenty times, the number 2 forty times and the number 3 sixty times in which case we
see that EX is precisely the average (or mean) value of the expected results. That is

20 × 1 + 40 × 2 + 60 × 3 7
EX = = .
120 3
Example 1.8. Find the mean value of the random variable X whose values are the sums of
the scores on two fair dice.

Solution. First find out pX (x). Note that x ∈ {2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}.

pX (2) = P({X = 2}) = P({(1, 2)}) = 1/36.


pX (3) = P({X = 3}) = P({(1, 2), (2, 1)}) = 2/36.
pX (4) = P({X = 4}) = P({(1, 3), (2, 2), (3, 1)}) = 3/36.
pX (5) = 4/36, pX (6) = 5/36, pX (7) = 6/36,
pX (8) = 5/36, pX (9) = 4/36, pX (10) = 3/36,
pX (11) = 2/36, pX (12) = 1/36.

This implies

1 2 3 4 5 6 5
EX = 2 × +3× +4× +5× +6× +7× +8×
36 36 36 36 36 36 36
4 3 2 1
+9× + 10 × + 11 × + 12 × = 7.
36 36 36 36

Remark 1.2. By symmetry and using center of gravity argument, we can directly say that
EX = 7.

Expected value for functions of random variables: Let X be a rv with PMF pX and

7
let g(X) be a function of rv X. Then, the expected value of the rv g(X) is given by
X
E[g(X)] = g(x)pX (x)
x

.
Verification. Let Y = g(X), ( Note that Y is another rv).
X X
EY = ypY (y), pY (y) = pX (x).
y {x:g(x)=y}

This implies

X X X X
EY = y pX (x) = ypX (x)
y {x:g(x)=y} y {x:g(x)=y}
X X X
= g(x)pX (x) = g(x)pX (x).
y {x:g(x)=y} x

Thus X
E[g(X)] = g(x)pX (x)
x
.

Remark 1.3. We can verify the expected value for a function of random variable by consid-
ering a simple example. Consider the following figure.

8
X
EY = ypY (y)
y

= y1 pY (y1 ) + y2 pY (y2 ) = y1 pX (x1 ) + y2 [pX (x2 ) + pX (x3 )]


= g(x1 )pX (x1 ) + y2 pX (x2 ) + y2 pX (x3 )
= g(x1 )pX (x1 ) + g(x2 )pX (x2 ) + g(x3 )pX (x3 )
X
= g(x)pX (x).
x

Example 1.9. Let X be a RV with range RX = {0, π/4, π/2, 3π/4, π}, such that pX (0) =
pX (π/4) = pX (π/2) = pX (3π/4) = pX (π) = 1/5. Compute E[sin(X)].

2+1
Answer: 5
.

Result 1.1. Let X be a discrete random variable with RX ⊂ {0, 1, 2, . . .}. Prove

X
EX = P(X > k).
k=0

Proof. We have,
P(X > 0) = pX (1) + pX (2) + pX (3) + pX (4) + · · · ,

P(X > 1) = pX (2) + pX (3) + pX (4) + · · · ,

P(X > 2) = pX (3) + pX (4) + · · · ,

P(X > 3) = pX (4) + pX (5) + · · · .

9
Hence,

X
P(X > k) = P(X > 0) + P(X > 1) + P(X > 2) + · · ·
k=0

= pX (1) + 2pX (2) + 3pX (3) + 4pX (4) + · · ·


= EX

Definition 1.7 (Variance). The variance provides a measure of dispersion of X around its
mean and is defined by Var(X) = E[X − E(X)]2 . Thus Var(X) is the expectation of the
random variable (X − EX)2 .
p
Definition 1.8 (Standard Deviation). Standard deviation is defined by σX = V ar(X). The
standard deviation has same unit as of rv X.

Properties of mean and variance.

(a) E(aX + b) = aE(X) + b

(b) E(g(X)) 6= g(E(X)) in general.

(c) V ar(aX + b) = a2 V ar(X)

(d) V ar(X) = EX 2 − (EX)2 .

Remark 1.4. If a random variable Y range is more dispersed than a random variable X
range, then Var(Y ) will be higher than Var(X) mostly. However variance also depends on the
probabilities assigned to values far away from mean (see the figures) and hence it depends on
the values taken far away from mean as well as corresponding probabilities.

10
Mean and variance of a Bernoulli rv: Consider the experiment of tossing a coin, where
head appears with probability p and a tail with probability (1 − p).
(
p, if k = 1
pX (k) =
1-p, if k = 0

EX = 1.p + 0.(1 − p) = p, EX 2 = 12 .p + 02 .(1 − p) = p


Var(X) = EX 2 − (EX)2 = p(1 − p).

Mean and Variance of discrete uniform random variable:


(
1/n, if k = 1, 2, . . . , n
pX (k) =
0, otherwise
1 n(n + 1) n+1
EX = (1 + 2 + · · · + n) = =
n 2.n 2
1 1 n(n + 1)(2n + 1) (n + 1)(2n + 1)
EX 2 = 12 + 22 + · · · + n2 =

=
n n 6 6
2 2
(n + 1)(2n + 1) (n + 1) n −1
V ar(X) = EX 2 − (EX)2 = − =
6 4 12
PMF, Mean and Variance of Some Well Known Discrete Distributions

Random Variable Parameter/s PMF Mean Variance


Bernoulli p pX (k) = pk (1 − p)1−k , k = 0, 1 p p(1 − p)
n k
 n−k
Binomial n, p pX (k) = k p (1 − p) , k = np np(1 − p)
0, 1, · · · , n
−λ k
Poisson λ pX (k) = e k!λ , k = 0, 1, · · · λ λ
1 (1−p)
Geometric p pX (k) = (1 − p)k p, k = 0, 1, · · · p p2
n2 −1
Uniform n pX (k) = n1 , k = 1, 2, · · · , n n+1
2 12

11
References
Dimitri Bertsekas and John N. Tsitsiklis (2008). Introduction to Probability, Athena
Scientific, 2nd edition.

Sheldon M. Ross (2009). Introduction to Probability and Statistics for Engineers and
Scientists, Academic Press.

12

You might also like