Professional Documents
Culture Documents
1
1 Discrete Random Variables
2. Expectation, Variance
3. Functions of rvs
Definition 1.1 (Random Variable). Given an experiment and the corresponding set of possible
outcomes (the sample space), a random variable associates a particular number with each
outcomes. Mathematically a random variable denoted by X is a real valued function from Ω
to R, i.e X : Ω → R.
A random variable is called discrete if its range is either finite or countably infinite.
In the above figure input is an element of the sample space ω ∈ Ω and output is the realization
x = X(ω).
Example 1.2. Consider two independent tosses of a fair coin and let X be the number of
heads obtained. Then Ω = {HH, HT, T H, T T } and x ∈ {0, 1, 2}, i.e. possible realization of
rv X are 0, 1 and 2.
2
Definition 1.2 (PMF). The probability mass of x denoted by pX (x) is the probability of the
event {ω : X(ω) = x} such that pX (x) = P{X = x}.
Properties of PMF:
Definition 1.3 (Geometric PMF). Suppose that a coin is tossed repeatedly and independently
with probability of a head equal to p. The geometric rv is the number X of tosses required to
get the first head. Its PMF is given by
pX (k) = (1 − p)k−1 p, k = 1, 2, . . .
and ∞ ∞
X X X 1
pX (k) = (1 − p)k−1 p = p (1 − p)k−1 = p = 1.
k=1 k=1
1 − (1 − p)
Definition 1.4 (Binomial PMF). X - number of heads in n independent coin tosses. Let
P(H) = p.
3
For n = 4
In general,
n k
pX (k) = p (1 − p)n−k , k = 0, 1, . . . , n.
k
Definition 1.5 (Poisson PMF). A discrete random variable X is said to follow Poisson PMF
with intensity parameter λ if
(
e−λ λk
k!
, if k = 0, 1, · · ·
pX (k) =
0, otherwise.
As n → ∞, we have
n
n(n − 1) · · · (n − k + 1) λ
→ 1, 1− → e−λ
nk n
and −k
λ
1− →1
n
4
Therefore,
e−λ λk
pX (k) → , k = 0, 1, 2, . . .
k!
Thus as n → ∞, p → 0 and np → λ, the Binomial PMF converges to Poisson PMF.
Example 1.4. The probability of winning at a certain game is 0.10, if you play the game 10
times, what is the probability that you win at most once.
Solution. Let X be the number of wins, we are interested in P(X ≤ 1). Since it is a Binomial
rv.
Function of random variables. Given a rv X; one may generate other random variables
by applying various transformations to X.
If X is discrete rv with PMF pX , then for some real valued function g the rv defined by
Y = g(X) is also discrete, and its PMF pY can be calculated using the pmf of X. The PMF
of Y is given by
X
pY (y) = pX (x).
{x:g(x)=y}
5
Example 1.5. Let Y = |X| and let
(
1/9, if x is an integer in the range [−4, 4]
pX (x) =
0, otherwise
Calculate pY (y)?.
2
pY (1) = pX (−1) + pX (1) =
9
..
.
2
pY (4) = pX (−4) + pX (4) = .
9
Thus the PMF of Y is (
2/9, if y = 1, 2, 3, 4
pY (y) =
1/9, if y = 0
Answer:
0.7,
for y = 0
pY (y) = 0.3, for y = 6
0, otherwise
Definition 1.6 (Expectation). The expected value (also called the expectation or mean) of a
rv is defined by
X
EX = xpX (x).
x
6
Expectation is a weighted average of the possible values of X, where the weights are
respective probabilities.
Example 1.7. Suppose that range of rv X is {1, 2, 3} with pX (1) = 1/6, pX (2) = 1/3 and
pX (3) = 1/2. Then
1 1 1 7
EX = 1 × + 2 × + 3 × = .
6 3 2 3
Remark 1.1. 1. Note that EX is not necessary to belong to the range of the rv X.
20 × 1 + 40 × 2 + 60 × 3 7
EX = = .
120 3
Example 1.8. Find the mean value of the random variable X whose values are the sums of
the scores on two fair dice.
Solution. First find out pX (x). Note that x ∈ {2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}.
This implies
1 2 3 4 5 6 5
EX = 2 × +3× +4× +5× +6× +7× +8×
36 36 36 36 36 36 36
4 3 2 1
+9× + 10 × + 11 × + 12 × = 7.
36 36 36 36
Remark 1.2. By symmetry and using center of gravity argument, we can directly say that
EX = 7.
Expected value for functions of random variables: Let X be a rv with PMF pX and
7
let g(X) be a function of rv X. Then, the expected value of the rv g(X) is given by
X
E[g(X)] = g(x)pX (x)
x
.
Verification. Let Y = g(X), ( Note that Y is another rv).
X X
EY = ypY (y), pY (y) = pX (x).
y {x:g(x)=y}
This implies
X X X X
EY = y pX (x) = ypX (x)
y {x:g(x)=y} y {x:g(x)=y}
X X X
= g(x)pX (x) = g(x)pX (x).
y {x:g(x)=y} x
Thus X
E[g(X)] = g(x)pX (x)
x
.
Remark 1.3. We can verify the expected value for a function of random variable by consid-
ering a simple example. Consider the following figure.
8
X
EY = ypY (y)
y
Example 1.9. Let X be a RV with range RX = {0, π/4, π/2, 3π/4, π}, such that pX (0) =
pX (π/4) = pX (π/2) = pX (3π/4) = pX (π) = 1/5. Compute E[sin(X)].
√
2+1
Answer: 5
.
Result 1.1. Let X be a discrete random variable with RX ⊂ {0, 1, 2, . . .}. Prove
∞
X
EX = P(X > k).
k=0
Proof. We have,
P(X > 0) = pX (1) + pX (2) + pX (3) + pX (4) + · · · ,
9
Hence,
∞
X
P(X > k) = P(X > 0) + P(X > 1) + P(X > 2) + · · ·
k=0
Definition 1.7 (Variance). The variance provides a measure of dispersion of X around its
mean and is defined by Var(X) = E[X − E(X)]2 . Thus Var(X) is the expectation of the
random variable (X − EX)2 .
p
Definition 1.8 (Standard Deviation). Standard deviation is defined by σX = V ar(X). The
standard deviation has same unit as of rv X.
Remark 1.4. If a random variable Y range is more dispersed than a random variable X
range, then Var(Y ) will be higher than Var(X) mostly. However variance also depends on the
probabilities assigned to values far away from mean (see the figures) and hence it depends on
the values taken far away from mean as well as corresponding probabilities.
10
Mean and variance of a Bernoulli rv: Consider the experiment of tossing a coin, where
head appears with probability p and a tail with probability (1 − p).
(
p, if k = 1
pX (k) =
1-p, if k = 0
11
References
Dimitri Bertsekas and John N. Tsitsiklis (2008). Introduction to Probability, Athena
Scientific, 2nd edition.
Sheldon M. Ross (2009). Introduction to Probability and Statistics for Engineers and
Scientists, Academic Press.
12