You are on page 1of 2

EE 5110: Probability Foundations

Tutorial 9

July-November 2015

1. (a) Prove that convergence in probability implies convergence in distribution, and give a
counter-example to show that the converse need not hold.
(b) Show that convergence in distribution to a constant random variable implies convergence
in probability to that constant.
2. A town starts a mosquito control program and the random variable Zn is the number of
mosquitoes at the end of the nth year (n = 0, 1, ...). Let Xn be the growth rate of mosquitoes
in the year n i.e. Zn = Xn Zn−1 , n ≥ 1. Assume that {Xn , n ≥ 1} is a sequence of i.i.d.
random variables with the PMF P(X = 2) = 21 , P(X = 21 ) = 14 and P(X = 41 ) = 14 . Suppose
Z0 , the initial number of mosquitoes, is a known constant and assume, for simplicity and
consistency, that Zn can take non-integer values.
(a) Find E[Zn ] and limn→∞ E[Zn ].
(b) Based on your answer to part (a), can you conclude whether or not the mosquito control
program is successful? What would your conclusion be?
(c) Let Wn = log2 Xn . Find E[Wn ] and E[log2 ZZn0 ].
(d) Show that there exists a constant α such that limn→∞ n1 log2 ZZn0 = α almost surely.
(e) Show that there is a constant β such that limn→∞ Zn = β almost surely.
(f) Based on your answer to part (e), can you conclude whether or not the mosquito control
program is successful? What would your conclusion be?
(g) How do you reconcile your answers to parts (b) and (f)?
3. Consider the sequence of random variables with densities

fXn (x) = 1 - cos(2πnx), x ∈ (0, 1).

Do Xn ' s converge in distribution? Does the sequence of densities converge?


4. Let X1 , X2 , ..., Xn be i.i.d. random variables with PDF fX . Then the set of random variables
X1 , X2 , ..., Xn is called a random sample of size n of X . The sample mean is dened as
1
Xn = (X1 + X2 + ... + Xn ).
n
Let X1 , X2 ,.. Xn be a random sample of X with mean µ and variance σ 2 . How many samples
of X are required for the probability that the sample mean will not deviate from the true
mean µ by more than σ/10 to be at least .95?

1
5. (a) Let Y be a Binomial random variable with parameters (n, p). Using the Central Limit
Theorem, derive the approximation formula
!
y − np
P(Y ≤ y) = φ p
np(1 − p)

where φ is the CDF of a Standard Normal random variable.


(b) Let Y be a Poisson random variable with parameter λ. Using the Central Limit Theorem,
derive the approximation formula
 
y−λ
P(Y ≤ y) = φ √ .
λ
where φ is the CDF of a Standard Normal random variable.
6. A sequence {Xn , n ≥ 1} of random variables is said to be completely convergent to X if

P(|Xn − X| > ) < ∞, ∀ > 0


P
n

Show that, for sequences of independent random variables, complete convergence is equivalent
to almost sure convergence. Find a sequence of (dependent) random variables that converge
almost surely but not completely.
7. Construct an example of a sequence of characteristic functions φn (t) such that the limit
φ(t) = limn→∞ φn (t) exists for all t, but φ(t) is not a valid characteristic function.
8. Imagine a world in which the value of π is unknown. It is known that area of a circle is
proportional to the square of the radius, but the constant of proportionality is unknown.
Suppose you are given a uniform random variable generator, and you can generate as many
i.i.d. samples as you need, devise a method to estimate the value of the proportionality
constant without actually measuring the area/circumference of the circle.

You might also like