# BASIC PROBABILITY : HOMEWORK 2

Exercise 1: where does the Poisson distribution come from? (corrected) Fix λ > 0. We denote by Xn has the Binomial distribution with parameters n and λ/n. Then prove that for any k ∈ N we have P [Xn = k ] − − − → P [Y λ = k ],
n→∞

where Yλ has a Poisson distribution of parameter λ.

Exercise 2 (corrected) Prove Theorem 2.2.

Exercise 3 (corrected) Let X and Y be independent Poisson variables with parameters λ and µ. Show that (1) X + Y is Poisson with parameter λ + µ, (2) the conditional distribution of X , given that X + Y = n, is binomial, and ﬁnd the parameters.

Exercise 4 (homework) Suppose we have n independent random variables X1 , . . . , Xn with common distribution function F . Find the distribution function of maxi∈[1,n] Xi .

Exercise 5 (homework) Show that if X takes only non-negative integer values, we have

E [X ] =
n=0

P [X > n].

An urn contains b blue and r red balls. Balls are removed at random until the ﬁrst blue ball is drawn. Show that the expected number drawn is (b + r + 1)/(b + 1).
1

. ). (3) P (Y ≤ X ). Then show that for any a > 0. Find (1) P ( 1 ≤X≤ 3 ). Exercise 6 (homework) Let X have a distribution function   if x < 0 0 1 F (x) = 2 x if 0 ≤ x ≤ 2  1 if x > 2 and let Y = X 2 . we have 1 P [X ≥ a] ≤ E [X ]. (5) P (X + Y ≤ 3 4 √ (6) the distribution function of Z = X Exercise 7: Chebyshev’s inequality (homework) Let X be a random variable taking only non-negative values. a Deduce that 1 P [|X − E [X ]| ≥ a] ≤ 2 var(X ). (4) P (X ≤ 2Y ). 2 2 (2) P (1 ≤ X < 2). a and explain why var(X ) is often referred to as the standard deviation.2 Hint : r n+b n=0 b = r+b+1 b+1 .

Y (x. Exercise 3 (solution) Let us notice that for X and Y non-negative independent random variables then n P [X + Y = n] = k=0 P [X = n − k ]P [Y = k ]. y can be rewritten P (X = x and Y = y ) = P (X = x)P (Y = y ) for all x. y ) = fX (x)fy (y ) for all x.Y (x.3 Exercise 1 (solution) We have n λ k λ (1 − )n−k k n n n(n − 1) . so in our speciﬁc case of Poisson random variables n P [X + Y = n] = k=0 e−λ λn−k e−µ µk (n − k )! k ! . y ) = f (x) y h(y ). y fY (y ) = y fX. . g (x). Suppose that fX. (n − k + 1) λk λ (1 − )n−k = k n k! n k λ → e−λ . Now for the second part of the theorem. . we have that (1 − n P [Xn = k ] = Exercise 2 (solution) The condition fX. k! λ n−k ) → e−λ . y ) = g (x)h(y ) for all x. so that fX (x)fY (y ) = g (x)h(y ) x g (x) y h(y ) = g (x)h(y ) = fX. y .Y (x. y ) = h(y ) Now 1= x fX (x) = x g ( x) y h(y ). we have f X ( x) = y fX. y ). y which is the very deﬁnition of X and Y being independent. since for any k > 0.Y (x.Y (x.

P [X + Y = n] k (µ + λ)n Hence the conditional distribution is binomial with parameters n and λ/(λ + mu). k n! For the second part of the question P [X = k. X + Y = n] P [X = k | X + Y = n] = P [X + Y = n] P [X = k ]P [Y = n − k ] n λk µn−k = = .4 = e−λ−µ n! n k=0 n n−k k e−λ−µ (µ + λ)n λ µ = . .