You are on page 1of 2

EE 5110: Probability Foundations

Tutorial 8

July-November 2015

1. [Independence and Expectation]


(a) Let X and Y be independent random variables (not necessarily discrete or continuous) and let Z = XY . Let
X + (ω) = max(X(ω), 0) be the positive part of X and X − (ω) = min(X(ω), 0) be the negative part of X.
(i) Prove that X + and Y + are independent, and same with X + and Y − , and X − and Y + , and X − and Y − .
(ii) Express Z + and Z − in terms of X + , X − , Y + and Y − .
(iii) Prove that E[XY ] = E[X]E[Y ].
(b) Let X1 , X2 , ..., Xn be independent random variables with non-zero finite expectations. Show that
n
Q
var( Xi ) n 
i=1
Y var(Xi ) 
n = +1 −1
E(Xi )2
E(Xi )2
Q
i=1
i=1
.
2. [Expectation of Continuous RV’s]
(a) A workstation consists of three machines, M1 , M2 and M3 , each of which will fail after an amount of time Ti
which is an independent exponentially distributed random variable, with parameter 1. Assume that the times to
failure of the different machines are independent. The workstation fails as soon as both of the following have
happened:
(i) Machine M1 has failed.
(ii) Atleast one of the machines M2 or M3 has failed.
Find the expected value of the time to failure of the workstation.
(b) Let Z be an exponential random variable with parameter λ = 1 and Zint = ⌊Z⌋. Compute E(Zint ).
(c) Suppose Sk and Sn are the prices of a financial instrument on days k and n, respectively. For k < n, the gross
return Gk,n between days k and n is defined as Gk,n = SSnk and is equal to the amount of money you would
have on day n if you invested $1 on day k. Let Gk,k+1 be lognormal random variable with parameters µ and σ 2 ,
∀k ≥ 1, and the random variables Gj,j+1 and Gk,k+1 are independent and identically distributed ∀k 6= j. Find
the expected total gross return from day 1 to day n.
3. [MGF and PGF]
(a) Let X be a random variable with its expected value, variance and moment generating function, denoted as E(X),
var(X) and MX (s) respectively. Each part of this problem introduces a new random variable. Determine the
expected values and variances of these random variables in terms of E(X) and var(X).
(i) MZ1 (s) = [MX (s)]5 .
(ii) MZ2 (s) = e6s MX (s).
(iii) MZ3 (s) = MX (as), where a is some finite positive real constants.
(b) Suppose there are X0 individuals in initial generation of a population. In the nth generation, the Xn individuals
(n) (n) (n) (n) (n) (n)
independently give rise to numbers of offspring Y1 , Y2 , ..., YXn , where Y1 , Y2 , ..., YXn are i.i.d. random
(n)
variables. The total number of individuals produced at the (n + 1)st generation will then be Xn+1 = Y1 +
(n) (n)
Y2 + ... + YXn . Then, {Xn } is called a branching process. Let Xn be the size of the nth generation of
a branching process with family-size probability generating function G(z), and let X0 = 1. Show that the
probability generating function Gn (z) of Xn satisfies Gn+1 (z) = G(Gn (z)) for n ≥ 0. Also, prove that
E(Xn ) = E(Xn−1 )G′ (1).

1
4. [DCT] Suppose that X is a non-negative random variable and that MX (s) < ∞ ∀s ∈ (−∞, a] where a is a positive
number.
(a) Show that E[X k ] < ∞, for every k.
(b) Show that E[X k esX ] < ∞, for every s < a.
ehX −1
(c) Show that h ≤ XehX .
ehX −1 E[ehX ]−1
(d) Argue that E[X] = E[limh↓0 h ] = limh↓0 h .
5. [Conditional Expectation]
(i) Given is the table for Joint PMF of random variables X and Y .
X=0 X =1
1 2
Y=0 5 5
2
Y=1 5 0
Let Z = E(X|Y ) and V = V ar(X|Y ). Find the PMF of Z and V , and compute E(Z) and E(V ).
(ii) Consider a sequence of i.i.d. random variables {Zi } where P(Zi = 0) = P(Zi = 1) = 21 . Using this sequence,
define a new sequence of random variables {Xn } as follows:
X0 = 0,
X1 = 2Z1 − 1, and
Xn = Xn−1 + (1 + Z1 + ... + Zn−1 )(2Zn - 1) for n ≥ 2.
Show that E[Xn+1 |X0 , X1 , ..., Xn ] = Xn a.s. for all n.
6. [Correlation Co-efficient]
(a) Take 0 ≤ r ≤ 1. Let X and Y be independent random variables taking values ±1 with probabilities 12 . Set
Z = X, with probability r and Z = Y , with probability 1 − r. Find ρX,Z .
(b) Let Z be a continuous random variable uniformly distributed in [0, 2π]. Define random variables X = cos(Z)
and Y = sin(Z). Evaluate the correlation co-efficient ρX,Y .
7. [Iterated Expectation]
(a) The number of people that enter a pizzeria in a period of 15 minutes is a (nonnegative integer) random variable K
with known moment generating function MK (s). Each person who comes in buys a pizza. There are n types of
pizzas, and each person is equally likely to choose any type of pizza, independently of what anyone else chooses.
Give a formula, in terms of MK (.), for the expected number of different types of pizzas ordered.
(b) John takes a taxi to home everyday after work. Every evening, he waits by the road to get a taxi but every taxi that
comes by is occupied with a probability 0.8 independent of each other . He counts the number of taxis he missed
till he gets an unoccupied taxi. Once he gets inside the taxi, he throws a fair six faced die for a number of times
equal to the number of taxis he missed. He counts the output of the die throws and gives a tip to the driver equal
to that. Find the expected amount of tip that John gives everyday.

You might also like