Professional Documents
Culture Documents
1. (i) Given that X and Y are independent random variables i.e. σ(X) and σ(Y ) are independent σ-algebras.
(a) Since “max” and “min” are Borel-measurable functions from R to R, σ(X + ) ⊆ σ(X), σ(X − ) ⊆
σ(X), σ(Y + ) ⊆ σ(Y ) and σ(Y − ) ⊆ σ(Y ).
Hence, X + and Y + , X + and Y − , X − and Y + , and X − and Y − are independent random variables.
(b) Since Z is product of X and Y , the positive and negative parts of Z respectively can be written as
Z + = X +Y + + X −Y −
Z − = X −Y + + X +Y −
(c)
E[XY ] = E[Z]
= E[Z + ] + E[Z − ]
= E[X + Y + ] + E[X − Y − ] + E[X − Y + ] + E[X + Y − ]
Each of these expectations on the RHS correspond to those of independent, non-negative random
variables. Hence, by approximating them by simple random variables and applying MCT, we get
Hence proved.
(ii) Consider the following:
n
Y n
Y Yn 2
var Xi =E Xi − E(Xi )
i=1 i=1 i=1
n
Y n
Y 2
= E(Xi2 ) − E(Xi )
i=1 i=1
Yn Yn 2
= var(Xi ) + (E(Xi ))2 − E(Xi )
i=1 i=1
n
E(Xi )2 .
Q
The desired result follows by dividing both sides by
i=1
2. (a) The failure time of the work station is given by max(T1 , min(T2 , T3 )). Given that Ti ’s are i.i.d. ex-
ponential with parameter 1. Hence, pdf of Ti is fTi (x) = e−x , x ≥ 0, ∀i. Suppose X1 , X2 , ..., Xn
are i.i.d. exponentials with parameter µ, then min(X1 , ..., Xn ) is also an exponential with parame-
ter nµ. Hence, the pdf of Z = min(T2 , T3 ) is fZ (z) = 2e−2z , z ≥ 0. Let Y = max(T1 , Z). Then,
P(Y ≤ y) = P(T1 ≤ y, Z ≤ y) = P(T1 ≤ y)P(Z ≤ y), because T1 and Z are independent. Hence, the cdf
of Y is given by (1 − e−2y )(1 − e−y ), y ≥ 0 ⇒ the pdf of Y is fY (y) = e−y + 2e−2y − 3e−3y , y ≥ 0. Now,
R∞
E(Y ) = yfY (y) dy = 76 .
0
1
k+1
(b) The pdf of Z is fZ (z) = e−z , z ≥ 0. Now, P(Zint = k) = fZ (z) dz = e−k − e−(k+1) , k ∈ Z+ ∪ {0}.
R
k
∞
1
kP(Zint = k) = e−1 + e−2 + e−3 + ... =
P
Therefore, E(Zint ) = e−1 .
k=0
(c) Let X ∼ N (µ, σ 2 ) and Y = eX . Then we have the following,
E(Y ) =E(eX ) =
Z∞
1 −x2 −µ2 +2x(µ+σ2 )
=√ e 2σ2 dx
2πσ 2
−∞
Z∞
σ2 1 −(x−(µ+σ2 ))2
=eµ+ 2 √ e 2σ2 dx
2πσ 2
−∞
2
µ+ σ2
=e .
Now, the total gross return from day 1 to n is G1,n = G1,2 G2,3 ...Gn−1,n . And, Gk,k+1 ’s are given to
be i.i.d. lognormal with parameters µ and σ 2 , ∀k ≥ 1. Thus, E(G1,n ) = E(G1,2 )E(G2,3 )...E(Gn−1,n ) =
(n−1)σ2
E(G1,n )n−1 = e(n−1)µ+ 2 .
5
3. (a) (i) Given that MZ1 (s) = (MX (s)) Differentiating MZ1 (s) w.r.t s, we get
4 dMX (s)
E(Z1 ) = 5 (MX (s)) = 5E(X)
ds
s=0
Now,
d2 MX (s)
3 d2 MX (s)
4
E(Z12 ) = 20 (MX (s)) + 5 (MX (s)) = 25E(X 2 )
ds2 ds2
s=0 s=0
So,
V ar(Z1 ) = E(Z12 ) − E( Z! ) = 25V ar(X)
(ii) Given that MZ2 (s) = e6s (MX (s). From the properties of MGF, we can write
Z2 = X + 6.
So,
E(Z2 ) = E(X), and V ar(Z2 ) = V ar(X)
(iii) Given that, MZ3 (s) = (MaX (s). Differentiating MZ3 (s) w.r.t s, we get
E(Z3 ) = aE(X), and V ar(Z3 ) = a2 V ar(X)
(b) Let G(z) be the PGF of Yi ’s:
∞
X
G(z) = P(Yi = k)z k (1)
k=0
Now, G0 (z) = z, since P(X0 = 1) = 1, P(X0 = x) = 0 for x 6= 1 and G1 (z) = G(z). Also,
(n) (n) (n)
Xn+1 = Y1 + Y2 + · · · + YXn
where Yin is the size of the family produced by the ith member of the nth generation. So, since Xn+1
is the sum of a random number of independent and identically distributed RVs, we have
Gn+1 (z) = Gn (G(z))
Iterating this result, we get
2
Now,
4. Given that X is a non-negative random variable with a Moment Generating Function such that MX (s) < ∞
∀s ∈ (−∞, a] for some positive a.
(a) Given MX (s) < ∞ ∀s ∈ (−∞, a]
So,
Clearly,
ǫk X k
< eǫX
k!
eǫX k!
Xk <
ǫk
(s+ǫ)X
e k!
esX X k <
ǫk
E[e(s+ǫ)X ]k!
E[esX X k ] < <∞
ǫk
E[esX X k ] < ∞ ,for every s < a
hX
(c) To prove that e h−1 ≤ XehX .
Let hX = Y . Therefore, re-arranging the terms, need to prove that eY − Y eY ≤ 1. Or equivalently, it
is enough to prove that
g(Y ) = eY (Y − 1) ≥ −1.
g(Y ) has a minima at Y = 0, and the minimum value i.e. g(0) = -1.
⇒ g(Y ) ≥ −1
⇒ eY (Y − 1) ≥ −1
Hence proved.
hX
(d) Define Xh = e h−1 .
limh↓0 Xh = X i.e. Xh → X point-wise.
Since E[X k esX ] < ∞ is true, when s = h and k = 1,
E[XehX ] < ∞.
Since Xh is dominated by XehX , E[XehX ] < ∞ and h limh↓0i Xh = X, applying DCT we get
ehX −1 ehX −1 E[ehX ]−1
E[X] = E[limh↓0 Xh ] = E[limh↓0 h ] = limh↓0 E h = limh↓0 h Therefore,
ehX
−1 E[ehX ]−1
E[X] = E[limh↓0 h ] = limh↓0 h .
Hence proved.
5. (i) We can easily obtain the following results from joint PMF.
3
3
5 y = 0,
pY (y) = 2 y = 1,
5
0 otherwise .
2 1
pX|Y (0/1) = 1, pX|Y (1/1) = 0, pX|Y (1/0) = , pX|Y (0/0) = .
3 3
Z is a random variable and is a function of Y as given by,
(
E[X|Y = 0] if Y = 0,
Z = E[X|Y ] =
E[X|Y = 1] if Y = 1.
=⇒
2
with probability
3
,
Z= 3 5
2
0
with probability .
5
=⇒
3 2
if z = ,
5
3
pZ (z) = 2 if z = 0,
5
0 otherwise .
2
=⇒ E[Z] = .
5
V is also a function of Y as given by,
(
V ar[X|Y = 0] if Y = 0,
V = V ar[X|Y ] =
V ar[X|Y = 1] if Y = 1.
=⇒
2
with probability
3
,
V = 9 5
2
0
with probability .
5
=⇒
3 2
if z = ,
5
9
pV (v) = 2 if z = 0,
5
0 otherwise .
2
=⇒ E[V ] = .
15
4
6. (a) Here, E(X) = [(1)( 12 )] + [(−1)( 21 )] = 0 = E(Y ). Also, E(X 2 ) = E(Y 2 ) = 12 (12 + (−1)2 ) = 1. Now,
E(Z) = rE(Z|Z = X) + (1 − r)(E(Z|Z = Y )) = 0. Note that since X and Y are independent we have,
E(XY ) = E(X)E(Y ) = 0. Hence, E(XZ) = rE(X 2 ) + (1 − r)E(XY ) = r. And, E(Z 2 ) = rE(Z 2 |Z =
X) + (1 − r)E(Z 2 |Z = Y ) = r + 1 − r = 1. Thus, Cov(X, Z) = E(XZ) − E(Z)E(X) = r − 0 = r ⇒
ρX,Z = √ r√
= r.
var(X) var(Z)
2π
R cos(Z)
(b) E(X) = E(cos(Z)) = 2π dz = 0. Similarly, one can verify that E(Y ) = 0. Now, E(XY ) =
0
E(sin(2Z))/2 = 0 (check!). Hence, ρX,Y = 0.
7. (a) Let Xi be the indicator RV corresponding to the event of it h pizza being ordered. i.e. if at least one
person orders pizza type i, then Xi = 1, otherwise Xi = 0. Let D be the RV indicating number of
different types of pizza that has been ordered. Thus D = X1 + X2 + .. + Xn . We need to find E[D].
E[D] = E[E[D|K]]
= E[E[X1 + X2 + · · · + Xn |K]]
= n.E[E[Xi |K]], Since Xi are i.i.d
" K #
n−1
= n.E 1 −
n
" K #
n−1
= n − n.E
n
= n − n.E[esK ]
= n − n.MK (log((n − 1)/n))
E[N ] = E[E[N |M ]]
= E[E[X1 + X2 + · · · + XM |M ]]
= E[M E[Xi |M ]], Since Xi are i.i.d
= E[M E[Xi ]], Since Xi is independent of M
= E[M ]E[Xi ], Since E[Xi ] is a constant