You are on page 1of 7

Statistics 116 - Fall 2004

Theory of Probability
Practice Final # 2 Solutions
Instructions: Answer Q. 1-6. All questions have equal weight.
Q. 1) Two dice are rolled. Let X and Y denote, respectively, the largest and
smallest values obtained.
(a) Compute the conditional mass function of Y given X = i for i =
1, . . . , 6.
(b) Are X and Y independent? Why?
A:
(a) Given that X is the largest of the two rolls and X = i, if we want to
compute P(Y = j|X = i) there are two possibilities: either j < i or
j = i. In the rst case, if j < i and R
1
is the rst roll and R
2
is the
second roll, we have
P(Y = j|X = i) =
P(Y = j, X = i)
P(X = i)
=
P({R
1
= j, R
2
= i} {R
1
= i, R
2
= j})
P(X = i)
=
2
36 P(X = i)
.
In the second case, i.e. i = j, we have
P(Y = i|X = i) =
P(Y = i, X = i)
P(X = i)
=
1
36 P(X = i)
.
Therefore, the only thing remaining to compute is P(X = i) which
1
is given by
P(X = i) = P
__

i1
k=1
{R
1
= k, R
2
= i} {R
1
= i, R
2
= k}
_
{R
1
= i, R
2
= i}
_
=
i1

k=1
_
1
36
+
1
36
_
+
1
36
=
2i 1
36
.
and
P(Y = j|X = i) =
_
2
2i1
j < i
1
2i1
j = i.
(b) The two random variables cant be independent simply because Y
X. Another way to see this is that if X and Y are to be independent,
then for all i and j
P(Y = j|X = i) = P(Y = j).
However because P(Y = j|X = i) depends on i it cant be equal to
a function of j alone.
2
Q. 2) Let X and Y be independent, non-negative, continuous random vari-
ables with respective hazard rate functions
X
(t) and
Y
(t) and set W =
min(X, Y ).
(a) Determine the distribution function of W in terms of those of X and
Y .
(b) Show that
W
(t), the hazard rate function of W, is given by

W
(t) =
X
(t) +
Y
(t).
A:
(a) Let W = min(X, Y ). We must compute
F
W
(w) = P (W w)
= P (min(X, Y ) w)
= P ({X w} {Y w})
= P (X w) +P (Y w) P (X w, Y w)
= P (X w) +P (Y w) P (X w) P (Y w)
= F
X
(w) +F
Y
(w) F
X
(w) F
Y
(w).
(b) For every pair (s, t) such that s < t we have
exp
_

_
t
s

W
(u) du
_
= P(W > t|W > s)
= P(min(X, Y ) > t| min(X, Y ) > s)
= P(X > t, Y > t|X > s, Y > s)
=
P(X > t, Y > t, X > s, Y > s)
P(X > s, Y > s)
=
P(X > t, Y > t)
P(X > s, Y > s)
=
P(X > t) P(Y > t)
P(X > s) P(Y > s)
=
P(X > t)
P(X > s)

P(Y > t)
P(Y > s)
= exp
_

_
t
s

X
(u) du
_
exp
_

_
t
s

Y
(u) du
_
= exp
_

_
t
s
(
X
(u) +
Y
(u)) du
_
.
This is enough to imply that

W
(t) =
X
(t) +
Y
(t).
3
Q. 3) If X
1
, X
2
, X
3
and X
4
are (pairwise) uncorrelated random variables each
having mean 0 and variance 1.
(a) Compute the correlation Cor(X
1
+X
2
, X
2
+X
3
);
(b) Compute the correlation Cor(X
1
+X
2
, X
3
+X
4
).
A:
(a)
Cor(X
1
+X
2
, X
2
+X
3
) =
Cov(X
1
+X
2
, X
2
+X
3
)
_
Var(X
1
+X
2
) Var(X
2
+X
3
)
=
E((X
1
+X
2
)(X
2
+X
3
)) E(X
1
+X
2
) E(X
2
+X
3
)

2 2
=
E((X
1
+X
2
)(X
2
+X
3
))

2 2
=
E(X
1
X
2
+X
1
X
3
+X
2
2
+X
2
X
3
2
=
1
2
.
(b) Similarly,
Cor(X
1
+X
2
, X
3
+X
4
) =
E((X
1
+X
2
)(X
3
+X
4
))

2 2
=
E(X
1
X
3
+X
1
X
4
+X
2
X
3
+X
2
X
4
)
2
=
0
2
= 0.
4
Q. 4) Let X be an integer-valued (discrete) random variable and Y be a contin-
uous random variable such that
P(X = j, y Y y+dy) =
(a +b)
(a)(b)
_
n
j
_
y
a1+j
(1y)
nj+b1
dy, 0 j n, 0 y 1
Using the fact that, for e, d > 0
_
1
0
t
d1
(1 t)
e1
dt =
(e)(d)
(e +d)
.
(a) Compute p
X|Y
(x|y), the conditional p.m.f. of X given Y = y.
(b) Compute f
Y |X
(y|x), the conditional p.d.f. of Y given X = x.
A:
(a) To compute p
X|Y
(x|y) we must nd the marginal p.d.f. of Y
f
Y
(y) =
n

j=0
(a +b)
(a)(b)
_
n
j
_
y
j+a1
(1 y)
nj+b1
=
(a +b)
(a)(b)
y
a1
(1 y)
b1

j=0
_
n
j
_
y
j
(1 y)
nj
=
(a +b)
(a)(b)
y
a1
(1 y)
b1
1
=
(a +b)
(a)(b)
y
a1
(1 y)
b1
.
Therefore
p
X|Y
(x|y) =
(a+b)
(a)(b)
_
n
j
_
y
a1+j
(1 y)
nj+b1
(a+b)
(a)(b)
y
a1
(1 y)
b1
=
_
n
j
_
y
j
(1 y)
nj
.
(b) To compute f
Y |X
(y|x) we must nd the marginal p.m.f. of X
p
X
(x) =
_
1
0
(a +b)
(a)(b)
_
n
j
_
y
j+a1
(1 y)
nj+b1
dy
=
(a +b)
(a)(b)
_
n
j
__
1
0
y
j+a1
(1 y)
nj+b1
dy
=
(a +b)
(a)(b)
_
n
j
_
(j +a)(n j +b)
(n +a +b)
.
Therefore,
f
Y |X
(y|x) =
(a+b)
(a)(b)
_
n
j
_
y
j+a1
(1 y)
nj+b1
(a+b)
(a)(b)
_
n
j
_
(j+a)(nj+b)
(n+a+b)
=
(n +a +b)
(j +a)(n j +b)
y
j+a1
(1y)
nj+b1
.
5
Q. 5) The random variables X and Y have joint density function
f(x, y) =
1
x
2
y
2
x > 1, y > 1.
Compute the distribution function and p.d.f. of the random variable U =
XY .
A:
We rst compute the distribution function
F
U
(u) = P(U u)
=
__
(x,y):xyu,x1,y1
1
x
2
y
2
dx dy
=
_
u
1
_
u/x
1
1
x
2
y
2
dy dx
=
_
u
1
1
x
2
_

1
y

u/x
1
_
dx
=
_
u
1
1
x
2
_
1
x
u
_
dx
=
_
u
1
1
x
2

1
xu
dx
=
1
x

u
1

1
u
log x

u
1
= 1
1
u

log u
u
= 1
log u + 1
u
.
To compute the density, we take the derivative, and for u > 1,
f
U
(u) =
d
du
F
U
(u) =
1 (log u + 1)
u
2
=
log u
u
2
.
6
Q. 6) Let X be a standard normal random variable and set Y = X
2
.
(a) Compute M
Y
(t).
(b) Recalling that the m.g.f. of a Gamma(, ) random variable is given
by
_

t
_

,
what is the distribution of Y ?
(c) If (X
1
, . . . , X
n
) are independent standard normal random variables,
what is the distribution of Z = X
2
1
+ +X
2
n
?
A:
(a) We have
M
Y
(t) = E[e
tY
]
= E[e
tX
2
]
=
1

2
_

e
tx
2
e
x
2
/2
dx
=
1

2
_

e
(12t)x
2
/2
dx
=
1

1 2t

1
_
2/(1 2t)
_

e
(12t)x
2
/2
dx
=
1

1 2t
1
=
1

1 2t
=
_
1
1 2t
_
1/2
=
_
1/2
1/2 t
_
1/2
(b) From the form of the m.g.f. we see that Y Gamma(1/2, 1/2).
(c) If Z = X
2
1
+ +X
2
n
and the X
i
s are independent then
M
Z
(t) = M
P
n
i=1
X
2
i
(t) =
n

i=1
M
X
2
i
(t) =
n

i=1
_
1
1 2t
_
1/2
=
_
1
1 2t
_
n/2
.
Therefore, Z has distribution Gamma(n/2, 1/2).
7

You might also like