Professional Documents
Culture Documents
Probability: Udvalgte Løsninger Til
Probability: Udvalgte Løsninger Til
Probability
(Jim Pitman)
http://www2.imm.dtu.dk/courses/02405/
17. december 2006
1
02405 Probability
2004-2-2
BFN/bfn
IMM - DTU
2
3
Question b) 67%.
Question c) 0.667
Question a.2)
4
7
1
02405 Probability
2004-2-2
BFN/bfn
IMM - DTU
4
11
4
11
8
11
1
02405 Probability
2003-9-10
BFN/bfn
IMM - DTU
1
IMM - DTU
02405 Probability
2004-2-4
BFN/bfn
1
02405 Probability
2004-9-3
BFN/bfn
IMM - DTU
Exactly three (A B C)
1
IMM - DTU
02405 Probability
2004-2-4
KKA,BFN/bfn,kka
1
02405 Probability
2003-9-13
KKA/bfn,kka
IMM - DTU
1
02405 Probability
2003-9-18
BFN/bfn
IMM - DTU
1
02405 Probability
2003-9-11
BFN/bfn
IMM - DTU
1
02405 Probability
2003-9-11
BFN/bfn
IMM - DTU
Using exclusion-inclusion for two events we get the formula stated p.32. Since the
exclusion-inclusion formula is assumed valid for n events we can use this formula for
the first term. To get through we realize that the last term
P (ni=1 Ai An+1 )
is of the form
P (ni=1 Bi )
with Bi = Ai An+1 , implying that we can use the inclusion-exclusion formula for this
term too. The proof is completed by writing down the expansion explicitly.
1
02405 Probability
2004-2-4
BFN/bfn
IMM - DTU
true
3
1
0.92 + 0.88 = 0.91
4
4
1
02405 Probability
2003-9-13
KKA/bfn,kka
IMM - DTU
1
IMM - DTU
02405 Probability
2003-9-24
BFN/bfn
1
02405 Probability
2003-9-11
BFN/bfn
IMM - DTU
P (A|C)P (C)
1 0.8
=
P (A|C)P (C) + P (A|C c )P (C)c
0.8 + 0.1 0.2
P (S|C c )P (C c )
0.1 0.2
1
=
=
P (S|C c )P (C c ) + P (S|C)P (C)
0.02 + 1 0.8
41
1
02405 Probability
2003-9-13
BFN/bfn
IMM - DTU
P (D|H) = 0.05
P (D|H c ) = 0.8
P (D|H c )P (H c )
= 0.8 0.010.008 + 0.05 0.99 = 0.139
P (D|H c )P (H c ) + P (D|H)P (H)
1
02405 Probability
2004-2-10
BFN/bfn
IMM - DTU
P (S) = P (S|T1 F )P (T1 F )+P (S|T2 F )P (T2 F )+P (S|T3 F )P (T3 F )+P (S|T1 F c )P (T1 F c )+P
The last three terms are zero. We apply The Multiplication Rule for the probabilities P (Ti F ) leading to
P (S) = P (S|T1 F )P (F |T1 )P (T1 )+P (S|T2 F )P (F |T2 )P (T2 )+P (S|T3 F )P (F |T3 )P (T3 )
a special case of The Multiplication Rule for n Events page 56. Inserting numbers
P (S) =
1
111 111 121
+
+
=
233 223 233
4
Question b) The probability in question is P (T1 |S). Applying Bayes rule page 49
P (T1 |S) =
P (S|T1 )P (T1 )
=
P (S)
11
63
1
4
2
9
1
02405 Probability
2004-2-7
BFN/bfn
IMM - DTU
n
Y
i=1
i1
1
12
n 13
1
02405 Probability
2003-9-24
BFN/bfn
IMM - DTU
364
365
n1
ln (2)
1
n
+ 1 = 253.7
2
ln (365) ln (364)
Question c) In the birthday problem we only ask for two arbitrary birthdays to be
the same, while the question in this exercise is that at least one out of n 1 has
a certain birthday.
1
02405 Probability
2003-9-18
BFN/bfn
IMM - DTU
1
IMM - DTU
02405 Probability
2004-9-24
BFN/bfn
1
02405 Probability
2003-9-13
BFN/bfn
IMM - DTU
1
365
1
= P (B12 )P (B23 ).
3652
Bij = 365
k=1 (Ai,k Aj,k )
1
by the independence of Ai,k and Aj,k . The event B12 B23
such that P (Bij ) = 365
can be expressed by
1
3652
1
IMM - DTU
02405 Probability
2004-2-10
BFN/bfn
1
02405 Probability
2003-9-13
BFN/bfn
IMM - DTU
5
8
1
02405 Probability
2004-2-10
BFN/bfn
IMM - DTU
P (B A)
P (A)
P (A|B)P (B)
P (B A)
=
P (A)
P (A)
Now the probability of P (A) is given by the binomial distribution page 81, as is P (B)
and P (A|B) (the latter is the probability of getting 1 six in 3 rolls). Finally
5
3
5 53 3 52
2
1
2 65 1 63
P (2 sixes in 5 rolls)P (1 six in 3 rolls)
=
=
P (B|A) =
P (3 sixes in 8 rolls)
5 55
8
2 68
3
a hypergeometric probability. The result generalizes. If we have x successes in n trials
then the probability of having y x successes in m n trials is given by
m
nm
y
xy
n
x
The probabilities do not depend on p.
1
02405 Probability
2005-5-30
BFN/bfn
IMM - DTU
8765 4 4
0.7 0.3 = 0.1361
4321
Question b)
P (B4| 8i=2 Bi ) =
P ((B4 (8i=2 Bi ))
P (B4)
=
== 0.1363
8
P (i=2 Bi )
1 P (B0) P (B1)
Question c)
6
2
1
02405 Probability
2004-2-10
BFN/bfn
IMM - DTU
210.5 200
10
189.5 200
10
220.5 200
10
209.5 200
10
200.5 200
10
199.5 200
10
210.5 200
10
209.5 200
10
1
02405 Probability
2004-2-10
BFN/bfn
IMM - DTU
120.5 100
8.165
1
02405 Probability
2003-9-19
BFN/bfn
IMM - DTU
we find k = 373.
k + 21 400 0.95
k + 21 400 0.95
1.645
400 0.95 0.05
0.95
1
02405 Probability
2003-9-24
BFN/bfn
IMM - DTU
25
2
Question c)
2 + 21 2.5
25 0.09
1 + 12 2.5
25 0.09
Question d)
2.52 2.5
e
= 0.2566
2!
Question e) Normal m is now 250
250 21 250
250 + 21 250
1
1
= ( ) ( ) = 0.0266
30
30
2500 0.09
2500 0.09
Question f ) Poisson - as above 0.2566.
1
02405 Probability
2003-9-19
BFN/bfn
IMM - DTU
k+1
e
(k+1)!
k
e
k!
k+1
The ratio is strictly decreasing in k. For < 1 maximum will be P (0), otherwise the
probabilities will increase for all k such that > k, and decrease whenever < k. For
non-integer the maximum of P (k) (the mode of the distribution) is obtained for the
largest k < . For intger the value of P () = P ( + 1).
1
02405 Probability
2003-9-13
BFN/bfn
IMM - DTU
P (0) = e N 3 N = 0.5134
Similarly for n = 53 N
P (0) + P (1) = e
1 5
N
N
3
5
1+
3
= 0.5037
1
IMM - DTU
02405 Probability
2004-2-10
BFN/bfn
1
IMM - DTU
02405 Probability
2003-10-3
BFN/bfn
+
+
50
50
45
5
5
10
45
10
1
IMM - DTU
02405 Probability
2003-10-5
BFN/bfn
1
02405 Probability
2003-11-30
BFN/bfn
IMM - DTU
7
X
g1
p
q g4
3
4
g=4
Question c) The easiest way is first answering question d) then using 1binocdf (3, 7, 2/3)
in MATLAB.
0.8267
Question d) Imagine that all games are played etc. From the binomial formula
p7 + 7p6 q + 21p5 q 2 + 35p4 q 3 = p7 + p6 q + 6p6 q + 6p5 q 2 + 15p5 q 2 + 35p4 q 3
= p6 + 6p5 q + 15p4 q 2 + 20p4 q 3 = p6 + p5 q + 5p5 q + 15p4 q 2 + 20p4 q 3
etc.
Question e)
P (G = 4) = p4 + q 4
P (G = 6) = 10p2 q 2 (p2 + q 2 )
Independence for p = q =
1
2
P (G = 5) = 4pq(p3 + q 3 )
P (G = 7) = 20p3 q 3 (p + q)
1
02405 Probability
2003-11-21
BFN/bfn
IMM - DTU
n
X
P (X = i)P (X +Y = n|X = i) =
i=0
n
X
P (X = i)P (Y = ni)
i=0
P (X = 3) =
1
,
18
P (X = 4) =
1
12
1
P (X = 5) = ,
9
P (X = 6) =
5
,
36
P (X = 7) =
1
6
P (X = 2) =
We get
1 1
1 5
+
P (X + Y = 8) = 2
36 36 18 9
35
1 1
=
12 12
16 81
1
02405 Probability
2003-10-6
BFN/bfn
IMM - DTU
1
2
for p = 21 .
1
02405 Probability
2004-5-13
BFN/bfn
IMM - DTU
or realizing that X binomial 3, 16 example 7 page 169 we have E(X) = 3 16 =
1
.
2
Question b) Let Y denote the number of odd numbers on three rolls, then Y
binomial 3, 12 thus E(Y ) = 3 21 = 32 .
1
IMM - DTU
02405 Probability
2003-10-5
BFN/bfn
n
X
i=0
pi
1
02405 Probability
2003-10-12
BFN/bfn
IMM - DTU
P (D = 9) = P (D 9) P (D 8) =
3
10
3
10
3
5
3
6
= 0.2284
13
13
9
8
=
13 12 11
13 12 11
13 12 11
12
12
X
X 3(i 1)(i 2)
3
3
i(i1)(i2) =
=
6, 006 = 10.5
i
E(D) =
13 12 11
13 12 11 i=3
13 12 11
i=3
P (D = i) = P (D i)P (D i1) =
1
02405 Probability
2003-10-2
BFN/bfn
IMM - DTU
1
IMM - DTU
02405 Probability
2003-10-5
BFN/bfn
P (W > 5000)=1
= 1 (1.66) = 0.0485
55 30
1
IMM - DTU
02405 Probability
2003-10-5
BFN/bfn
P (S26 104)=
= 0.5
1 104
1
02405 Probability
2003-10-5
BFN/bfn
IMM - DTU
p = 2, 3, . . .
Question b)
E(D) = E(X + 1) = E(X) + 1 =
1
+1=3
p
1p
= 2,
p2
SD(D) =
1
02405 Probability
2003-10-13
BFN/bfn
IMM - DTU
1
02405 Probability
2003-10-13
BFN/bfn
IMM - DTU
6.023 1023
= 2.688 1019 x3
22.4 103
5.1854 109 x x
0.01 x 7.1914 106
2.688 1019 x3
1
IMM - DTU
02405 Probability
2003-10-5
BFN/bfn
we are almost certain that we will get at most one cookie without goodies.
1
02405 Probability
2003-10-5
BFN/bfn
IMM - DTU
n
X
pi =
i=0
Question b) As n we get n
.
1p
1 pn+1
1p
n == pn1 +
1
02405 Probability
2003-10-15
BFN/bfn
IMM - DTU
x2
0
!
Z 1
i+3 x=1
2
2
2
X
X
X
1
2
2
2
i+2
i x
i
(x) dx =
(1)
(x) dx =
=
i
i
i
i + 3 x=0 30
0
i=0
i=0
i=0
such that
f (x) = 30 x2 (1 x)2
0<x<1
xf (x)dx =
0
x30x2
0
!
i+4 x=1
2
2
X
X
1
2
2
i x
i
=
(1)
(x) dx = 30
i
i
i + 4 x=0 2
i=0
i=0
which we could have stated directly due to the symmetry of f (x) around 12 , or
from page 478.
Question c) We apply the computational formula for variances as restated page 261.
E(X ) =
such that
x2 30x2
0
30
1
1
=
105 4
28
(3 +
1
33
=
+ 3 + 1)
28
3)2 (3
1
02405 Probability
2003-10-13
BFN/bfn
IMM - DTU
f (x)dx
a
We get
P (1 X 2) =
1
=
2(1 x)
2
1
1
dx =
2(1 + |x|)2
x=0
x=1
+
1
2(1 + x)
0
1
x=2
x=0
1
dx +
2(1 x)2
=
2
0
1
dx
2(1 + x)2
7
1 1 1 1
+ =
2 4 2 6
12
h
ix=
1
Question c) The distribution is symmetric so P (|X| > 1) = 2P (X > 1) = 2 2(1+x)
=
1
.
2
x=1
R
0
1
x 2(1+x)
2 dx does not exist).
1
02405 Probability
2003-10-13
BFN/bfn
IMM - DTU
32
P (S4 3) = 1 q = 1 (1.73) = 1 0.9582 = 0.0418
1
3
1
02405 Probability
2003-11-10
BFN/bfn
IMM - DTU
R 1.1
We can find c the standard way using 0.9 f (x)dx = 1. However, we can derive
the area of the triangle directly as 21 0.02 c such that c = 100. Due to the
symmetry of f (x) we have P (X < 0.925) = P (1.075 < X).
0.925
0.9
1
10(x0.9)dx = 20 x2 0.9x
2
x=0.925
x=0.9
Question b) We define the random variable Y as the length of an item which has
passed the quality inspection. The probability
P (0.95 < Y < 1.05) =
The number of acceptable items A out of c are binomially distributed. We determine c such that
P (A 100) 0.95
We now use the normal approximation to get
100 0.5 0.8 c
0.95
1
0.4 c
100 0.5 0.8 c
1.645
0.4 c
and we find c 134.
= 0.0625
1
02405 Probability
2003-10-23
BFN/bfn
IMM - DTU
t50% =
ln 2
= 6.93
1
= 10
1 X
T =
Ti
100 i=1
We know from page 286 that T is Gamma distributed. However, it is more
convenient to apply CLT (Central Limit Theorem) p.268 to get
!
11
10
= 1 (1) = 0.1587
P (T > 11) = 1 P (T 11)=1
10
100
Question e) The sum of the lifetime of two components is Gamma distributed. From
p.286 (Right tail probability) we get
P (T1 + T2 > 22) = e0.122 (1 + 2.2) = 0.3546
1
IMM - DTU
02405 Probability
2003-10-13
BFN/bfn
Question a)
P (W4 2) = 1
= 0:8647
The distribution of the time to the arrival of the fourth call is a Gamma
(4; ) distribution. We nd the probability using the result (2) on page 286
Question b)
P (T4 5) = 1
25 125
1+5+ +
2
6
Question c)
E (T4 ) =
using (3) page 286.
=4
=1
118
e
3
= 0:735
1
02405 Probability
2003-10-30
BFN/bfn
IMM - DTU
(T ) such that
P (T1 = 0) = 1 P (T > 1) = 1 e
using the survival function for an exponential random variable. Correspondingly
k
1 e
P (K = k) = P (T > k)P (T > k+1) = ek e(k+1) = ek 1 e = e
a geometric distribution with parameter p = 1 e .
Question b)
k
k
k+1
k
m
m
k+1
m
m
)=e
1e
e
= e
P (Tm = k) = P (T > )P (T >
m
m
pm = e m .
Question c) The mean of the geometric distribution of Tm is
E(Tm ) =
1 pm
pm
The mean is measured in m1 time units so we have to multiply with this fraction
to get an approximate value for E(T )
1 pm
1
E(Tm ) =
m
pm
1 m
+o m
1
1
for m inf ty
=
m1 1 m +o m
E(T )=
=
1 e m
=
m 1 e m
1
02405 Probability
2003-11-12
BFN/bfn
IMM - DTU
G(t) = e
Rt
0
u1 du
= e[u
]u=t
u=0
= et
dG(t)
= et t1 = t1 et
dt
t1 et
= t1
(t) =
et
1
02405 Probability
2003-10-15
BFN/bfn
IMM - DTU
y = g(x) = x2 ,
1
fY (y) =
2 y
x=
y,
dy
= 2x = 2 y
dx
0<y<1
y) =
The last equality follows from the cumulative distribution function (CDF) of a Uniformly distributed random variable (page 487). The density is derived from the CDF
by differentation (page 313) and
fU 2 (y) =
1
dFU 2 (y)
= ,0 < y < 1
dy
2 y
1
02405 Probability
2003-11-12
BFN/bfn
IMM - DTU
1 1
, < y <
1 + y2
1
+
y
2
0
R
The integral yfY (y)dy has to converge absolutely for E(Y ) to exist, i.e. E(Y )
exists if and only if E(|Y |) exists (e.g. page 263 bottom).
1
02405 Probability
2003-10-16
BFN/bfn
IMM - DTU
yb
X
a
yb
X
a
=1P
=F
yb
a
yb
X
a
=1F
yb
a
1
02405 Probability
2003-10-29
BFN/bfn
IMM - DTU
1
dy
=
Y = T , T = Y 2,
dt
t
fY (y) = 2 yey
a Weibull distribution. See e.g. exercise 4.3.4 page 301 and exercise 4.4.9 page
310.
Question b)
2 y 2
2y e
dy =
y 2 ey dy
We note the similarity with the variance of an unbiased (zero mean) normal
variable.
r Z
Z r s 1
Z
2
2
1 1 21 y1
2 2 12 y1
2
2 y 2
2
2 dy =
dy =
y
y e
y
e 2 dy
1 e
2 2
2 12
1
the integral is the expected value of Z 2 , where Z is normal 0, 2
distributed.
1
Thus the value of the integral is 2 Finally we get
E(Z 2 ) = V ar(Z)
r
1
1
=
==
= 0.51
with = 3
2
2
E(Y ) =
Question c) We apply the inverse distribution function method suggested page 320323. Thus
1
U = 1 eX X = ln (1 U )
Now 1 U and U are identically distributed such that we can generate an exponential X with X = 1 ln (U ). To generate a Weibull ( = 2) distributed Y we
q
take the square root of X, thus Y = 1 ln (1 U ).
1
02405 Probability
2003-10-17
BFN/bfn
IMM - DTU
Question b) In this case we have S = min(X1 , X2 ) and we apply the result for the
minimum of random variables page 317. The special case of two exponentials is
treated in example 3 page 317
P (S t) = 1 e(1 +2 )t
Question c) From the system design we deduce S = max (min (X1 , X2 ), min (X3 , X4 ))
such that
P (S t) = 1 e(1 +2 )t 1 e(3 +4 )t
1
02405 Probability
2003-11-1
BFN/bfn
IMM - DTU
n
k
xk (1 y)nk
Question f )
k < x, n k 1 > y
one in between
1
02405 Probability
2003-11-12
BFN/bfn
IMM - DTU
Question b) From the boxed result at the bottom of page 327 we have that (X (k) has
beta(k, n k + 1) distribution. Substituting r = k and s = n k + 1 we get
P (X(k) x) =
r+s1
X
r+s1
i
i=r
xi (1 x)s+ri1
X
1
1
xr1 (1 x)s1 =
xr1
f (x) =
B(r, s)
B(r, s)
i=0
s1
i
(x)i
Now
P (X(k) x) =
s1
1 X
=
B(r, s) i=0
x
0
as was to be proved.
f (x)dx =
0
s1
i
x
0
s1
X
1
ur1
B(r, s)
i=0
s1
i
s1
(u)
r+i1
xr X
du =
B(r, s) i=0
(u)i du
s1
i
(x)i
r+i
1
02405 Probability
2003-11-19
BFN/bfn
IMM - DTU
2
7
3
=
4
16
Question b) We see that the probability can be rewritten This is example 2 page 343
with different values. We get
1
13 14
9
=
24 25
40
Question c)
P (Y X|Y > 0.25) =
P (Y X) P (Y X, Y 0.25)
P (Y X, Y > 0.25)
=
P (Y > 0.25)
P (Y > 0.25)
=
1
2
1
2
3
4
1 2
4
5
8
1
02405 Probability
2003-11-1
BFN/bfn
IMM - DTU
1
02405 Probability
2003-10-17
BFN/bfn
IMM - DTU
r2
2
dFR (r)
2r
= 2
dr
With R1 and R2 indpendent we have the joint density from (2) page 350
f (r1 , r2 ) =
We now integrate over the set r2 <
P
R1
R2
2
r1
2
r1
2
4r1 r2
4
4r1 r2
1
dr2 dr1 = 4
4
r13 dr1 =
1
8
1
02405 Probability
2003-11-20
BFN/bfn
IMM - DTU
dg(y)
= 12y 2 , Y =
dy
Z
4
13
1
z 3
1
y 3 y 1
( z4 ) 3
4
fZ (z) = e
=
e
6
12y 2
72
Question c) We have |X| |Y | = Y . Thus E(|X|) E(Y ) = 4.
1
02405 Probability
2003-11-22
BFN/bfn
IMM - DTU
Question d)
E X 2 e2Y = E(X 2 )E e2Y
We recall the general formula for E(g(Y )) from page 263 or 332
Z
E(g(Y )) = g(y)f (y)dy
y
thus E X 2 e
2Y
is undefined ().
1
02405 Probability
2003-11-1
BFN/bfn
IMM - DTU
f (u, v)dudv
Question c)
f (x, y) =
d2 F (x, y)
dxdy
1
02405 Probability
2004-4-15
BFN/bfn
IMM - DTU
13
Question b)
1 (1 (1))2
Question c) Drawing helpful, suggests that the following should be true
(1) (1)
Question d)
P (1 > max (X, Y ) min (X, Y ) = P (1 > |X Y |) =
1
IMM - DTU
02405 Probability
2003-11-19
BFN/bfn
1
02405 Probability
2003-11-22
BFN/bfn
IMM - DTU
y = g(z) = z 2 ,
z=
y,
dy
= 2z = 2 y
dz
Inserting in the boxed formula page 304 and use the many to one extension.
fY (y) =
y
1
e 2
2y
0<y<
We recognize the gamma density with scale parameter = 12 and shape parameter
r = 21 from the distribution summary page 481. By a slight reformulation we have
1 1
y
y
1 2 2
e 2
fY (y) =
2
1
=
Question b) The formula is valid for n = 1. Assuming the formula valid for odd n
we get
n
n+2
=
+1
Gamma
2
2
The recursive formula for the gamma-function page 191 tells us that (r + 1) =
r(r) and we derive
n (n 1)!
n+2
=
Gamma
2
2 2n1 n1
!
2
n
2
2
Y
n1
i=1
1
i
2
2
Question f ) The mean of a gamma (r, ) distribution is r , thus n has mean
The variance of a gamma (r, ) distribution is
n
2
1 = 2n. Skewness bla bla bla
4
r
,
2
n
2
1
2
= n.
1
02405 Probability
2004-4-15
BFN/bfn
IMM - DTU
u
(tu)
t
e e
du = e
eu() du =
f (t) =
et et
0
0
Question b)
E(Z) = E(X1 ) + E(X2 ) =
1
1
+
See e.g. page 480 for the means E(Xi ) for the exponential variables .
Question c) Using the independence of X1 and X2 we have
r
1
1
V ar(Z) = V ar(X1 ) + V ar(X2 ) =
+ 2
2
1
02405 Probability
2003-11-20
BFN/bfn
IMM - DTU
2
2
Question c)
1 e2t0.9 (1 + 2t0.9 ) = 0.9
e2t0.9 (1 + 2t0.9 ) = 0.1
1
02405 Probability
2003-11-11
BFN/bfn
IMM - DTU
Xi =
ri
n X
X
Wij
i=1 j=1
Pn
Pn
a sum of
i=1 ri exponential() random variables. The sum is gamma(
i=1 ri , )
distributed.
1
02405 Probability
2004-4-17
BFN/bfn
IMM - DTU
P (X1 = x1 , X1 + X2 = n)
P (X1 + X2 = n)
P (X1 = x1 )P (X2 = n x1 )
P (X1 = x1 , X2 = n x1 )
=
P (X1 + X2 = n)
P (X1 + X2 = n)
where we have used the independence of X1 and X2 and the last equality. Now
using the Poisson probability expression and the boxed result page 226
x
P (X1 = x1 |X1 + X2 = n) =
x1 1 2nx1
n!
=
=
x1 !(n x1 )! (1 + 2 )n
with p =
1
.
1 +2
nx
1 1 1 2 1 2
e (nx1 )! e
x1 !
(1 +2 )n (1 +2 )
e
n!
n
x1
px1 (1 p)nx1
Question b) Let Xi denote the number of eggs laid by insect i. The probability
in
1
question is P (X1 90) = P (X2 60). Now Xi binomial 150, 2 . With the
normal approximation to the binomial distribution page 99 to get
!
60 + 12 150 12
29
= (2.37) = 0.0089
P (X2 60) =
=
1
150
150
2
1
02405 Probability
2003-11-19
BFN/bfn
IMM - DTU
m
X
i=1
P (AB)
,
P (B)
P (N1 = n1 , N2 = n2 , . . . Nm = nm
P
Ni = n)
P( m
i=1 Ni = n)
Pm
i=1
Ni = n)
P
Now realising that P (N1 = n1 , N2 = n2 , . . . Nm = nm m
i = n) = P (N1 =
i=1
PN
m
n1 , N2 = n2 , . . . Nm = nm ) and using
the
fact
that
N
=
i=1 Ni has Poisson
Pm
distribution with parameter = i=1 i we get
P (N1 = n1 , N2 = n2 , . . . Nm = nm |
such that with n =
Pm
i=1
m
X
Ni = n) =
i=1
ni
P (N1 = n1 , N2 = n2 , . . . Nm = nm |
m
X
i=1
Qm
i i i
i=1 ni ! e
Pm
ni
P i=1
e
( m
n
i=1 i )!
n!
Ni = n) =
n1 !n2 ! nm !
n1
n 2
n m
i
.
Question b) Using
P (N1 = n1 , N2 = n2 , . . . Nm = nm ) = P (N = n)P (N1 = n1 , N2 = n2 , . . . Nm = nm |
we see that the Ni s are independent Poisson variables.
m
X
i=1
Ni = n)
1
02405 Probability
2003-11-19
BFN/bfn
IMM - DTU
We now apply the crucial idea of adding 0 in the form of E(Y |x) E(Y |x) inside the
brackets.
XX
V ar(Y ) =
(y E(Y |x) + E(Y |x) E(Y ))2 f (x, y)
x
XX
x
By definition fY (y|x) =
V ar(Y ) =
f (x,y)
f (x)
"
X X
x
f (x)
f (x)
f (x, y)
f (x)
f (x)
thus
#
since
y (y
the inner part of the first term is V ar(Y |X = x) while the inner part of the second
term is constant. Thus
X
X
V ar(Y ) =
V ar(Y |X = x)f (x) +
(E(Y |x) E(Y ))2 f (x)
x
1
02405 Probability
2003-11-19
BFN/bfn
IMM - DTU
y
, 0 < y < 1 |x|
1 |x|
Question a) We have P Y 21 |X = x = 1 F 12 |x
Question b) We have P Y 21 |X = x = F 12 |x
Question c) Similarly
V ar(Y |X = x) =
(1 |x|)2
12
1
02405 Probability
2003-11-19
BFN/bfn
IMM - DTU
n
Y
i=1
pXi (1 p)1Xi = p
Pn
Xi
i=1
(1 p)n
Pn
i=1
Xi
f (p; X1 = x1 , X2 = x2 , . . . , Xn = xn )
f (X1 = x1 , X2 = x2 , . . . , Xn = xn )
f (p|X1 = x1 , X2 = x2 , . . . , Xn = xn ) = R 1
0
Pn
i=1
Pn
i=1
Xi
(1 p)n
Xi (1
Pn
p)n
i=1
Pn
i=1
Xi
f (p)
Xi f (p)dp
Pn
i=1
Xi we
We note that if the prior density of p f (p) is a beta(r, s) distribution, then the
posterior distribution is a beta(r + Sn , s + n Sn ) distribution.
1
02405 Probability
2003-11-12
BFN/bfn
IMM - DTU
E(XY ) = E(X ) =
1
x3 dx = 0
2
1
1
1
1
= 1 6= P Y >
Y > |X| >
4
2
4
1
IMM - DTU
02405 Probability
2003-11-11
BFN/bfn
1
02405 Probability
2004-5-13
BFN/bfn
IMM - DTU
X3
0
1
0
1
X2 + X 3
0
1
1
2
X2 X 3
0
-1
1
0
X2 + X 3 / X 2 X 3
0
1
2
Probability
1
3
1
6
1
3
1
6
-1
0
1
6
1
3
1
0
1
6
1
3
1
3
= 61 .
1
IMM - DTU
02405 Probability
2003-11-11
BFN/bfn
P (X + 2Y 3) = P
=
= (1.34) = 0.9099
5
5
5
Question b) We have from the boxed result page 451
r
1
1
Y = X + 1 Z
2
4
where X and Z are indpendent standard normal variables. Thus
X + 2Y = 2X + 3Z
This is the sum of two independent normal variables which itself is N ormal(0, 2 2 +
2
3 ) distributed. Thus
3
= (1.13) = 0.8708
P (X + 2Y 3) =
7
1
02405 Probability
2004-5-13
BFN/bfn
IMM - DTU
3Y normal(v, 3)
1
02405 Probability
2003-12-12
BFN/bfn
IMM - DTU
where X and Z are standardized independent normal variables. Thus any linear
combination of V and W will be a linear combination of X and Z. We know
from chapter 5. that such a combination is a normal variable. After some tedious
calculations we find the actual linear combinations to be
p
aV + bW = aV + bW + (aV + bW )X + b2 1 2 Z
and
cV + dW = cV + dW + (cV + dW )X + d2
1 2 Z
2
Such that (aV + bW normal(aV + bW , a2 V2 + b2 W
+ 2abV W ) and (cV +
2 2
2 2
dW normal(cV + dW , c V + d W + 2cdV W ).
W1 = cV + dW = 2 + 21 X + 22 Z
such that X1 and Y1 are standard normal variables. We see that with some effort
we would be able to write
q
Y1 = 1 X1 + 1 21 Z1
and we conclude from page 454 that V1 and W2 are bivariate normal variables.
Question c) We find the parameters using standard results for mean and variance
1 = E(aV + bW ) = aV + bW
12
a2 V2
2
b 2 W
+ 2abV W
2 = E(cV + dW ) = cV + bW
2
22 = c2 V2 + d2 W
+ 2cdV W
etc
1
IMM - DTU
02405 Probability
2003-9-11
BFN/bfn
1 0.03
6
P (I|B2)P (B2)
=
=
P (I|B0)P (B0) + P (I|B1)P (B1) + P (I|B2)P (B2)
0 0.92 + 0.5 0.05 + 1 0.03
11
1
IMM - DTU
02405 Probability
2004-2-10
BFN/bfn
1
02405 Probability
2004-10-16
BFN/bfn
IMM - DTU
Question a)
P (E2 ) = P (A2 ) + P (B2 ) + P (C2 ) + P (D2 ) = p2a + p2b + p2c + p2d = 0.3816
Question b) We have p(k) = P (Ek ). By combinatorial considerations we can show
P (Ai1 Bi2 Ci3 Di4 ) =
(i1 + i2 + i3 + i4 )! i1 i2 i3 i4
pa pb pc pd
i1 !i2 !i3 !i4 !
p(2) = 0.5973
p(3) = 0.3163
p(4) = 0.0177
p(1) = P (E1 ) = P (A4 ) + P (B4 ) + P (C4 ) + P (D4 ) = p4a + p4b + p4c + p4d = 0.0687
p(4) = P (E4 ) = P (A1 B1 C1 D1 ) = 24pa pb pc pd = 0.0177
To calculate p(3) = P (E3 ) we use the law of averaged conditional probabilities
p(3) = P (E3 ) =
4
X
i=0
We immediately have
P (E3 |A4 ) = P (E3 |A3 ) = 0
2
To establish P (E3 |A2 ) we argue
P (E3 |A2 ) = P (B1 C1 |A2 )+P (B1 D1 |A2 )+P (C1 D1 |A2 ) =
pb pc + pb pd + pc pd
(1 pa )2
further
P (E3 |A0 ) = P (B2 C1 D1 |A0 )+P (B1 C2 D1 |A0 )+P (B1 C1 D2 |A0 ) =
4pb pc pd (pb + pc + pd
(1 pa )4
4
X
i=1
with
P (E3|A1 B0 ) =
3pc pd (pc + pd )
(1 pa pb )3
p2c + p2d
(1 pa pb )2
pc + pd
P (E3|A1 B2 ) =
1 pa pb
P (E3|A1 B1 ) =
P (E3|A1 B3 ) = 0
and we get
3pc pd (pc + pd )
P (E3|A1) =
(1 pa pb )3
1 pa pb
1 pa
3
p2c + p2d
(1 pa pb )2
1
IMM - DTU
02405 Probability
2003-10-2
BFN/bfn
2
P (G|B1)P (B1)
=
P (G|B1)P (B1) + P (G|B2)P (B2) + P (G|B3)P (B3)
3
1
02405 Probability
2003-10-12
BFN/bfn
IMM - DTU
i=3
= 0.99
50
0.01
0.01 49
1+
50 1 +
0.99
0.99 2
= 0.0138
Pitman claims this probability to be 0.0144. We evaluate the second probability using
the Normal approximation to the Binomial distribution. Let X denote the number of
packets the manufacturer has to replace. The random variable X follows a Binomial
distribution with n = 4000 and p =. We can evaluate the probability using the normal
approximation.
40 + 21 4000 0.0138
P (X > 40) = 1 P (X 40)=1
4000 0.0138 0.9862
14.77
1
= 1 (2.00) = 0.9772
7.38
8:9<;>=@?BA9DC
EF9DGHGBIKJLAINM
IPOLIQGRNA>STIVUXW
YZR\[]_^@?`IQGaUDbcA>C
deA?`f
]XC
gihQj1hkmlZh
npo1hQh&qrhlsn,tNuwv`npomxynzh&{ZxF|n,}~DvB1n,txF+hKnp,xFl1t,<n+nphj|r+,h|np}~xyl1j:npo1h
h&qrhlnK
npo1xynPxr+jtznppxyl1t,<n+nphj|r,+h|n,}~F
<%c
gih|>xylh&{1+ht,tn,o1hh&qrhlsnD
u_v+
y
o1h_hqrhlsnptu_v\xF,hwXZnp1xF}}~h&{|}1t,qrhXt+1|onpomxn
1t+l1<npo1hxyj1j1n,rlL,Z}hmxFrhX
hrhn
mxyrhQFZZt,
u_vX
u_v
u_v*zxF,hKrqFhls~npoZh\l1FDxF}j1tnp,1Zn,rlDmxFrh
D
H`
<%c
37
HB
D
PnpoZhQ1,r1xF1}n,ht
linph+DtwFnpoZhu_v+ ts~
vLw
aBZ rF
!
!r
Z
Z r
!rFrFs
xyl1j
1
02405 Probability
2003-10-12
BFN/bfn
IMM - DTU
P (A3 A)
1
=
P (A)
1 + 3q + 6q 2
3
8
Question e) Pitman suggests no, which is reasonable. However, the way to assess
whether we can assume independence or not would be to analyze the distribution
of the number of sets played in a large number of matches.
1
IMM - DTU
02405 Probability
2003-10-3
BFN/bfn
1
02405 Probability
2003-10-15
BFN/bfn
IMM - DTU
i 1000i
35
X
1
37
1000
i
38
38
i=20
q
1 37
Question b) The standard deviation 1000 38
=5.1
1
1
1000
1
20 38
35 + 1000 38
q 2
= (1.814) (1.346) = 0.8764
q 2
1 37
1 37
1000 38 38
1000 38 38
1
02405 Probability
2003-10-13
BFN/bfn
IMM - DTU
X
x=0
P (X = x)P (Y X|X = x)
X
x=0
P (X = x)P (Y x)
X
x
x=0
x!
e (1 p)x = e e(1p) = ep
x=0
((1p))x
x!
ep = e 2 = 0.6065
= e(1p) .
1
02405 Probability
2003-10-13
BFN/bfn
IMM - DTU
9
36
6
36
3
36
6
36
4
36
2
36
3
36
2
36
1
36
as a check we verify that the sum of all entries in the table is 1. We derive the
distribution of Y1 + Y2
Y1 + Y 2 = i
0 1 2 3 4
12
10
4
1
9
P (Y1 + Y2 = i) 36
36
36
36
36
Question b)
1
1
1
E(3Y1 +2Y2 ) = E(3Y1 )+E(2Y2 ) = 3E(Y1 )+2E(Y2 ) = 5E(Y1 ) = 5 0 + 1 + 2
2
3
6
The first equality is true due to the addition rule for expectations (page 181), the
second equality is true due to the result for linear functions of random variables
page 175 b., the third equality is true since Y1 and Y2 has the same distribution,
and the fourth equality is obtained from the definition of the mean see page 181.
Question c)
or something similar.
for X 3
0
1 for 4 X 5
f (x) =
2
for X = 6
10
3
1
02405 Probability
2003-11-6
BFN/bfn
IMM - DTU
Question c)
n
k
n
k
Qk1
Qnk1
(b
+
jd)
(w + jd)
j=0
j=0
Qk1
j=0 (b + w + jd)
1
k!(n k)!
=
(n + 1)!
n+1
b
b+w
P (Xn+1 = 1) = P (Xn+1 = 1|X1 = 1)P (X1 = 1)+P (Xn+1 = 1|X1 = 0)P (X1 = 0) = P (Xn+1 = 1|X1 =
To proceed we note that the probability P (Xn+1 = 1|X1 = 1) is the probability
of P (Yn = 1) in an urn scheme starting with b + d blacks and w whites, thus
b+d
P (Xn+1 = 1|X1 = 1) = P (Yn = 1) = b+w+d
. Correspondingly P (Xn+1 = 1|X1 =
b
0) = b+w+d . Finally
P (Xn+1 = 1) =
b+d
b
w
b
b
+
=
b+w+db+w b+w+db+w
b+w
2
Question f )
P (X5 = 1|X10 = 1) =
using Bayes rule, or from the exchangeability. From the exchangeability we also
have
b+d
P (X10 = 1|X5 = 1) = P (X2 = 1|X1 = 1) =
b+w+d
1
02405 Probability
2003-10-22
BFN/bfn
IMM - DTU
z x P (X = x)
x=0
However, this is a power series in z that is absolutely convergent for |z| 1 and
thus defines a C function of z for |z| < 1.
Question b) The more elegant and maybe more abstract proof is
GX+Y (z) = E z X+Y = E z X z Y
X+Y
z k P (X+Y = k) =
X
k=0
k=0
zk
k
X
i=0
P (X = i, Y = k i)
k
XX
X
X
z k P (X = i)P (Y = ki)
zk
P (X = i)P (Y = k i)
GX+Y (z) =
i=0 k=i
i=0
k=0
The interchange of the sums are justified since all terms are positive. The rearrangement is a commonly used tool in analytic derivations in probability. It is
quite instructive to draw a small diagram to verify the limits of the sums. We
now make further rearrangements
GX+Y (z) =
X
i=0 k=i
z i P (X = i)
i=0
X
k=i
z k P (X = i)P (Y = k i)
z ki P (Y = k i) =
X
i=0
z i P (X = i)
m=0
z i P (X = i)
i=0
z m P (Y = m) =
X
i=0
z m P (Y = m)
m=0
2
Question c) By rearranging Sn = (X1 + + Xn1 ) + Xn we deduce
GSn (z) =
n
Y
GXi (z)
i=1
We first find the generating function of a Bernoulli distributed random variable(binomial with n = 1)
X
E(z ) =
1
X
x=0
z x P (X = x) = z 0 (1 p) + z 1 p = 1 p(1 z)
Now using the general result for Xi with binomial distribution b(ni , p) we get
E(z Xi ) = (E(z X ))ni = (1 p(1 z))ni
Generalizing this result we find
E(z Sn ) = (1 p(1 z))
Pn
i=1
ni
i.e. that the sum of independent binomially distributed random variables is itself
binomially distributed provided equality of the p i s.
Question d) The generating function of the Poisson distribution is given in exercise
3.5.19. Such that
GSn (z) =
n
Y
ei (1z) = e
i=1
Pn
i=1
i (1z)
The result proofs that the sum of independent Poisson random variables is itself
Poisson.
Question e)
zp
GX (z) =
1 z(1 p)
Question f )
G Sn =
G Sn =
zp
1 z(1 p)
zp
1 z(1 p)
Pni=1 ri
n
1
02405 Probability
2003-11-1
BFN/bfn
IMM - DTU
f (x)dx =
e|x| dx = 1. We have =
since
2
1
2
= 2
Question c)
P (|X| > y) = 2P (X > y) = 2
t
e dt =
2
et dt = ey
1
02405 Probability
2003-10-16
BFN/bfn
IMM - DTU
5
3
(loc 1)5 1 (dis 1)3 1 dis loc
e loc
e dis =
e loc dis
5!
3!
5!3!
Question b) The sum of two indpendent Poisson random variables is Poisson distributed (boxed result page 226), leading to
P (Nloc (3) + Ndis (3) = 50) =
Question c) We now introduce the random variables Siloc and Sidis as the time of
the ith local and long distance call respectively. These random variables are
Gamma distributed according to the box on the top of page 286 or to 4. page
289 The probability in question can be expressed as The waiting time to the first
long distance in terms of calls are geometrically distributed
10
loc
10
P (X > 10) = (1 pdis ) =
loc + dis
1
02405 Probability
2003-11-2
BFN/bfn
IMM - DTU
1 2
= 1 e 2 y e 2 y = 1 ey
We find the density using (5) page 297 or directly using E4.3.4 (i)
fY (y) = 2yey
Question b) This is a special case of E4.4.9 a). We can re-derive this result using the
dg(y)
= 2y.
change of variable formula page 304. With Z = g(Y ) = Y 2 we get
dy
Inserting we get
2 1
= ez
fZ (z) = 2yey
2y
an exponential(1) distribution.
Question c) We have E(Z) = 1 (see e.g. the mean of an exponential variable page
279 or the distribution summary page 477 or page 480).
1
02405 Probability
2003-10-16
BFN/bfn
IMM - DTU
where we have used standard rules for mean and variance see eg. page 249, and
the result page 279 for the variance of the exponential distribution.
Question b) We get the density fM (m) of the random variable M is
1 1
fM (m) = e 2 (m3)
2
m > 3.
from the stated assumptions. We can apply the box page 304 to get
3
1 2 (log (x)3)
e2
e
fM (m)
2
2
fX (x) =
=
,
=
dx
x
x x
dm
x > e3
= P (Y log (x) 3) = 1 e
(log (x)3)
2
e2
=1
x
x > e3
e
dFX (x)
2
==
fX (x) =
,
dx
x x
x > e3
1
02405 Probability
2003-11-1
BFN/bfn
IMM - DTU
1
2
0 elsewhere.
Question b) The standard uniform density f (y) = 1 for 0 < y < 1, 0 elsewhere.
Question c)
E(Y ) =
1
2
0
1
= , V ar(Y ) =
2
4
1
2
0
12
2
1
48
1
02405 Probability
2003-11-12
BFN/bfn
IMM - DTU
E(Wt ) = E XetY = E(X)E etY
by the independence of X and Y . We find E etY from the definition of the
mean.
Z 3
2
2et t
tY
e2 1
=
E e
ety 2dy =
t
1
2et t
2
E(Wt ) = 2
e 1
t
Alternatively we could derive the joint density of X and Y to
f (x, y) = 2(2x)3 e2x ,
where we have used that X has Gamma (4,2) density, and apply the formula for
E(g(X, Y )) page 349.
Question b) Since X and Y are independent we find E(Wt2 )
2
E(Wt2 ) = E(X 2 )E etY
where E(X 2 ) = V ar(X) + (E(X))2 = 5, see eg. page 481. Next we derive
E
etY
2
e2t t
e 1
t
and apply the computational formula for the variance page 261
s
2
e2t t
2et t
SD(Wt ) = 5 (e 1) 2
e2 1
=
t
t
1
02405 Probability
2003-11-1
BFN/bfn
IMM - DTU
1
11 1
1 1
=
2 2 32 2
3 2
1
Y |Y X 2
2
2
3
2
3
3 2
=1
2
4
1
02405 Probability
2003-10-17
BFN/bfn
IMM - DTU
the cumulative distribution function of an exponentially distributed random variable with parameter 1 + 2 .
1 e
1 t1 2 t1
t1
dt1 =
t1
2
=
1 e1 t1 e2 t1 dt1 =
1
e(1 +2 )t
1 + 2
1
02405 Probability
2004-5-13
BFN/bfn
IMM - DTU
1
2
1
2
1 1