Professional Documents
Culture Documents
5.1 By monotonicity,
P {|X| a} P {g(|X|) g(a)}
Eg(|X|)
g(a)
(1 p)k p = (1 p)j+1 p
(1 p)k = (1 p)j+1
k=0
k=j+1
Replacing j by i + j,
P {X > i + j} = (1 p)i+j+1
Replacing j by i 1,
P {X i} = P {X > i 1} = (1 p)i
Thus
P {X > i + j|X i} =
=
P {X > i + j, X i}
P {X i}
P {X > i + j}
= (1 p)j+1 = P {X > j}
P {X i}
5.19. Intuitively, this follows from linearty of the expectation. On the other hand,
justification is needed for the inifinite summation. It is not hard if the monotonic converngence theorem (will be taught later) is allowed. Without monotonic converngence theorem,
we need a delicate algebrac operation.
Notice that the random variable
X=
1 An
n=1
X
n=1
1 An
= E(X) =
kP {X = k} =
k=0
k=1
nI
kP {X = k}
CI = An
f inite I: nI
Hence,
E
X
1 An
P (CI ) =
n=1
Notice that
k
X
1 An
n=1
n=1
k=1
I: #(I)=k
f inite I: nI
P (CI ) =
k=1 I: #(I)=k n: nI
P (CI )
P (CI )
I: #(I)=k n: nI
I: #(I)=k
Hence,
I: #(I)=k
CI
P (CI )
n=1 f inite I: nI
P (An )
n=1
1{X>n}
n=0
kP {X = k} =
k=1
X
k P {X > k 1} P {X > k}
k=1
By the fact
n
n
n1
X
X
X
k P {X > k 1} P {X > k} =
(k + 1)P {X > k}
kP {X > k}
k=1
= nP {X > n} +
k=0
n1
X
P {X > k}
k=0
k=1
Whn n , the both sides converges or diverges simultaneously. When converging, the
left hand gives that E(X) < . By (the spirit of )Markov inequality,
nP {X > n}
kP {X = k} 0
k=n+1
So we have
E(X) =
k=0
P {X > k}
(n )