Professional Documents
Culture Documents
Independence
It turns out that if (1.4) holds for all Bi of the form Bi = (−∞, xi ], xi ∈ R,
then it holds for all Bi ∈ B(R), i = 1, 2, . . . , k. This follows from the
proposition below.
Proposition 7.1.1: Let (Ω, F, P ) be a probability space. Let A be a
nonempty set. Let Gα ⊂ F be a π-system for each α in A. Let {Gα : α ∈ A}
be independent w.r.t. P . Then the family of σ-algebras {σGα : α ∈ A} is
also independent w.r.t. P .
Proof: Fix 2 ≤ k < ∞, {α1 , α2 , . . . , αk } ⊂ A, Bi ∈ Gαi , i = 1, 2, . . . , k − 1.
Let
k−1
+
L ≡ B : B ∈ σGαk , P (B1 ∩ · · · ∩ Bk−1 ∩ B) = P (Bi ) P (B) .
i=1
(1.5)
7.1 Independent events and random variables 221
Proof: For the ‘if’ part let Gα ≡ {Xα−1 (−∞, x] : x ∈ R}, α ∈ A. Now
apply Proposition 7.1.1. The only if part is easy. 2
Remark 7.1.1: If the probability distribution of (Xα1 , Xα2 , . . . , Xαk ) is
absolutely continuous w.r.t. the Lebesgue measure mk on Rk , then (1.6)
and hence the independence of {Xα1 , Xα2 , . . . , Xαk } is equivalent to the
condition that
+
k
fα1 ,α2 ,...,αk (x1 , x2 , . . . , xk ) = fαi (xi ), (1.7)
i=1
a.e. (mk ), where f(α1 ,α2 ,...,αk ) is the joint density of (Xα1 , Xα2 , . . . , Xαk ),
and fαi is the marginal density of Xαi , i = 1, 2, . . . , k. See Problem 7.18.
Proposition 7.1.3: Let (Ω, F, P ) be a probability space and let
{X1 , X2 , . . . , Xk }, 2 ≤ k < ∞ be a collection of random variables on
(Ω, F, P ).
(i) Then {X1 , X2 , . . . , Xk } is independent iff
+
k +
k
E hi (Xi ) = Ehi (Xi ) (1.8)
i=1 i=1
Proof:
(i) If (1.8) holds, then taking hi = IBi with Bi ∈ B(R), i =
1, 2, . . . , k yields the independence of {X1 , X2 , . . . , Xk }. Conversely,
222 7. Independence
where PX1 ,X2 is the joint distribution of (X1 , X2 ) and PXi is the
marginal distribution of Xi , i = 1, 2. Also, by the independence of
X1 and X2 , PX1 ,X2 is equal to the product measure PX1 ×PX2 . Hence,
by Tonelli’s theorem,
E|X1 X2 | = |x1 x2 |dPX1 ,X2 (x1 , x2 )
R2
= |x1 x2 |dPX1 (x1 )dPX2 (x2 )
R2
= |x1 |dPX1 (x1 ) |x2 |dPX2 (x2 )
R R
Remark 7.1.2: Note that the converse to (ii) above need not hold. That is,
if X1 and X2 are two random variables such that E|X1 | < ∞, E|X2 | < ∞,
E|X1 X2 | < ∞, and EX1 X2 = EX1 EX2 , then X1 and X2 need not be
independent.
lim sup An ≡ lim An ≡ An (2.1)
n→∞
k=1 n≥k
7.2 Borel-Cantelli lemmas, tail σ-algebras, and Kolmogorov’s zero-one law 223
Proof: Since {An }n≥1 ⊂ F and F is a σ-algebra, Bk = An ∈ F for
∞ n≥k
each k ∈ N and hence lim An ≡ k=1 Bk ∈ F. Next,
ω ∈ lim An
⇐⇒ ω ∈ Bk for all k = 1, 2, ...
⇐⇒ for each k, there exists nk ≥ k such that ω ∈ A nk
⇐⇒ ω ∈ An for infinitely many n.
The proof for lim An is similar. 2
In probability theory, lim An is referred to as the event that “An happens
infinitely often (i.o.)” and lim An as the event that “all but a finitely many
An ’s happen.”
Example 7.2.1: Let Ω = R, F = B(R), and let
" 1#
0, for n odd
An = " n1 #
1 − n , 1 for n even.
Remark 7.2.1: This result is also called a zero-one law as it asserts that
∞pairwise independent events {An }n≥1 , P (lim An ) = 0 or 1 according to
for
n=1 P (An ) < ∞ or equal to ∞.
224 7. Independence
Proof:
n ∞
(a) Let Zn ≡ j=1 IAj . Then Zn ↑ Z ≡ j=1 IAj and by the MCT,
n ∞
EZn ≡ j=1 P (Aj ) ↑ EZ. Thus, j=1 P (Aj ) < ∞ ⇒ EZ < ∞ ⇒
Z < ∞ w.p. 1 ⇒ P (Z = ∞) = 0. But the event lim An = {Z = ∞}
and so (a) follows.
(b) Without loss of generality, assume P (Aj ) > 0 for some j. Let Jn =
EZn for n ≥ j where Zn is as above. Then, EJn = 1 and by the
Zn
Proof:
∞
(a) Fix > 0. Let An = {|Xn | > }, n ≥ 1. Then n=1 P (An ) < ∞ ⇒
P (lim An ) = 0, by the first Borel-Cantelli lemma (Theorem 7.2.2 (a)).
But
(lim An )c = {ω : there exists n(ω) < ∞ such that for all
n ≥ n(ω), w ∈ An }
= {ω : there exists n(ω) < ∞ such that |Xn (ω)| ≤
for all n ≥ n(ω)}
= B , say.
7.2 Borel-Cantelli lemmas, tail σ-algebras, and Kolmogorov’s zero-one law 225
∞ ∞
Thus, n=1 P (An ) < ∞ ⇒ P (B ) = 1. Let B = r=1 B r1 . Now note
that
∞
ω : lim |Xn (ω)| = 0 = B r1 .
n→∞
r=1
∞
Since P (B ) ≤
c
r=1
c
P (B 1 ) = 0, P (B) = 1.
r
∞
(b) Let {Xn }n≥1 be pairwise independent and n=1 P (|Xn | > 0 ) = ∞
for some 0 > 0. Let An = {|Xn | > 0 }. Since {Xn }n≥1 are pairwise
independent, so are {An }n≥1 . By the second Borel-Cantelli lemma
P (lim An ) = 1.
Sn
lim sup √ = +∞ a.s. (2.3)
n→∞ n
√
To show this, let S = lim supn→∞ Sn / n. First it will be shown that
S is T , B(R̄)-measurable.
√ For any m ≥ 1, define √ the variables Tm,n =
(Xm+1 + . . . + Xn )/ n and Sm,n = (X1 + . . . + Xm )/ n, n > m. Note that
for any fixed m ≥ 1, Tm,n is σXm+1 , . . .-measurable and Sm,n (ω) → 0 as
7.3 Problems 227
∞
= P Am
n=1 m=n
S
n
= P √ > x i.o.
n
≤ P (S ≥ c + 1) = 0.
7.3 Problems
7.1 Give an example of three events A1 , A2 , A3 on some probability space
such that they are pairwise independent but not independent.
(Hint: Consider iid random variables X1 , X2 , X3 with P (X1 = 1) =
228 7. Independence
1
2 = P (X1 = 0) and the events A1 = {X1 = X2 }, A2 = {X1 = X3 },
A3 = {X3 = X1 }.)
7.2 Let {Xα : α ∈ A} be a collection of independent random variables on
some probability space (Ω, F, P ). For any subset B ⊂ A, let XB ≡
{Xα : α ∈ B}.
(a) Let B be a nonempty proper subset of A. Show that the collec-
tions XB and XB c are independent, i.e., the σ-algebras σXB
and σXB c are independent w.r.t. P .
(b) Let {Bγ : γ ∈ Γ} be a partition of A by nonempty proper subsets
Bγ . Show that the family of σ-algebras {σXBγ : γ ∈ Γ} are
independent w.r.t. P .
7.3 Let X1 , X2 be iid standard exponential random variables, i.e.,
P (X1 ∈ A) = e−x dx, A ∈ B(R).
A∩(0,∞)
(d) For any cdf F , show that the random variable X(ω) ≡ F −1 (ω)
has cdf F , where
1
R= .)
lim sup |Xn |1/n
n→∞
P (lim sup An ) = 0.
n→∞
(a) If E|X1 |α < ∞ for some α ∈ (0, ∞), then show that
Mn
→ 0 w.p. 1. (3.7)
n1/α
(Hint: Fix > 0. Let An = {|Xn | > n1/α }. Apply the first
Borel-Cantelli lemma.)
(b) Show that if {Xi }i≥1 are iid satisfying (3.7) for some α > 0,
then E|X1 |α < ∞.
(Hint: Apply the second Borel-Cantelli lemma.)
7.15 (AR(1) series). Let {Xn }n≥0 be a sequence of random variables such
that for some ρ ∈ R,
(a) Show that if |ρ| < 1 and E(log |1 |)+ < ∞, then
n
X̂n ≡ ρj j+1 converges w.p. 1
j=0
(b) Show that under the hypothesis of (a), for any bounded contin-
uous function h : R → R and for any distribution of X0
Eh(Xn ) → Eh(X̂∞ ).
7.16 Establish the following generalization of Example 7.2.2. Let {Xn }n≥1
be a sequence of independent random variables on some probability
space (Ω, F, P ). Suppose there exists sequences {an }n≥1 , {xn }n≥1 ,
such that an ↑ ∞, xn ↑ ∞ and for each k < ∞, limn P (Sn ≤ an xk ) ≡
F (xk ) exists and is < 1. Show that lim supn→∞ Sann = +∞ a.s.
(c) Show that there exists a probability space (Ω, F, P ) such that
|Ω| = 2k and k independent events A1 , A2 , . . . , Ak in F such
that 0 < P (Ai ) < 1, i = 1, 2, . . . , k.
7.20 (a) Let Ω ≡ {(x1 , x2 ) : x1 , x2 ∈ R, x21 + x22 ≤ 1} be the unit disc in
R2 . Let F ≡ B(Ω), the Borel σ-algebra in Ω and P = normalized
Lebesgue measure, i.e., P (A) ≡ m(A) π , A ∈ F. For ω = (x1 , x2 )
let
X1 (ω) = x1 , X2 (ω) = x2 ,
and R(ω), θ(ω) be the polar representation of ω. Show that
the random variables R and θ are independent but X1 and X2
are not.
(b) Formulate and establish an extension of the above to the unit
sphere in R3 .
7.21 Let X1 , X2 , X3 be iid random variables such that P (X1 = x) = 0 for
all x ∈ R.
(a) Show that for any permutation σ of (1,2,3)
1
P Xσ(1) > Xσ(2) > Xσ(3) = .
3!
(b) Show that for any i = 1, 2, 3
1
P Xi = max Xj = .
1≤j≤3 3
(c) State and prove a generalization of (a) and (b) to random vari-
ables {Xi : 1 ≤ i ≤ n} such that the joint distribution of
(X1 , X2 , . . . , Xn ) is the same as that of (Xσ(1) , Xσ(2) , . . . , Xσ(n) )
for any permutation σ of {1, 2, . . . , n} and P (X1 = x) = 0 for
all x ∈ R.
7.22 Let f , g : R → R be monotone nondecreasing. Show that for any
random variable X
Ef (X)g(X) ≥ Ef (X)Eg(X)