You are on page 1of 7

Short recap: σ-algebras generated by maps

1) Let (X, A) and (Y, B) be measurable spaces and f : X → Y be a map. Then we define
the pull back σ-algebra as
f −1 (B) := {f −1 (B) : B ∈ B}
This is the smallest σ-algebra on X such that f is measurable. In particular, f is (A, B)-
measurable if and only if f −1 (B) ⊆ A.

2) If (fi )i∈I is a family of maps fi : X → Yi , then


[ 
σ(fi , i ∈ I) := σ fi−1 (Bi )
i∈I

is the σ-algebra generated by (fi )i∈I . Please note that i∈I fi−1 (Bi ) is not a σ-algebra in
S
general, and as in 1) we can easily prove that σ(fi , i ∈ I) is the smallest σ-algebra on X
such that all fi are measurable.

Exercise 4 (σ(X)-measurable random variables) .


Let (Ω, A) be a measurable space.

1 , . . . , an ∈ R be pairwise
a) Let aS distinct and A1 , . . . , An ∈ A be pairwise disjoint with
Ω = nj=1 Aj . Let X := nj=1 aj 1Aj and σ(X) be the σ-algebra generated by X.
P

i) Determine σ(X).
ii) Let Y be a σ(X)-measurable random variable. Show that Y is constant on each
set Aj , j = 1, . . . , n.
iii) Show that Y can be written as a function of X.

b) Let X : Ω → R be an arbitrary random variable and let Y : Ω → R be σ(X)-


measurable. Show that there exists a function Φ such that

Y = Φ(X).

Hint: algebraic induction.

Solution: a) i) In this particular case we obtain


 n[ o
σ(X) = X −1 (B(R)) = σ {A1 , . . . , An } = Aj : J ⊆ {1, . . . , n} .
j∈J

a) ii) We have Y −1 (B) ∈ σ(X) for all B ∈ B(R), in particular Y −1 ({z}) ∈ σ(X) for
all z ∈ R. Now assume that Y −1 ({z}) 6=S∅ for some z ∈ R. Then there exists a set
A ∈ σ(X) \ {∅} such that Y −1 ({z}) = A = j∈J Aj for some index set J ⊆ {1, . . . , n}, i.e.
Y (ω) = z for all ω ∈ A. This means, that Y is constant on each set Aj , j = 1, . . . , n.

a) iii) By part ii) we can find z1 , . . . , zn ∈ R such that


n
X n
X
Y = zi 1 A i = zi 1{X=ai } .
i=1 i=1

1
Pn
By defining the map g : R → R by g(x) = i=1 zi 1{ai } (x) we finally obtain
n
X
g(X(ω)) = zi 1{X=ai } (ω) = Y (ω).
i=1

b) 1) Let Y be a simple function, i.e Y = ni=1 zi 1Ai for some (zi ) ⊆ R and (Ai ) ⊆ σ(X).
P
Then there exist sets Bi ∈ B(R) such that Ai = X −1 (Bi ), i = 1, . . . , n. Now define
n
X
ϕ(x) := zi 1Bi (x), x ∈ R.
i=1

Then we obtain
n
X n
X
ϕ(X(ω)) = zi 1Bi (X(ω)) = zi 1Ai = Y.
| {z }
i=1 i=1
=1X −1 (B ) (ω)
i

2) Let Y : Ω → R be a positive function. Then there exists a sequence of simple functions


Yn such that 0 ≤ Yn % Y pointwise, in particular Y = supn Yn . By part 1) we can find a
sequence of functions ϕn such that ϕn (X) = Yn , n ∈ N. Now take ϕ := supn∈N ϕn . Then
we get
ϕ(X) = sup ϕn (X) = sup Yn = Y.
n n

3) Let Y be a general measurable map. Then Y = Y+ − Y− with Y+ , Y− ≥ 0. By part 2)


we can construct functions ϕ+ and ϕ− such that Y+ = ϕ+ (X) and Y− = ϕ− (X). Defining
ϕ := ϕ+ − ϕ− we finally get

ϕ(X) = ϕ+ (X) − ϕ− (X) = Y+ − Y− = Y.

Moreover, we obtain the following result:

Lemma T2. Let fi : (Ω, A) → (Ωi , Bi ), i ∈ I, and ϕ : (Ω0 , A0 ) → (Ω, A) be maps. Then
we have

ϕ is (A0 , σ(fi , i ∈ I))-measurable ⇐⇒ fi ◦ ϕ is (A0 , Bi )-measurable for all i ∈ I.

Proof: First assume the left-hand side. Since ϕ is (A0 , σ(fi ))-measurable and fi is
(σ(fi ), Bi )-measurable for any fixed i ∈ I, we trivially obtain that fi ◦ ϕ is (A0 , Bi )-
measurable for all i ∈ I.
Now assume the converse. Let M := i∈I fi−1 (Bi ) be the generating system of σ(fi , i ∈ I)
S
and A ∈ M be any set. Then there exists an i ∈ I and Bi ∈ Bi with A = fi−1 (Bi ) satisfying

ϕ−1 (A) = ϕ−1 (fi−1 (Bi )) = (fi ◦ ϕ)−1 (Bi ) ∈ A0 .

Since A ∈ M was arbitrary, the claim follows.

2
Exercise 5 (Product probability spaces) .
Let (Ωj , Aj , Pj ), j ∈ N, be probability spaces,
∞ ∞

×
O
Ω := Ωj , and A := Aj := σ(πj , j ∈ N)
j=1 j=1

the product σ-algebra generated by the projections πj : Ω → Ωj (πj (ω) = ωj , j ∈ N). Show
that there exists a uniquely determined probability measure P on A satisfying
∞ n

×
 Y
P A1 × . . . × An × Ωj = Pj (Aj )
j=n+1 j=1

for any sets Aj ∈ Aj , j = 1, . . . , n.


N∞ N∞ N∞ N∞
Notation: P = j=1 Pj and j=1 (Ωj , Aj , Pj ) := (×∞
j=1 Ωj , j=1 Aj , j=1 Pj ).

Hints for the proof:

1) For n ∈ N define

×

Fn := Bn × Ωj : Bn ∈ A1 ⊗ . . . ⊗ An
j=n+1

and show that this is an ascending sequence of σ-algebras.

2) For F := ∞
S
n=1 Fn show that F ⊂ A and σ(F) = A.

3) On Fn define the probability measure



×

µ n Bn × Ωj := P1 ⊗ . . . ⊗ Pn (Bn )
j=n+1

and on F the map P by


P(A) := µn (A) if A ∈ Fn .

4) Show that P(Ω) = 1 and P is σ-additive.

5) Finally, use Caratheodory’s extension theorem as well as the measure uniqueness


theorem to conclude the proof.

Before we start with the proof of this result we shortly recall Caratheodory’s extension
theorem and the measure uniqueness theorem:

Caratheodory’s extension theorem. Let R be a semi-ring, µ : R → [0, ∞] be a set


function satisfying

1) µ(∅) = 0

2) µ is σ-additive

Then there exists a measure µ̃ on σ(R) with µ(A) = µ̃(A) for all A ∈ R.

3
Measure uniqueness theorem. Let M ⊆ P(Ω) be an intersection stable generator of
the σ-algebra A and let µ1 , µ2 be measures on A such that

µ1 (A) = µ2 (A) for all A ∈ M.


S
If there exists an exhausting sequence of sets (An )n≥1 ⊆ M with Ω = n≥1 An with
µ1 (An ) = µ2 (An ) < ∞ for all n ∈ N, then µ1 = µ2 .

Solution: 1) Let Fn := Bn × ×∞

j=n+1 Ωj : Bn ∈ A1 ⊗ . . . ⊗ An , n ∈ N. Using that
⊗nj=1 Aj is a σ-algebra, it can be easily verified that Fn is a σ-algebra as well. Moreover,
since Bn × Ωn+1 ∈ ⊗n+1 n
j=1 Aj for any set Bn ∈ ⊗j=1 Aj we trivially obtain Fn ⊆ Fn+1 .
S
2) Now define F := n∈N Fn (which is not a σ-algebra in general). Then F is an algebra:

i) Since Ω ∈ Fn for all n ∈ N, we obtain Ω ∈ F.

ii) Let A ∈ F. Then there is an n ∈ N such that A ∈ Fn . Using that Fn is a σ-algebra


we get AC ∈ Fn , i.e. AC ∈ F.

iii) Let A, B ∈ F. Then there are m, n ∈ N such that A ∈ Fn and B ∈ Fm . Without loss
of generality we may assume that n < m. Then A, B ∈ Fm by part 1). Using again
that Fm is a σ-algebra, we get A ∪ B ∈ Fm , and this implies A ∪ B ∈ F by definition.

In particular, F is an intersection stable semi-ring. Consider now the maps

p n : Ω → Ω1 × . . . × Ωn , pn (ω) = (ω1 , . . . , ωn ).

Then pn is (A, ⊗ni=1 Ai )-measurable. This can be seen by considering the maps
n
πjn : ×Ω
i=1
i → Ωj , πjn ((ω1 , . . . , ωn )) = ωj , j = 1, . . . , n

which has the property πjn ◦ pn = πj , and this is (A, Aj )-measurable for each j = 1, . . . , n.
By Lemma T2 pn is (A, σ(πjn , j = 1, . . . , n))-measurable.
| {z }
=⊗n
j=1 Aj

For any set A ∈ F we can find an n ∈ N such that A = Bn × ×∞


j=n+1 Ωj for some
Bn ∈ ⊗nj=1 Aj . Using the measurability of pn above we obtain

A = p−1
n (Bn ) ∈ A.

This implies F ⊆ A.
Define the sets Rn := {A1 × . . . An × ×∞
j=n+1 Ωj : Aj ∈ Aj }, n ∈ N. Then we have


[ ∞
[
πj−1 (Aj ) ⊆ Rj ⊆ F ⊆ A,
j=1 j=1

which implies that σ(F) = A.

3) On Fn we define the probability measure µn by



×
 
µ n Bn × Ωj := ⊗nj=1 Pj (Bn )
j=n+1

4
and the set function P on F by

P(A) := µn (A) if A ∈ Fn .

4) By construction P is well-defined and it holds that

P(Ω) = µ1 (Ω1 ) = P1 (Ω1 ) = 1.

Now let A1 , . . . , An ∈ F be disjoint sets. Then there is an m ∈ N such that A1 , . . . , An ∈


Fm . This implies
n
[  n
[  n
X n
X
P Aj = µ m Aj = µm (Aj ) = P(Aj ).
j=1 j=1 j=1 j=1

which means that P is finitely additive. We next show that P is continuous in ∅ (and
hence σ-additive). To prove that let (Cn )n ⊆ F be a decreasing sequence
T of sets such that
limn→∞ P(Cn ) > 0. In the remaining of this part we want to show that ∞n=1 Cn 6= ∅ (which
then proves the continuity of P).
We may assume that Cn = Bn × ×∞
j=n+1 Ωj , Bn ∈ A≤n . Then

lim µn (Cn ) = lim P(Cn ) > 0.


n→∞ n→∞

Now define
Z
fn1 (ω1 ) := 1Bn (ω1 , ω2 , . . . , ωn ) d(P2 ⊗ . . . ⊗ Pn )(ω2 , . . . , ωn )
Ω2 ×...×Ωn
Z
≥ 1Bn+1 (ω1 , ω2 , . . . , ωn+1 ) d(P2 ⊗ . . . ⊗ Pn+1 )(ω2 , . . . , ωn+1 )
Ω2 ×...×Ωn+1
1
= fn+1 (ω1 ),

where the inequality comes from the fact that Cn+1 ⊆ Cn and Bn+1 ⊆ Bn × Ωn+1 . This
implies that (fn1 )n≥2 is a decreasing sequence. By monotone convergence this implies
Z Z
1 def
inf fn (ω1 ) dP1 (ω1 ) = inf fn1 (ω1 ) dP1 (ω1 ) = inf µn (Cn ) > 0.
Ω1 n≥2 n≥2 Ω1 n≥2

Hence, there exists an ω̄1 ∈ Ω1 such that inf n≥2 fn1 (ω̄1 ) > 0. Inductively, we can define for
any k ∈ N and n ≥ k + 1 the function
Z
k
fn (ω̄1 , . . . , ω̄k−1 , ωk ) := 1Bn (ω̄1 , . . . , ω̄k−1 , ωk , . . . , ωn ) d(Pk+1 ⊗. . .⊗Pn )(ωk+1 , . . . , ωn )
Ωk+1 ×...Ωn

and with the same argument as above we get an ω̄k ∈ Ωk satisfying

inf fnk (ω̄1 , . . . , ω̄k ) > 0.


n≥k+1

k (ω̄ , . . . , ω̄ ) =
R
In particular, fk+1 1 k Ωk+1 1Bk+1 (ω̄1 , . . . , ω̄k , ωk+1 ) dPk+1 (ωk+1 ) > 0. This
implies that the set
{ωk+1 ∈ Ωk+1 : (ω̄1 , . . . , ω̄k , ωk+1 ) ∈ Bk+1 }
is not empty for all k ∈ N. Now using that Bk+1 ⊆ Bk × Ωk+1 we obtain

(ω̄1 , . . . , ω̄k ) ∈ Bk for all k ∈ N.

5
T∞
Eventually, ω̄ := (ω̄j )j≥1 ∈ n=1 Cn .

5) By part 4) the set function P is σ-additive on the intersection stable semi-ring F. By


Caratheodory’s extension theorem and the uniqueness theorem for measures we obtain a
unique probability measure P̃ on σ(F) = A such that

P̃(A) = P(A) for all A ∈ F.

Having this result at hand we can prove the following Corollary:

Corollary T3. Let Xi : Ω → Ωi , i ∈ N, be random variables. Then we have

X = (Xi )i∈N is independent ⇐⇒ PX = ⊗∞ Xi


i=1 P .

Proof: Let (Xi )i∈N be independent. Then, for any n ∈ N we have P(X1 ,...,Xn ) = ⊗ni=1 PXi .
Using this, we get

PX (B1 × . . . × Bn × ×
j=n+1
Ωj ) = P(X1 ∈ B1 , . . . , Xn ∈ Bn , Xj ∈ Ωj , j ≥ n + 1)

n
Y
= P(X1 ∈ B1 , . . . , Xn ∈ Bn ) = PXj (Bj ).
j=1

By Exercise 5, the only measure satisfying this is ⊗∞ Xi X ∞ Xi


i=1 P , i.e. P = ⊗i=1 P .

Now assume the converse and let J ⊆ N be a finite subset. Then there is an N ∈ N such
that J ⊆ {1, . . . , N } and we have (let Bj = Ωj for j ∈
/ J)

∞ N

×
Y
X
P(Xj ∈ Bj , j ∈ J) = P (B1 × . . . × BN × Ωj ) = PXj (Bj )
j=N +1 j=1
Y Y
= PXj (Bj ) = P(Xj ∈ Bj ).
j∈J j∈J

which implies that (Xi )i∈N are independent.

Exercise 6 (Infinite and independent coin toss) .


Let (Ω, A, P) = ∞
N 
j=1 {0, 1}, P({0, 1}), Pj with Pj ({0}) = p and Pj ({1}) = 1 − p, 0 < p <
1, j ∈ N, the stochastic model of an infinite sequence of independent coin tosses with a not
necessarily fair coin. We now continuously flip the coin and stop when the coin shows head
(head = 1). Calculate the probabilities of the following events:

a) A: we flip the coin at least k times (k ∈ N);

b) B: the amount of coin flips is even.

6
Solution: Let X : Ω → N be the random variable counting the amounts of flips of this
experiment. By independence we get

P(X = 1) = P1 ({1}) = 1 − p,
P(X = 2) = P1 ({0})P2 ({1}) = p(1 − p),
P(X = 3) = P1 ({0})P2 ({0})P3 ({1}) = p2 (1 − p),
..
.
P(X = k) = P1 ({0}) . . . Pk−1 ({0})Pk ({1}) = pk−1 (1 − p), k ∈ N.

a) Using this observation we obtain


k−1
X
P(A) = P(X ≥ k) = 1 − P(X ≤ k − 1) = 1 − pj−1 (1 − p)
j=1
k−2
X 1− pk−1
= 1 − (1 − p) pj = 1 − (1 − p)
1−p
j=0
k−1
=p .

b) Similarly we get

X ∞
X
P(B) = P(X = 2k) = p2k−1 (1 − p)
k=1 k=1

X 1
= p(1 − p) p2j = p(1 − p)
1 − p2
j=0
p
= .
1+p

If we have a fair coin (p = 12 ) we get P(B) = 31 , which means that the probability of
X being odd is twice as much as X being even.

You might also like