Professional Documents
Culture Documents
SLLN s709-15
SLLN s709-15
Jun Shao
Department of Statistics University of Wisconsin Madison, WI 53706, USA
logo
October 9, 2009
1 / 11
October 9, 2009
2 / 11
October 9, 2009
2 / 11
October 9, 2009
2 / 11
Theorem 1.13
Let X1 , X2 , ... be i.i.d. random variables. (i) (The WLLN). A necessary and sufcient condition for the existence of a sequence of real numbers {an } for which 1 n Xi an p 0 n i=1 is that nP(|X1 | > n) 0, in which case we may take an = E(X1 I{|X1 |n} ). (ii) (The SLLN). A necessary and sufcient condition for the existence of a constant c for which 1 n Xi a.s. c n i=1 is that E|X1 | < , in which case c = EX1 and 1 n ci (Xi EX1 ) a.s. 0 n i=1 logo for any bounded sequence of real numbers {ci }.
Jun Shao (UW-Madison) Stat 709 Lecture 15 October 9, 2009 3 / 11
For any > 0, it follows from Chebyshevs inequality that P Zn EZn > n
where the last equality follows from the fact that Ynj , j = 1, ..., n, are i.i.d.
logo
October 9, 2009
4 / 11
x 2 dF|X1 | (x) =
2 n
n 0
which converges to 0 since nP(|X1 | > n) 0 (why?). This proves that (Zn EZn )/n p 0, which together with P(Tn = Zn ) 0 and the fact that EYnj = E(X1 I{|X1 |n} ) imply the result.
October 9, 2009
5 / 11
x 2 dF|X1 | (x) =
2 n
n 0
which converges to 0 since nP(|X1 | > n) 0 (why?). This proves that (Zn EZn )/n p 0, which together with P(Tn = Zn ) 0 and the fact that EYnj = E(X1 I{|X1 |n} ) imply the result.
October 9, 2009
5 / 11
Remarks
If E|X1 | < , then an = E(X1 I{|X1 |n} ) EX1 and result the WLLN is actually established in Example 1.28 in a much simpler way. On the other hand, if E|X1 | < , then a stronger result, the SLLN, can be obtained. Some results for the case of E|X1 | = can be found in Exercise 148 and Theorem 5.4.3 in Chung (1974). The next result is for sequences of independent but not necessarily identically distributed random variables.
Jun Shao (UW-Madison) Stat 709 Lecture 15 October 9, 2009 logo
6 / 11
Remarks
If E|X1 | < , then an = E(X1 I{|X1 |n} ) EX1 and result the WLLN is actually established in Example 1.28 in a much simpler way. On the other hand, if E|X1 | < , then a stronger result, the SLLN, can be obtained. Some results for the case of E|X1 | = can be found in Exercise 148 and Theorem 5.4.3 in Chung (1974). The next result is for sequences of independent but not necessarily identically distributed random variables.
Jun Shao (UW-Madison) Stat 709 Lecture 15 October 9, 2009 logo
6 / 11
Remarks
If E|X1 | < , then an = E(X1 I{|X1 |n} ) EX1 and result the WLLN is actually established in Example 1.28 in a much simpler way. On the other hand, if E|X1 | < , then a stronger result, the SLLN, can be obtained. Some results for the case of E|X1 | = can be found in Exercise 148 and Theorem 5.4.3 in Chung (1974). The next result is for sequences of independent but not necessarily identically distributed random variables.
Jun Shao (UW-Madison) Stat 709 Lecture 15 October 9, 2009 logo
6 / 11
Theorem 1.14
Let X1 , X2 , ... be independent random variables with nite expectations. (i) (The SLLN). If there is a constant p [1, 2] such that E|Xi |p < , ip i=1
(1)
then
(ii) (The WLLN). If there is a constant p [1, 2] such that 1 n E|Xi |p = 0, n np i=1 lim then 1 n (Xi EXi ) p 0. n i=1
Stat 709 Lecture 15 October 9, 2009
(2)
logo
7 / 11
Remarks
Note that (1) implies (2) (Lemma 1.6). The result in Theorem 1.14(i) is called Kolmogorovs SLLN when p = 2 and is due to Marcinkiewicz and Zygmund when 1 p < 2. An obvious sufcient condition for (1) with p (1, 2] is supn E|Xn |p < . The WLLN and SLLN have many applications in probability and statistics.
Example 1.32
Let f and g be continuous functions on [0, 1] satisfying 0 f (x) Cg(x) for all x, where C > 0 is a constant. We now show that
1 n 0 1 1
lim
0 0 1 0 g(x)dx
1 0 f (x)dx 1 0 g(x)dx
(3)
logo
(assuming that
October 9, 2009
8 / 11
Remarks
Note that (1) implies (2) (Lemma 1.6). The result in Theorem 1.14(i) is called Kolmogorovs SLLN when p = 2 and is due to Marcinkiewicz and Zygmund when 1 p < 2. An obvious sufcient condition for (1) with p (1, 2] is supn E|Xn |p < . The WLLN and SLLN have many applications in probability and statistics.
Example 1.32
Let f and g be continuous functions on [0, 1] satisfying 0 f (x) Cg(x) for all x, where C > 0 is a constant. We now show that
1 n 0 1 1
lim
0 0 1 0 g(x)dx
1 0 f (x)dx 1 0 g(x)dx
(3)
logo
(assuming that
October 9, 2009
8 / 11
E[f (X1 )] =
f (x)dx < ,
E[g(X1 )] =
g(x)dx < .
By the SLLN (Theorem 1.13(ii)), 1 n f (Xi ) a.s. E[f (X1 )], n i=1 By Theorem 1.10(i), E[f (X1 )] n f (Xi ) i=1 . a.s. n E[g(X1 )] i=1 g(Xi ) (4) 1 n g(Xi ) a.s. E[g(X1 )], n i=1
Since the random variable on the left-hand side of (4) is bounded by C, result (3) follows from the dominated convergence theorem and the fact that the left-hand side of (3) is the expectation of the random logo variable on the left-hand side of (4).
Jun Shao (UW-Madison) Stat 709 Lecture 15 October 9, 2009 9 / 11
Example
Let Tn = n Xi , where Xn s are independent random variables i=1 satisfying P(Xn = n ) = 0.5 and > 0 is a constant. We want to show that Tn /n a.s. 0 when < 0.5. For < 0.5, n 2 EX 2 n2n = n2 < . n=1 n=1 By the Kolmogorov strong law of large numbers, Tn /n a.s. 0.
October 9, 2009
10 / 11
Example
Let Tn = n Xi , where Xn s are independent random variables i=1 satisfying P(Xn = n ) = 0.5 and > 0 is a constant. We want to show that Tn /n a.s. 0 when < 0.5. For < 0.5, n 2 EX 2 n2n = n2 < . n=1 n=1 By the Kolmogorov strong law of large numbers, Tn /n a.s. 0.
October 9, 2009
10 / 11
iff
n /n 0.
11 / 11