This action might not be possible to undo. Are you sure you want to continue?

Most important limit theorems in probability are ``law of large numbers' and ''central limit theorems``. Law of large numbers describes the asymptotic behavior of the averages , where

is a sequence of random variables whereas central limit theorems describe the asymptotic behavior of appropriately scaled sum of random variables. To describe the asymptotic behavior, for example in law of large numbers, one should define the meaning of

i.e. one need to talk about convergence of random variables. There are multiple ways one can define convergence of sequence of random variables.

Definition 7.1. Let

is said to converges almost surely if

be random variables defined on a probability space. Then

If

converges to

almost surely, we write

a.s.

Definition 7.2. Let

is said to converge to

be random variables defined on a probability space. Then in Probability, if for each ,

Now we prove some very useful inequalities.

**Theorem 8.0.39 (Markov inequality) Let
**

moment. Then we have for each

be a non negetive random variable with finite

th

Proof. Set

Then we have

is a non negetive simple random variable such that . Therefore

. Hence from Theorem 6.0.29,

This completes the proof. As a corollary we have the Chebyshev's inequality.

41 (Strong law of large numbers) Let be a sequence of independent and finite second moment. Then and identically distributed random variables.. weak law of large numbers and strong law of large numbers. each having finite mean Then Proof: . Weak law of large numbers describe the asymptotic behavior of averages of random variables using convergence in probability and strong law of large numbers describe limit of averages of random variables using almost sure convergence. and identically distributed random variables. The proof of Chebyshev's inequality follows by replacing by in the Markov inequality. Let Then for each be a random variable with finite mean and finite variance . Theorem 8.Chebyshev's inequality. for each .40 (Weak law of large numbers) Let be a sequence of independent and finite variance .0. i.e.0. There are two types of law of large numbers. each having finite mean converges in probability to . Proof: Note that Hence applying Chebyshev's inequality to we get Therefore Theorem 8. where .

0.1) Since are identically distributed.3) Therefore (8. Consider the Bernstein . we get are independent for (8.45 Let polynomials be a continuous function.0. we get (8.0.0. Hence (8. As an application we show that any continuous function can be approximated by Bernstein polynomials.2) where .4) Hence Therefore This completes the proof.Hence using the fact that distinct.0. Example 8.

each having finite mean Then where Proof: Set is the standard normal distribution function .42 (Central limit theorem) Let be a sequence of independent and and finite non zero variance . Here we use the fact that every continuous function defined on dominated convergence theorem (Theorem 6.31).5) . we have Now note that is Binomial random variable. For .Fix . Now apply the i. (8. Then using strong law of large numbers. as and where such that .s. Let be independent and identically distributed Bernoulli random variables. we get is bounded.0. Hence Set Then a. the proof follows by observing that and . Theorem 8.e.0.0. identically distributed random variables. For .

one can get DeMoivre-Laplace limit theorem. (8.8) The last two equalities follows from l'Hospital's rule in view of Theorem 7. we have from (7.0. for each For . we have i.7) Consider (8.5). (8. the above limit follows easily.9) Hence using Theorem 7.0.where the second inequality uses the fact that identically distributed. . for sufficiently large .0.0.0.35. be a sequence of independent and is close to . then are independent and identically distributed Binomial is Bernoulli random random variable.0. Under the hypothesis of central limit theorem.0.38). we get DeMoivre-Laplace limit theorem.38.e. we complete the proof.43 Let be Bernoulli random variable.6) Hence for . we have (8.0.0.0.6). Then Proof: Observe that if variables. Combining (8. we get. we have from (8. Hence for each . As a corollary to Central limit theorem. One can use central limit theorem to get the normal approximation formula.6).0. Hence for sufficiently large . Theorem 8.6) and (8. For each fixed values of . Now apply Central limit theorem. (8.0.

10). Then using (8. Let be an independent and identically distributed Bernoulli a non negative integer.(8.46 random variables.0. we get for where .0.10) Here means ``approximately equal to''. Example 8.0.

- Gonzales Barreto 2015
- PUBLICACION EHPM 2012
- Drip Pricing When Consumers Have Limited Foresight
- How Increased Shareholder Governance Reduces Firm Value
- geometry of public goods
- Gas Expectatiion and forecasting
- econometrics_I definitions , theorems and facts
- homeworks in mathematical statitics
- Primal/Dual Mesh with Application to Triangular/Simplex Mesh and Delaunay/Voronoi
- satistical problem solutions
- Matrix Econometrics I
- Matrix Cookbook
- Questions Economics
- Microeconomics
- 83163340 Probability Random Variables and Stochastic Processes by Papoulis Pillai
- Cross Validation techniques in R
- lecture 7
- Why Encouraging More People to Become Entrepreneurs
- Measure Theory
- Measure Theory

Sign up to vote on this title

UsefulNot usefullimit theorems, central limit theorem and other important results

limit theorems, central limit theorem and other important results

- Results on Central Limit Theorems for Empirical Process
- Discrete Random Variables
- 10.1.1.29.9665
- An Exploration of Random Processes for Engineers
- Ch 2. Theory of Probability
- Erasmus Students - Matemathics
- Stochastic Process
- lec32
- 05_chapter 2 (1).pdf
- The Central Limit Theorem and Poisson Approximation
- Lectures on Stochastic Calculus With Applications to Finance
- Random Variables Without Joint Distributions
- ON DISCRETE FRACTIONAL INTEGRAL OPERATORS AND RELATED DIOPHANTINE EQUATIONS
- Core StatiCore Statistics new Simon Woods Bookstics New Simon Woods Book
- Hw 3
- Unit 2
- Moment Generating Function
- HW9
- KujDzhSel.pdf
- Lo Lo
- Lec8 Randomization
- Positive_definite Random Matrices
- Assign343
- Statistics Revision 2 Solutions
- NML Commentary
- MCT Book - Chapter 10
- CA03_5_AsymptoticHandout
- art14.pdf
- Maths Questionner
- Excel Math 1 Coaching (9.30.2011)
- Maths Probability Limit Theorems lec8/8.pdf