You are on page 1of 8

EXPECTATION

1.1 Definition
Let X be random variable on a probability space is a measurable
function
Definition 1.1.1 The Lebesgue integral of X over Ω with respect to the probability
measure is called the expectation of X denoted by . In other word:

Remark 1.1.2 For discrete random variables,

If X is continuous random variables with its density function

We will use Eqn to derive the expected value for a continuous RV. The idea is to
write our continuous RV as the limit of a sequence of discrete
Let X be a continuous random variable. We will assume that it is bounded. So there is a
constant M such That the range of X lies in Fix a positive
integer n and divide the range into subintervals of width . In each of these subintervals
we the value of X to the left endpoint of the interval and call the resulting
random variable is defined by

Note that for all outcomes converges to


pointwide on the sample space . In fact it converges uniformly on . The expected value
of X should be the limit of
The random variable is discrete
Since we do not know how to find the density of , we cannot prove this yet. We just
Give a non-rigorous derivation. Let be the sequence of discrete random variable that ap-
proximated X defined above. Then are discrete random variable. They appoximate
. In fact, if the range of is bounded and is continous, then will converge
uniformly to . So should converges to
Now is a discrete random variable, its values are with running from
to or possibly a smaller set . So

Now

So

When is large, the integrals in the sum are over a very small interval. In this interval,
is very close to . In fact, they differ by at most . So the limit as of the above
should be

Remark 1.1.3 If is any random variables then


where

then
Theorem 1.1.4 Distribution determinies expectation. That is implies

Since and are indentically distributed, then . Where be a


Borel set.
For indicator random variables:
This implies
Similarly to simple random variable, non-negative random variable and arbitrary random
variable.
We use the changing variables theorem
Suppose a Borel function such and are integral. Then,

And since property of integral and

1.2 Expectation of simple functions:


Let be the set of all simple function on. . Suppose is a simple random variable of the
form

Where , and . . Define for the expectation as

We now discuss the properties arising from this definition 1.2


1. We have that

This follows since so and

so

2. If and then
To verify this, note that if , then

and

and therefore
3. The expectation operator is linear in the sence that if , then
For
4. The expectation operator is monotone on in the sense that if and ,
then
To prove this, we observe that we have and . So from
property 2, and thus

since
5. If and either or , then

or

1.3Example
Example 1.3.1 Fire three independent bullets consecutively at a target. Let X be the number
1
of bullets hitting the target in three bullets, the probability of hitting each bullet is . Recall
2

{
1
3 , probability
8
3
2 , probability
8
X¿
3
1 , probability
8
1
0 , probability
8

We had this somewhere. So the expectation is to sum up these values with these weights:
1 3 3 1 21
¿ 3 ∙ +2 ∙ + 1∙ +0 ∙ = .
8 8 8 8 8
This is what you should expect. In three hits, you should get to target.
Example 1.3.2 From a lot of 6 items containing 2 defective items, a sample of 4 items are
drawn at random. Let the random variable X denote the number of defective items in the
sample. Recall
{
6
2, probability
8
8
X¿ 1 , probability
15
1
0 , probability
15

We had this somewhere. So the expectation is to sum up these values with these weights:
1 3 3 1 21
¿ 3 ∙ +2 ∙ + 1∙ +0 ∙ =
8 8 8 8 8

Example 1.3.3 Let Then


Indeed,
Since
Therefore
Example 1.3.4 Toss 2 evenly matched coins, Let be the number of heads appcaring.
Caculate
Solution: we have

difined by:

Therefore
1 1 1
¿ 2 ∙ +1 ∙ +1∙ =1
4 4 4
Example 1.3.5 Let . . Find
Solution: Put

Therefore ,
then
We have and
So
On the other hand thus

Therefore
Example 1.3.6 The lifetime of a person is an exponentially distributed random variable X
with a density function

Calculate the timelife? means that find

Put

Therefore

1.4. Change of variables


We want to compute expectation by integrating over and not To do this, we use the
induced probability measure on . The measure is defined as follows:

defined on every Borel set A.


Theorem 1.4.1
Let be a random variable with probability distribution , and
a Borel function such that is integrable. Then

Especially, if X is integrable, then


Proposition 1.4.2

where F is the distribution function of X. And more generally.

Corollary 1.4.3 Let be a random variable. Then

and more generally, you can do the same for a function of a random variable:

Theorem 1.4.4
Let X be a continuous random variable with probability density . Then for all
we have

1.5
1.5.1
Let X be a random variable. Then for every we have

Suppose is a random variable. Then, for all

So

1.5.2
Let be a random variable with mean value and variance . Then, for every

Since is a positive variable we may apply the Makov inequality

1.5.3
Let X be a random variavle with finite expectation , and let be a convex function on
the real time. Then

For any then

Since , we have

You might also like