You are on page 1of 40

EG_EE 580

Dr. S.Ghazanshahi

Random Experiment:
A random experiment is an experiment in which the outcome varies in an
unpredictable fashion, when the experiment is repeated under the same conditions.

Sample Space:
The sample space S of a random experiment is defined as the set of all possible
outcomes of random experiment.
Event (Set):
An event is a subset of the sample space S which contains specific outcomes.

Special Events:
a) S Universe set (or certain event), which consists of all possible outcomes of
an experiment.
b) - Impossible or null set, which contains no outcomes and hence never
occurs.
Example:
Select a ball from an urn containing balls numbered 1 to 20:
= {1 , 2 , 3 , , 20} discrete sample space.
{ } = {2 , 4 , 6 , , 20}

Example of Continuous Sample Space


Pick two numbers at random between zero and one.

EG_EE 580
Dr. S.Ghazanshahi

Set Operations and Set relations (pages 28 30):


1) Union Sum Or:
or ( + ) is defined as the set of all outcomes belong to or .
Example:
=

2) Intersection Product And:


The intersection of two events and is denoted by or and is defined as
the set of outcomes that are in both and .

Example:
=

3) Complement of an event A:
Complement of an event is denoted by and is defined as the set of all outcomes
in but not in .
Example:
=

Set Algebra:
1) Commutative properties:
=
=
2) Associated properties:
( ) = ( ) =
( ) = ( ) = =ABC

EG_EE 580
Dr. S.Ghazanshahi

3) Distributive properties:
( ) = ( ) ( )
( ) = ( ) ( )
4) De Morgans Rules:

In general:
1 2 =
=1 = =1

1 2 =
=1 = =1
Mutually exclusive (Disjoint Sets):
1. A and B are mutually disjoint iff A = .
=
Sets 1 , 2 , are said to be mutually exclusive iff: {
=

Collectively Exhaustive:
N sets 1 , 2 , , are collectively exhaustive iff:
=1 =
If in addition = , then [1 , , ] are called partition of S.

Example:
Show that the following equallity is correct.

(A - AB) = A (
)
)
= A (

= A ( ) = (A )= ( ) = A = A (
)

EG_EE 580
Dr. S.Ghazanshahi

Probability definitions and Axioms (pages 31 - 35):


Probability is a rule that assigns to each event of sample space a number (),
called the probability of , and indicates how likely it is that the event will occur
and satisfies the following three axioms:
1) () 0
2) () = 1
3) For n disjoint events 1 , 2 , , such that = , then :

[ ] = ( )
=1

=1

1. Equally likely or Classical Definition of Probability:


Consider an experiment with N different outcomes, which all are disjoint and
equally likely to happen, and let times of these N outcomes correspond to
event A. Then we define:
P(A) =

N = number of different outcomes of chance experiment.


= Number of times event A happen.
Example:
Let toss a six sided die, then:
Sample space: {1, 2, 3, 4, 5, 6}
Event A: { } = {2, 4, 6}
Event B: {2, 3},
P(A)=

3
6

Event C: {4, 5}

P(B)=

2
6

P(C)=

2
6

P(B )= P(B)+P(C)= + =
6
6 6

EG_EE 580
Dr. S.Ghazanshahi

2) Relative Frequency Definition:


Let us repeat an experiment n times under the same conditions. If event A happens
times, then, P(A) is defined as:
P(A) = lim
Where

is called relative frequency of occurrence of event A, which is a good

measurement of the probability of having event A. So P(A)

Corollary of axioms:
1) () = 0
2) () = 1 ()
3) ( ) = () + () ()

Note: If = , then P(AB)=0 in above equation.

Example:
Draw a card from a deck of 52 playing cards and find:
1) ( ) =

4
52

2) ( ) = 13
52

3) ( ) = () + () =

13
52

13
52

1
2

4) ( ) = () + () + ( ) =
1
52

13
52

4
52

16
52

EG_EE 580
Dr. S.Ghazanshahi

Conditional probability (page 48):


The conditional probability of an event given that event has occurred, denoted by
(|), and is defined by:
(|) =

()

for () 0

()

In computing (|), we can therefore view the experiment as now having the
reduced sample space . The event occurs in the reduced sample space. From the
above equation, the joint probability of P(AB) follows as:
() = (|)()
Similarly:
(|) =

()
()

() = (|)()

Example:
A box contains two black and three white balls. Two balls are selected at random
from the box without replacement and the sequence of colors is noted. Find the
probability that both balls are black.
1 2
1
(1 2 ) = (2 |1 )(1 ) = ( ) ( ) =
4 5
10
B black ball, W white ballType equation here.
(2 ) = (2 1 2 1 ) = (2 1 ) + (2 1 ) = (2 |1 )(1 ) +
2

(2 |1 )(1 ) = ( ) ( ) + ( ) ( ) =
4
5
4
5
5

EG_EE 580
Dr. S.Ghazanshahi

Total or Marginal Probability (page 52):

The probability of any event A defined on a sample space S can be expressed in


terms of joint or conditional probabilities. Consider n mutually exclusive events
where their union is equal to S:
=

i j = 1, 2, , n

=1 = S
Then:

P(A) = =1 ( )= =1 (| )P( )

P(A) is called total probability of A, probability A without regard to which occur.


Proof:
A = AS = A (1 2 )
= A1 2 (Mutually exclusive events)
P(A) = =1 ( ) = =1 (| )( )
Example:
An assembler of radios uses motors from three different factories. Factory one (1 )
supplies 50% of the motors, factory two (2 ) supplies 20% of the motors and factory
three (3 ) supplies 30%. It is known that 5% of the motors supplied by 1 , 3% of the
motors supplied by 2 and 2% of the motors supplied by 3 are defective.
a) What is the probability that the selected motor is defective??
b) What is the probability that this defective motor was supplied by 2 ?
a) () = (1 2 3 ) = (|1 )(1 ) + (|2 )(2 ) + (|3 )(3 )
= (0.05)(0.5) + (0.03)(0.2) + (0.02)(0.3) = 0.037
(2 |) =

(2 |)
(2 )
= ( |2 )
= 0.0162
()
()

EG_EE 580
Dr. S.Ghazanshahi

Bayes rule (page 52):


( |) =

( )
= (| )( )/ (| )( )
()
=1

Statistically independent events:


Two events and are called statistically independent if the probability of the
occurrence of one event is not affected by the occurrence or non occurrence of the
other one. Mathematically:
(|) = ()
Or:
() = ()()
Three events are called independent if and only if :
1) ( ) = ( )( )

2) (1 2 3 ) = (1 )(2 )(3 )
Example: Toss a die
Outcomes: {1, 2, 3, 4, 5, 6}
Events:
= { } = {1, 3, 5}
= { } = {2, 4, 6}
= { } = {1, 2}
a) Are A and B statistically independent events?
b) Are A and C statistically independent events?
P(AB) =? P(A) P(B)

Answer: a) Check for:


P(B) = P(A) =

1
2

EG_EE 580
Dr. S.Ghazanshahi

P(AB) = P() = 0
P(AB) P(A) P(B)

A and B are not independent.

Check for: P(AC) =? P(A) P(C)


Or check for: P(C|) =? P(C)
P(C) =

P(A) =

1
2

P(AC) = P({1}) =

1
6

P(AC) = P(A) P(C)


Or P(C|) =

1
3

P(C) =

1
3

So A and C are statistically independent.

Random variable
Definition:
A real random variable is a single valued function that maps all outcomes of the
sample space into a set of real numbers. The domain of this function is the set of
experimental outcomes (i.e; ) and its range is a set of real numbers (i.e., R). Hence
random variable is a function that assigns a real number, to each outcome the
sample space.

Example:
Toss a coin three times and let define random variable as the number of heads in
each outcome in as:
, , , , , , ,
() = 3 , 2 , 2 , 2 , 1 , 1 , 1

,0

is a random variable taking the values 0 , 1 , 2 , 3 (many to one mapping)


Note:
( = 0) =

1
8

( = 1) =

3
8

EG_EE 580
Dr. S.Ghazanshahi

( = 2) =

( = 3) =

1
8

Example:
Toss a die and define random variable as ( ) = 10.
= {1 , 2 , , 6 } = {10 , 10 , 30 , , 60}
One to one mapping:
( = 35) = () = 0
( 35) =

( = 40) =

6
6

The Probability Distribution Function (Page 141):


The distribution function of the random variable is defined as the probability of the
event { }:
() = ( )
Example: I n example of die as defined above,

F(x)
(0) = 0
(10) =

1
6

(60) = 1

10

EG_EE 580
Dr. S.Ghazanshahi

Properties of Distribution Function(P.D.F.) (page 144):


1) (-) = P(X< ) = () = 0
2) () = P(X< ) = () = 1
3) (1 ) (2 ) if 1 2 ; () increasing function of x and has
positive slope.
4) () = ( + )

at the point of discontinuity.

+ = +
5) P(1 < 2 ) = (2 ) (1 )

for 1 < 2

6) ( = ) = (+ ) ( )
7) ( = ) = 0

In previous example of die:


4

P(20 < 40) = (40) (20) = =


4

P(20 40) = (40) (20 ) = =

Probability density function(pdf) (page 148):


The probability density function is denoted by (), and is defined as derivative of
the distribution function:
()

()

() represents the density of probability at the point . For small :


( < < + ) = ( + ) () =

{( + ()}
= ()

f(x) exists when the derivative of F(x) exist.

11

EG_EE 580
Dr. S.Ghazanshahi

For discrete random variable with stair step distribution function, the unit impulse
function () is introduced to define f(x) at stair step points as:

() = ( )( )
=1

Properties of probability density function:


1) () 0

since F(x) has positive slope.

2) () = ()

3) () = ()|
= () () = () = 1

4) P{1 < 2 } = 2 () = (2 ) (1 )
1

12

EG_EE 580
Dr. S.Ghazanshahi

Example:
Let X be continuous random variable with probability density function defined as:
() = {

2 , 0 1
0 ,

a) Find P(0.5 0.75)


b) Find ()
Answer:
1

0 () = 0 2 = 1

=3

0.75

0.75

a) P(0.5 0.75) = 0.5 () = 0.5 3 2 = 3 |0.75


0.5

b) () = 0 32 = 3

Note:
Unit step function and delta function are defined as:
() = {
() =

0, < 0
1, 0

()

() = ()
() for discrete RVs can be written as a weighted sum of unit step functions:
() = ( )( )

13

EG_EE 580
Dr. S.Ghazanshahi

Where ( ) = ( = ) is called probability mass function. Therefore:

() =

()
= ( )( )

Example:
Recall the example of a die where RV X is discrete.
( ) =

1
6

= 1, 2, 3, , 6

() = ( 10) + ( 20) + + ( 60)


() = ( 10) + ( 20) + + ( 60)

14

EG_EE 580
Dr. S.Ghazanshahi

Sample of density function (page 149):

1) Uniform density function


1

,
() = {
0,

() = () =

Example:
A resistor is RV uniformly distributed between [900,1100]. Find the probability
that is between 950 and 1050 .
() =

1
1100900

1
200

(950 < < 1050) =

1050 1

200 950

200

= 0.5

2) Normal (Gaussian); (Page 167)


() =

1
2

()2
22

< <

f(x) is bell shaped, symmetrical about = .


= = mean or average value of X
2 = Variance (VAR), spread of X around the mean value.
= Standard deviation (SDV) = + 2
Distribution function of Gaussian random variable is:

F()= ()

()

1
2 22

15

EG_EE 580
Dr. S.Ghazanshahi
2

(1 2 ) = () =
1

1
2

()2
22

There are no analytical solutions for above integrals, and we should calculate them
numerically. There exist a table for calculating the distribution function of standard
Gaussian with =0, and = 1. .
For negative value of we can use F(- )=1-F( ).

We can calculate general

distribution function from standard normal distribution function and its table, using
variable transformation as:
=

Therfore:
(1 2 ) =
1 =

2 =

=0 =1
(0,1) =

2 1

2 2
1

1
2

. ;
()2
2

= 2 () = (2 ) (1 ) Where
1

(0,1)
Standard normal p.d.f.
2

(1 < < 2 ) = (2 ) (1 ) = (

() =

()

2 2

) (

Standard Gaussian Distribution Function

Example:
A random variable X normally distributed with mean=1000, and standard deviation
equal 50 (i.e. (1000,50)). Find:

16

EG_EE 580
Dr. S.Ghazanshahi

(900 < < 105) = [(

10501000
50

9001000

)] [ (

50

)] = (1) (2)

= (1) [1 (2)] = (1) + (2) 1 = 0084 + 0.977 1 = 0.818

Binomial (Bernoulli Trials) (page 60, 103):


A binomial experience composed of identical independent trials (Bernoulli trials),
where each trial has just two outcomes success or failure with (() = ) and
(() = ). The binomial arises in applications where there are two outcomes
such as head/tail, good/defective, correct/incorrect, pass/fail.
If RV X defines the number of successes in trials. Then has a binomial
distribution of order with:
( = ) = () 1

= 0, 1, 2, ,

+ =1
With the following density and distribution functions:
( = ) = ()
() = =0() ( )

() = ( ) ( )

=0

For large , binomial density function can be approximated with Gaussian density
function of (, ) , and we can use standard normal table to calculate

Binomial distribution function with using (

= =

1
2

17

EG_EE 580
Dr. S.Ghazanshahi

Poisson density function (page 120):

In many applications, we are interested in counting the number of occurrence of an


event in a certain time period, or in a certain region of space. Poisson RV arises in
counts of emissions from radioactive substances, in counts of demands for telephone
demand, in counts of defects in a semiconductor chip, and in count of number of
customers coming to teller.
Probability function of the Poisson RV is given by:
( = ) =

= 0, 1, 2, 3, .

Where is the average number of events occurrence in the same time interval or
region of space.
Where a = T= average number of events in interval T
=average rate at which events occur.
The Poisson density and distribution functions are:


() = ( )( ) =
( )
!

=0

() = ( )( ) =
=0

( )

Example:
Let X be the number of accidents in one intersection during one week where average
number of accidents in one week reported as 2. (a=2)
a) Write density and distribution functions of X.
b) find (2).

18

EG_EE 580
Dr. S.Ghazanshahi

Answer:
a) () = ( )( ) =
=0
() = =0

b) (2) = 2=0

!
2 2
!

( ) = =0
=

2 20
0!

2 21
1!

2 2

a=2

( )

2 22
2!

( )

= 0.135+0.27+0.27=0.675

Remark:
Density and distribution functions for Poisson random variable when plotted appear
similar to those of the Binomial. Actually Binomial probability function converges
to Poisson probability function for and 0 in such a way that = .

(# = ) = ( )

!
Example:
The probability of a bit error in communications is 103 . Find the probability that a
block of 1000 bits has five or more errors.
Each bit transmission corresponds to a Bernoulli trial success equal to a bit error.
= 1000

= 103

( 5) = 1 ( < 5) = 1 4=0

= = 1000 103 = 1

!

= 1 1 [1 +

1
1!

1
2!

1
3!

+ ]=
4!

0.00366

Function of RV (page 125):


Let, = (),
Find () from density function of ().
Solve the equation = () and denote the real roots by as:
= (1 ) = (2 ) = = ( ), then using the transformation theorem, we can
calculate the density function of as:
19

EG_EE 580
Dr. S.Ghazanshahi

() =
=1

Note that ( ) =

( )
| ( )|

| =

Example:
2

Given = () = and () =

1
2

()2
2

if < < . Find the density

function of .
Find roots: 1 =

2 =

() = 2

| (1 )| = | (2 )| = 2
() =

1
2

[ () + ()] = 2

(1 )
(2 )
+
=
| (1 )| | (2 )|

()
2

1
2

()
2

)=

20

EG_EE 580
Dr. S.Ghazanshahi

Joint statistics

Joint probability distribution function (page 243)


(1 , 1 ) = ( 1 1 )
(, ) = () = 0
(, ) = (S) = 1
(, ) = (X < ) = () = 0
(, ) = (X < ) = (Y y) = () , marginal distribution
function of y
(, ) = () , marginal distribution function of x
(, ) = ( + , + ): .
(, ) ncreasing function of both arguments x and y.
Joint density function:
(, ) =

2 (, )
0

Note:

1) (, ) = (, )

2) (, ) = () = 1

2
2
3) P(1 < 2 , 1 < 2 ) = =
= (, )
1

Remark:
If X and Y be two discrete random variables, the distribution function, (, ) has
step discontinuities with undefined derivatives at these discontinuous points.

21

EG_EE 580
Dr. S.Ghazanshahi

However by using impulse functions we can define joint density function (, )


at these points as:
(, ) = =1
=1 ( , )( )( )
And joint distribution function is:

(, ) = ( , )( )( )
=1 =1

Example:
Consider two continuous random variables X and Y with the following joint density
function as:
,
(, ) = {
0 ,

01 01

a) Find , (, )
1

b) Find ( , )
c) ( )
Answer:
K=?
1

=0 =0 = () = 1 = 4

a) (, ) = 0 0 4 = 0 4 0 = 0 4 ( )
2

= 2 2 0 = 2 2
a)

1
2

3
4

1
2

1 3

, ) = 0 0 4 = ( , ) =
4
2 4
1

9
64
14

c) ( ) = =0 =0 4 = 0 4 0 = 0 3 =
2

2
4

4 |10 =

1
2

22

EG_EE 580
Dr. S.Ghazanshahi

Marginal distribution and density functions:(page 249)


(, ) = ()

(, ) = ()
Therefore:

() = (, )

This is marginal density function of from the joint density function. Similarly:

() = (, )

For discrete random variable, we have:


( = ) = ( = = )
Recall total probability and compare () = ( )

Independent Random Variables: (page 254):


Two RVs and are called statistically independent if the events { } and
{ } be independent, which means:
( ) = ( ) ( )
(, ) = () ()
Or:
(, ) = () ()
f(x|y)=f(x)

23

EG_EE 580
Dr. S.Ghazanshahi

Conditional Distribution Function:


(|) is the conditional distribution function of RV X assuming , and is defined
as the conditional probability of event { }:
(|) = ( |) =

( )
()

(|) has all the properties of distribution function as we discussed before.

Conditional Density Function:

(|) =

(|)

(|) = () = 1

(|) = (|)

(1 < 2 |) = 2 (|)
1

1) Interval Conditioning:
= {1 < 2 }
0
( |) = ( |) =

( )
()

()(1 )

={

()

< 1
1 < 2

By differentiating:
0
(|) =
(|) =

(|)

()
()

()

= {()
0

< 1
1 < 2
2

Where = {1 2 } or = { 2 }

24

EG_EE 580
Dr. S.Ghazanshahi

2) Point conditioning, where = { = } then:


(|) = (|) =

(,)
()

Where, this is the familiar form

Example:
Consider two RVs and with the joint density function of:
1

(, ) = ( + )

0 1 0 2

Find:
1) (, )
2) () and ()
3) () and ()
4) (|)
5) (|) where = {0 0.5}
6) (0.2 0.4 , 1 1.5)

Answer:

1) (, ) = 0 0 (, ) = 0 0 ( + ) = ( 2 +
3
6
2 )
1

2) () = (, 2) = (2 2 + 4)
6
1

() = (1, ) = ( + 2 )
6

3) () =

() = (4 + 4)
6

21

Or: () = 0 (, ) = 0 ( + ) = +
1

21

() = 0 (, ) = 0 ( + ) = +

25

EG_EE 580
Dr. S.Ghazanshahi

4) (|) =

(,)

5) (|) =

()

()

()

1
(+)
3
1
1
(+2)
3

+
+

2
(+1)
3
0.52
0 3(+1)

1
2

2
(+1)
3

=
(0.5)
1.5

4
(+1)
3
1 2 4
( + )
3 4 2

= ( + 1)
5

0.4 1

6) (0.2 0.4 , 1 1.5) = 1 0.2 ( + )


3

Expected Values (page 155):


Definition:
Process of averaging when random variable is involved is called Expectation. For
random variable X , the notation E[X]= , called the expected value of X, or the
statistical average of X,; or the mean value of X,.
In general, consider random variable X and its function g(X), then expected value of
g(X) defined as:

[()] =
() = ()()

= ( )( )

for continuous RV
for discrete RV

Mean:
The mean of RV is defined by:

[] = = ()

for continuous RV

[] = = =1 ( )

for discrete RV

[] = = .

Variance (VAR)
[] = 2 = [( )2 ] 0
26

EG_EE 580
Dr. S.Ghazanshahi

. .:

2 = =1( )2 ( )

for continuous R.V.:

2 = [( )2 ] = ( )2 ()
1) Variance is a measure of the spread about the mean value.
2) Variance is always positive value.
3) If 2 = 0, then ( = ) = 1.

X is a constant

The positive constant , is called the standard deviation of .


Another form of :
2 = [( )2 ] = [ 2 + 2 2]
= [ 2 ] + 2 2[]
= [ 2 ] 2
= [ 2 ] 2 []
If = 0 , then 2 = [ 2 ]
Example:
Throw a dart towards a target:
( = 1) = 0.64
( = 2) = 0.32
( = 3) = 0.04
[] = ( ) = (1)(0.64) + (2)(0.32) + (3)(0.04) = 1.4
2 = [ 2 ] 2 []
[ 2 ] = 2 ( ) = (1)2 (0.64) + (2)2 (0.32) + (3)2 (0.04) = 2.28
2 = 2.28 (1.4)2 = 0.32

27

EG_EE 580
Dr. S.Ghazanshahi

Moments about the Origin:


Denote the kth moment by :

= [ ] = ()

= [ ] = =0 ( )

Central Moments:
Moments around the mean value of random variable X are called central moments
and given by:

= [( ) ] = ( ) ()
or for discrete RV: = ( ) ( )
Example:
1 = =
2 = [ 2 ] =
2 = 2 =
Properties of expected values:
1) [] = ; =

[] = () = ()
=

Note: Joint expectation


= (, )

[] = [(, )] = (, )(, )
= ( , )( , )
2) [1 + 2 ] = [1 ] + [2 ]
28

EG_EE 580
Dr. S.Ghazanshahi

Since:

[1 + 2 ] = (1 + 2 )(1 , 2 )1 2

= 1 (1 , 2 )1 2 + 2 (1 , 2 )1 2

= 1 (1 )1 + 2 (2 )2
= [1 ] + [2 ]
In general:

3) [
=1 ] = =1 [ ] = =1 [ ]

are constants.

4) For two independent random variable 1 and 2 .


[1 2 ] = [1 ][2 ]
Since:

[1 2 ] = 1 2 (1 , 2 ) 1 2 = 1 2 (1) (2 ) 1 2 =

1 (1 )1 2 (2 )2
= [1 ][2 ]
In general:

[1 ] = [ ] = [ ]
=1

=1

Example:
Find the mean and variance of the uniform random Variable:
() =

2 1

2 1

= [] = 2 () = 2

1
2 1

1 +2
2

29

EG_EE 580
Dr. S.Ghazanshahi

2 = [( )2 ] = 2 (

1 +2 2

let

1 +2
2

) ()

, =
2 1

2
1
(2 1 )2
2
2
=

=
2 1 21
12
2

Example:
Consider two independent RVs 1 and 2 with:
2
2
[1 ] = 1 , [2 ] = 2 , 1
= 2 , 2
=1

Find the mean and variance of , where = 21 + 32 .


[] = [21 + 32 ] = 2[1 ] + 3[2 ] = 2(1) + 3(2) = 8
2 = [ 2 ] 82 = [412 + 922 + 121 2 ] 82
= 4[12 ] + 9[22 ] + 12[1 2 ] 64
2
2
= 4[1
+ 12 ] + 9[2
+ 22 ] + 121 2 64

= 12 + 45 + 24 64 = 17

Joint Expectation
= (, )

[] = [(, )] = (, )(, )
= ( , )( , )
Covariance
Consider two random variables and
(, ) = = [( )( )]
= [ + ]
= [] +
30

EG_EE 580
Dr. S.Ghazanshahi

= []
Where [] is called the correlation between X and Y, and is denoted by . Let
and be independent RVs, then:
= [] =
= = = 0

uncorrelated RV

Uncorrelatedness:
= 0

and

Orthogonality:
= [] = 0

Note:
Independency

Uncorrelatedness

Uncorrelatedness

Independency

(not in general)

Correlation Coefficient:

| 1|
.

Example:
Let and be statistically independent RVs with = 0.75 , = 1 ,
2 = 4 and 2= 5.
Consider : = 2 + 1.
Find following as:

31

EG_EE 580
Dr. S.Ghazanshahi

1) = [] = [] [] = (0.75)(1) = 0.75
2) = [] = [ 2 2 + ] = [ 2 ] 2[][] + []
= 4 2(0.75) + 0.75 = 3.25
3) = = = 0
4) =

=0

= 3.25 0.75(0.75 2 + 1) = 3.4375


5) =
6) =

= 3.4375/(4 + 0.752 ) + 3.25 = .441

7) = 0.441
2

= [ 2 ] 2 [] = (3.25)2

Or:
= 2 + 42 = 3.25

Characteristic function (C.F)


The C.F. of RV is defined as:

() = [ ] = ()
() = { ()}

Where = Fourier Transform

Note:

1) (0) = ()

2) | ()| =

| () |

= () = 1

|()|| |

= () 1

32

EG_EE 580
Dr. S.Ghazanshahi

Application of the characteristic function:


1) [ ] =

()|=0

2) Consider two independent RVs and . Let = + , then:


() = [ (+) ]
= E[ ]
= E[ ] E[ ]
= () ()
Recall: {1 2 } = 1 () 2 () , where stands for convolution integrals.
Hence, () = () ().

() = () ( ) = () () convolution integral.

= () ( ) = () ()
By induction:
If:
= =1

all independent RVs, then:

() = 1 2
() = =1 ()
[] = =1 [ ]
2 = =1 2
Example:
=+

and are independent RVs.

() =

0 <

() =

0
33

EG_EE 580
Dr. S.Ghazanshahi

() = () ( ) = 0 ()

= {

( ),
2

, =

Example:
Find the characteristic function of an exponentially distributed RV X, with parameter
given by:
() =

Answer:

1) () = 0 () = 0

= 0 () =

2) Find the mean of random variable from C.F.


[] =

1 ()

= [

|=0

()2

] |=0 =

Random vector:
A random vector is a vector whose components are random variables:
= [1

or

1
2
=[ ]

() = 1 ,2 ,, [1 2 ] = [1 1 , , ]
() = 1 ,2,, [1 2 ] =

(1 ,2 ,, )
1

34

EG_EE 580
Dr. S.Ghazanshahi

General transformation:
Consider functions of random variables = [1
random vector as 1 = 1 ([1

] . Consider

]).

1 ()
1

()
= [ 2] = 2

[ ()]
Solve this system of equations to 1 ,
solution of 1

2 , , assuming there is a single

such as = 1 () then:

( ,2 ,, )

(1 , 2 , , ) = |( 1

1 ,2 ,, )|

Where:
1

: (1 , 2 , , ) = det

[ 1

Example:
Given:

1 = 1 (1 , 2 ) = 1 + 2

2 = 2 (1 , 2 ) =

and

1
1 +2

and

(1 , 2 ) = (1+2 ) 1 0 , 2 0.
Find (1 , 2 ).
( ,2 )

(1 , 2 ) = |(1

1 ,2 )|

Solve for 1 and 2


1 = 11 (1 , 2 ) = 1 2
2 = 21 (1 , 2 ) = 1 1 2
1

(1 , 2 ) =

[ 1
2
1

1
2
]
2
2

= |[

(1 +2 )2

(1 +2 )2

]|

35

EG_EE 580
Dr. S.Ghazanshahi

1
(1 +2

)2

(1 , 2 ) =

1 +2

|(1 , 2 )| = | | =

2
)2

1
1 +2

1
1

1,2 (1 2 ,1 1 2 )
1
1

= 1 12+11 2 = 1 1

1 0 , 0 2 1

Limits:
= 1 2 0
,
{ 1
2 = 1 (1 2 ) 0,

1 0 2 0
1 2 0 2 1

Correlation and covariance matrices:


1
2
The correlation and covariance matrices for a random vector = [ ] is defined as:

= = [ ]
1
1 1 1 2 1
2
2 1 2 2 2
= [ ] [1 2 ] = [

]
1 2
[
]
11 12 1

22

= [ 21
]

1 2
Where = [ ] and = [ 2 ] 0
= = [( )( ) ] = [ ]

36

EG_EE 580
Dr. S.Ghazanshahi

11 = 12
=[

1
]

2
=

Where = [ ] =
= 2 = [ 2 ]

Note:

= 2 and is the transpose of .

[]
are symmetric .

Jointly normal (Gaussian) :


RVs 1 , 2 , , have a jointly Gaussian distribution with (, ) where is a
vector mean and is an covariance matrix with the following joint density
function.
(1 , 2 , , ) =

1
( ) 1 ()}
2

1
(2) 2 | |2

For = 2:

= [

21

1 2 1 2

1 2 1 2

] = 2

1
2

[
(12 )

22

22
12 1 2

1 2 1 2
21

37

EG_EE 580
Dr. S.Ghazanshahi

(1 , 2 ) =

1
21 2 1

{
2

(1 1 )2

[
2(12 )

2(1 1 )(2 2 )

1 2

(2 2 )2
2

]}

Note:
If = 0, then 1 and 2 are independent random variables.
(1 , 2 ) =

1
21 2 1 2

Linear transformation of Gaussian RVs:


1 = 11 1 + + 1
2 = 21 1 + + 2

= 1 1 + +
1 = 1

11
=[
1

Then:
=

and

Example:
1 = 51 + 22 + 2
2 = 1 + 3 + 1

4
= [2
1

2 1
4 2]
2 4

5
= [1
2

2 0
0 1]
1 2

3 = 21 2 + 23
156 15 48
= = [15
6
0]
48
0
28

38

EG_EE 580
Dr. S.Ghazanshahi

Note:
2
1
= 156

2
2
=6

12 = 15

12 =

2
3
= 28
15
1566

Central Limit Theorem (CLT)


The CLT states that the distribution function of the sum of independent RVs,
= 1 + 2 + + converges to the Gaussian distribution with:
= =1 and 2 = =1 2
So = (, )
Binomial RV also can approximated with the Gaussian for large and 1.

(, )
( )

This is called the DeMoivre Laplace theorem.

Example:
= 100
=1 and is uniformly distributed in interval [0,1], So;
[ ] =
2 =

1+0
2

(10)2
12

=
=

1
2
1
12

100 1

= 100
=1 = =1 = 50
2

2
2 = 100
=1 =

= (50,

100
12

25
3

Example:
Consider a fair coin tossed 1014 times. Find the probability that heads appears
between 4900 and 5000 times.

39

EG_EE 580
Dr. S.Ghazanshahi

(4900 5000) (, )
= [

104
2

50005000

= (
1

50

= (
2

, 104 ( )( )] = (5000,50)

100

) (
1

49005000
50

)
1

) = 2 [1 (2)] = 2 + 0.977 = 0.477


50

40