gazanshahin notes

© All Rights Reserved

1 views

gazanshahin notes

© All Rights Reserved

- Worksheet Binomial Distribution Problems
- Complete Business Statistics-Chapter 3
- Probability & Statistics BITS WILP
- Binomial Distribution Exercises
- 499d2Module_6
- Prob Distribution.docx
- Binomial
- Mcq Binomial and Hypergeometric Probability Distribution With Correct Answers
- An Overview of Probability
- PQRS Manual En
- Binomial Distribution
- l 05 Stats i Discrete Prob Dist s
- Probability Blog 1
- Ch3_ContinuousDistributions_ProbPlots_torresgarcia_02152016.pdf
- Special Distributions
- Tutorial Excel Binomial Dist
- 5298649
- 6 Random Variables(1)
- Untitled
- Tutorial_4.pdf

You are on page 1of 40

Dr. S.Ghazanshahi

Random Experiment:

A random experiment is an experiment in which the outcome varies in an

unpredictable fashion, when the experiment is repeated under the same conditions.

Sample Space:

The sample space S of a random experiment is defined as the set of all possible

outcomes of random experiment.

Event (Set):

An event is a subset of the sample space S which contains specific outcomes.

Special Events:

a) S Universe set (or certain event), which consists of all possible outcomes of

an experiment.

b) - Impossible or null set, which contains no outcomes and hence never

occurs.

Example:

Select a ball from an urn containing balls numbered 1 to 20:

= {1 , 2 , 3 , , 20} discrete sample space.

{ } = {2 , 4 , 6 , , 20}

Pick two numbers at random between zero and one.

EG_EE 580

Dr. S.Ghazanshahi

1) Union Sum Or:

or ( + ) is defined as the set of all outcomes belong to or .

Example:

=

The intersection of two events and is denoted by or and is defined as

the set of outcomes that are in both and .

Example:

=

3) Complement of an event A:

Complement of an event is denoted by and is defined as the set of all outcomes

in but not in .

Example:

=

Set Algebra:

1) Commutative properties:

=

=

2) Associated properties:

( ) = ( ) =

( ) = ( ) = =ABC

EG_EE 580

Dr. S.Ghazanshahi

3) Distributive properties:

( ) = ( ) ( )

( ) = ( ) ( )

4) De Morgans Rules:

In general:

1 2 =

=1 = =1

1 2 =

=1 = =1

Mutually exclusive (Disjoint Sets):

1. A and B are mutually disjoint iff A = .

=

Sets 1 , 2 , are said to be mutually exclusive iff: {

=

Collectively Exhaustive:

N sets 1 , 2 , , are collectively exhaustive iff:

=1 =

If in addition = , then [1 , , ] are called partition of S.

Example:

Show that the following equallity is correct.

(A - AB) = A (

)

)

= A (

= A ( ) = (A )= ( ) = A = A (

)

EG_EE 580

Dr. S.Ghazanshahi

Probability is a rule that assigns to each event of sample space a number (),

called the probability of , and indicates how likely it is that the event will occur

and satisfies the following three axioms:

1) () 0

2) () = 1

3) For n disjoint events 1 , 2 , , such that = , then :

[ ] = ( )

=1

=1

Consider an experiment with N different outcomes, which all are disjoint and

equally likely to happen, and let times of these N outcomes correspond to

event A. Then we define:

P(A) =

= Number of times event A happen.

Example:

Let toss a six sided die, then:

Sample space: {1, 2, 3, 4, 5, 6}

Event A: { } = {2, 4, 6}

Event B: {2, 3},

P(A)=

3

6

Event C: {4, 5}

P(B)=

2

6

P(C)=

2

6

P(B )= P(B)+P(C)= + =

6

6 6

EG_EE 580

Dr. S.Ghazanshahi

Let us repeat an experiment n times under the same conditions. If event A happens

times, then, P(A) is defined as:

P(A) = lim

Where

Corollary of axioms:

1) () = 0

2) () = 1 ()

3) ( ) = () + () ()

Example:

Draw a card from a deck of 52 playing cards and find:

1) ( ) =

4

52

2) ( ) = 13

52

3) ( ) = () + () =

13

52

13

52

1

2

4) ( ) = () + () + ( ) =

1

52

13

52

4

52

16

52

EG_EE 580

Dr. S.Ghazanshahi

The conditional probability of an event given that event has occurred, denoted by

(|), and is defined by:

(|) =

()

for () 0

()

In computing (|), we can therefore view the experiment as now having the

reduced sample space . The event occurs in the reduced sample space. From the

above equation, the joint probability of P(AB) follows as:

() = (|)()

Similarly:

(|) =

()

()

() = (|)()

Example:

A box contains two black and three white balls. Two balls are selected at random

from the box without replacement and the sequence of colors is noted. Find the

probability that both balls are black.

1 2

1

(1 2 ) = (2 |1 )(1 ) = ( ) ( ) =

4 5

10

B black ball, W white ballType equation here.

(2 ) = (2 1 2 1 ) = (2 1 ) + (2 1 ) = (2 |1 )(1 ) +

2

(2 |1 )(1 ) = ( ) ( ) + ( ) ( ) =

4

5

4

5

5

EG_EE 580

Dr. S.Ghazanshahi

terms of joint or conditional probabilities. Consider n mutually exclusive events

where their union is equal to S:

=

i j = 1, 2, , n

=1 = S

Then:

P(A) = =1 ( )= =1 (| )P( )

Proof:

A = AS = A (1 2 )

= A1 2 (Mutually exclusive events)

P(A) = =1 ( ) = =1 (| )( )

Example:

An assembler of radios uses motors from three different factories. Factory one (1 )

supplies 50% of the motors, factory two (2 ) supplies 20% of the motors and factory

three (3 ) supplies 30%. It is known that 5% of the motors supplied by 1 , 3% of the

motors supplied by 2 and 2% of the motors supplied by 3 are defective.

a) What is the probability that the selected motor is defective??

b) What is the probability that this defective motor was supplied by 2 ?

a) () = (1 2 3 ) = (|1 )(1 ) + (|2 )(2 ) + (|3 )(3 )

= (0.05)(0.5) + (0.03)(0.2) + (0.02)(0.3) = 0.037

(2 |) =

(2 |)

(2 )

= ( |2 )

= 0.0162

()

()

EG_EE 580

Dr. S.Ghazanshahi

( |) =

( )

= (| )( )/ (| )( )

()

=1

Two events and are called statistically independent if the probability of the

occurrence of one event is not affected by the occurrence or non occurrence of the

other one. Mathematically:

(|) = ()

Or:

() = ()()

Three events are called independent if and only if :

1) ( ) = ( )( )

2) (1 2 3 ) = (1 )(2 )(3 )

Example: Toss a die

Outcomes: {1, 2, 3, 4, 5, 6}

Events:

= { } = {1, 3, 5}

= { } = {2, 4, 6}

= { } = {1, 2}

a) Are A and B statistically independent events?

b) Are A and C statistically independent events?

P(AB) =? P(A) P(B)

P(B) = P(A) =

1

2

EG_EE 580

Dr. S.Ghazanshahi

P(AB) = P() = 0

P(AB) P(A) P(B)

Or check for: P(C|) =? P(C)

P(C) =

P(A) =

1

2

P(AC) = P({1}) =

1

6

Or P(C|) =

1

3

P(C) =

1

3

Random variable

Definition:

A real random variable is a single valued function that maps all outcomes of the

sample space into a set of real numbers. The domain of this function is the set of

experimental outcomes (i.e; ) and its range is a set of real numbers (i.e., R). Hence

random variable is a function that assigns a real number, to each outcome the

sample space.

Example:

Toss a coin three times and let define random variable as the number of heads in

each outcome in as:

, , , , , , ,

() = 3 , 2 , 2 , 2 , 1 , 1 , 1

,0

Note:

( = 0) =

1

8

( = 1) =

3

8

EG_EE 580

Dr. S.Ghazanshahi

( = 2) =

( = 3) =

1

8

Example:

Toss a die and define random variable as ( ) = 10.

= {1 , 2 , , 6 } = {10 , 10 , 30 , , 60}

One to one mapping:

( = 35) = () = 0

( 35) =

( = 40) =

6

6

The distribution function of the random variable is defined as the probability of the

event { }:

() = ( )

Example: I n example of die as defined above,

F(x)

(0) = 0

(10) =

1

6

(60) = 1

10

EG_EE 580

Dr. S.Ghazanshahi

1) (-) = P(X< ) = () = 0

2) () = P(X< ) = () = 1

3) (1 ) (2 ) if 1 2 ; () increasing function of x and has

positive slope.

4) () = ( + )

+ = +

5) P(1 < 2 ) = (2 ) (1 )

for 1 < 2

6) ( = ) = (+ ) ( )

7) ( = ) = 0

4

4

The probability density function is denoted by (), and is defined as derivative of

the distribution function:

()

()

( < < + ) = ( + ) () =

{( + ()}

= ()

11

EG_EE 580

Dr. S.Ghazanshahi

For discrete random variable with stair step distribution function, the unit impulse

function () is introduced to define f(x) at stair step points as:

() = ( )( )

=1

1) () 0

2) () = ()

3) () = ()|

= () () = () = 1

4) P{1 < 2 } = 2 () = (2 ) (1 )

1

12

EG_EE 580

Dr. S.Ghazanshahi

Example:

Let X be continuous random variable with probability density function defined as:

() = {

2 , 0 1

0 ,

b) Find ()

Answer:

1

0 () = 0 2 = 1

=3

0.75

0.75

0.5

b) () = 0 32 = 3

Note:

Unit step function and delta function are defined as:

() = {

() =

0, < 0

1, 0

()

() = ()

() for discrete RVs can be written as a weighted sum of unit step functions:

() = ( )( )

13

EG_EE 580

Dr. S.Ghazanshahi

() =

()

= ( )( )

Example:

Recall the example of a die where RV X is discrete.

( ) =

1

6

= 1, 2, 3, , 6

() = ( 10) + ( 20) + + ( 60)

14

EG_EE 580

Dr. S.Ghazanshahi

1

,

() = {

0,

() = () =

Example:

A resistor is RV uniformly distributed between [900,1100]. Find the probability

that is between 950 and 1050 .

() =

1

1100900

1

200

1050 1

200 950

200

= 0.5

() =

1

2

()2

22

< <

= = mean or average value of X

2 = Variance (VAR), spread of X around the mean value.

= Standard deviation (SDV) = + 2

Distribution function of Gaussian random variable is:

F()= ()

()

1

2 22

15

EG_EE 580

Dr. S.Ghazanshahi

2

(1 2 ) = () =

1

1

2

()2

22

There are no analytical solutions for above integrals, and we should calculate them

numerically. There exist a table for calculating the distribution function of standard

Gaussian with =0, and = 1. .

For negative value of we can use F(- )=1-F( ).

distribution function from standard normal distribution function and its table, using

variable transformation as:

=

Therfore:

(1 2 ) =

1 =

2 =

=0 =1

(0,1) =

2 1

2 2

1

1

2

. ;

()2

2

= 2 () = (2 ) (1 ) Where

1

(0,1)

Standard normal p.d.f.

2

(1 < < 2 ) = (2 ) (1 ) = (

() =

()

2 2

) (

Example:

A random variable X normally distributed with mean=1000, and standard deviation

equal 50 (i.e. (1000,50)). Find:

16

EG_EE 580

Dr. S.Ghazanshahi

10501000

50

9001000

)] [ (

50

)] = (1) (2)

A binomial experience composed of identical independent trials (Bernoulli trials),

where each trial has just two outcomes success or failure with (() = ) and

(() = ). The binomial arises in applications where there are two outcomes

such as head/tail, good/defective, correct/incorrect, pass/fail.

If RV X defines the number of successes in trials. Then has a binomial

distribution of order with:

( = ) = () 1

= 0, 1, 2, ,

+ =1

With the following density and distribution functions:

( = ) = ()

() = =0() ( )

() = ( ) ( )

=0

For large , binomial density function can be approximated with Gaussian density

function of (, ) , and we can use standard normal table to calculate

= =

1

2

17

EG_EE 580

Dr. S.Ghazanshahi

event in a certain time period, or in a certain region of space. Poisson RV arises in

counts of emissions from radioactive substances, in counts of demands for telephone

demand, in counts of defects in a semiconductor chip, and in count of number of

customers coming to teller.

Probability function of the Poisson RV is given by:

( = ) =

= 0, 1, 2, 3, .

Where is the average number of events occurrence in the same time interval or

region of space.

Where a = T= average number of events in interval T

=average rate at which events occur.

The Poisson density and distribution functions are:

() = ( )( ) =

( )

!

=0

() = ( )( ) =

=0

( )

Example:

Let X be the number of accidents in one intersection during one week where average

number of accidents in one week reported as 2. (a=2)

a) Write density and distribution functions of X.

b) find (2).

18

EG_EE 580

Dr. S.Ghazanshahi

Answer:

a) () = ( )( ) =

=0

() = =0

b) (2) = 2=0

!

2 2

!

( ) = =0

=

2 20

0!

2 21

1!

2 2

a=2

( )

2 22

2!

( )

= 0.135+0.27+0.27=0.675

Remark:

Density and distribution functions for Poisson random variable when plotted appear

similar to those of the Binomial. Actually Binomial probability function converges

to Poisson probability function for and 0 in such a way that = .

(# = ) = ( )

!

Example:

The probability of a bit error in communications is 103 . Find the probability that a

block of 1000 bits has five or more errors.

Each bit transmission corresponds to a Bernoulli trial success equal to a bit error.

= 1000

= 103

( 5) = 1 ( < 5) = 1 4=0

= = 1000 103 = 1

!

= 1 1 [1 +

1

1!

1

2!

1

3!

+ ]=

4!

0.00366

Let, = (),

Find () from density function of ().

Solve the equation = () and denote the real roots by as:

= (1 ) = (2 ) = = ( ), then using the transformation theorem, we can

calculate the density function of as:

19

EG_EE 580

Dr. S.Ghazanshahi

() =

=1

Note that ( ) =

( )

| ( )|

| =

Example:

2

Given = () = and () =

1

2

()2

2

function of .

Find roots: 1 =

2 =

() = 2

| (1 )| = | (2 )| = 2

() =

1

2

[ () + ()] = 2

(1 )

(2 )

+

=

| (1 )| | (2 )|

()

2

1

2

()

2

)=

20

EG_EE 580

Dr. S.Ghazanshahi

Joint statistics

(1 , 1 ) = ( 1 1 )

(, ) = () = 0

(, ) = (S) = 1

(, ) = (X < ) = () = 0

(, ) = (X < ) = (Y y) = () , marginal distribution

function of y

(, ) = () , marginal distribution function of x

(, ) = ( + , + ): .

(, ) ncreasing function of both arguments x and y.

Joint density function:

(, ) =

2 (, )

0

Note:

1) (, ) = (, )

2) (, ) = () = 1

2

2

3) P(1 < 2 , 1 < 2 ) = =

= (, )

1

Remark:

If X and Y be two discrete random variables, the distribution function, (, ) has

step discontinuities with undefined derivatives at these discontinuous points.

21

EG_EE 580

Dr. S.Ghazanshahi

at these points as:

(, ) = =1

=1 ( , )( )( )

And joint distribution function is:

(, ) = ( , )( )( )

=1 =1

Example:

Consider two continuous random variables X and Y with the following joint density

function as:

,

(, ) = {

0 ,

01 01

a) Find , (, )

1

b) Find ( , )

c) ( )

Answer:

K=?

1

=0 =0 = () = 1 = 4

a) (, ) = 0 0 4 = 0 4 0 = 0 4 ( )

2

= 2 2 0 = 2 2

a)

1

2

3

4

1

2

1 3

, ) = 0 0 4 = ( , ) =

4

2 4

1

9

64

14

c) ( ) = =0 =0 4 = 0 4 0 = 0 3 =

2

2

4

4 |10 =

1

2

22

EG_EE 580

Dr. S.Ghazanshahi

(, ) = ()

(, ) = ()

Therefore:

() = (, )

This is marginal density function of from the joint density function. Similarly:

() = (, )

( = ) = ( = = )

Recall total probability and compare () = ( )

Two RVs and are called statistically independent if the events { } and

{ } be independent, which means:

( ) = ( ) ( )

(, ) = () ()

Or:

(, ) = () ()

f(x|y)=f(x)

23

EG_EE 580

Dr. S.Ghazanshahi

(|) is the conditional distribution function of RV X assuming , and is defined

as the conditional probability of event { }:

(|) = ( |) =

( )

()

(|) =

(|)

(|) = () = 1

(|) = (|)

(1 < 2 |) = 2 (|)

1

1) Interval Conditioning:

= {1 < 2 }

0

( |) = ( |) =

( )

()

()(1 )

={

()

< 1

1 < 2

By differentiating:

0

(|) =

(|) =

(|)

()

()

()

= {()

0

< 1

1 < 2

2

Where = {1 2 } or = { 2 }

24

EG_EE 580

Dr. S.Ghazanshahi

(|) = (|) =

(,)

()

Example:

Consider two RVs and with the joint density function of:

1

(, ) = ( + )

0 1 0 2

Find:

1) (, )

2) () and ()

3) () and ()

4) (|)

5) (|) where = {0 0.5}

6) (0.2 0.4 , 1 1.5)

Answer:

1) (, ) = 0 0 (, ) = 0 0 ( + ) = ( 2 +

3

6

2 )

1

2) () = (, 2) = (2 2 + 4)

6

1

() = (1, ) = ( + 2 )

6

3) () =

() = (4 + 4)

6

21

Or: () = 0 (, ) = 0 ( + ) = +

1

21

() = 0 (, ) = 0 ( + ) = +

25

EG_EE 580

Dr. S.Ghazanshahi

4) (|) =

(,)

5) (|) =

()

()

()

1

(+)

3

1

1

(+2)

3

+

+

2

(+1)

3

0.52

0 3(+1)

1

2

2

(+1)

3

=

(0.5)

1.5

4

(+1)

3

1 2 4

( + )

3 4 2

= ( + 1)

5

0.4 1

3

Definition:

Process of averaging when random variable is involved is called Expectation. For

random variable X , the notation E[X]= , called the expected value of X, or the

statistical average of X,; or the mean value of X,.

In general, consider random variable X and its function g(X), then expected value of

g(X) defined as:

[()] =

() = ()()

= ( )( )

for continuous RV

for discrete RV

Mean:

The mean of RV is defined by:

[] = = ()

for continuous RV

[] = = =1 ( )

for discrete RV

[] = = .

Variance (VAR)

[] = 2 = [( )2 ] 0

26

EG_EE 580

Dr. S.Ghazanshahi

. .:

2 = =1( )2 ( )

2 = [( )2 ] = ( )2 ()

1) Variance is a measure of the spread about the mean value.

2) Variance is always positive value.

3) If 2 = 0, then ( = ) = 1.

X is a constant

Another form of :

2 = [( )2 ] = [ 2 + 2 2]

= [ 2 ] + 2 2[]

= [ 2 ] 2

= [ 2 ] 2 []

If = 0 , then 2 = [ 2 ]

Example:

Throw a dart towards a target:

( = 1) = 0.64

( = 2) = 0.32

( = 3) = 0.04

[] = ( ) = (1)(0.64) + (2)(0.32) + (3)(0.04) = 1.4

2 = [ 2 ] 2 []

[ 2 ] = 2 ( ) = (1)2 (0.64) + (2)2 (0.32) + (3)2 (0.04) = 2.28

2 = 2.28 (1.4)2 = 0.32

27

EG_EE 580

Dr. S.Ghazanshahi

Denote the kth moment by :

= [ ] = ()

= [ ] = =0 ( )

Central Moments:

Moments around the mean value of random variable X are called central moments

and given by:

= [( ) ] = ( ) ()

or for discrete RV: = ( ) ( )

Example:

1 = =

2 = [ 2 ] =

2 = 2 =

Properties of expected values:

1) [] = ; =

[] = () = ()

=

= (, )

[] = [(, )] = (, )(, )

= ( , )( , )

2) [1 + 2 ] = [1 ] + [2 ]

28

EG_EE 580

Dr. S.Ghazanshahi

Since:

[1 + 2 ] = (1 + 2 )(1 , 2 )1 2

= 1 (1 , 2 )1 2 + 2 (1 , 2 )1 2

= 1 (1 )1 + 2 (2 )2

= [1 ] + [2 ]

In general:

3) [

=1 ] = =1 [ ] = =1 [ ]

are constants.

[1 2 ] = [1 ][2 ]

Since:

[1 2 ] = 1 2 (1 , 2 ) 1 2 = 1 2 (1) (2 ) 1 2 =

1 (1 )1 2 (2 )2

= [1 ][2 ]

In general:

[1 ] = [ ] = [ ]

=1

=1

Example:

Find the mean and variance of the uniform random Variable:

() =

2 1

2 1

= [] = 2 () = 2

1

2 1

1 +2

2

29

EG_EE 580

Dr. S.Ghazanshahi

2 = [( )2 ] = 2 (

1 +2 2

let

1 +2

2

) ()

, =

2 1

2

1

(2 1 )2

2

2

=

=

2 1 21

12

2

Example:

Consider two independent RVs 1 and 2 with:

2

2

[1 ] = 1 , [2 ] = 2 , 1

= 2 , 2

=1

[] = [21 + 32 ] = 2[1 ] + 3[2 ] = 2(1) + 3(2) = 8

2 = [ 2 ] 82 = [412 + 922 + 121 2 ] 82

= 4[12 ] + 9[22 ] + 12[1 2 ] 64

2

2

= 4[1

+ 12 ] + 9[2

+ 22 ] + 121 2 64

= 12 + 45 + 24 64 = 17

Joint Expectation

= (, )

[] = [(, )] = (, )(, )

= ( , )( , )

Covariance

Consider two random variables and

(, ) = = [( )( )]

= [ + ]

= [] +

30

EG_EE 580

Dr. S.Ghazanshahi

= []

Where [] is called the correlation between X and Y, and is denoted by . Let

and be independent RVs, then:

= [] =

= = = 0

uncorrelated RV

Uncorrelatedness:

= 0

and

Orthogonality:

= [] = 0

Note:

Independency

Uncorrelatedness

Uncorrelatedness

Independency

(not in general)

Correlation Coefficient:

| 1|

.

Example:

Let and be statistically independent RVs with = 0.75 , = 1 ,

2 = 4 and 2= 5.

Consider : = 2 + 1.

Find following as:

31

EG_EE 580

Dr. S.Ghazanshahi

1) = [] = [] [] = (0.75)(1) = 0.75

2) = [] = [ 2 2 + ] = [ 2 ] 2[][] + []

= 4 2(0.75) + 0.75 = 3.25

3) = = = 0

4) =

=0

5) =

6) =

7) = 0.441

2

= [ 2 ] 2 [] = (3.25)2

Or:

= 2 + 42 = 3.25

The C.F. of RV is defined as:

() = [ ] = ()

() = { ()}

Note:

1) (0) = ()

2) | ()| =

| () |

= () = 1

|()|| |

= () 1

32

EG_EE 580

Dr. S.Ghazanshahi

1) [ ] =

()|=0

() = [ (+) ]

= E[ ]

= E[ ] E[ ]

= () ()

Recall: {1 2 } = 1 () 2 () , where stands for convolution integrals.

Hence, () = () ().

() = () ( ) = () () convolution integral.

= () ( ) = () ()

By induction:

If:

= =1

() = 1 2

() = =1 ()

[] = =1 [ ]

2 = =1 2

Example:

=+

() =

0 <

() =

0

33

EG_EE 580

Dr. S.Ghazanshahi

() = () ( ) = 0 ()

= {

( ),

2

, =

Example:

Find the characteristic function of an exponentially distributed RV X, with parameter

given by:

() =

Answer:

1) () = 0 () = 0

= 0 () =

[] =

1 ()

= [

|=0

()2

] |=0 =

Random vector:

A random vector is a vector whose components are random variables:

= [1

or

1

2

=[ ]

() = 1 ,2 ,, [1 2 ] = [1 1 , , ]

() = 1 ,2,, [1 2 ] =

(1 ,2 ,, )

1

34

EG_EE 580

Dr. S.Ghazanshahi

General transformation:

Consider functions of random variables = [1

random vector as 1 = 1 ([1

] . Consider

]).

1 ()

1

()

= [ 2] = 2

[ ()]

Solve this system of equations to 1 ,

solution of 1

such as = 1 () then:

( ,2 ,, )

(1 , 2 , , ) = |( 1

1 ,2 ,, )|

Where:

1

: (1 , 2 , , ) = det

[ 1

Example:

Given:

1 = 1 (1 , 2 ) = 1 + 2

2 = 2 (1 , 2 ) =

and

1

1 +2

and

(1 , 2 ) = (1+2 ) 1 0 , 2 0.

Find (1 , 2 ).

( ,2 )

(1 , 2 ) = |(1

1 ,2 )|

1 = 11 (1 , 2 ) = 1 2

2 = 21 (1 , 2 ) = 1 1 2

1

(1 , 2 ) =

[ 1

2

1

1

2

]

2

2

= |[

(1 +2 )2

(1 +2 )2

]|

35

EG_EE 580

Dr. S.Ghazanshahi

1

(1 +2

)2

(1 , 2 ) =

1 +2

|(1 , 2 )| = | | =

2

)2

1

1 +2

1

1

1,2 (1 2 ,1 1 2 )

1

1

= 1 12+11 2 = 1 1

1 0 , 0 2 1

Limits:

= 1 2 0

,

{ 1

2 = 1 (1 2 ) 0,

1 0 2 0

1 2 0 2 1

1

2

The correlation and covariance matrices for a random vector = [ ] is defined as:

= = [ ]

1

1 1 1 2 1

2

2 1 2 2 2

= [ ] [1 2 ] = [

]

1 2

[

]

11 12 1

22

= [ 21

]

1 2

Where = [ ] and = [ 2 ] 0

= = [( )( ) ] = [ ]

36

EG_EE 580

Dr. S.Ghazanshahi

11 = 12

=[

1

]

2

=

Where = [ ] =

= 2 = [ 2 ]

Note:

[]

are symmetric .

RVs 1 , 2 , , have a jointly Gaussian distribution with (, ) where is a

vector mean and is an covariance matrix with the following joint density

function.

(1 , 2 , , ) =

1

( ) 1 ()}

2

1

(2) 2 | |2

For = 2:

= [

21

1 2 1 2

1 2 1 2

] = 2

1

2

[

(12 )

22

22

12 1 2

1 2 1 2

21

37

EG_EE 580

Dr. S.Ghazanshahi

(1 , 2 ) =

1

21 2 1

{

2

(1 1 )2

[

2(12 )

2(1 1 )(2 2 )

1 2

(2 2 )2

2

]}

Note:

If = 0, then 1 and 2 are independent random variables.

(1 , 2 ) =

1

21 2 1 2

1 = 11 1 + + 1

2 = 21 1 + + 2

= 1 1 + +

1 = 1

11

=[

1

Then:

=

and

Example:

1 = 51 + 22 + 2

2 = 1 + 3 + 1

4

= [2

1

2 1

4 2]

2 4

5

= [1

2

2 0

0 1]

1 2

3 = 21 2 + 23

156 15 48

= = [15

6

0]

48

0

28

38

EG_EE 580

Dr. S.Ghazanshahi

Note:

2

1

= 156

2

2

=6

12 = 15

12 =

2

3

= 28

15

1566

The CLT states that the distribution function of the sum of independent RVs,

= 1 + 2 + + converges to the Gaussian distribution with:

= =1 and 2 = =1 2

So = (, )

Binomial RV also can approximated with the Gaussian for large and 1.

(, )

( )

Example:

= 100

=1 and is uniformly distributed in interval [0,1], So;

[ ] =

2 =

1+0

2

(10)2

12

=

=

1

2

1

12

100 1

= 100

=1 = =1 = 50

2

2

2 = 100

=1 =

= (50,

100

12

25

3

Example:

Consider a fair coin tossed 1014 times. Find the probability that heads appears

between 4900 and 5000 times.

39

EG_EE 580

Dr. S.Ghazanshahi

(4900 5000) (, )

= [

104

2

50005000

= (

1

50

= (

2

, 104 ( )( )] = (5000,50)

100

) (

1

49005000

50

)

1

50

40

- Worksheet Binomial Distribution ProblemsUploaded byonek1ed
- Complete Business Statistics-Chapter 3Uploaded bykrishnadeep_gupta4018
- Probability & Statistics BITS WILPUploaded bypdparthasarathy03
- Binomial Distribution ExercisesUploaded byAlp Eren AKYUZ
- 499d2Module_6Uploaded byTanyaSehgal
- Prob Distribution.docxUploaded byGodfred Abledu
- BinomialUploaded byNishant Dutta
- Mcq Binomial and Hypergeometric Probability Distribution With Correct AnswersUploaded byranit98
- An Overview of ProbabilityUploaded byMarvin Luis Montilla
- PQRS Manual EnUploaded byjfmagar-1
- Binomial DistributionUploaded byShruti S Kumar
- l 05 Stats i Discrete Prob Dist sUploaded byyeshimebet
- Probability Blog 1Uploaded byAkash Ashish Gautama
- Ch3_ContinuousDistributions_ProbPlots_torresgarcia_02152016.pdfUploaded byOsiris López Limas
- Special DistributionsUploaded byUmair Ansari
- Tutorial Excel Binomial DistUploaded byHarry Monson
- 5298649Uploaded byHasun Malinga
- 6 Random Variables(1)Uploaded byAndrea Allen Lu
- UntitledUploaded byapi-140032165
- Tutorial_4.pdfUploaded byAkshay Goel
- Business Statistics- CW2.docUploaded byfmuffet
- Bnm DistUploaded byNeha Ahlawat
- Binomial DistributionUploaded byssd13
- sadUploaded byCamilo Lillo
- statisticsUploaded bySalman Shakir
- PROBABILITY DISTRIBUTION EDITED.docxUploaded bySharvinder Singh
- DIT+1073Uploaded byct asiyah
- unit 13_4939Uploaded byRavi Stephen
- AD Week 1Uploaded bySahil Jain
- Lecture-2Uploaded byS.Waqquas

- Force Torque1Uploaded bySagar Jathan
- Instru Sem ViUploaded bySagar Jathan
- Silicon Photonics and Their ApplicationsUploaded bySagar Jathan
- hw3Uploaded bySagar Jathan
- Self Balancing SkateBoardUploaded bySagar Jathan
- Digital_Signal_Processing_by_J.S.CHITODE.pdfUploaded bydivyanaidu7
- Resume samplesUploaded bySagar Jathan
- aa_karaUploaded bySagar Jathan
- Interfacing GSM With 8051Uploaded bySagar Jathan
- Data Transfer Between Two USB Disk Without Use of ComputerUploaded bySagar Jathan
- 01 Phy Quant 2 (Length & Time) (9Jan8)Uploaded bySagar Jathan
- ThreadsUploaded bySagar Jathan

- Discrete and Continuous Probability Distribution ToolkitUploaded bySean Callahan
- 11MA4ICMAT[1]Uploaded byLeoThomas
- Reliability L.D. AryaUploaded byShweta Jain
- Homework Solution 1Uploaded byAnonymous ED9Je1Uc
- Factores que afectan el Split Set.pdfUploaded byJuan David Rondinel Buleje
- Chapter 2-Decision Theory Problem.pdfUploaded byDrHarman Preet Singh
- Laws of Large NumbersUploaded bySoumen Bose
- [Alessandro_Fantechi,_Thierry_Lecomte,_Alexander_R] - Reliability, Safety, And Security of Railway Systems 2017Uploaded byperry.stamatiou
- ed 315 probability 2Uploaded byapi-308892367
- Probability Final Exam with solutionsUploaded bysaruji_san
- _newbold_ism_04.pdfUploaded byIrakli Maisuradze
- Measuring and Comparing the ProductivityUploaded byjason_cullen
- Chapter 1 sUploaded byMichael Pun
- 0538479825_312310Uploaded byShua
- Taburan KebarangkalianUploaded byAbdul Manaf
- SLOPE_WUploaded bySastra Win
- Corrosion TechnologyUploaded byRaghav Kiran
- MATH 2209 Probability Notes CompilationUploaded byCallum Biggs
- CHAP 4.pptUploaded bySenelwa Anaya
- sta255_Lecture3Uploaded byIEyra ShaHera
- Handout12Uploaded byBinasxx
- Utility TheoryUploaded byDerick Mendoza
- Prob TablesUploaded byjayasanker
- RISK-LinzUploaded byMiguel Angel Hrndz
- Improvement of Probabilistic small signal stability of power system with large scale wind integrationUploaded bySamundra Gurung
- P and S Unit 1Uploaded byMadhav Ch
- STAT3502Uploaded byMoustafa Sohdy
- Carrier-Proving History or IdiocyUploaded by50_BMG
- Itô's Stochastic Calculus and Its ApplicationsUploaded bykim haksong
- B4.1-R3Uploaded byapi-3782519