Attribution Non-Commercial (BY-NC)

2.3K views

Attribution Non-Commercial (BY-NC)

- Papoulis Solutions Manual 4th Edition
- veerarajan
- Probability, Statistics and Random Processes 1
- Probability- Statistics and Random Processes by Veerarajan
- Stark Woods Solution Chp1-7 3ed
- Formula Sheet-Probability and Random Processes
- Probability&RandomProcesseswithApplicationstoSignalProcessing3eStark
- Probability Random Variables and Random Processes Part 1
- IPA - Monte Carlo Simulation
- Introduction to Probability
- DIT 111 Probability and queing theory.pdf
- 74263926 Probability Statistics anProbability-Statistics-and-Random-Processesd Random Processes 1
- Probability and Queuing Theory - Question Bank.
- Probability and Stochastic Processes
- Introduction to Probability
- 74263926-Probability-Statistics-and-Random-Processes-1.pdf
- Stochastic lecture 5
- Ma0232 Probability Random Process
- Solution Manual.error Control Coding 2nd.by Lin Shu and Costello
- Answer of Pro and Stat

You are on page 1of 312

Random Processes

Title : Probability Theory and Random Processes

Course Code : 07B41MA106 (3-1-0), 4 Credits

Pre-requisite : Nil

Objectives :

To study

● Probability: its applications in studying the outcomes of random experiments

● Random variables: types, characteristics, modeling random data

● Stochastic systems: their reliability

● Random Processes: types, properties and characteristics with special

reference to signal processing and trunking theory.

(i) model real life random processes using appropriate statistical distributions;

(ii) compute the reliability of different stochastic systems;

(iii) apply the knowledge of random processes in signal processing and trunking

theory.

Evaluation Scheme

Evaluation Components

Weightage ( in percent)

Teacher Assessment

25

(based on assignments, quizzes,attendence etc.)

T1 (1 hour)

Reference Material :

1. T. Veerarajan. Probability, Statistics and Random

processes. Tata McGraw-Hill.

2. J. I. Aunon & V. Chandrasekhar. Introduction to Probability

and Random Processes. McGraw-Hill International Ed.

3. A. Papoulis & S. U. Pillai. Probability, Random Variables

and Stochastic Processes. Tata WcGraw-Hill.

4. Stark, H. and Woods, J.M. Probability and Random

Processes with Applications to Signal Processing.

.

Origins of Probability

originally

came from gambling!

Why are Probabilities

Important?

• They help you to make good decisions, e.g.,

– Decision theory

• They help you to minimize risk, e.g.,

– Insurance

• They are used in average-case

time complexity analyses of

- Computer algorithms.

• They are used to model processes

in

- Engineering.

Random Experiments

• An experiment whose outcome or result can

be predicted with certainty is called a

deterministic experiment.

•Although all possible outcomes of an experiment

may be known in advance, the outcome of a

particular performance of the experiment cannot be

predicted owing to a number of unknown causes.

Such an experiment is called a random experiment.

•A random experiment is an experiment that can be

repeated over and over, giving different results.

•e.g A fair 6-faced cubic die, the no. of telephone

calls received in a board in a 5-min. interval.

Probability theory is a study of random or

unpredictable experiments and is helpful

in investigating the important features of

these random experiments.

Probability Definitions

• For discrete math, we focus on the

discrete version of probabilities.

• For each random experiment, there is

assumed to be a finite set of discrete

possible results, called outcomes. Each

time the experiment is run, one outcome

occurs. The set of all possible outcomes

is called the sample space.

Example.

coins, then the sample space is:

Example.

dice, then the sample space is:

S = {(i, j) | i, j = 1, 2, 3, 4, 5, 6}

More Probability Definitions

• A subset (say E) of the sample space is

called an event. In other words, events

are sets of outcomes.

contained in E, then we say E has

occurred.)

For each event, we assign a number between 0

and 1, which is the probability that the event

occurs.

Example.

coins, and E is the event that a head

appears on the first coin, then E is:

Example.

dice, and E is the event that the sum of the

two dice equals 7, then E is:

E = {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}

Union: E∪ F

mutual exclusive.)

NOTE

It may or may not be true that all

outcomes are equally likely. If they

are, then we assume their

probabilities are the same.

Definition of Probability in

Text

The probability of an event E is the sum of the

probabilities of the outcomes in E

p(E) = n(E)/n(S)

=

exhasustive number of cases in S

Example

• The outcomes are: 1, 2, 3, 4, 5, and 6,

presumably each having an equal

probability of occurrence (1/6).

• One event is “odd numbers”, which

consists of outcomes 1, 3, and 5. The

probability of this event is 1/6 + 1/6 +

1/6 = 3/6 = 0.5.

Example

• An urn contains 4 green balls and 6 red balls.

What is the probability that a ball chosen from the

urn will be green?

assumed to be equally likely. Of these, 4 of them

yield a green ball. So probability is 4/10 = 0.4.

Example

• What is the probability that a person wins the

lottery by picking the correct 6 lucky numbers out

of 40? It is assumed that every number has the

same probability of being picked (equally likely).

number of ways we can choose 6 numbers out

of 40 is:

C(40,6) = 40! / (34! 6!) = 3,838,380.

Therefore, the probability is 1/3,838,380.

Examples

• Consider an experiment in which a coin is tossed twice.

• Sample space: { HH, HT, TH, TT }

• Let E be the event that at least one head shows up on the

two tosses. Then E = { HH, HT, TH }

• Let F be the event that heads occurs on the first toss. Then

F = { HH, HT }

• A natural assumption is that all four possible events in the

sample space are equally likely, i.e., each has probability

¼.

Then the P(E) = ¾ and P(F) = ½.

Probability as a Frequency

Let a random experiment be repeated n times and

let an event A occur n Atimes out of the n trials.

nA

The ratio is called the relative frequency of

n

the event A

nA

As n increases, shows a tendency to stabilise and to

n

approach a constant value. This value, denoted by P(A),

nA

is called the probability of the event A, i.e., P(A) = lim n →∞ .

n

Frequency Definition of

Probability

• Consider probability as a measure of the

frequency of occurrence.

– For example, the probability of “heads” in a

coin flip is essentially equal to the number of

heads observed in T trials, divided by T, as T

approaches infinity.

number of heads

Pr( heads) ≈ lim T →∞

T

Probability as a Frequency

• Consider a random experiment with possible outcomes

w1, w2, …,wn. For example, we roll a die and the possible

outcomes are 1,2,3,4,5,6 corresponding to the side that

turns up. Or we toss a coin with possible outcomes H

(heads) or T (tails).

• We assign a probability p(wj) to each possible outcome wj

in such a way that:

p(w1) + p(w2) +… + p(wn) = 1

• For the dice, each outcome has probability 1/6. For the

coin, each outcome has probability ½.

Example

To find the probability that a spare part produced

by a machine is defective.

it is assumed that the probability of a defective item

is 0.05

∪

∪

∩

∪

∩

∪

Ax

Example

Rolling a die is a random experiment.

•

• The outcomes are: 1, 2, 3, 4, 5, and 6. Suppose the die is

“loaded” so that 3 appears twice as often as every other number.

All other numbers are equally likely. Then to figure out the

probabilities, we need to solve:

p(1) + p(2) + p(3) + p(4) + p(5) + p(6) = 1 and p(3) = 2*p(1)

and p(1) = p(2) = p(4) = p(5) = p(6).

Solving, we get p(1) = p(2) = p(4) = p(5) = p(6) = 1/7 and p(3) = 2/7.

• One event is “odd numbers”, which consists of outcomes 1, 3,

and 5. The probability of this event is:

p(odd) = p(1) + p(3) + p(5) = 1/7 + 2/7 + 1/7 = 4/7.

Theorem 1

The probability of the impossible event is zero,

i.e. , if Φ is the subset (event) containing no sample

point, P( Φ ) = 0.

Pr oof

The certain event S and the impossible event

Φ are mutually exclusive.

Hence P(S ∪ Φ ) = P(S) + P( Φ )

But S ∪ Φ = S.

∴ P(S) = P(S) + P( Φ )

∴ P( Φ ) = 0

Theorem

If A and B are any 2 events,

P(A ∪ B) = P(A) + P(B) - P(A ∩ B)

≤ P(A) + P(B)

Pr oof :

A is the union of the mutually exclusive events

AB and AB and B is the union of the mutually

exclusive events AB and AB.

∴ P( A ) = P( AB ) + P( AB )

& P(B ) = P( AB ) + P( AB )

S

AB AB AB

A B

∴ P( A ) + P( B )

[

= P( AB ) + P( AB ) + P( AB ) + P( AB ) ]

= P( A ∪ B ) + P( A ∩ B )

Hence proved.

S

AB AB AB

A B

Conditional Probability

that event E also occurs?

This probability is called conditional

probability and denoted as p(E|F).

If p(F) > 0, then

p( E ∩ F )

p( E | F ) =

p( F )

Example.

balls. We draw 2 balls from the urn

without replacement. What is the

probability that both balls are red?

Solution:

Let E be the event that both balls drawn are

red. Then

p(E) = C(8, 2)/C(12, 2)

probability approach,

that the first and second balls drawn are red.

Then

p(E1E2) = p(E1) p(E2 | E1 ) = (8/12) (7/11)

Multiplication Rule

p( E1 E2 E3 ...En )

= p( E1 ) p( E2 | E1 ) p( E3 | E1 E2 ) ...p( En | E1 E2 ...En−1 )

Example.

divided into 4 piles of 13 cards each.

Compute the probability that each pile

has exactly one ace.

Solution:

Define events

p(E1E2E3E4) = p(E1)p(E2| E1)p(E3|E1E2)p(E4|E1E2E3)

p(E1) = C(4,1)C(48,12)/C(52,13)

p(E2|E1) = C(3,1)C(36,12)/C(39,13)

p(E3|E1E2) = C(2,1)C(24,12)/C(26,13)

p(E1E2E3E4) ≈ 0.1055

Let E and F be events. We can express E as

E = EF ∪ EFc

Therefore, we have

Independent Events

p(EF)=p(E)p(F)

dependent.

p(EF) = p(E)p(F) if and only if p(E|F) = p(E).

Fc.

Theorem

If the events A and B are independent , the events

A & B are also independents.

Pr oof

The events A ∩ B & A ∩ B are mutually exclusive

such that P(A ∩ B) + P(A ∩ B) = P(B)(by addition theorem)

= P(B) − P(A)P(B)(by product theorem)

= P(B)[1 − P(A)] = P(A)P(B)

Three events E, F, and G are independent if

p(EFG) = p(E)p(F)p(G)

p(EF) = p(E)p(F)

p(EG) = p(E)p(G)

p(FG) = p(F)p(G)

Example.

the event that the sum of the dice is 7.

Let F denote the event that the first

die is 4 and let G be the event that the

second die is 3. Is E independent of

F? Is E independent of G? Is E

independent of FG?

p(E) = 6/36 = 1/6

p(F) = 1/6

p(G) = 1/6

p(EF) = 1/36

p(EG) = 1/36

p(FG) = 1/36

p(EFG) = 1/36

Baye's Theorem

Let B1 , B2 ,.......Bk be a partition of the sample space S.

Let P(A/Bi ) & P(Bi / A) be the conditional probabilities

for i = 1 to k, then

P ( A / Bi ) P ( Bi )

P(Bi / A) = k

i = 1,2,....., k

∑ P(A / B j ) P( B j )

j=1

Pr oof

Given B1 , B2 ,....Bk be a partition of the sample space S.

i.e., (i) Bi ∩ B j = φ∀i ≠ j.

S

B1

B2

B4

B3 A

(ii) ∪ Bi = S(iii)P(Bi ) > 0∀i.

k

i =1

Let A be the event associated with S. Then

(A ∩ B1 ) ∪ (A ∩ B2 ) ∪ ....... ∪ (A ∩ Bk )

= A ∩ (B1 ∪ B2 ) ∪ (A ∩ B3 ) ∪ ....... ∪ (A ∩ Bk )

(by distributive law)

= A ∩ (B1 ∪ B 2 ∪ .... ∪ Bk ) = A ∩ S = A

Also all the events (A ∩ B1 ), (A ∩ B2 ),......., (A ∩ Bk )

are pairwise mutually exclusive. For,

(A ∩ B1 ) ∩ (A ∩ B2 ) ∩ ....... ∩ (A ∩ B k ) = A ∩ (Bi ∩ B j ), i ≠ j

= A∩φ = φ

Then P(A) = P(A ∩ B1 ) + P(A ∩ B2 ) + ...

+ P(A ∩ Bk )

However each term P(A ∩ B j ) may be expressed

as P(A/B j )P(B j ) & hence we obtain the theorem

on total probability.

P(A) = P(A / B1 )P(B1 ) + P(A / B2 )P(B2 ) + ...

+ P(A / B k )P(B k )

k

= ∑ P(A / B j )P(B j )

j=1

P ( Bi ∩ A ) = P ( Bi ) × P ( A / B i ) = P ( A ) × P ( Bi / A )

P ( Bi ) × P ( A / Bi )

∴ P ( Bi / A ) =

P(A)

P ( Bi ) × P ( A / Bi )

= k

∑ P( B j ) × P(A / B j )

j=1

Example.

test, a student either knows the answer or

guesses. Let p be the probability that the

student knows the answer and 1−p the

probability that the student guesses.

Assume that a student who guesses at the

answer will be correct with probability 1/m,

where m is the number of multiple-choice

alternatives. What is the (conditional)

probability that a student knew the answer to a

question, given that his answer is correct?

Solution:

K = the student actually knows the answer

p( C | K ) p( K )

p( K | C ) =

p( C | K ) p( K ) + p ( C |~ K ) p( ~ K )

p

=

p + (1− p )(1 / m )

mp

=

1 + ( m − 1) p

Example.

with probability ¼, whereas when coin B

is flipped it comes up heads with probability ¾.

Suppose that one coin is randomly chosen and

is flipped twice. If both flips land heads, what is

the probability that coin B was the one chosen?

Solution:

C = coin B is chosen

H = both flips show head

p( H | C ) p( C )

p( C | H ) =

p ( H | C ) p ( C ) + p ( H |~ C ) p( ~ C )

3 3 1

× ×

= 4 4 2

3 3 1 1 1 1

× × + × ×

4 4 2 4 4 2

9

= ≈ 0.9

10

Example.

A laboratory test is 95 percent correct in detecting a certain disease

when the disease is actually present. However, the test also yields a

“false” result for 1 percent of the healthy people tested. If 0.5 percent

of the population has the disease, what is the probability a person

has the disease given that his test

result is positive?

Example.

A suspect is believed 60 percent guilty. Suppose now a new

piece of evidence shows the criminal is left-handed. If 20

percent of the population is left-handed,and it turns out that the

suspect is also left-handed, then does this change the guilty

probability of the suspect? By how much?

Solution:

E = the test result is positive

p( E | D ) p( D )

p( D | E ) =

p( E | D ) p( D ) + p( E |~ D ) p ( ~ D )

=

( 0.95)( 0.005)

( 0.95)( 0.005) + ( 0.01)( 0.995)

95

=

294

≈ 0.323

Example.

A suspect is believed 60 percent guilty. Suppose now a new

piece of evidence shows the criminal is left-handed. If 20

percent of the population is left-handed,and it turns out that

the suspect is also left-handed, then does this change the

guilty probability of the suspect? By how much?

Solution:

LH = the suspect is left-handed

p ( LH | G ) p ( G )

p ( G | LH ) =

p( LH | G ) p( G ) + p ( LH |~ G ) p( ~ G )

=

( 1.0)( 0.6 )

(1.0)( 0.6) + ( 0.2)( 0.4)

60

=

68

≈ 0.88

Random Variable

Definition:

A random variable (RV) is a function X : S → R

that assigns a real number X(s) to every element

s ∈ S,where S is the sample space corresponding to

a random experiment E.

If X is a random variable (RV) which can take a finite

number or countably infinite number of values, X is

called a discrete RV.

Eg. 1. The number shown when a die is thrown

2. The number of alpha particles emitted by a

radioactive source are discrete RVs.

Example

Suppose that we toss two coins and consider the sample

Space associated with this experiment. Then

S={HH,HT,TH,TT}.

Define the random variable X as follows:

X is the number of heads obtained in the two tosses.

Hence X(HH) = 2, X(HT) = 1 = X(TH) & X(TT) = 0.

one value X(s). Different values of x may lead to

the same value of S.

Probability Function

If X is a discrete RV which can take the values

x1 , x 2 , x 3 ,.... such that

function or probability mass function or point

probability function, provided p i (i = 1,2,3,...)

satisfy the following conditions:

(i)p i ≥ 0, ∀i,&

(ii)∑ p i = 1

i

Example of a Discrete PDF

• Suppose that 10% of all households have

no children, 30% have one child, 40%

have two children, and 20% have three

children.

• Select a household at random and let X =

number of children.

• What is the pmf of X?

Example of a Discrete PDF

• We may list each value.

– P(X = 0) = 0.10

– P(X = 1) = 0.30

– P(X = 2) = 0.40

– P(X = 3) = 0.20

Example of a Discrete PDF

• Or we may present it as a chart.

x P(X = x)

0 0.10

1 0.30

2 0.40

3 0.20

Example of a Discrete PDF

• Or we may present it as a stick graph.

P(X = x)

0.40

0.30

0.20

0.10

x

0 1 2 3

Example of a Discrete PDF

• Or we may present it as a histogram.

P(X = x)

0.40

0.30

0.20

0.10

x

0 1 2 3

Example

The pmf. of a random variable X is given by

cλi

p(i) = i = 0,1,2,....., where λ is some positive

i!

value. Find (i) P(X = 0)(ii)P(X > 2)

Solution.

∞ λ∞

i

Since ∑ p(i) = 1, we have c ∑ = 1

i =0 i = 0 i!

∞

λ i

As e = ∑ , we have ce = 1.

λ λ

i = 0 i!

-λ 0

e λ −λ

Hence P(X = 0) = =e

0!

P ( X > 2) = 1 − P ( X ≤ 2)

−λ −λ λe 2 −λ

= 1 − e + λe +

2

-λ

e λ x

P (X = x) =

x!

If X represents the total number of heads obtained,

when a fair coin is tossed 5 times, find the

probability distribution of X.

X :0 1 2 3 4 5

1 5 10 10 5 1

P:

32 32 32 32 32 32

Continuous Random Variable

If X is an RV which can take all values (i.e., infinite

number of values ) in an interval, then X is called

a continuous RV.

0, x < 0,

X( x ) = x ,0 ≤ x < 1,

1, x ≥ 1

Probability Density Function

If X is a continuous RV,then f is said to be the

probability density function (pdf) of X , if it satisfies the

following conditions:

∞

(i)f ( x ) ≥ 0, ∀x ∈ R x , and (ii) ∫ f ( x )dx = 1

−∞

b

P(a ≤ X ≤ b) = ∫ f ( x )dx

a

When X is a continuous RV

a

P(X = a ) = P(a ≤ X ≤ a ) = ∫ f ( x )dx = 0

a

RV assumes a specific value.

Probability Density Function

0.6 0 if x < 0

f X ( x) = 1 − x /α

0.5 ⋅e if 0 ≤ x

0.4 α

f X (x ) 0.3

0.2

0.1

0.0

-3 0 3 6 9 1

x

Example

Check whether the function f(x) = 4x ,0 ≤ x ≤ 1

3

1 1

Also ∫ f ( x )dx = ∫ 4 x dx = 1.

3

0 0

A random variable X has the density function

1 / 4 - 2 < x < 2

f(x) =

0 elsewhere

Obtain (i) P(X < 1)(ii)P( X > 1)(iii)P(2X + 3 > 5)

(¾,1/2,1/4)

Find the formula for the probability distribution

of the number of heads when a fair coin is

tossed 4 times.

Cumulative Distribution Function (cdf)

If X is an RV, discrete or continuous , then P(X<=x)

is called the cumulative distribution function of X

or distribution function of X and denoted as F(x).

If X is discrete, F(x) = ∑ p j

j

X j≤x

X

If X is continuous, F(x) = P(-∞ < X ≤ x) = ∫ f(x)dx

-∞

Probability Density Function

0.6 0 if x < 0

0.5 f X ( x) = 1 − x / α

⋅e if 0 ≤ x

0.4 α

f X (x ) 0.3

0.2

0.1

0.0

-3 0 3 6 9 1

x

Cumulative Distribution Functio

1.2

1.0

0.8

F X (x ) 0.6

0.4

0.2

0.0

0 if

-3 x<0 0 3 6 9

FX ( x) = − x /α

1 − e if 0 ≤ x x

Probbility Density Function

0.0016

0.0012

f X (x ) 0.0008

0.0004

0.0000

0 if x < 0

1

f X ( x) = -100

if 0 ≤0x ≤ u100 200 300 400 500 600

u

0 if u < x x

Cum ulative Distribution Function

1.0

0.8

0.6

F X (x )

0.4

0.2

0 if 0.0

x<0

x

F ( x) = -100

X if 0 ≤ x 0

≤ u 100 200 300 400 500 600

u

1 if u < x

x

If the probability density of a random variable is

K(1 - x 2 ),0 < x < 1

given by f(x) =

0 otherwise

Find (i) K, (ii) the cumulative distribution function of

3 / 2[ x − x 3 / 3] 0 < x < 1

the random variable. =

0 otherwise

1

K. if − ∞ < x < ∞

f(x) = 1 + x 2 K = 1/ π

−1

F( x ) =

0, otherwise

Properties of the cdf F(x)

1.F(x) is a non - decreasing function of x, i.e. , if x1 < x 2 ,

then F(x1 ) ≤ F(x 2 ) .

x1 < x 2 < x 3 < .... < x i −1 < x i < ....., then P(X = x i ) = F( x i ) − F( x i −1 ).

d

4.If X is a continuous RV, then F( x ) = f ( x ), at all

dx

points where F(x) is differentiable.

Definitions

If X is a discrete RV, then the expected value or

the mean value of g(X) is defined as

E{g(X)} = ∑ g(x i )p i , where p i = P(X = x i )

i

If X is a continuous RV with pdf f(x), then

E{g(X)} = ∫ g(x)f(x)dx

Rx

Two expected values which are most commonly

used for characterising a RV X are its mean µ x & var ianceσ 2x

µ x = E (X) = ∑ x i p i , if X is discrete

i

= ∫ xf(x)dx, if X is continuous

Rx

2

x {

Var ( x ) = σ = E (X − µ x ) 2

}

= ∑ ( x i − µ x ) p i , if X is discrete

2

i

= ∫ (x - µ x ) f ( x )dx , if X is continuous

2

Rx

The square root of variance is called the standard

deviation.

Example

Find the expected value of the number on a die when

thrown.(7/2)

2 2

2

Var (X) = E{(X − µ x ) } = E{X − 2µ x X + µ x }

2 2

2

= E (X ) − 2µ x E (X) + µ x (sin ce µ x is a constant)

2

2

= E (X ) − µ x (sin ceµ x = E (X))

2

2 2

Example

A random variable X has the following probability distribution

x : - 2 -1 0 1 2 3

p(x) : 0.1 K 0.2 2K 0.3 3K

(a) Find K, (b) Evaluate P(X < 2) and P(-2 < X < 2), (c) find the cdf of

X and (d) evaluate the mean of X.

distribution is given by P(X = j) = 1/2 ( j − 1,2,3,..).

j

find the mean and variance of the distribution.

Find also P(X is even), P(X ≥ 5) & P(X is divisible by 3).

xe -x 2 / 2

x≥0

If p(x) =

0 x<0

(a )show that p(x) is a pdf(of a continuous RV X.)

(b) find its distribution function P(x).

Example 2

F( x ) = 1 − e − x / 2 , x ≥ 0

A continuous random variable has the probability

kxe -λx x ≥ 0, λ > 0

density function defined by f(x) =

0otherwise

Deter min e the cons tan t k and find mean and variance.

(k = λ ,2 / λ,6 / λ )

2 2

Moments

r

If X is the discrete or continuous RV, E ( X )is called rth

order raw moment of S about the origin and denoted

by µ'r .

∑ x r f ( x ) if X is discrete

x

µ r ' = E(X ) = ∞

r

r

∫ x f ( x )dx if X is continuous

- ∞

Since the first and second moments about the origin are

given by µ11 = E (X) & µ12 = E (X 2 ),

mean = first moment about the origin

Var(X) = µ − ( E (X) ) = µ − µ .

1

2

2 1

2 ( )1 2

1

E{(X − µ x ) n is called the nth order central moment

of X and denoted by µ n .

n n

E{ X } & E{ X − µ x } are called absolute moments of X.

n

E{(X − a ) } & E{ X − a } are called generalised moments of X.

n

µ 2 = E (X − E(X)) = Var (X)

2

Two-Dimensional Random Variables

Definitions : Let S be the sample space associated with a random

experiment E. Let X = X(s) and Y = Y(s) be two functions each

assigning a real number to each outcomes s ∈ S. Then (X, Y) is

called a two - dimensional random variable.

Two-dimensional discrete RV.

Probability Function of (X,Y)

then p ij is called the probability mass function of (X, Y) provided

(i)p ij ≥ 0, ∀i & j

(ii)∑ ∑ p ij = 1

j i

is called the joint probability distribution of (X, Y)

• If (X,Y) is a two-dimensional continuous RV.

The joint probability density function (pdf) f is

a function satisfying the following conditions

f ( x, y) ≥ 0, −∞ < x < ∞ −∞ < y< ∞

∞ ∞

∫− ∞ ∫− ∞ f ( x, y)dxdy = 1

∫x1 ∫y1 f ( x, y)dydx

Cumulative Distribution Function

then F(x, y) = P{X ≤ x & Y ≤ y} is called the cdf of (X, Y)

F( x , y) = ∑ ∑ p ij

j i

= x y

∫−∞ ∫−∞ f (u , v)dvdu

•Properties of joint PDF

0 ≤ F ( x, y ) ≤ 1, −∞ < x < ∞ −∞ < y < ∞

F (−∞ , y ) = F ( x,−∞ ) = F (−∞ ,−∞ ) = 0

F ( ∞, ∞ ) = 1

F ( x, y ) is a nondecreasing function as either x or y, or both, increase.

F (∞, y ) = FY ( y ) F ( ∞, x ) = F X ( x )

F ( x, y ) = Pr( X ≤ x, Y ≤ y )

∂ 2 F ( x, y )

f ( x, y ) =

∂x∂y

Examples

tossing two coins

X ... random variable associated with the first coin

Y ... random variable associated with the second coin

sample space {(0,0), (0,1), (1,0), (1,1)} 0 for tail, 1 for head

F(x,y)

1

F(0,0) =

4

2

F(0,1) =

4

3

1/2 F(1,0) =

1/4 4

F(1,1) = 1

1 x

y

Example

Three balls are drawn at random without replacement

from a box containing 2 white,3red and 4 black balls.

If X denotes the number of white balls drawn and Y

denotes the no of red balls drawn, find the joint

probability distribution of (X,Y).

Solutions

As there are only 2 white balls in the box, X can take

the values 0,1,2and Y can take the values 0,1,2,3.

P(X=0,Y=0)=P(drawing 3 balls none of which is

white or red)=P(all the 3 balls drawn are black)

= C3 / C3 = 1 / 21

4 9

P(X=0,Y=1)=3/14,P(X=0,Y=2)=1/7………

X Y

0 1 2 3

2

X Y

0 1 2 3

2 1/21 1/28 0 0

For the bivariate probability distribution of (X, Y)

given below, find P(X ≤ 1), P(X ≤ 1, Y ≤ 3), P(X ≤ 1/Y ≤ 3)

P(Y ≤ 3/X ≤ 1) & P(X + Y ≤ 4).

1 2 3 4 5 6

Y

X

0 0 0 1/32 2/32 2/32 3/32

(ans.7/8,9/32,18/32,9/28,13/32)

Example

The joint density function of (X, Y) is given by

2e e 0 < x < ∞,0 < y < ∞

-x −2 y

f(x, y) =

0otherwise

Compute (i) P(X > 1, Y < 1), (ii)P(X < Y), (iii)P(X < a)

ans. e -1 (1 − e − 2 ),1 / 3,1 − e −a

Marginal probability density function

p(xj, y1) + p(xj, y2) + p(xj, y3) + … = p{X= xj} = f(xj)

and for every fixed k

p(x1, yk) + p(x2, yk) + p(x3, yk) + … = p{Y= yk} = f(yk)

called marginal probability density functions.

∞ ∞

f X (x) = ∫−∞ f ( x , y)dy, f Y ( y) = ∫−∞ f ( x , y)dx

As an illustrative example, consider a joint pdf of the form

6

f ( x, y ) = (1 − x 2 y ) for 0 ≤ x ≤ 1, 0 ≤ y ≤1

5

= 0wrt y alone and

-Integrating this elsewhere

wrt x alone gives the two

marginal pdf

- 6 x2

f X ( x ) = (1 − ) 0 ≤ x ≤1

5 2

6 y

f Y ( y) = (1 − ) 0 ≤ y ≤1

5 3

Independent Random Variables

probability density function f(x,y) and

marginal probability functions f X ( x ) and f Y ( y)

If

F(x,y) = F(x) G(y)

Or p(x,y) = f X ( x ) f ( y)

Y

Example

A machine is used for a particular job in the forenoon

and for a different job in the afternoon. The joint

probability of (X,Y), where X and Y represent the

number of times the machine breaks down in the

forenoon and in the afternoon respectively, is given

in the following table. Examine if X and Y are

independent RVs.

X Y 0 1 2

0 0.1 0.04 0.06

1 0.2 0.08 0.12

2 0.2 0.08 0.12

X Y 0 1 2

0 0.1 0.04 0.06

1 0.2 0.08 0.12

2 0.2 0.08 0.12

P0* = f (0) = 0.1 + 0.04 + 0.06 = 0.2; P1* = 0.4; P2* = 0.4

P*0 × P0* = 0.2 × 0.5 = 0.1 = P00

P0* × P*1 = 0.2 × 0.2 = 0.04 = P01

Hence the RVs X and Y are independent

The cumulative distribution function of the

continuous random variable (X, Y) is given by

1 - e − e + e

-x -y -(x + y)

, x, y > 0

F(x, y) =

0otherwise

Pr ove that X and Y are independent.

2

∂ F( x , y) e −( x + y)

x, y ≥ 0

f ( x , y) = =

∂x∂y 0otherwise

e − x x ≥ 0 e − y y ≥ 0

f1 ( x ) = f 2 ( y) =

0otherwise 0otherwise

F1 ( x ) = −x

F2 ( y) = −x

1 − e x ≥ 0 1 − e y ≥ 0

−x −y

F1 ( x )F2 ( y) = (1 − e )(1 − e ) = F( x , y)

Example

Let X and Y be the lifetimes of two electronic devices.

Suppose that their joint pdf is given by f(x, y) = e -(x + y) , x ≥ 0, y ≥ 0

then X and Y are independent

Example

Suppose that f(x, y) = 8xy,0 ≤ x ≤ y ≤ 1.

Check the independence of X and Y.

Expectation of Product of random variables

variables, then the expectation of their product

exists and is

E(XY) = E(X) E(Y)

Example

lightbulb are being modeled as continuous random variables

Let the pdf be given by f(x, y) = λ1λ 2 e −( λ1x + λ 2 y ) ,0 < x < ∞,0 < y < ∞

first part is of length X, find E(X), Var(X) and E{X(a-X)}.

Expectation of Sum of random variables

expectation of their sum exists and is

E ( X ) + E ( Y ) = ∑ x j p ( x j , yk ) + ∑ yk p ( x j , yk )

j ,k j ,k

= ∑ ( x j + yk ) p ( x j , yk )

j ,k

= E( X + Y )

Example

What is the mathematical expectation of the sum of

points on n dice?

Ans. (7/2)n

n n

A box contains 2 tickets among which C r

tickets bear the number r (r = 0,1,2,…,n). A group of m

tickets is drawn . Let S denote the sum of their

numbers. Find E(S) and Var S.

Ans. (n/2)m

E ( XY ) = ∑ x j yk p ( x j , yk )

j ,k

= ∑ x j yk f ( x j ) g ( yk )

j ,k

= ∑ x jf ( x j ) ∑ y k g( y k )

j k

= E( X ) E(Y )

Example

If the joint pdf of (X, Y) is given by f(x, y) = 24y(1- x),0 ≤ y ≤ x ≤ 1

Find E(XY)

y

11

E(XY) = ∫ ∫ xyf ( x , y)dxdy

0y

x=y

Ans. 4/15

0 x

Binomial Distribution (re-visit)

a success with probability p and results in a failure with

1–p, are performed. If Sn represents the number of

successes that occur in the n Bernoulli trials, then Sn is

said to be a binomial random variable with parameter

n and p.

Let Xk be the number successes scored at the kth trial.

Since Xk assumes only the values 0 and 1 with

corresponding probabilities q and p, we have

E(Xk) = 0 • q + 1 • p = p

Since

Sn = X1 + X2+…+ Xn

We have

= E(X1) + E(X2) +… + E(Xn)

= np

Conditional Probability

F ( x M ) = Pr[ X ≤ x M ]

Pr{ x ≤ x , M }

= Pr(M ) > 0

Pr(M )

depends on some other random variable Y.

Conditional Probability Density Function

f ( x , y) or f ( x , y)

f ( y | x) = f ( x | y) =

fX ( x ) f Y ( y)

- the continuous version of Bayes’ theorem

f ( x | y ) fY ( y )

f ( y | x) =

f X ( x)

- another expression of the marginal pdf

∞ ∞

f X ( x) = ∫ f ( x, y )dy = ∫ f ( x | y ) fY ( y )dy

−∞ −∞

∞ ∞

fY ( y ) = ∫ f ( x, y )dx = ∫ f ( y | x) f X ( x)dx

−∞ −∞

For the bivariate probability distribution of (X, Y)

given below, find P(X ≤ 1), P(X ≤ 1, Y ≤ 3), P(X ≤ 1/Y ≤ 3)

P(Y ≤ 3/X ≤ 1) & P(X + Y ≤ 4).

1 2 3 4 5 6

Y

X

0 0 0 1/32 2/32 2/32 3/32

(ans.7/8,9/32,18/32,9/28,13/32)

Suppose that p(x,y) the joint probability mass

function of X and Y , is given by p(0,0) =.4

p(0,1)=.2,p(1,0)=.1,p(1,1)=.3 Calculate the

conditional probability mass function of X given

that Y = 1

Ans. 2/5,3/5

Example

Suppose that 15 percent of the families in a certain

community have no children, 20% have 1, 35% have

2, & 30% have 3 children; suppose further that each

child is equally likely (and independently) to be a

boy or a girl. If a family is chosen at random from this

community, then B, the number of boys, and G , the

number of girls, in this family will have the joint

probability mass function .

j 0 1 2 3

i

0 .15 .10 .0875 .0375 P(B = 1, G = 1) = P(BGorGB)

= P(BG ) + P(GB)

1 .10 .175 .1125 0 1 1

= .35 × × × 2 = .175

2 2

2 .0875 .1125 0 0

3 .0375 0 0 0

P(B = 2, G = 1) = P(BBGorBGBorGBB)

1 1 1

= P(BBG) + P(BGB) + P(GBB) = .30 × × × × 3

2 2 2

.9

= = .1125

8

If the family chosen has one girl, compute the

conditional probability mass function of the

number of boys in the family

Ans.8/31,14/31,9/31,0

Example

The joint density of X and Y is given by

12

x ( 2 − x − y ) ,0 < x < 1,0 < y < 1

f(x, y) = 5

0otherwise

compute the conditional density of X , given that

Y = y , where 0 < y < 1

f ( x , y) 6x ( 2 − x − y )

f ( x | y) = ans =

f Y ( y) 4 − 3y

Example

The joint pdf of a two - dimensional RV is given by

2

x

f(x, y) = xy 2 + ,0 ≤ x ≤ 2,0 ≤ y ≤ 1.

8

1 1 1

Compute P(X > 1), P(Y < ), P(X > 1/Y < ), P(Y < / X > 1)

2 2 2

P(X < Y) & P(X + Y ≤ 1)

(ans. 19/24,1/4,5/6,5/19,53/480,13/480)

Variance of a Sum of random variables

If X and Y are random variables, then the variance of

their sum is

2

Var(X + Y) = E({(X+Y) – (µ x + µy )} )

( ) ( )

= E ( X − µX ) + E ( Y − µY ) − 2E ( ( X − µX )( Y − µY ) )

2 2

= Var ( X ) + Var (Y ) − 2 E (( X − µ X )( Y − µY ) )

= Var ( X ) + Var ( Y ) − 2( E ( XY ) − µ X µY )

Cov( X , Y ) = E ( ( X − µ X )( Y − µY ) ) = E ( XY ) − µ X µY

• If X and Y are mutually independent, then

Cov(X,Y) = 0.

• If X1, …, Xn are mutually independent, and

Sn = X1 + …+ Xn, then

Q: Let Sn be a binomial random

variable with parameter n and p.

Show that

Var(Sn) = np(1-p)

Example

Compute Var(X) when X represents the outcome

when we roll a fair die.

Solution

Since P(X=i)=1/6, i = 1,2,3,4,5,6, we obtain

6

E (X ) = ∑ i P[X = i]

2 2

i =1

1 2 1 2 1 2 1 2 1 2 1

=1 ( ) + 2 ( ) +3 ( ) + 4 ( ) +5 ( ) + 6 ( )

2

6 6 6 6 6 6

=91/6

Var(X) = E (X ) − E (X)

2 2

2

91 7

= − = 35 / 12

6 2

independent rolls of a fair die are made.

Ans 175/6

from 10 independent tosses of a fair coin.

The coefficient of correlation between X and Y

denoted by ρ xy , is defined as

C xy

ρ xy =

σxσy

The correlation co-efficient is a measure of dependence

between RV’s X and Y.

If E(XY) = 0 , X and Y are said to be orthogonal RV’s.

ρ xy ≤ 1 or C xy ≤ σ x σ y

Example

Calculate the correlation coefficient for the following

heights (in inches) of fathers (X) & their sons (Y):

X Y

65 67

(Ans .603)

66 68

67 65

67 68

68 72

69 72

70 69

72 71

C xy

ρ xy =

σxσy

∑ xy ∑x ∑y

−

= n n n

2 2 2 2

∑x ∑x ∑y ∑y

− −

n n n n

Example

Let the random variables X and Y have the

x + y 0 < x < 1,0 < y < 1

joint pdf f(x, y) =

0 elsewhere

Find the correlation coefficient between X and Y.

(ans. - 1/11)

Example

If X,Y and Z are uncorrelated RVs with zero means

and standard deviations 5, 12 and 9 respectively

and if U=X+Y and V=Y+Z find the correlation

coefficient between U and V.

(ans 48/65)

Conditional Expected Values

(X, Y) p ij

g ( X, Y )

E{g (X, Y) / Y = y j} = ∑ g ( x i , y j )P(X = x i / Y = y j )

i

P{X = x i Y = y j} p ij

= ∑ g(x i , y j ) = ∑ g(x i , y j )

i P{Y = y j} i p* j

∞

E{g (X, Y) / Y} = ∫ g ( x , y)f ( x / y)dx

−∞

∞

E{g (X, Y) / X} = ∫ g ( x , y)f ( y / x )dy

−∞

Conditional means

∞

µ y/x = E (Y / X) = ∫ yf ( y / x )dy

−∞

∞

σ 2

y/x = E (Y − µ y / x ) = ∫ ( y − µ y / x ) f ( y / x )dy

2 2

−∞

Example

The joint pdf of (X,Y) is given by f(x,y)=24xy, x>0,y>0,

x+y<=1,and 0, elsewhere, find the conditional

mean and variance of Y, given X.

f ( y / x ) = 2 y /(1 − x ) , E (Y / X) = 2 / 3(1 − x ),

2

var = 1 / 18(1 − x ) 2

bounded by y = 1 - x & y = 0 , find E(X/Y) and E(Y/X)

2

Properties

(1)If X and Y are independent RV’s, then E(Y/X)=E(Y)

and E(X/Y)=E(X).

(2)E[E{g(X,Y)/X)=E{g(X,Y)} in particular E{E(X/Y)}=E(X)

(3)E(XY)=E[X.E(Y/X)]

E (X 2 Y 2 ) = E (X 2 E (Y 2 / X)]

Example

Three coins are tossed. Let X denote the number of

heads on the first two coins,Y denote the no of tails

on the last two, and z denote the number of heads

on the last two. Find

(a)The joint distribution of (i) X and Y (ii) X and Z

(b) Conditional distribution of Y given X = 1

(c) Find covariance of x,y and x,z

(d) Find E(Z/X=1)

(e)Give a joint distribution , that is not the joint

distribution of X and Z in (a), but has the same

marginals as of (b)

RVs (X1 , X 2 ,...X n ) − independent

f ( x1 , x 2 ,..., x n ) = f ( x1 ) × f ( x 2 ) × ... × f ( x n )

conditional density

f(x1 , x 2 , x 3 )

f(x1 , x 2 / x 3 ) =

f( x 3 )

f(x1 , x 2 , x 3 )

f(x1 / x 2 , x 3 ) =

f(x 2 , x 3 )

Definition

Let X denote a random variable with probability

density function f(x) if continuous (probability

mass function p(x) if discrete)

Then

M(t) = the moment generating function of X

( )

= E etX

∞ tx

∫ e f ( x ) dx if X is continuous

= −∞

∑ etx p ( x ) if X is discrete

x

Example

xe − x ( y +1)

x > 0, y > 0

If f ( x , y) =

0otherwise

Find the moment - generating function of XY.

∞∞

− x ( y +1)

Solution : M XY = ∫ ∫ e xe xyt

dydx

00

∞ ∞

−x xyt − x ( y )

= ∫ xe { ∫ e e dy}dx

0 0

∞ ∞

−x − xy (1− t )

= ∫ xe { ∫ e dy}dx =1/1-t

0 0

M X (t) = ∑ e f (x)

tx

Properties

= ∑ (1 + tx +

( tx )

2

+ .....)f ( x )

2!

MX(0) = 1

2

2 tx

M 'X ( t ) = ∑ ( x + + ...)f ( x )

2!

M 'X ( t ) t =0 = ∑ xf ( x ) = E(X) = µ1

3

6 tx

M X ( t ) = ∑ (x +

2 2

...)f ( x )

3!

M X ( t ) t =0 = ∑ x f ( x ) = µ 2

2 2

mX( k)

( 0) = k th

derivative of mX ( t ) at t = 0.

µ 2 2 µ3 3 µk k

mX ( t ) = 1 + µ1t + t + t + L + t +L .

2! 3! k!

x k f ( x ) dx

µk = E ( X k

) = ∫ X continuous

∑ x p ( x )

k

X discrete

Let X be a random variable with moment

generating function MX(t). Let Y = bX + a

Then MY(t) = MbX + a(t) = E(e [bX + a]t) = eat MX (bt)

variables with moment generating

functionMX(t)and MY(t) .

Then MX+Y (t) = MX (t) MY (t)

6. Let X and Y be two random variables

with moment generating function MX(t)

and MY(t) and two distribution functions

FX(x) and FY(y) respectively.

Let MX (t) = MY (t) then FX(x) = FY(x).

random variable can be identified by its

moment generating function

Example

If X represents the outcome, when a fair die is tossed,

find the MGF of X and hence find E(X) and Var(X).

(Ans. 7/2,35/12)

If a RV X has the MGF M(t) = 3/(3-t), obtain the

standard deviation of X. (ans, 1/3)

1 t 2 2 t 3 3t 4 4 t

M(t ) = e + e + e + e

10 10 10 10

Find p.d.f.

1 t 2 2 t 3 3t 4 4 t

M(t ) = e + e + e + e

10 10 10 10

M(t ) = ∑ e f ( x )

tx

x

1 t 2 2 t 3 3t 4 4 t

e + e + e + e = f (a )e + f (b)e + ...

at bt

10 10 10 10

x

, x = 1,2,3,4

f ( x ) = 10

0, otherwise

1

2 ,1 < x < ∞

If X has the p.d.f f(x) = x

0, otherwise

find the mean of X.

The characteristic function of a random variable X is defined by

φ X ( w ) = E(e iwx )

x

=∞

iwx

∫ e f ( x )dx , if X is continuous

− ∞

e iwx

= cos wx + i sin wx

1/ 2

(

= cos wx + sin wx

2 2

)

1/ 2

= 1 =1

∞

φ x ( w ) = E (e iwx

) = ∫ e f ( x )dx

iwx

−∞

∞ ∞

≤ ∫ e iwx

f ( x )dx = ∫ f ( x )dx = 1

−∞ −∞

when moment-generating function may not exist.

Properties of Characteristic Function

i ω n n

1.µ = E (X ) = the co - efficient of

'

n

n

in the

n!

expansion of φ(ω) in series of asending powers of iω.

φ X ( w ) = E (e iwx

) = ∑ e f (x) iwx

x

= ∑ 1 + iwx +

( iwx )

2

+

( iwx )

3

+ ...f ( x )

x

2! 3!

= ∑ f ( x ) + iw ∑ xf ( x ) +

( iw )

2

∑x

2

f ( x ) + .....

x x 2! x

1 d n

2.µ'n = n n φ(ω)

i dω ω= 0

3.If the characteristic function of a RV X

is φ x (ω) and if Y = aX + b, then φ y (ω) = e ibωφ x (aω)

4. If X and Y are independent RVs, then

φ x + y (ω) = φ x (ω) × φ y (ω).

5. If the characteristic function of a continuous RV X with

1 ∞ −ixω

density function f(x) is φ(ω), then f(x) = ∫ φ(ω)e dω.

2π − ∞

6.If the density function of x is known , the

density function of Y = g(X) can be found from the

CF of Y, provided Y = g(X) is one - one.

The characteristic function of a random variable X is

given by

1 − w , w ≤ 1

φx (w ) =

0, w > 1

Find the pdf of X.

The pdf of X is

1 ∞ −iwx

f(x) = ∫ φ x ( w )e dw

2π − ∞

1 1 −iwx

= ∫ (1 − w )e dw

2π −1

1 0 −iwx

1

−iwx

= ∫ (1 + w ) e dw + ∫ (1 − w ) e dw

2π −1 0

1 1

= (2 − e − e ) = 2 (1 − cos x )

ix −ix

2πx 2

πx

1 sin ( x / 2 )

2

= , −∞ < x < ∞

2π x / 2

Show that the distribution for which the

-ω

characteristic function is e has the density

1 1

function f(x) = × ,−∞ < x < ∞

π 1+ x 2

1 ∞ − i ωx

f (x) = ∫ φ(ω)e dω

2π − ∞

If X1 & X 2 are two independent RVs that

follow Poisson distribution with parameters

λ1 & λ 2 , prove that (X1 + X 2 ) also follows a

Poisson distribution with parameter (λ1 + λ 2 ).

Reproductive property of Poisson distribution

λ1 ( e iω −1)

φ x1 ( t ) = e

λ 2 ( e iω −1)

φx 2 ( t ) = e

since X1 & X 2 are independen t RVs,

( λ1 + λ 2 ) ( e iω −1)

φ x1 + x 2 ( t ) = e

Joint Characteristic Function

If (X, Y) is a two - dimensional RV, then

iω1X + iω2 Y

E(e ) is called the joint characteristic

function of (X, Y) and denoted by φ xy (ω1 , ω2 ).

∞ ∞

iω1x + iω2 y

φ xy (ω1 , ω2 ) = ∫ ∫ e f ( x , y)dxdy

−∞ −∞

iω1x + iω2 y

= ∑∑e p( x i , y j )

i j

(i)φ xy (0,0) = 1

1 ∂ m+n

(ii)E{X Y } = m + n

m n

m n

φ xy (ω1 , ω2 )

i ∂ω1 ∂ω2 ω1 =0,ω2 =0

(iii)φ x (ω) = φ xy (ω ,0) & φ y (ω) = φ xy (0, ω )

(iv)If X and Y are independent

φ xy (ω1 , ω2 ) = φ x (ω1 ) × φ y (ω2 )

and conversely.

Compute the characteristic function of the

discrete r.v.' s X and Y if the joint PMF is

1 / 3, x = y = 0

1 / 6, x = ±1, y = 0

PXY =

1 / 6, x = y = ±1

0, else.

1 1 i ( w 1k + w 2 l )

φ XY ( w 1 , w 2 ) = ∑ ∑ e PXY

k = −1 l = −1

= e + e + e + e + e

3 6 6 6 6

1 1 1

= + ( cos w1 + i sin w 1 ) + ( cos( w 1 + w 2 ) + i sin ( w 1 + w 2 ) ) +

3 6 6

1 1

( cos w1 − i sin w1 ) + ( cos( w1 + w 2 ) − i sin ( w1 + w 2 ) )

6 6

1 1 1

= + ( cos w 1 ) + ( cos( w 1 + w 2 ) )

3 3 3

Example

Two RVs X and Y have the joint characteristic

function φ xy (ω1 , ω2 ) = e ( − 2 ω ) . Show that

2

1 −8 ω 2 2

they are uncorrelated

1 ∂ m+n

(ii)E{X Y } = m + n

m n

m n

φ xy (ω1 , ω2 )

i ∂ω1 ∂ω2 ω1 =0,ω2 =0

By the property of joint CF

1 ∂ ( − 2 ω12 −8ω2 2 )

E(X) = e

i ∂ω1 ω = 0 ,ω 1 2 =0

= e[( −2 ω1

2

−8 ω 2 2 ) 4iω ]

1 ω1 = 0 , ω2 = 0 =0

E(Y) = e ( [

−2 ω 1

2

−8 ω 2 2 )16iω ]

2 ω1 = 0 ,ω2 = 0 =0

1 ∂ ( − 2 ω12 −8 ω2 2 )

2

E(XY) = 2 e

i ∂ω1∂ω2 ω1 =0,ω2 =0

∂ ( − 2 ω12 −8ω2 2 )

= e 16ω2

∂ω1 ω1 =0,ω2 =0

= {−64ω1ω2 e ( −2ω 1

2

−8 ω 2 2 )}

ω1 = 0 , ω2 = 0

=0

C xy = E(XY) − E (X) × E (Y) = 0

Compute the joint characteristic function of X and Y if

1 1 2 2

f xy = exp − ( x + y )

2π 2

Ans.

1

1 ∞ ∞ − ( x 2 + y2 )

iω1x + iω2 y

φ xy ( w 1 , w 2 ) = ∫ ∫e2

e dxdy

2π − ∞ − ∞

Random Variable

Binomial Distribution

The Bernoulli probability mass function is the density

function of a discrete variable X having 0 and 1 as

the only possible values

P(1) = P{P = 1} = p

where p, 0<=p<=1 is the probability that the trial is a

success.

if pmf satisfies above equation for some p ∈ (0,1).

An experiment consists of performing a sequence of

subexperiments. If each subexperiment is

identical, then the subexperiments are called

trials.

Bernoulli Trials

• Each trial of an experiment that has only two possible

outcomes (success or failure) is called a “Bernoulli trial.”

• If p is the probability of success, then (1-p) is the

probability of failure.

• The probability of exactly k successes in n independent

Bernoulli trials, with probability of success p and

probability of failure q = 1-p, is given by a formula called

the Binomial Distribution:

C(n, k) pk q(n-k)

Example of Bernoulli Trials

• Suppose I roll a die and I consider a 3 to be

success and any other number to be a failure.

• What is the probability of getting exactly 5

successes if I roll the die 20 times?

• Solution: C(20, 5) (1/6)5 (5/6)15

successes?

Theorem

If the probability of occurrance of an event (probability of success) in a

single trial of a Bernoulli's experiment is p, then the probability that the event

occurs exactly r times out of n independent trials is equal to nC r q n − r p r , where

q = 1 - p, the probability of failure of the event.

Pr oof

Getting exactly r successes means getting r

successes and (n - r) failures simultaneously.

∴ P(getting r successes and n - r failures)

n −r

=p q r

The trials, from which the successes are obtained

are not specified. There are nC r ways of choosing

r trials for successes. Once the r trials are chosen

for successes, the remaining (n - r) trials should

result in failures.

These nC r ways are mutually exclusive. In

each of these nC r ways, P(getting exactly

r successes ) = p r q n − r .

n −r

= nC r q r

p.

Example

If war breaks out on the average once in 25 years,

find the probability that in 50 years at a strech,

there will be no war.

1 24

p = , q = , n = 50;

25 25

0 50 50

1 24 24

P = C(50,0) =

25 25 25

Example

Each of two persons A and B tosses 3 fair coins.

What is the probability that they obtain the same

number of heads?

=P(they get no head each or 1 head each or 2 heads

each or 3 heads each)

= P(A gets 0 head) P(B gets o head)+------

3 2 3 2

1 1

= C(3,0) + C(3,1)

2 2

3 2 3 2 =5\16

1 1

+ C(3,2) + C(3,3)

2 2

Example.

A and B. Any player who firstly wins 3 matches

will be the winner of the game.

Suppose matches are independent. What will

be the probability for player A to win the game?

Solution:

matches.

3 2 4 1 5

5 2 1 5 2 1 5 2

=

3

3 3 +

4

3 3 +

5

3

64

=

81

Example.

Each trial results in a success with probability p and

a failure with probability 1–p. What is the probability

that

Solution:

(a)The probability of no success in the

first n trials is (1-p)n. Thus, the answer

is 1–(1–p)n.

n k

p (1 − p )

(b) n−k

k

Assuming that p remains the same for all repetitions,

if we consider n independent repetitions (or trials) of

E and if the random variable X denotes the

number of times the event A has occurred, then

X is called a binomial random variable with

parameters n and p

parameters (n,p) is given by

Example

It is known that car produced by an automobile company

will be defective with probability 0.01 independently of

each other. The company sells the cars in packages of

10 and offers a money-back guarantee that atmost 1 of

the10 cars is defective. What proportion of packages

sold must the company replace?

= 1 − {10C 0 (.01) (.99) + (10C1 (.01)(.99) } = .004

0 10 9

n n

mean = E (X) = ∑ xnC x p (1 − p) x n −x

= ∑ xnC x p (1 − p) x n −x

x =0

x =1

n

n (n − 1)! x

= ∑x p (1 − p) n −x

x =1 x!(n − x )!

n

n (n − 1)!

= ∑x x −1

pp (1 − p) n−x

x =1 x ( x − 1)![ (n − 1) − ( x − 1)]!

n

(n − 1)!

= np∑ x −1

p (1 − p) n −x

x =1 ( x − 1)![ ( n − 1) − ( x − 1) ]!

n

= np∑ n − 1C x −1p x −1 (1 − p) ( n −1) −( x −1)

x =1

= np(p + (1 − p)) n −1

, sin ce[ p + (1 − p)] = ∑ n C x p x (1 − p) n − x

n n

x =0

= np

n

Second moment µ12 = E(X 2 ) = ∑ x 2 nC x p x (1 − p) n − x

x =0

Example

For a binomial distribution with mean 6, standard deviation

2 , find the first two terms of the distribution.

9

((1 / 3) ,2 / 2187)

The mean and variance of binomial distribution

are 4 and 3 respectively. Find P(X ≥ 1).

(1 − (3 / 4) ) = .9899

16

Poisson Distribution

The probability density function of the poisson variate can

be obtained as a limiting case of the Binomial probability

density function under the following assumption.

(i) The number of trials is increased indefinitely (n → ∞)

(ii) The probability of success in a single trial is very small(p → 0)

(iii) np is a finite constant say np = λ.

Cosider the probability density function of a Binomial random

variable X as

n −x n!

P(X = x) = nC x p (1 − p)

x

= p x (1 − p)n − x

x!(n − x )!

x n −x

n (n − 1)...(n − x + 1) λ λ

= 1 −

x! n n

n

λ

1 −

n (n − 1)....(n − x + 1) x n

= x

λ x

x!n λ

1 −

n

n

n (n − 1) (n − x + 1) λ

.... 1 −

= n n n λx n

x

x! λ

1 −

n

For given x, as n → ∞, the terms 1 - 1/n,1 - 2/n,...1 - ( x - 1) /n,

(1 - λ/n) x tends to 1.

−λ

Also, Lt n →∞ (1 − λ / n ) = e .

n

λx −λ

Hence, Lt n →∞ P(X = x ) = e .

x!

Poisson random variable

A random variable X taking on one of the values 0,1,2,...

is said to be a Poisson random variable with parameter λ

e -λ λx

if for some λ > 0, P(x) = P(X = x) = x = 0,1,2,....

x!

∞ ∞

e − λ λx ∞

λx

∑

x =0

P ( x ) = ∑

x =0 x!

= e −λ

∑

x = 0 x!

= e −λ λ

e =1

∞ e − λ λx ∞ e − λ λx

∞

Mean = E(X) = ∑ xP(X = x ) = ∑ x =∑

x =0 x =0 x! x =1 ( x − 1)!

x −1

λ

∞

−λ λ λ

= λe ∑ −λ

= λe 1 + + + ...

x =1 ( x − 1)! 1! 2!

−λ λ

= λe e = λ

Var (X) = λ

Example

The average number of radioactive particles passing

through a counter during 1 millisecond is in a

laboratory experiment is 4. What is the probability that

6 particles enter the counter in a given millisecond?

unit is 2%, in a box of 200 fuses, find the probability

that exactly 4 fuses are defective.

−4 6 4 −4

e 4 4 e

,

6! 4!

Example

At a busy traffic intersection the probability p of an

Individual car having an accident is very small say

p=0.0001. However, during a certain peak hours of

the day say between 4 p.m. and 6 p.m., a large

number of cars (say 1000) pass through the

intersection. Under these conditions what is the

probability of two or more accidents occurring during

that period.

In a component manufacturing industry, there is a small

probability of 1/500 for any component to be defective.

The components are supplied in packets of 10. Use

Poisson distribution to calculate the app no of packets

containing (i) no defective (ii)one defective components

in a consignment of 10,000 packets..9802 ×10000,.0196 ×10000

Binomial Distribution

n −r

P(X = r )= C r p q

n r

; r = 0,1,2,..., n

If we assume that n trials constitute a set and if we

consider N sets, the frequency function of the

binomial distribution is given by f(r)=N p(r)

n −r

= N Cr p q

n r

Example

Fit a binomial distribution for the following data and

hence find the theoretical frequencies:

x:0 1 2 3 4

f: 5 29 36 25 5

germinating out of 10 on damp filter paper for 80

set of seeds. Fit a binomial distribution to these

data:x 0 1 2 3 4 5 6 7 8 9 10

y 6 20 28 12 8 6 0 0 0 0 0

Ans. 7,26,37,34,6

6.89,19.14,23.94,17.74,8.63,2.88,0.67,0.1,0.01,0,0

Fit a Poisson distribution for the following distribution:

x: 0 1 2 3 4 5

f: 142 156 69 27 5 1

gives the number of yeast cells per square for

400 squares

No. of cells per square 0 1 2 3 4 5 6 7 8 9 10

(x)

107,141,93,41,4,0,0,0,0,0

GEOMETRIC DISTRIBUTION

probability p of success, are performed until a success

occurs. If we let X = n, then P(X = n) = (1 - p) n -1 p, n = 1,2,...

Since in order that X = n, it is necessary and sufficient

that the first (n - 1) trials are failures and the nth trial is

a success.

∞ ∞ n −1 p

∑ P( X = n ) = ∑ (1 − p) p= = 1.

n =1 n =1 1 − (1 − p)

Geometric Random Variable

In a chemical engineering process industry it is

known that , on the average, 1 in every 100 items

is defective. What is the probability that the fifth

item inspected is the first defective item found.

ans(.01)(.99) 4 = .0096

At busy time, a telephone exchange will be working

busy with full capacity . So people cannot get a line to

use immediately. It may be of interest to know the

number of attempts necessary in order to get a

connection. Suppose that p = 0.05, then find the

probability that 5 attempts are necessary for a

successful call connection.

Ans .041

Mean

∞ n −1 ∞ n −1

E (X) = ∑ n (1 − p) p = p ∑ nt , t = 1− p

n =1 n =1

= p[1 + 2 t + 3t + 4 t + ....]

2 3

(sin ce t < 1)

−2 p 1−2

= p[1 − t ] = p[1 − (1 − p)] = 2 =

p p

∞ n −1 ∞ 2 n −1

E(X ) = ∑ n (1 − p)

2 2

p = ∑n t p

n =1 n =1

∞

n −1

= ∑ [n (n − 1) + n ]t p

n =1

∞ ∞

n −1 n −1

= ∑ [n (n − 1)]t p + ∑ nt p

n =1 n =1

∞ n −2

= pt ∑ [n (n − 1)]t +1/ p

n =2

= pt[2 + 3.2 t + 4.3t + 5.4 t + ....] + 1 / p

2 3

−3

= 2pt[1 + 3t + 6 t + ...] + 1 / p = 2pt[1 − t ] + 1 / p

2

2p(1 − p) 2 − p

= 3

= 2

p p

1− p

Var(X) = 2

p

Negative Binomial Distribution

Trials repeated until a fixed number of success occur.

when n is fixed

Probability that rth success occurs on the xth trial.

Negative binomial experiment

binomial experiment is called a negative binomial

random variable and its probability distribution is

called the negative binomial distribution

The negative binomial distribution is used when the

number of successes is fixed and we're interested in

the number of failures before reaching the fixed number

of successes. An experiment which follows a negative

binomial distribution will satisfy the following requirements

1.The experiment consists of a sequence of independent

trials.

2.Each trial has two possible outcomes, S or F.

3.The probability of success,is constant from

one trial to another.

4.The experiment continues until a total of r successes

are observed, where r is fixed in advance

Suppose we repeatedly throw a die, and consider

a "1" to be a "success". The probability of success

on each trial is 1/6. The number of trials needed to get

three successes belongs to the infinite set

{ 3, 4, 5, 6, ... }. That number of trials is a (displaced)

negative-binomially distributed random variable.

The number of failures before the third success

belongs to the infinite set { 0, 1, 2, 3, ... }. That number

of failures is also a negative-binomially distributed

random variable.

A Bernoulli process is a discrete time process, and

so the number of trials, failures, and successes are

integers. For the special case where r is an integer,

the negative binomial distribution is known as the

Pascal distribution.

case we get the probability distribution of failures

before the first success (i.e. the probability of success

on the (k+1)th trial), which is a geometric distribution.

Let the random variable y denotes the no of failures

before the occurrence of the rth success. Then y+r

denotes the number of trials necessary to produce

exactly r success and y failures with the rth success

occurring at the (y+r)th trial.

pmf of y = g(y) = [ ( n −1)

C ( r −1) p r −1

q p y

]

= [ ( y +r −1)

C ( r −1) p r −1

]

q p, q = 1 − p

y

( y + r −1)

= C ( r −1) p q r y

Example

the 6th grade field trip. There are thirty houses in the

neighborhood, and Pat is not supposed to return home

until five candy bars have been sold. So the child goes

door to door, selling candy bars. At each house, there

is a 0.4 probability of selling one candy bar and a 0.6

probability of selling nothing.

house?

A fair die is cast on successive independent trials until

the second six is observed. The probability of observing

exactly ten non-sixes before the second six is cast is

……..

get either all heads or all tails for the second time

on the fifth toss.

2 10 2 3

11 1 5 4 1 3

C1 , C1

6 6 4 4

Relationship between the binomial and negative

binomial

Let X have a binomial distribution with parameters

n,p. Let Y have a negative binomial distribution with

parameters r and p. (That is Y = no of trials required

to obtain r successes with probability of success p).

Then

(i)P(Y ≤ n ) = P(X ≥ r )

(ii)P(Y > n ) = P(X < r )

The probability that an experiment will succeed is 0.8.

If the experiment is repeated until four successful

outcomes have occurred, what is the expected number

of repetitions required?

Ans. 1

∞

∑ g(y) = ∑

y =0

∞

y =0

[ ( y + r −1) r

C ( r −1) p q y

]

( y + r − 1)! r y

∞

= ∑ pq

y =0

( r − 1)! y!

= ∑

∞

y =0

[ ( y + r −1)

Cyp q = p ∑r y

] r ∞

y =0

[ ( y + r −1)

C yq y

]

( y + r − 1)! y

r ∞

=p ∑ q

y =0

( r − 1)! y!

( r − 1)! r! ( r + 1)! 2

=p r

+ q+ q + ...

( r − 1)! ( r − 1)! ( r − 1)!2!

r ( r + 1) 2

= p 1 + rq +

r

q + ...

2!

= p [1 − q ] = p p = 1.

r −r r −r

rq rq

Mean = E(X) = Variance = 2

p p

PROBABILITY

DISTRIBUTIONS

Discrete Distributions

Binomial Distribution

n −r

P(X = r )= C r p q

n r

; r = 0,1,2,..., n

If we assume that n trials constitute a set and if we

consider N sets, the frequency function of the

binomial distribution is given by f(r)=N p(r)

n −r

= N Cr p q

n r

Example

Fit a binomial distribution for the following data and

hence find the theoretical frequencies:

x:0 1 2 3 4

f: 5 29 36 25 5

germinating out of 10 on damp filter paper for 80

set of seeds. Fit a binomial distribution to these

data:x 0 1 2 3 4 5 6 7 8 9 10

y 6 20 28 12 8 6 0 0 0 0 0

Ans. 7,26,37,34,6

6.89,19.14,23.94,17.74,8.63,2.88,0.67,0.1,0.01,0,0

Fit a Poisson distribution for the following distribution:

x: 0 1 2 3 4 5

f: 142 156 69 27 5 1

gives the number of yeast cells per square for

400 squares

No. of cells per square 0 1 2 3 4 5 6 7 8 9 10

(x)

107,141,93,41,4,0,0,0,0,0

Probbility Density Function

0.0016

0.0012

f X (x ) 0.0008

0.0004

0.0000

0 if x < 0

1

f X ( x) = -100

if 0 ≤0x ≤ u100 200 300 400 500 600

u

0 if u < x x

Cum ulative Distribution Function

1.0

0.8

0.6

F X (x )

0.4

0.2

0 if 0.0

x<0

x

F ( x) = -100

X if 0 ≤ x 0

≤ u 100 200 300 400 500 600

u

1 if u < x

x

If X is uniformly distributed over the interval [0,10]

compute the probability (a) 2<X<9(b) 1<X<4

(c)X<5 (d) X>6

Ans. 7/10,3/10,5/10,4/10

b b 1

mean = E (X) = ∫ xf ( x )dx = ∫ x dx

a a b−a

1

mean = (b + a )

2

If X has uniform distribution in (-3,3), find P( X - 2 < 2)

Ans. 1/2

If X has uniform distribution in (-a, a), a > 0, find

a such that P( X < 1) = P( X > 1)

a=2

Buses arrive at a specific stops at 15min. Intervals

starting at 7 A.M., that is , they arrive at 7,7:15,7:30

and so on. If a passenger arrives at the stop at a

random time that is uniformly distributed between

7 and 7:30A.M., find the probability that he waits

(a) less than 5 min (b) at least 12 min. for a bus.

Ans. 1/3,1/5

Example:

uniformly distributed between 3 and 7 days. What is the

probability that the application will be processed in less than 4

days?1 Total Area

b −a 1

=.25 (b −a )( )

b −a

0 3 5 7

a 4 6.5 b

1

P (3 ≤x ≤4) =( 4 −3)( ) =.25

7− 3

What is the probability that it will take more than 6.5 days?

1

P (6.5 ≤x ≤7) =(7 −6.5)( ) =.125

7− 3

var iance = E(X 2 ) − E (X) 2

a+b

E(X) =

2

a

b

E (X ) = ∫ x f ( x )dx =

2 2 1

3(b − a )

x 3

[ ] b

a

=

( b− a ) b + ab + a

3

=

3 2 2

3(b − a ) 3

a + ab + b a + 2ab + b

2 2 2 2

var = −

3 4

a + b − 2ab ( b − a )

2 2 2

= =

12 12

n +1 n +1

b −a

moments = E(X ) = n

( n + 1) (b − a )

central moments µ r = E{(X - E(X)) } r

r

1 b a + b

= ∫ x − dx

b−a a 2

r +1 b r +1 r +1

a +b b−a a −b

x − −

1 2 2 2

= =

b−a r +1 ( b − a )( r + 1)

a

0 if r is odd

= 1 b − a if r is even

r

r +1 2

2n

1 b−a

µ 2 n −1 = 0, µ 2 n = for n = 1,2,3,..

2n + 1 2

b a+b 1 1

= (b − a)

= ∫ x− * dx

a 2 b−a

4

The gamma function, denoted by Γ is defined as

∞

Γ(x) = ∫ e t - t x −1

dt , x > 0

0

Pr operties

∞

1.Γ( x ) = − e t[ ] + ∫ ( x − 1)t

− t x −1 ∞

0

x −2 − t

e dt

0

∞

= ( x − 1) ∫ e − t t ( x −1) −1dt = ( x − 1)Γ( x − 1)

0

∞

2.Γ(1) = ∫ e dt = 1 −t

0

3.put x = n

Γ(n) = ( n - 1) Γ( n - 1) = ( n − 1)( n − 2 ) Γ(n − 2) = ......

= (n − 1)(n − 2)....2.1

= (n − 1)!

Γ(1) = 0!= 1

Exponential Distribution

If the occurances of events over nonoverlapping

intervals are independent, such as arrival times of

telephone calls or bus arrival times at a bus stop,

then the waiting time distribution of these events

can be shown to be exponential

between people arriving at a line to check out in a

department store. (People, machines, or telephone calls may

wait in a queue)

Exponential distribution

λe -λx x ≥ 0

f(x) =

0 otherwise

parameter λ > 0

Exponential distribution

λe -λx x ≥ 0

f(x) =

0 otherwise

parameter λ > 0

∞

− λx

moments = µ'r = E (X ) = ∫ x λe

r r

dx

0

1 ∞ r −y Γ(r + 1) r!

= r ∫ y e dy = =

λ 0 λ r

λr

Example

Let X have an exponential distribution with mean of

100 . Find the probability that X<90

Ans. 0.593

Customers arrive in a certain shop according to an

approximate Poisson process at a mean rate of 20

per hour. What is the probability that the shopkeeper

will have to wait for more than 5 minutes for his first

customer to arrive?

-15

ans. e

Memoryless Property of the Exponential Distribution

P(X > s + t/X > s) = P(X > t), for any s, t > 0

( )

∞

− λx − λx ∞ − λk

P ( X > k ) = ∫ λe dx = − e k =e

P{ X > s + t & X > s}

k

Now P(X > s + t/X > s) =

P{ X > s}

P{ X > s + t} e −λ (s+ t )

− λt

= e = P(X > t ).

= = − λs

P{ X > s} e

If x represents the lifetime of an equipment then

above property states that if the equipment has

been working for time s, then the probability that it

will survive an additional time t depends only on t

(not on s) and is identical to the probability of survival

for time t of a new piece of equipment.

use for time s.

A crew of workers has 3 interchangeable machines,

of which 2 must be working for the crew to its job.

When in use, each machine will function for an

exponentially distributed time having parameters λ

before breaking down. The workers decide to initially

use machines A and B and keep machine C in

reserve to replace whichever of A or B breaks down

first. They will then be able to continue working until

one of the remaining machines breaks down. When

the crew is forced to stop working because only

one of the machines has not yet broken down, what

is the probability that the still operable machine is

machine C?

Suppose the life length of an appliance has an

exponential distribution with λ = 10 years.

A used appliance is bought by someone. What is

the probability that it will not fail in the next 5 years?

0.368

Suppose that the amount of waiting time a customer

spends at a restaurant has an exponential distribution

with a mean value of 5 minutes Then find the

probability that a customer will spend more than

10 minutes in the restaurant

0.1353

Example

Suppose that the length of a phone call in minutes is

an exponenetial random variable with parameter

1

λ=

10.

If A arrives immediately ahead of B at a public

telephone booth, find the probability that B will have

to wait (i) more than 10 minutes, and (ii) between

10 and 20 minutes.

(ans. 0.368,0.233)

Erlang distribution or General Gamma distribution

distribuion or General Gamma distribuion with parameter

λ > 0, k > 0, if its pdf is given by

λk x k −1e −λx

, for x ≥ 0

f(x) = Γ(k )

0, otherwise

∞

∫ f ( x )dx =1

0

λ = 1,

k −1 − x

x e

pdf = , x ≥ 0; k > 0.

Γ(k )

Gamma distribution or simple gamma distribution

with parameter k.

to the exponential distribution with parameter

λ > 0.

Example

The random variable X has the gamma distribution

with density function given by

0 for x ≤ 0

f(x) = -2x

2e for x > 0

Find the probability that X is not smaller than 3.

∞

−2 x e

−2 x

−6

P(X ≥ 3) = 2 ∫ e dx = 2 =e

3 −2

Suppose that an average of 30 customers per hour

arrive at a shop in accordance with a Poisson Process

That is, if a minute is our unit , then λ =1 / 2. What

is the probability that the shopkeeper wait more than

5 minutes before both of the first two customer arrive?

Solution.

If X denotes the waiting time in minutes until the

second customer arrives, then X has Erlang(Gamma)

distribution with k = 2, λ =1 / 2

λ ( λx )

k −1

∞

− λx

P(X > 5) = ∫ e dx

5 Γ(k )

=0.287

Mean and Variance of Erlang Distribution

moments = µ'r = E (X ) r

∞ λ k + r −1 −λx

k

=∫ x e dx

0 Γ(k )

λk

1 ∞ k + r −1 − t 1 Γ( k + r )

= ∫ t e dt = r

k +r

Γ(k ) λ 0 λ Γ(k )

k k

mean = E(X) = var(X) = 2

λ λ

m.g.f . = M X ( t ) = E (e ) tx

∞ λ k

k −1 − λx tx λk ∞

=∫ x e e dx = k −1 − ( λ − t ) x

∫x e dx

0 Γ(k ) Γ(k ) 0

λk

1 ∞ k −1 − y λ k

= k ∫

y e dy =

Γ(k ) ( λ − t ) 0 (λ − t) k

−k

t

= 1 −

λ

Reproductive Property

M ( X1 + X 2 +...+ X n ) ( t ) = M X1 ( t )M X 2 ( t ).....M X n ( t )

M X1 ( t )M X 2 ( t ).....M X n ( t )

− k1 −k 2 −k n

k k k

= 1 − 1 − .....1 −

λ λ λ

− ( k1+ k 2 ...+ k n )

t

= 1 −

λ

= M ( X1 + X 2 +...+ X n ) ( t )

The sum of a finite number of Erlang variables is also

an Erlang variable.

with parameters (λ, k1 ), (λ, k 2 ),......(λ, k n ), then

X1 + X 2 + .... + X n is also an Erlang variable with

parameter (λ, k1 + k 2 + ..... + k n )

If a company employes n sales persons, its gross

sales in thousands of rupees may be regarded as

RV having an Erlang distribuition with λ = 1/2,

and k = 80 n . If the sales cost is Rs. 8000 per

person, how many salespersons should the company

employ to maximise the expected profit?

Sales persons

Weibull Distribution

β −1 − αx β

f ( x ) = α βx e ,x > 0

parameter α, β α, β > 0

when β = 1, Weibull distribution reduces to

the exponential distribution with parameter α.

∞

r +β −1 − αx β

µ r ' = E(X ) = α β∫ x

r

e dx

0

r 1 1

+1− −1

y

∞ β β y

−y

β

= ∫ e dy

0 α α

1

−

β 1

Mean = E (X) = µ'1 = α Γ + 1

β

2 1 2

−2 / β

β β

Each of the 6 tubes of a radio set has a life

length(in years) which may be considered as

a RV that follows a Weibull distribution with

parameters α = 25 and β = 2. If these tubes

function independently of one another, what is

the probability that no tube will have to be replaced

during the first 2 months of service?

β-1 − αx β

its density function f(x) is given by f(x) = α βx e ,x > 0

−25 x 2

f ( x ) = 50 xe ,x > 0

P( a tube is not to be replaced during the first

2 months)

= p(X > 1 / 6) = ∫ 50 xe

∞

1/ 6

− 25 x 2

(

dx = − e − 25 x 2

) ∞

1

6

− 25 / 36

=e

P(all the 6 tubes are not to be replaced during

the first 2 months)

= e ( )

- 25/36 6

= 0.0155

Properties of a

Normal Distribution

• Continuous Random Variable

• Symmetrical in shape (Bell shaped)

• The probability of any given range of

numbers is represented by the area under

the curve for that range.

• Probabilities for all normal distributions are

determined using the Standard Normal

Distribution.

Probability for a

Continuous Random Variable

Probability Density Function for

Normal Distribution

−µ

2

1 −1 (

x

)

f (x ) =

σ 2π

e 2

σ

N(µ, σ)

+

2

1 − (

x 7

f (x) = 1

e 2 )

32π 4

N(−7,4)

∞

∫ f ( x )dx =1

−∞

Standard Normal Distribution

N(0,1)

z2

1 −

φ(z) = e 2

,−∞ < z < ∞.

2π

µ = 0, σ = 1 & by changing x and f respectively

into z and φ.

X -µ

If X has distribution N(µ, σ) and if Z = ,

σ

then Z has distribution N(0,1)

z

values of φ(z), ∫ φ(z)dz are tabulated.

0

X − N(µ, σ)

∞

E(X) = ∫ xf ( x )dx

−∞

1 ∞ − ( x −µ ) 2 / 2 σ 2

= ∫ xe dx

σ 2π − ∞

=

1 ∞

( )

−t 2

∫ µ + 2σt e dt

π −∞

t =

x −µ

σ 2

µ ∞ −t 2 2 ∞ −t 2

= ∫ e dt + σ ∫ te dt

π −∞ π −∞

µ

= π = µ.

π

Figure 6.3

Figure 6.5

Determining the Probability for a

Standard Normal Random Variable

• P(-∞≤ Z ≤ 1.62) = .5 + .4474 = .9474

• P(Z > 1.62) = 1 - P(-∞≤ Z ≤ 1.62) =

1 - .9474 = .0526

z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09

0.0 0.0000 0.0040 0.0080 0.0120 0.0160 0.0190 0.0239 0.0279 0.0319 0.0359

0.1 0.0398 0.0438 0.0478 0.0517 0.0557 0.0596 0.0636 0.0675 0.0714 0.0753

0.2 0.0793 0.0832 0.0871 0.0910 0.0948 0.0987 0.1026 0.1064 0.1103 0.1141

0.3 0.1179 0.1217 0.1255 0.1293 0.1331 0.1368 0.1406 0.1443 0.1480 0.1517

0.4 0.1554 0.1591 0.1628 0.1664 0.1700 0.1736 0.1772 0.1808 0.1844 0.1879

0.5 0.1915 0.1950 0.1985 0.2019 0.2054 0.2088 0.2123 0.2157 0.2190 0.2224

0.6 0.2257 0.2291 0.2324 0.2357 0.2389 0.2422 0.2454 0.2486 0.2517 0.2549

0.7 0.2580 0.2611 0.2642 0.2673 0.2704 0.2734 0.2764 0.2794 0.2823 0.2852

0.8 0.2881 0.2910 0.2939 0.2969 0.2995 0.3023 0.3051 0.3078 0.3106 0.3133

0.9 0.3159 0.3186 0.3212 0.3238 0.3264 0.3289 0.3315 0.3340 0.3365 0.3389

1.0 0.3413 0.3438 0.3461 0.3485 0.3508 0.3513 0.3554 0.3577 0.3529 0.3621

1.1 0.3643 0.3665 0.3686 0.3708 0.3729 0.3749 0.3770 0.3790 0.3810 0.3830

1.2 0.3849 0.3869 0.3888 0.3907 0.3925 0.3944 0.3962 0.3980 0.3997 0.4015

1.3 0.4032 0.4049 0.4066 0.4082 0.4099 0.4115 0.4131 0.4147 0.4162 0.4177

1.4 0.4192 0.4207 0.4222 0.4236 0.4251 0.4265 0.4279 0.4292 0.4306 0.4319

1.5 0.4332 0.4345 0.4357 0.4370 0.4382 0.4394 0.4406 0.4418 0.4429 0.4441

1.6 0.4452 0.4463 0.4474 0.4484 0.4495 0.4505 0.4515 0.4525 0.4535 0.4545

1.7 0.4554 0.4564 0.4573 0.4582 0.4591 0.4599 0.4608 0.4616 0.4625 0.4633

1.8 0.4641 0.4649 0.4656 0.4664 0.4671 0.4678 0.4686 0.4693 0.4699 0.4706

1.9 0.4713 0.4719 0.4726 0.4732 0.4738 0.4744 0.4750 0.4756 0.4761 0.4767

2.0 0.4772 0.4778 0.4783 0.4788 0.4793 0.4798 0.4803 0.4808 0.4812 0.4817

2.1 0.4821 0.4826 0.4830 0.4834 0.4838 0.4842 0.4846 0.4850 0.4854 0.4857

2.2 0.4861 0.4864 0.4868 0.4871 0.4875 0.4878 0.4881 0.4884 0.4887 0.4890

2.3 0.4893 0.4896 0.4898 0.4901 0.4904 0.4906 0.4909 0.4911 0.4913 0.4916

2.4 0.4918 0.4920 0.4922 0.4925 0.4927 0.4929 0.4931 0.4932 0.4934 0.4936

2.5 0.4938 0.4940 0.4941 0.4943 0.4945 0.4946 0.4948 0.4949 0.4951 0.4952

2.6 0.4953 0.4955 0.4956 0.4957 0.4959 0.4960 0.4961 0.4962 0.4963 0.4964

2.7 0.4965 0.4966 0.4967 0.4968 0.4969 0.4970 0.4971 0.4972 0.4973 0.4974

2.8 0.4974 0.4975 0.4976 0.4977 0.4977 0.4978 0.4979 0.4979 0.4980 0.4981

2.9 0.4981 0.4982 0.4982 0.4983 0.4984 0.4984 0.4985 0.4985 0.4986 0.4986

3.0 0.4987 0.4987 0.4987 0.4988 0.4988 0.4989 0.4989 0.4989 0.4990 0.4990

3.1 0.4990 0.4991 0.4991 0.4991 0.4992 0.4992 0.4992 0.4992 0.4993 0.4993

3.2 0.4993 0.4993 0.4994 0.4994 0.4994 0.4994 0.4994 0.4995 0.4995 0.4995

3.3 0.4995 0.4995 0.4995 0.4996 0.4996 0.4996 0.4996 0.4996 0.4996 0.4997

3.4 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4998

Determining the probability of any Normal Random

Variable

Interpreting Z

• In figure Z = -0.8 means that the value 360

is .8 standard deviations below the mean.

• A positive value of Z designates how many

standard deviations (σ ) X is to the right of

the mean (µ ).

• A negative value of Z designates how may

standard deviations (σ ) X is to the left of the

mean (µ ).

Example: A group of achievement scores are

normally distributed with a mean of 76 and a

standard deviation of 4. If one score is randomly

selected what is the probability that it is at least 80.

σ =4

76 80

x − µ 80 − 76

Z= = =1

σ 4

P( x ≥ 80) = P(z ≥ 1) = .5 − P(0 ≤ z ≤ 1) =

.5 − .3413 = .1587

σ =4

76 80

x − u 80 − 76

Z= = =1

σ 4

P ( x ≥ 80) = P ( z ≥ 1) = .5 − P (0 ≤ z ≤ 1) =

.5 − .3413 = .1587

.3413

.1587

0 1

Continuing, what is the probability that it is less than 70.

70 76

x −u 70 − 76

Z= = = −1.5

σ 4

P ( x ≤ 70 ) = P ( z ≤ −1.5) = .5 − P ( −1.5 ≤ z ≤ 0.0) =

.5 −.4332 = .0668 .4332

.0668

-1.5 0

What proportion of the scores occur within 70 and 85.

.4332

.4878

-1.5 0 2.25

x −u 70 −76

Z = = = −1.5

σ 4

x −u 85 −76

Z = = = 2.25

σ 4

P (70 ≤ x ≤85 ) = P ( −1.5 ≤ z ≤ 2.25 ) =

P ( −1.5 ≤ z ≤ 0.0) + P (0.0 ≤ z ≤ 2.25 ) =

.4332 +.4878 =.9210

Time required to finish an exam is known to be normally

distributed with a mean of 60 Min. and a Std Dev. of 12

minutes. How much time should be allowed in order for

90% of the students to finish?

.9 σ =12

60 x

x −µ

z=

σ

zσ = x − µ

zσ + µ = x

1.28(12 ) + 60 = x

x = 75 .36

An automated machine that files sugar sacks has an adjusting device to

change the mean fill per sack. It is now being operated at a setting that

results in a mean fill of 81.5 oz. If only 1% of the Sacks filled at this

setting contain less than 80.0 oz, what is the value of the variance for this

population of fill weights. (Assume Normality).

.01

-2.33 0

µ=81 .5 x =80 PROB =.01

x −u

Z =

σ

80 .0 −81 .5

−2.33 =

σ

(80 .0 −81 .5)

σ= =.6437

−2.33

σ =.4144

2

Moment generating function of N(0,1)

∞

M Z ( t ) = E(e tZ ) = ∫ e tz φ(z)dz

−∞

1 ∞ tz − z 2 / 2 1 ∞

( − 2 tz ) / 2

=

2

∫e e dz = ∫ e

− z

dz

2π − ∞ 2π − ∞

( z−t ) 2

1 ∞ − ( z − t ) − t 2 / 2 1 ∞

2

−

t2 / 2

= ∫ e

dz = e ∫ e 2

dz

2π − ∞ 2π − ∞

t2 / 2 1 t2 / 2

t2 / 2 1 ∞ − u du = e Γ(1 / 2) = e

=e ∫ e π

2π − ∞ 2u

The moment generating function of N(µ, σ)

= M X ( t ) = M σZ+µ ( t )

t ( σz +µ) µt tσz

= E (e ) = e E (e )

µt

= e M Z (σt )

=e eµt σ 2 t 2 / 2

=e ( ( ) )

t µ+ σ2 t / 2

t σ t t 2

σ t 2 2 2

= 1 + (µ + ) + (µ + ) + ..... + ∞

1! 2 2! 4

If X has the distribution N(µ, σ) then Y = aX + b

has the distribution N(aµ + b, aσ)

M X (t) = e ( ( ) )

t µ+ σ2 t / 2

M Y ( t ) = M aX + b ( t )

= e M X (at )

bt

( ( ) )

bt t aµ + a 2 σ 2 t / 2

=e e

=e ( ( ) )

t ( aµ + b ) + a 2 σ 2 t / 2

X -µ

If X has distribution N(µ, σ), then Z =

σ

1 µ 1

has the distribution N( µ − , .σ) = N(0,1)

σ σ σ

Additive property of normal distribution

If X i (i = 1,2,...., n ) be n independent normal

RVs with mean µ i and variance σ , then 2

i

n n

∑ a i Xi is also a normal RV with mean ∑ a i µ i

i =1 i =1

n 2

& var iance ∑ a i σ i . 2

i =1

∑ a iXi

i=1

a1µ1t + a12 σ12 t 2 / 2 a 2µ 2 t + a 22 σ 22 t 2 / 2

=e .e ........

( ∑ a iµi ) t + ∑ a i2σi2 t 2 / 2

=e

When n is very large and neither p nor q is very small

X—B(n,p)

s tan dard binomial variable Z is given by

X - np

Z=

npq

as X varies from 0 to n with step size 1, Z varies

- np np 1

from to with step size .

npq npq npq

∞ 1 −t 2 / 2

P(z) = F(z) = ∫ e dt ,−∞ < z < ∞

−∞ 2π

Let X be the number of times that a fair coin, flipped 40

times, land heads. Find P(X=20). Use normal

approximation and compare it to the exact solution.

P(X=20)=P(19.5<X<20.5)

19.5 − 20 X − 20 20.5 − 20

= P < <

10 10 10

= φ(.16) − φ(−.16) = .1272

0 1 2 3 4 5 6 7 8 9 10 11 12

01 2 3 4 5 6 7 8 9 10 11 12

If 20% of the momory chips made in a certain plant

are defective, what are the probabilities that in a

lot of 100 randomly chosen for inspection

(a) at most 15 will be defective?

(b) exactly 15 will be defective ?

µ = 100(.20) = 20, σ = 4

15.5 − 20

F( ) = 0.1292

4

15.5 − 20 14.5 − 20

F − F = 0.0454

4 4

Fit a normal distribution to the following distribution

and hence find the theoretical frequencies:

Class Freq

60-65 3

65-70 21

70-75 150

75-80 335

80-85 336

85-90 135

90-95 26

95-100 4

-------------

1000

β −1 − αx β

f ( x ) = α βx e ,x > 0

The special case of Weibull with α = 1/σ & β = 2 2

Thus Rayleigh has linear rate .

x − x 2 / 2a 2

If the pdf of a continuous RV X is f(x) = 2 e ,0 < x < ∞,

σ

then X follows a Rayleigh distribution with parameter σ.

of a randomly received signal usually can be modeled

as a Rayleigh distribution.

Let X have a gamma distribution with λ = 1/2 and k = r/2

where r is a positive integer.

x ( r/2 ) −1 −x / 2

e ,0 ≤ x ≤ ∞

The pdf of X is f(x) = Γ(r / 2)2 r/2

0 x<0

2

of freedom.

k ( r / 2)

E(X) = = =r

λ (1 / 2 )

k ( r / 2)

Var (X) = 2 = = 2r

λ (1 / 4 )

Mean equals the number of degrees of freedom and

the variance equals twice the number of degrees of

freedom.

−r / 2

M X ( t ) = (1 − 2t ) , t < 1/ 2

df \p .005 .01 .025 .05 .10 .90 .95 .975 .99 .995

1 .00004 .00016 .00098 .0039 .0158 2.71 3.84 5.02 6.63 7.88

2 .0100 .0201 .0506 .1026 .2107 4.61 5.99 7.38 9.21 10.60

3 .0717 .115 .216 .352 .584 6.25 7.81 9.35 11.34 12.84

4 .207 .297 .484 .711 1.064 7.78 9.49 11.14 13.28 14.86

5 .412 .554 .831 1.15 1.61 9.24 11.07 12.83 15.09 16.75

.6 .676 .872 1.24 1.64 2.20 10.64 12.59 14.45 16.81 18.55

7 .989 1.24 1.69 2.17 2.83 12.02 14.07 16.01 18.48 20.28

8 1.34 1.65 2.18 2.73 3.49 13.36 15.51 17.53 20.09 21.96

9 1.73 2.09 2.70 3.33 4.17 14.68 16.92 19.02 21.67 23.59

10 2.16 2.56 3.25 3.94 4.87 15.99 18.31 20.48 23.21 25.19

11 2.60 3.05 3.82 4.57 5.58 17.28 19.68 21.92 24.73 26.76

12 3.07 3.57 4.40 5.23 6.30 18.55 21.03 23.34 26.22 28.30

df \p .005 .01 .025 .05 .10 .90 .95 .975 .99 .995

14 4.07 4.66 5.63 6.57 7.79 21.06 23.68 26.12 29.14 31.32

15 4.6 5.23 6.26 7.26 8.55 22.31 25 27.49 30.58 32.80

16 5.14 5.81 6.91 7.96 9.31 23.54 26.30 28.85 32.00 34.27

18 6.26 7.01 8.23 9.39 10.86 25.99 28.87 31.53 34.81 37.16

20 7.43 8.26 9.59 10.85 12.44 28.41 31.41 34.17 37.57 40.00

24 9.89 10.86 12.40 13.85 15.66 33.20 36.42 39.36 42.98 45.56

30 13.79 14.95 16.79 18.49 20.60 40.26 43.77 46.98 50.89 53.67

40 20.71 22.16 24.43 26.51 29.05 51.81 55.76 59.34 63.69 66.77

60 35.53 37.48 40.48 43.19 46.46 74.40 79.08 83.30 88.38 91.95

120 83.85 86.92 91.58 95.70 100.62 140.23 146.57 152.21 158.95 163.64

Let X be χ (10). Find P(3.25 ≤ X ≤ 20.5).

2

Ans.0.95

−6

0.05

If X is χ (5) , determine the constant c and d so that

2

0.831, 12.8

Beta Distribution

The random variable x is said to have beta

distribution with nonnegative parameters α & β

1

if f(x) = x α −1 (1 − x )β−1 ,0 < x < b

B(α, β)

0 otherwise

1

α -1 β −1

where B(α, β) = ∫ x (1 − x ) dx

0

2π Γ( α ) Γ( β )

= 2 ∫ ( sin θ) ( cos θ)

2 α −1 2β −1

dθ =

0 Γ( α + β )

when α = β = 1, beta distribution is uniform distribution

on (0,1)

uniform distribution on (0,1)

distribution takes a variety of shapes.

α αβ

µ= ,σ =

2

α+β ( α + β) ( α + β + 1)

2

In a certain country, the proportion of highway

requiring repairs in any given year is a random

variable having the beta distribution with α = 3 and

β = 2. Find

(a) on the average what percentage of the highway

sections require repairs in any given year :

(b) the probability that at most half of the highway

sections will require repairs in any given year.

3

(a )µ = = 0.60, i.e.60% of the high way

3+ 2

sections require repairs in any given year.

1/2

(b)P(X ≤ 1/2) = ∫ 12x (1 − x )dx = 5 / 16

2

0

The lognormal distribution

If X = log T follows a normal distribution N(µ, σ),

then T follows a lognormal distribution whose pdf is given

1 1 t

2

by f(t) = exp − 2 log , t ≥ 0

st 2π 2s tm

median time(or mean time) to failure is the location parameter,

given by log t m = µ.

2

s

MTTF = E ( t ) = t m exp( )

2

2

[

var(T ) = σ T = t m exp(s ) exp(s ) − 1

2 2 2

]

F( t ) = P(T ≤ t ) = P{ log T ≤ log t}

log T − µ log t − µ

= P ≤

σ σ

log T − log t m 1 t

= P ≤ log

σ s tm

1 t

= P Z ≤ log

s tm

We can compute R(T) and λ(t).

Fatigue wearout of a component has a log normal distribution

with t M = 5000 hours and s = 0.20.

(a) Compute the MTTF and SD.

(b) Find the reliability of the component for 3000 hours.

(c) Find the design life of the component for a reliability of 0.95.

5101hours,1030hours

∞ ∞

R (3000) = ∫ f ( t )dt = 1

∫ φ( z )dz

3000

3000 log( )

0.2 5000

∞

= ∫ φ(z)dz = 0.9946

− 2.55

3598.2

z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09

0.0 0.0000 0.0040 0.0080 0.0120 0.0160 0.0190 0.0239 0.0279 0.0319 0.0359

0.1 0.0398 0.0438 0.0478 0.0517 0.0557 0.0596 0.0636 0.0675 0.0714 0.0753

0.2 0.0793 0.0832 0.0871 0.0910 0.0948 0.0987 0.1026 0.1064 0.1103 0.1141

0.3 0.1179 0.1217 0.1255 0.1293 0.1331 0.1368 0.1406 0.1443 0.1480 0.1517

0.4 0.1554 0.1591 0.1628 0.1664 0.1700 0.1736 0.1772 0.1808 0.1844 0.1879

0.5 0.1915 0.1950 0.1985 0.2019 0.2054 0.2088 0.2123 0.2157 0.2190 0.2224

0.6 0.2257 0.2291 0.2324 0.2357 0.2389 0.2422 0.2454 0.2486 0.2517 0.2549

0.7 0.2580 0.2611 0.2642 0.2673 0.2704 0.2734 0.2764 0.2794 0.2823 0.2852

0.8 0.2881 0.2910 0.2939 0.2969 0.2995 0.3023 0.3051 0.3078 0.3106 0.3133

0.9 0.3159 0.3186 0.3212 0.3238 0.3264 0.3289 0.3315 0.3340 0.3365 0.3389

1.0 0.3413 0.3438 0.3461 0.3485 0.3508 0.3513 0.3554 0.3577 0.3529 0.3621

1.1 0.3643 0.3665 0.3686 0.3708 0.3729 0.3749 0.3770 0.3790 0.3810 0.3830

1.2 0.3849 0.3869 0.3888 0.3907 0.3925 0.3944 0.3962 0.3980 0.3997 0.4015

1.3 0.4032 0.4049 0.4066 0.4082 0.4099 0.4115 0.4131 0.4147 0.4162 0.4177

1.4 0.4192 0.4207 0.4222 0.4236 0.4251 0.4265 0.4279 0.4292 0.4306 0.4319

1.5 0.4332 0.4345 0.4357 0.4370 0.4382 0.4394 0.4406 0.4418 0.4429 0.4441

1.6 0.4452 0.4463 0.4474 0.4484 0.4495 0.4505 0.4515 0.4525 0.4535 0.4545

1.7 0.4554 0.4564 0.4573 0.4582 0.4591 0.4599 0.4608 0.4616 0.4625 0.4633

1.8 0.4641 0.4649 0.4656 0.4664 0.4671 0.4678 0.4686 0.4693 0.4699 0.4706

1.9 0.4713 0.4719 0.4726 0.4732 0.4738 0.4744 0.4750 0.4756 0.4761 0.4767

2.0 0.4772 0.4778 0.4783 0.4788 0.4793 0.4798 0.4803 0.4808 0.4812 0.4817

2.1 0.4821 0.4826 0.4830 0.4834 0.4838 0.4842 0.4846 0.4850 0.4854 0.4857

2.2 0.4861 0.4864 0.4868 0.4871 0.4875 0.4878 0.4881 0.4884 0.4887 0.4890

2.3 0.4893 0.4896 0.4898 0.4901 0.4904 0.4906 0.4909 0.4911 0.4913 0.4916

2.4 0.4918 0.4920 0.4922 0.4925 0.4927 0.4929 0.4931 0.4932 0.4934 0.4936

2.5 0.4938 0.4940 0.4941 0.4943 0.4945 0.4946 0.4948 0.4949 0.4951 0.4952

2.6 0.4953 0.4955 0.4956 0.4957 0.4959 0.4960 0.4961 0.4962 0.4963 0.4964

2.7 0.4965 0.4966 0.4967 0.4968 0.4969 0.4970 0.4971 0.4972 0.4973 0.4974

2.8 0.4974 0.4975 0.4976 0.4977 0.4977 0.4978 0.4979 0.4979 0.4980 0.4981

2.9 0.4981 0.4982 0.4982 0.4983 0.4984 0.4984 0.4985 0.4985 0.4986 0.4986

3.0 0.4987 0.4987 0.4987 0.4988 0.4988 0.4989 0.4989 0.4989 0.4990 0.4990

3.1 0.4990 0.4991 0.4991 0.4991 0.4992 0.4992 0.4992 0.4992 0.4993 0.4993

3.2 0.4993 0.4993 0.4994 0.4994 0.4994 0.4994 0.4994 0.4995 0.4995 0.4995

3.3 0.4995 0.4995 0.4995 0.4996 0.4996 0.4996 0.4996 0.4996 0.4996 0.4997

3.4 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4998

- Papoulis Solutions Manual 4th EditionUploaded byapi-19496750
- veerarajanUploaded byManish Vohra
- Probability, Statistics and Random Processes 1Uploaded bybaali4
- Probability- Statistics and Random Processes by VeerarajanUploaded byRamakant Sharma
- Stark Woods Solution Chp1-7 3edUploaded byVi Tuong Bui
- Formula Sheet-Probability and Random ProcessesUploaded byhammadmunawar
- Probability&RandomProcesseswithApplicationstoSignalProcessing3eStarkUploaded byMayank Gupta
- Probability Random Variables and Random Processes Part 1Uploaded bytechnocrunch
- IPA - Monte Carlo SimulationUploaded byNaufaldy Obianka
- Introduction to ProbabilityUploaded byfluffy1777
- DIT 111 Probability and queing theory.pdfUploaded byChan_abm
- 74263926 Probability Statistics anProbability-Statistics-and-Random-Processesd Random Processes 1Uploaded byAbhishek Balaji
- Probability and Queuing Theory - Question Bank.Uploaded byprooban
- Probability and Stochastic ProcessesUploaded byAshiquzzaman Akash
- Introduction to ProbabilityUploaded bySean Shiu
- 74263926-Probability-Statistics-and-Random-Processes-1.pdfUploaded byhaziqhazri
- Stochastic lecture 5Uploaded bym_usama
- Ma0232 Probability Random ProcessUploaded byRajkumar Kondappan
- Solution Manual.error Control Coding 2nd.by Lin Shu and CostelloUploaded bySerkan Sezer
- Answer of Pro and StatUploaded byZe Xu
- Life Data Analysis ReferenceUploaded byMuralikrishna Arigonda
- Probabilidad y estadistica - Formulario 2ndo ParcialUploaded byRoberto Martinez
- f11notesUploaded byArif Nur Alfiyan
- Probability Theory and StatisticsUploaded bySnigdha Chakraborty
- Gnostics in Valuation: Non-Parametric Approach to Multiples EstimationUploaded byInternational Journal of Innovative Science and Research Technology
- Sobol-The-Monte-Carlo-Method-LML.pdfUploaded byk464265
- I - 09 - Common Probability DistributionsUploaded byCAROL_X1
- EC254_Problem_set4.pdfUploaded byChirag Don Johnson
- RV_Prob_Distributions.pdfUploaded bySaad Khan

- Ma2262 Probability and Queuing Theory Question Bank DownloadUploaded byTaniyaa Venkat
- 2015 IFT Study Planner Level IUploaded byVenkatRoshan
- Tutorial Part 1( May 2014)Uploaded byEvan Chin
- homeAssignment_1Uploaded byangipyant
- AN462Uploaded bydocdoc26
- Handout 03 Continuous Random VariablesUploaded bymuhammad ali
- probUploaded byYodaking Matt
- 1-Proposal for Asset Management of Transformer Power & Distribution Tf-LAUploaded byAradhana Ray
- Exercises 5&EstimationUploaded bySriram Raghunath
- Baker (2008) Intro to PSHA v1 3Uploaded byketanbajaj
- Mendel Ci Magazine 2007Uploaded bydfgh
- ch8Uploaded byAbhijit Pathak
- Chapter 2Uploaded byMarco Cocchi
- Style Space - How to Compare Image Sets and Follow Their Evolution, Lev ManovichUploaded byTilman Shalom Porschuetz
- Intensity HistogramUploaded bybasab_ecens
- hw6solUploaded byEge Engin
- Probabilistic Analysis of Small-Signal Stability of Large-Scale Power Systems as Affected by Penetration of Wind GenerationUploaded bySamundra Gurung
- Statprob Final Assignment 2015Uploaded byRaihan Alisha Nabila
- Random vibrations‐IIUploaded byyoosef
- ExamQuestionsUP.pdfUploaded bysound05
- 41R-08 Risk Analysis and Contingency Determination Using Range EstimatingUploaded byDody Bdg
- Lecture 1Uploaded bykardra
- Statistic 1Uploaded bydzatiathiar
- SSRN-id2614233Uploaded byhc87
- Basic GeostatisticsUploaded byDronYA
- Chapter 1Uploaded byPrabhakar Krishnamurthy
- Detailed Bayesian Inversion of Seismic DataUploaded byNANKADAVUL007
- Beta DistributionUploaded bynimesh
- 286ddfUploaded byLiam Gray
- ELEC 303 HW10 PROBABILITYUploaded byXuyang Sean Lu