You are on page 1of 85

# Probability & Statistics

Rubayet Karim
Assistant Professor

Dept. of Industrial & Production Engineering
Jessore University of Science & Technology

Probability is a branch of mathematics that deals with
calculating the likelihood of a given event's occurrence,
which is expressed as a number between 1 and 0.

P(E)
Where: P(E): probability of occurrence of event E

N: Number of outcomes in E
N: total number of outcomes

P(E)
Where: P(E): probability of occurrence of event E
N: total number of trials
Number of outcomes in E

.

.

.

Single outcome(Elementary event). Tossing each coin. In probability theory an elementary event (also called an atomic event or simple event) is an event which contains only a single outcome in the sample space. 2. 3. then possible outcome is 2. • When the objective is to get a even number from this experiment. 5 and 6.Example.6 so not a single outcome that’s why this is not a elementary event. Example: Die rolling •The possible outcomes of this experiment are 1. 4. . Experiment : Trial : Tossing 4 Coins. We can consider the act of tossing each coin as a trial and thus say that there are 4 trials in the experiment of tossing 4 coins.4.

.

.

.

.

.

.

{} Therefore P( ) )=0 .

P ( X Y) = P(X) and P ( Y X) = P(Y) P(A) + P( )=1 Types of Probability There are four types: Marginal probability P(X) • The probability of X occurring .

General Law of Addition P( ) = P(X) + P(Y) – P( ) .Union Probability P( ) • The probability of X or Y occurring Joint Probability P( ) • The probability of X and Y occurring Conditional probability P ( X Y) • The probability of X occurring given that Y has occurred.

.

P( ) = P(X) + P(Y) .

• P(T P( C) = P(T) + P(C) = )=P(X) P(Y X) =P(Y) P(X Y) .

2 • P(S M) = P(M) P(S M)= P(X)= P(X Y).• P(S) = P(M)= • P(S M) =0. P( P(Y) =P(Y X) ) = P(X) P(Y) .

P( P(X Y) = ( ) P(Y X) P(X) = P(Y) P(Y) .

.

.

(Y Xi) P(Xi) P(Y Xi) P(Y Xi) P(Xi ) P(Xi Y) (Y Xi) P(Xi) .

e conflict with each other ) • Together they must form a sample space • Most of the time use reverse time order probability (i.e P(cause effect) ) • When conditional probability declares reverse time order then it is called posterior probability • P(Y X) Time order Effect • P( X Y) Cause Reverse time order .Bayes’ rule • Events are mutually exclusive(i.

P(Y X1) P(X1) P(Y X1) P(X1 )+ P(Y X2) P(X2)+ P(Y X3) P(X3) .

.

it rains or it does not rain. When it actually rains. at an outdoor ceremony in the desert. a third event occurs when the weatherman predicts rain. It rains on Marie's wedding.Example Marie is getting married tomorrow. Event B. the weatherman has predicted rain for tomorrow. Event A2. Notation for these events appears below. he incorrectly forecasts rain 10% of the time. In recent years. What is the probability that it will rain on the day of Marie's wedding? Solution: The sample space is defined by two mutually-exclusive events . Additionally. When it doesn't rain. The weatherman predicts rain. the weatherman correctly forecasts rain 90% of the time. it has rained only 5 days each year. It does not rain on Marie's wedding. . Event A1. Unfortunately.

given a forecast for rain by the weatherman.] We want to know P( A1 | B ).] P( B | A1 ) = 0.] P( A2 ) = 360/365 = 0. as shown below. .1 [When it does not rain. The answer can be determined from Bayes' theorem. the weatherman predicts rain 90% of the time. the probability it will rain on the day of Marie's wedding. we know the following's P( A1 ) = 5/365 =0.9863014 [It does not rain 360 days out of the year. the weatherman predicts rain 10% of the time.In terms of probabilities.9 [When it rains.] P( B | A2 ) = 0.0136985 [It rains 5 days out of the year.

Note the somewhat unintuitive result. Even when the weatherman predicts rain. Despite the weatherman's gloomy prediction. . it only rains only about 11% of the time. there is a good chance that Marie will not get rained on at her wedding.

.

.

.

.

.

c = constant .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

5 ) = 1.0.P(X ≥ 12.5 ) = = = P(X< 12.1666 =0.8333 0.P(X ≥ 12.5 ) = 1.1666 .

.

.

.

.

.

.

.

.

.

.

.

Application  The exponential distribution occurs naturally when describing the lengths of the inter-arrival times in a homogeneous Poisson Process • Queuing theory :the service times of agents in a system (e. to serve a customer) are often modeled as exponentially distributed variables.g. . it is well-suited to model the constant hazard rate portion of the bathtub curve. • Reliability theory: Because of the memory less property of this distribution. how long it takes for a bank teller etc.

.

.

.

.

.

.

.

.

.

.

.

.

E[ X3 ] -3E[X] E[ X2 ]+ 2(E[X])3 = μ/3 (x)-3. E[ X4 ] -4E[X] E[ X3 ]+ 6(E[X])2 E[ X2] -3(E[X])4 =μ/4 (x)-4μ/1 (x) μ/3 (x)+ 6[μ/1 (x) ]2 μ/2 (x) -3. E[X]= μ/1 (x)  1st & 2nd moment of x: Variance. [μ/1 (x)]4 . 1st moment of x : Mean. E[ X2 ] – {E[X]}2 = μ/2 (x) [μ/1 (x)]2  1st . [μ/1 (x)]3  1st .3rd & 4th moment of x: Kurtosis . μ/1(x) μ/2 (x)+ 2. 2nd . 2nd & 3rd moment of x: Skewness .

5)x where X is a random variable having moment generating function Mx (t) for t < Calculate the expected value of this piece of equipment after three years of use.5)x ] = 100 E[(0. Soln: Let .Example: The value of a piece of factory equipment after three years of use is 100(0. Value Y = 100(0.5)x ] =100E [ ] =100E [ ] = 100Mx (ln 0. E[Y] = E[100(0.5) = 100 x = 41.9060 .5)x So expected value.

THE END .