Professional Documents
Culture Documents
INTRODUCTION:
• If an experiment is repeated under essentially
homogeneous and similar conditions we
generally come across two types of situations:
For example:
if the potential difference E between the two ends
of the conductor and the resistance R are known,
the current I flowing in the conductor is uniquely
determined by Ohm‘s- Law", viz., I = .
There are phenomena [as covered by (ii) above]
which do not lend themselves to deterministic
approach and are known as 'unpredictable' or
'probabilistic' phenomena.
For example:
I. In tossing of a coin one is not sure if a head or
tail will be obtained.
II. A salesman can not predict the with certainty
about the sales target next year.
Basic Terminology
• Random experiment:
Random experiments are those experiments whose results
depend on chance the word ‘random’ may be taken as
equivalent to ‘depending on chance’).
• Sample Space:
The set of all possible outcomes of a random experiment is
called the sample space (or universal set), and it is
denoted by S. An element in sample space S is called a
sample point.
Each outcome of a random experiment corresponds to a
sample point.
Trial and Event:
Consider an experiment which though repeated under essentially
identical conditions, the results(outcomes) are not unique, but
may be one of the possible outcomes.
The experiment is known as a trial and the outcomes are known
as events or cases.
For example:
Tossing of a coin is a trial and getting head (H) or tail (T) is an event.
Similarly in tossing a coin the events head and tail are mutually
exclusive.
• Equally likely events:
Outcomes of a trial are said to be equally likely
if taking into consideration all the relevant
evidence, there is no reason to expect one in
preference to the others.
For example:
In tossing an unbiased or uniform coin, head
or tail are equally likely events.
In throwing an unbiased die, all the six faces
are equally likely to come.
• Independent events:
Several events are said 'to be independent if the
happening (or non-happening) of an event is 'not affected
by the supplementary knowledge concerning the
occurrence of any number of the remaining events. For
example:
In tossing an unbiased coin the event of getting a head in the first
toss is independent of getting a head, in the second, third and
subsequent throws.
A B
S
A B
(A B) (A B) (A B) (A B)
P(A B) P (A B) (A B) (A B)
P(A B) P(A B) P(A B)
....(By axioms of additivity)
P(A) - P(A B) P(A B) P(B) P(A B)
P(A) P(B) - P(A B)
• Cor.1 If event A and B are mutually exclusive,
then A B
P A B P 0
P(A B) P(A) P(B)
2.P(B/B) = 1
P A1 A 2 A 3 ...... A n P A1 .P A 2 .P A 3 .......P A n
Mutually exclusive (disjoint) events and
independent events:
• Two mutually disjoint events with positive
probabilities are always dependent (not
independent) events.
• Two independent events (both of which are
possible events), can not be mutually disjoint.
• If A and B are independent events, then
(i) A and B (ii) and AB(iii) and A B
are also independent.
• If A1, A2 A3……. An are independent events with
respective probabilities of occurrence P(A1), P(A2),……
P(An) find the probability of occurrence of at least
one of them.
Proof:
[ (A1 A 2 ..... A n ) = (A 1 A 2 ..... A n )
(De-Morgan's Law)]
Hence the probability of happening of at least one
of the events is given by
P(A1 A 2 ..... A n ) = 1 - P(A1 A 2 ..... A n )
= 1 - P(A1 A 2 ..... A n )
= 1 - P (A1 ) .P (A 2 )........P (A n )
• P[happening of at least one of the events
A1,A 2 ,.........,A]n
=1 - P(none of the events A1,A 2 ,.........,A
happens)
n
or equivalently,
• P {none of the given events happens}
= 1 - P {at least one of them happens}.
Given
P ( A) 1/ 2 P( A) 1/ 2
P ( B ) 3 / 4 P( B ) 1/ 4
P (C ) 1/ 4 P(C ) 3 / 4
P ( problem is solved )
P ( Atleast one of them solved problem)
1 P(none of the A,B & Csolved problem)
=1-P(A B C)
=1-P(A)P(B)P(C) ( A,B, C are ind.)
1 1 3
=1- . .
2 4 4
29
32
Given
P( A) 1/ 2 P ( A) 1/ 2
P( B ) 3 / 4 P ( B ) 1/ 4
P(C ) 1/ 4 P(C ) 3 / 4
P( Atleast two of them solved problem)
P(ABC or ABC or ABC or ABC)
=P (ABC)+P(ABC) + P(ABC) + P(ABC)
=P(A)P(B)P(C)+P(A)P(B)P(C)+P(A)P(B)P(C)+P(A)P(B)P(C)
=9/32+1/32+3/32+3/32
1/ 2
Ex. A person has undertaken a construction job the
probabilities are 0.65 that there will be strike, 0.80 the
construction job will be completed on time if there is no
strike and 0.32 that the construction job will be completed
on time if there is a strike. Determine the probability that
the construction will be completed on time.
Solution:
Let A be an event that construction job will be completed
on time &
B is the event that there will be strike.
A
P 0.32
B
Conditional probability that the construction job will be completed on time if there is no
strike.
A
P 0.80
B
Since
A ( A & B ) or ( A & B )
( A B) ( A B )
P ( A) P [ A B ) ( A B )]
P ( A) P ( A B ) +P ( A B )
P (B ).P ( A / B ) +P (B ).P ( A / B )
0.49 (Approx)
Bayes Theorem
The probability revision process are shown in the following diagram
New
information
probability
Theorem:
if B1, B2, B3, ……., Bn be a set of exhaustive and mutually
exclusive events associated with a random experiment
and An event A occur either with B1 or B2 or B3 or …….
or Bn .Then conditional probability P(Bi/A) of a specified
event Bi when A is stated to actually occurred is given by
B4 AB3 AB2 B1
AB1
ABn
ABr
Bn
Br
Proof:
Outer circle represents a sample space which
consists Bi i = 1,2, ……, n, n mutually exclusive and
exhaustive cases and inner circle represents event A
can occur with either B1 or, B2 or, B3, or ……., Bn.
Therefore
P(A) = P(B1A or B2A or B3A or…………or BnA)
= P(B1A) + P(B2A) + P(B3A) +…………+ P(BnA)
[B1A or B2A or B3A or…………or BnA are also mutually
exclusive events.]
P (B1)P ( A / B1) P (B2 )P ( A / B2 ) P (Bn )P ( A / Bn )
P (B1)P ( A / B1) P (B2 )P ( A / B2 ) P (Bn )P ( A / Bn )
n
P (Bi )P ( A / Bi )
i 1
Again
P ( ABi ) P ( A)P (Bi / A)
P (Bi A) P (Bi )P ( A / Bi )
P (Bi )P ( A / Bi )
P (Bi / A) n
P (Bi )P ( A / Bi )
i 1
• Remarks:
a) The probabilities P(B1), P(B2),…….,P(Bn) are termed as the'
a priori probabilities' because they exist before we gain
any information from the experiment itself.
P A / B1 0.95
P A / B1 0.05
Therefore the probability of a ‘1’ is received i.e.
= P(B1A) + P(B2A)
P(Â) = P(B1 Â U B2 Â)
= P(B1 Â) + P(B2 Â)
=.4x0.95+.6x0.1
= 0.44
•And
The probability of a ‘1’ is transmitted given
that a ‘1’ is received(by Bayes theorem) is:
P=
0.6 0.9 27
0.56 28
List of m. Prior The Joint The posterior probabilities using
e. and probabilities conditional probabilities for the basic relation of conditional
exhaustive probabilities each event probability.
events of the new
information
(A) given
each event
=0.4 0.05
B2 P(B2)= 0.6 P(A/B2) P(B2A) P
B2 P(B2)= 0.6 P(A/B2) P(B2A) =
=0.90 =P(B2) P(A/B2)
=0.90 =P(B2) P(A/B2)
0.6 0.90
0.6 0.9 27
0.56 28
No. of favourable caese for A n( A)
P ( A)
Total no. of cases n( S )
P ( A) 1 P( A)
0 P( A) 1
Addition theoerm
(i ) M .E.
A and B two M .E. events
P ( AorB) P( A B ) P ( A B) P ( A) P ( B)
A H , C , D, S n( A) 4
H A, 2,3, 4,5, 6, 7,8,9,10, J , Q, K n( H ) 13
n( A H ) 1
P ( A H ) P( A) P( H ) P( A H )
4 13 1 16
52 52 52 52
mutiplication theorem
ind . events
P ( A B) P( AB ) P ( A) P( B)
b
P (a X b) f ( x )dx..............(*)
a
Important Remark.
• In case of discrete random variable the probability at a point,
i.e., P (x = c) is not zero for some fixed c.
P ( a X b ) P (a X b ) P ( a X b ) P (a X b )
If X is continuous,
x
P( X x)
f ( x)dx
Properties of cdf F(x)
• F(x)is non-decreasing function of x.
• F(-∞) = 0 and F(∞)=1
• If X is a discrete RV taking the values x1, x2,….,
where x1< x2 <……< xn-1 < xn <…., then
P(xn) = F(xn) - F(xn-1)
• If X is continuous RV, then
d/dx F(x)=f(x),
at all point where F(x) is differentiable.
Find the constant k such that the function
kx 2 0 x 3
f ( x)
0 otherwise
is a probability density function and compute (i ) P(1 x 2),
(ii ) P( X 2), and (iii ) P ( X 2).
Sol.
f ( x) pdf then
(i ) f ( x) 0 for 0 x 3
3
(ii ) f ( x)dx 1
0
3
k x 2 dx 1
0
3
x3
k 1
3 0
9k 1
k 1/ 9
2
1 x
2 2 3
x 7
(a) P(1 x 2) dx
1
9 9 3 1 27
3 2
1 x
2 2
x 8
(b) P( X 2) dx
0
9 9 3 0 27
3
1 x 19
3 2 3
x
(c) P( X 2) dx
2
9 9 3 2 27
8 19
P( X 2) 1- P( X 2) 1-
27 27
Let the continuous random var iable X have the probability density
function
2 / x3 1 x
f ( x)
0 otherwise
findF ( x )
x
x
x
31
1 x 1
2
F ( x) p ( X x) 2 / x dx 2
3
2 1 2
1 3 1 1 x x
x2 1
F ( x) x 2 1 x
0 otherwise
• Ex. Find the value of k so that f(x) = kx(2-x), may be a p.d.f. of
RV x for 0 x 2.
• . Verify that the function F(x) is a distribution function.
1-e-x/4 x 0
F ( x)
0 x<0
Also, find the probabilities P(X 4), P(X 8), P(4 X 8).
p( x) 1
x 0
0 K 2 K 2 K 3K K 2 2 K 2 7 K 2 K 1
10 K 2 9 K 1 0
10 K 2 10 K K 1 0
K 1/10, 1
K 1/10
f ( x)
0, x0
(a) show that f(x) is a pdf (of a continuous RV X.)
(b) find its distribution function F(x).
(a) f(x) is a pdf
(i) f ( x) 0
(ii ) f ( x)dx 1
-
0
2
x /2
f ( x)dx 0.dx xe dx
- - 0
let x / 2 t xdx dt limits x=0 t=0;x= t
2
e t
0 e t dt
0 1 0
0 (1) 1
x 0 x2 / 2
(b)F(x)=P(X x)= f ( x)dx 0.dx e t dt
- - 0
2
x /2
e t
x2 / 2
e 1
1 0
x2 / 2
1 e
A continuous RV X that can assume any value between x = 2 and
x = 5 has a density function given by f(x) = k(1 + x). Find P(X < 4).
f ( x)
5
k (1 x)
0,
2 x5
elsewhere
k (1 x)dx 1
2
5
k x x / 2 1
2
2
k 2 / 27
4
4
P ( X 4) 2 / 27 (1 x) dx (2 / 27). x x / 2 16 / 27
2
2
2
Expectation (mean )and variance of RV X
INTRODUCTION:
• A discrete/ continuous RV is no doubt completely described
by is pmf /pdf or by its cumulative distribution function.
• for many purposes, this description is often considered to
consist too many details.
• It is rather simpler and more convenient to describe a RV
or to characterize its distribution by a few parameter,
known as statistical measure , that are representative of
the distribution. Out of which expectation or mean is the
most important.
• Mathematical
Expectation
If X is a discrete R. V. with p.m.f. P(X = xi) = pi;
then mathematical expectation of X or arithmetic
mean of X, denoted as E(X), is defined by
E(X) = pi.
If X is a continuous R. V. with p.d.f. f(x), then
mathematical expectation of X is defined by
E(X) =
• Properties of expected values:
• E(X+Y) = E(X) + E(Y), provided all expectation
exists.
• If X and Y are independent random variables
then E(XY) = E(X). E(Y).
• In general, E(XY) E(X). E(Y).
• Ex. Find the expectation of the number on
dice when thrown.
Subject: Probability Theory
Subject
code: MA-451
Discrete probability distribution
• P(X = x) =
• P(X = x) =
• The
two independent constants n and p in the
distribution are known as the parameters of the
distribution.
Mean = E(X) = np
Var(X) = npq
Example 1
The incidence of an occupational disease in an industry is such that the workers have a 20%
chance of suffering from it. What is the probability that out of 6 workers chosen at random,
four or more will suffer from the disease?
Poisson Distribution
Poisson distribution is a distribution related to probabilities of events which are
extremely rare, but which have a large number of independent opportunities
for occurrence.
• Mean = E(X) = λ.
• Var X = λ
A car-hire firm has two cars, which it hires out day by day. The number
of demands for a car on each day is distributed as a Poisson distribution
with a mean of 1.5. Calculate the proportion of days on which (i) neither
car is used, and (ii) the proportion of days on which some demand is
refused.
X:The number of demands for a car on each day
X P(m)
mean X= m =1.5
Probability of days on which there are x demands for a car
e -m m x e-1.5 (1.5) x
P(X=x)= .......(*)
x! x!
(i) neither car is used,(x=0)
e -m m x e-1.5 (1.5) 0
P(X=0)= e -1.5 0.22313
x! 0!
(ii) the proportion of days on which some demand is
refused.
P(X>2)=1-P(X 2)
=1- P(X=0)+P(X=1)+P(X=2)
(1.5) 2.25
1 e -1.5 1+ +
1 2
0.1912
Six coins are tossed 6400 times. Using the Poisson distribution, what is
the approximate probability of getting six heads 10 times?
X: no of times getting six heads.
X P(m)
n=6400,
p=probability of getting six heads=(1/2) 6 1/ 64
m np
6400 / 64 100
Probability of getting x heads
e-m m x e-100100 x
P(X=x)=
x! x!
what is
the approximate probability of getting six heads 10 times
e-m m x e-10010010
P(X=10)=
x! 10!
• Geometric distribution
•Suppose we have a series of independent trials or
repetitions and on each repetition or trial the
probability of success 'p' remains the same. Then the
probability that there are x failures preceding the first
success is given by qxp.
• Definition.
A random variable X is said to have a geometric
distribution if it assumes only non-negative values
and its probability mass function is given by
• P(X=x) =
• Remarks:
1. Since the various probabilities for x = 0, 1, 2, ...... are
the various terms of geometric progression, hence the
name geometric distribution.
Subject
code: MA-451
Continuous probability distribution
k, a x b
f (x)
0, otherwise
Since total probability is always unity, we have
b b
f (x)dx 1 k dx 1
a a
1
i.e.k
ba
1
, a x b
f (x) b a
0, otherwise
Remarks.
• a and b, (a < b) are the two parameters of the
uniform distribution on (a, b).
• The distribution is also known as rectangular
distribution, since the curve y = f(x) describes a
rectangle over the x-axis and between the ordinates
at x=a and x=b.
• The distribution function F (x) is given by
0, if x a
x a
F(x) , a x b
b a
1, b x
• For a rectangular or uniform variate X in
(- a, a) , p.d.f. is given by
1
, a x a
f (x) 2a
0, otherwise
Moments of Rectangular Distribution.
1 b a r 1 r 1
'
r
ba r 1
• In particular
ab
Mean= E(X) =
'
1
2
2
Variance b a
12
• Ex. If X is uniformly distributed
with mean 1 and variance 4/3. find
P(X < 0).
1 e , x 0
x
F(x)
0, otherwise
1
• Variance = 2
Ex. Show that the exponential distribution
"lacks memory“, i.e . if X has an exponential
distribution, then for every constant a> 0,
one has p(Y x / X a) = P (X x) for all x,
where Y= X-a.
Subject
code: MA-451
Normal (Gaussian) distribution
------ 68.27%----
-----------------95.45% ------------------
----------------------------- 99.73%----------------------------
Area between z = 1 is 68.27%
Area between z = 2 is 95.45%
Area between z = 1 is 99.73%
uses of normal distribution
(i) The normal distribution can be used to approximate binomial and Poisson
distributions.
(ii) It is used extensively in sampling theory. It helps to estimate parameters from
statistics and to find confidence limits of the parameter.
(iii) It is widely used in testing statistical hypothesis and tests of significance in
which it is always assumed that the population from which the samples have
been drawn should have normal distribution.
(iv) It serves as a guiding instrument in the analysis and interpretation of statistical
data.
(v) It can be used for smoothing and graduating a distribution which is not normal
simply by contracting a normal curve.
• The average seasonal rainfall in a place is 16 inches with
an SD of 4 inches. What is the probability that the
rainfall in that place will be between 20 and 24 inches in
a year?
.
• .
Chi square distribution
If X1, X2,…… Xn are normally distributed independent
random variables, then it is known that (X12+ X22+…… +Xn2)
follows a probability distribution, called chi- square
Ex. The following data give the number of
aircraft accidents that occurred during the
various days of a week.
•
Day: Mon Tues Wed Thu FriSat
No. of accident: 15 19 13 12 16 15
• Test whether the accidents are uniformly
distributed over the week.
• (Given 0.05
2
( 5) 11.07 )
Marginal Probability Distribution
P( X xi )
P{( X xi and Y y j ) or ( X xi and Y y2 ) or etc.}
Similarly
fY ( y ) f ( x, y)dx is called the m arg inal density of
-
Y.