Professional Documents
Culture Documents
Anakha A Menon
Sneha Saji
Sets
• Example:
(i) Out of 80 students who attended a test in English and Mathematics, 50 passed mathematics, 48 passed
English ,30 passed both and 12 passed neither.
80
18
30 20
12
Subset
• A set A is a subset of another set B if all elements of the set A are elements of the set B.
• i.e., if x є A then x є B
• In other words, the set A is contained inside the set B.
• The subset relationship is denoted as A⊂B.
E.JansiDevi
Probability
Probability denotes the possibility of outcome of any
random event
• Axiom 1:
For any event A, P(A)>0
• Axiom 2:
Probability of sample space S is P(S)=1
• Axiom 3:
P(AUB)= P(A)+P(B)
THE LAW OF TOTAL
PROBABILITY
SHITAL PATIL
What is law of probability?
• The rule states that if the probability of an event is unknown, it can
be calculated using the known probabilities of several distinct
events.
• The probability for a can be written as sums of event B. The total
probability rule is: P(A) = P(A∩B) + P(A∩Bc).
ASSUMPTIONS
1) Let there be n mutually exclusive and exhaustive events (Ei)
P( )= 0 i =1,2,….n ;j < i
P( ) = ( )=……. = 0
P( …….)= 1
As we know
• P(A|B) = , it gives
• P(A B)=P(B) P(A|B)
Law of total probability by Venn diagram
CONDITIONAL
PROBABILITY
BY
PRARTHANA KARMAKAR
M.SC. ACTUARIAL SCIENCE
UNIVERSITY OF MADRAS
DEFINITION
The probability that one event happens given that another event is already known
to have happened is called a conditional probability.
Suppose we know that event A has happened. Then the probability that event B
happens given that event A has happened is denoted by P(B|A) .
P(B|A) = , if P(A)>0
• EXAMPLE:
Let us consider a random experiment of drawing a card from a pack of cards.
Then the probability of happening of the event A: "The card drawn is a king", is
given by:
P(A)= = .
Now suppose that a card is drawn and we are informed that the drawn card is
red. How does this information affect the likelihood of the event A?
Obviously, if the event B: 'The card drawn is red', has happened, the event
'Black card' is not possible. Hence the probability of the event A must be
computed relative to the new sample space 'B' which consists of 26 sample
points (red cards only), i.e., n(B) = 26. Among these 26 red cards, there are two
(red) kings so that n(A∩B) = 2. Hence, the required probability is given by:
P(A|B)= = = .
PROPERTIES:
• For two events A and B,
P(A∩B)= P(A). P(B|A), P(A)>0
= P(B). P(A|B), P(B)>0
where P(B|A) represents conditional probability of occurrences of B when the event
A has already happened.
• If A and B are two independent events then,
P(A∩B)= P(A). P(B)
• An event A is independent of B then,
P(A|B)= P(A)
and P(B|A)=P(B) when B is independent of A.
BAYES THEOREM.
- Venmani. J
P(A|B) = .
* In it , we update a priori probabilities based on the given information
by calculating the revising probabilities.
P(|E) = , i=1,2,3,...,n
Disjoint Arbitrary
Bayes Theorem
Proof
Since E ⊂ , we have
E=E∩
P( E ) = P )
= (E∩)
P( E ) = ( ) P (E │ )
Now, by definition of conditional probability,
P(|E) =
= P()
P(|E) =
• The probabilities P(A1), P(A2), ...... are termed as "a prior probabilities"
because they exist before we gain any information.
• The probabilities P() are called "likelihoods" because they indicate how
likely the event E occur given each and every a prior probability.
The probability P(|E) are called "posterior probability" because they are
determined after the results of the experiment are known.
Joint Probability
- K. RENUGA
When a coin is tossed, what will be the joint probability of getting a head in the
first toss and getting a tail in the following second toss?
Let the Event A be the probability of getting a head in the first toss,
i.e., P(A) = ½ = 0.5
And the Event B be the probability of getting a tail in the second toss,
i.e., P(B) = ½ = 0.5
Definition
•A random variable is a function that assigns a number ( real value) to each elements of
a sample space S of an experiment.
•It is a function with domain S ( sample space) and range -∞ to +∞
•Random variables are commonly labeled with letters like ‘X’ , ‘Y’ etc.
Example :
Suppose we have a random experiment of flipping a coin
Sample space = ( H , T)
We assign a real number to the each element of SS
say head – 0 , tail – 1
The X random variable can take either of two values based
on the outcome of the experiment
i.e X = { 0,1}
Types of random variables
Random variables
Discrete Continuous
Example:
• Throwing a dice. the dice can take only a finite number of outcomes {1, 2, 3, 4, 5,
and 6}
• number of children in a family,
• no of defective bulbs in a box of 10 bulbs etc.
Example:
• height, weight of a person
• Amount of sugar in an orange
• The time required to run a mile.
DEFINITION :
The expectation of a random variable X, denoted by
E(X) or µx , serves as a measure of central tendency of the
probability distribution of X. If X assume the values x1, x2, …,
with respective probabilities p1, p2 ,…, where =1, then
E(X) = , provided it is finite.
• The mathematical expressing for computing the expected value of a discrete
random variable X with probability mass function (p.m.f.) f(x) is :
E(X) = , (for discrete r.v.)
Solution: Let, X and Y denote respectively number of points obtained in the first
and second throws. Then both of them take the values 1,2,3,4,5 and 6, each with
probability .
E(X) = (1+2+3+4+5+6)* = .
Similarly, E(X) = .
We require,
E(X+Y)= E(X) + E(Y) = 7,
and E(XY)= E(X)E(Y) = . = = 12 .
CONDITIONAL EXPECTATION: SHARMILA K
1 2 3 4 5 6
A 0 1 0 1 0 1
B 0 1 1 0 1 0
The unconditional expectation of A is , but the expectation of A is
E[A] = (0 + 1 + 0 + 1 + 0 +1) / 6 =1/2,
but the expectation of A conditional on B = 1 (i.e., conditional on the die roll being
2, 3, or 5) is
E[A | B = 1] = (1 + 0 + 0) / 3 = 1/3,
and the expectation of A conditional on B = 0 (i.e., conditional on the die roll being
1, 4, or 6) is
E[A | B = 0] = (0 + 1 + 1) / 3 = 2/3 .
BINOMIAL DISTRIBUTION:
If p and q are the probability of success and failure of any
event, then the probability of getting success x times in n trails is
given by,
P(X=x)=n, n=No. of trails
p=probability of success in 1trail
n= q=probability of failure in I trail
(1-p)
GRAPHICAL REPRESENTATION OF BINOMIAL
DISTRIBUTION
0.14
0.12
0.1
0.08
B(64,0.487)
B(64,0.532)
0.06
0.04
0.02
0
0 10 20 30 40 50 60 70
CONDITIONS FOR BINOMIAL DISTRIBUTION
BY
SUBASH.M
POISSON DISTRIBUTION
DEFINITION:
A random variable X is said to follow a Poisson distribution if it assume only non-
negative values and its probability mass function is given by.
P(x, ; x = 0,1,2,…;
=0, otherwise
Here is known as the parameter of the distribution.
We shall use the notation X~P() to denote that X is a Poisson variate with parameter
POISSON DISTRIBUTION
MOMENTS OF POISSON DISTRIBUTION:
)
= = []
= +…)
=
=
Hence the mean of the Poisson distribution is
POISSON DISTRIBUTION
MOMENTS OF POISSON DISTRIBUTION:
)
=
=
=
=
=
+
=
=
Thus the mean and variance of the Poisson distribution are each equal to
POISSON DISTRIBUTION
EXAMPLE:
If 3% of electronic units manufactured by a company are defective. Find the probability that
in a sample of 200 units, less than 2 bulbs are defective.
Solution:
The probability of defective units p=3/100 = 0.03
Given n=200
Mean=
P(X=x) =
P(X<2)=P(X=0)+P(X=1)
=
= + *6
=0.00247+0.148
P(X<2)=0.01727
The probability that less than 2 bulbs are defective is 0.01727
POISSON DISTRIBUTION
PROPERTIES:
The event are independent.
The average number of success in the given period of time alone can occur. No
two events can occur at the same time.
The Poisson distribution is limited when the number of trials n is indefinitely
large.
Mean=Variance=
Np=finite, where is constant.
The standard deviation is always equal to the square root of the mean µ.
If the mean is large, then the Poisson distribution is approximately a normal
distribution.
POISSON DISTRIBUTION
DISTRIBUTION IN REAL LIFE:
Number of deaths from a disease(not in the form of an epidemic) such as heart attack
or cancer or due to snake bite.
Number of suicides reported in a particular city.
Number of faulty blades in a packet of 100.
The number of defective material in a packing manufactured by a good concern.
Number of printing mistake at each page of the book.
Number of cars passing a crossing per minute during the busy hours of a day.
The emission of radioactive(alpha) particles.
Number of telephone calls received at a particular telephone exchange in some unit of
time or connections to wrong number in a telephone exchange.
Normal distribution
K.Hamsavarshini
Definition :
A random variable x is said to have a normal distribution with parameter
(called “mean” ) and (called “variance” ) if its density function is given by the
probability law:
f (x;) =- {
or f(x;
MGF OF NORMAL DISTRIBUTION : the m.g.f is given by
(t) = exp { ( /2}
=/ 2)
= ( - 2t
=
}
Mean for normal distribution
= x
= x /2 ) }
hence (t) =
mean for normal distribution :
M.D. =
=
=
=
Since integral is an even function of z .
Since in , = z ,we have
mean =
=
=
=
Example sum :
that
1) 26
2) X
1 SOLN : Here
where x= 26 , z = = = -0.8
And when x= 40 , z= =2
p( 26
=p ( - 0.8
=p(0 (from symmetry & tables )
= 0.2881 + 0.4772 = 0.7653
2) p(x
z= =3
p(x
= 0.5 – 0.49865 = 0.00135
Applications and interpretation of normal distribution
there are statistical methods to empirically test that assumption , see the above
Normally tests
•In biology , the logarithm of various variables tend to have a normal distribution ,that
is, they tend to have a log-normal distribution(after
separation on male/female subpopulations),with examples including:
•Measures of size of living tissue(length , height, skin area , weight
•Certain physiological measurements , such as blood pressure of adult humans.
•In finance , in particular the Black-Scholes model, changes in the logarithm of
exchange rates ,price indices , and stock market indices are assumed normal (these
variables behave like compound interest , not like simple interest )
•Measurement errors in physical experiments are often modeled by a normal
distribution . This use of a normal distribution does not imply that one is assuming the
measurement
EXPONENTIAL
DISTRIBUTION
by
Manimekalai.I
DEFINITION
• The exponential distribution is a continuous distribution that is
commonly used to measure the expected time for an event to occur
• The continuous random variable, say X is said to have an
exponential distribution, if it has the following probability density
function
• ()= for x ≥ 0
0 for x ≤0
• Mean of exponential distribution:
• The mean of the exponential distribution is calculated
using the integration by parts.