You are on page 1of 21

Chapter 2 (Probability)

1. Define sample Space & Events.


The set of all possible outcomes of an experiment is known as the sample space of the
experiment and is denoted by S. For example, if the experiments consists of flipping two
coins, then the sample space consists of the following four points:
S = {(H,H), (H,T), (T,H), (T,T)}

Any subset of the sample space S is known as an event and denoted by E. In last example,
if, E = {(H,H), (H,T)}

Then E is the event that a head appears on the first coin.

2. What is Independent & Dependent Event?


The definition of dependency of two events is an important consequence of multiplication
rule. Two events E and F are said to be independent if,
P(EF) = P(E)P(F)
From the equation of conditional probability we have,
P(EF)
P(E|F) =
P(F)
P(E|F) = P(E)

Which also implies that P(F|E) = P(F). That is, E and F are independent if F has occurred
does not affect the probability that E occurs. Two events E and F that are not independent
are said to be dependent.

3. What is Random Variable?


Suppose that to each point of a sample space we assign a number. Then we have a
function defined on the sample space. This function is called a random variable (discrete
and continuous). Actually random variable is the outcome of a random process outputs
the numeric value. The output of a random variables are random number.
Suppose that a coin is tossed twice so that the sample space is S = {HH, HT, TH, TT}. Let
X represent the number of heads that can come up. With each sample point we can
associate a number for X as shown;

Sample-Point HH HT TH TT
X 2 1 1 0

Discrite Random variable: If the random variables took on either a finite or a countable
number of possible values. Such random variables are called discrete. The example of
discrete random variable are dead/alive, dice, counts, etc. The discrete random variable
is described by the distribution function.

Continuous variable: There also exist random variables that take on a continuum of
possible values. These are known as continuous random variables. The example of
continuous random variable are blood pressure, weight, the speed of a car, the real
numbers from 1 to 6. The continuous variable is described by the probability density
function.

4. Describe the Probability functions.


A probability function maps the possible values of x against their respective probabilities
of occurrence, p(x) is a number from 0 to 1.0 having the area always 1.

For a discrete random variable X, we define the Probability mass function p(a) of X by
p(a) = P{X=a}

The probability mass function p(a) is positive for at most a countable number of values
of a. A discrete example of rolling a die with the probability mass function (PMF) is
given bellow;

x=1 p(x=1) = 1/6


x=2 p(x=2) = 1/6
x=3 p(x=3) = 1/6
x=4 p(x=4) = 1/6
x=5 p(x=5) = 1/6
x=6 p(x=6) = 1/6
The another way to describe the probability is Cumulative distribution function (or more
simply the distribution function). The function F(·) of the random variable X is defined
for any real number b, where −∞ < b < ∞, by
F(b) = P{X ≤ b}

F(b) denotes the probability that the random variable X takes on a value that is less
than or equal to b. Some properties of the cumulative distribution function (cdf) F are
given bellow;
1. F(b) is a non-decreasing function of b
2. Limb -> ∞ F(b) = F(∞) = 1
3. Limb -> -∞ F(b) = F(-∞) = 0

x=1 p(x≤1) = 1/6


x=2 p(x≤2) = 2/6
x=3 p(x≤3) = 3/6
x=4 p(x≤4) = 4/6
x=5 p(x≤5) = 5/6
x=6 p(x≤6) = 6/6

Another probability function is Joint distribution function. If the first feature has the
value x and the second feature has the value of y, then the joint distribution function
of the joint random variable (x, y) is the probability P(x, y) that both x and y occur.

5. Define Expected Value and Variance of random variable.


Expected value: Expected value is just the average or mean (µ) of random variable x. It’s
also called the first moment of the distribution.

Formally, the expected value for discrete case is;


E(x) = u = ∑𝑎𝑙𝑙 𝑥 𝑥𝑖 𝑝(𝑥𝑖 )

And for continuous case;


E(x) = u = ∫𝑎𝑙𝑙 𝑥 𝑥𝑖 𝑝(𝑥𝑖 ) dx
Variance: It is the second moment of the distribution. Let X be normally distributed with
parameters μ and σ2.  is called standard deviation.
Var(x) = 2 = E(x - )2 = E[x2] – E[x]2 = E[x2] - 2

Formally, the variance for discrete case is;


Var(x) = ∑𝑎𝑙𝑙 𝑥(𝑥𝑖 − 𝑢)2 𝑝(𝑥𝑖 )

And for continuous case;


Var(x) = ∫𝑎𝑙𝑙 𝑥(𝑥𝑖 − 𝑢)2 𝑝(𝑥𝑖 ) dx

6. Proof: Show that Var(x) = E[x2] - 2


Prove: Suppose that X is continuous with density f, and let E[X] = μ. Then,
[Ex-1] Calculate the conditional probability of rain occurred given that the high
barometric pressure occurred on 160 days of 200 days. Mention that rain occurred 20
days with in the high barometric pressure.
Solution: Here,
P(high barometric pressure occurred, H) = 160/200 = 0.80
P(rain occurred, R) = 20/200 = 0.10 [P(R and H) = 0.10]
P(RH)
P(R|H) = = 0.10/0.80 = 0.125
P(H)

[Collected] Suppose we toss two fair dice. Let E1 denote the event that the sum of the
dice is six and F denote the event that the first die equals four. Is E1 independent of F?
What will happen if E2 be the event that the sum of the dice equals seven?
Solution: Here, for the first question E1 is not independent of F. Because,
P(E1 F) = P({4,2}) = 1/36
And, P(E1) P(F) = 5/36 x 6/36 = 5/216
P(E1 F) ≠ P(E1) P(F)

Again for the second question the answer is Yes!!! E2 is not independent of F. Because,
P(E2 F) = P({4,3}) = 1/36
And, P(E2) P(F) = 6/36 x 6/36 = 1/36
P(E2 F) = P(E2) P(F)

[Ex-2: addition and multiplication rule] Suppose we toss two fair dice. What is the
probability of the event that the sum of the dice is three?
Solution: Having two ways
P(1 and 2) or P(3 and 2)
= P(1) P(2) + P(2) P(1)
= (1/6 x 1/6) + (1/6 x 1/6)
= 1/36 + 1/36
= 1/18
[Ex-3] Suppose a company wants to run two new techniques/events for company’s
improvement which are independent. One is E, the explosion to fracture the strata within
100 meters and other is R, the brine to recover the oil at rate of 50 barrel per day.
Assume that probability of explosion to fracture the strata successfully is 0.8. If explosion
occurred then the probability of R occurring is 0.9 and if explosion is not occurred
successfully then the probability of occurring R is 0.3.
a. What is the probability that explosion and brine are both successful or
both fail?
b. What is the probability of E, given the constraint that only one of two
test was successful?
c. What is the probability of E given that R was successful?
d. If one or more of the test was successful, what is the probability that the
other one was successful?
Solution: Given that,
P(E) = 0.8
P(R|E) = 0.9
P(R|e) = 0.3
The joint events are;
P(ER) = P(E)P(R|E) = 0.8 x 0.9 = 0.72
P(Er) = P(E)P(r|E) = 0.8 x 0.1 = 0.08
P(eR) = P(e)P(R|e) = 0.2 x 0.3 = 0.06
P(er) = P(e)P(r|e) = 0.2 x 0.7 = 0.14
Now, P(R) = P(ER) + P(eR) = 0.72 + 0.06 = 0.78
P(r) = 0.22

a. P(ER) or P(er) = 0.72 + 0.14


b. P(E|only one suc.) = P(E and only one suc.) / P(only one suc. )
= P(Er)/[P(Er) + P(eR)]
= 0.571
c. P(E|R) = P(ER)/P(R) = 0.72 / 0.78 = 0.923
d. P(E or R| one or more suc.) = P(ER)/[P(one or more suc.)]
= P(ER)/[P(ER) + P(Er) + P(eR)] = 0.84
[Ex-4] Suppose that our experiment consists of tossing two fair coins. Letting Y denote
the number of heads appearing. What is the probability mass function/ distribution
function?
Solution: Here, y denote the random variable with 0, 1, 2
P{Y=0} = P{(T,T)} = ¼
P{Y=1} = P{(T,H),(H,T)} = 2/4 = ½
P{Y=2} = P{(H,H)} = ¼
Of course, P{Y=0} + P{Y=1} + P{Y=2} = 1.

[Collected] Letting X denote the random variable that is defined as the sum of two fair
dice; what is the probability mass function?
Solution: Here, X = 2, 3, 4, …,11, 12
P{X=2} = P{(1,1)} = 1/36
P{X=3} = P{(1,2), (2,1)} = 2/36
……………………………………………………………
P{X=11} = P{(5,6), (6,5)} = 2/36
P{X=12} = P{(6,6)} = 1/36

[Collected] The number of patients seen in the hospital in any given hour is a random
variable represented by X. The probability distribution for x is:

x 10 11 12 13 14
P(x) .4 .2 .2 .1 .1

Find the probability that in a given hour:


a. Exactly 14 patients arrive [p(x = 14) = .1]
b. At least 12 patients arrive [p(x ≥ 12) = (.2 + .1 + .1) = .4]
c. At most 11 patients arrive [p(x ≤ 11) = (.2 + .4) = .6]

[Collected] Suppose that our experiment consists of tossing 2 fair coins.. Show the
probability mass function and also show the cumulative distribution function for the
number of heads appearing.
Solution: Letting Y denote the number of heads appearing, then Y is a random variable
taking on one of the values 0, 1, 2 with respective probabilities.
P{Y=0} = P{(T,T)} = ¼
P{Y=1} = P{(T,H),(H,T)} = 2/4 = ½
P{Y=2} = P{(H,H)} = ¼
Of course, P{Y=0} + P{Y=1} + P{Y=2} = 1.

The probability mass function is;


p(x=0) = ¼
p(x=1) = ½
p(x=2) = ¼

The cumulative distribution function is;

0 𝑥<0
1
4
0 ≤𝑥<1
𝑓(𝑥) = 3
1 ≤𝑥<2
4
{1 2 ≤𝑥

[Collected] Suppose the distribution function of X is given by

What is the probability mass function of X?


Solution: The probability mass function is,
P(0) = ½
P(1) = 1/10
P(2) = 1/5
P(3) = 1/10
P(3.5) = 1/10
[Ex-5] An anthropologist con identify a human skull as a female with probability 0.6.
What is the probability that exactly four will be detected successfully within six
experimental events?
Solution: We know, the probability mass function of a binomial random variable having
parameters (n,p) is given by
p(i) = (𝑛𝑖) pi (1−p)n-i i = 0, 1, ..., n

Here, n = 6 and P(success) = 0.6


P(4) = 6C4 x .64 x .42
P(4) = 0.311

[Collected] In a taste test of Pepsi vs Coke, suppose 25% of tasters can correctly identify
which cola they are drinking. If 12 tasters participate in a test by drinking from 2 cups
in which 1 cup contains Coke and the other cup contains Pepsi, what is the probability
that exactly 5 tasters will correctly identify the colas?
Solution: Here x = 5 is a binomial random variable with parameters (n = 12, p = 0.25).
P{x = 5} = (12
5
) (0.25 )5 (1 – 0.25)12-5
P(x = 5) = 0.10324

[Collected] It is known that any item produced by a certain machine will be defective
with probability 0.1, independently of any other item. What is the probability that in a
sample of three items, at most one will be defective?
Solution: If X is the number of defective items in the sample, then X is a binomial
random variable with parameters (3, 0.1). Hence, the desired probability is given by
P{x ≤ 1} = P{x = 0} + P{x = 1}
= (30) (0.1)0 (1 − .1)3-0 + (31) (0.1)1 (1 − .1)3-1
= 1.1.(0.729) + 3.(0.1).(0.81) = 0.972

[Collected] If you toss a coin 5 times, what’s the probability to get exactly 3 heads?
Solution: The Binomial random variable x = 3 with parameters (n = 5, p = 0.5).
P{x = 5} = (53) (0.5 )3 (1 – 0.5)5-3
P(x = 5) = 5/16
[Collected] A coin is tossed 20 times, what’s the probability of getting 2 or fewer heads?
Solution: The Binomial random variable x = 3 with parameters (n = 20, p = 0.5).
P{x ≤ 2} = P{x = 0} + P{x = 1} + P{x = 2}
= (20
0
) (0.5 )0 (1 – 0.5)20 + (20
1
) (0.5 )1 (1 – 0.5)19 + (20
2
) (0.5 )2 (1 – 0.5)18
= 1.8 x 10-4

[Collected] If the number of accidents occurring on a highway each day is a Poisson


random variable with parameter λ = 3, what is the probability that no accidents occur
today?
Solution: As we determine the probability that no accidents occur today, then x = 0
𝜆0
p(0) = P{X = 0} = 𝑒 −𝜆 0!
30
p(0) = 𝑒 −3 0 ! = 0.04978 = 0.05

[Collected] Suppose that the number of typographical errors on a single page of this
Book has a Poisson distribution with parameter λ = 1. Calculate the probability that
there is at least one error on this page?
Solution: First we have to determine the probability that there is no error on this page.
That is;
10
p(x = 0) = 𝑒 −1 = 0.3678
0!

Now the probability that there is at least one error on this page;
p(x ≥ 1) = 1 – p(x = 0)
p(x ≥ 1) = 1 - 0.3678 = 0.632

[Collected] Consider an experiment that consists of counting the number of α-particles


given off in a one-second interval by one gram of radioactive material. If on the average,
3.2 α-particles are given off, what is a good approximation to the probability that no
more than two α-particles will appear?

Solution: The number of α-particles given off will be a Poisson random variable with
parameter λ = 3.2. Hence the desired probability is;
3.20 3.21 3.22
p(x ≤ 2) = 𝑒 −3.2 + 𝑒 −3.2 + 𝑒 −3.2
0! 1! 2!
p(x ≤ 2) = 0.04076 + 0.1304 + 0.2087
p(x ≤ 2) = 0.37988 = 0.38
[Collected] Find the PDF and CDF for uniform random variable.
Solution: We know that a random variable is said to be uniformly distributed over the
interval (0,1) if its probability density function is given by;
1 0<𝑥<1
𝑓(𝑥) = {
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

In general, we say that X is a uniform random variable on the interval (α, β) if its
probability density function is given by;
1
𝛼<𝑥<𝛽
𝑓(𝑥) = {𝛽− 𝛼
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

The cumulative distribution function is easy to compute either by integration of the pdf
or by finding the area of rectangles.
0 𝑥 ≤ 𝛼
𝑥− 𝛼
𝑓(𝑎) = {𝛽− 𝛼 𝛼<𝑥<𝛽
1 𝑥 ≥ 𝛽

[Ex-6] If X is uniformly distributed over (0, 10), calculate the probability that
a. X < 3,
b. X > 7,
c. 0.2 < X < 0.7.
Solution:
3
∫0 𝑑𝑥 3−0
a. P(x < 3) = = = 3/10
10−0 10−0
10
∫7 𝑑𝑥 10−7
b. P(x > 7) = = = 3/10
10−0 10−0
.7
∫.2 𝑑𝑥 .7−.2
c. P(0.2 < x < 0.7) = = = .5/10 = .05
10−0 10−0

[Ex-7] Suppose that the lifetime of a particular type of radioactive atom in years has an
exponential density with B (beta) = 0.01. What is the probability that the atom will decay
within 50 years after 0 years.
Solution: Here,
50
P(0 ≤ x ≤ 50) = ∫0 𝐵𝑒 −𝐵𝑥
= |50
0 −0.01𝑒
−0.01𝑥

= - 0.01[ 𝑒 −50 − 𝑒 −0 ] = 0.39


[Collected] Survival times after lung transplant may roughly follow an exponential
function. Find the probability that a patient will die in the second year after surgery.
Solution: The patient will die in the second year manes between years 1 and 2. And this
is an exponential function. So,
2
P(1 ≤ x ≤ 2) = ∫1 𝑒 −𝑥
= |12 −𝑒 −𝑥
= - [ 𝑒 −2 − 𝑒 −1 ]
= 0.23
The probability that a patient will die in the second year after surgery (between years
1 and 2) is 23%.

[Ex-8] Consider an electrical circuit in which the voltage is normally distributed with
mean 120 and standard deviation 3. What is the probability that the next reading will
be between 119 and 121.
Solution: We know from the standardizing transformation of normal density,
z = (x – u)/σ
So, P(119 ≤ x ≤ 121) = C[(121 -120)/3] – C[(119-120)/3]
= C(0.333) – C(-0.333)
= 0.631 – (1-0.631)
= 0.262
The value 0.631 was obtained by linear interpolation between C(0.33) and C(0.34).

[Ex-9] Suppose that we flip coin A and if the outcome is Head having probability 0.6 we
flip coin B. but is the result of the first coin is tail, we flip the first coin A again. What
are the joint probabilities if the probability of coin B appearing head is 0.3?
Solution: Here,
P(H) = 0.6 and P(T) for coin A.
P(h) = 0.3 and P(t) for coin B.
The joint probability:
P(H,h) = 0.6 x 0.3 = 0.18
P(H,t) = 0.6 x 0.7 = 0.42
P(t,H) = 0.4 x 0.6 = 0.24
P(t,t) = 0.4 x 0.4 = 0.16
[Ex-10] Find the joint probability for the event J, 0.2 ≤ x ≤ 0.3 and 0.4 ≤ y ≤ 0.5 having
the probability density function;
𝟔(𝟏 − 𝒙 − 𝒚) 𝒊𝒇 ≤ 𝒙, 𝟎 ≤ 𝒚 𝒂𝒏𝒅 𝒙 + 𝒚 ≤ 𝟏
𝑷(𝒙, 𝒚) = {
𝟎 𝒐𝒕𝒉𝒆𝒓𝒘𝒊𝒔𝒆

Solution: Here,
0.3 0.5
P(J) = ∫0.2 [ ∫0.4 6(1 − 𝑥 − 𝑦)𝑑𝑦] dx
P(J) = 0.018

[Ex-11] Suppose that our experiment consists of tossing two fair coins. Letting Y denote
the number of heads appearing. What is the expected value?
Solution: Here, y denote the random variable with 0, 1, 2
P{Y=0} = ¼, P{Y=1} = ½, P{Y=2} = ¼
E(x) = ∑𝑎𝑙𝑙 𝑥 𝑥𝑖 𝑝(𝑥𝑖 )
E(x) = (0 x ¼ ) + (1 x ½ ) + (2 x ¼ ) = 1

[Collected] Find expected value E[X] where X is the outcome when we roll a fair die.
Solution: Here;
P(1) = p(2) = p(3) = p(4) = p(5) = p(6) = 1/6
E(x) = (1 + 2 + 3 + 4 + 5 + 6)1/6
E(x) = 7/2

[Collected] Consider the following probability distribution and find the expected value
for discrete case.

x 10 11 12 13 14
P(x) .4 .2 .2 .1 .1

Solution: For discrete case we have;


E(x) = ∑𝑎𝑙𝑙 𝑥 𝑥𝑖 𝑝(𝑥𝑖 )
E(x) = (10x0.4) + (11x0.2) + (12x0.2) + (13x0.1) + (14x0.1)
E(x) = 11.3
[Ex-12] Calculate E(x) of a random variable uniformly distributed over (α,β).
Solution: From the definition of expected value and a uniform random variable we have;

[Ex-13] Calculate E(x) of exponentially distributed with parameter λ.


Solution: From the expected value and a exponential random variable we have;

Integrating by parts uv

[Ex-15] Suppose that our experiment consists of tossing two fair coins. Letting Y denote
the number of heads appearing. Find the variance?
Solution: Here, y denote the random variable with 0, 1, 2
P{Y=0} = ¼, P{Y=1} = ½, P{Y=2} = ¼
E(x) = ∑𝑎𝑙𝑙 𝑥 𝑥𝑖 𝑝(𝑥𝑖 )
E(x) = (0 x ¼ ) + (1 x ½ ) + (2 x ¼ ) = 1

E(x2) = (0 x ¼ ) + (1 x ½ ) + (4 x ¼ ) = 1.5
Var(x) = E(x2) – [E(x)]2 = 1.5 – 1 = ½

Here, mean u = (0 + 1 + 2)/3 = 1


Va(x) = for all x (xi - u)P(xi)
[Collected] Suppose X has the following probability mass function:
p(0) = 0.2, p(1) = 0.5, p(2) = 0.3
Calculate E[x], E[x2] and (E[x])2.
Solution: Letting Y = X2, we have that Y is a random variable that can take on one of
the values 02, 12, 22 with respective probabilities;
p(0) = P{Y=02} = 0.2,
p(1) = P{Y=12} = 0.5,
p(4) = P{Y=22} = 0.3

Firstly, E[X] = 0(0.2) + 1(0.5) + 2(0.3) = 1.1


Hence, E[X2] = E[Y] = 0(0.2) + 1(0.5) + 4(0.3) = 1.7
Then, (E[x])2 = (1.1)2 = 1.21

[Collected] Let X be uniformly distributed over (0,1). Calculate E[X3].


Solution: For uniformly the expected value is;

E(x3) = ∫0 𝑥 3
1 1
𝑑𝑥
1−0

E(x3) = ∫0 𝑥 3 𝑑𝑥
1

𝑥4 1
E(x3) = |
4 0
E(x3) = ¼

[Collected] Calculate Var(X) when X represents the outcome when a fair die is rolled.
Solution: Here;
P(1) = p(2) = p(3) = p(4) = p(5) = p(6) = 1/6
E(x) = ∑𝑎𝑙𝑙 𝑥 𝑥𝑖 𝑝(𝑥𝑖 )
E(x) = (1 + 2 + 3 + 4 + 5 + 6)1/6
E(x) = 7/2

E[x2] = (1 + 4 + 9 + 16 + 25 + 36)1/6
E[x2] = 91/6
Now; Var[x] = E[X2] − (E[X])2 [for normal random variable]
Var[x] = 91/6 – (7/2)2
Var[x] = 35/12

Another way, here the value of x are 1, 2, 3, 4, 5, 6 and the mean u = (3+4) = 3.5
Var(x) = ∑𝑎𝑙𝑙 𝑥(𝑥𝑖 − 𝑢)2 𝑝(𝑥𝑖 )
= [ (1-3.5)2 + (2-3.5)2 + (3-3.5)2 + (4-3.5)2 + (5-3.5)2 + (6-3.5)2 ] 1/6
= 35/12

[Ex-16] Calculate mean of a Poisson random variable.


Solution: A random variable X, taking on one of the values 0, 1, 2, ..., is said to be a
Poisson random variable with parameter λ, if for some λ > 0,
𝜆𝑖
p(i) = P{X = i} = 𝑒 −𝜆 , i = 0,1,2,...
𝑖!

From the definition of expected value and Poisson random variable we have;

𝜆𝑘
Where, the identity ∑∞
𝑘0 = 𝑒𝜆
𝑘!
[Ex-14] Find the variance of uniform random variable with range a and b.
Solution:

[Ex-17] Find the mean and variance for the standard normal distribution.
[Ex-18] Find the mean and variance for the normal distribution.
[Ex-19] Estimate the range of uniform density for the sample value x = 2,3,5,6,8,9,11,12.

[Ex-22/23] Find the maximum likelihood estimated for u and σ2 in normal distribution.
End

You might also like