You are on page 1of 7

Prelude

All of us would have heard or used the words Luck, Risk, Doubt, Randomness in a
very casual way. Those words have probability as the backbone concept. The fact is
that despite its ubiquity, we never knew that it works on Probability.

Join us in the ride to explore more on our next destination Probability.

Reminisce the Terms


Random Experiment

An Experiment is a process that can be repeated infinitely resulting in a set of


possible outcomes which cannot be predicted.
Individual repetitions of Experiment is termed as Trials.
Sample Space

Sample Space denotes the set of all outcomes obtained in the experiment process.
Denoted by S
Event

It is a set of outcomes of an experiment.


Probability is assigned to an event.
Event is a subset of Sample Space.
Event \subseteq Sample SpaceEvent⊆SampleSpace

Probability
Definition

Probability is the likelihood or chance of an event occurring.

Formula

Let E denote an event and S denotes the Sample Space.

Probability (P)=\dfrac{No. of Favorable Outcomes}{Total no. of


outcomes}=\frac{n(E)}{n(S)}
Totalno.ofoutcomes

No.ofFavorableOutcomes
=
n(S)

n(E)

Consider the classic example of tossing coins. Assume that the coin is tossed 3
times. Find the probability such that we get at least one head?

Solution

Sample Space

Let S denote the Sample Space.

S = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT}

n(S)=8
Event

Let E denote the event of getting at least one head.


E= {HHH, HHT, HTH, THH, HTT, THT,TTH}

n(E)=7
Probability

P(getting atleast one head)=\dfrac{n(E)}{n(S)}


n(S)

n(E)
\dfrac{7}{8}
8

7
= 0.875

Awestruck on the use of Statistics and Probability? Probability and Statistics play
a vital role in our daily life situation without our knowledge.

Example

Have you ever checked the weather forecast in the morning to decide on the
possession of umbrella?
Have you ever felt nervous about the cricket match toss ceremony?
Yeah! Probability plays a crucial role in all of these examples.

Properties of Probability
Property 1

Probability value lies between 0 and 1.

0 \leq Probability \leq 10≤Probability≤1

Property 2

The impossible event will have the probability value as zero.

P(\emptyset)=0P(∅)=0

Property 3

The probability of the sure event is 1.

P(S)=1P(S)=1

Property 4

The sum of the probabilities of an event and its complimentary is 1.

P(A) + P(\overline{A}) =1P(A)+P(


A

)=1

Central Limit Theorem


What is the Central Limit Theorem?

The Central Limit Theorem states that as the sample size gets larger, the sampling
distribution of the sample means approaches a normal distribution.
Consider a population with

Mean \muμ
Standard Deviation \sigmaσ
And take a large number of random samples from the population by sampling with
replacement. Try visualizing the distribution of sample means, which turns out to
be normally distributed.

This condition remains true regardless of whether the population is skewed or


normal, provided the sample size is sufficiently large (usually n > 30).

Prelude
World is all about numbers. It is time to move onto those magic numbers. Let us map
our outcomes of the experiment to numbers.

Our next destination in the journey is Random Variables. Explore more to adore the
beauty of this place

Random Variables
A random variable represents the outcome of a statistical experiment in numerical
values.

Represented by X

Example:
Consider the experiment in which a single coin is tossed where we could get either
Heads or Tails.

Let us assume the values.

Heads = 0
Tails =1
X be the random variable denoting the outcome of the coin flip.

X= \big\{0,1 \big\}X={0,1}

Probability Distribution
Having explored the different forms of outcomes of an experiment, it is time to
perceive its occurrence probability.

Probability Distribution

Definition

Mathematical function that aids in describing the probability of occurrence of all


the outcome values that a random variable can assume in an experiment.

The Classes
Two main classes of Probability Distribution Functions are:

Probability Mass Function


Probability Density Function

PMF
Definition

A function that produces the likelihood of a Discrete random variable, exactly


equal to some value.
Representation

Let X be a discrete random variable with range R_X=\big\{x_1,x_2,x_3.....\big\}R


X
={x
1
,x
2
,x
3
.....} (finite or countably infinite).

The function

P_X(x)= \begin{bmatrix} P(X=x) & x \epsilon R_X\\ \\ 0 & Otherwise \end{bmatrix}P


X
(x)=

P(X=x)

xϵR
X

Otherwise


is the Probability Mass Function (PMF) of X.

Properties of PMF
Let X be the Discrete Random Variable, then the Probability Mass Function(PMF)
P_X(x)P
X
(x) has the following properties.

0\leq P_X(x) \leq 10≤P


X
(x)≤1 \:\forall∀ x\in R_xx∈R
x
(always positive)
\sum_{x \in R_X} P_X(x)=1∑
x∈R
X

P
X
(x)=1
P(X \in A)=\sum_{x \in A} P_X(x)P(X∈A)=∑
x∈A
P
X
(x)
Fact
A Discrete Random Variable on taking the mode value, Probability Mass Function has
its maximum value.

Probability Density Function

Definition

A Statistical expression of a Continuous random variable, whose integral across an


interval gives the probability value that lies within the same interval.

Representation

Let X be the Continuous Random Variable, then the probability density function
(PDF) of XX is denoted by

{P(a \le X \le b) = \int_a^b p(x) d_x}P(a≤X≤b)=∫


a
b
p(x)d
x

Properties of PDF
Let X be the Continuous Random Variable, then the Probability Density Function(PDF)
f_X(x)f
X
(x) has the following properties.

f_X(x) \geq 0f
X
(x)≥0 \forall∀ x\in R_xx∈R
x
(always positive)
P\:[a\leq X \leq b]P[a≤X≤b]=\int_{a}^{b} f_X(x) \:dx∫
a
b
f
X
(x)dx
\int_{-\infty}^{\infty} f_X(x) \:dx∫
−∞

f
X
(x)dx=1
F_X(x)=\int_{-\infty}^{x} f_X(x) \:dxF
X
(x)=∫
−∞
x
f
X
(x)dx

Prelude
Let us try applying some statistical measures on the probability distribution
function.

Expected Value - Discrete


The expected value is estimated by multiplying each of the possible outcomes in the
Sample Space by the probability of them and summing all of those values.
Also called as Mean or Average.
Discrete

Definition
Let X be a discrete random variable with range R_X=\{x_1,x_2,x_3,...\}R
X
={x
1
,x
2
,x
3
,...} (finite). Expected value of a discrete random variable X can be obtained as

E[X]=\sum_{x_k \in R_X} x_k P(X=x_k)E[X]=∑


x
k
∈R
X

x
k
P(X=x
k
)

where k=1,2,3..k=1,2,3..

Notations
EX=E[X]=E(X)=\mu_XEX=E[X]=E(X)=μ
X

Variance - Discrete
The Variance of any Discrete random variable X can be obtained by

Var(X)=\sum x^2 P(X=x)-\big(\sum x\:P(X=x)\big)^2Var(X)=∑x


2
P(X=x)−(∑xP(X=x))
2

A measure of spread for a distribution of a random variable that determines the


degree to which the values of a random variable differ from the expected value.

Expected Value - Continuous


Definition
Let X be a continuous random variable with range (-\infty,\infty)(−∞,∞). Expected
value of a continuous random variable X can be obtained as

E[X]= \int_{-\infty}^{\infty} xf_X(x)dxE[X]=∫


−∞

xf
X
(x)dx

Variance - Continuous
The variance of any continuous random variableX can be obtained with the formula

Var(X)=\int_{-\infty}^{\infty} x^2 f_X(x)dx- \big(\int_{-\infty}^{\infty} x


f_X(x)dx\big)^2Var(X)=∫
−∞

x
2
f
X
(x)dx−(∫
−∞

xf
X
(x)dx)
2

You might also like