You are on page 1of 6

Ryan Absar Ilmi - EE

1806187581

Concept of Probability
A random variable is a variable whose value is unknown or a function that assigns
values to each experiment’s outcomes. The use of random variables is most common in
probability and statistics, where they are used to quantify the results of unexpected
occurrences. A random variable has a set of values, and any of those values could be the
resulting outcome—for example, an experiment where a dice is tossed three times. If X
represents the number of times, the number 2 comes up. Then X is a discrete random
variable. Continuous random variables can represent any value within a specified range or
interval and can take on an infinite number of possible values. An example of a continuous
random variable would be an experiment that involves measuring the amount of car in a city
over a year or the average height of a random kindergarten of 25 students.

Furthermore, there are some distributions of discrete random variables that are
important as models of scientific phenomena. The distribution that will be discussed in this
review are:

- Bernoulli Trials
In the 1600s, Jakob Bernoulling spent his life studying probability. An
experiment must meet a few criteria. There must be only two possible outcomes in
the investigation, each output has a fixed probability of occurring, and the result is
entirely independent of all others. There is three required distribution in this
connection.
- Binomial Distribution
A binomial distribution is a trial that is repeated multiple times. The
binomial distribution can only have two possible outcomes; hence the bi in
binomial means two. We can represent this explanation with a formula of

/
Where:
b = binomial probability
X = total number of “successes” (pass or fail, heads or tails, etc.)
P = probability of success on an individual trial
n = number of trials

- Geometric Distribution
A geometric distribution experiment continues until either a success or
a failure occurs rather than for a set number of trials. There are a few
characteristics that determine a geometric distribution. The trial keeps going
until the first success occurred. In theory, the experiment could go on forever.
Then the probability of success and failure (p and q) is the same for each trial,
whether p+q=1 also q=1-p.
Ryan Absar Ilmi - EE
1806187581

Where:
k = total number of failures
p = probability of success

- Negative Binomial Distribution


A negative binomial distribution consists of x repeated trial. each trial
can result in just two possible outcomes, a success, and a failure. The
experiment continues until a number of successes are observed where the
number of successes is specified in advance.

The negative binomial distribution is at times characterized regarding the


random variable Y =number of failures before rth success. This definition is
measurably proportionate to the one given above as far as X =trial at which
the rth achievement happens, since Y = X − r. T

- Multinomial Distribution
Bernoulli experiment requires only two possible outcomes in an experiment or
a binomial experiment. whilst, with a multinomial experiment, each trial can have two
or more possible outcomes. It can be represented as an equation as follow.

P=
where,
p is the probability,
n is the total number of events
k is the probability of Outcome 1

- Poisson Distribution
A Poisson Process is a model for a series of discrete events where the
average time between events is known, but the exact timing of events is random. The
Poisson process is the model used in Poisson distribution to find the probability of a
number of events in a time period. The Poisson Distribution probability mass function
gives the probability of observing k events in a time period given the length of the
period and the average events per time:
Ryan Absar Ilmi - EE
1806187581

As explained previously, probability can be discrete and continuous. A continuous


distribution is a probability distribution in which the random variable X can take on any value
(is continuous). Because there are infinite values that X could assume, the probability of X
taking on anyone’s specific value is zero. The distribution for continuous distribution that will
be discussed in this review are:

- Uniform Distribution
A probability over a given range for a continuous distribution is called uniform
distribution. One of the most important application is generating random numbers.
Many random number generator uses uniform distribution to generate random
numbers on the (0,1) interval. This argument can be represented as a formula.

where A is the location parameter and (B - A) is the scale parameter. The case
where A = 0 and B = 1 is called the standard uniform distribution. The equation for
the standard uniform distribution is

f (x) = 1 F or 0 ≤ x ≤ 1
- Gaussian/Normal Distribution
The Gaussian distribution is considered to be the most important probability
distribution statistics because it fits many natural phenomena. Gaussian distribution
Ryan Absar Ilmi - EE
1806187581
or Normal distribution describes how distributed the values of a variable are. It is
asymmetric distribution were around the central peak most of the observation
clustered and the probabilities for values further away from the mean taper off
equally in both directions. All forms of the distribution has a few characteristics, it is
all symmetric. The normal distribution cannot be model skewed distribution. The
mean, median, and mode are all equal. The normal distribution has a few theorems
that correspond to the normal distribution:
- Central Limit Theorem
- Probability Tabulation
- Multivariate Normal Distribution
- Sums of Normal Random Variables
- Lognormal Distribution
A log-normal distribution is a statistical distribution of logarithmic values from
a related normal distribution. A log-normal distribution can be translated to a normal
distribution and vice versa using associated logarithmic calculations.
- Probability Tabulation
- Gamma and Related Distribution
A gamma distribution is a general type of statistical distribution that is related
to the beta distribution and arises naturally in processes for which the waiting times
between Poisson distributed events are relevant. Gamma distributions have two free
parameters, labeled alpha, and theta.

- Exponential Distribution
- Chi-Squared Distribution
- Beta and Related Distribution
Ryan Absar Ilmi - EE
1806187581
Ryan Absar Ilmi - EE
1806187581

Reference:
- https://www.statisticshowto.com/probability-and-statistics/binomial-theorem/binomial-
distribution-formula/
- http://www.math.wichita.edu/history/Topics/probability.html
- https://mathworld.wolfram.com/GeometricDistribution.html
- https://opentextbc.ca/introbusinessstatopenstax/chapter/geometric-distribution/
- http://www.math.ntu.edu.tw/~hchen/teaching/StatInference/notes/lecture16.pdf
- https://mathworld.wolfram.com/UniformDistribution.html
- https://www.itl.nist.gov/div898/handbook/eda/section3/eda3662.htm

You might also like