You are on page 1of 8

ASYNCHRONOUS LECTURE ACTIVITY

MIDTERM AY 2022-2023
BIOSTATISTICS AND EPIDEMIOLOGY

REVIEW QUESTIONS. ANSWER THE FOLLOWING:

Instructions: Type your answers in Google Documents, save and submit in PDF Form. Attach you
work in the link provided. Deadline: November 19, 2022.
ANSWERS:
1. Define the following:
a) PROBABILITY – The probability distribution of a discrete random variable is a table,
graph, formula, or other device used to specify all possible values of a discrete random
variable along with their respective probabilities.

b) OBJECTIVE PROBABILITY – In contrast to intuition or guessing, objective probability


refers to the possibilities or probabilities that an event will occur based on the examination of
certain measurements. Each measure is based on a factual observation, a recorded
observation, or a long history of data collection.

c) SUBJECTIVE PROBABILITY – This view holds that probability measures the confidence
that a particular individual has in the truth of a particular proposition. This concept does not
rely on the repeatability of any process. In fact, by applying this concept of probability, one
may evaluate the probability of an event that can only happen once, for example, the
probability that a cure for cancer will be discovered within the next 10 years. Although the
subjective view of probability has enjoyed increased attention over the years, it has not been
fully accepted by statisticians who have traditional orientations.

d) CLASSICAL PROBABILITY – The classical treatment of probability dates back to the


17th century and the work of two mathematicians, Pascal and Fermat. Much of this theory
developed out of attempts to solve problems related to games of chance, such as those
involving the rolling of dice. Examples from games of chance illustrate very well the
principle involved in classical probability. For example, if a fair six-sided die is rolled, the
probability that a 1 will be observed is equal to 1 = 6 and is the same for the other five faces.
If a card is picked at random from a well-shuffled deck of ordinary playing cards, the
probability of picking a heart is 13 = 52. Probabilities such as these are calculated by the
processes of abstract reasoning. It is not necessary to roll a die or draw a card to compute
these probabilities. In the rolling of the die, we say that each of the six sides is equally likely
to be observed if there is no reason to favor any one of the six sides. Similarly, if there is no
reason to favor the drawing of a particular card from a deck of cards, we say that each of the
52 cards is equally likely to be drawn.
e) THE RELATIVE FREQUENCY CONCEPT OF PROBABILITY – The relative
frequency approach to probability depends on the repeatability of some process and the
ability to count the number of repetitions, as well as the number of times that some event of
interest occurs. In this context we may define the probability of observing some
characteristic, E, of an event as follows: If some process is repeated a large number of times,
n, and if some resulting event with the characteristic E occurs m times, the relative frequency
of occurrence of E, m = n, will be approximately equal to the probability of E.

f) MUTUALLY EXCLUSIVE EVENTS – Two events are mutually exclusive in statistics


and probability theory if they cannot happen simultaneously. A coin toss is the most basic
illustration of two occurrences that cannot coexist. The result of tossing a coin can be either
heads or tails, but neither outcome can happen at the same time.

g) INDEPENDENCE – Independence is a fundamental notion in probability theory, as in


statistics and the theory of stochastic processes. Two events are independent, statistically
independent, or stochastically independent if, informally speaking, the occurrence of one
does not affect the probability of occurrence of the other or, equivalently, does not affect the
odds. Similarly, two random variables are independent if the realization of one does not
affect the probability distribution of the other.

h) MARGINAL PROBABILITY – Given some variable that can be broken down into m
categories designated by A1; A2; ...; Ai; ...; Am and another jointly occurring variable that is
broken down into n categories designated by B1; B2; ...; Bj; ...; Bn, the marginal probability of
Ai, P (Ai), is equal to the sum of the joint probabilities of Ai with all the categories of B. That
is, P(Ai) = ∑P(Ai Ո Bj), for all values of j.

i) JOINT PROBABILITY – Sometimes we want to find the probability that a subject picked
at random from a group of subjects possesses two characteristics at the same time. Such a
probability is referred to as a joint probability.
j) CONDITONAL PROBABILITY – On occasion, the set of “all possible outcomes” may
constitute a subset of the total group. In other words, the size of the group of interest may be
reduced by conditions not applicable to the total group. When probabilities are calculated
with a subset of the total group as the denominator, the result is a conditional probability.

k) THE ADDITION RULE – The third property of probability given previously states that the
probability of the occurrence of either one or the other of two mutually exclusive events is
equal to the sum of their individual probabilities.

l) THE MULTIPLICATION RULE – A probability may be computed from other


probabilities. For example, a joint probability may be computed as the product of an
appropriate marginal probability and an appropriate conditional probability. This relationship
is known as the multiplication rule of probability.

m) COMPLEMENTARY EVENTS – We computed the probability that a person picked at


random from the 318 subjects will be an Early age of onset subject as P(E) = 141/318
= .4434. We found the probability of a Later age at onset to be P(L) = 177/318 = .5566. The
sum of these two probabilities we found to be equal to 1. This is true because the events
being early age at onset and being later age at onset are complementary events. In general,
we may make the following statement about complementary events.

n) FALSE POSITIVE – A false positive results when a test indicates a positive status when the
true status is negative.

o) FALSE NEGATIVE – A false negative results when a test indicates a negative status when
the true status is positive.

p) SENSITIVITY – The sensitivity of a test (or symptom) is the probability of a positive test
result (or presence of the symptom) given the presence of the disease.
q) SPECIFICITY – The specificity of a test (or symptom) is the probability of a negative test
result (or absence of the symptom) given the absence of the disease.

r) PREDICTIVE VALUE POSITIVE – The predictive value positive of a screening test (or
symptom) is the probability that a subject has the disease given that the subject has a positive
screening test result (or has the symptom).

s) PREDICTIVE VALUE NEGATIVE – The predictive value negative of a screening test (or
symptom) is the probability that a subject does not have the disease, given that the subject
has a negative screening test result (or does not have the symptom).

t) BAYE’S THEOREM – Bayes' Theorem, named after 18th-century British mathematician


Thomas Bayes, is a mathematical formula for determining conditional probability.
Conditional probability is the likelihood of an outcome occurring, based on a previous
outcome having occurred in similar circumstances. Bayes' theorem provides a way to revise
existing predictions or theories (update probabilities) given new or additional evidence.

2. Name and explain the three properties of probability


1) Given some process or experiment with n mutually exclusive outcomes called events, E1;

E2; . . . ; En, the probability of any event Ei is assigned a non-negative number. That is,

P(Ei) ≥ 0
- That is to say, all events must have a probability greater than or equal to zero, which a
reasonable demand is given that it is impossible to imagine a situation in which there
is a negative probability. The idea of mutually exclusive outcomes is a crucial one in
the formulation of this attribute. If two events cannot happen at once, they are said to
be mutually exclusive.

2) The sum of the probabilities of the mutually exclusive outcomes is equal to 1.

P(E1) + P(E2) = …+ P(En) = 1


- This property, known as exhaustiveness, states that the observer of a probabilistic
process must take into account every scenario that might occur; when all scenarios
are combined, and their combined probability is 1. The occurrences E1; E2; must be
mutually exclusive in order to satisfy this condition. . . En do not overlap, hence they
cannot both occur at the same time.

3) Consider any two mutually exclusive events, Ei and Ej. The probability of the occurrence
of either Ei or Ej is equal to the sum of their individual probabilities.

P(Ei + Ej) = P(Ei) + P(Ej)


- Assume that the two occurrences are not mutually exclusive and that they are capable
of happening simultaneously. The issue of overlapping would be found when
attempting to calculate the chance of either Ei or Ej occurring, and the process could
become quite challenging.

1) What is a discrete random variable? Give three examples that are of interest to the health
professional.
- A discrete random variable is a variable whose values are obtained by count.
Example: number of patients admitted in a hospital, number of doctors in a hospital, number of
emergency patients admitted in one day in a hospital.
2) What is continuous random variable? Give three examples that are of interest to the health
professional.
- A continuous random variable is a random variable whose probability values are obtained by
measuring intervals that the random variable falls in.
Example: height of patients in a hospital, weight of obesity patients, and cholesterol level in a
certain hospital, etc.
3) Define the probability distribution of a discrete random variable.
- The probability distribution of a discrete random variable is a table, graph, or formula that
gives each possible values and the probabilities of that possible values happen. The total of
all probability across the distribution must be 1 and each individual probability must be
between 0 and 1.
4) Define the probability distribution of a continuous random variable.
- The probability distribution of a continuous random variable is a total area under the curve
and the probability that X is between an interval of numbers is the area under the density
curve between the interval endpoints. Each probability must be 0 and 1.
5) What is a cumulative probability distribution?
- Cumulative probability distribution is the probability that a value of random variable falls
with specified range less than or equal to specified value.
6) What is a Bernoulli trial?
- Bernoulli trial is derived from a Bernoulli process – a sequence of Bernoulli trials: each trial
in two mutually excusive outcomes which success (1) and failure (0), the probability of
success is denoted by p and failure is 1-p, denoted by q, and the trials are independent, in
other words outcome of one trial does not depend other trials.
7) Describe the binomial distribution.
- A binomial distribution can be thought of as simply the probability of a success or failure
outcome in an experiment or survey that is repeated multiple times. It is derived from
Bernoulli trial. It has two parameters, n and p. they are sufficient to specify a binomial
distribution.
8) Give an example of a random variable that you think follows a binomial distribution.
- Example: 20 college students selected randomly are interviewed and the number of students
having Netflix subscription follows a binomial distribution.
9) Describe the Poisson distribution.
- The Poisson distribution describes the occurrence of events, the probability of certain number
of events in a given interval, space or time, the number of occurences of the event may be
infinity.
3 assumptions:
1. Rare events. In a small interval, only 1 or 0 event.
Pr (2 or more events in a small interval or area) = 0.
2. Independent, the occurrence of the event does not affect the occurrence of events in a
non-overlap interval.
3. Stationary. The intensity (number of events per unit interval) remains constant.
The mean and variance are equal in Poisson distribution denoted by λ
10) Give an example of a random variable that you think is distributed according to the
Poisson law.
- Example: number of a car accidents on a specific section of a highway in one day.

You might also like