Professional Documents
Culture Documents
EXPECTED VALUES
Experimental Expected
Trial Experiment
Probability Value
Specific
Observing an Collection of Probability of outcome we
event and one or an event expect to
record the multiple based on the occur when
outcome trials experiment we run an
experiment
FREQUENCY
Probability frequency Collection of the probabilities for each possible
distribution outcome of an event.
COMPLEMENTS
Everything an event is not
P(A’) = 1- A + A’ = Sample
A’ = Not A
P(A) space
julia.page
COMBINATORICS
The likelihood of an event occurring
preferred outcomes
P(X) =
sample space
P(A, B) = P(A) + P(B)
Without With
repetition
PERMUTATIONS repetition
(arrange)
How many ways are How many ways are σ 𝑛𝑖 !
𝑃𝑛 = 𝑃𝑛𝑛 = 𝑛! to arrange 3 letters a, to arrange 2 letters a 𝑃𝑛1 ,...,𝑛𝑘 = ς 𝑛𝑖 !
b, c? and 2 letters b?
VARIATIONS
(pick and arrange)
Hoy many words of 2 Hoy many words of 2
𝑛! different letters can different letters can
𝑉𝑝𝑛 = 𝑃𝑝𝑛 =
𝑛−𝑝 ! you make with 4 you make with 4 𝑉ത𝑝𝑛 = 𝑛𝑝
letters a, b, c, d? letters a, b, c, d?
COMBINATIONS
(pick)
Hoy many ways are Hoy many ways are
𝑛! there to pick 2 there to pick 2 letters n+p−1
𝐶𝑝𝑛 = different letters out out of 4 letters a, b, c, 𝐶𝑝ҧ 𝑛 = Cp
𝑝! 𝑛 − 𝑝 ! of 4 letters a, b, c, d? d?
FACTORIALS
0! = 1
If n < 0, n! doesn’t exist
If n > 0, n > k
(n + k) ! = n! • ( n+1) • … • (n+k)
n!
(n-k) ! =
n−k+1 • … •n
n!
= k+1 • … • n
k!
julia.page
BAYESIAN INFERENCE
𝑥 ∈ 𝐴, 𝑥 ∉ 𝐴: element x is a part of set A (x NOT in A)
A ∋ 𝑥: set A contains element x
∀𝑥: for all/any x
𝐴 ⊆ 𝐵: A is a subset of B
A B Mutually
A∩B=ø
All complements are mutually exclusive,
but not all mutually exclusive sets are
exclusive complements
AB Completely
overlap
Dependent Events
Probabilities of dependent events vary as
P(A |B) ≠ P (A) conditions change
Conditional Probability
• B has occurred
P(A ∩ B)
P(A|B) = • Only elements of the intersection can satisfy A
P(B) • P(A|B) not the same meaning as P(B|A)
A = B1 + … + Bn
Law of Total Probability
P(A) = P(A|B1) ∙ P(B1) + … + P(A|Bn) ∙ P(Bn)
P(B A ∙ P(A)
Bayes’ Law P(A|B) =
P(B)
julia.page
DISTRIBUTIONS
Show the possible values a random variable can take and how
frequently they occur.
• Y actual outcome Population Sample
• Y one of the possible outcomes
• P(Y = y) = p(y) Mean μ xത
• Probability function: function that Variance σ2 s2
assigns a probability to each
distinct outcome in the sample Standard Deviation σ s
space
DISCRETE
• Finite number of outcomes
• Can add up individual value to determine the probability of an
interval
• Expressed with table, graph or piecewise function
• Expected values might be unattainable
Uniform
• Y ~ U(a, b)
• Y ~ U(a) for categorical
• Outcomes are equally likely
• No predictive power
Bernoulli
• Y ~ Bern(p)
• 1 trial, 2 possible outcomes
• E(Y) = p
• σ2 (Y) = p • (1-p)
Binomial • Y ~ B(n, p)
• Measures the p 1 of the possible
outcomes over n trials
• P(Y=y) = p(y) = C(y, n) • py • (1-p)n-y
• E(Y) = n • p
• σ2 (Y) = n • p • (1-p)
• Y ~ Po(λ)
• Measures the frequency over an Poisson
interval of time or distance (λ ≥ 0)
λy
• P(Y=y) = p(y) =
y! e−λ
• E(Y) = λ
• σ2 (Y) = λ
julia.page
DISTRIBUTIONS
• PDF: Probability Density Function
• CDF: Cummulative Density Function
CONTINUOUS
• Infinitely many consecutive possible values
• Cannot add up individual value to determine the probability of an interval
• Expressed with graph or continuous function
• P(Y=y) = p(y) = 0 for any individual value y (P(Y< y) = P( Y ≤ y)
Normal
• Y ~ N(μ, σ2)
• E(Y) = μ
• σ2(Y) = σ2
• 68% of all ist values fall in the
interval (μ – σ, μ+ σ)
• Y ~ t(k)
Students’ T
• Small sample size approximation of a
Normal (accounts for extreme values
better)
• If k>1: E(Y) = μ and σ2 (Y) = s2 • k /(k-2)
Chi-Squared
• Y ~ 𝝌𝟐(λ)
• Square of the t-distribution
• E(Y) = k
• σ2 (Y) = 2k
Exponential
• Y ~ Exp(λ)
• E(Y) = 1/λ
• σ2 (Y) = 1/λ2
Logistic
• Y ~ Logistic(μ, s)
• Continous variable inputs and binary outcome
• CDF ↑ when values near the mean
• ↓ s, the quicker it reaches values close to 1
• E(Y) = μ
• σ2 (Y) = s2 • n2 / 3
julia.page