You are on page 1of 3

Chapter 1 and 2: Describing data

“N” is the population size and “n” is the sample size.


∑"
!#$ #! ∑"
!#$ %! #!
1) 𝜇! = $
or 𝜇! = ∑ %!

∑%
!#$ #! ∑%
!#$ %! #!
2) 𝑥̅ = $
or 𝑥̅ = ∑ %!

∑"
!#$ &! #! ∑%
!#$ &! #!
3) 𝜇! = ∑ &!
or 𝑥̅ = ∑ &!

'() +,
4) Median = & *
' position in the ordered data. For even values, median is the average of two middle
numbers.
5) Mode = value that occurs most often.
-
6) 𝑃+, percentile = value at &)..' × (𝑛 + 1)+, ordered position. This can be used for quartiles, five-number
summary, and interquartile range.

∑"
!#$(#! 01)
& 3
7) 𝜎 * = $
and Population Coefficient of variation = >&1' × 100@ %

∑%
!#$(#! 01)
& 4
8) 𝑠 * = and sample Coefficient of variation = C&#̅ ' × 100D %
'0)

∑"
!#$(#! 01' )(7! 01( ) 89:(!,6)
9) 𝐶𝑜𝑣(𝑋, 𝑌) = 𝜎!6 = and corr(𝜌) =
$ 3) 3*

∑%
!#$(#! 01' )(7! 01( ) 89:(!,6)
10) 𝐶𝑜𝑣(𝑋, 𝑌) = 𝑠!6 = '0)
and corr(𝑟) = 4) 4*

Chapter 3: Probability
1) 𝑃(𝐴) + 𝑃(𝐴̅) = 1
2) 𝑃(𝐴 ∪ 𝐵) = 𝑃(𝐴) + 𝑃(𝐵) − 𝑃(𝐴 ∩ 𝐵)
-(<∩>) -(<∩>)
3) 𝑃(𝐴|𝐵) = or 𝑃(𝐵|𝐴) =
-(>) -(<)

4) Independence: 𝑃(𝐴 ∩ 𝐵) = 𝑃(𝐴) × 𝑃(𝐵) (this can be extended to k number of events)


-(<)
5) Odds =
)0-(<)

6) Bayes Rule and Total Prob: if there are 𝐵) , 𝐵* , … . 𝐵? , … , 𝐵@ (𝑓𝑜𝑟 𝑖 = 1 … 𝐾) mutually exclusive events of
the sample space S and A is any event with P(A)≠0, then

𝑃X𝐴Y𝐵? Z. 𝑃(𝐵)
𝑃X𝐵? Y𝐴Z =
∑@AB) 𝑃(𝐴|𝐵A ). 𝑃(𝐵A )

7) Permutation and Combination:


n! n!
PCD = and CCD =
(n − x)! x! (n − x)!
Chapter 4: Discrete Random Variables

1) CDF of a Random Variable, 𝑋: 𝐹(𝑥. ) = 𝑃(𝑋 ≤ 𝑥. ) = j 𝑃(𝑥)


#E#+

2)𝐸(𝑋) = 𝜇! = j 𝑥 𝑃(𝑥)
#

1
3)Var(X) = 𝜎#* = 𝐸[(𝑥 − 𝜇! )* ] = j (𝑥 − 𝜇! )* 𝑃(𝑥) 𝐎𝐑; 𝐸(𝑋 * ) − 𝜇!*
#

4) If 𝑔(𝑋) is a function of 𝑋, then: 𝐸[𝑔(𝑋)] = j 𝑔(𝑋) 𝑃(𝑥)


#

5) Bernoulli PDF:

𝐸(𝑋) = 𝜇! = j 𝑥 𝑃(𝑥) = P
#

Var(X) = 𝜎#* = P(1 − P)


6) Binomial Distribution
𝑛!
𝑃(𝑋 = 𝑥) = 𝑃 # (1 − 𝑃)'0#
𝑥! (𝑛 − 𝑥)!
𝐸(𝑋) = 𝑛P; and 𝜎#* = 𝑛P(1 − P)
7) Poisson Distribution:

𝑒 0F 𝜆#
𝑃(𝑋 = 𝑥) =
𝑥!
𝐸(𝑋) = 𝜆; and 𝜎#* = 𝜆
8) Jointly Distributed DRVs:
if 𝑃(𝑥, 𝑦) = 𝑃(𝑋 = 𝑥 ∩ 𝑌 = 𝑦), 𝑡ℎ𝑒𝑛
--Marginal Probabilities are: P(𝑋) = ∑7 𝑃(𝑥, 𝑦) 𝐚𝐧𝐝 P(𝑌) = ∑# 𝑃(𝑥, 𝑦)

--Joint CDF: 𝐹(𝑥, 𝑦) = 𝑃(𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦)


-(#,7) -(#,7)
--Conditional Probabilities are: 𝑃(𝑌|𝑋) = -(!)
𝐚𝐧𝐝 𝑃(𝑋|𝑌) = -(6)

--Independence: 𝑃(𝑥, 𝑦) = 𝑃(𝑋) × 𝑃(𝑌) (this can be extended to k number of events)


--Conditional Expectation (this is Y|X, same can be expressed for X|Y)
𝐸(𝑌|𝑋) = 𝜇6|! = j 𝑦 𝑃(𝑦|𝑥)
7

* * * * *
Var(Y|X) = 𝜎7|# = 𝐸 ƒX𝑦 − 𝜇6|! Z „ = j X𝑦 − 𝜇6|! Z 𝑃(𝑦|𝑥) 𝐎𝐑; 𝐸(𝑌 * |𝑋) − 𝜇6|! = j (𝑦)* 𝑃(𝑦|𝑥) − 𝜇6|!
7 7

--Covariance and Correlation

Cov(𝑋, 𝑌) = 𝐸[(𝑋 − 𝜇! )(𝑌 − 𝜇6 )] = 𝐸(𝑋𝑌) − 𝜇! 𝜇6 = j j 𝑥 𝑦 𝑃(𝑥, 𝑦) − (𝜇! 𝜇7 )


# 7

Cov(𝑋, 𝑌)
corr(𝜌) =
𝜎! 𝜎6
Chapter 5: Continuous Random Variables
#+

1) CDF of a Random Variable, 𝑋: 𝐹(𝑥. ) = … 𝑓(𝑥)𝑑𝑥 . (𝐻𝑒𝑟𝑒 𝑥H 𝑖𝑠 𝑡ℎ𝑒 𝑚𝑖𝑛 𝑣𝑎𝑙𝑢𝑒 𝑜𝑓 𝑥)


#,

2) 𝐸(𝑋) = 𝜇! = … 𝑥 𝑓(𝑥) 𝑑𝑥
0I

2
I

3) 𝑉𝑎𝑟(𝑋) = 𝜎!* = …(𝑥 − 𝜇! )* 𝑓(𝑥) 𝑑𝑥 = 𝐸[(𝑥 − 𝜇! )* ] = 𝐸(𝑋 * ) − 𝜇!*


0I

4) Uniform Distribution:
(𝑎 + 𝑏)
𝐸(𝑋) = 𝜇! =
2
(𝑏 − 𝑎)*
Var(X) = 𝜎#* =
12
5) If 𝑋~𝑁(𝜇! , 𝜎#* ), then 𝑍~𝑁(0, 1)
6) Exponential Distribution:

𝐹(𝑥) = 1 − 𝑒 0F# 𝐚𝐧𝐝 𝑓(𝑥) = 𝜆𝑒 0F#


1 1
𝐸(𝑋) = 𝜇! = 𝐚𝐧𝐝 Var(X) = 𝜎#* = *
𝜆 𝜆

Integration rules
𝑥 '()
1) … 𝑥 ' 𝑑𝑥 = + 𝑐 (except when n = −1)
𝑛+1
1
2) … 𝑑𝑥 = ln 𝑥 + 𝑐
𝑥
3) … 𝑒 # 𝑑𝑥 = 𝑒 # + 𝑐
𝑎#
4) … 𝑎 # = + 𝑐
ln 𝑎
5) …[𝑓(𝑥) ± 𝑔(𝑥)] 𝑑𝑥 = … 𝑓(𝑥)𝑑𝑥 ± … 𝑔(𝑥)𝑑𝑥

6) … 𝑘 𝑓(𝑥)𝑑𝑥 = 𝑘 … 𝑓(𝑥)𝑑𝑥
Note − constant is not required in the definite (with limits) integrals.
Rules for Expectation and Variance operators (Linear functions of Random Variables):
Let X and Y be random variables, and let k be a constant. Then:

**End of the document**

You might also like