You are on page 1of 2

Key Formulas for Statistical Methods

Chapter 3 Descriptive Statistics


rP
(yi − ȳ)2
P
yi
Mean ȳ = Standard deviation s =
n n−1

Chapter 4 Probability Distributions


y−µ σ
z-score z = Standard error σȳ = √
σ n

Chapter 5 Statistical Inference


s
Confidence interval for mean ȳ ± t(se) with se = √
n r
π̂(1 − π̂)
Confidence interval for proportion π̂ ± z(se) with se =
n

Chapter 6 Statistical Inference: Significance Tests


ȳ − µo s
H0 : µ = µ0 test statistic t = with se = √ , df = n − 1
se rn
π̂ − π0 π0 (1 − π0 )
H0 : π = π0 test statistic z = with se0 =
se0 n

Chapter 7 Comparison of Two Groups


s
s21 s2
r
1 1
Compare means: (ȳ2 −ȳ1 )±t(se) with se = + 2 , or pooled s: se = s +
n1 n2 n1 n2
ȳ2 − ȳ1
Test H0 : µ1 = µ2 using t =
se r
π̂1 (1 − π̂1 ) π̂2 (1 − π̂2 )
Compare proportions: (π̂2 − π̂1 ) ± z(se) with se = +
n1 n2

Chapter 8 Analyzing Association Between Categorical Variables


P (f0 − fe )2
Chi-squared test of H0 : Independence, χ2 = , df = (r − 1)(c − 1),
fe
p
se of f0 − fe = fe (1 − row proportion)(1 − column proportion)

C −D γ̂
Ordinal measure γ̂ = , −1 ≤ γ̂ ≤ 1 z = , γ̂ ± zσ̂γ̂
C +D σ̂γ̂

Chapter 9 Linear Regression and Correlation

Linear regression model 


E(y)= α + βx, prediction equation ŷ = a + bx
sx
Pearson correlation r = b, −1 ≤ r ≤ 1
sy
TSS − SSE
r2 = (y − ȳ)2 , SSE = (y − ŷ)2 , 0 ≤ r2 ≤ 1
P P
, TSS =
TSS
b
Test of independence H0 : β = 0, t = , df = n − 2
se
2

Chapter 11 Multiple Regression and Correlation

Multiple regression model E(y) = α + β1 x1 + β2 x2 + · · · + βk xk


Global test H0 : β1 = · · · = βk = 0
Model mean square R2 /k
Test statistic F = =
Error mean square (1 − R2 )/[n − (k + 1)]
df1 = k, df2 = n − (k + 1)
bi
Partial test H0 : βi = 0, test statistic t = , df = n − (k + 1)
se
Model Comparison: Complete versus Reduced models

(SSEr − SSEc )/df1


Test statistic F =
SSEc /df2
df1 = dfr − dfc , and df2 = dfc

2
R2 − ryx
2
1
Squared Partial correlation ryx2 ·x1
= 2
1 − ryx 1

Chapter 12 Comparing Groups: Analysis of Variance Methods

H0 : µ1 = · · · = µg , One-way ANOVA test statistic

Between-group sum of squares/(g − 1)


F = , df1 = g − 1, df2 = N − g
Within-group sum of squares/(N − g)

Chapter 13 Combining Regression and ANOVA: Analysis of Covariance

E(y) = α + βx + β1 z1 + · · · + βg−1 zg−1 , zi = 1 or 0 is dummy variable for group i


Partial test
Effect mean square
Test statistic F = Error mean square
for covariate df1 = 1, for factor or interaction, df1 = g − 1
model with interaction: df2 = n − 2g; model without interaction: df2 = n − g − 1

Chapter 14 Model Building and Multiple Regression

Quadratic regression E(y) = α + β1 x + β2 x2


Exponential regression E(y) = αβ x (log of mean is linear in x)

Chapter 15 Logistic Regression: Modeling Categorical Responses


 
P (y = 1)
Logistic regression logit = log(odds) = log = α + βx
1 − P (y = 1)
eα+βx odds
P (y = 1) = =
1 + eα+βx 1 + odds

You might also like