Professional Documents
Culture Documents
Formulas
Numerical Descriptive techniques
Population mean
=
N
x
N
i
i
1
Sample mean
n
x
x
n
i
i
1
Range
Largest observation - Smallest observation
Population variance
2
=
N
) x (
N
i
i
1
2
Sample variance
2
s =
1
) (
1
2
n
x x
n
i
i
Population standard deviation
=
2
Sample covariance
1
1
n
) y y )( x x (
s
n
i
i i
xy
Population coefficient of correlation
y x
xy
Sample coefficient of correlation
y x
xy
s s
s
r
Slope coefficient
2
1
x
s
) y , x cov(
b
y-intercept
x b y b
1 0
Probability
Conditional probability
P(A|B) = P(A and B)/P(B)
Complement rule
P(
C
A
) = 1 P(A)
Multiplication rule
P(A and B) = P(A|B)P(B)
Addition rule
P(A or B) = P(A) + P(B) - P(A and B)
Bayes Law Formula
) A | B ( P ) A ( P . . . ) A | B ( P ) A ( P ) A | B ( P ) A ( P
) A | B ( P ) A ( P
) B | A ( P
k k 2 2 1 1
i i
i
+ + +
x all
) x ( xP
Variance
V(x) =
x all
) x ( P ) x (
2 2
Standard deviation
2
Covariance
COV(X, Y) =
) y , x ( P ) y )( x (
y x
Coefficient of Correlation
y x
) Y , X ( COV
Laws of expected value
1. E(c) = c
2. E(X + c) = E(X) + c
3. E(cX) = cE(X)
Laws of variance
1.V(c) = 0
2. V(X + c) = V(X)
3. V(cX) =
2
c
V(X)
Laws of expected value and variance of the sum of two variables
1. E(X + Y) = E(X) + E(Y)
2. V(X + Y) = V(X) + V(Y) + 2COV(X, Y)
Laws of expected value and variance for the sum of more than two variables
1.
k
i
i
k
i
i
X E X E
1 1
) ( ) (
2.
k
i
i
k
i
i
X V X V
1 1
) ( ) ( if the variables are independent
Mean and variance of a portfolio of two stocks
E(Rp) = w1E(R1) + w2E(R2)
V(Rp) =
2
1
w V(R1) +
2
2
w V(R2) + 2
1
w
2
w COV(R1, R2)
=
2
1
w
2
1
+
2
2
w
2
2
+ 2
1
w
2
w
1
k
i
i i
R E w
1
) (
V(Rp) =
+
+
k
i
k
i j
j i j i
k
i
i i
R R COV w w w
1 1 1
2 2
) , ( 2
Binomial probability
P(X = x) =
)! x n ( ! x
! n
x n x
) p ( p
1
np
) p ( np 1
2
) p ( np 1
Poisson probability
P(X = x) =
! x
e
x
Continuous Probability Distributions
Standard normal random variable
X
Z
Exponential distribution
/ 1
x
e ) x X ( P
>
x
e 1 ) x X ( P
<
2 1
x x
1 2 2 1
e e ) x X ( P ) x X ( P ) x X x ( P
< < < <
F distribution
2 1
, , 1 A
F
=
1 2
, ,
1
A
F
Sampling Distributions
Expected value of the sample mean
) (X E
x
Variance of the sample mean
) (X V
n
x
2
2
Standard error of the sample mean
n
x
Standardizing the sample mean
n
X
Z
/
(P E
p
p
(
2
Standard error of the sample proportion
n
p p
p
) 1 (
Standardizing the sample proportion
n p p
p P
Z
) 1 (
Standardizing the difference between two sample means
2
2
2
1
2
1
2 1 2 1
) ( ) (
n n
X X
Z
Introduction to Estimation
Confidence interval estimator of
n
z x
/
2
t
Sample size to estimate
2
2
,
_
W
z
n
/
n /
x
z
n / s
x
t
n
s
t x
/ 2
t
Test statistic for
2
2
2
2
1
s ) n (
LCL =
2
2
2
1
/
s ) n (
UCL =
2
2 1
2
1
/
s ) n (
1
Confidence interval estimator of p
n / ) p ( p z p
/
t 1
2
Sample size to estimate p
2
2
1
,
_
W
) p ( p z
n
/
Confidence interval estimator of the total of a large finite population
1
]
1
t
n
s
t x N
/ 2
Confidence interval estimator of the total number of successes in a large finite population
1
1
]
1
t
n
p ( p
z p N
/
1
2
Confidence interval estimator of
t
N
n N
n
s
t x
/
Confidence interval estimator of the total in a small population
,
_
t
1
2
N
n N
n
s
t x N
/
Confidence interval estimator of p when the population is small
1
1
2
t
N
n N
n
) p ( p
z p
/
,
_
t
1
1
2
N
n N
n
) p ( p
z p N
/
Inference About Two Populations
Equal-variances t-test of
2 1
,
_
2 1
2
2 1 2 1
1 1
n n
s
) ( ) x x (
t
p
2
2 1
+ n n
Equal-variances interval estimator of
2 1
,
_
+ t
2 1
2
2 2 1
1 1
n n
s t ) x x (
p /
2
2 1
+ n n
Unequal-variances t-test of
2 1
,
_
2
2
2
1
2
1
2 1 2 1
n
s
n
s
) ( ) x x (
t
1 1
2
2
2
2
2
1
2
1
2
1
2
2
2
2 1
2
1
n
) n / s (
n
) n / s (
) n / s n / s (
n
) n / s (
n
) n / s (
) n / s n / s (
t-Test of
D
D D
D D
n / s
x
t
1
D
n
t-Estimator of
D
D
D
/ D
n
s
t x
2
t
1
D
n
F-test of
2
2
2
1
/
F =
2
2
2
1
s
s
1
1 1
n and 1
2 2
n
F-Estimator of
2
2
2
1
/
LCL =
2 1
2
2
2
2
1
1
, , /
F
s
s
,
_
UCL =
1 2
2
2
2
2
1
, , /
F
s
s
,
_
,
_
2 1
2 1
1 1
1
n n
) p ( p
) p p (
z
Case 2:
2
2 2
1
1 1
2 1 2 1
1 1
n
) p ( p
n
) p ( p
) p p ( ) p p (
z
z-estimator of
2 1
p p
2
2 2
1
1 1
2 2 1
1 1
n
) p ( p
n
) p ( p
z ) p p (
/
+
t
Analysis of Variance
One-way analysis of variance
SST =
k
j
j j
) x x ( n
1
2
SSE =
j
n
i
j ij
k
j
) x x (
1
2
1
MST =
1 k
SST
MSE =
k n
SSE
F =
MSE
MST
Two-way analysis of variance (randomized block design of experiment)
SS(Total) =
b
i
ij
k
j
) x x (
1
2
1
SST =
k
i
j
) x ] T [ x ( b
1
2
SSB =
b
i
i
) x ] B [ x ( k
1
2
SSE =
+
b
i
i j ij
k
j
) x ] B [ x ] T [ x x (
1
2
1
MST =
1 k
SST
MSB =
1 b
SSB
MSE =
1 b k n
SSE
+
F =
MSE
MST
F=
MSE
MSB
Two-factor experiment
SS(Total) =
a
i
b
j
r
k
ijk
) x x (
1 1 1
2
SS(A) =
a
i
i
) x ] A [ x ( rb
1
2
SS(B) =
b
j
j
) x ] B [ x ( ra
1
2
SS(AB) =
+
a
i
b
j
j i ij
) x ] B [ x ] A [ x ] AB [ x ( r
1 1
2
SSE =
a
i
b
j
r
k
ij ijk
) ] AB [ x x (
1 1 1
2
F =
MSE
) A ( MS
F =
MSE
) B ( MS
F =
MSE
) AB ( MS
Least Significant Difference Comparison Method
LSD =
,
_
+
j i
/
n n
MSE t
1 1
2
Tukeys multiple comparison method
g
n
MSE
) , k ( q
Chi-Squared Tests
Test statistic for all procedures
k
1 i
i
2
i i 2
e
) e f (
Simple Linear Regression
Sample slope
2
1
x
xy
s
s
b
Sample y-intercept
x b y b
1 0
Sum of squares for error
SSE =
n
i
i i
) y y (
1
2
Standard error of estimate
2
n
SSE
s
Standard error of
1
b
2
1
1
x
b
s ) n (
s
s
Coefficient of determination
2 2
2
2
y x
xy
s s
s
R
2
1
) y y (
SSE
i
Prediction interval
2
2
2 2
1
1
1
x
g
n , /
s ) n (
) x x (
n
s t y
+ + t
Confidence interval estimator of the expected value of y
2
2
2 2
1
1
x
g
n , /
s ) n (
) x x (
n
s t y
+ t
Sample coefficient of correlation
y x
xy
s s
s
r
Test statistic for testing
= 0
2
1
2
r
n
r t
Multiple Regression
Standard Error of Estimate
1
k n
SSE
s
i
b
i i
s
b
t
Coefficient of Determination
2 2
2
2
y x
xy
s s
s
R
2
1
) y y (
SSE
i
Adjusted Coefficient of Determination
Adjusted
) n /( ) y y (
) k n /( SSE
R
i
1
1
1
2
2
Mean Square for Error
MSE = SSE/k
Mean Square for Regression
MSR = SSR/(n-k-1)
F-statistic
F = MSR/MSE
Durbin-Watson statistic
n
i
i
n
i
i i
e
) e e (
d
1
2
2
2
1
Time Series Analysis and Forecasting
Exponential smoothing
1
) 1 (
+
t t t
S w wy S
Nonparametric Statistical techniques
Wilcoxon rank sum test statistic
1
T T
E(T) =
2
1
2 1 1
) n n ( n + +
12
1
2 1 2 1
) n n ( n n
T
+ +
T
) T ( E T
z
Kruskal-Wallis Test
) n (
n
T
) n ( n
H
k
j
j
j
1 3
1
12
1
2
+
1
1
]
1
Friedman Test
) k ( b T
) k )( k ( b
F
k
j
j r
1 3
1
12
1
2
+
1
1
]
1
1
3
Upper control limit =
n
) p ( p
p
+
1
3
Decision Analysis
Expected Value of perfect Information
EVPI = EPPI - EMV*
Expected Value of Sample Information
EVSI = EMV' - EMV*