You are on page 1of 3

Sampling Distr of x

For Finite Population


i.e. n/N < .05
For Infinite Population
Interval Estimation
known
n > 30
n < 30
(Pop. Normally distr.)

x (

N -n
)
n N -1

unknown

x z / 2
n

x z / 2
n

Sampling Distr of p

p (1 p ) N n
n
N 1

p (1 p)
n

x z / 2
x t / 2

s
n
s
n

( z / 2 ) 2 2

Margin of error:

E z / 2
E2
n
p (1 p )
p z / 2
Interval estimation of a population proportion:
n
2
( z / 2 ) p (1 p )
p (1 p)
Margin of error: E z / 2
n
E2
n
( z z ) 2 2
n

Sample Size for a Hypothesis Test about a Population Mean


( 0 a ) 2
Properties of the Sampling Distribution of x1 x2
Expected Value: E ( x1 x2 ) 1 2

Standard Deviation:

x1 x2

12 22

n1 n2

Large Sample case: (n1 >=30 and n2>=30)


Interval Estimate with 1 and 2 Known: x1 x2 z / 2 x1 x2
Interval Estimate with 1 and 2 Unknown: x1 x2 z / 2 s x1 x2 , where: sx1 x2

s12 s22

n1 n2

Small sample case: (n1 < 30 and/or n2 < 30)


2
Interval Estimate with 2 known: x1 x2 z / 2 x1 x2 , where x1 x2 (

1 1
)
n1 n2

Interval Estimate with 2 Unknown: x1 x2 t / 2 sx1 x2


1
(n 1) s12 (n2 1) s22
2 1
s2 1
Where sx1 x2 s ( )
,
n1 n2
n1 n2 2
Properties of Sampling Distribution of
Expected Value: E ( p1 p2 ) p1 p2

p1 p2
Standard Deviation: p1 p2

Interval Estimate: p1 p2 z / 2 p1 p2
Point Estimator of p1 p2 :
Test statistic z

s p1 p2

( p1 p2 ) ( p1 p2 )
p1 p2

p1 (1 p1 ) p2 (1 p2 )

n1
n2

p1 (1 p1 ) p2 (1 p2 )

n1
n2

Point Estimator of p1 p2 when p1 = p2 s p1 p2

n1 p1 n2 p2
n1 n2

p (1 p )(1 n1 1 n2 ) , p

(f e )
Test Statistic for goodness of fit: i i with degrees of freedom = k-1 & all ei 5
ei
i 1
k

Expected freq. for contingency Tables (assuming independence): eij


n

Test statistic for independence:


2

( f ij eij )

Samplesize

(deg. of freedom = (n-1)(m-1)& all eij 5

eij

i 1 j 1

row * column

Between samples estimate of popln variance- Mean Square Between:

MSB

n (x
j

j 1

MSW

x) 2

k 1

MSB = Sum of Squares Between (SSB)


Degrees of freedom of SSB = k-1
Within samples estimate of popln variance- Mean Square within:

(n
j 1

1) s j

nT k

MSW = Sum of Squares due to error (SSE)


Degrees of freedom of SSE = nt-k
Test statistic for equality of k population means: F = MSB / MSE
ANOVA TABLE
Source of
Sum of
Variation Squares
Treatment
SSB
Error
SSE
Total
SST

Degrees of
Mean
Freedom
Squares
k-1
MSB
nT - k
MSE
nT 1

Fishers LSD Procedure Test Statistic:

F
MSB/MSE

where, SST ( xij x) 2


j 1 i 1

= SSB + SSW
and SST / (nT 1) is overall
sample variance of nT obs.

xi x j
MSW ( 1 1 ) (nT k degrees of freedom)
ni
nj

Fishers LSD procedure based on test statistic

xi x j

rejects H0 : i = j when

xi x j t / 2 MSW ( 1 1 )
ni
nj
Simple Linear Regression Model
Simple Linear Regression Equation
Estimated Simple Linear Regression Equation
Least Squares Criterion
Slope for the Estimated Regression Equation
xi yi ( xi yi ) / n
b1
xi2 ( xi ) 2 / n
b0 y b1 x
SST = SSR + SSE
( yi y )2 ( $y i y )2 ( yi $yi )2

nj

y = 0 + 1x +
E(y) = 0 + 1x
y = b0 + b1x

min ( yi 0 1 xi ) 2

Coefficient of Determination: r2 = SSR/SST


Where: SST = total sum of squares, SSR = sum of squares due to regression

SSE = sum of squares due to error


Sample Correlation Coefficient: rxy (sign of b1 ) r 2
An Estimate of s2 = MSE = SSE/(n-2)
2
2
where: SSE ( y i y i ) ( y i b0 b1 xi )
Hypotheses
H0: 1 = 0
Ha: 1 = 0
s
b1
Test Statistic t
where sb1
sb1
( xi x ) 2
Rejection Rule: Reject H0 if t < -t/ or t > t/ (n - 2 degrees of freedom)
Confidence Interval for 1: b1 t / 2 sb1
1
Confidence Interval Estimate of E(yp) y p t / 2 s y p where s 2yp s 2 (
n

( x p x) 2
( xi x) 2

Prediction Interval Estimate of yp yp + t/2 sind (n - 2 degrees of freedom)


2
1 ( x p x)
2
2
)
Where sind s (1
n ( xi x) 2
Residual for Observation I: yi yi
yi yi
Standardized Residual for Observation I:
s y $y
i
i
Where s yi yi s 1 hi
hi

1 ( xi x) 2

n ( xi x) 2