You are on page 1of 22

Homework: 2

BUS173
Prepared By:
,me: Moh,mmed
Aft,-
101 0017 030
Section: 8

Submitted 1o:
M. Siddique Hoss,in (SQH)
Lecturer
School of Business
orth South University, h,k,
,te of Su-mission: 21 ovem-er 2011



For a large sample size the sampling distribution oI chi square can be closely approximated by
a continuous curve known as the chi-squared distribution. II we can assume that a population
distribution is normal, then it can be shown that the sample variance and the population variance
are related through a probability distribution which is known as the chi-squared distribution.
Multiplying s by (n-1) and dividing it by converts it into chi-squared random
variable.
It is possible to Iind the probabilities oI this variable:
u p | (n-1)s/ _ |
When the number oI degrees oI Ireedom is n 1.
4 mportant properties of chi-squared distribution:
1. distribution is a continuous probability which has the value zero at its lower limit and
extends to inIinity in the positive direction.
2. The exact shape oI the distribution depends upon the number oI degrees oI Ireedom.
When this value is small, the shape oI the curve is skewed to the right and the distribution
becomes more and more symmetrical as the value increases and thus can be
approximated by the normal distribution.
3. The mean oI the chi-squared distribution is given by, E(x) v and variance is given by,
v(x) 2v.The chi square distribution with (n-1) degrees oI Ireedom is the distribution oI
the squares oI (n-1) independent standard normal variables.
hi-squ,red distri-ution

3
. s v gets larger, approaches the normal distribution with mean v and standard
deviation \2v.
5. The sum oI independent variables is also a variable.
4 hi-square distribution of sample and population variances:
iven a random sample oI n observations Irom a normally distributed population whose
population variance is and whose resulting sample variance is s. it can be shown that
(n-1)s _(Xi x)
has a distribution known as the chi-square distribution with n-1 degrees oI Ireedom.
We can use the properties oI the chi-square distribution to Iind the variance oI the sampling
distribution oI the sample variance when the parent population is normal.



ConIidence interval Ior variance o is based on the sampling distribution oI
(n-1)s2
c
2
which
Iollows `2 distribution with (n-1) degrees oI Ireedom, 100(1-u) conIidence interval Ior a o
2
is
constructed by Iirst obtaining an interval about
(n-1)s2
c
2

100(1-u) conIidence interval Ior (n-1)s/ is given by,
p ( n - 1,1 - u/2 n-1 n - 1,u/2 ) 1 u
p ( n - 1,1 - u/2 (n-1)s/ n - 1,u/2 ) 1 u
p ( (n - 1)s / n - 1,u/2 (n 1)s / n - 1,1 - u/2 ) 1 u
4 onfidence interval for the difference between two normal population means:
ependent samples: Ior example in pharmaceutical industry say a drug is already in the
market, working well Ior the people, but a new drug is about to be launched in the
market. What we do is this test the eIIectiveness oI the new drug under similar
onfidence interv,l for v,ri,nce


condition as the old drug. See whether there is any diIIerence in the result. We consider
samples to be dependent iI the values in the other sample. In this scheme the sample
members are chosen in pairs one Irom each population. Thus the procedure is oIten
known as method pairs.
!% SAM!LS: In this scheme samples are drawn independently Irom
two normally distributed populations with known population variances so that the
membership oI one oI the sample is not inIluenced by the membership oI the sample.


We as business decision maker may wish to decide on the basis oI sample data.
1. Whether a new medicine is really eIIective in curing a disease.
2. Whether one training procedure is better than another.
3. Whether light bulb produced under system is better than the system B.
Such decisions are called statistical decisions. Hypothesis tests are widely used in business and
industries Ior making decisions.
It is here probability and sampling theory plays an ever increasing role in constructing the
criteria on which business decisions are made.
The statistical testing oI hypothesis is the most important technique in statistical inIerence
hypothesis is an assumption to tested.
4 !rocedure for Hypothesis testing:
1. Set up a hypothesis: the approach to hypothesis testing is not to construct single
hypothesis about the population parameter but rather to set up two hypothesis. They are:
Null hypothesis, denoted by Ho
lternative hypothesis, denoted by H1
2. Set up a suitable signiIicance level : the level oI signiIicance is generally speciIied beIore
any samples are drawn. So that results obtained will not inIluence our choice.
%est of hypothesis


3. etermination oI a suitable test statistic; the third step is to determine a suitable test
statistic and its distribution. eneral Iorm oI test statistic is
Test statistic (sample statistic Hypothesized parameter) std. dev oI the statistic
. etermine the critical region: it is important to speciIy beIore the sample is taken which
value oI the test statistic will lead to a rejection oI Ho and which lead to acceptance oI
Ho.
5. oing computations: we need to perIorm various computations Irom a random sample oI
size n necessary Ior the test statistic. Then we need to see whether sample result Ialls in
the critical region or in the acceptance region.
6. Making decisions; draw statistical conclusions and decision maker may take the decision.



There are Iour possible results in a statistical hypothesis testing:
1. the hypothesis is true but our test rejects it
2. the hypothesis is Ialse but our test accepts it
3. the hypothesis is true and our test accepts it
. the hypothesis is Ialse and our test rejects it
The Iirst two possibilities lead to error. II we reject a hypothesis when it should be accepted we
say that type I error has been made.
II we accept a hypothesis when it should be rejected, we say that a type II error has been made.


%ype ,nd type errors





S

H:%#U

H:FALS

A!% H

##% S

1-o

%! 2 ###



#% H



%! 1 ###



##% S (1-
!)

7



The probability oI committing a type 1 error is designated as u and is called the level oI
signiIicance.



Though it is diIIicult to draw a clear cut line oI demarcation between large and small samples,
it is generally agreed that iI the size oI sample exceeds 30 iI should be regarded as a large
sample.
4 %ests of hypothesis involving l,rge s,mples ,re -,sed on the following ,ssumptions:
1. The sampling distribution oI statistic is approximately normal.
2. Values given by the samples are suIIiciently close to the population value and can be
used in its place Ior the standard error oI the estimate.
4 nterpret,tion of the !ro-,-ility v,lues or !-v,lues: the p-value provides more precise
inIormation about the strength oI the rejection oI the null hypothesis that result Irom the
observed sample mean.
The p-value is the smallest signiIicance level at which Ho can be rejected. Consider a random
sample oI size n observations Irom a population that has a normal distribution with mean and
standard deviation and the resulting computed sample mean, x-bar.



1. The Iarther the true mean is Irom the hypothesized mean, the greater is the power oI the
test, everything else being equal.
%ests of hypothesis concerning l,rge num-ers
%he Fe,tures of the !ower Function

8
2. The smaller the signiIicance level oI the test, the smaller the power, everything also being
equal. Thus reducing the probability oI type I error increases the probability oI type II
error, but reducing u by 0.01 does not generally increase by 0.01. the change is not
linear.
3. The larger the population variance, the lower the power oI the test, everything else being
equal.
. The larger the sample size, the greater the power oI the test, everything else being equal.
5. The power oI the test at the critical value equals 0.5 because that a sample mean is above
oxc is, oI course, 0.50.


SUMS FROM 1HE BOOK"
8.2
!,irs Before After if.(i)
1 6 8 -2
2 12 1 -2
3 8 9 -1
10 13 -3
6 7 -1
1 8. 2 10.2 d -1.8

Std deviation \_(di d) (n -1) }
\ (2.8/)
0.8367
ME tn-1,u/2 sd/\n
t,0.05 0.8367/\5


2.132 x 0.8367/\5
0.798
LCL d tn-1,u/2 sd/\n UCL d tn-1,u/2 sd/\n
-2.598 -1.002
ME t,0.025 x sd/\n
2.776 x 0.8367/\5
1.0
Width 2 x ME 2 x 1.0 2.08
8.
!,irs Without p,ssive
sol,r
With p,ssive sol,r if.(i)
1 85 52 33
2 23 386 37
3 515 502 13
25 376 9
653 605 8
386 380 6
7 26 395 31
8 73 11 62
5 15 39
10 96 1 55
1 73.6 2 36.3 37.3

d _di / n 37.3
sd \_(di d) (n -1)}

10
17.658
tn-1,u/2 t9,0.05 1.833
ConIidence interval Ior 90:
d tn-1,u/2 sd/\n _ 1 - 2 _ d tn-1,u/2 sd/\n
37.3 (1.833 x 17.658)/\10 _ 1 - 2 _ 37.3 (1.833 x 17.658)/\10
27.06 _ 1 - 2 _ 7.5
8.
a. egrees oI Ireedom, n1 n2 2 12 1 2 2
b. degrees oI Ireedom, n1 n2 2 6 7 2 11
c. n1 n2 2 9 12 2 19

8.10
a. degrees oI Ireedom, v |6/12 10/1| | (6/12)/11 (10/1)/13 |
(289/196) (1737/28028)
23.79 2
b. v | (30/6 36/10) | | (30/6)/5 (36/10)/9 |
73.96/(51.) 11.8 11
c. v | (16/9 25/12) | | (16/9)/8 (25/12)/11 |
18.88 19
d. v | (30/6 36/7) | | (30/6)/5 (36/7)/6 |
10.93 11

11

8.18
ME Zu/2 \ | p1(1-p1)/n1 p2(1-p2)/n2 |
a. ME Zu/2 \ 0.75 x 0.25/260 0.68 x 0.32/200
0.025 Zu/2
b. ME Zu/2 \ 0.6 x 0./00 0.68 x 0.32/500
0.0322 Zu/2
c. ME Zu/2 \ 0.2 x 0.8/500 0.25 x 0.75/375
0.0286 Zu/2


8.20
n1 120 p1 0.708
n2 163 p2 0.79
u 0.02
u/2 0.01
ME Zu/2 \ | p1(1-p1)/n1 p2(1-p2)/n2 |
2.33 \ 0.708 x 0.292/120 0.79 x 0.521/163
2.33 x 0.057 0.133
ConIidence interval x 0.133 _ p1 p2 _ x 0.133
(0.708 0.79) 0.133 (0.706 0.79) 0.133
0.096 0.362

12


8.2
ME (Zu/2)/\n
a. n (1.96)/0.03 268.6 x 268. x 0.5 1067.11 1067
b. n (1.96)/0.05 1536.6 x 1536.6 x 0.5 38.16 38
c. s the margin oI error increases, the sample size required decreases.

8.28
a. ME 0.0 u 0.1 ME (Zu/2)/\n
n (1.65 x 0.25)/0.0 16.91 17
b. u 0.025
n (1.96 x 0.25)/0.0 600.25 600
c. ME 0.05
u 0.02
n (2.33 x 0.25)/0.0 88.27 88

8.30
u 0.1 ME 0.03 ME (Zu/2)/\n
n (1.65 x 0.25)/0.03 751.67 752

13
8.0
a. u 0.1
u/2 0.05
x y 80 520 -0
d.I. nx ny -2 10 12 2 20
s |(nx 1)sx (ny 1)sy| (nx ny 2)
(9 x 30 11 x 25)/20 78.75

ConIidence Interval
(x y) tnxny,u/2 \(s/nx) (s/ny) _ 1 - 2 _ (x y) tnxny,u/2 \(s/nx) (s/ny)
0 t20,0.05 \(78.75/10) (78.75/12) _ 1 - 2 _ 0 t20,0.05 \(78.75/10)
(78.75/12)
0 1.725 x 11.716 _ 1 - 2 _ -0 1.725 x 11.716
-60.21 _ 1 - 2 _ -19.79

b. u 0.1
u/2 0.05
degrees oI Ireedom,v (sx/nx sy/ny) | (sx/nx)/nx-1 sy/ny)/ny-1 |
(30/10 25/12) | (30/10)/9 (25/12)/9 |
16.8 17
ConIidence interval (x-y) tv,u/2 \(sx/nx sy/ny) _1 - 2_ (x-y) tv,u/2 \(sx/nx sy/ny)

1
-0 1.7 x 11.92 _1 - 2_ -0 1.7 x 11.92
-60.7 _1 - 2_ -19.26

8.2
nx 21 x 72.1 sx 11.3
ny 18 y 73.8 sy 10.6
d.I. nx ny 2 21 18 2 37
x y 72.1 73.8 -1.7
u 0.2 u/2 0.1
s |(nx 1)sx (ny 1)sy | (nx ny -2)
(20 x 11.3 17 x 10.6) / 37
120.65
ME t37,0.1 \ | (120.65/21) (120.65/18) |
1.303 x 3.53
.597
ConIidence interval (x-y) ME _ 1 - 2 _ (x-y) ME
-1.7 .597 _ 1 - 2 _ -1.7 .597
-6.297 _ 1 - 2 _ 2.897
8.0
n1 00 p1 0.75
n2 500 p2 0.5

1
u 0.05 u/2 0.0025 p1 p2 0.75 0.5 0.3
\(0.75 x 0.25)/00 (0.5 x 0.55)/500 0.031
ConIidence interval
(p1-p2) Zu/2\| p1(1-p2)/n1 p2(1-p2)/n2| _ p1 p2 _ (p1-p2) Zu/2\| p1(1-p2)/n1 p2(1-
p2)/n2|
0.3 1.96 x 0.031 _ p1 p2 _ 0.3 1.96 x 0.031
0.239 _ p1 p2 _ 0.361

7.2
a. p(Xn-1,1-u/2 Xn-1 Xn-1,u/2) 1 u
p| (n-1)s/Xn-1,u/2 (n-1)s/Xn-1,1-u/2) 1 u
Lower Limit (n-1)s / Xn-1,u/2
20 x 16 / (X20, 0.0125) 20 x 16 /37.57 8.52
b. 15 x 8 / X15, 0.025 15 x 8 / 27.9 3.92
c. 27 x 15 / X27, 0.005 27 x 15 / 9.6 122.38

7.8
n 18 s 10. u 0.1
p| (n-1)s/Xn-1,u/2 (n-1)s/Xn-1,1-u/2) 1 u
p| 17 x 10./X17, 0.05 17 x 10./X17, 0.95) 0.90
ConIidence Interval 17 x 10./27.59 17 x 10./8.67

1
66.6 212.08

.
a. Ho the Iood is unhealthy
H1 the Iood is healthy
b. Ho the Iood is healthy
H1 the Iood is unhealthy
.8
s the sample size increases, the critical value, x, decreases.
s the population variance, , increases, the critical value increases.

.10
p-value
a. (-80 70) / (15/\25) -3.33
b. (-80 70) / \(900/25) -1.67
c. (-80 70) / \(00/25) -2.5
d. (-80 70) / \(600/25) -2.0

.12
n 9 u 0.1
Ho : _ 50
H1 : 50

17
!(Z _ z) p (8.5 50) / (3/\9)
p( Z _ -1.8)
s -1.8 -1.6, we will reject the null hypothesis.

.1
a. II x ~ x2 reject Ho
(106 100) / 15/\25 ~ 1.96
2 ~ 1.96 so, reject Ho.
b. (10 100) / 10/\25 ~ 1.96
2 ~ 1.96 so, reject Ho(95 100) / 100/\25 ~ 1.96
c. (95 100)/10/\25 ~ 1.96
-2.5 1.96 so, accept Ho.
d. (92 100) / 18/\25 ~ 1.96
-2.22 1.96 so accept Ho


.20
Ho : 0
H1 : 0 x2 -2.91 0 / 11.33/\170 -3.35
x -2.91
s 11.33 iI x xc reject ho, thus here we Iail to reject Ho
n 170


.2

18
Ho 20
H1 = 20
u 0.05 n 9
x 20.36

Reject Ho iI t (x-o)/(s/\n) tn-1,u/2 OR, t (x-o)/(s/\n) ~ tn-1,u/2

s 1/(n-1) _(xi x)
1/(n-1) (1.0 0.66 0.66 0.2 0. 0.26 0.66 0.06 0.5)
1/8(3.002) 0.3753
So, s 0.6126
(X - o)/s/\n -tn-1, u/2 OR, (X - o)/s/\n ~ -tn-1, u/2
20.36 20/0.6126/\9 -t8.0.025 20.36 20/0.6126/\9 ~ -t8.0.025
1.76 ~ -2.306 1.76 2.306
So, reject the Ho.



.28
a. Reject Ho iI (! !o)/\!o(1-!o)/n ~ Zu
! 0,25 0.07
! 0.297

b. p 0.25/\(0.25 x 0.75)/225 ~ 2.17
p 0.25 0.0626
p 0.3130

c. p 0.25/\(0.25 x 0.75)/625 ~ 2.17
p 0.25 0.038
p 0.2880

1
d. p 0.25/\(0.25 0.75)/900 ~ 2.17
p 0.25 0.0313
p 0.2810


.3
p 28/50 0.56
n 50
Ho : !o 0.5
H1 : !o ~ 0.5
u 0.05 Zu 1.96
reject Ho iI (! !o)/\!o(1-!o)/n ~ Zu
(0.56 0.50)/ \0.5/50
0.880
s 0.880 Zu, we Iail to reject the Ho


.38
Reject Ho iI !x ~ OR, !x 0.6
a. Ho is not rejected
p(!x 0.6) p(Z (0.6 0.52)/\(0.52 x 0.8)/600)
p(Z -2.96)
0.5 p(0 z 2.96)
0.5 0.985
0.0015
nd, p(!x ~ 0.5) p(z ~ (0.5 0.52)/\(0.52 x 0.8)/600)
p(z ~ 0.981)
0.5 0.3365
-.1635
0.0015 0.1635 0.1650

20

b. Ho is rejected
p(0.6 !x 0.5) p (0.6 0.58)/\(0.2 x 0.58)/600 Zx (0.5 0.58)/\(0.2 x 0.58)/600
p(-5.96 Z -1.99)
0.999 0.767
0.0232
0.0232

c. Ho is not rejected
p(!x0.6) p(Z (0.6 0.53)/\((0.53x0.7)/600) p(Z-3.)
0.5 0.997
0.003
!(!x ~ 0.5) p(Z~ (0.5 0.53)/\((0.53x0.7)/600) p(Z~0.9)
0.5 0.1879
0.3121
0.003 0.3121 0.3151

d. Ho is not rejected
p(!x~0.5) p(Z~ (0.5 0.8)/\((0.52x0.8)/600) p(Z~2.9)
0.5 0.98
0.0016

p(!x~0.6) p(Z~ (0.6 0.8)/\((0.52x0.8)/600) p(Z~-.98)
0.5 0.3365
0.1635
0.0016 0.1635 0.1651

e. Ho is rejected
!(0.6 !x 0.5)
p(0.6 0.3)/\((0.57x0.3)/600) Z p(0.5 0.3)/\((0.57x0.3)/600)

21
p(1.8 Z 5.)
0.5 0.306
0.069
0.069
.0
x 3.07 o 3 n 6 0. Zu 1.96
a. Ho : 3
H1 : ~ 3
Reject H1 iI (x- )/(0./\6) ~ Zu
(3.07 3)/(0./\6) 1.
s 1. ~ 1.96, we Iail to reject Ho.

b. reject Ho iI (x-)/(/\n) ~ Zu
x ~ 1.96(0./\6) 3
X ~ 3.098
So, Ho is rejected
!(X~3.098)
!(Z ~ (3.098 3.1)/(0./\6) !(Z~-0.0) 0.5 0.016 0.5160


.8
n 8
s 1/(n-1)_(xi x)
1/7 (1.625 3.625 21.625 5.375 18.38 22.625 3.375)
1/7(663.5)
S 30.39
Ho : _ 500
H1 : 500
Reject Ho iI (n-1)s/ ~ Xn-1,u
(7 x 923.)/500

22
12.93
X7,0.1 12.02
s 12.02 12.93, reject Ho.

You might also like