Professional Documents
Culture Documents
$ Costs % of COQ
Internal failure 410,000 47.0
External failure 90,000 10.3
Appraisal 340,000 38.9
Prevention 33,000 3.8
873,000
Although scrap is part of the process, Smith Company needs to minimize the
amount of scrap produced, just as if it were making a product. The total cost of
quality is $873,000. Although the categories are not completely clear, the
assumed categories are listed in the revised table above. Internal failure (scrap
and repair) total 47% of quality costs, and external failure in the form of cus-
tomer returns adds another 10.3%. Only 3.8 % of total costs are being applied to
prevention. Apparently, based on high internal failure and appraisal costs, this
organization is attempting to screen out bad product and scrap or repair it.
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-2
Also, on this batch, they didn’t accomplish their goal of 60% of book value goal
because (1,700,000 – 873,000)/1,700,000 = 48.65 %.
8. Miami Valley Aircraft Service Company’s data show rapidly decreasing total
quality costs (except for a slight rise in the 4th quarter), possibly due to a con-
certed quality effort. The decrease in both internal and external quality costs, as
a percentage of total quality and labor costs, while the prevention costs per-
centage is rising, is good. The only recommendation would be to increase pre-
vention costs even more rapidly, while holding the line on appraisal costs
However, the caution is that doing so may increase total quality costs in the
short run, as may have happened in the 2nd quarter.
10. Spreadsheet data and the Pareto chart for Repack Solutions, Inc. show that the
company is spending too much on appraisal and internal failure cost and too
little on prevention. Checking boxes, machine downtime, and packaging waste
need immediate improvement to have the greatest impact on quality costs
because they constitute almost 82% of quality costs. However, it should be done
with caution because “checking boxes” represents appraisal costs designed to
screen out poor quality and prevent it from reaching the customer.
Repack Solutions, Inc. Quality Cost & Percentages
Quality Cost
Percent Cumulative % Cost ($) Category
Checking boxes 48.80 48.80 710,000 Appraisal
Mach. downtime 27.84 76.63 405,000 Int. Fail.
Pkg. waste 5.15 81.79 75,000 Int. Fail.
Income. insp. 4.12 85.91 60,000 Appraisal
Other waste 3.78 89.69 55,000 Int. Fail.
Cust. complaints 2.75 92.44 40,000 Ext. Fail.
Error corrn. 2.75 95.19 40,000 Int. Fail.
Qual. train. assoc. 2.06 97.25 30,000 Prevent.
Improv. proj. 1.37 98.63 20,000 Prevent.
Typo corrn. 0.69 99.31 10,000 Int. Fail.
Quality planning 0.69 100.00 10,000 Prevent.
Total 1,455,000
Note that costs could also be classified by aggregating them into the four cate-
gories of internal and external failure, prevention, and appraisal costs, instead
of the categories listed in the table.
12. For HiTeck Tool Company, the largest costs are internal failure (56.6%) and
appraisal (27.1%). More must be done in quality training, a component of pre-
vention (currently 7.8%), if failure, appraisal, and overall quality costs are to be
controlled. External failure costs are 8.6% of quality costs, so screening methods
are working fairly well. Note that the proportions are fractions of the total quality
costs of $247,450.
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-3
14. The data for Beechcom Software Corporation show that the three categories of
rejected disks (loaded), returns, and system downtime account for 74.64 percent
of the defects. These appear to be completely under the control of the firm, so
steps should be taken to analyze root causes for these problem areas in order to
correct them as quickly as possible.
16. This is a very challenging problem, even for advanced students. Although you
may fit a linear regression equation to the set of data given in the problem, fit-
ting a curvilinear model would provide a higher R-squared value. This requires
a more complex solution process. There is no “cut and dried” answer to what
level of additional quality improvement effort would be best, of course.
A number of “what if” questions and scenarios could be raised. Some of
these might include the following:
1. What if the sample of hotel guests was not representative of the general popu-
lation of guests?
2. What if the site manager was simply interested in reducing, rather than
“eliminating,” dissatisfied customers?
3. What if her objective was to eliminate the competition, then go back to the
previous level of quality?
4. What are the disadvantages of fitting a linear model to the data? (Note: In using
Excel 4.0 when this solution was developed, it appears that there is a “bug” in
the module that calculates the equation for the graph. Therefore, the “add-in”
Excel model was used to get the equation and the R-squared value, as follows.)
CHAPTER 10
2. The defect rate is 65/1000 = 0.065. This is the same as: 0.065 × 1,000,000 = 65,000
dpmo. From Table 10.1, we see that this is slightly better than 3 sigma with off
centering of 1.5 sigma.
4. We use 3/1054 to get the number of defects per unit (DPUs). However, there are
2 opportunities per injection (wrong drug, wrong dosage) to make an error.
They must be considered to calculate dpmo.
dpmo = (3/1054) × 1,000,000/2 = 1423.1, which is slightly less than 4.5 sigma
with off centering of 1.5 sigma.
6. To calculate the overall dpmo and sigma level, we have:
dpmo = (6/5000) × 1,000,000/5 = 240, which is approximately 5 sigma with off-
centering of 1.5 sigma.
But for the one characteristic, we have:
dpmo = (2/5000) × 1,000,000 = 400, which is still good, but somewhat less than
5 sigma with off centering of 1.5 sigma.
A Six Sigma project should be launched to determine root causes for the defects
from this one characteristic.
CHAPTER 11
2. The following results were obtained from the Staunton Steam Laundry Data
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-6
Frequency Histogram
40
35 Frequency
30
Frequency
25
20
15
10
5
0
15 30 45 60 75 90 105 More
Measures
Column1
Mean 34.280
Standard Error 3.241
Median 25.500
Mode 19.000
Standard Deviation 32.412
Sample Variance 1050.507
Kurtosis 3.847
Skewness 1.756
Range 169.000
Minimum 1.000
Maximum 170.000
Sum 3428.000
Count 100.000
Largest(1) 170.000
Smallest(1) 1.000
Confidence Level(95.0%) 6.431
The conclusion that can be reached from looking at the summary statistics and
the histogram is that these data are exponentially distributed, with descending
frequencies. These data may show errors, by category, which are best repre-
sented by a histogram.
4. Descriptive statistics for the Harrison Metalwork foundry are shown in the
following chart:
Descriptive Statistics
Mean 38.6320
Standard Error 0.0444
Median 38.6000
Mode 38.4000
Standard Deviation 0.4436
Sample Variance 0.1967
Range 2.6000
Minimum 37.3000
Maximum 39.9000
Sum 3863.2000
Count 100.0000
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-7
Frequency Distribution
35
Frequencies
30
25
Frequency
20
15
10
0
37.5 37.8 38.1 38.4 38.7 39.0 39.3 39.6 39.9
Cell Boundaries
The conclusion that can be reached from looking at the summary statistics and
the histogram is that these data are fairly normally distributed, with some slight
skewing to the right.
6. For Georgia Tea’s bottling process, the values for the 1% cutoff and the standard
deviation are:
x = 1990 ml; σ = 15 ml
For a total probability of 1% for overfilling:
x−µ
P(x > upper fill limit) = 0.5000 − P − 0.5000 − .4900 = 0.01
σ
Using the Normal Table, Appendix A, z = 2.33
x − µ 1990 − µ
z= = = 2.33
σ 15
µ = 1955.05 ml
∴The process mean should be 1955.05 ml, so that there is only a 1% probability
of overfilling.
8. The mean for the Kiwi Blend product is µ = 927.5; the standard deviation,
σ = 15, x = 950.
x − µ 950 − 927.5
z= = = 1.50
σ 15
P(x > 950) = 0.5000 − P(0 < z < 1.5)
P(z > 950) = 0.5000 − 0.4332 = 0.0668
(Results are based on the Standard Normal Table, Appendix A.)
10. Given that the process mean filling weight is µ = 16.8 oz for the Martin salt
containers,
By looking up 0.5000 – 0.0250 = 0.4775, we find z = 1.96
16 − 16.8
z = − 1.96 =
σ
∴ σ = 0.4082 oz.
(Results are based on the Standard Normal Distribution Table, Appendix A.)
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-8
∑ fx 3858.90
a. x = = = 38.589 (vs. 38.670 from the data in problem 11-3)
n 100
38
37
0 20 40 60 80 100
Sample Percentile
14. Specification for answer time for the Tessler utility is:
H0: Mean response time: µ1 ≤ 0.10
H1: Mean response time: µ1 > 0.10
−
x 1 = 0.1023, s1 = 0.0183
and the t-test is:
x − 0.10 0.1023 − 0.10 0.0023
t1 = = = = 0.697 , t29 ,.05 = 1.699
s/ n 0.0183/ 30 0.0033
Specification for service time is:
H0: Mean service time: µ2 ≤ 0.50
H1: Mean response time: µ2 > 0.50
−
x 2 = 0.5290, s2 = 0.0902
and the t-test is:
x − 0.50 0.529 − 0.50 0.029
t2 = = = = 1.761, t29 ,.05 = 1.699
s n 0.0902/ 30 0 .0165
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-9
Because t29,.05 = 1.699, we cannot reject the null hypothesis for t1, but we can
reject the hypothesis for t2. Therefore, there is no statistical evidence that the
mean response time exceeds 0.10 for the answer component, but the statistical
evidence does support the service component.
Note: Problems 15–19 address sample size determination and refer to theory
covered in the Bonus Material for this chapter as contained on the student
CD-ROM.
16. The size of the population is irrelevant to this customer satisfaction survey, although
it is good to know that it is sizable. Therefore, make the following calculations:
n = (zα/2)2 p(1 − p)/E 2 = (1.96)2 (0.04)(0.96)/(0.02)2 = 368.79, use 369
18. Using the formula: n = (zα/2)2 p(1 − p)/E 2, the engineer at the Country Squire
Hospital can solve for zα/2 as follows:
800 = (zα/2)2 (0.10)(0.90)/(0.02)2
800 = (zα/2)2 (225)
(zα/2)2 = 800/225
(zα/2)2 = √3.556 = 1.886; use 1.87
From the Standard Normal Distribution table, Appendix A, we find a proba-
bility of 0.4693 for z = 1.87. Because it is only one tail of the distribution, we
multiply the area by 2 to get the confidence level of 0.9386. Thus, the man-
agement engineer can only be almost 94% confident of her results based on
this sample size.
20. The process engineer at Sival Electronics can calculate the main effects as follows:
Signal
High (18 + 12 + 16 + 10)/4 = 14
Low (8 + 11 + 7 + 14)/4 = 10
High – Low = 4
Material
Gold (18 + 12 + 8 + 11)/4 = 12.25
Silicon (16 + 10 + 7 + 14)/4 = 11.75
Gold – Silicon = 12.25 – 11.75 = 0.5
Temperature
Low (18 + 16 + 8 + 7)/4 = 12.25
High (12 + 10 + 11 + 14)/4 = 11.75
Low – High = 12.25 – 11.75 = 0.5
The main effects of the “signal” far outweigh the effects of material and tem-
perature, indicating that these factors are insignificant. Therefore, interaction
effects will be negligible.
CHAPTER 12
2. With the new data given for Fingerspring’s potential customers, a partial House
of Quality for the design of the PDA can be built. Note that there are strong rela-
tionships between customer requirements and associated technical require-
ments of the PDA design.
The inter-relationships of the roof may be sketched in. For example, they
would show a strong inter-relationship between size and weight.
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-10
The analysis suggests that Fingerspring should try to position itself between
Springbok and Greenspring in price and features. It should build on the strength
of the customer’s reliability concern, keeping battery life near 35 hours and use a
proven operating program, such as PalmOS. Enough features (10) should be
offered to be competitive. If Fingerspring can design a high-value PDA and sell it
at an attractive price (say, $350 or less), it should be a very profitable undertaking.
4. With the new data given for Bertha’s customers, a partial House of Quality for
the design of the burritos can be built. Note that the relationships between cus-
tomer requirements (flavor, health, value) and associated technical require-
ments (% fat, calories, sodium, price) of the burrito design are strong.
The inter-relationships of the roof may be sketched in. For example, they
would show a strong inter-relationship between fat and calories.
Bertha’s Big Burritos technical requirements must be placed on a more equal
basis, which would best be shown as units/ounce, except for the percent fat
value. These are shown in the following:
Although Bertha’s is low in price per ounce, calories, and percent fat, this analy-
sis suggests that Bertha’s should try to increase its size and visual appeal, while
continuing to reduce the cost per ounce. At the same time, it should build on the
strength of the nutrition trend by keeping the sodium and percent fat low, as did
Grabby’s, and slightly reducing the number of calories per ounce to be even more
competitive. If Bertha’s can design a flavorful, healthy, 7-oz burrito and sell it at
an attractive price (say, $1.85 or less), it should be a very profitable undertaking.
6. The following table can be used to sketch the reliability function.
Therefore, P(x < 880) = 0.5 – 0.4772 = 0.0228 or 2.28% should survive less than
650 days.
c. This distribution looks approximately like:
X = 750 X = 875
σ = 50
3 3
λ= = = 0.001237 faillures/hour
[(3 × 600) + 100 + 175 + 350] 2425
RR = (EV )2 + ( AV )2 = 0.3843
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-13
Tolerance
analysis
Average range 0.117 Repeatability (EV) 0.3579 89.47%
– ) 0.058
X-bar range (x 35.58%
D Reproducibility (AV) 0.1423
Repeatability and Reproducibility (R&R) 0.3851 96.28%
Control limit for individual ranges 0.3020
Note: Any ranges beyond this limit may be the result
of assignable causes. Identify and correct. Discard
values and recompute statistics.
σ = 0.00104
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-14
Conclusion: The process is centered on the mean, but it does not have adequate
capability at this time.
Cpk= min (Cpl, Cpu) = 0.903
b. x = 23 ; σ = 1.2
UTL − LTL 28.25 − 21.75
Cp = = = 0.903 ged.
This result has not chang
6σ 6(1.2)
Conclusion: The process is skewed and still does not have adequate capability
at this time.
c. σ2new = 0.4 (1.44) = 0.576
∴ σnew = 0.759
28.25 − 25.0
Cpu = = 1.427
3(0.759)
25.0 − 21.75
Cpl = = 1.427
3(0.759)
Cpk = min(Cpl , Cpu ) = 1.427
Reducing the variance brings the Cpl and Cpu to the point of adequacy, provided
the process can remain centered.
CHAPTER 13
2. The important quality characteristics for this drive-through window are the
machinery, materials, methods, and people (manpower). The machinery must
work well (e.g., most important is the speaker system by which the order is
transmitted and received), the bell and its operating system must work well, the
menu sign must be readable and conveniently placed, the order computer/cash
register must be working properly to give the total bill, and all the necessary
equipment in the food preparation area must also be working properly. The
“materials” used in order taking are few. However, the sign must be kept up-to-
date with the latest prices and selection of menu items. The method currently
being used is shown on the flowchart (Figure 13.23), and possible improve-
ments are discussed in the next paragraph. The people who take the order must
be trained to be courteous, friendly, accurate, and knowledgeable, or the
system’s quality will suffer.
Possible improvements to the system might include installation of a second
window, so that the order is taken at the first window, money is collected there,
and the pickup is made at the second window. A radio transmit/receive unit
linking the customer at the sign to the employee wearing a headset could
increase the ability of the employee to hear the order and to move around to
assemble the order while the customer is driving through. Automatic order
entry of standard selections might be built into the menu board with push but-
tons (similar to an automated teller machine in a drive-through banking opera-
tion). This would probably need to be coupled with personal assistance from
employees for special orders via a speaker system.
4. a. The C-E diagram for this process analysis shows that possible major causes
relating to client dissatisfaction (the effect) may be classified into three cate-
gories: employees, processing method, and client procedures.
b. The supervisor might use flowcharts, check sheets, and Pareto analysis to
classify the types of defects and their frequencies. Then, training, cross-
checking for errors, and work redesign might be done in order to remove
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-17
those error causes. Once the process is under control, control charts might
be used to “hold the gains.”
6. The scatter diagram shows that the employees’ accuracy improves for approxi-
mately the first 25 weeks. After that, it basically levels off. The differences don’t
appear to be significant after about 30 weeks.
8. The scatter diagram shows the packing time for a standard size package is lowest
for the first group of 20 packers, who average 13.85 minutes, although Packers #20
and 21 are considerably higher than the “lower” time group members. The pack-
ing time for a standard size package is higher for the second group of 20 packers,
who average 19.25 minutes, which is considerably longer. This suggests that some
workers are able to perform the task much faster than the norm (mean of 16.55). If
the output quality is the same for the faster group, as well as the slower one, then
the production coordinator should attempt to find the root cause, by observing the
methods of both groups, as well as testing to see if there are any significant differ-
ences in abilities between the group members. If the methods used by the first
group can be taught to the slower group members, this could increase productiv-
ity, reduce cost, and perhaps even improve quality, simultaneously.
10. It is obvious from the table and Pareto chart that may be constructed that the
first two categories, accounting for 68% of the errors, need improvement.
Ace Printing Company
Quality Errors and Percentages
Percent Cumulative % Frequency
Setup delays 37.40 37.40 245
No press time 30.53 67.94 200
No paper 12.21 80.15 80
Design delays 9.16 89.31 60
Order info error 4.43 93.74 29
Cust. chg, delays 3.05 96.79 20
Lost order 3.21 100.00 21
Total 655
12. The medication administration process offers numerous possibilities for error at
every step. The physician may not write legibly (probably the most frequent
source of physician error), or even specify the wrong drug or dosage. The sec-
retary may not transcribe the order correctly. The reviewing nurse may approve
an order that is not correct. The pharmacist may not read or interpret the pre-
scription correctly, or may mix up orders. And the attending nurse may give the
wrong medication, or the wrong amount, to the patient.
A Medication Error Committee at one hospital identified the highest ranked
problems that were deemed to be the most critical in causing severe errors as follows:
• Having lethal drugs available on floor stocks.
• Mistakes in math when calculating doses.
• Doses or flow rates calculated incorrectly.
• Not checking armbands (patient identity) before drug administration.
• Excessive drugs in nursing floor stock.
To reduce possible critical errors at the point of medication, these poka-yokes
could be applied:
• Remove lethal and excessive drugs from floor stock.
• Standardize infusion rates and develop an infusion handbook
• Educate nurses to double-check rates, protocols, and doses
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-18
14. From the Pareto diagram that can be constructed, we can conclude that 55% of the
problems are with long delays and another 25.2% are due to shipping errors, for a
total in the top two categories of 80.2%. These categories should be improved first.
16. The data on the syringes that may be graphed show a suspicious pattern that
indicates that the process may be unstable. Ten values, from samples 20 to 29,
are alternating above and below the average, indicating that some instability
may be found in the system, if it is carefully investigated.
CHAPTER 14
2. Results from 50 samples of 5 for Mount Blanc Hospital’s customer service project
show that the R chart is obviously out of control. On the x– chart, means for sam-
ples 6 and 7 are on, or almost on, their control limits. Assignable causes should
be determined and eliminated, and control limits should be recalculated.
–
For the Center Lines, CL x– : =
x = 22.62; CLR: R = 1.94
–
Control limits for the –x- chart are: =
x ± A2 R
–
UCLx– = =x + A2 R = 22.62 + (0.577) 1.94 = 23.74
–
LCLx– = =
x – A2 R = 22.62 – (0.577) 1.94 = 21.49
–
For the R-chart: UCLR = D4 R = (2.114) 1.94 = 4.11
–
LCLR = D3 R = 0
4. a. Descriptive statistics for Babbage Chips, Inc., based on all 50 samples, are
shown in the following. The histogram, when drawn, shows the “classic”
bell-shaped curve.
b. Results from first 30 samples of 5 for Babbage show that both the x– and R
charts are apparently in control.
For the Center Lines, CL – : =
–
x = 9.170; CL : R = 2.543
x R
Control limits for the –x- chart are: =
–
x ± A2 R
UCL – = =
–
x + A R = 9.170 + (0.577) 2.543 = 10.64
x 2
LCL x– = =
–
x – A2 R = 9.170 – (0.577) 2.543 = 7.70
For the R-chart:
–
UCLR = D4 R = (2.114) 2.543 = 5.38
–
LCLR = D3 R = 0
c. Using these control limits to monitor the last 20 samples, there is one unusual
occurrence, with seven out of the last eight samples below the centerline,
indicating a probable out-of-control condition. Note to instructors: The tem-
plates for the x– and R charts had to be modified to show the control limits,
based only on the first 30 samples, and the data for the additional 20 samples
were then added to the table and as shown as follows.
6. For the Quality Service Company’s center lines, CL x– = =
–
x = 8.0; CLR : R = 2.0
–
Control limits for the x –chart are:
= –
x ± A R = 8.0 ± (0.483) 2.0 = 7.03 to 8.97
2
–
For the R-chart: UCLR = D4 R = 2.004(2.0) = 4.01
–
LCLR = D3 R = 0
–
Estimated σ = R/d2 = 2.0/2.534 = 0.79
8. We can see from the initial control charts [labeled as x-bar chart (A) and R-chart
(A)], for the Hertz Company that there are two out-of-control points, one on the
–x -chart and one on the R-chart. We must throw out outliers #16, #23, and revise
the chart to yield the results shown in part b.
For the Center Lines, CL – : =
–
x = 402.92; CL : R = 33.20
x R
Control limits for the –x -chart are:
= –
x ± A R = 402.92 ± 1.023(33.20) = 368.96 to 436.88
2
–
For the R-chart: UCLR = D4 R = 2.574(33.20) = 85.46
–
LCLR = D3 R = 0
For the revised –x - chart:
=
x ± A2 = 400.29 ± 1.023(30.96) = 368.62 to 431.96
–
For the revised R-chart: UCLR = D4 R = 2.574 (30.96) = 79.69
–
LCLR = D3 R = 0
10. For 50 samples of 5 given for Beta Sales Corp., we obtain the following control
limits. We can conclude from the –x - and R charts that the process is probably in
control because the points seem to be randomly distributed in both charts.
For the Center Lines, CL : =
–
x = 0.011; CL : R = 1.372
–x R
Control limits for the –x -chart are:
= –
x ± A R = 0.011 ± 0.577 (1.372) = 0.78 to 0.80
2
–
For the R-chart: UCLR = D4 R = 2.11 (1.3718) = 2.89
–
LCLR = D3 R = 0
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-20
b. We can see from the –x -chart that points 19 and 21 are out of control and the
R-chart shows point 18 is out of control on the range. We obtain the fol-
lowing control limits and related charts after dropping these 3 points:
New Center Lines: Center Lines, CL – : =
–
x = 5.037; CL : R = 1.057
x R
c. The calculations of process capability using the estimated σ value are shown
on the following table.
–
Estimated σ = R/d2 = 0.0124/2.059 = 0.0060; actual σ = 0.0055, as shown in part
a, above.
18. For the Center Lines, CL x– : =
–
x = 69.147; CLR : R = 21.920
–
Control limits for the x-chart are:
= –
x ± A R = 69.147 ± 0.577 (21.920) = 56.50 to 81.80
2
–
For the R-chart: UCLR = D4 R = 2.114 (21.920) = 46.34
–
LCLR = D3 R = 0
These limits apply to sample groups of 5 items each.
–
Estimated σ = R/d2 = 21.920/2.326 = 9.423
The problem asks that students perform a process capability analysis. This is
only justified if the process is in control. The fact that the process is thought to
be normally distributed does not establish that it is in control. The –x -chart
shows that the process is, in fact, out of control because 4 out of 5 samples
within samples 6–10 are on one side of the center line. The % outside calculation
can be performed as follows. Note the warning, however.
Percent outside Specification Limits (45 to 95)
=
LSL − x
% Below LSL: z =
σ
45 − 69.147
z= = −2.56;; P(z < − 2.56) = (0.5 − 0.4948)
9.423
= 0.0052 that items will exceed lower limit
=
% Above USL : z = USL − x
95 − 69.147
z= = 2.74 ; P(z > 2.74) = (0.5 − 0.4969)
9.423
= 0.0031 that items will exceed upper limit
the data. It appears that each of the heads on the molding machine has a sep-
arate distribution of data. Thus, control charts should be prepared for each
head, rather than treating the data as if it came from the same population.
24. See control charts for Wilmer Machine Co. and template spreadsheet for details.
a. For the center line, CL –x : = 3.526; CLs : –s = 0.359
Control limits for the –x -s charts are:
=
x ± A3 –s = 3.526 ± 1.954 (0.359) = 2.825 to 4.227
For the s-chart: UCLs = B4 –s = 2.568 (0.359) = 0.922
LCL = B –s = 0
s 3
sp = [( p )(1 − p )]/n
sp = (0.0333)(0.9667 ) / 75 = 0.0207
Control limits:
UCL = –p + 3 s–
p p
UCLp = 0.0333 + 3(0.0207) = 0.0954
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-24
LCLp = –p – 3 s–p
LCLp = 0.0333 – 3(0.0207) = – 0.0288, use 0
32. The data and control chart for Quality Printing Company’s plant from the tem-
plate spreadsheet show:
CL–p = 0.06
Control limits:
UCLp = –p + 3 sp = 0.06 + 3(0.0336) = 0.1608
LCL = –p – 3 s = 0.06 – 3(0.0336) = – 0.0408 use 0
p p
p1 + p2 + p3 +
CLp =
N
CLp = 0.63 / 30 = 0.0210
Control limits:
UCL = –p + 3 s– = 0.0171 + 3(0.0130) = 0.0561
p p
LCLp = –p – 3 s–p = 0.0171 – 3(0.0130) = −0.0219, use 0
The conclusion is that the process is now in control.
36. The template spreadsheet for AtYourService.com shows:
The average sample size = 15755/30 = 525.17
82286_16_Solution.qxd 12/12/06 4:54 PM Page S-25
Control limits:
– + 3 s – = 2.1 + 3(1.434) = 6.402
UCL np– = np np
LCL – = np – – 3 s – = 2.1 – 3(1.434) = −2.202, use 0
np np
As was shown in the previous control chart for problem 14-34, values for sam-
ples 9 and 23 are out of limits. Eliminating these points, we get revised control
limits shown for the final control chart (follows). Note that the two values of 6
or more were dropped.
Problem 38—Revised
– = 100 (0.0171) = 1.71
So, CL – = npnp
Control limits:
– + 3 s – = 1.71 + 3(1.296) = 5.598
UCLnp– = np np
c ± 3 c = 30 ± 3 25 = 30 ± 15 = 15 to 45
42. Data for defects per pizza in a new store being opened by Rob’s Pizza Palaces is
used to construct a c-chart. The chart shows:
Number defective = 84; number of samples = 25
Center Line for the c-chart: –c = 84/25 = 3.36
46. The appropriate sample size for detecting shifts in means is simply an exercise in
reading values from the curves to fit required conditions.
a. For a 1 σ shift and a 0.80 probability, use n = 15 (if rounded to next higher value).
b. For a 2 σ shift and a 0.95 probability, use n = 8 (rounded to next higher value).
c. For a 2.5 σ shift and a 0.90 probability, use n = 3 (rounded to next higher value).
48. The stabilized p-chart diagram, based on the post office example, plots the
“transformed z statistic” instead of p, and it shows the process is in control. To
verify calculations from the spreadsheet, for example, the first data point is:
50. The control chart for the EMWA versus observed values shows that, with an
α = 0.8, the process is under control, and the EMWA estimate fairly closely antic-
ipates the next observed value. The conclusion is that a better “forecast” of future
values may be obtained for volatile values such as these if a larger α value is
used to give greater weight to more recent values.
For problems 52 through 54, see the Statistical Foundations of Control Charts
Section in the Bonus Materials folder on the CD-ROM.
α 0.10
52. = = 0.05 ; From the normal probability table, P( z) = 0.4500
2 2
Therefore, z0.02 = 1.64 or 1.65 because it is equidistant (0.4495 and 0.4505, respec-
tively) between the closest table values to 0.4500.
54. Using the binomial formula:
n
∑
n
Probability (acceptance) = f ( x) = and f ( x) = ( x) px (1 − p)n − x
x=0
11 in a row = (0.5) 11
= 0.049%
11
11 of 11 = (0.5)10 (0.5)1 = 11(0.5)11 = 0.539%
10
11
9 of 11 = (0.5)9 (0.5)2 = 55(0.5)11 = 2.695%
9
11
8 of 11 = (0.5)8 (0.5)3 = 165(0.5)11 = 8.085%
8
11
7 of 11 = (0.5)7 (0.5)4 = 330(0.5)11 = 16.17%
7