You are on page 1of 21

Measurement System

Analysis

1
“Data is only as good as the system that
measures it. If you can’t measure it, you can’t
manage it.”
Measurement System

 The complete process of obtaining measurements.

 Includes gages, people, software, and procedures.

 Measurement System Analysis (MSA)

• Evaluation of measuring instruments to determine capability to yield


precise response

• Often specified as: Gage R&R


Definitions
• Variable Data
– Continuous measurements such as length,
voltage, viscosity
• Repeatability
– Variation in measurements obtained with one
gage when used several times by one appraiser.
• Reproducibility
– Variation in the average of the measurements
made by different appraisers using the same
measurement system.
Types of Measurement Error that
Affect Location
• Accuracy
– Deviation from the true value
• Linearity
– Deviations increase as size increases
• Stability
– Deviations over time
Types of Measurement Error that
Affect Spread
• Precision
• Repeatability
• Reproducibility
Gage R&R

 Repeatability

• Variations in measurements considering one part and one operator

 Reproducibility

• Variation between operators measuring one part


Decomposition of the Variability

 In any problem involving measurement the observed variability in


product is due to two sources:

• Part variability - σ2part

• Gage variability - σ2gage

 Total observed variance in product:

σ2total = σ2part + σ2gage

 Total σ2gage can be further broken down as:

σ2gage = σ2repeatability + σ2reproducibility


What is Gauge R&R?
Measurement Systems AnalysisHow good is our
measurement system?

 2T = 2p +  2m

2T = Total Variance


2p = Process Variance
2m = Measurement Variance
Variable Gauge R&R - What’s
Involved?
1 Gauge

3 Appraisers

10 Parts
How to set up a Variable GRR Study
• Preparation & Planning
• 1 Gauge
• 3 Operators (Appraisers)
• 10 Parts
• 3 Trials
• Randomize the readings
• Code the parts (blind study) if possible
• 3 Ops x 10 parts x 3 trails = 90 Data Points
Acceptability Criteria
• R&R Indices
– 10% Acceptable Measurement System
– 10% - 30% May be acceptable based upon
application, cost of measurement
device, cost of repair, etc.
–  30% Not acceptable. Measurement
system needs improvement.
Example (Software Output)

 10 parts (Sr. No. 1 to 10), each measured 3 times (i, ii, and iii) by
three operators (a, b, and c)

Sr. Operator a Operator b Operator c


No.
i ii iii i ii iii i ii iii
1 37 38 37 41 41 40 41 42 41
2 42 41 43 42 42 42 43 42 43
3 30 31 31 31 31 31 29 30 28
4 42 43 42 43 43 43 42 42 42
5 28 30 29 29 30 29 31 29 29
6 42 42 43 45 45 45 44 46 45
7 25 26 27 28 28 30 29 27 27
8 40 40 40 43 42 42 43 43 41
9 25 25 25 27 29 28 26 26 26
10 35 34 34 35 35 34 35 34 35
Example – Minitab Output – ANOVA Method

Gage R&R (ANOVA) for Measurement


Reported by :
Gage name: Tolerance:
Date of study : Misc:

Components of Variation Measurement by Part Number


100 % Contribution
% Study Var
40
Percent

50
32

0 24
Gage R&R Repeat Reprod Part-to-Part 1 2 3 4 5 6 7 8 9 10
Part Number
R Chart by Operator
a b c
Measurement by Operator
UCL=2.746
Sample Range

2 40
_
1 R=1.067 32

0 LCL=0 24
a b c
Operator
Xbar Chart by Operator
a b c Operator * Part Number Interaction
Operator
Sample Mean

a
40 _ 40
_ Average b
UCL=36.89
X=35.8 c
LCL=34.71
32 32

24
24 1 2 3 4 5 6 7 8 9 10
Part Number
Example – Minitab Output – ANOVA Method
Gage R&R Study - ANOVA Method

Two-Way ANOVA Table With Interaction


Source DF SS MS F P
Part Number 9 3935.96 437.328 162.270 0.000
Operator 2 39.27 19.633 7.285 0.005
Part Number * Operator 18 48.51 2.695 5.273 0.000
Repeatability 60 30.67 0.511
Total 89 4054.40

Gage R&R
%Contribution
Source VarComp (of VarComp)
Total Gage R&R 1.8037 3.60
Repeatability 0.5111 1.02
Reproducibility 1.2926 2.58
Operator 0.5646 1.13
Operator*Part Number 0.7280 1.45
Part-To-Part 48.2926 96.40
Total Variation 50.0963 100.00

Study Var %Study Var


Source StdDev (SD) (6 * SD) (%SV)
Total Gage R&R 1.34302 8.0581 18.97
Repeatability 0.71492 4.2895 10.10
Reproducibility 1.13692 6.8215 16.06
Operator 0.75140 4.5084 10.62
Operator*Part Number 0.85322 5.1193 12.05
Part-To-Part 6.94929 41.6957 98.18
Total Variation 7.07787 42.4672 100.00

Number of Distinct Categories = 7

Gage R&R for Measurement


MSA for Attribute Data

 Uses a metric called as Kappa.

 Suitable for non-quantitative systems such as:

• Good/Bad

• Go/No-go

• Acceptable/Unacceptable

• Differentiating between different noises (example in an automobile


engine: No noise/Hiss/Clatter)
MSA for Attribute Data

 Notes on Kappa for attribute data

• Treats all unacceptable categories equally (example: a ‘hiss’ is neither better nor
worse than the ‘clatter’)

• Assessment categories are mutually exclusive

 General guidelines for the study

• If there are only two categories (‘good’ and ‘bad’) make sure that the distribution
of the parts in these two categories in approximately 50%-50%

• If there are more than two categories, make approximately 50% of the items
‘good’ and make sure that each defect category has at least 10% of the items

• If required, combine two or more defect categories into ‘others’


MSA for Attribute Data - Example

 15 parts (Sr. No. 1 to 15), each measured once by two operators


(a, and b). Each part can be classified into four categories (A, B,
C, and D)

Sr. optr optr Sr. optr optr Sr. optr optr


No. a b No. a b No. a b
1 A A 6 C D 11 C C
2 A B 7 B B 12 C C
3 B B 8 A A 13 C D
4 A C 9 D D 14 D D
5 D D 10 B A 15 B B
MSA for Attribute Data - Example

 Table of Matches and Mismatches (Numbers)

OPERATOR b
A B C D
OPERATOR A 2 1 1 0 4
a
B 1 3 0 0 4
C 0 0 2 2 4
D 0 0 0 3 3
3 4 3 5 15
MSA for Attribute Data - Example

 Table of Matches and Mismatches (Proportions)

OPERATOR b
A B C D
OPERATOR A 0.13 0.07 0.07 0 0.27
a
B 0.07 0.20 0 0 0.27
C 0 0 0.13 0.13 0.27
D 0 0 0 0.20 0.20
0.20 0.27 0.20 0.33 1.00
MSA for Attribute Data - Example

 Kappa value calculations

Pobserved = 0.13 + 0.20 + 0.13 + 0.20 = 0.66

Pchance = (0.27 x 0.20) + (0.27 x 0.27) + (0.27 x 0.20) + (0.20 x 0.33)

= 0.25

Kappa = [Pobserved – Pchance] / [1 – Pchance]

= [0.66 – 0.25] / [1 – 0.25] = 0.55

Guidelines for gauge acceptance:

Kappa > 0.9  Excellent;

0.7 < Kappa < 0.9  Acceptability depends on application

Kappa < 0.7  Unacceptable

You might also like