You are on page 1of 76

MEASUREMENT SYSTEM

ANALYSIS

MSA 3RD EDITION (AIAG)

Presented By
Mahendra S. Negi
CONT ENTS
1. Ba sic T erms( Discr imin at io n, Err ors ,Ref.
Value )
2. Obje ct iv e
3. Sta tis tic al p ro pert ie s
4. Desc rip tio n of Loca tio n & W idth Err ors
5. Effect s of Measur eme nt Err or o n M .S.
6. Varia ble M eas urement s tud y
7. Attr ibu te Measur eme nt St ud y
BASIC TERMS
• Measurement.
• Measurement System.
• Gage
• Discrimination, Readability, Resolution
• Reference Value and True Value
• Uncertainty
Measurement
Assignment of numbers (values) to material things to
represent the relationship among them w.r.t. particular
properties.
C. Eisenhart (1963)

Measurement System
The complete process used to obtain measurement i.e.
Combination of -
• operations,
• Procedures,
• Gauges and other equipments,software,
• Personnel,
• environment and
• assumption etc.
Gage
Gage is any device used to obtain
measurements, frequently used to refer
specifically the devices used on shop floor,
includes Go/No Go devices.

Discrimination
The ability of the system to detect and indicate
even small changes of the measured
characteristic, also known as resolution
Measurement system is unacceptable for
analysis if it cannot detect process variation
True value
Actual value of an Artifact.
Unknown and Unknowable.
Reference value
Accepted Value of an Artifact.
Used as a Surrogate to the true Value.
Uncertainty
An estimated range of values about the
measured value in which the true value is
believed to be contained.
True Measurement = Obs. Measurement ± U
OBJECTIVES
1. To quantify the variation present in the
measurement system

2. Ensure the stability of the measurement system

3. Initiate appropriate actions to minimise the


contamination of measurement system
variation in total process variation
Statistical Properties of
Measurement System (MS)
1.ADEQUATE DISCRIMINATION &
SENSITIVITY:
•Should be small relative to Process Variation or
Specification Limit.
•Rule of 10’s i.e. 1/10th .
i.e. least count/ resolution of equipment should be
1/10th of process Variation (10 data categories).

2. MS should be in statistical control


Under repeatability conditions, the measurement
system variation is due to common causes only and no
special causes
Statistical Properties of
Measurement System (MS)

3. The measurement error must be small


compared to

a) The tolerance spread


b) The process variability
INADEQUATE DISCRIMINATION
Inadequate discrimination of a measurement system is
shown on Range chart when:

- Only one,two or three values for ranges can be read

- More than ¼ of ranges are Zero


RA NGE CH AR T W IT H D IFFERENT
DISCRIMINAT IONS
MEASUREMENT SYSTEM
ERRORS

1. LOCATION ERROR

3. WIDTH ERROR
MEASUREMENT SYSTEM
ERRORS
1. LOCATION ERROR

ACTUAL VARIATION OBS.VARIATION


DUE TO MS ERROR
MEASUREMENT SYSTEM
ERRORS
2. Width ( spread ) Error

ACTUAL VARIATION OBS.VARIATION


DUE TO MS ERROR
Effects of Measurement system Errors
on Measurement Decision
1. EFFECT ON PRODUCT CONTROL:
1a. CALLING A GOOD PART AS BAD PART (Called TYPE -I Error,
Producer’s Risk, False Alarm)

1b. CALLING A BAD PART AS GOOD PART (Called TYPE –II error,
Consumer’s Risk, Miss rate)
Effects of Measurement system Errors
on Measurement Decision

2. EFFECT ON PROCESS CONTROL:

2a. CALLING A COMMON CAUSE AS SPECIAL CAUSE


(CALLED TYPE -I ERROR)

2b. CALLING A SPECIAL CAUSE AS COMMON CAUSE


(CALLED TYPE -II ERROR)
COMMON CAUSE ASSIGNABLE CAUSE
-Consists of many individual -Consists of just one or few
causes individual causes
-Cannot be economically -Easy to detect and generally
eliminated economical to eliminate
-Process follows a predictab -No specific pattern
le pattern

- GOK - HAK
( GOD ONLY KNOWS ) ( HUMAN ALSO KNOWS )
EXAMPLE OF LOCATION
ERRORS
1. ACCURACY

2. BIAS

3. STABILITY

4. LINEARITY
EXAMPLE OF WIDTH ERRORS

1. PRECISION

2. REPEATABILITY

3. REPRODUCIBILITY

4. GRR
Accuracy
Closeness to true Value.

Bias
– Difference between the observed Reference
average of measurements and the Value

reference value on the same


characteristics on the same part. Bias

– A measure of the systematic error


of the measurement system.
– It is the contribution of the total
Observed
error comprised of the combined Average
effects of all sources of variation, Value

known or unknown.
Bias
Possible causes for Excessive
bias
1. Instrument needs calibration.
2. Worn instrument, equipment or fixture.
3. Wrong gage for the application.
4. Different measurement method-setup, loading,
clamping.
5. Distortion (Gage/Part)
6. Parallax
7. Environment – Temp, humidity, vibration,
cleanliness
Linearity
– The difference in the bias values through the
expected operating (measurement) range of the
equipment.
– This is change of bias with respect to size.

MEASURMENT
1 2 3
POINTS
Linearity

NO LINEARITY ERROR CONSTANT LINEARITY NON LINEAR


1
BIAS

-1

REFERENCE VALUE
Stability (Drift)
The total variation in the measurements obtained with a
measurement system-
•on the same master or parts,
•when measuring a single characteristic,
•over an extended time period.
i.e. Stability is the change of bias over time
CAUSES OF GAGE STABILITY
ERROR
Environment or system changes, such as :
humidity; air pressure
• Infrequent calibration
• Lack of air pressure regulator or filter
• Warm – up period for electronic or other gages
• Lack of maintenance
• Wear
• Oxidization (corrosion)
Width Errors
Precision
Closeness of repeated readings to each other
Precision is often denoted by σgauge, which is the standard
deviation of the measurement system. The smaller the
spread of the distribution, the better the precision.
Precision can be separated into two components, called
repeatability and reproducibility .
Accuracy v/s Precision

Precise but Accurate but Not accurate Accurate


not accurate not precise or precise and
precise
Repeatability (Within system variation)

The variation in measurements obtained


•with one measurement instrument
•when used several times
•by one appraiser
•while measuring the identical
characteristic
•on the same part.

It shows by Range (R) Repeatability


Chart
Note:Repeatability is commonly referred to as equipment variation(EV).
Reproducibility (Between system variation)
The variation in the average of the measurements
•made by different appraisers
•using the same measuring instrument
•when measuring the identical characteristic
•on the same part.
It shows by Average (X bar) Chart

APPRAISER
A C B
REPRODUCIBILITY

Note:Reproducibility is commonly referred to as Appraiser Variation (AV).


Gage Repeatability & Reproducibility(GRR)
An estimate of the combined variation of
repeatability and reproducibility.
GRR is the variance equal to the sum of within
system & between system variances.
σ2 = σ2 + σ2 repeatability
GRR reproducibility

A C B
APPRAISER
MEASUREMENT SYSTEM APPLICABILITY

Measuring System Bias Linearity Stability R&R

Vernier Caliper * * *

Micrometer * * *

Steel Rule

Temp Controller * *

Torque Wrench * *

Ht Gauge * * *

Pneumatic Absolute Meas System * * * *

Pneumatic Comparator * * *

Pressure Gauge *

Volt Meter *
Methods to determine the
repeatability and reproducibility

1. RANGE METHOD

2. AVERAGE RANGE METHOD

3. ANOVA METHOD
DIFFERENCE AMONG
METHODS

Note : All methods ignore within part variation ( Such as


Roundness,flatness,Diametric Taper )
MSA DATA HAS BEEN CORRUPTED
OR

UR MIND HAS BEEN LOST


LET’S PLAY A GAME

MindReader 1.02.xls
Range Method

: Quick Approximation of Measurement


Variability
Does not decompose the variability into
Repeatability and Reproducibility

GRR = R / d2 x 100
PROCEDURE R METHOD
1. Select two or three appraisers who are users of the measurement system
2. Obtain a sample of 10 parts that represent actual or expected range of
process variation
3. Number parts 1 through 10 so that numbers are not visible to appraisers
4. Calibrate gage if this is part of the normal gauging procedures
5. Measure 10 parts in random order by appraiser a, with an observer
recording results
6. Repeat step 5 with other appraisers, conceal other appraisers readings
7. Calculate Range and Average Range
8. Calculate G R& R
9. Calculate % R&R against total variation or tolerance.
EXERCISE ON RANGE METHOD
PART NAME : Molding Roof
PARAMETER: Width of Profile
GAGE USED : Vernier Caliper
DIMENSION : 28.0 ±0.6
TOLERANCE : 0.60 MM

# APPRAISER A APPRAISER B RANGE

1 27.48 27.48
2 28.52 28.50
3 27.80 27.80
4 28.45 28.44
5 28.20 28.21
6 27.92 27.93
7 28.30 28.31
8 27.66 27.66
9 27.85 27.84
10 27.98 28.00
AVERAGE RANGE
RANGE (R) = MAX-MIN
R = ΣRi / 10
GRR = R/d2

GRR
% GRR = x 100
Tolerance

Factors Table
Sample
Size D2 A2 D3 D4
2 1.128 1.88 0.00 3.27
3 1.693 1.02 0.00 2.58
4 2.059 0.73 0.00 2.58
5 2.326 0.58 0.00 2.11
6 2.534 0.48 0.00 2.00
7 2.704 0.42 0.08 1.92
8 2.847 0.37 0.14 1.86
9 2.970 0.34 0.18 1.82
10 3.078 0.31 0.22 1.78
AVERAGE AND RANGE
METHOD
Average and Range Method:
The method allows the measurement system’s variation to be
Decomposed into Repeatability and Reproducibility but not
Their Interaction.

GRR = EV2 + AV2

ndc = 1.41 ( PV / GRR )

ndc > =5

Note : The sum of the percent consumed by each factor


will not be equal 100 %
EXAMPLE - CONDUCTING THE STUDY
Selection of sample: n > 5 parts (representing process variation).
– Identification: 1 through n (not visible to the appraisers).
– Location Marking (easily visible & identifiable by the appraisers).
– Selection of appraiser: 2-3 routine appraisers
– Selection of Measuring equipment: Calibrated routine equipment
– Deciding number of trials: 2-3
– Data collection:
– Using data collection sheet
– Under normal measurement condition
– in random order
DATA COLLECTION

1) Enter appraiser A result in row 1 of data collection sheet.


2) Enter appraisers B and C results in row 6 and 11,
respectively.
3) Repeat the cycle (2nd trial) & enter data in rows 2, 7 and
12.
4) If three trials are needed, repeat the cycle and enter data in
row 3, 8 and 13.
DATA COLLECTION SHEET

Sampl Sampl Sampl Sampl


Sample e Sample e Sample e Sample e Sample Sample
1 2 3 4 5 6 7 8 9 10
Trial 1 Row 1
Opr A Trial 2 Row 2
Trial 3 Row 3
Row 4
Row 5
Trial 1 Row 6
Opr B Trial 2 Row 7
Trial 3 Row 8
Row 9
Row 10
Trial 1 Row 11
Opr C Trial 2 Row 12
Trial 3 Row 13
Row 14
Row 15
CALCULATION
• For appraiser A, calculate Average (X) & Range(R) for each part
and enter in rows 4 & 5 respectively.
• Do the same for appraisers B & C and enter results in rows 9, 10
and14, 15 respectively.
• For appraiser A, calculate average (Xa) of all the averages (row 4)
and average (Ra) of all the ranges (row 5) and enter in data sheet.

• Calculate X b, Rb & Xc, Rc for appraisers B & C also and enter the
results in data sheet.
• Calculate average of all the observations (rows 4, 9 &14) of each
part and enter result in row 16.
• Calculate Part range (Rp)= Difference of Max. and Min. of row 16
and enter in data sheet (right most row 16).
• Calculate X =(Xa + Xb + Xc )/3 and enter in data sheet (right most
row 16).
Sample 1 Sample 2 Sample 3 Sample 4 Sample 5 Sample 6 Sample 7 Sample 8 Sample 9 Sample 10
Trial 1 Row 1
Opr A Trial 2 Row 2
Trial 3 Row 3
Average (XA ) Row 4
Range (RA) Row 5
Trial 1 Row 6
Opr B Trial 2 Row 7
Trial 3 Row 8
Average (XB ) Row 9
Range (RB) Row 10
Trial 1 Row 11
Opr C Trial 2 Row 12
Trial 3 Row 13
Average (XC ) Row 14
Range (RC) Row 15

X Row 16
CACULATION – FOR GRAPH
PREPRATION

• Calculate R = (Ra+ Rb + Rc )/3 and enter result in row 17.

• Calculate Xdiff = Difference of Max and Min of (Xa , Xb & Xc) and enter
in row 18
• Calculate UCLR =D4 R and enter in row 19 (D4=3.27 for 2 trials & 2.58
for 3)
• Calculate LCL R =D3 R and enter in row 20 (D3= 0 for trials<7)

• Calculate UCLX= X+A2 R (A2=1.88 for 2 trials & 1.02 for 3 trials).

• Calculate LCL x= X-A2 R


R&R- GRAPHICAL ANALYSIS
RANGE CHARTS
– Used to determine whether the process is in statistical control.
– The special causes need to be identified and removed
– Plot all the ranges for each part & each appraiser on range chart
– If all ranges are under control, all appraisers are doing the same job.
– If one appraiser is out of control, the method used differs from the others.
– Repeat any readings that produced a range greater than the calculated
UCLR using the same appraiser and part as originally used.

– Or, discard those values and re-average and recompute R and the limiting
value UCLR based upon the revised sample size.

– Correct the special cause that produced the out of control condition.
R&R- AVERAGE CHART
– Plot the averages of the multiple readings by each appraiser on each part
(rows 4, 9 & 14) on X chart.
– The X chart provides an indication of “usability” of the measurement
system.
– The area within the control limits represents the measurement sensitivity
– Approximately one half or more the averages should fall outside the
control limits.
– If the data show this pattern, then the measurement system should be
adequate to detect part-to-part variation and can be used for
analyzing and controlling the process.
– If less than half fall outside the control limits then either the
measurement system lacks adequate effective resolution or the sample
does not represent the expected process variation.
AVERAGE AND RANGE CHARTS
Average Char (X bar) for opr. B
Average Chart (X bar) for Opr A
86.0000 86.0000

84.0000 84.0000

82.0000 82.0000

80.0000 80.0000

78.0000 78.0000

76.0000 76.0000

74.0000 74.0000

72.0000 72.0000
1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10

Range Chart for Operator A Range Chart for Operator B

1.8000 1.8000

1.6000 1.6000

1.4000 1.4000

1.2000 1.2000

1.0000 1.0000

0.8000 0.8000

0.6000 0.6000

0.4000 0.4000

0.2000 0.2000

0.0000 0.0000
1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10
R&R- ANALYSIS OF RESULTS -- NUMERICAL

Calculate the following and record in report sheet


– Repeatability or equipment variation (EV) = R x K1 (K1 =1/ d2* )
K1 = .8862 for 2 trials & .5908 for 3 trials

– Reproducibility or appraiser variation (AV)= (XDIFF x K2)2 - (EV)2


(K2= .7071 for 2 appraisers & .5231 for 3) nr

– GRR= (EV)2 + (AV)2

– Part to part variation (PV)= RP x K3

(GRR)2 + (PV)2
– Total variation (TV) =
NUMERICAL ANALYSIS
– Calculate % variation and ndc as follows
%EV = 100 [EV/TV]
%AV = 100 [AV/TV]
%GRR = 100 [GRR/TV]
%PV = 100 [PV/TV]
No. of distinct categories (ndc)= 1.41(PV/GRR)

Note:
– In case measurement system is to be used for product control instead of process
control, TV should be replaced with specification tolerance.
– The sum of the percent consumed by each factor will not equal 100%.
MEASUREMENT SYSTEM ERRORS

Repeatability > Reproducibility

- Instrument needs maintenance


- Improve clamping or location of gaging
- Excessive within – part variation

Reproducibility > Repeatability

- Appraisers need better gage use training


- need better operational definition
- need fixture to provide consistency in gage use
Decision Making Criteria
1. For % R & R
Error < 10% - MS is acceptable

10% < Error < 30% - May be acceptable with


justification

Error > 30% - MS needs improvement

2. ndc >= 5
%GRR
%GRR is the comparison of GRR w.r.t. Specified
tolerance or the Total variation of process.
GRR
% GRR = x 100
Tolerance or TV

Condition Guideline
TV> tolerance Compare against
tolerance
TV < tolerance but Compare against
measured parameter /ms is tolerance
not required for SPC
TV < tol and parameter is Compare against
under SPC study total variation
Number of Distinct Categories
It is calculated by dividing the standard deviation
for Parts by the standard deviation for Gage, then
multiplies by 1.41.
ndc=1.41 (PV/GRR
This number represents the number of non-overlapping
confidence intervals that will span the range of product
variation. You can also think of it as the number of groups
within your process data that your measurement system can
determine.
The Automobile Industry Action Group (AIAG) suggests that when the number of
categories is less than 2, the measurement system is of no value for controlling the
process, since one part cannot be distinguished from another.
When the number of categories is 2, the data can be divided into two groups, say
high and low.
When the number of categories is 3, the data can be divided into 3 groups, say low,
middle and high.
A value of 5 or more denotes an acceptable measurement system.
MEASUREMENT SYSTEM ANALYSIS - VARIABLE STUDY (GR&R) PLANT III

PART NAME Molding Roof DATE 08.01.09 NOMINAL DIMENSION 80 APPRAISER A Rajendra Kumar APPRAISEE M. NEGI
PART NO. 78150/60 M79F01 CHARACTERISTIC Lip Length UPPER TOLERANCE 3 APPRAISER B Vivek Sharma %R & R 6.22 %
MODEL B EQUIPMENT USED Vernier Caliper(VC-14) LOWER TOLERANCE 3 APPRAISER C Gopal Chauhan ndc 23.00
Apprisar A
Trials 1 2 3 4 5 6 7 8 9 10 Constant Tables 0.1643 Sr.No. ACCEPTABILITY CRITERIA

1 82.30 80.75 77.05 80.28 80.06 80.25 79.61 78.61 80.37 82.10 X=Average No. Of Actual 0.0317
Trials A2 D3 D4 K1 K2 Parts K3 1 If % R & R < 10%, system is acceptable
2 82.30 80.90 77.24 80.14 79.82 80.02 79.48 78.46 79.97 82.21 R=Range Parts No. of 0.4232
(r) (n)
Range Parts
3 82.17 81.09 77.28 80.09 80.06 80.06 79.55 78.32 80.15 82.33 (RP) (n) 2 1.880 0 3.267 0.8862 0.707 0.0000 If 10% < % R & R < 30%, system may
be acceptable based on importance of
Avg. 82.26 80.91 77.19 80.17 79.98 80.11 79.55 78.46 80.16 82.21 XA= 80.10 3 1.020 0 2.575 0.5908 0.523 80.0990 2
application, cost of gage, cost of repair,
Range 0.13 0.34 0.23 0.19 0.24 0.23 0.13 0.29 0.40 0.23 RA= 0.24 4.96 10 3 1.020 0 2.575 0.591 0.523 10 #### 80.2666 etc.,
79.9314 If % R & R > 30%, system needs
3
Apprisar B PART 2 3 4 5 6 7 8 9 10 TOLERANCE 6 improvement
Trials 1 2 3 4 5 6 7 8 9 10 K3 0.707 0.523 0.4467 0.403 0.3742 0.353 0.34 0.325 0.31
4 NDC >= 5
1 82.14 81.03 77.49 80.14 79.98 80.13 79.69 78.51 80.00 82.25
2 82.10 81.00 77.22 80.00 80.02 79.98 79.56 78.47 79.90 82.28
3 82.22 81.00 77.28 80.19 79.89 80.00 79.56 78.37 80.01 82.06 COMPARISION TO COMPARISION TO
MEASUREMENT UNIT ANALYSIS
Avg. 82.15 81.01 77.33 80.11 79.96 80.04 79.60 78.45 79.97 82.20 XB= 80.08 TOTAL VARIATION TOLERANCE
Range 0.12 0.03 0.27 0.19 0.13 0.15 0.13 0.14 0.11 0.22 RB= 0.15 1 Repeatability - Equipment Variation (EV)
EV = R x K1 => 0.0971 % EV = 100 (EV/TV) => 6.21 % % EV =100 (EV/Tol)=> 1.62 %
Apprisar C 2 Reproducibility - Appraiser Variation (AV)
2 2 1/2
Trials 1 2 3 4 5 6 7 8 9 10 AV = {(xDIFF x K2) - (EV /nr)} => 0.0063 % AV = 100 (AV/TV) => 0.4 % % AV =100 (AV/Tol)=> 0.11 %
1 82.12 80.88 77.21 80.19 80.02 80.00 79.60 78.74 80.20 82.34 n = number of parts
2 82.18 80.93 77.31 80.15 80.08 79.90 79.51 78.60 80.11 82.27 r = number of trials
3 82.21 80.95 77.35 80.11 79.99 79.95 79.56 78.56 80.16 82.24 3 Repeatability & Reproducibility (R & R)
Avg. 82.17 80.92 77.29 80.15 80.03 79.95 79.56 78.63 80.16 82.28 XC= 80.11 R & R (GRR)= {(EV2 + AV2)}1/2 => 0.0973 % R&R = 100 (R&R/TV)=> 6.22 % % R&R 100
= (R&R/Tol)
=> 1.62 %
Range 0.09 0.07 0.14 0.08 0.09 0.10 0.09 0.18 0.09 0.10 RC= 0.10 4 Part Variation (PV)
Part Avg. PV = Rp X K3 => 1.5608 % PV = 100 (PV/TV) => 99.81 % % PV =100 (PV/Tol)=> 26.01 %
(XP) 82.19 80.95 77.27 80.14 79.99 80.03 79.57 78.52 80.10 82.23 RP= 4.96 5 Total Variation (TV) No.of distinct categories No.of distinct categories
2 2 1/2
TV = {(R&R + PV )} => 1.5638 NDC= 1.41(PV/GRR)=> 23 NDC= 1.41(PV/GRR)
=> 22.62

Average Chart (X bar) for Opr A Average Char (X bar) for opr. B Average Char (X bar) for Opr. C

83.00 83.00
82.00 82.00
83.00 81.00
81.00
82.00
80.00
80.00 81.00
79.00 80.00 79.00
78.00
79.00 78.00
78.00 77.00
77.00
77.00 76.00
76.00 76.00
75.00 75.00 75.00
74.00 74.00 74.00
1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10
1 2 3 4 5 6 7 8 9 10

Range Chart for Operator A Range Chart for Operator B Range Chart for Operator C

0.45 0.45
0.45
0.40 0.40
0.40
0.35
0.35 0.35
0.30
0.30 0.30
0.25 0.25 0.25
0.20 0.20 0.20
0.15 0.15 0.15
0.10 0.10 0.10
0.05 0.05 0.05
0.00 0.00 0.00
1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10
ATTRIBUTE MEASUREMENT
SYSTEMS STUDY
Attribute MSA is applicable when the measurement value is
one of finite number of categories. Most common example is
of “go/not go” gauge.
Other examples are leak or no leak, crack or no crack,
defect or no defect, pass or fail, complete or incomplete.

METHODS
Hypotheses Test Analysis-Cross Tab Method
ATTRIBUTE MEASUREMENT
SYSTEMS STUDY

METHODOLOGY:

– Select approx 20 parts as follows:

– Approximately one third conforming,

– One third non conforming & one third marginal (marginal conforming &
marginal non conforming)

– Note down the correct measurement attribute (true status).

– Decide the no. of appraiser & no. of trials to be conducted.

– Record the measurement result in data sheet, for Not OK decision record 0
and for OK decision record 1.
DATA SHEET FOR ATTRIBUTE MSA
STUDY
MEASUREMENT SYSTEM ANALYSIS
PLANT - III
ATTRIBUTE STUDY
PART NAME Samples of Molding & Extrusion parts EQUIPMENT US ED VISUAL
PART NO. - APPRAISER A SHYAM SUNDER
MODEL ALL APPRAISER B DEELIP MALIK
DATE 12.11.07 APPRAISER C SOHAN PAL
CHAR. OK / NG Judgment APPRAISEE MNEGI
PART A1 A2 A3 B1 B2 B3 C1 C2 C3 REF. CODE

1 1 1 0 1 0 1 0 1 0 1

2 0 0 0 0 0 0 1 0 0 0

3 0 1 0 1 0 0 0 0 0 0

4 1 1 1 1 1 0 1 1 1 1

5 0 0 0 0 0 0 0 0 1 0

6 1 1 1 1 1 1 1 1 1 1

7 1 1 1 1 1 1 1 0 1 1

8 0 0 0 0 0 0 0 0 0 0

9 1 1 1 1 1 1 1 1 1 1

10 1 1 1 1 1 1 1 1 1 1

11 0 0 0 0 0 0 0 0 0 0

12 1 1 1 1 1 0 1 0 1 1

13 0 0 0 0 0 0 0 0 0 0

14 1 1 1 1 1 1 1 1 1 1

15 1 1 1 1 1 1 1 1 1 1

16 1 1 1 1 1 1 1 1 1 1

17 1 0 0 0 0 0 0 0 0 0

18 0 0 0 0 0 0 0 0 0 0

19 1 1 1 1 1 1 1 1 1 1

20 1 1 1 1 1 1 1 1 1 1
ATTRIBUTE MEASUREMENT
SYSTEMS STUDY

Possible outcome by the appraiser:


– Correct decisions
– Calling good part as good (good-correct)
– Calling bad part as bad (bad-correct)
– Wrong decisions
– Calling good part as bad (false alarm)
– Calling bad part as good (miss)
– For each appraiser count the data as follows
– Number good-correct (GN)
– Number bad -correct (NB)
– Number correct (CN): total (GN+NB)
– Number false alarm(NF): that is rejecting ok part
– Number miss(NM): that is accepting rejected part
– Number total (TN): total of (GN+NB+NF+NM)
ATTRIBUTE MEASUREMENT
SYSTEMS STUDY

–For each appraiser calculate the following

o Kappa
o Effectiveness
o False alarm
o Miss Rate
Kappa
Kappa indicates the degree of agreement of the
nominal or ordinal assessments made by multiple
appraisers when evaluating the same samples.
Kappa statistics are commonly used in cross
tabulation (table) applications in attribute
agreement analysis (attribute gage R&R).

0<kappa<1

General rule of thumb is that value of kappa


greater than 0.75 indicates good to excellent
agreement, values less than 0.40 indicates Poor
agreement
Effectiveness :

The ability to accurately detect conforming and


nonconforming part s. This is expressed no. between 0 and
1 where 1 is perfect. It is computed by

Effectiveness (E) =No. of parts correctly identified / Total


opportunity to be correct

The total opportunities to be correct are a function of the


number of parts used and how many times each part is
inspected. If 10 parts are selected and each is inspected
three times, there are a total of 3x10=30 opportunity to be
correct.
Probability of a Miss rate Pmiss :
The probability of a miss rate is the chance of not rejecting a
nonconforming part. This is a serious type of error since a
non conforming part is accepted.

The probability of a miss rate is computed as

Pmiss = No. of misses / no. of opportunity for a miss.

No. of opportunity for a miss.

total sample that are good in study x number of trials


per operator
Probability of False Alarm Pfa :
The probability of a false alarm is the chance of rejecting a
conforming part. This type of error is not a serious as miss,
since a conforming part is rejecting.

The probability of a miss rate is computed as

Pfa = No. of misses / no. of opportunity for a false alarm.

No. of opportunity for false alarm

total sample that are bad in study x number of trials


per operator
Decision Criteria

FALSE ALARM
DECISION EFFECTIVENESS MISS RATE
RATE
ACCEPTABLE FOR THE
APPRAISER > = 90% < = 2% < = 5%
MARGINALLY ACCEPTABLE
FOR THE APPRAISER - MAY > = 80% < = 5% < = 10%
NEED IMPROVEMENT
UNACCEPTABLE FOR THE
APPRAISER - NEEDS < 80% > 5% > 10%
IMPROVEMENT
If have any query :
contact at msnegi@ppapco.com
To be Continue. . . . . . . . . . . . . . . .
EXAMPLE CROSS TAB
METHOD
Part A-1 A-2 A-3 B-1 B-2 B-3 C-1 C-2 C-3 Ref. Ref. value Code
1 P P P P P P P P P P 0.476901 +
2 P P P P P P P P P P 0.509015 +
3 NEG NEG NEG NEG NEG NEG NEG NEG NEG NEG 0.576459 -
4 NEG NEG NEG NEG NEG NEG NEG NEG NEG NEG 0.566152 -
5 NEG NEG NEG NEG NEG NEG NEG NEG NEG NEG 0.570360 -
6 P P NEG P P NEG P NEG NEG P 0.544951 x
7 P P P P P P P NEG P P 0.465454 x
8 P P P P P P P P P P 0.502295 +
9 NEG NEG NEG NEG NEG NEG NEG NEG NEG NEG 0.437817 -
10 P P P P P P P P P P 0.515573 +
11 P P P P P P P P P P 0.488905 +
12 NEG NEG NEG NEG NEG NEG NEG P NEG NEG 0.559918 x
13 P P P P P P P P P P 0.542704 +
14 P P NEG P P P P NEG NEG P 0.454518 x
15 P P P P P P P P P P 0.517377 +
16 P P P P P P P P P P 0.531939 +
17 P P P P P P P P P P 0.519694 +
18 P P P P P P P P P P 0.484167 +
19 P P P P P P P P P P 0.520496 +
20 P P P P P P P P P P 0.477236 +
21 P P NEG P NEG P NEG P NEG P 0.452310 x
22 NEG NEG P NEG P NEG P P NEG NEG 0.545604 x
23 P P P P P P P P P P 0.529065 +
24 P P P P P P P P P P 0.514192 +
25 NEG NEG NEG NEG NEG NEG NEG NEG NEG NEG 0.599581 -
26 NEG P NEG NEG NEG NEG NEG NEG P NEG 0.547204 x
27 P P P P P P P P P P 0.502436 +
28 P P P P P P P P P P 0.521642 +
29 P P P P P P P P P P 0.523754 +
30 NEG NEG NEG NEG NEG P NEG NEG NEG NEG 0.561457 x
31 P P P P P P P P P P 0.503091 +
32 P P P P P P P P P P 0.505850 +
33 P P P P P P P P P P 0.487613 +
34 NEG NEG P NEG NEG P NEG P P NEG 0.449696 x
35 P P P P P P P P P P 0.498698 +
36 P P NEG P P P P NEG P P 0.543077 x
37 NEG NEG NEG NEG NEG NEG NEG NEG NEG NEG 0.409238 -
38 P P P P P P P P P P 0.488184 +
39 NEG NEG NEG NEG NEG NEG NEG NEG NEG NEG 0.427687 -
40 P P P P P P P P P P 0.501132 +
41 P P P P P P P P P P 0.513779 +
42 NEG NEG NEG NEG NEG NEG NEG NEG NEG NEG 0.566575 -
43 P NEG P P P P P P NEG P 0.462410 x
44 P P P P P P P P P P 0.470832 +
45 NEG NEG NEG NEG NEG NEG NEG NEG NEG NEG 0.412453 -
46 P P P P P P P P P P 0.493441 +
47 P P P P P P P P P P 0.486379 +
48 NEG NEG NEG NEG NEG NEG NEG NEG NEG NEG 0.587893 -
49 P P P P P P P P P P 0.483803 +
50 NEG NEG NEG NEG NEG NEG NEG NEG NEG NEG 0.446697 -
KAPPA between operators Total Total
A=N & B=P 0 2 4 6 A=N & B=N 16 14 14 44 31.33
A=P & B=N 0 2 1 3 A=P & B=P 34 32 31 97 68.67
B=N & C=P 1 3 1 5 B=N & C=N 15 13 14 42 31.33
B=P & C=N 1 4 4 9 B=P & C=P 33 30 31 94 68.67
A=N & C=P 1 4 2 7 A=N & C=N 15 12 16 43 33.33
A=P & C=N 1 5 2 8 A=P & C=P 33 29 30 92 66.67

KAPPA between operators & REF value


A=N/Ref=P 5 B=N/Ref=P 2 C=N/Ref=P 9 = false alarm
A=N/Ref=N 45 B=N/Ref=N 45 C=N/Ref=N 42 = correct decision
A=P/Ref=P 97 B=P/Ref=P 100 C=P/Ref=P 93 = correct decision
A=P/Ref=N 3 B=P/Ref=N 3 C=P/Ref=N 6 = missed bad parts

100% 100% 100%


Operator match 100% missed Mixed Operator match Operator match
Good Bad Good & Bad

A 29 0 8 A 13 A 42
B 32 0 5 B 13 B 45
C 28 0 10 C 12 C 40
A to B crosstabulation

B Kappa A - B
0 1 Total
A 0 Count 44 6 50 Po 0.94 kappa = Po -Pe / 1- Pe
Expected count 15.7 34.3 50.0 Pe 0.56
1 Count 3 97 100
Expected count 31.3 68.7 100.0
Total Count 47 103 150 0.38 0.44 0.86
Expected count 47.0 103.0 150.0

Po = the sum of the observed proportion in the diagonal cells


Pe = the sum of the expected proportion in the diagonal cells
B to C crosstabulation

C Kappa B - C
0 1 Total
B 0 Count 42 5 47 Po 0.91 kappa = Po -Pe / 1- Pe
Expected count 16.0 31.0 47.0 Pe 0.56
1 Count 9 94 103
Expected count 35.0 68.0 103.0
Total Count 51 99 150 0.35 0.44 0.79
Expected count 51.0 99.0 150.0

A to C crosstabulation

C Kappa A - C
0 1 Total
A 0 Count 43 7 50 Po 0.90 kappa = Po -Pe / 1- Pe
Expected count 17.0 33.0 50.0 Pe 0.55
1 Count 8 92 100
Expected count 34.0 66.0 100.0
Total Count 51 99 150 0.35 0.45 0.78
Expected count 51.0 99.0 150.0
A to REF crosstabulation

REF Kappa A - REF


0 1 Total
A 0 Count 45 5 50 Po 0.95
Expected count 16.0 34.0 50.0 Pe 0.56
1 Count 3 97 100
Expected count 32.0 68.0 100.0
Total Count 48 102 150 0.39 0.44 0.88
Expected count 48.0 102.0 150.0

B to REF crosstabulation

REF Kappa B - REF


0 1 Total
B 0 Count 45 2 47 Po 0.97
Expected count 15.0 32.0 47.0 Pe 0.57
1 Count 3 100 103
Expected count 33.0 70.0 103.0
Total Count 48 102 150 0.40 0.43 0.92
Expected count 48.0 102.0 150.0

C to REF crosstabulation

REF Kappa A - REF


0 1 Total
C 0 Count 42 9 51 Po 0.90
Expected count 16.3 34.7 51.0 Pe 0.56
1 Count 6 93 99
Expected count 31.7 67.3 99.0
Total Count 48 102 150 0.34 0.44 0.77
Expected count 48.0 102.0 150.0
Appraiser Number Number Number Number Number Number
Good bad false
correct correct correct alarm missed total
A 97 45 142 5 3 150
B 100 45 145 2 3 150
C 93 42 135 9 6 150

Appraiser Effectiveness Pf alse alarm Pmissed

A 84.00 4.90 6.25


B 90.00 1.96 6.25
C 80.00 8.82 12.50

Effectiveness (E) =No. of parts correctly identified / Total opportunity to be correct

Pmiss = No. of misses / no. of opportunity for a miss.

Pfa = No. of misses / no. of opportunity for a false alarm.

You might also like