Are you sure?
This action might not be possible to undo. Are you sure you want to continue?
MSA 3RD EDITION (AIAG) Presented By Mahendra S. Negi
CONT ENTS
1. Ba sic T erms( Discr imin at io n, Err ors ,Ref. Value ) 2. Obje ct iv e 3. Sta tis tic al p ro pert ie s 4. Desc rip tio n of Loca tio n & W idth Err ors 5. Effect s of Measur eme nt Err or o n M .S. 6. Varia ble M eas urement s tud y 7. Attr ibu te Measur eme nt St ud y
BASIC TERMS
• • • • • • Measurement. Measurement System. Gage Discrimination, Readability, Resolution Reference Value and True Value Uncertainty
Measurement
Assignment of numbers (values) to material things to represent the relationship among them w.r.t. particular properties. C. Eisenhart (1963)
Measurement System
The complete process used to obtain measurement i.e. Combination of • operations, • Procedures, • Gauges and other equipments,software, • Personnel, • environment and • assumption etc.
Gage
Gage is any device used to obtain measurements, frequently used to refer specifically the devices used on shop floor, includes Go/No Go devices.
Discrimination
The ability of the system to detect and indicate even small changes of the measured characteristic, also known as resolution Measurement system is unacceptable for analysis if it cannot detect process variation
True value
Actual value of an Artifact. Unknown and Unknowable.
Reference value
Accepted Value of an Artifact. Used as a Surrogate to the true Value.
Uncertainty
An estimated range of values about the measured value in which the true value is believed to be contained. True Measurement = Obs. Measurement ± U
OBJECTIVES
1. To quantify the variation present in the measurement system 2. Ensure the stability of the measurement system 3. Initiate appropriate actions to minimise the contamination of measurement system variation in total process variation
Statistical Properties of Measurement System (MS)
ADEQUATE DISCRIMINATION & SENSITIVITY:
1. •Should be small relative to Process Variation or Specification Limit. •Rule of 10’s i.e. 1/10th . i.e. least count/ resolution of equipment should be 1/10th of process Variation (10 data categories).
2. MS should be in statistical control
Under repeatability conditions, the measurement system variation is due to common causes only and no special causes
Statistical Properties of Measurement System (MS)
3. The measurement error must be small compared to a) The tolerance spread b) The process variability
INADEQUATE DISCRIMINATION
Inadequate discrimination of a measurement system is shown on Range chart when:  Only one,two or three values for ranges can be read  More than ¼ of ranges are Zero
RA NGE CH AR T W IT H D IFFERENT DISCRIMINAT IONS
MEASUREMENT SYSTEM ERRORS
1. LOCATION ERROR 3. WIDTH ERROR
MEASUREMENT SYSTEM ERRORS
1. LOCATION ERROR
ACTUAL VARIATION
OBS.VARIATION DUE TO MS ERROR
MEASUREMENT SYSTEM ERRORS
2. Width ( spread ) Error
ACTUAL VARIATION
OBS.VARIATION DUE TO MS ERROR
Effects of Measurement system Errors on Measurement Decision
1. EFFECT ON PRODUCT CONTROL:
1a. CALLING A GOOD PART AS BAD PART (Called TYPE I Error,
Producer’s Risk, False Alarm)
1b. CALLING A BAD PART AS GOOD PART (Called TYPE –II error,
Consumer’s Risk, Miss rate)
Effects of Measurement system Errors on Measurement Decision
2. EFFECT ON PROCESS CONTROL:
2a. CALLING A COMMON CAUSE AS SPECIAL CAUSE
(CALLED TYPE I ERROR)
2b. CALLING A SPECIAL CAUSE AS COMMON CAUSE
(CALLED TYPE II ERROR)
COMMON CAUSE
Consists of many individual causes Cannot be economically eliminated Process follows a predictab le pattern GOK ( GOD ONLY KNOWS )
ASSIGNABLE CAUSE
Consists of just one or few individual causes Easy to detect and generally economical to eliminate No specific pattern HAK ( HUMAN ALSO KNOWS )
EXAMPLE OF LOCATION ERRORS
1. ACCURACY 2. BIAS 3. STABILITY 4. LINEARITY
EXAMPLE OF WIDTH ERRORS
1. PRECISION 2. REPEATABILITY
3. REPRODUCIBILITY 4. GRR
Accuracy
Closeness to true Value.
Bias
– Difference between the observed average of measurements and the reference value on the same characteristics on the same part. – A measure of the systematic error of the measurement system. – It is the contribution of the total error comprised of the combined effects of all sources of variation, known or unknown.
Observed Average Value Reference Value
Bias
Bias
Possible causes for Excessive bias
1. 2. 3. 4. Instrument needs calibration. Worn instrument, equipment or fixture. Wrong gage for the application. Different measurement methodsetup, loading, clamping. 5. Distortion (Gage/Part) 6. Parallax 7. Environment – Temp, humidity, vibration, cleanliness
Linearity
– The difference in the bias values through the expected operating (measurement) range of the equipment. – This is change of bias with respect to size.
MEASURMENT POINTS
1
2
3
Linearity
NO LINEARITY ERROR 1 CONSTANT LINEARITY NON LINEAR
BIAS
1
0
REFERENCE VALUE
Stability (Drift)
The total variation in the measurements obtained with a measurement system•on the same master or parts, •when measuring a single characteristic, •over an extended time period. i.e. Stability is the change of bias over time
CAUSES OF GAGE STABILITY ERROR
Environment or system changes, such as : humidity; air pressure • Infrequent calibration • Lack of air pressure regulator or filter • Warm – up period for electronic or other gages • Lack of maintenance • Wear • Oxidization (corrosion)
Width Errors
Precision
Closeness of repeated readings to each other Precision is often denoted by σgauge, which is the standard deviation of the measurement system. The smaller the spread of the distribution, the better the precision. Precision can be separated into two components, called repeatability and reproducibility .
Accuracy v/s Precision
Precise but not accurate
Accurate but not precise
Not accurate or precise
Accurate and precise
Repeatability (Within system variation)
The variation in measurements obtained •with one measurement instrument •when used several times •by one appraiser •while measuring the identical characteristic •on the same part.
It shows by Range (R) Chart
Repeatability
Note:Repeatability is commonly referred to as equipment variation(EV).
Reproducibility (Between system variation)
The variation in the average of the measurements •made by different appraisers •using the same measuring instrument •when measuring the identical characteristic •on the same part.
It shows by Average (X bar) Chart
APPRAISER A C
REPRODUCIBILITY
B
Note:Reproducibility is commonly referred to as Appraiser Variation (AV).
Gage Repeatability & Reproducibility(GRR)
An estimate of the combined repeatability and reproducibility. variation of
GRR is the variance equal to the sum of within system & between system variances. σ2
GRR
= σ2
reproducibility
+
σ2
repeatability
APPRAISER
A
C
B
MEASUREMENT SYSTEM Measuring System
Vernier Caliper Micrometer Steel Rule Temp Controller Torque Wrench Ht Gauge Pneumatic Absolute Meas System Pneumatic Comparator Pressure Gauge Volt Meter
APPLICABILITY
Linearity Stability R&R
Bias * *
* *
* *
* * * * * * * * * *
*
* * * * *
Methods to determine the repeatability and reproducibility
1. RANGE METHOD 2. AVERAGE RANGE METHOD 3. ANOVA METHOD
DIFFERENCE AMONG METHODS
Note : All methods ignore within part variation ( Such as Roundness,flatness,Diametric Taper )
MSA DATA HAS BEEN CORRUPTED OR UR MIND HAS BEEN LOST
LET’S PLAY A GAME
MindReader 1.02.xls
Range Method
: Quick Approximation of Measurement Variability Does not decompose the variability into Repeatability and Reproducibility GRR = R / d2 x 100
PROCEDURE R METHOD
1. 2. 3. 4. 5. 6. 7. 8. 9. Select two or three appraisers who are users of the measurement system Obtain a sample of 10 parts that represent actual or expected range of process variation Number parts 1 through 10 so that numbers are not visible to appraisers Calibrate gage if this is part of the normal gauging procedures Measure 10 parts in random order by appraiser a, with an observer recording results Repeat step 5 with other appraisers, conceal other appraisers readings Calculate Range and Average Range Calculate G R& R Calculate % R&R against total variation or tolerance.
EXERCISE ON RANGE METHOD
PART NAME : Molding Roof PARAMETER: Width of Profile GAGE USED : Vernier Caliper DIMENSION : 28.0 ±0.6 TOLERANCE : 0.60 MM
# 1 2 3 4 5 6 7 8 9 10 APPRAISER A 27.48 28.52 27.80 28.45 28.20 27.92 28.30 27.66 27.85 27.98 AVERAGE RANGE APPRAISER B 27.48 28.50 27.80 28.44 28.21 27.93 28.31 27.66 27.84 28.00 RANGE
RANGE (R) R GRR % GRR =
= = =
MAXMIN ΣRi / 10 R/d2 GRR Tolerance x 100
Factors Table
Sample Size 2 3 4 5 6 7 8 9 10 D2 1.128 1.693 2.059 2.326 2.534 2.704 2.847 2.970 3.078 A2 1.88 1.02 0.73 0.58 0.48 0.42 0.37 0.34 0.31 D3 0.00 0.00 0.00 0.00 0.00 0.08 0.14 0.18 0.22 D4 3.27 2.58 2.58 2.11 2.00 1.92 1.86 1.82 1.78
AVERAGE AND RANGE METHOD
Average and Range Method:
The method allows the measurement system’s variation to be Decomposed into Repeatability and Reproducibility but not Their Interaction. GRR = EV2 + AV2
ndc = 1.41 ( PV / GRR ) ndc > =5 Note : The sum of the percent consumed by each factor will not be equal 100 %
EXAMPLE  CONDUCTING THE STUDY
Selection of sample: n > 5 parts (representing process variation). – Identification: 1 through n (not visible to the appraisers). – Location Marking (easily visible & identifiable by the appraisers). – Selection of appraiser: 23 routine appraisers – Selection of Measuring equipment: Calibrated routine equipment – Deciding number of trials: 23 – Data collection: – Using data collection sheet – Under normal measurement condition – in random order
DATA COLLECTION
1) Enter appraiser A result in row 1 of data collection sheet. 2) Enter appraisers B and C results in row 6 and 11, respectively. 3) Repeat the cycle (2nd trial) & enter data in rows 2, 7 and 12. 4) If three trials are needed, repeat the cycle and enter data in row 3, 8 and 13.
DATA COLLECTION SHEET
Sample 1 Trial 1 Opr A Trial 2 Trial 3 Sampl e 2 Sample 3 Sampl e 4 Sample 5 Sampl e 6 Sample 7 Sampl e 8 Sample 9 Sample 10 Row 1 Row 2 Row 3 Row 4 Row 5 Trial 1 Opr B Trial 2 Trial 3 Row 6 Row 7 Row 8 Row 9 Row 10 Trial 1 Opr C Trial 2 Trial 3 Row 11 Row 12 Row 13 Row 14 Row 15
CALCULATION
• • • • • • • For appraiser A, calculate Average (X) & Range(R) for each part and enter in rows 4 & 5 respectively. Do the same for appraisers B & C and enter results in rows 9, 10 and14, 15 respectively. For appraiser A, calculate average (Xa) of all the averages (row 4) and average (Ra) of all the ranges (row 5) and enter in data sheet. Calculate X b, Rb & Xc, Rc for appraisers B & C also and enter the results in data sheet. Calculate average of all the observations (rows 4, 9 &14) of each part and enter result in row 16. Calculate Part range (Rp)= Difference of Max. and Min. of row 16 and enter in data sheet (right most row 16). Calculate X =(Xa + Xb + Xc )/3 and enter in data sheet (right most row 16).
Sample 1 Trial 1 Opr A Trial 2 Trial 3 Average (XA ) Range (RA) Trial 1 Opr B Trial 2 Trial 3 Average (XB ) Range (RB) Trial 1 Opr C Trial 2 Trial 3 Average (XC ) Range (RC) X
Sample 2
Sample 3
Sample 4
Sample 5
Sample 6
Sample 7
Sample 8
Sample 9
Sample 10 Row 1 Row 2 Row 3 Row 4 Row 5 Row 6 Row 7 Row 8 Row 9 Row 10 Row 11 Row 12 Row 13 Row 14 Row 15 Row 16
CACULATION – FOR GRAPH PREPRATION
• • • • • • Calculate R = (Ra+ Rb + Rc )/3 and enter result in row 17. Calculate Xdiff = Difference of Max and Min of (Xa , Xb & Xc) and enter in row 18 Calculate UCLR =D4 R and enter in row 19 (D4=3.27 for 2 trials & 2.58 for 3) Calculate LCL R =D3 R and enter in row 20 (D3= 0 for trials<7) Calculate UCLX= X+A2 R (A2=1.88 for 2 trials & 1.02 for 3 trials). Calculate LCL x= XA2 R
R&R GRAPHICAL ANALYSIS RANGE CHARTS
– – – – – – – – Used to determine whether the process is in statistical control. The special causes need to be identified and removed Plot all the ranges for each part & each appraiser on range chart If all ranges are under control, all appraisers are doing the same job. If one appraiser is out of control, the method used differs from the others. Repeat any readings that produced a range greater than the calculated UCLR using the same appraiser and part as originally used. Or, discard those values and reaverage and recompute R and the limiting value UCLR based upon the revised sample size. Correct the special cause that produced the out of control condition.
R&R AVERAGE CHART
– – – – – Plot the averages of the multiple readings by each appraiser on each part (rows 4, 9 & 14) on X chart. The X chart provides an indication of “usability” of the measurement system. The area within the control limits represents the measurement sensitivity Approximately one half or more the averages should fall outside the control limits. If the data show this pattern, then the measurement system should be adequate to detect parttopart variation and can be used for analyzing and controlling the process. If less than half fall outside the control limits then either the measurement system lacks adequate effective resolution or the sample does not represent the expected process variation.
–
AVERAGE AND RANGE CHARTS
Average Chart (X bar) for Opr A
86.0000 84.0000 82.0000 80.0000 78.0000 76.0000 74.0000 72.0000 1 2 3 4 5 6 7 8 9 10
86.0000 84.0000 82.0000 80.0000 78.0000 76.0000 74.0000 72.0000 1 2 3 4 5 6 7 8 9 10
Average Char (X bar) for opr. B
Range Chart for Operator A
1.8000 1.6000 1.4000 1.2000 1.0000 0.8000 0.6000 0.4000 0.2000 0.0000 1 2 3 4 5 6 7 8 9 10 1.8000 1.6000 1.4000 1.2000 1.0000 0.8000 0.6000 0.4000 0.2000 0.0000 1 2 3
Range Chart for Operator B
4
5
6
7
8
9
10
R&R ANALYSIS OF RESULTS  NUMERICAL
Calculate the following and record in report sheet
– Repeatability or equipment variation (EV) = R x K1 (K1 =1/ d2* ) K1 = .8862 for 2 trials & .5908 for 3 trials –
Reproducibility or appraiser variation (AV)= (K2= .7071 for 2 appraisers & .5231 for 3)
(XDIFF x K2)2

(EV)2 nr
–
GRR=
(EV)2 + (AV)2
–
Part to part variation (PV)= RP x K3 (GRR)2 + (PV)2
–
Total variation (TV) =
NUMERICAL ANALYSIS
– Calculate % variation and ndc as follows %EV %AV %GRR %PV = = = = 100 [EV/TV] 100 [AV/TV] 100 [GRR/TV] 100 [PV/TV]
No. of distinct categories (ndc)= 1.41(PV/GRR)
Note: – – In case measurement system is to be used for product control instead of process control, TV should be replaced with specification tolerance. The sum of the percent consumed by each factor will not equal 100%.
MEASUREMENT SYSTEM ERRORS
Repeatability > Reproducibility  Instrument needs maintenance  Improve clamping or location of gaging  Excessive within – part variation Reproducibility > Repeatability  Appraisers need better gage use training  need better operational definition  need fixture to provide consistency in gage use
Decision Making Criteria
1. For % R & R Error < 10% 10% < Error < 30% Error > 30% 2. ndc >= 5 MS is acceptable May be acceptable with justification MS needs improvement
%GRR
%GRR is the comparison of GRR w.r.t. Specified tolerance or the Total variation of process. % GRR = GRR Tolerance or TV x 100
Condition
TV> tolerance
Guideline
Compare against tolerance TV < tolerance but Compare against measured parameter /ms is tolerance not required for SPC TV < tol and parameter is under SPC study Compare against total variation
Number of Distinct Categories
It is calculated by dividing the standard deviation for Parts by the standard deviation for Gage, then multiplies by 1.41. ndc=1.41 (PV/GRR This number represents the number of nonoverlapping confidence intervals that will span the range of product variation. You can also think of it as the number of groups within your process data that your measurement system can determine.
The Automobile Industry Action Group (AIAG) suggests that when the number of categories is less than 2, the measurement system is of no value for controlling the process, since one part cannot be distinguished from another. When the number of categories is 2, the data can be divided into two groups, say high and low. When the number of categories is 3, the data can be divided into 3 groups, say low, middle and high. A value of 5 or more denotes an acceptable measurement system.
MEASUREMENT SYSTEM ANALYSIS  VARIABLE STUDY (GR&R)
PART NAME PART NO. MODEL
Apprisar A
PLANT III
APPRAISEE %R & R ndc
0.1643 Sr.No.
Molding Roof 78150/60 M79F01 B
1 2 3 4 5 6
DATE CHARACTERISTIC EQUIPMENT USED 7 8 9
08.01.09 Lip Length Vernier Caliper(VC14) 10
NOMINAL DIMENSION UPPER TOLERANCE LOWER TOLERANCE
80 3 3 Constant Tables
APPRAISER A APPRAISER B APPRAISER C
Rajendra Kumar Vivek Sharma Gopal Chauhan
M. NEGI 6.22 % 23.00
Trials 1 2 3 Avg. Range
ACCEPTABILITY CRITERIA
82.30 82.30 82.17 82.26 0.13
80.75 80.90 81.09 80.91 0.34
77.05 77.24 77.28 77.19 0.23
80.28 80.14 80.09 80.17 0.19
80.06 79.82 80.06 79.98 0.24
80.25 80.02 80.06 80.11 0.23
79.61 79.48 79.55 79.55 0.13
78.61 78.46 78.32 78.46 0.29
80.37 79.97 80.15 80.16 0.40
82.10 82.21 82.33
X=Average R=Range
No. Of Trials Parts No. of (r) Range Parts 2 (RP) (n) 3 4.96 10 3
A2 1.880 1.020 1.020
D3 0 0 0
D4 3.267 2.575 2.575
K1
K2
Actual Parts K3 (n)
0.0317 0.4232 0.0000 80.0990
1
If % R & R < 10%, system is acceptable If 10% < % R & R < 30%, system may be acceptable based on importance of application, cost of gage, cost of repair, etc., If % R & R > 30%, system needs improvement NDC >= 5
0.8862 0.707 0.5908 0.523 0.591 0.523 10 ####
TOLERANCE
82.21 XA= 80.10 0.23 RA= 0.24
2
80.2666 79.9314
Apprisar B Trials 1 2 3 Avg. Range 1 2 3 4 5 6 7 8 9 10
PART K3
2
3
4 0.4467
5 0.403
6 0.3742
7 0.353
8
9
10
6
3 4
0.707 0.523
0.34 0.325 0.31
82.14 82.10 82.22 82.15 0.12
81.03 81.00 81.00 81.01 0.03
77.49 77.22 77.28 77.33 0.27
80.14 80.00 80.19 80.11 0.19
79.98 80.02 79.89 79.96 0.13
80.13 79.98 80.00 80.04 0.15
79.69 79.56 79.56 79.60 0.13
78.51 78.47 78.37 78.45 0.14
80.00 79.90 80.01 79.97 0.11
82.25 82.28 82.06 82.20 XB= 80.08 0.22 RB= 0.15
MEASUREMENT UNIT ANALYSIS
1 2 Repeatability  Equipment Variation (EV) EV = R x K1 => Reproducibility  Appraiser Variation (AV) 2 2 1/2 AV = {(xDIFF x K2)  (EV /nr)} => n = number of parts r = number of trials Repeatability & Reproducibility (R & R) R & R (GRR)= {(EV2 + AV2)}1/2 => Part Variation (PV) PV = Rp X K3 => Total Variation (TV) 2 2 1/2 TV = {(R&R + PV )} => 0.0971 0.0063
COMPARISION TO TOTAL VARIATION
% EV = 100 (EV/TV) => % AV = 100 (AV/TV) =>
6.21 % 0.4 %
COMPARISION TO TOLERANCE
% EV =100 (EV/Tol)=> % AV =100 (AV/Tol) > = 1.62 % 0.11 %
Apprisar C Trials 1 2 3 Avg. Range Part Avg. (XP) 1 2 3 4 5 6 7 8 9 10
82.12 82.18 82.21 82.17 0.09
82.19
80.88 80.93 80.95 80.92 0.07
80.95
77.21 77.31 77.35 77.29 0.14
77.27
80.19 80.15 80.11 80.15 0.08
80.14
80.02 80.08 79.99 80.03 0.09
79.99
80.00 79.90 79.95 79.95 0.10
80.03
79.60 79.51 79.56 79.56 0.09
79.57
78.74 78.60 78.56 78.63 0.18
78.52
80.20 80.11 80.16 80.16 0.09
80.10
82.34 82.27 82.24 82.28 XC= 80.11 0.10 RC= 0.10
82.23 RP= 4.96
3 4 5
0.0973 1.5608 1.5638
% R&R = 100 (R&R/TV) > = % PV = 100 (PV/TV) => No.of distinct categories NDC= 1.41(PV/GRR) > =
6.22 % 99.81 % 23
% R&R 100 (R&R/Tol) = =>
1.62 %
% PV =100 (PV/Tol)=> 26.01 % No.of distinct categories NDC= 1.41(PV/GRR) => 22.62
Average Chart (X bar) for Opr A
83.00 82.00 81.00 80.00 79.00 78.00 77.00 76.00 75.00 74.00
Average Char (X bar) for opr. B
83.00 83.00 82.00 81.00 80.00 79.00 78.00 77.00 76.00 75.00 74.00 82.00 81.00 80.00 79.00 78.00 77.00 76.00 75.00 74.00 1 2 3 4 5 6 7 8 9 10 1 2
Average Char (X bar) for Opr. C
1
2
3
4
5
6
7
8
9
10
3
4
5
6
7
8
9
10
Range Chart for Operator A
0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00
Range Chart for Operator B
0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 1 2
Range Chart for Operator C
0.45 0.40 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10
3
4
5
6
7
8
9
10
ATTRIBUTE MEASUREMENT SYSTEMS STUDY
Attribute MSA is applicable when the measurement value is one of finite number of categories. Most common example is of “go/not go” gauge. Other examples are leak or no leak, crack or no crack, defect or no defect, pass or fail, complete or incomplete.
METHODS
Hypotheses Test AnalysisCross Tab Method
ATTRIBUTE MEASUREMENT SYSTEMS STUDY
METHODOLOGY:
– Select approx 20 parts as follows: – Approximately one third conforming, – One third non conforming & one third marginal (marginal conforming & marginal non conforming) – Note down the correct measurement attribute (true status). – Decide the no. of appraiser & no. of trials to be conducted. – Record the measurement result in data sheet, for Not OK decision record 0 and for OK decision record 1.
DATA SHEET FOR ATTRIBUTE MSA STUDY
MEASUREMENT SYSTEM ANALYSIS ATTRIBUTE STUDY
PART NAME PART NO. MODEL DATE
CHAR.
PLANT  III
VISUAL SHYAM SUNDER DEELIP MALIK SOHAN PAL MNEGI
C3 REF. CODE
Samples of Molding & Extrusion parts ALL 12.11.07 OK / NG Judgment
A1 A2 A3 B1 B2 B3
EQUIPMENT US ED
APPRAISER APPRAISER APPRAISER APPRAISEE
A B C
PART
C1
C2
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
1 0 0 1 0 1 1 0 1 1 0 1 0 1 1 1 1 0 1 1
1 0 1 1 0 1 1 0 1 1 0 1 0 1 1 1 0 0 1 1
0 0 0 1 0 1 1 0 1 1 0 1 0 1 1 1 0 0 1 1
1 0 1 1 0 1 1 0 1 1 0 1 0 1 1 1 0 0 1 1
0 0 0 1 0 1 1 0 1 1 0 1 0 1 1 1 0 0 1 1
1 0 0 0 0 1 1 0 1 1 0 0 0 1 1 1 0 0 1 1
0 1 0 1 0 1 1 0 1 1 0 1 0 1 1 1 0 0 1 1
1 0 0 1 0 1 0 0 1 1 0 0 0 1 1 1 0 0 1 1
0 0 0 1 1 1 1 0 1 1 0 1 0 1 1 1 0 0 1 1
1 0 0 1 0 1 1 0 1 1 0 1 0 1 1 1 0 0 1 1
ATTRIBUTE MEASUREMENT SYSTEMS STUDY
Possible outcome by the appraiser:
– Correct decisions – Calling good part as good (goodcorrect) – Calling bad part as bad (badcorrect) – Wrong decisions – Calling good part as bad (false alarm) – Calling bad part as good (miss) – For each appraiser count the data as follows – Number goodcorrect (GN) – Number bad correct (NB) – Number correct (CN): total (GN+NB) – Number false alarm(NF): that is rejecting ok part – Number miss(NM): that is accepting rejected part – Number total (TN): total of (GN+NB+NF+NM)
ATTRIBUTE MEASUREMENT SYSTEMS STUDY
–For each appraiser calculate the following o Kappa o Effectiveness o False alarm o Miss Rate
Kappa
Kappa indicates the degree of agreement of the nominal or ordinal assessments made by multiple appraisers when evaluating the same samples. Kappa statistics are commonly used in cross tabulation (table) applications in attribute agreement analysis (attribute gage R&R). 0<kappa<1 General rule of thumb is that value of kappa greater than 0.75 indicates good to excellent agreement, values less than 0.40 indicates Poor agreement
Effectiveness :
The ability to accurately detect conforming and nonconforming part s. This is expressed no. between 0 and 1 where 1 is perfect. It is computed by Effectiveness (E) =No. of parts correctly identified / Total opportunity to be correct The total opportunities to be correct are a function of the number of parts used and how many times each part is inspected. If 10 parts are selected and each is inspected three times, there are a total of 3x10=30 opportunity to be correct.
Probability of a Miss rate Pmiss :
The probability of a miss rate is the chance of not rejecting a nonconforming part. This is a serious type of error since a non conforming part is accepted. The probability of a miss rate is computed as Pmiss = No. of misses / no. of opportunity for a miss.
No. of opportunity for a miss.
total sample that are good in study x number of trials per operator
Probability of False Alarm Pfa :
The probability of a false alarm is the chance of rejecting a conforming part. This type of error is not a serious as miss, since a conforming part is rejecting. The probability of a miss rate is computed as Pfa = No. of misses / no. of opportunity for a false alarm.
No. of opportunity for false alarm
total sample that are bad in study x number of trials per operator
Decision Criteria
DECISION ACCEPTABLE FOR THE APPRAISER MARGINALLY ACCEPTABLE FOR THE APPRAISER  MAY NEED IMPROVEMENT UNACCEPTABLE FOR THE APPRAISER  NEEDS IMPROVEMENT EFFECTIVENESS MISS RATE FALSE ALARM RATE
> = 90% > = 80% < 80%
< = 2% < = 5% > 5%
< = 5% < = 10% > 10%
If have any query : contact at msnegi@ppapco.com
To be Continue. . . . . . . . . . . . . . . .
EXAMPLE CROSS TAB METHOD
Part 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50
A1 P P NEG NEG NEG P P P NEG P P NEG P P P P P P P P P NEG P P NEG NEG P P P NEG P P P NEG P P NEG P NEG P P NEG P P NEG P P NEG P NEG
A2 P P NEG NEG NEG P P P NEG P P NEG P P P P P P P P P NEG P P NEG P P P P NEG P P P NEG P P NEG P NEG P P NEG NEG P NEG P P NEG P NEG
A3 P P NEG NEG NEG NEG P P NEG P P NEG P NEG P P P P P P NEG P P P NEG NEG P P P NEG P P P P P NEG NEG P NEG P P NEG P P NEG P P NEG P NEG
B1 P P NEG NEG NEG P P P NEG P P NEG P P P P P P P P P NEG P P NEG NEG P P P NEG P P P NEG P P NEG P NEG P P NEG P P NEG P P NEG P NEG
B2 P P NEG NEG NEG P P P NEG P P NEG P P P P P P P P NEG P P P NEG NEG P P P NEG P P P NEG P P NEG P NEG P P NEG P P NEG P P NEG P NEG
B3 P P NEG NEG NEG NEG P P NEG P P NEG P P P P P P P P P NEG P P NEG NEG P P P P P P P P P P NEG P NEG P P NEG P P NEG P P NEG P NEG
C1 P P NEG NEG NEG P P P NEG P P NEG P P P P P P P P NEG P P P NEG NEG P P P NEG P P P NEG P P NEG P NEG P P NEG P P NEG P P NEG P NEG
C2 P P NEG NEG NEG NEG NEG P NEG P P P P NEG P P P P P P P P P P NEG NEG P P P NEG P P P P P NEG NEG P NEG P P NEG P P NEG P P NEG P NEG
C3 P P NEG NEG NEG NEG P P NEG P P NEG P NEG P P P P P P NEG NEG P P NEG P P P P NEG P P P P P P NEG P NEG P P NEG NEG P NEG P P NEG P NEG
Ref. P P NEG NEG NEG P P P NEG P P NEG P P P P P P P P P NEG P P NEG NEG P P P NEG P P P NEG P P NEG P NEG P P NEG P P NEG P P NEG P NEG
Ref. value 0.476901 0.509015 0.576459 0.566152 0.570360 0.544951 0.465454 0.502295 0.437817 0.515573 0.488905 0.559918 0.542704 0.454518 0.517377 0.531939 0.519694 0.484167 0.520496 0.477236 0.452310 0.545604 0.529065 0.514192 0.599581 0.547204 0.502436 0.521642 0.523754 0.561457 0.503091 0.505850 0.487613 0.449696 0.498698 0.543077 0.488184 0.427687 0.501132 0.513779 0.566575 0.462410 0.470832 0.412453 0.493441 0.486379 0.587893 0.483803 0.446697
Code
+ + x x + + + x + x + + + + + + x x + + x + + + x + + + x +
x 0.409238 + + + x + + + + 
KAPPA between operators 0 2 A=N & B=P 0 2 A=P & B=N 1 3 B=N & C=P 1 4 B=P & C=N 1 4 A=N & C=P 1 5 A=P & C=N
4 1 1 4 2 2
Total 6 3 5 9 7 8
A=N & B=N A=P & B=P B=N & C=N B=P & C=P A=N & C=N A=P & C=P
16 34 15 33 15 33
14 32 13 30 12 29
14 31 14 31 16 30
Total 44 97 42 94 43 92
31.33 68.67 31.33 68.67 33.33 66.67
KAPPA between operators & REF value 5 A=N/Ref=P B=N/Ref=P 45 A=N/Ref=N B=N/Ref=N 97 A=P/Ref=P B=P/Ref=P 3 A=P/Ref=N B=P/Ref=N
2 45 100 3
C=N/Ref=P C=N/Ref=N C=P/Ref=P C=P/Ref=N
9 42 93 6
= = = =
false alarm correct decision correct decision missed bad parts
Operator
100% match Good
100% missed
Mixed
Operator
100% match Bad
Operator
100% match Good & Bad
A B C
29 32 28
0 0 0
8 5 10
A B C
13 13 12
A B C
42 45 40
A to B crosstabulation B 0 A 0 1 Total Count Expected count Count Expected count Count Expected count 44 15.7 3 31.3 47 47.0 1 6 34.3 97 68.7 103 103.0 Total 50 50.0 100 100.0 150 150.0 Po Pe 0.94 0.56 kappa = Po Pe / 1 Pe Kappa A  B
0.38
0.44
0.86
Po = the sum of the observed proportion in the diagonal cells Pe = the sum of the expected proportion in the diagonal cells
B to C crosstabulation C 0 B 0 1 Total Count Expected count Count Expected count Count Expected count 42 16.0 9 35.0 51 51.0 1 5 31.0 94 68.0 99 99.0 Total 47 47.0 103 103.0 150 150.0 Po Pe 0.91 0.56 kappa = Po Pe / 1 Pe Kappa B  C
0.35
0.44
0.79
A to C crosstabulation C 0 A 0 1 Total Count Expected count Count Expected count Count Expected count 43 17.0 8 34.0 51 51.0 1 7 33.0 92 66.0 99 99.0 Total 50 50.0 100 100.0 150 150.0 Po Pe 0.90 0.55 kappa = Po Pe / 1 Pe Kappa A  C
0.35
0.45
0.78
A to REF crosstabulation REF 0 A 0 1 Total Count Expected count Count Expected count Count Expected count 45 16.0 3 32.0 48 48.0 1 5 34.0 97 68.0 102 102.0 Total 50 50.0 100 100.0 150 150.0 Po Pe 0.95 0.56 Kappa A  REF
0.39
0.44
0.88
B to REF crosstabulation REF 0 B 0 1 Total Count Expected count Count Expected count Count Expected count 45 15.0 3 33.0 48 48.0 1 2 32.0 100 70.0 102 102.0 Total 47 47.0 103 103.0 150 150.0 Po Pe 0.97 0.57 Kappa B  REF
0.40
0.43
0.92
C to REF crosstabulation REF 0 C 0 1 Total Count Expected count Count Expected count Count Expected count 42 16.3 6 31.7 48 48.0 1 9 34.7 93 67.3 102 102.0 Total 51 51.0 99 99.0 150 150.0 Po Pe 0.90 0.56 Kappa A  REF
0.34
0.44
0.77
Appraiser
Number Good correct 97 100 93
Number bad correct 45 45 42
Number correct 142 145 135
Number false alarm 5 2 9
Number missed 3 3 6
Number total 150 150 150
A B C
Appraiser A B C
Effectiveness 84.00 90.00 80.00
Pf alse alarm 4.90 1.96 8.82
Pmissed 6.25 6.25 12.50
Effectiveness (E) =No. of parts correctly identified / Total opportunity to be correct Pmiss = No. of misses / no. of opportunity for a miss. Pfa = No. of misses / no. of opportunity for a false alarm.
This action might not be possible to undo. Are you sure you want to continue?
Use one of your book credits to continue reading from where you left off, or restart the preview.