You are on page 1of 9

1

COURSE: SIX SIGMA PRINCIPLES

WEEK 1
1. MEASUREMENT SYSTEM ANALYSIS PART 1
I. MEASUREMENT SYSTEM ANALYSIS PART 1
II. MEASUREMENT SYSTEM ANALYSIS PART 2
Practice Quiz: Measurement System Analysis Practice - 10 questions
Quiz: Measurement System Analysis Practice Graded Quiz - 8 questions

I. MEASUREMENT SYSTEM ANALYSIS PART 1

Measurement system analysis involves validating the measurement system. Lord Kelvin said,
the grandest discoveries of science have been but the rewards of accurate measurement and
patient long continued labor in shifting of numerical results. At the highest level of the
measuring system there are handful of key questions that we must address. These questions
seem rather intuitive and even obvious but we do find the answers to these questions can be
harder to understand than we may suspect once we dig into the details of the process we are
accessing.
Ultimately our aim is one of process effectiveness or freedom from deficiencies. Are we
capturing the correct data? Does the data reflect what is happening? Can we detect any
changes in the process? Can we identify the sources of measurement error? And System
Effectiveness which refers to the validity and reference of the measure to customer needs,
these are measures external to the process that help to assist the outcome. Such as is the
systems stable over time is it capable of generating and measuring data to support your
decisions. And can the measurement system be improved? Or the possible sources of
variation within the measurement system, there are sources of variation from the operator
or the person actually conducting the measurements, or the gage or the equipment used to
measure, as well as other sources of environmental reasons.

Accuracy of an instrument sustain through calibration. An instrument's performance can


often drift over time to you to where Temperature changes or barometric pressure changes.
It is necessary to compare the measuring device to that of a known standard through
operational definition. How often calibration is needed is based on how much drift is being
tolerated over time. Calibration seeks to understand the mean or average output of the
instrument, not the variation.

Calibration helps us uncover measurement system bias. A shift in the mean is detected and
adjustments are made to re-center the instrument. From this equation, you can see that by
minimizing the measurement error, the observed value becomes closer to the true value.
Accuracy is simply how close the agreement is between the mean output of the measuring
device to that of a reference or a standard value. Sometimes you will see accuracy denoted
as a range or a tolerance, such as plus or minus one-thousands of an inch.
2
COURSE: SIX SIGMA PRINCIPLES

Accuracy seeks to understand the mean of the instrument, not the variation.
Precision is different from accuracy, precision is the closeness of repeated readings to each
other and not to the standard. Precision helps helps process owners understand random
error which is part of the Measurement system variation. Precision is used to calculate the
PT ratio. Seeing on the gage or on our video.

The prime contributor towards total variation is measurement system variation. Detecting
actual process variation's very difficult. You can see from this example that the blue curve
Represents the observed variation in the collection of data. The red curve represents the
measurement system variation from the repeated measuring of the measuring devices. The
green curve represents the actual process variation. The purpose of the precision is to
approach the green curve as close as possible to identify true variation of the mean. Here's a
quick illustration on a difference between Precision and Accuracy. Using Target Analogy you
can see the first target as where the arrows were precise, but not accurate missing the balls
eye.

A graphical view to the right indicates a screw to the right from the true value of the bull's
eye, in the second target you say more arrows arround the bull's eye but not very precise.
The graphical view to the right denotes the main is aligned with the true main, however the
variation is spread out Indicated by the flatten curve.
The third target, the arrows are both precise and accurate hitting the bull's eye. The graphical
view shows the aligned observational main together the true main. It also denotes minimal
variability this a higher peak with less deviation or spread. Stability is the drift and the
absolute value over time with a gauge.

When we examine the difference In the average of at least two sets of measurements
obtained with the same gage on the same product at the different times, we refer of this is
Gage Stability. The purpose of Calibration System is to eliminate stability errors by identifying,
adjusting, and reducing these differences. Statistical Process Control techniques are used to
control the stability of the process. Note that proper calibration can help eliminate stability
error. Gage bias is defined as the difference between the observed mean of the
measurements and a known standard.

NIST, or the National Institute of Standards and Technology maintains these standards. It is
often used interchangeable with precision.Bias has a direction like the bias is plus 25
thousandths of an inch.
Periodic calibrations against the standard can correct bias. Gage linearity is measuring how
accurate the instrument, device or gage is during the span. Or measuring range of the
equipment. Periodic calibration can also eliminate linearity issues over a wide range of
measurements. In essence, you can see from the illustration, that an instrument can
produced two different biases over the observable range of time or linearity in consistencies.
3
COURSE: SIX SIGMA PRINCIPLES
4
COURSE: SIX SIGMA PRINCIPLES
5
COURSE: SIX SIGMA PRINCIPLES
6
COURSE: SIX SIGMA PRINCIPLES

II. Measurement System Analysis Part 2


Beyond accuracy, we must focus our attention on the variation inherent to the measurement
system. Variation is inherent to the system and we need to understand this variation due to the
measurement system itself. This opens the door to gauge repeatability and reproducibility.
Gauge repeatability is the variation in instruments obtained when one operator uses the same
gauge for measuring the identical characteristics of the same products. The variance affected
by the trial to trial measurements is the repeatability. Gauge Reproducibility is the variation in
the average of measurements made by different operators using the same gauge when
measuring identical characteristics of the same outputs.
The variance affected by the operator or operator measurements is the reproducibility. The
total 100 percent variability found in the system is equal to the actual part to part process
variation, is what we wish to see and the measurement system which is preventing us from
seeing a clear picture of what's going on. The measurement variability branch includes both the
gauge variability and the operator variability.
The repeatability variation occurs with the gauge variability and the reproducibility variation
occurs within operator variability. Reproducibility can be further broken down by the operator
to other operators and operator by part. Variation due to the operator is called reproducibility.
Variation due to the gauge system is called repeatability. This is where gauge R&R comes from.
Measurement error sum together. A set of measurements are taken. We take several
measurements of the same product with the same gauge. This forms a distribution. We repeat
this for several measurements of several products to produce multiple distributions. If we plot
the sum of these distributions, we would see that the total observed variation distribution.
The element we want to uncover is the actual part to part variation. Measurement variation is
often expressed as a ratio of precision and to tolerance or P over T. The total variation can be
replaced by one-sixth of the tolerance. So you have the percentage gauge R&R formula is, 100
times the GRR divided by the tolerance divided by six. Six is used to represent 99.73 percent of
the total variation or six standard deviations. This is where we get six sigma.
The illustration denotes that the standard normal curve indicating that the area below the
curve equals one but the plus or minus three sigma, or standard deviations from the mean or
six sigma, accounts for the confidence of capturing 99.73 percent of the total variation.
Let's recap. The total variation is the sum of the Part to Part Variability, or the actual variability
and the Measurement Variability. Gauge R&R determines the amount of variation in the
observed measurements due to the Operators and the Equipment. Calibration insures the
equipment mean readings are matched to known standards. What remains is our actual
process variation. Identifying Bias, Stability, and Linearity all help to improve repeatability
measurements. Various statistical calculations systems such as Minitab or Microsoft Excel can
7
COURSE: SIX SIGMA PRINCIPLES

perform needed distribution analysis. Statistical process control sometimes referred to as


statistical quality control uses the X bar and R chart method to identify variability.
8
COURSE: SIX SIGMA PRINCIPLES
9
COURSE: SIX SIGMA PRINCIPLES

You might also like