You are on page 1of 60

UNCERTAINTY ANALYSIS:

A BASIC OVERVIEW
presented at

CAVS

by

GLENN STEELE

www.uncertainty-analysis.com

August 31, 2011

1
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
EXPERIMENTAL UNCERTAINTY REFERENCES

The ISO GUM:


The de facto
international
standard

2
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
EXPERIMENTAL UNCERTAINTY REFERENCES
http://www.oiml.org/publications/?publi=3&publi_langue=en

3
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
VALIDATION REFERENCES

4
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
VALIDATION REFERENCES

5
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
“Degree of Goodness”

• When we use experimental results (such as property


values) in an analytical solution, we should consider
“how good” the data are and what influence that degree
of goodness has on the interpretation and usefulness of
the solution

• When we compare model predictions with experimental


data, as in a validation process, we should consider the
degree of goodness of the model results and the degree
of goodness of the data.

6
Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Typical comparison of predictions and data,
considering no uncertainties:

Result, CD

Set point, Re

7
Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Comparison of predictions and data considering only
the likely uncertainty in the experimental result:

Result, CD

Set point, Re

Uncertainties set the resolution at which


meaningful comparisons can be made.
8
Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Validation comparison considering all uncertainties:

S  value from the simulation


D  data value from experiment
E  comparison error
E = S - D = S- D
Result, CD

where (S= model+ input+ num)


r Ux
URe
UU
DCD
D
S+U
USIM
S

X Re
Set point,
9
Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
“Degree of Goodness” and Uncertainty Analysis

• When an experimental approach to solving a problem is to be


used, the question of “how good must the results be?” should be
answered at the very beginning of the effort. This required
degree of goodness can then be used as guidance in the
planning and design of the experiment.

• We use the concept of uncertainty to describe the “degree of


goodness” of a measurement or an experimental result.

10
Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
ERRORS
&
UNCERTAINTIES

11
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
An error  is a quantity with a sign and magnitude. (We
assume any error whose sign and magnitude is known has
been corrected for, so the errors that remain are of
unknown sign and magnitude.)

An uncertainty u is an estimate of an interval u that should


contain .

12
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Consider making a measurement of a steady variable X (whose
true value is designated as Xtrue) that is influenced by errors i
from 5 elemental error sources.

Postulate that errors 1 and 2 do not vary as measurements are


made, and 3, 4, and 5 do vary during the measurement period:

13
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
 (varies)

1 2 β (does not vary)

The total error () is the sum of

–  (=  1 +  2) the systematic, or fixed, error


–  (=  3 +  4 +  5) the random, or repeatability, error

=+

14
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
The kth measurement of X then appears as

The total error (k) is the sum of


–  k the systematic, or fixed, error
–  k the random, or repeatability, error

15
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
• Central Limit
Theorem
•   statistics
•   ???

16
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Histogram of
temperatures
read from a
thermometer
by 24
students

17
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Now consider again making the measurements of X

 (varies)

1 2
β (does not vary)

18
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Xi  X true  (β1  β 2 )  ( )i

We can calculate the standard deviation sX of the distribution of N


measurements of X

and that will correspond to a standard uncertainty (u) estimate of the range
of the i’s. We will call sX the random standard uncertainty.

19
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Xi  X true  (β1  β 2 )  ( )i

20
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
We will estimate systematic standard uncertainties corresponding to
the elemental systematic errors i and use the symbol bi to denote
such an uncertainty. Thus ±b1 will be an uncertainty interval that
should contain 1, ±b2 will be an uncertainty interval that should
contain 2, and so on....

The systematic standard uncertainty bi is understood to be an estimate


of the standard deviation of the parent population from which the
systematic error i is a single realization.

21
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Xi  Xtrue  (β1  β2 )  ()i
The standard uncertainty in X -- denoted uX -- is defined
such that the interval ± uX contains the (unknown)
combination (β1  β 2 )  ( )

and, in accordance with the GUM, is given by

u X  b12  b22  s X2

22
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Categorizing and Estimating Uncertainties in the Measurement
of a Variable

• GUM categorization by method of evaluation:


– Type A  “method of evaluation of uncertainty by the
statistical analysis of series of observations”
– Type B  “method of evaluation of uncertainty by means
other than the statistical analysis of series of observations”

• Traditional U.S. categorization by effect on measurement:


– Random (component of) uncertainty  estimate of the effect
of the random errors on the measured value
– Systematic (component of) uncertainty  estimate of the
effect of the systematic errors on the measured value

Both are useful, and they are not inconsistent. Use of both will be
illustrated in the examples in this course.

23
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
An Additional Uncertainty Categorization

• In the fields of Risk Analysis, Reliability Engineering, Systems


Safety Assessment, and others, uncertainties are often
categorized as

• Aleatory
– Variability
– Due to a random process

• Epistemic
– Incertitude
– Due to lack of knowledge

24
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Uncertainty Categorization

100 %

The key is to identify the significant errors and estimate the corresponding
uncertainties – whether one divides them into categories for convenience of
Random – Systematic
Type A – Type B
Aleatory – Epistemic
Lemons – Chipmunks
should make no difference in the overall estimate u if one proceeds properly.

25
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
OVERALL UNCERTAINTY OF A
MEASUREMENT

At the standard deviation level


Systematic Standard Uncertainty = bX  b12  b22
(for 2 elemental systematic errors)
Random Standard Uncertainty = sX (or s )
X
Combined Standard Uncertainty = uX

u X2  bX2  s 2X

Overall or Expanded Uncertainty at C % confidence

U% =k %uX

26
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
To obtain a value of the coverage factor k, an
assumption about the form of the distribution of the
total errors (the ’s) in X is necessary.

For large samples, assuming the total errors in the measurements


have a roughly Gaussian distribution, and using a 95% confidence
level, k95 = 2 and

( 
1/ 2
U95  2 b  s 2
X
2
X

The true value of the variable will then be within the limits

X - U95  X true  X  U95

about 95 times out of 100.

27
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
RESULT DETERMINED FROM

MULTIPLE MEASURED

VARIABLES

28
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
• We usually combine several variables using a
Data Reduction Equation (DRE)
r  (u ' v ' 
p D
 CD 
1 2
RT V A
2

to determine an experimental result.

• These have the general DRE form


r  r ( X1, X 2,..., X J )

• There are two approaches used for propagating


uncertainties through the DREs:
– the Taylor Series Method (TSM)
– the Monte Carlo Method (MCM)

29
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
TAYLOR SERIES METHOD OF UNCERTAINTY PROPAGATION

For the case where the result r is a function of two variables x and y

r = f(x,y)

the combined standard uncertainty of the result, ur, is given by

2 2
 r   r   systematic error  2
ur2    b2x    b2y     sr
 x   y   correlation effects 

where sr is calculated from multiple result determinations and the bx and by


systematic standard uncertainties are determined from the combination of
elemental systematic uncertainties that affect x and y as
Mx My

b  b
2
x
2
xk and b   b 2yk
2
y
k 1 k 1

30
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Monte Carlo Method
of
Uncertainty
Propagation

31
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Applying General Uncertainty Analysis –
Experimental Planning Phase

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
32
GENERAL UNCERTAINTY ANALYSIS
• For a result given by a data reduction equation (DRE)

r  r ( X1, X 2 ,..., X J )
• the uncertainty is given by
2 2 2
 r   r   r 
U 2
 U X1 (  2
  U X2 (  2
 ...    U XJ (  2

  X1    X2    XJ 
r

2FD
• Example DRE CD 
V 2 A
• Note that (assuming the large sample approximation) the U in
the propagation equation can be interpreted as the 95%
confidence U95 = 2 u or as the standard uncertainty u as long as
each term in the equation is treated consistently.

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
33
Example
It is proposed that the shear modulus, MS, be determined for an alloy
by measuring the angular deformation  produced when a torque T is
applied to a cylindrical rod of the alloy with radius R and length L. The
expression relating these variables is
2LT

R 4MS
We wish to examine the sensitivity of the experimental result to the
uncertainties in the variables that must be measured before we
proceed with a detailed experimental design. The physical situation
shown below (where torque T is given by aF) is described by the data
reduction equation for the shear modulus

2LaF
MS 
R 4

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
34
2LaF
MS =
πR 4
2 2 2 2
UM  L MS   a MS   F MS  UF2
UL2 Ua2
S
      2
MS2  MS L  L  MS a  a  MS F  F
2 2

2 2
 R MS    MS  U2 UR2
    2
 MS R  R  MS   
2

2 2 2
UM UL2 2 Ua 2 UF
2 2
2 UR 2 U
 (1 2  (1 2  (1 2  ( 4  2  (1 2
S 2

MS2 L a F R 
2
UM
 ( 0.01  ( 0.01  ( 0.01  16 ( 0.01  ( 0.01
S 2 2 2 2 2

MS2
2
UM UMS
 20 ( 0.01
2
S
 0.045  4.5%
MS2 MS
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
35
ESTIMATING RANDOM

UNCERTAINTIES

36
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Data sets for determining estimates of standard deviations and
random uncertainties should be acquired over a time period that is
large relative to the time scales of the factors that have a significant
influence on the data and that contribute to the random errors.

37
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Direct Calculation Approach for Random
Uncertainty
For a result that is determined M times
r1, r2 ,..., rM
the mean value of the result is
1 M
r   rk
M k 1
and
1/2
 1 2M
sr    (rk  r  
 M  1k 1 
1/2
1  1 M 2
sr   M  1  ( rk  r  
M k 1 
38
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
ESTIMATING SYSTEMATIC

UNCERTAINTIES

39
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Propagation of systematic errors into an experimental result:

40
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
The systematic standard uncertainties for the elemental error
sources are estimated in a variety of ways that were discussed
in some detail in the course. Among the ways used to obtain
estimates are:

use of previous experience,

manufacturer’s specifications,

calibration data,

results from specially designed “side” experiments,

results from analytical models,

and others.

41
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Recall the definition of a systematic standard uncertainty, b. It is not the most
likely value of  , nor the maximum value. It is the standard deviation of the
assumed parent population of possible values of .

42
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
SYSTEMATIC STANDARD UNCERTAINTY

bA/ 3

bA/ 6

43
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
44
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Correlated Systematic Errors

• Typically occur when different measured variables share one or more


elemental error sources
– multiple variables measured with same transducer
• probe traversed across flow field
• multiple pressures ported sequentially to the same transducer (scanivalve)

– multiple transducers calibrated against same standard


• electronically scanned pressure (ESP) systems in use in aerospace ground test
facilities

• Examples
– q = m Cp (To – Ti)

– 1
P (P1  P2  ...  PN 
N

– u’v’

45
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
 r  r 
Using the TSM, there is a 2    bx1x2
 x1  x2 
term in the br2 equation for each pair of variables in the DRE that might
share an error source:

• For q = m Cp (To – Ti)

2
 q  2  q  q 
b2q  ...    T
b  ...  2    bT T
T
 o
o
T T
 o  i 
o i

• For
1
P (P1  P2  ...  PN 
N

 P   P   P  P 
bP2  ...  2    bPP  2    bPP  ...
 P1   P2   P1  P3 
1 2 1 3

• For u’v’ ....

46
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
47
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Some Final Practical Points on Estimating Systematic
Uncertainties

• When estimating b, we are not trying to estimate the most probable value nor
the maximum possible value of 

• Always remember to view and use estimates with common sense. For example,
a “% of full scale” b should not apply near zero if the instrument is nulled.

• Resources should not be wasted on obtaining good uncertainty estimates for


insignificant sources – a practice we have observed too many times….

48
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
“V&V” – Verification & Validation: The Process

• Preparation
– Specification of validation variables, validation set points, etc. (This
specification determines the resource commitment that is necessary.)
– It is critical for modelers and experimentalists to work together in this phase.
The experimental and simulation results to be compared must be
conceptually identical.

• Verification
– Are the equations solved correctly? (MMS for code verification. Grid
convergence studies, etc, for solution verification to estimate unum .)

• Validation
– Are the correct equations being solved? (Compare with experimental data
and attempt to assess model )

• Documentation

49
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
A Validation Comparison 50
Copyright 2008 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Reality of Interest (Truth): Experiment “as run”
model
Modeling
Assumptions

Simulation

D
Experimental Simulation Inputs input
Errors (Properties, etc.)

num
Numerical Solution
of Equations

Comparison Error,
E=S-D
Experimental Data, D Validation Uncertainty, Simulation Result, S
uval

E = (model) + (input+ num - D)

V&V Overview – Sources of Error Shown in Ovals

51
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Strategy of the Approach
• Isolate the modeling error, having a value or uncertainty for
everything else
E
E=S-D = model + (input +num - D)

model = E - (input +num - D)

± uval
• If ± uval is an interval that includes (input +num - D)

then model lies within the interval


E ± uval

52
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Uncertainty Estimates Necessary to Obtain the
Validation Uncertainty uval

( 
1/ 2
uval  u  u 2
D
2
num u 2
input

• Uncertainty in simulation result due to numerical solution of the


equations, unum (code and solution verification)

• Uncertainty in experimental result, uD


Propagation by
(A) Taylor Series
• Uncertainty in simulation result due to (B) Monte Carlo
uncertainties in code inputs, uinput

53
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Methodology
Simulation Uncertainty
Modeling error for uncalibrated model used to make calculations
between validation points

mod el lies within the int erval E  uE2  u2v al  u2sp

where
usp = uncertainty contribution from the uncertainty of input
parameters at the simulation calculation point
2
 s 
( 2
J
u2sp     u X
i1 Xi 
i

and
uE = uncertainty in E at the calculation point from the
interpolation process
54
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Uncertainty of Calibrated
Models

55
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Methodology
Instrument Calibration Analogy
• Uncalibrated instrumentation system

uI  u2t  um
1
2

where ut = uncertainty of the transducer


and um = uncertainty of the meter

• Calibrated instrumentation system

uI  uc
2

where uc is the calibration uncertainty


56
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Methodology
Instrument Calibration Analogy
• If a curve-fit is used to develop a relationship
between the meter reading and the calibrated
output value, then

uI  uc2  ucf
3
2

where ucf = the curve-fit uncertainty

• If the meter used in testing (m2) is different from


the meter used in calibration (m1), then

uI  uc2  ucf
4
2
 um
2
 um
2
1 2

57
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Methodology
Instrument Calibration Analogy
• The uncertainties, u, in the previous expressions
are standard uncertainties, at the standard
deviation level. To express the uncertainty at a
given confidence level, such as 95%, the
standard uncertainty is multiplied by an
expansion factor. For most engineering
applications, the expansion factor is 2 for 95%
confidence.
U95  2u

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 58
Methodology
Calibrated Model
• To calibrate a model, the simulation results are
compared with a set of data and corrections are
applied to the model to make it match the data.
The simulation uncertainty is then
us1  ud

• As in the curve-fit uncertainty in the calibration of


a transducer, there will be additional uncertainty
in the calibrated model based on the error
between the corrected simulation results and the
data.
us  u2d  uE2 2
59
Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.
Methodology
Calibrated Model
• u s2 would apply for simulation results over the
range of the input parameter values used in the
calibration of the model with the assumption that
the input parameters in the simulation have the
same uncertainties that they had in the
calibration process.
• If the input parameter sources or transducers
change for a simulation result, then

us3  u2d  uE2  u2sp1  u2sp2

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 60

You might also like