# Unit 4 Statistics, Detection Limits and Uncertainty

Experts Teaching from Practical Experience

Unit 4 – Topics Statistical Analysis Detection Limits
 Decision thresholds & detection levels

 Instrument Detection Limits vs. Method Detection Limits

© Kinectrics Inc. 2012

2

Unit 4 – Topics

Uncertainty
 Requirements  Guide to Uncertainty in Measurement (GUM)  Uncertainty budgets

 Sampling uncertainty

© Kinectrics Inc. 2012

3

Unit 4 – Learning Objectives
 Describe the responsibilities of the user of the standards with regard to Detection Limits

 Identify the guidance on analysis of samples and reporting of results
 Describe the changes in the guidance on estimating and reporting uncertainties

© Kinectrics Inc. 2012

4

Unit 4a Statistics Experts Teaching from Practical Experience .

Draw Conclusions from the Data More detail is given in EPA QA/G-9S © Kinectrics Inc. Verify the Assumptions of the Tests 5.Interpretation of Survey Results  MARSSIM (Chapter 8) recommends the following sequence of steps: 1. Select the Tests 4. Conduct a Preliminary Data Review 3. Review the Data Quality Objectives (DQOs) and Sampling Design 2. 2012 6 .

2012 7 .Review the DQOs and Sampling Design  Review the objectives of the study  Translate the objectives into statistical hypotheses  Translate the objectives into limits on Type I & Type II errors  Review the sampling design & note any special features or potential problems © Kinectrics Inc.

dispersion and. if the data involves two variables. measures of central tendency.Conduct a Preliminary Data Review  Review Quality Assurance reports • Look for problems or anomalies  Calculate basic statistical quantities • Calculate percentiles. the correlation coefficient  Graph the data • Select graphical representations that illuminate the structure of the data © Kinectrics Inc. 2012 8 .

Select the Statistical Methods  Select the statistical method • Follow the “Decision Tree” from EPA QA/G-9S  Identify assumptions underlying the test • List the key underlying assumptions such as distributional form. etc • Note any sensitive assumptions where relatively small deviations could jeopardize the validity of the test © Kinectrics Inc. dispersion. 2012 9 . independence.

Verify the Assumptions  Determine approach for verifying assumptions • Review (or develop) a statistical model for the data • Select the methods for verifying the assumptions  Perform tests of assumptions • Adjust for distributional assumptions (if warranted) • Perform the calculations required for the tests  Determine corrective actions © Kinectrics Inc. 2012 10 .

Verify the Assumptions  Determine corrective actions (if required) • Determine if data transformation will correct the problem • If data is missing. 2012 11 . explore collecting more data or using theoretical justification • Consider robust procedures or nonparametric hypothesis tests © Kinectrics Inc.

draw conclusions & document • If the null hypothesis is not rejected.Draw Conclusions from the Data  Perform the statistical procedure • Perform & document the statistical tests • Identify outliers and recalculate if necessary  Draw study conclusions • If the null hypothesis is rejected. verify limits on decision errors then draw conclusions & document • Interpret the results © Kinectrics Inc. 2012 12 .

2012 13 .Draw Conclusions from the Data  Evaluate performance of the sampling design • Evaluate the statistical power of the design over the full range of parameter values © Kinectrics Inc.

Unit 4b Detection Limits Experts Teaching from Practical Experience .

2012 15 .4-10 & N2885. document. and justify the detection limit reported by any laboratory engaged to perform analyses. © Kinectrics Inc.-11 caution that terminology and the definitions used for these concepts (non-detect level & detection limit) are not always consistent. It is the responsibility of the user to understand.Definitions  Both N228.

Definitions Organization or Discipline CSA N288. of ACS IUPAC ISO 11929:2010 LC Non-detect Level (LC) Critical Level (CL) Limit of Detection (LOD) Limit of Detection (LOD) Detection Decisions (LC) Decision Threshold (y*) LD Detection Limit (LD) Lower Limit of Detection (LLD) Limit of Quantification (LOQ) Limit of Quantification (LOQ) Minimum Detectable Value or Detection Limit (LD) Detection Limit (y#) © Kinectrics Inc.4/5 Health Physics (ANSI/HPS N13. IOHA) Environmental Analytical Chemistry Cmte.30-1996) Occupational Hygiene (AIHA. 2012 16 .

• The non-detect level is the smallest value of the measurand for which the probability of a wrong conclusion that the measurand is present when it actually is not present (‘error of the first kind’ or a ‘false positive error’) does not exceed a specified probability.1 of CSA N288.5-11 © Kinectrics Inc.4-10 and N288. 2012 17 . Excerpt from Clause 3. α.Definitions  Non-detect level — the level below which quantitative results are not obtained from the measurement system or analysis method selected.

Definitions  Detection limit — the level (relative to background) above which an effect can confidently be measured. 2012 18 . Excerpt from Clause 3.4-10 and N288.5-11 © Kinectrics Inc. • The detection limit is the smallest value of the measurand for which the probability of a wrong conclusion that the measurand is not present when it actually is present (‘error of the second kind’ of a ‘false negative error’) does not exceed a specified probability. β.1 of CSA N288.

2012 19 .Definitions Non-detect Level (LC) for Type I Error (α) Detection Limit (LD) for Type II Error (β) From IUPAC Recommendations 1995 © Kinectrics Inc.

4-10 Annex D.Definitions CSA N288. 2012 20 . Table D.1 Formulae for non-detect (critical) level and detection limit © Kinectrics Inc.

10 of CSA N288. • This level shall be defined and its derivation should be documented.1.CSA N288. 2012 21 .5-11  The level below which quantitative results are not obtained from the measurement system or analysis method selected is called the non-detect level. • Measurements below this level are often reported as being “less than” some value. Paraphrased from Clause 8.5-11 © Kinectrics Inc.

 The values of the non-detect level should be documented and reported.4-10  The treatment of results that are less than the nondetect level for the measurement shall be defined and documented. 2012 22 .4-10 © Kinectrics Inc.CSA N288.3 of CSA N288. Paraphrased from Clause 8.  The results of measurements that are below the nondetect level may be reported as “not detected” or as being below a “less than value”.

10 of CSA N288. Paraphrased from Clause 8.1.5-11 © Kinectrics Inc.CSA N288. • The requirement for numerical values needs to be considered within the context of laboratory capabilities and the proximity of the result to any applicable benchmark value.5-11  Quantitative numerical values should be reported rather than “less than” some value or non-detect level. 2012 23 .

81(1).Low Count Rates The user is cautioned that these formula might not be appropriate in all situations. © Kinectrics Inc. 2001). Health Physics. particularly in lowlevel counting. (see Strom & MacLellan. 2012 24 .

1.Instrument vs. 2012 25 . document. and justify the detection limit reported by any laboratory engaged to perform analyses.9 of N288. Method Detection Limits  A note following Clause 8.  Both standards generally assume (but often do not always explicitly state) that the detection level is the Method Detection Level © Kinectrics Inc. It is the responsibility of the user to understand.5-11 warns that: • …some laboratories might report an instrument detection level that is often much lower than the detection level of the method (which includes any required sample preparation).

2012 26 . Method Detection Limits Learning Objective  Describe the responsibilities of the user of the standards with regard to Detection Limits © Kinectrics Inc.Unit 4b – Summary and Review Detection Limits  Non-detect level & detection limit  Instrument Detection Limits vs.

Unit 4c Uncertainty Experts Teaching from Practical Experience .

determined in any other manner © Kinectrics Inc. – Measurement Uncertainty (from JCGM 200:2012 – International Vocabulary of Metrology): non-negative parameter characterizing the dispersion of the quantity values being attributed to a measurand.1 of CSA N288.4-10 • Type A uncertainty .a quantitative expression of error that results from incomplete knowledge or information about a parameter or value.that component of uncertainty which arises from imprecision. Excerpt from Clause 3.determined by repeated measurement • Type B uncertainty . based on the information used • Statistical uncertainty . • Systematic uncertainty .that component of uncertainty which arises from biases.Uncertainty  Uncertainty . 2012 28 .

5: The uncertainty associated with results of effluent monitoring measurements and any dose estimates derived from them should be discussed in the report.3.5-11  Both: The uncertainty should take into account both sampling and measurement errors.2.3.3..4: The uncertainty associated with each measured or calculated value should be estimated.1 of CSA N288.4-10  N288.Requirements  N288.3. Excerpt from Clause 9. Sampling errors cannot always be quantified but they shall be kept to a minimum by design of the monitoring program.1 of CSA N288. Excerpt from Clause 9. 2012 29 . © Kinectrics Inc.

2012 30 .Introduction to Uncertainty Two introductory texts on uncertainty: © Kinectrics Inc.

org/en/publications/guides/gum. 2012 31 .Guide to Uncertainty in Measurement  The Guide to Uncertainty in Measurement (GUM) was prepared by the Joint Committee for Guides in Metrology (JCGM) • Originally published in 1995 (JCGM 100:1995) • Minor revision published in 2008 (JCGM 100:2008) – Available for download from the BIPM website at http://www.html  The 1995 version of the GUM was adopted by ISO & IEC as ISO/IEC Guide 98-3:2008 • the 2008 revision has not yet been adopted by ISO/IEC © Kinectrics Inc.bipm.

Evaluation of Measurement Data  The JCGM Working Group on Uncertainty in Measurement is preparing additional guidance on the “Evaluation of Measurement Data”: • Guide to the expression of uncertainty in measurement (JCGM 100:2008) • Propagation of distributions using a Monte Carlo method – Supplement 1 to the GUM (JCGM 101:2008) • Extension to any number of output quantities – Supplement 2 to the GUM (JCGM 102:2011) • Introduction to the “Guide to the expression of uncertainty in measurement” and related documents (JCGM 104:2009) • Other documents & supplements are in preparation © Kinectrics Inc. 2012 32 .

pdf © Kinectrics Inc.com/library/TechnicalInformation/Pubs-Technical-Articles/PubsList/M3003. 2007) “The Expression of Uncertainty and Confidence in Measurement” • Provides an introduction to the subject with examples of the application of the guidance given in the GUM • http://www.eurachem.ukas. Jan.pdf  The Eurachem/CITAC Guide (Ramsey & Ellison.org/guides/pdf/UfS_2007. 2012 33 .Further Guidance  UKAS M3003 (Ed. 2007) “Measurement of uncertainty arising from sampling: A guide to methods and approaches” • Provides additional guidance on the estimation of uncertainty due to sampling and sample preparation • http://www. 2.

and of their calculation and combination  A National Research Council template for Uncertainty Budgets (an Excel spreadsheet) is available at: www.Uncertainty Budgets  The American Society for Quality (ASQ) defines an ‘uncertainty budget’ as: a statement of measurement uncertainty.nrc-nrc.xls © Kinectrics Inc. of the components of that measurement uncertainty.gc.ca/obj/inms-ienm/doc/clasclas/uncertainty_budget_template. 2012 34 .

2012 35 .Uncertainty Budgets Example: Calculation of activity by liquid scintillation counting A = {(C-B) x exp(-λt) x R x S} / {V x T x ε x Pϒ} Symbol A C B exp(-λt) R Activity Gross Counts Background Counts Decay Factor Random Summing Correction Symbol V T ε Pϒ S Volume Time Efficiency Emission Probability Self-absorption Correction © Kinectrics Inc.

32% 0.75% 0.00% 1.49% © Kinectrics Inc.30% 0.Uncertainty Budgets Component Gross Counts Background Counts Decay R S Value 414 22 1 1 1 Uncertainty 20.34 1 1922 0.30% 0.30% Volume (l) Live Time (s) Efficiency Emission Probability ACTIVITY (Bq/l) 0. 2012 36 .30% Distribution Normal (1σ) Normal (1σ) Normal (1σ) Normal (1σ) Normal (1σ) Divisor 1 1 1 1 1 Relative Uncertainty 4.15% 0.3 4.50% 0.10% 0.7 0.00002 0.91% 21.001 3.001 600 0.25% 11.50% 221 Rectangular (half-range) Rectangular (half-range) Normal (2σ) Normal (2σ) Normal (2σ) √3 √3 2 2 1.10% 0.

Rectangular Distribution P(x) = 1/2a. otherwise E(x) = Mean = b E(x2) = b2 + a2/3 σ2 = E(x2) – E(x)2 = a2/3 σ = a/√3 © Kinectrics Inc. for (b–a) < x < (b+a) 0. 2012 37 .

2012 38 . • Heterogeneity is the leading contributor to random sampling uncertainty.  Several reviews have concluded that. in general: • Uncertainty due to sampling is greater than uncertainty due to analysis (often an order of magnitude or more greater). and • Non-representative sampling is the leading contributor to systematic sampling uncertainty. © Kinectrics Inc.Sampling Uncertainty  The standards do not address sampling uncertainty in detail.

2012 39 .Sampling Uncertainty © Kinectrics Inc.

2012 40 .  Possible approaches include: • Empirical Methods – – – – – Duplicate samples Multiple protocols Collaborative trial Sampling proficiency test Reference sampling method/target • Modeling – Cause & Effect Modeling – Gy’s Sampling Theory © Kinectrics Inc.Sampling Uncertainty  Sampling Uncertainty should be included but it may be difficult (or impossible) to quantify.

Sampling Uncertainty Component Estimated Method Sampling Precision Bias Analytical Precision Bias No * Duplicate Samples (Single sampler using a single sampling protocol) Yes No Yes (may be estimated by including Certified Reference Materials) Multiple Protocols (Single sampler using multiple sampling protocols) No * Between protocols Yes (may be estimated by including Certified Reference Materials) Collaborative Trial (Multiple samplers using the same sampling protocol) Between samplers Between samplers & protocols Yes Yes Sampling Proficiency Test (Multiple samplers each using a different sampling protocols) Yes Yes © Kinectrics Inc. 2012 41 .

and • An estimate of uncertainty in each of the parameters used in the risk assessment equations (parameter uncertainty). © Kinectrics Inc. 2012 42 .Model Uncertainty  “Model uncertainty” is an assessment of the degree of confidence that a mathematical model is a "correct“ representation of the system.  Model uncertainty includes: • An estimate of the uncertainty due to the structure of the model (structure uncertainty).

water. 2012 43 . typically at the 95th percentile level. and air intake rates for the representative person.2.” © Kinectrics Inc. soil.1-08 Clause 4.9 – “Conservatism is introduced into the current model by selecting conservative values for food.Model Uncertainty  Estimates of the uncertainties in the N288 models are generally not available: • Excerpt from CSA N288.

the values were given without a statement of uncertainty or a range. because of the limited data available. 2012 44 . where possible .Model Uncertainty  Estimates of the uncertainties in the N288 models are generally not available: • IAEA TRS 472 (Handbook of Parameter Values for the Prediction of Radionuclide Transfer in Terrestrial and Freshwater Environments) – “Estimations … of uncertainty about each such value were carried out by applying statistical analysis.” © Kinectrics Inc... In some cases.

© Kinectrics Inc.Model Uncertainty  Estimates of the uncertainties in the N288 models are generally not available: • NCRP Report 164 (Uncertainties in Internal Dose Assessment) – The dose limits that are recommended by the International Commission on Radiological Protection (ICRP) for regulatory purposes are based on the use of values of dose per unit intake that are to be applied without any consideration of uncertainty. 2012 45 .

2012 46 . 1900 ± 200 Bq/l).g.Reporting Uncertainty  The number of significant figures quoted in a result shall not imply a degree of precision greater than that warranted by the sources of uncertainty.3 of CSA N288. • More significant figures should be carried for calculation steps than what is reported in the final stage. • Uncertainty should be rounded to one significant figure • The least significant figure in the result should correspond to the significant figure in the uncertainty (e.4-10 and CSA N288.5-11 © Kinectrics Inc. Paraphrased from Clause 9..

2012 47 .Unit 4c – Summary and Review Uncertainty  ISO Guide to Uncertainty in Measurement (GUM)  Uncertainty budgets  Sampling uncertainty  Reporting uncertainty Learning Objectives  Identify the guidance on analysis of samples and reporting of results  Describe the changes in the guidance on estimating and reporting uncertainties © Kinectrics Inc.

Sign up to vote on this title