You are on page 1of 21

Chapter 5: Errors in Chemical

Analysis

“Measurements invariably involve errors


and uncertainties. It is impossible to perform a
chemical analysis that is totally free of errors or
uncertainties. We can only hope to minimize
errors and estimate their size with acceptable
accuracy.”

1
• Measurement uncertainties cause replicate results
to vary.

Replicates are samples of


about the same size that
are carried through an
analysis in exactly the
2 same way.
• Measurement uncertainties can never be
completely eliminated
→ measurement data can give us only an
estimate of the "true" value.

• There is no simple and widely applicable


method for determining the reliability of
data with absolute certainty.

3
Reliability can be assessed in several ways:

• Perform experiments designed to reveal the


presence of errors

• Standards of known composition can be


analyzed and the results compared with the
known composition.

• Consult the chemical literature

• Calibrate the equipment

• Apply statistical tests to the data.


4
"What maximum error can I tolerate in the result?"

5
“Individual results from a set of measurements are
seldom the same, so we usually consider the "best"
estimate to be the central value for the set.”

Measures of Central Tendency:


1. Mean, arithmetic mean, and average (x)
are synonyms for the quantity obtained by
dividing the sum of replicate measurements by
the number of measurements in the set.
N

x
i =1
i

x =
N
where, xi represents the individual values
6 of x making up a set of N replicate
measurements.
7
2. Median - the middle result when replicate data
are arranged in order of size
• Equal number of results are larger and smaller
than the median.
• For an odd number of data points, the median
can be evaluated directly.
• For an even number, the mean of the middle
pair is used
Median is used advantageously when a set of data
contains an outlier.

3. Mode
- the value that occurs most frequently in a
8 set of determinations
Precision - the closeness of results to others
that have been obtained in exactly the same
way.
• describes the reproducibility of measurements
• precision of a measurement is determined by
simply repeating the measurement on
replicate samples.

Three terms are widely used to describe the


precision:
• standard deviation
• variance
• coefficient of variation.
These terms are functions of the deviation from
the mean di or just the deviation.
di = | xi – x |
9
Accuracy - closeness of a measurement to
the true or accepted value and is expressed by
the error

“We can never determine accuracy exactly


because the true value of a measured quantity
can never be known exactly. We need to use
an accepted value.”

Accuracy is expressed in terms of:


• absolute error
• relative error

10
11
Illustration of accuracy and precision
o Absolute Error
- the difference between the measured value
and the true value
𝑬 = 𝒙𝒊 − 𝒙𝒕
where, xt is the true or accepted value of the quantity.
The sign of the absolute
o Relative error - the error tells whether the
absolute error divided by the value in question is high
true value or low. If result is low,
xi − xt sign is negative; if the
Er =  100% measurement is high,
the sign is positive.
xt
Relative error also expressed in parts per
thousand (ppt).
12
Results can be precise
without being accurate
and accurate without
being precise

13
Absolute error in micro-Kjeldahl determination of nitrogen.
Types of Errors in Experimental Data

• Random (or indeterminate) error


- causes data to be scattered more or less
symmetrically around a mean value
- affect the measurement precision

• Systematic (or determinate) error


- causes the mean of a data set to differ from
the accepted value
- affect the accuracy of results
- causes all the results to be too high or too low

• Gross error
- usually occur only occasionally, often too
large, and may cause a result to be too high or too
low An outlier is an occasional result in
14
- leads to outlier replicate measurements that differ
significantly from the rest of the results
Systematic Errors
- lead to bias in measurement results

Types of systematic errors:


1. Instrumental errors
- caused by nonideal instrumental behavior,
by faulty calibrations, or by use under inappropriate
conditions such as:

2. Method errors
- arise from nonideal chemical or physical
behavior of the reagents and reactions on which an
analysis is based

15
Sources of nonideality include:
• Slowness and/or incompleteness of some
reactions
Errors inherent in a
• Nonspecificity of most reagents method are often
• Instability of some species difficult to detect and
• Possible occurrence of side reactions are thus the most
serious of the three
types of systematic
error.
3. Personal errors
- result from carelessness, inattention, or
personal limitations of the experimenter

A universal source of error is prejudice or bias.


To preserve the integrity of
the collected data, persons
who make measurements
16
must constantly guard
against personal bias.
The Effect of Systematic Errors on Analytical
Results

• Constant Errors
- the magnitude stays essentially the same as
the size of the quantity measured is varied
- absolute error is constant with sample size
but relative error varies when sample size is changed

The effect of a constant error


17 becomes more serious as the
size of the quantity measured
decreases.
One way of reducing the effect of constant error is to increase
the sample size until the error is tolerable

• Proportional Errors
- decrease or increase in proportion to the size of
the sample
- caused the presence of interfering contaminants
in the sample

Detection of Systematic Instrument and Personal


Errors
• Some systematic instrument errors can be found
and corrected by calibration.
• Simple calibration does not compensate for errors
caused by interferences.
18• Most personal errors can be minimized by care
and self-discipline.
Detection of Systematic Method Errors
Bias in an analytical method is particularly difficult to detect.

Analysis of standard samples


Concentration of one or more of the SRMs has
been determined in one of three ways:
• by analysis with a previously validated reference
method
• by analysis by two or more independent, reliable
measurement methods
• by analysis by a network of cooperating
laboratories
Standard reference materials (SRMs]
are substances sold by the National
Institute of Standards and Technology
19 (NIST] and certified to contain
specified concentrations of one or
more analytes.
Independent Analysis
If standard samples are not available, a second
independent and reliable analytical method can be
used in parallel with the method being evaluated.

Blank Determinations A blank solution


contains the sol-
- useful for detecting certain types vent and all of the
of constant errors reagents in an
analysis. Whenever
- reveal errors due to interfering feasible, blanks may
contaminants from the reagents and also contain added
vessels used in the analysis constituents to
simulate the
The term matrix refers to the collection sample matrix.

20 of all the constituents in the sample.


• All steps of the analysis are performed on the
blank material
• The results are then applied to correct the sample
measurements.

Variation in Sample Size


Constant errors can often be detected by varying the
sample size .

---END---

21

You might also like