You are on page 1of 42

Chapter 5:

Making Measurement and Reporting

1
Student Learning Outcomes:
At the end of this chapter, students should be able to:
• appreciate the importance of careful work in making reliable
measurements.
• understand the role of calibration and the importance of
using chemical standards, reference materials and quality
control samples.
• understand the principles involved in generating reports and
good keeping records.

2
QUALITY SYSTEM
Organisational structure, procedures, processes and
resources needed to implement quality management.

Quality Management:
All activities of the overall management function that
determine the quality policy, objectives and
responsibilities, and implement them by means such as
quality planning, quality control, quality assurance, and
quality improvement within the quality system.

3
GOOD LABORATORY PRACTICE (GLP)

GLP is a quality system concerned with the organisational processing


process and conditions under several studies are planned, performed,
monitored, recorded, archived and reported.

In the clinical and research area, good laboratory practice generally


refers to a system of management controls for laboratories and
research organizations to ensure the consistency and reliability of
results - as outlined in the Organisation for Economic Co-operation
and Development (OECD).

4
Aim of Good laboratory practise (GLP)

To encourage scientists to organise their work in such a


way as to ensure the production of reliable results. GLP
does not cover scientific aspects, it focuses on resources,
procedures and protocols, characterisation of test
systems, documentation and archiving and independent
quality assurance. GLP introduces the standard operating
procedure (SOP).

5
GLP (continued)
Describe how chemists and other scientists should go about
their day-to-day work (safety, tidiness, cleanliness, care,
thoughtfulness, organisation and self-discipline).

Principles of GLP is concerned with organisational process and


conditions under which laboratory studies related to regulatory
work is carried out.
- Before starting an analysis
- During the analysis
- After the analysis
6
(i) Before starting an analysis
- Locate the samples.
- Ensure that up-to-date copy of method.
- Check the availability of the instruments and make sure its work and
calibrated.
- Plan the sequence of the works.
- Make sure the glassware is cleaned.
- Make sure the reagents, standards and reference standard are adequate.

(ii) During the analysis


- For each sample, note the details such as sample conditions.
- Check sample at the correct temperature before open the container.
- Carry out appropriate sampling procedures.
- Clean the equipment for different samples to prevent cross contamination.
- Carry out any necessary calibration.
- Follow the method exactly as it was written.
- Record all observations.
7
(iii) After the analysis

-using the data gathered, calculate the required answer, look for obvious
errors.
-check data transcriptions and calculations using someone other than the
person conducting the work.
-retained the samples until satisfactory report has been produced.
-clean the apparatus used and make sure the instrument ready for the
next use.

8
Some Facts….

The quality of products and services dependent on reliable


measurements.

Reliable measurements depend critically on:


• competent staff,
• validated and tested methods,
• comprehensive quality systems and
• traceability to appropriate measurement references.

To achieve comparability of results over space and time, it is


essential to link all the individual measurement results to some
common, stable references or measurement standard. Results
can be compared through their relationship to that reference.

The strategy of linking results to a reference is termed “tracebility”.

9
Measurement Traceability

ISO defines traceability as the property or result of a


measurement, whereby it can be related to appropriate
national or international standards, through an unbroken
chain of comparisons.

National standard (eg. NIST), International standards (CSA and EU)


CSA- Canadian Standard Association; EU- European Union;
NIST - National Institute of Standards and Technology

10
NIST - National Institute of Standards and Technology

Promote U.S. innovation and industrial


competitiveness by advancing
measurement science, standards,
and technology in ways that enhance
economic security and improve
our quality of life.

Measurements and standards

NIST supplies industry, academia, government, and other users


with over 1,300 Standard Reference Materials (SRMs). These
artifacts are certified as having specific characteristics or
component content, used as calibration standards for measuring
equipment and procedures, quality control benchmarks for
industrial processes, and experimental control samples.
11
Traceability in Analytical Measurement

Meaningful comparisons between measurements are only possible if


the results are expressed in the same units (measurement scale)
Stated
Quantity Analyte Measurand Unit
reference
concentration c DDT c(DDT) in soil ng/kg SI
w(Pb) in waste
content w Pb ng/L SI
water
Number of E.Coli
count E. Coli per unit surface
m-2 SI

activity amylase A(amylase) Katal SI


+ c(H+) in waste pH
pH H ions pH scale
water number
12
However one problem that frequently occurs and must
be taken into account, is the problem caused by the
sample matrix.

In many analytical situations, the matrix of the sample


has a major influence on the determination of the
analyte, resulting in measurement uncertainties
considerably greater than that associated with the base
SI unit.

13
Establishing Traceability:

 Specifying the measurand


 Choosing a suitable
measurement procedure
model equation
 Demonstrating (validation) that:
the model equation is adequate (all significant influence
quantities have been taken into account)
the measurement conditions are adequate
 Establishing traceability for each influence
quantity:
Choosing appropriate reference standards
Calibrating using the standards
 Evaluating uncertainty

14
Specifying Measurand
Identity of the analyte

Chemical measurement most commonly quantifies particular compound


or elemental species. It will be necessary to take extra care if different
forms of a material occur and if the difference is important.

For example, different isotopes, isotope mixtures, enantiomers, or


crystalline forms may need to be distinguished.

Implied measurement conditions

Most analytical results are expected to be obtained under conditions


close to normal ambient temperature, pressure and humidity. In
considering traceability, it is important to understand exactly what
conditions apply, as these form part of the formal definition of the
15
measurand.
Choosing a suitable method

Once the measurand is known and understood, a method of


measurement is selected, or may be developed especially for the
purpose.

The choice of method involves a range of factors, including regulatory


requirements for particular methods, customer requirements, cost,
experience of different methods, availability of equipment, and criticality
of decisions.

Choice of method is accordingly a matter of judgement informed by


customer needs.

16
Demonstrating Validation

Accuracy

Precision

Limit of Detection
Method Limit of Quantitation
Validation
Specificity/selectivity

Linearity and Range

Ruggedness/Robustness

System Suitability

17
Uncertainty estimation

The requirement for uncertainty information follows from


the need to ensure:

(i) the references used are sufficiently accurate for the


purpose.

(ii) to provide similar information for the result of the


measurement.

18
Establishing Traceability:
Specifying the measurand
Uncertainty estimation

Standards and standardization play a major role in reliability


assurance.

What is a Standard Method?

A standard method is the method that assures maximum reliability by


using a material characterization.

The analytical methods used as standard methods should have the


following characteristic:
rapidity, reproducibility, flexibility, and reliability.

19
Establishing Traceability:
Specifying the measurand
Uncertainty estimation

Requirement of a Standard Method:

- The method is known to be widely applicable to the analyte, present in


a range of matrices.

- The method has been validated through inter-laboratory collaborative


studies (reproducibility assessment) and is known to give accurate
results.

- Interference effects are well documented.

- The method requires the use only of equipment which it would be usual
to find within the average analytical laboratory.

20
Types of Standards:

i) Reference standard (i.e., primary standard) may be


obtained from official sources.
Standards are used in analytical chemistry. A primary standard is
typically a reagent which can be weighed easily, and which is so pure
that its weight is truly representative of the number of moles of
substance contained. Features of a primary standard include high
purity, stability (low reactivity), low hygroscopicity, high equivalent
weight, non-toxicity, ready and cheap availability.

ii) Working standard is a standard that is qualified against


and used instead of the reference standard. A
measurement standard used to check or calibrate
measuring instruments.

21
Chemical Standard
Chemical standard is used for calibration. They may be used
‘externally’, or ‘internally’.

External Standard (in chromatography)


A compound present in a standard sample of known concentration and
volume which is analysed separately from the unknown sample under
identical conditions. It is used to facilitate the qualitative indentification
and/or quantitative determination of the sample components. The volume
of the external standard (standard sample) need not to be known if it is
identical to that of the unknown sample.

Internal Standard:
An internal standard in analytical chemistry is a chemical substance that
is added in a constant amount to samples, blank and calibration standards
in a chemical analysis.

Used to correct for the loss of analyte during sample preparation. It should
matches as closely, but not completely with analyte of interest in the
22 samples.
External Standard Calibration

Definition: Known amounts of analytes are run in a separate


analysis, and the resulting peak areas are used to obtain
calibration factors. The calibration factor is used to calculate
sample concentrations in later runs.

Advantages: Useful for analyzing different sample sizes and


determining relative amounts without recalibrating.

Disadvantages: Does not account for detector sensitivity


changes or extraction or injection variances.

Not recommended for manual injections

Page 23
Internal Standard Calibration
Definition: A compound that is added, in known and constant
concentration, to the sample just before extraction or analysis.
The response factor, ratio of its peak area to the target compound
in various concentrations, is used in the calculation of sample
concentrations.
Advantages: Adjusts for extraction, injection, and detector
variances.
Disadvantages: may interfere with target compounds, can use
isotopically labeled internal standards
3 1:Naphtalene
2
1 2:Acenaphtalene
4 3:Acenaphtene
6
4:Flourene
5:Phenanthrene, d10 (ISTD)
Example: Phenanthrene, d10 6:Pyrene
is used as internal standard 5
7:Benzo[a]anthracene
7
(ISTD) 8:Benzo[a]pyrene

Page 24 (a)
NIST Reference Standards

Reference Material (RM):


Material or substance one or more of whose property values are
sufficiently homogeneous and well established to be used for the
calibration of an apparatus, the assessment of a measurement method,
or for assigning values to materials.

Certified Reference Material (CRM):


Reference material, accompanied by a certificate, one or more of whose
values are certified by a procedure which establishes traceability to an
accurate realization of the unit in which the property values are
expressed, and for which each certified value is accompanied by an
uncertainty at a stated level of confidence 1944 Sediment.pdf

25
Establishing Traceability:
Specifying the measurand
Documentation for CRMs

CRMs should be accompanied by appropriate documentation such as


a certificate, that provides the following essential information:
• Name and address of the producer of the material
• Description/name of the material
• Physical/chemical form of the material
• Sample number/batch number/certified number
• Date of production
• Shelf life/expiry date
• Property values
• Uncertainty of property values
• Procedures used to characterise
the material, i.e to determine
its property values

26
NIST Reference Standard (continued)

NIST Standard Reference Material® (SRM®):


NIST SRMs are issued with Certificates of Analysis or Certificates that
report the results of their characterizations and provide information
regarding the appropriate use(s) of the material.

NIST Traceable Reference Material™ (NTRM™):


A commercially-produced reference material with a well-defined
traceability linkage to existing NIST standards for chemical
measurements. This traceability linkage is established via criteria and
protocols defined by NIST to meet the needs of the metrological
community to be served.

27
Main uses of CRMs:
1. Calibration and verification of measurement process
under routine conditions.
2. Internal quality control and quality assurance schemes.
3. Verification of the correct application of standardised
methods.
4. Development and validation of new methods of
measurement.
5. Calibration of other materials.

28
Selecting Reference Standard

In selecting a reference materials for a particular


application, it is the analyst's responsibility to assess its
suitability. Factors to be considered:

• matrix match and potential interferences


• measurands type (analytes)
• measurement range (concentration)
• measurement uncertainty
• certification procedures used by the producer
• documentation supplied with the material (e.g.
certificate, report, etc.)

29
Role of CRMs in Method Validation:
The main used of certified reference materials is to assess
the trueness (bias) of a method.

Trueness is the closeness of agreement between the


average value obtained from a large set of test results and
accepted reference value.

The measure of trueness is normally expressed in terms of


bias.
30
QUALITY CONTROL (QC)
QC in analytical measurement describes the measures
used to ensure the quality of individual results.
The choice of possible measures are:
• Blank samples
• Quality control sample (QC samples)
• Repeat samples
• Blinds samples
• Chemical Standards
• Recoveries study
31
Blank samples:
Blanks (should be <MDL) are used as a means of
establishing which part of a measurement is not due to the
characteristics being measured.

To detect errors due to changes in reagent, new batches of


reagents, carryover errors and drift of apparatus parameters.

Blank value before and after analysis allow identification of


some systematic trends.
32
Quality Control Samples and Control
Chart:
QC samples are used as a means of studying the variation
within and between batches of a particular analysis.

A typical QC sample should be stable, homogeneous, typical


in composition to the types of sample normally examine and
available in large quantity (to provide continuity).

33
Requirements for control samples:
• Must be suitable for monitoring over longer time period.
• Should be representative for matrix and analyte
concentration.
• Concentration should be in the region of analytical
important value (limits).
• Amount must be sufficient for a longer time.
• Must be stable and no losses due to the container.
• No changes due to taking sub-samples.
Standard solutions/Calibration:

 Use to verify the calibration (correlation coefficient should


be > 0.9998 or specified by the method)
 Control sample must be completely independent from
calibration solutions.
 No influence of sample matrix.
 Limited control for precision.
 Very limited control for trueness.

35
Repeat Sample:
Used to measure precision of the analytical system. Precision
should not be out of targeted precision of the method.

Within an analytical process, repeat sample is needed


(triplicate) in order to study the variations between sets of
results.

To ensure the obtained results is within the reasonably


acceptable.
36
Blind Sample:
A subsample submitted for analysis with a composition and identity
known to the submitter but unknown to the analyst and used to test the
analyst’s or laboratory’s proficiency in the execution of the measurement
process.

Blind sample are a (type of repeat sample which are inserted into the
analytical batch without the knowledge of analyst).

It may be sent by the customer as a check on the laboratory or on a


particular system. To determine whether the variation levels are
acceptable.
37
Standard and Spikes (Recovery study)
In this procedure, the concentration of the analyte in the
sample has already been determined or is known to be zero.

The ‘spike’ (a small quantity of standard) is added in order to


check analyte recovery (recoveries between 98-102% would
normally be expected).

The accuracy of spikes should fall within the accuracy limit


established for the method.
38
39
Records management
A record is a document or other electronic or physical entity in
an organization that serves as evidence of an activity or
transaction performed by the organization and that requires
retention for some time period.

The ISO defines records as "information created, received,


and maintained as evidence and information by an
organization or person in the transaction of activties".

The key word in these definitions is evidence. Put simply,


a record can be defined as "evidence of an event".

40
Records management

Records management is the process by which an organization:

• Determines what kinds of information should be considered as


records.
• Determines how active documents/records should be handled
while they are being used, and determines how they should be
collected after they are declared to be records.
• Determines how long for each record type should be retained to
meet legal, business, or regulatory requirements.
• Researches and implements technological solutions and business
processes to ensure that the organization complies with its
records management obligations in a cost-effective way.
• Performs records-related tasks such as disposing of expired
records or locating and protecting records that are related to
external events such as lawsuits.

41
Practicing records management
A Records Manager is someone who is responsible for records management
in an organisation. The practice of records management may involve:

• Planning the information needs of an organization.

• Identifying information requiring capture.

• Creating, approving, and enforcing policies and practices regarding records,


including their organization and disposal.

• Developing a records storage plan, which includes the short and long-term
housing of physical records and digital information.

• Identifying, classifying, and storing records.

• Coordinating access to records internally and outside of the organization,


balancing the requirements of business confidentiality, data privacy, and
public access.

42

You might also like