You are on page 1of 92

Laboratory Solutions

Measuring
Techniques
Tips and Tricks
Service and
Maintenance
Software Solutions

Basic Laboratory Skills


To Ensure Measurement Quality
Table of Contents
Table of Contents

1. Introduction 4
1.1. Considerations toward Data Quality 4
1.2. Implementation 7
2. Gravimetric Procedures: Weighing and Moisture Analysis 9
2.1. An Overview of Balances 9
2.2. Finding the Right Balance 11
2.3. Balance Installation: Preventing Environmental Influences on Weighing 13
2.4. Effective Balance Use and Weighing Operations 13
2.5. Moisture Content Determination 18
2.6. Conclusions 22
2.7. Resources and Further Reading 22
3. Volumetric Procedures: Liquid Handling Operations 23
3.1. An Overview of Pipette Types 23
3.2. Air-Displacement and Positive-Displacement Technology 24
3.3. Pipetting Strategies to Improve Data Quality 25
3.4. Pipette Tip Selection 26
3.5. Pipette Cleaning 27
3.6. Pipette Testing and Maintenance 27
3.7. Resources and Further Reading 28
4. Analytical Chemistry: pH and Conductivity Value Determination 29
4.1. The Principle and Importance of Measuring pH 29
4.2. Applications of pH 30
4.3. The Principle and Importance of Measuring Conductivity 30
4.4. Instrumentation to Measure pH and Conductivity 31
4.5. pH and Conductivity Sensors 34
4.6. Sensor Calibration and Maintenance 37
4.7. Service 38
4.8. Resources and Further Reading 38
5. Analytical Chemistry: Titration Procedures 39
5.1. Titrators and Sensors 40
5.2. Automated Titrators: Measuring Principle 42
5.3. Tips and Hints for Titrations 43
5.4. Routine Testing 44
5.5. Service Options 45
5.6. Resources and Further Reading 45
6. Analytical Chemistry: Ultraviolet-Visible Spectroscopy 46
6.1. Measuring Principle 46
6.2. Instruments 48
6.3. Tips and Hints for UV/Vis Spectroscopy 51
Table of Contents

6.4. Performance Verification – General Aspects 53


6.5. Measurement Uncertainty Considerations 54
6.6. Spectrophotometer Care and Service 55
6.7. Resources and Further Reading 55
7. Analytical Chemistry: Density 56
7.1. Measuring Principle 56
7.2. Density Applications 59
7.3. Tips and Hints for Density Measurement 59
7.4. Performance Verification 62
7.5. Data Management 63
7.6. Maintenance and Service 63
7.7. Resources and Further Reading 64
8. Analytical Chemistry: Refractive Index 65
8.1. Measuring Principle 65
8.2. Refractometry Applications 68
8.3. Tips and Hints for Refractive Index Measurement 69
8.4. Performance Verification 71
8.5. Data Management 71
8.6. Maintenance and Service 72
8.7. Resources and Further Reading 72
9. Thermal Values: Measurements of Melting, Softening, Dropping, Boiling, Cloud,
and Slip Melting Points 73
9.1. Melting Point 73
9.2. Dropping Point and Softening Point 74
9.3. Boiling Point 76
9.4. Cloud Point 77
9.5. Slip Melting Point 78
9.6. Data Management 79
9.7. Routine Testing and Service 79
9.8. Resources and Further Reading 80
10. Thermal Analysis 81
10.1. A Brief Overview of Important Thermal Analysis Techniques 81
10.2. Method Development – Selection and Development of Procedures 84
10.3. Samples 87
10.4. Instrumentation and Software 88
10.5. Testing and Service 89
10.6. Resources and Further Reading 89
11. Conclusion 90
12. Additional Resources 91
1. Introduction

Scientific inquiry aims to increase our understanding of what we observe, often for the purpose of exploiting, or
Introduction

modifying, a process or substance. It is furthered according to the scientific method, a set of procedures that can
be applied to gain qualitative or quantitative insights into questions of interest. While formalized in recent centu-
ries, the basic precepts have been used for millennia to explain natural phenomena, yielding profound advances
even in pre-modern times.

In the past few decades, highly sensitive instrumentation, automation technologies, and innovations in comput-
ing have enabled detailed and precise investigation of ever more complex processes. The result has been rapid
progress in both life and physical sciences, with an especially spectacular recent example being the develop-
ment of mRNA-based vaccines against the novel coronavirus (SARS-CoV-2). Drawing on previous findings and
ongoing work, the candidate vaccines moved from the bench to clinical settings in an unprecedentedly short
timespan, driven by the availability of high quality, reproducible datasets.

Precise and accurate data is the driving factor in all scientific breakthroughs, yet the assessment of how to
achieve it in any instance often begins long before an analysis is actually started, and requires the synthesis of
multiple strands. First, the successful outcome of any measurement depends on access to appropriate instru-
ments, that are calibrated and in good repair, and on which personnel are trained. Secondly, proven approaches
for sample collection and preparation must exist, or must be developed. Additionally, a system for faithful and
complete record-keeping must be in place. Together, these components will determine whether trends can be
clearly resolved and specific effects securely pinpointed as experimental data is collected and evaluated.

Drawing on METTLER TOLEDO’s history as a manufacturer of precision instrumentation, we offer this guide as
a supplement to laboratories’ efforts toward the quality of results. In it, we describe strategies to reinforce the
precision and accuracy of data generated in research as well as production environments (e.g. quality control).
We present the operating principles of a variety of frequently employed instruments and suggest best practices
for their use; we touch on routine testing solutions and highlight resources to help in identifying the most appro-
priate testing and service schedules; and we indicate software and automation options to reinforce labs’ produc-
tivity, consistency, and compliance, as applicable.

As no single document can be exhaustive, we also link to materials that provide a more comprehensive p­ ortrait
of the methodologies we summarize. Readers should gain an overview of each area, with a foundation for
deeper study.

In the sections below we introduce quality systems and their possible benefits to laboratory operations, and
briefly summarize the concepts and instrumentation elaborated in the later chapters of the guide.

1.1. Considerations toward Data Quality

When laboratories’ experimental or analytical procedures are robust and well executed, with carefully prepared
samples and on properly chosen, maintained, and calibrated equipment, the quality of the resulting data should
be high. In this scenario, results will be meaningful and interpretable, and assays will not need to be repeated
before decisions can be taken or further actions planned. The implementation of a quality system can set a
framework in place to smooth lab operations, helping to improve efficiency and reduce resource use.

METTLER TOLEDO Basic Laboratory Skills 4


1.1.1. Quality Systems

Quality systems – mechanistic schemata to carry out processes smoothly, repeatably, and cost-effectively, to
a consistently high standard – should consider instruments, protocols, and software (as well as the method
Introduction

according to which datasets are stored, if different from software). While their specific details may differ, and will
reflect the environment in which they are executed, they will often align with the guidance laid out in regulatory
blueprints such as Good Laboratory Practice or Good Manufacturing Practice (GLP, GMP), incorporating elements
from more narrowly focused industry standards. Adhering to these stipulations will reinforce the quality and safety
of operations, the accuracy and reproducibility of data, and the uniformity and integrity of any products that result.
• GLP applies to research laboratories involved in the development of products that affect humans, animals, or
the environment. It encompasses study plans, equipment, standard operating procedures (SOPs), data, and
data storage, among other factors. The Organization for Economic Cooperation and Development (OECD)
­created an international standard for GLP in the early 1990s, which has since been codified into, e.g., US and
EU legal directives.
• GMP applies to production rather than research, governing both quality and batch-to-batch consistency. Its
intent is to prevent harm from arising to the end user or consumer following ingestion, use of, or interaction
with a product or substance, and it includes, e.g., the maintenance of clean and hygienic facilities, consistent
application and documentation of processes, training of personnel, storage of data, and traceability of records.
- Conformity with GMP is usually verified by inspections that are carried out by personnel from regulatory
agencies. The inspectors will assess whether the resulting goods are likely to be safe for use based on the
conditions under which they were produced, as well as the processes (quality system) in place for their
­production. Good Distribution Practice (GDP) may govern the goods’ storage and transport.
- For goods that will be exported, inspectors from the country of origin or the destination country may conduct
evaluations, depending on the agreements in place.

1.1.2. Basic Safety, Training, Methods, and Software

Though not encompassed by the traditional interpretation of a quality system, the safety of personnel joins with
that of processes in its importance to daily work in the lab or at line. In the section following, we group them
together in reflection of their influence on the precision and accuracy of the data gathered in either context.

To support effective operations, efforts should be made to ensure that appropriate personal protective equipment
(PPE) is freely available and consistently used in the workplace, that equipment can be operated ergonomically
and breaks taken to prevent fatigue, and that containment areas (fumehoods, biosafety cabinets, “hot rooms”
for radioactive substances) may be utilized for workflows with hazardous substances. Waste should also be
disposed of compliantly, spills cleaned immediately and safely, and automated solutions adopted as needed to
reduce risks. Vigilance with respect to these points will go far in preventing unplanned downtime by protecting
employees as well as instrumentation and spaces.

Once general safety protocols are in place, laboratories should establish guidelines for training on, and use of,
their equipment. Most laboratory workflows involve several instruments or devices. Indeed, in a typical day,
personnel will often carry out a handful of procedures, some interdependent, that vary in their complexity, and
will interact with both basic and more sophisticated equipment. An understanding of the best ways to utilize
and maintain even simple instrumentation is consequently vital to accurate and precise results.

Training is, however, not just fundamental to the quality of workflows and the resulting data; it can also sensitize
users to potential sources of error that might negatively impact either equipment functioning or an analysis in
progress. Additionally, training informs method development, which is a critical element of laboratory work. Once
defined, methods should be applied consistently and should produce results of sufficient accuracy and reproduc-
ibility for the lab’s needs. A complete method will include sample collection and preparation steps to augment the
overall repeatability of operations.

METTLER TOLEDO Basic Laboratory Skills 5


It should be noted that laboratory methods may need to be altered to accommodate changing circumstances:
for example, the number of samples handled each day. In this case, new methods should be verified to ensure
the accuracy and reproducibility of the resulting data; once validated, they should be implemented systematically
while their usefulness endures.
Introduction

Although manual processes are adequate in most situations, the use of an automated instrument such as
an autotitrator, halogen moisture analyzer (moisture balance), or electronic pipette – particularly one that
accepts SOPs – may reinforce workflow security by ensuring that methods are always followed to the letter.
Software-driven automated systems afford end-to-end control in regulated environments, or where sample
throughput demands a workaround to a previously manual procedure.

But software also plays a role in the quality and safety of lab operations more broadly. In the most basic case,
simple data-logging software, usually compatible with a spreadsheet program, allows export of data to a PC for
analysis or inclusion in an electronic lab notebook (eLN). Applications of this kind (such as METTLER TOLEDO’s
EasyDirect™) save time that would otherwise be allocated to manual transcription or calculations. They also
reduce the risk of jotting or keystroke errors, but are not sufficient to comply with regulations.

A further, more powerful tier of laboratory software includes METTLER TOLEDO’s LabX® and STARe. These platforms
link instruments into a network and enable the development of SOPs, user and asset management, workflow auto-
mation, and sound data management practices. For quality control purposes they can indicate whether results
are within tolerances. Especially important is their capacity to ensure that datasets are captured together with
metadata, which affords laboratories interoperability of data collected at different times or by separate individuals.

Laboratory software of the latter kind typically stores datasets in a secure database, and can integrate with LIMS
for consolidation with results from unconnected instruments. The software makes datasets attributable to support
data integrity, allowing errors to be traced back to the source upon detection, and may offer sophisticated analyti-
cal features.

1.1.3. Equipment Maintenance

Instrument maintenance, which involves not only calibration and adjustment by a service professional but routine
activities, is also a core aspect of data quality, and should be considered during the implementation of a quality
system.

Preventive maintenance by a professional technician is the key to averting unplanned downtime, and should be
undertaken at regular intervals. Including cleaning, calibration, and adjustment, along with documentation of
“as found” or “as left” status, maintenance should be scheduled to meet a lab’s needs and / or auditing require-
ments. In addition to supporting data consistency over time, it will prolong equipment’s “usable life”, and will
typically also reduce the risk of catastrophic failures that might be costly or disruptive to operations.

Between service visits, instrument performance should be verified periodically. Employing a standard, b­ uffer,
test weight, or certified reference material (CRM), performance verification is intended to determine whether
the instrument under evaluation is functioning within tolerances. It will be carried out with varying frequency
depending on the process – daily, or before each use, for a pH sensor or density meter, but less often for other
equipment (balances, pipettes). However, verification should be a normal element of laboratory operations, as
this is the most effective way to detect drift or worrisome trends and to keep the quality of results uniformly high.

It should be noted that asset management features in laboratory software (e.g. LabX and STARe) can block
instruments from use when calibration is due.

Regular cleaning or decontamination complements performance verification in ensuring that instruments are kept
in good trim, serving the joint purpose of protecting the integrity of samples. Rudimentary cleaning with a brush,

METTLER TOLEDO Basic Laboratory Skills 6


distilled water or dilute solvents, or damp lab tissue will often be carried out prior to the start of a procedure, as
well as between replicates, to prevent contamination or cross-contamination from arising. Yet repeated use of lab
equipment, especially with challenging samples, will naturally necessitate more thorough cleaning over time, and
laboratory personnel should be familiar with standard procedures that can safely restore performance.
Introduction

As care must be taken to use solvents that can dissolve dirt and residue while not damaging the equipment itself,
it is imperative to follow the manufacturer’s instructions (often found in the manual), or to design a cleaning
workflow that accounts for the materials from which the device is made. Consultation with a service professional
or technical specialist is advisable when first establishing cleaning procedures, or if circumstances change.

1.2. Implementation

Any operating framework or quality system set in place in your laboratory should reflect your needs and expec-
tations. The instrument park, the range of procedures carried out and the stages at which they are necessary,
the accuracy and repeatability required for each measurement, and other key parameters (user management,
data integrity, industry standards and guidelines) should be considered during its development.

The individual chapters of this guide are intended to provide an overview of a series of commonly used instru-
ments (listed below), and offer pointers as to equipment choice, use, cleaning, performance verification,
and service, as well as, in selected cases, automation options. Available software and accessories are also
­discussed as applicable.

The instruments are grouped by measuring principle, and include those that support:

Gravimetric Techniques
• Weighing. A key activity in most laboratories, though often underestimated, weighing is important to execute
correctly due to its frequent involvement in analytical workflows. We outline actions to take so as to improve
data quality and reduce the risk of error.
• Moisture content determination. Critical to many products and industries, it guides everything from product
quality and performance to shelf life. We introduce the operating principle of halogen moisture analyzers,
along with strategies for precise and reliable results.

Volumetric Techniques
• Pipetting. A mainstay of life science laboratories, pipetting is, however, sensitive to technique and sample
type alike. We describe the different types of pipettes, strategies to accommodate challenging samples, and
the components of a pipetting system that will enhance the accuracy of results on a continuing basis.

Analytical Chemistry
• pH and conductivity. Important values for many applications, products, and processes, highly accurate pH
and conductivity measurements are achievable for substances with a broad range of properties via tailored
sensors and dedicated methods.
• Titration. Depending on the reproducible and stoichiometric completion of a well-characterized reaction, titration
is a workhorse in many laboratories. We provide strategies and applications to reinforce the repeatability and
accuracy of measurements for various titration types.
• UV/Vis. Used in analytical chemistry as well as life science laboratories, UV/Vis spectroscopy is a routine assay
for substances as disparate as transition metal ions, highly conjugated organic compounds, and biological
macromolecules. We highlight spectrophotometers’ operating principles and outline consumables and certified
reference materials to ensure optimal results.
• Density and refractive index. Providing critical quality control information in many industries, these analyses
are fast and reproducible given the right methods and equipment. We include an overview of measuring prin-
ciples and the most effective workflows to support reliable results.

METTLER TOLEDO Basic Laboratory Skills 7


Thermal Values and Thermal Analysis
• Melting, dropping, boiling, cloud, and slip melting point. These analyses are based on the same general meth-
odology: controlled heating until a defined event based on a phase transition is detected. Correct sample prep-
aration and instrument operation are discussed as essential and prerequisite for accurate results.
Introduction

• Thermal analysis. Encompassing a range of techniques to characterize materials under controlled temperature
and time programs, thermal analysis offers detailed insights into their properties and behavior under stress.
Together with an outline of the different techniques, we describe a systematic approach to method development.

Each section follows a similar structure: the technique itself, often including elements of theory and the operat-
ing principles of the relevant equipment, measurement uncertainty and, as presented above, practical details
to ensure continued quality of results. We also provide links to proven applications, along with more detailed
reading material to support further learning activities. We encourage you to delve more deeply into the chapters
most relevant to your work, and to explore METTLER TOLEDO’s Expertise Library or contact a local representative
should additional questions arise.

METTLER TOLEDO Basic Laboratory Skills 8


2. Gravimetric Procedures: Weighing and Moisture Analysis

Gravimetric methods – those based on weight determination – have been in use for thousands of years.
Weighing and Moisture Analysis

Originally utilized in commerce to measure quantities, or in certain cases, substances’ purity (e.g. for precious
metals), gravimetry is now broadly employed in research, production, and quality control. In this chapter we
highlight applications involving both balances and moisture analyzers, placing our emphasis on best practice
in daily operations as well as effective instrument maintenance.

Weighing in the Lab


Weighing is a fundamental part of most laboratory workflows. Whether used to prepare solutions or standards,
or to evaluate samples, it has a strong bearing on the outcome of downstream operations: any errors introduced
at this stage will be propagated to subsequent steps. In practice, this means that the choice of instrument, its
installation location, and its readiness to weigh should all be considered in order to ensure that accurate and
precise data is reliably generated.

In principle a balance or weighing cell determines samples’ weight: in other words, according to Newton’s
­second law of motion, their mass multiplied by their acceleration (usually gravitational force: F = ma). Weighing
results are, however, typically reported as mass, either in kilograms or in fractions thereof. While other units of
­measure can be derived mathematically, until 2019 the reference kilogram was a physical object, the International
Prototype of the Kilogram (IPK), kept under controlled conditions in St-Cloud, France. It has now been defined
according to fundamental physical constants, including the speed of light, c, and Planck’s constant, h.

Measuring Principles
Weighing operations appear simple, involving the placement of the object under evaluation on the weighing pan
of a gravimetric device. Yet understanding how these devices function, and their inherent limitations, is essential
to countering the internal as well as external factors that can influence the accuracy and precision of the resulting
data, and hence to achieving optimal weighing results.

Both balances and moisture analyzers used for high-resolution applications – such as METTLER TOLEDO
Excellence and Advanced series instruments – employ magnetic force restoration (MFR) technology. The beam
of the weighing instrument is kept in equilibrium by compensating the weight of the sample via a magnetic force.

Strain gauge (SG) technology is applied for lower-resolution weighing tasks. SG weighs by changing the electri-
cal resistance in relation to the weight put on a balance or moisture analyzer, and increases the robustness of
weighing cells.

In both cases, modern electronics and software convert the initial signal to the desired unit and readability.

2.1. An Overview of Balances

Laboratories will frequently have more than one kind of balance, with the choice of instrument depending on
the sensitivity of the applications to be performed. The balance’s minimum and maximum capacity, described
further below, are essential to keep in mind when planning a weighing procedure.

An analytical or semi-analytical balance will usually offer a high degree of precision, with capacity in the range
of 54–520 g and readability of 0.005–0.1 mg. Precision balances, also called top-loading balances, offer a
wider range of weighing capacities – from less than a gram to 64 kg. They allow readability in the range of 1 mg
(0.001 g) to 1 g, or 0–3 decimal places. High precision laboratory balances can extend this accuracy to four
decimal places; on the display of the precision balance, 0.0001 g (0.1 mg) increments are used.

METTLER TOLEDO Basic Laboratory Skills 9


Weighing and Moisture Analysis

Figure 1: Precision (left) and analytical (right) laboratory balances.

Precision balances produce steady readings in a greater range of environmental conditions than analytical
­balances, and provide a more convenient weighing experience due to their lower sensitivity to temperature
­fluctuations and air currents. While a draft shield is not always necessary for their accurate use, when working
in a ventilated area such as a fume hood, or with a high-resolution precision balance (e.g. of 1 mg readability)
on the open bench, one should be employed to maintain performance.

A modified weighing pan such as SmartPan™ will compensate for these possible effects, such that a balance
with 1 mg readability can be used without a draft shield in standard conditions, while one with 0.1 mg readability
can still provide s­ table and accurate readings with the draft shield doors open.

Applications for analytical and precision balances in a laboratory setting include simple weighing, formulation,
dynamic weighing, sample preparation, preparation of buffers, statistical quality control, and interval weighing.

A microbalance or ultra-microbalance delivers the highest precision of all lab balances. Microbalances offer a
capacity of up to 10.1 g with readability down to 1 µg (0.000001 g). Ultra-microbalances offer an incredible full
resolution of 61 million digits, with a capacity of 6.1 g and 0.1 µg readability (0.0000001 g).

Figure 2: Microbalances and ultra-microbalances (shown) have low capacity and ultra-low readability.

METTLER TOLEDO Basic Laboratory Skills 10


Microbalances and ultra-microbalances are frequently used in product testing and quality assurance labs, as well
as in chemistry labs and mining operations for sensitive applications. Medical device research may also employ
a microbalance to check uniformity in critical components. Microbalance applications include particulate matter
(filter) weighing, pipette calibration, analysis of pesticides, and stent weighing. Ultra-microbalance applications
Weighing and Moisture Analysis

include particulate matter (filter) weighing, ashing or incineration, drying, measurement of coatings, applications
for elemental analysis, and checking spillage quantities.

For selected applications with substances that are very rare, expensive, difficult to handle, or simply dangerous
to human health, automated dosing on analytical balances offers the necessary precision and safety. METTLER
TOLEDO’s XPR Automatic balance dispenses at 2 µg readability, a precision unmatched by manual weighing.
As powdered substances are sealed in an airtight container, operator exposure to them is limited.

Figure 3: Automatic balances dispense solids and liquids without manual


intervention. Able to accommodate various types of weighing vessels, they
afford extreme precision and accuracy at low weights.

In combination with a liquid dispensing module, the XPR Automatic Balance enables you to prepare highly
­precise concentrations down to 0.1 mg/g. Achieving a dilution factor of 10,000 is possible in a single step.
The amounts of solvent, standard, and sample required for typical operations are reduced considerably, ­saving
costs and diminishing waste. Solutions are prepared directly in a target container in an automated process that
is fast, a­ ccurate, and cost-effective.

2.2. Finding the Right Balance

Ensuring that you select a balance that meets your requirements is the first step to obtaining correct and reli-
able results through the complete analytical workflow or process. Instrument accuracy is a good place to start.
Accuracy – the correctness, or closeness to the true value, of a single measurement – is often confused with
the readability or display resolution of a balance. It is instead determined by the performance and quality of the
instrument’s measuring sensor. To define accuracy, we must assess the measurement uncertainty of the instru-
ment in the environment where it is used. To assess whether the accuracy of your balance meets the require-
ments of your process, you will need to consider the following points.

METTLER TOLEDO Basic Laboratory Skills 11


1. Define the smallest net sample you want to weigh. This is
the so-called “smallest weight” to be weighed on the selected
balance. If you weigh s­ everal components into the same tare
container, the smallest net weight refers to the smallest of the
Weighing and Moisture Analysis

components that you weigh and not to the smallest total amount
of all components. Make sure your smallest net sample is above
the value given by the minimum weight with safety f­actor applied
(see 3, below).

2. Define your tolerance (how accurately you need to weigh


the smallest sample and how much variation is acceptable).
Your tolerance will ­determine if a device measures “well enough”
to meet process ­requirements. If the balance is used for more
than one process, ­select the tightest tolerance expected
a­ ccording to your SOPs.

3. Define the minimum weight of the balance and apply a safety


factor. The minimum weight is the value b­ elow which the
­weighing results may be subject to an excessive relative error.
The safety factor is a multiplier applied to the minimum weight
to allow for fluctuations caused by environmental influences and
varying operators and samples. The rule of thumb is to begin
with a safety factor of 2 for typical laboratory ­environments
and 3 or more for typical manufacturing / production
environments.

4. Check the maximum capacity of the balance and compare


it against the maximum load you intend to weigh, inclusive of
your tare container.

With these four points in mind, a balance can be selected. Points 2 and 3 are necessary to define the lower accu-
racy limit of the balance. Most balance manufacturers specify a minimum weight in their technical data that may
serve as a starting point. However, these figures should be double-checked against the process requirements for a
planned operation. Since minimum weight can vary across balance types, and even among balances of the same
type, e.g. when they are placed in different weighing locations, it is best to have the minimum weight determined
experimentally in situ (note that this may be performed during professional installation).

METTLER TOLEDO Basic Laboratory Skills 12


Due to the wide variety of balances available, consultation with a professional may be a useful stepping stone to
finding the instrument best suited to your laboratory’s applications. METTLER TOLEDO’s GWP® Recommendation
service offers a customized, free of charge look at available options to fulfill your needs.
Weighing and Moisture Analysis

2.3. Balance Installation: Preventing Environmental Influences on Weighing

A range of factors, both internal and external, may impinge on balance performance. For example, balance drift,
usually due to the environment in which the instrument operates, may affect zero / tare functions, resulting in
slower settling times and reduced accuracy.

To limit the impact of external physical influences on a balance, consider the following:
• Is the balance placed on a solid surface (bench or table)?
• Is the balance level?
• Is the balance located away from windows?
• Is the balance located away from open doors?
• Is the balance located far from radiators and other sources of heat?
• Is the balance located away from mechanical equipment?

As shown by the questions above, placing a balance on a solid foundation is not enough to counteract drift.
Environmental conditions that can affect results include vibrations, air drafts, and temperature variations.
Consider the three most important balance placement aspects to limit the impact of environmental influences.

1. Vibrations
Use a solid, stable weighing table. Damper plates can be helpful in some cases. Avoid positioning the balance
near mechanical or motorized equipment such as pumps.

2. Temperature variations
Avoid locating the balance near direct heat sources or windows to help keep temperature constant.

3. Air drafts
Avoid locating the balance near doors, high-traffic areas, vents, or fans (including those of other lab devices).
Close draft shield doors when weighing in a ventilated safety enclosure or fume hood.

2.4. Effective Balance Use and Weighing Operations

Once installed and in routine use, the balance should be operated with care to ensure the safety of personnel,
as well as the accuracy and repeatability of results over the long term. Proper PPE protects samples from con-
tamination as well as keeping operators safe; routine testing and periodic service identify potential sources of
error early, reducing the likelihood of extensive rework due to weighing inaccuracies.

Note that in addition to appropriate PPE, care should be taken to prevent spills. Laboratory guidelines and / or
occupational safety and health directives should be followed in the event a spill or similar incident occurs to
reduce the risk of exposure to hazards.

METTLER TOLEDO Basic Laboratory Skills 13


2.4.1. How to Weigh

Weighing is a straightforward procedure that most laboratory professionals know well, but keeping in mind a few
tips can make a difference to the quality of results. In addition to selecting the most appropriate balance for your
Weighing and Moisture Analysis

weighing operation and ensuring that it is ready to weigh (clean, level, and not undergoing an internal adjustment
cycle – see below), best practice involves the following:
• Weigh directly into a (tared) vessel, not into a weighing boat or onto weighing paper. The latter two can cause
sample loss.
• Consider potential electrostatic charges. Use a de-ionizer if necessary.
• Use a clean spatula to prevent sample contamination. Wash it or wipe with lab tissue before weighing if
necessary.
• When weighing multiple samples, for instance when preparing buffers, growth media, or other multi-compo-
nent solutions, write out the relevant quantities of each substance (legibly) before beginning the operation,
or follow a printed SOP. This will help to prevent omissions or double additions, and will provide a record later
on if you are concerned that the quantities were incorrect.
• Note the exact quantities of each substance weighed out next to your protocol or SOP (see above). In the best
case, use a printer and / or record the values electronically in balance software to prevent transcription errors.

2.4.2. Maintaining a Clean Balance

Keeping a balance clean can help to lengthen its service life and ensure it is functioning effectively on an ongo-
ing basis. It also helps to minimize the risk of cross-contamination, which may introduce unforeseen sources
of error into laboratory workflows; and to enhance user safety, increase operating reliability, and reduce rates of
equipment failure.

To determine whether a balance is sufficiently clean – before and after your weighing operation – and to establish
an appropriate schedule for maintenance, consider the following:
• Is a cleaning SOP in place?
• Can the balance be easily dismantled for cleaning?
• Are only non-toxic substances weighed?
• Is only one substance weighed, to reduce cross-contamination risk?
• Is the balance protected from dust and dirt when in use?
• Is the balance well protected when not in use?

Guidance on balance cleaning can usually be found in the operating manual, and is also presented in the poster
“Even the Best Balances Need Some Care”, in a guide, and in an SOP.

2.4.3. Maintaining Balance Accuracy

Over time, the performance of weighing equipment can deteriorate due to environmental changes, wear and tear,
or non-obvious damage. While most laboratories have defined a schedule for professional service (calibration,
adjustment, and preventive maintenance), this is only part of a complete maintenance program that also encom-
passes routine testing.

2.4.3.1. Routine Testing by the User

Routine tests are measures undertaken by operators to monitor performance over time, with an eye to identifying
problems as they arise between calibrations. By uncovering malfunctions early, such tests help to eliminate sur-
prises at the next calibration service and ensure that corrective action can be taken in a timely fashion.

METTLER TOLEDO Basic Laboratory Skills 14


Weighing and Moisture Analysis

Figure 4: Certified, traceable test weights can be used to verify balance performance between service visits.

When routine testing is not performed correctly (or at all), the effects can be serious, and include:
• Unidentified weighing inaccuracies,
• Erroneous results,
• Poor product quality, with concomitant process and audit issues, and
• Repetition, product rework, or recalls.

Weighing’s early involvement in many workflows can amplify the significance of these consequences, most
notably with respect to the time lost and the financial implications of weighing errors over a sustained period
of time.

Routine testing relies on four pillars:


1. Test frequency
2. Test methods
3. Test weights
4. Test tolerances

For more information on how often you should test, the three main test methods, and how to analyze the results
of your tests, we recommend the guide “Correct Weight Handling”. To explore test weight sets, please refer to the
METTLER TOLEDO website.

2.4.3.2. Internal Adjustment by the Balance

As briefly alluded to above, many balances include automatic adjustment mechanisms, which are an integral
part of their design and consist of one or more reference weights and a loading mechanism. Such a mechanism
conveniently tests and adjusts the sensitivity of the weighing device and is used to compensate for the impact of
temperature changes on measuring results. Balances should not be used during an internal adjustment cycle,
which usually will last a few minutes at most.

Although such internal mechanisms allow a reduction in the frequency of tests with external reference weights,
they do not substitute for other methods of performance verification, whether routine testing performed by users
or professional service at regular intervals.

METTLER TOLEDO Basic Laboratory Skills 15


2.4.3.3. Calibration and Adjustment by a Service Technician

The cornerstone of performance verification is calibration by a qualified technician, which should be undertaken
according to a schedule defined by a risk assessment. In most cases calibration is an annual or biannual occur-
Weighing and Moisture Analysis

rence, and will determine a balance’s performance under operating conditions on site, using certified test weights.
Calibration must quantify measurement uncertainty, which consists of assessing the error of an indication, the
influence of eccentricity, and the repeatability.

While calibration states how a weighing device behaves, adjustment changes its behavior. Adjusting an instru-
ment means modifying its indications in a way that allows them to correspond – as much as possible – to the
quantity values of the measurement standards applied. For example, adjustment may be employed to reverse a
systematic drift in readings.

For more information on calibration and adjustment services, and to define a service schedule that meets your
needs, please consult METTLER TOLEDO’s Good Weighing Practices™ (GWP) resources.

2.4.3.4. Weighing Errors from Samples or the Environment


While the quality of weighing results may be affected by many factors – among them the location of the balance,
the time elapsed since its last service, and the local microenvironment – electricity also plays a role. Electrostatic
charge is a common yet influential source of weighing errors that may prevent readings from stabilizing, and
stems from the accumulation of electrical charges on the surface of a non-conductive material. The phenomenon
of static electricity requires the separation of positive and negative charges (protons and electrons, respectively).
When two different materials are in contact with one another, electrons can move from one to the other, leaving a
positive charge on one material, and an equal, negative, charge on the other.

Electrostatic charges can cause sample handling difficulties, longer weighing times, and errors in weighing
results. The effect on the accuracy and reproducibility of the values measured can be very significant, particu-
larly for low sample weights.

Static is often visibly apparent, as powders may adhere to the sides of a target vessel rather than collecting on
the bottom. It can also be recognized through:
• Drifting measurements,
• Irreproducibility of results,
• Balance instability (or a stabilization time longer than usual).

Electrostatic charges also disrupt the weighing process if the sample or opening of the tare container is charged.
When this happens, powder can “jump” from the spatula onto the tare container, invalidating a measurement.
Dry powders are very susceptible to electrostatic forces and can be troublesome to weigh. Weighing a small
quantity of material into a large glass or polymer vessel represents the classic use case in which an electrostatic
charge may significantly increase the error of the weighing result.

It should be noted that certain balances can detect the presence of electrostatic charges and warn the operator
when they exceed a predefined tolerance.

METTLER TOLEDO Basic Laboratory Skills 16


Weighing and Moisture Analysis

Figure 5: Electrostatic forces can alter weighing accuracy and can Figure 6: The StaticDetect™ feature monitors electrostatic charges.
be counteracted, e.g., by ionizing modules.

For more information about the effects of electrostatics on weighing, as well as how to dissipate charges, please
consult our resource collection.

2.4.4. Ensuring Data Accuracy

While jotting down weighing values is standard practice, keeping track of weighing data manually is both
error-prone and time-consuming. In addition, making calculations based on transcribed results poses its own
challenges.

The options outlined below reduce errors and accelerate workflows.

Option 1: connect a printer to your balance


One basic way to reduce the risk of transcription error is
simply to add print-out capabilities to your balance, scale,
or other analytical instrument. This is an easy solution to
mistakes and difficult-to-read handwriting, and allows a
stable record of weighing operations, often with a timestamp
included. All METTLER TOLEDO balances are compatible
with RS or USB printers, and newer models often are LAN
compatible for wireless printing.

Figure 7: Adding a printer to a lab balance


diminishes the risk of transcription errors and
provides a lasting record of weighing operations.

Option 2: leverage balance software


Software such as EasyDirect™ Balance, which comes with Advanced and Standard series balances such as
the MS-TS, ML-T, and ME-T models, allows weighing results to be collected, stored, and processed electronically.
Weighing results can be reviewed on the PC screen, filtered by date, instrument, user, or sample. Results can be
visualized in charts, e.g. to assess target ranges. Meaningful reports can be created for documentation purposes:
data can be exported from the SQL database in various formats (XML, CSV, XLSX, or PDF) to a PC, or printed on
a network printer.

METTLER TOLEDO Basic Laboratory Skills 17


Weighing and Moisture Analysis

Figure 8: Software such as EasyDirect Balance facilitates record-keeping, as well as further analyses.

Option 3: integrate your balance into a network


Instrument control software such as LabX offers more than data capture and data management. Users are fully
guided through a complete workflow according to an SOP, data including metadata is stored centrally, and all
steps are fully tracked. This ensures that records are correctly archived and provides traceability back to the
­origin of each measurement. This is the most sophisticated option, and while offering the most options for data
management, it usually is also the most expensive and time consuming to install.

LabX works together with all METTLER TOLEDO Excellence series instruments and permits the definition of work-
flows involving both balances and analytical instruments (titrators, pH meters, etc.). In this way the user can be
prompted through an SOP involving more than one device, helping to ensure that entire workflows are carried
out consistently and with no errors in data logging or calculations.

LabX also supports data traceability, ensuring all data records are stored and easily accessible in compliance
with ALCOA+ requirements, EU Annex 11, and FDA 21CFR11.

2.5. Moisture Content Determination

Gravimetric principles can also be used to determine moisture content – a common quality check in a ­number
of industries. Moisture affects the processability, shelf life, usability, and quality of many products, among
them plastics, food ingredients, and both active pharmaceutical ingredients (APIs) and excipients. Most sub-
stances have an optimum moisture content for processing into finished products of high quality; moreover, its
impact on price has led to statutory rules governing the maximum level of moisture permissible in certain goods
(as defined by pharmacopeias or national food regulations).

METTLER TOLEDO Basic Laboratory Skills 18


2.5.1. Measuring Principle

The traditional method to measure moisture content is loss on drying with a drying oven, a differential weighing
procedure. In a typical loss on drying workflow, a piece of sample might be placed in a crucible and weighed,
Weighing and Moisture Analysis

then set in a drying oven (still in the crucible) for a specified length of time before a second weighing step. Given
that the crucible’s weight is pre-determined, the loss of weight over the drying interval can be used to calculate
the original sample’s percent moisture.

As several hours’ drying may be required for complete moisture evaporation, depending on the sample type,
a faster workaround presents an attractive alternative. This is offered by halogen moisture analyzers, which
operate according to the same principle, but employ a halogen heating unit for quick, accurate, and consistent
results. In moisture analyzers the loss in weight is continuously recorded and drying ends once a defined crite-
rion is reached; methods include two parts: sample heating and the switch-off criterion.

Figure 9: Halogen moisture analyzers offer fast and


reproducible moisture results. Shown: the HX204
instrument.

Sample Heating
The type and shape of the heat source (radiator) are crucial to the speed and reproducibility of moisture deter-
mination. A halogen lamp reaches its maximal heating output very fast compared with conventional infrared
lamps. In addition, a ring shaped radiator ensures even heating of the entire sample pan area.

Figure 10: Halogen heating elements


increase moisture analyzers’ speed.
Shown: the HC103 instrument.

METTLER TOLEDO Basic Laboratory Skills 19


Switch-Off Criterion
The switch-off criterion determines the point at which measurement is automatically ended and the result displayed.
Halogen moisture analyzers offer two different kinds of switch-off criteria: time control and weight loss over time.
In the latter case, the drying process is terminated and the result displayed if the loss in weight (Δg) falls below
Weighing and Moisture Analysis

the prescribed figure over a certain time interval (Δt). In the former case, the heating process is terminated and
results are evaluated after the set time has elapsed.

Sample weight
= Mean weight loss
per unit of time
( Δ g
Δ t
)

Δ g

Δ t

Figure 11: Halogen moisture analyzers can terminate the drying


Switch-off criterion (1...5) process after a defined time interval, or based on weight loss within
a set window of time.

More information regarding the substitution of a halogen moisture analyzer for a balance and drying oven may
be found in a white paper.

2.5.2. Choosing a Moisture Analyzer

METTLER TOLEDO offers three halogen moisture analyzer lines that accommodate a broad range of laboratory
applications. As the choice of instrument is strongly influenced by the properties of the samples to be evaluated,
our short videos are a useful place to start. Explore a playlist on YouTube for overviews of their use and features,
as well as suggested instruments for pharmaceutical, polymer, and food production, or browse the same content
in analogue form in the guide “Selecting the Right Moisture Analyzer”.

It should be noted that halogen moisture analyzers may be designed with either hanging or top-loading
­weighing pans. The former, which include METTLER TOLEDO’s Excellence instruments, are particularly suited to
applications requiring optimal measuring performance. They come with high performance weighing cells that
allow r­eadability down to 0.001% moisture (%MC; the HX204 model) and a special construction that eliminates
as much measurement error as possible. In addition, they are easy to clean and maintain.

2.5.3. Tips and Hints for Moisture Determination

To ensure precise measuring results during routine operation,


the following should be observed:
• Keep the sample pan area clean (e.g. using a brush).
• Clean dirt off the temperature sensor and protective glass on the
heating module (for details, see operating instructions).
• Only use clean sample pans for moisture determination. Using dis-
posable aluminum sample pans guarantees reliable measurements
free from the influence of residue remaining from previous samples
or cleaning agents.
• Do not use deformed sample pans.
• Ensure even particle size (granulation).
• If necessary, increase the sample’s surface area by crumbling or grinding it. This will afford better and faster
release of moisture during drying (faster diffusion of moisture to the surface). The sample should not be heated
during preparation to prevent moisture loss.

METTLER TOLEDO Basic Laboratory Skills 20


Use the right sample volume. Sample size depends on:
• The moisture content of the sample. The recommended sample size
is typically 3–5 g, but for samples of <0.5 %MC it is 20–30 g.
• Homogeneity. Nonhomogeneous materials may require more sample
Weighing and Moisture Analysis

to represent the entire batch.


• Sample properties or consistency (liquid, paste-like, solid, powdery).

Additionally, always use the same volume of sample to achieve


good repeatability, and ensure that you spread it evenly across the
entire pan. Figure 12: The sample pan should be filled
evenly for best results.

More information about sample preparation can be found in a short video.

Temperature: Follow one of the recommendations below:


• Apply the SOP relevant to your sample.
• Apply the same settings as for a similar sample.
• Apply the temperature used for the traditional oven method.
• Apply 105 °C for temperature-sensitive samples, and 150 °C for other samples.

More information, including method collections for materials commonly assessed in different industries, can be
found in our Expertise Library.

2.5.4. Data Quality

While following the guidance above will help in obtaining high quality results, a few other factors offer reinforcement.

First, the instruments should be kept within specification by service visits from an accredited technician. The fre-
quency of service (which should include preventive maintenance as well as adjustment) will depend on how
often, and under what conditions, the instruments are used; the best way to establish a suitable schedule is usu-
ally to consult with a professional – which may also be done at the time of installation. Regular service, coupled
with routine testing, will help to prevent unplanned downtime.

An effective routine testing strategy is afforded by the SmartCal™ test substance, which can evaluate halogen
moisture analyzer performance in just ten minutes. SmartCal provides fast and traceable evidence that your
device continues to function correctly between calibrations.

Figure 13: The SmartCal test substance affords traceable


performance verification.

A further way to secure the quality of your results is also the simplest – exercise caution to avoid lapses in data-
recording. Models that permit data-export via USB flash drive or over a network, or the connection of a printer,
eliminate the negative consequences of omitting to write values down, or of manual errors. METTLER TOLEDO’s
EasyDirect Moisture Data Management Software, compatible with Advanced and Excellence series instruments,

METTLER TOLEDO Basic Laboratory Skills 21


permits the transfer of results as PDFs and CSV or XML files, and can also send reminders to perform routine
testing. Buttressing your existing workflows, it supports the monitoring of moisture content data over time, along
with audit-readiness.
Weighing and Moisture Analysis

2.6. Conclusions
The solutions offered in this chapter are intended to bolster the accuracy and repeatability of gravimetric measure-
ments in any laboratory. From instrument choice, use, cleaning, and testing to data-storage and service options,
the principles outlined here should be applicable in most contexts. We encourage you to browse the reference
materials listed below for more information, and never to hesitate in speaking to your service representative or a
support professional from METTLER TOLEDO should technical assistance be required.

2.7. Resources and Further Reading

Balances
Guides:
Proper Weighing with Laboratory Balances
8 Steps to a Clean Balance and 5 Solutions to Keep It Clean
Correct Weight Handling: 12 Practical Tips (Routine testing guide)

Other resources:
Outstanding Weighing Performance Even under Harsh Conditions (White paper)
SOP for Cleaning a Balance: Benchtop Balances (SOP)
Even the Best Balances Need Some Care (Poster)
Good Weighing Practices™ (GWP®) framework (Best practice guidelines)
Electrostatic Charges and Their Effects on Weighing (Resource collection)

Moisture Analyzers
Guides:
Guide to Moisture Analysis: Fundamentals and Applications
Selecting the Right Moisture Analyzer: Guidance and Criteria

Other resources:
Drying Oven Method vs. Halogen Moisture Analyzer: A Practical Guide to Compare Methods (White paper)
How to Get Reliable Moisture Results (Resource collection)
Moisture Application Library (Application notes)
How to Select a Moisture Analyzer (YouTube playlist)
Correct Sample Handling: The First Step to Reliable Moisture Results (YouTube video)

METTLER TOLEDO Basic Laboratory Skills 22


3. Volumetric Procedures: Liquid Handling Operations

Liquid transfer ranks close to weighing in its importance among laboratory activities, but the equipment used will
Liquid Handling Operations

differ depending on the volumes involved. Filling a vessel (e.g. a beaker, flask, or graduated cylinder) to a mark
will be sufficiently accurate for some operations, but for more exacting workflows that employ smaller volumes,
pipettes may be necessary to achieve the required precision.

The reproducibility and accuracy of a pipetting operation depend upon creating a pipetting system made up of
the proper pipette and tip, as well as user technique. The care and maintenance of the pipettes are essential to
the consistent production of accurate results; in the sections below, we outline the considerations to be made
when creating, calibrating, and maintaining a pipetting system.

3.1. An Overview of Pipette Types

Both manual and electronic varieties of pipettes are in use in many laboratories. Electronic pipettes use a motor
to drive the piston that controls the volume of liquid to transfer, whereas manual rely on the user’s thumb.

Figure 14: Manual and electronic pipettes. Left to right: manual single-channel, multichannel, and adjustable-spacer multichannel pipettes;
electronic single-channel, multichannel, and adjustable-spacer multichannel pipettes.

Electronic pipettes can often be programmed to carry out complex pipetting protocols with both dispensing and
mixing operations. As the program is identical from replicate to replicate, consistency is improved. In addition,
when repeated operations are necessary, electronic pipettes can enhance reproducibility while reducing user
fatigue and the strain associated with repetitive motion. The volume of liquid transferred, necessary precision,
and degree of repetition are factors to consider when deciding between manual and electronic options.

Manual as well as electronic pipettes come in single-channel and multichannel models. The latter typically have
either eight or 12 channels that can be loaded with individual tips. The multiple channels enable the uptake of a
designated volume several times in parallel, increasing the throughput of workflows involving well plates.

The adjustable-spacer multichannel pipette is designed with channels that can be positioned more closely
together or farther apart based on the application. These pipettes can also be used to transfer liquids among
receptacles of varying kinds – between well plates of different sizes, or between a well plate and microfuge
tubes, etc. Further efficiency gains can be achieved through high-throughput liquid-handling systems, which
have an aspirating / dispensing head that can accept at least 96 pipette tips at once, and can process 96- and
384-well plates in a single operation.

METTLER TOLEDO Basic Laboratory Skills 23


Liquid Handling Operations

Figure 15: High-throughput liquid-handling systems enable quicker


processing of well plates. Shown: METTLER TOLEDO’s Rainin
BenchSmart 96.

Pipettes used in life science labs are commonly calibrated for use in measuring predefined volume ranges with
nominal volumes of 2 µL, 20 µL, 200 µL, or 1 mL. Attempting to use a pipette over its entire volume range is
not advisable; accuracy will fluctuate due to the nature of the substance pipetted, the pipette’s state of repair, the
seal made between the pipette and the pipette tip, the user’s technique, and other factors. For this reason, it is
recommended to use a pipette with a nominal volume as close as possible to that being transferred.

3.2. Air-Displacement and Positive-Displacement Technology

Pipette design is based on one of two technologies: air displacement and positive displacement. Air-displacement
pipettes rely on an air cushion between the piston and the sample to transfer liquids, whereas positive-displace-
ment lack this cushion. Positive-displacement technology outperforms air-displacement in the handling of chal-
lenging liquids.

Air-displacement pipettes achieve liquid uptake through the creation of a partial vacuum when the pipette’s
plunger, held within an airtight shaft, is depressed and then released following immersion of the tip in a sample.
Extremely accurate for most aqueous substances, the technology is less suitable in other cases. Aspiration and
dispensing operations involving viscous liquids, for example, stretch and compress the air column, leading to
measurement inaccuracies. In addition, volatile liquids constantly emit vapor, so when aspirated into an air
­displacement pipette tip cause the buildup of air pressure, diminishing accuracy in their turn.

Positive-displacement pipettes provide the necessary accuracy for challenging, frequently non-aqueous, sub-
stances. They employ syringe tips that contain an inner plastic piston extending the full length of the tip. At vol-
ume -0-, the end of the inner piston is fully extended, closing off the tip. As a user aspirates liquid, the end of the
piston rises, pulling liquid into the tip. With no air between the piston and the liquid in a positive displacement
system, the pressures that viscous and volatile liquids exert on it do not come into play, and extreme accuracy
is preserved.

METTLER TOLEDO Basic Laboratory Skills 24


Liquid Handling Operations

Figure 16: Positive-displacement pipettes help to transfer challenging


substances accurately. Shown: METTLER TOLEDO’s Rainin PosD.

A white paper, “Pipette Choice and Maintenance: Best Practice Matters”, provides an overview of pipette
selection criteria.

3.3. Pipetting Strategies to Improve Data Quality


Good user technique complements the choice of the
right pipette for a given workflow to bolster the preci-
sion and accuracy of results. An effective pipetting
­system depends on adherence to best pipetting prac-
tices, among them the following:
• Sample aspiration with the pipette tip as close to
90 degrees from the surface of the liquid as possible,
and no more than 20 degrees from the vertical. This
prevents excess liquid from being drawn into the tip.
• Consistent pipetting rhythm and smooth plunger
action. This ensures the complete uptake of a sample
into the tip before it is withdrawn from the liquid.
• Proper touch-off with each liquid dispensing action.
• Attention to the possible effects of hand-warming on
pipetting operations.

In addition, as a general guideline, the pipette tip


should just break the surface of the liquid sample.

Figure 17: A pipetting angle as close to 90 degrees from the surface


of the liquid as possible prevents excess liquid from being drawn
into the tip.

The method of pipetting adopted must also be suited to sample properties. With challenging liquids, alterations
in technique may be the most effective way to improve accuracy. In particular, switching from forward pipetting to
reverse may be desirable when using an air-displacement pipette.

As the majority of applications in life science laboratories involve aqueous solutions, forward pipetting is often
referred to as the “normal” pipetting procedure. Forward pipetting involves depressing the pipette’s plunger
to the first stop, aspirating an aliquot of the sample while the pipette tip is immersed in it, removing the tip,
and then depressing the plunger fully, to the second stop, to expel the liquid held in the pipette tip into the target
receptacle.

METTLER TOLEDO Basic Laboratory Skills 25


Reverse pipetting is a modification to this technique that can be used with viscous substances, e.g. blood.
It involves first depressing the plunger to the second stop to uptake more liquid than desired. During the next
step, dispensing into the target vessel, the plunger is only depressed to the first stop. Although liquid will be
left in the tip, viscous samples will be dispensed more accurately and reproducibly with this technique than via
Liquid Handling Operations

­forward pipetting.

METTLER TOLEDO’s Good Pipetting Practice™ (GPP™) resources offer more information on pipetting techniques,
including an eLearning activity and a request form for a free training seminar. The posters “Get Better Results” and
“Pipette Challenging Liquids” also provide stepwise, visual instructions on best practice for your daily operations
and can be printed and hung in your lab.

3.3.1. Uncertainty in Pipetting Accuracy

An understanding of the roles played by systematic and random error in influencing the volumes dispensed is also
important when optimizing complex protocols. A white paper, “Pipetting Accuracy: Understanding Uncertainty”,
provides an overview of the subject, with details of how to identify and eliminate possible sources of error.
Methods to account for uncertainty are described as well.

3.4. Pipette Tip Selection

Suitable pipette tips are the third component of an effective pipetting system. There are many different types of
tips on the market, and it is common for labs to purchase several in order to have the appropriate tip on hand
for a particular application. For instance, the most sensitive procedures, including those involving live cultures,
diagnostics, or labile biomolecules such as RNA, may call for pre-sterilized and / or filter tips.

For more general use, labs may purchase either loose tips or prefilled racks and load them into existing pipette
tip racks or boxes, respectively, for autoclave-sterilization. This approach is not just economical; it cuts down on
the consumption of single-use plastics. Yet regardless of the application, the fit between pipette and tip should
be carefully considered. The tip should fit snugly against the pipette so that there are no leaks, and should not be
so tight that it becomes difficult to remove.

Figure 18: Purchasing prefilled pipette tip racks permits the reuse of labs’ existing tip boxes, cutting plastic waste. Racks are dispensed
from the bottom, preserving the sterility of the tips remaining in the packaging.

For additional information about the range of tips available, please consult METTLER TOLEDO’s “Guide to Good
Pipetting: Get Better Results” or explore our tip finder.

METTLER TOLEDO Basic Laboratory Skills 26


3.5. Pipette Cleaning

Pipette cleanliness and regular maintenance are essential to continued success in pipetting workflows. Even
with cautious use, pipettes will become dirty and can contaminate or cross-contaminate samples.
Liquid Handling Operations

However, pipettes should never be cleaned with aggressive solvents. Instead, they can be wiped down with lab,
or other lint-free, tissues dampened with distilled water. Mild detergent or isopropyl alcohol are likewise appropri-
ate as needed. Bleach solutions should be avoided, as they may discolor the pipette.

METTLER TOLEDO’s “Cleaning a Pipette” poster includes guidelines for routine cleaning. It also has steps for
complete decontamination, with specific advice for countering common laboratory challenges.

3.6. Pipette Testing and Maintenance

Once the pipette(s), tips, and technique are decided upon for any workflow, the most important consideration
remaining is the ability to consistently produce accurate results. Confidence in results depends on confidence
in the proper functioning of the pipettes. As a result, calibration, preventive maintenance, and routine testing of
each pipette are essential to the quality of data, and to the quality of the pipetting system overall.

Calibration – the comparison of measurement values produced by a device with those of a known accuracy –
is necessary for all measuring equipment. Its frequency depends on customer tolerances, and in some cases
on regulatory requirements. Most pipette manufacturers recommend pipette calibration, typically by a service
professional, a minimum of once per year.

A gravimetric balance is often used between service visits to verify a pipette’s calibration in order to verify its
accuracy (see, e.g., the “Pipette Routine Testing: A Practical Guide for Life Scientists” guide). As this procedure can
be time-consuming, a newer, faster, alternative for pipette calibration verification offered by METTLER TOLEDO’s
Rainin SmartCheck™ system may be preferable. The user dispenses the nominal volume of the pipette into the
SmartCheck vessel, then follows the prompts to dispense the same volume three more times. A clearly indicated
result – Pass or Fail – may be obtained in less than one minute.

Figure 19: The SmartCheck device enables verification that a pipette is working within tolerances in just a minute or two.

METTLER TOLEDO Basic Laboratory Skills 27


Periodic performance verification reinforces data quality by detecting any divergences in the volumes transferred
that may arise over time. It allows users to set aside malfunctioning pipettes for service before they affect experi-
mental workflows, reducing rework and the waste of reagents.
Liquid Handling Operations

Calibration and maintenance activities are additionally critical to continued good pipette performance. Service
should be ISO 17025–compliant; ISO 8655 describes the standards specifically obtaining for piston-operated
volumetric devices.

For more information about the range of services offered by METTLER TOLEDO, please consult our website.

3.7. Resources and Further Reading

Guides:
Guide to Good Pipetting: Get Better Results
Pipette Routine Testing: A Practical Guide for Life Scientists

White papers:
Pipette Choice and Maintenance: Best Practice Matters
Pipetting Accuracy: Understanding Uncertainty

Posters:
Cleaning a Pipette
Get Better Results
Pipette Challenging Liquids

Other resources:
Good Pipetting Practice™ (GPP™) framework (Best practice guidelines)
Tip Finder

METTLER TOLEDO Basic Laboratory Skills 28


4. Analytical Chemistry: pH and Conductivity Value Determination

Substances’ pH and conductivity are electrochemical properties that vary with their ion content. pH reflects ­acidity
pH and Conductivity Value Determination

– more precisely, the hydronium ion concentration – while conductivity rather indicates an ability to conduct
­electricity. These parameters may be measured for a variety of reasons: to assess materials’ quality or verify their
suitability for inclusion in manufacturing processes, to ensure the favorability of chemical or biological reactions,
and to determine the shelf-life of preserved foodstuffs, among others.

Both pH and conductivity are measured with a meter and sensor. Most pH sensors actually register the voltage
difference between a reference and a sample electrode following their immersion in a sample of interest; this value
can be mathematically converted into pH. Conductivity, by contrast, is assessed by applying a known voltage to a
measuring cell comprising electrodes of opposing charge, also immersed in a sample of interest, and is reported
in siemens / m (S/m). Below, we describe these parameters, as well as instruments and sensors for their measure-
ment, and list resources for further consultation.

4.1. The Principle and Importance of Measuring pH


The pH value, broadly speaking, indicates the acidity or alkalinity of a sample. Measured on a logarithmic scale
from 0 to 14, pH actually describes the self-ionization of pure water.

In solution, a chemical equilibrium exists for every substance. For water, a small proportion of neutral water
molecules (H2O) react with each other to produce the hydroxide (OH−) and hydronium (H3O+) ions:

H2O + H2O → −
← OH + H3O
+

The equilibrium constant for this self-ionization reaction is referred to as K w and, at ambient temperature (25 °C,
by definition), its value is 1.0 × 10−14. This means that, at the specified temperature, the product of the activities
of the OH− and H3O+ ions must have this numerical value:

[OH−] [H3O+] = K w = 1.0 × 10−14

Notably, the chemical activity of a compound in solution, also referred to as its “effective concentration”, is the
product of the true concentration of the compound and a factor that accounts for deviations from the ideal behav-
ior that would be exhibited by the compound’s molecules in isolation. Mutual interactions and / or environmental
influences will cause the true and effective concentrations to differ.

Importantly, for dilute samples in aqueous solution, the activities of the OH− and H3O+ ions are approximately
equal to their concentrations.

The pH of an aqueous solution is a measure of its H3O+ ion activity, and is calculated as follows:

pH = −log [H3O+]

In pure water, the activity of the H3O+ ion is equal to that of the OH− ion; at ambient temperature, each ion has
an activity of 10−7 mol/L, and the pH is 7, or neutral. Upon the addition of chemical species that release the H3O+
ion (acids; also called proton [H+] donors), the value of the pH decreases, so that aqueous media characterized
by pH values <7.0 are referred to as acidic. Similarly, the addition of species that release OH− – bases, or proton
acceptors – results in a pH greater than 7.0.

METTLER TOLEDO Basic Laboratory Skills 29


It should be noted that the activities of the OH− and H3O+ ions cannot vary independently of each other in water.
An increase in H3O+ activity therefore yields a decrease in OH− activity, and vice versa, so that the product of
these ions’ activities remains equal to K w.
pH and Conductivity Value Determination

pH can be determined in aqueous and non-aqueous solutions alike; however, as pH measurements are based
on the ability of the electrode’s sensing glass to detect H3O+ ions (see below for a diagram), special care must
be taken to prevent errors from arising. A sensitive, high-impedance pH meter and electrodes with low-resistance
sensing glass assist users to obtain accurate measurements, as described in section 4.5.1.

For more information on pH theory, please consult our guides “Essential Knowledge of pH to Measure Correctly
from the Start” and “A Guide to pH Measurement: Theory and Practice of Laboratory pH Applications”.

4.2. Applications of pH

Acids and bases may be strong or weak – in other words, their pH values may cluster toward the edges of the
pH scale, or may lie closer to neutral. The strength of such compounds refers to their tendencies to dissociate,
releasing OH− and H3O+ ions in solution, and influences their possible uses in various applications.

Weak acids and bases can form buffers, which are relatively resistant to changes in pH. Many finished products,
including foods, cosmetics, and pharmaceuticals, will be buffered at a target pH selected for safety and efficacy,
or to preserve the properties of one or more substances of interest. pH is also an essential value in research and
development, with a bearing on biological as well as chemical reactions: fluctuations in pH may bias a reaction
in favor of a side-product, or may alter the stability of one or more components of the reaction, e.g. biomolecules
such as proteins.

In practice, this means that pH is determined often during both research and manufacturing or processing appli-
cations. Measured at goods in, in process, and at final quality control, or routinely during the preparation stages
of many laboratory experiments, it strongly influences the outcome of a broad range of workflows.
• In food and beverage production, for example, the pH value can dramatically affect end-product properties
such as appearance, texture, or taste. In addition, the growth of harmful microbes such as Clostridium botuli-
num is limited below pH 4.6, making pH determination important to both safety and shelf-life.
• In manufacturing processes, including those of chemicals and pharmaceutical compounds, the yield of spe-
cific products may depend on the pH of the relevant reactants’ solution. Should the pH differ, side-products
may result, or the favorability of the reaction change.
• The pH and conductivity values of pure water, used in applications from food and beverage to pharmaceutical
production, also indicate possible microbial contamination.

As such, a variety of regulations and standards, including pharmacopoeial, provide a framework for the circum-
stances under which pH measurements should be carried out, as well as their accuracy and repeatability.

More information on pH applications can be found in METTLER TOLEDO’s Expertise Library, as well as on our
“pH in Food and Beverage Production” page.

4.3. The Principle and Importance of Measuring Conductivity

Electrical conductivity is the ability of a substance to carry an electrical current. The current may be carried by elec-
trons or by ions, depending on the type and nature of the sample; in solution, conductivity reflects the overall quan-
tity of ionic species released from an electrolyte (salts, acids, bases, and some organic substances), and does not
distinguish individual ions. Conductivity measurements are frequently employed for the monitoring and surveillance
of a wide range of different types of water (pure water, drinking water, natural water, process water, etc.), as well as
other solvents, and are also sometimes used to determine the concentrations of conductive chemicals.

METTLER TOLEDO Basic Laboratory Skills 30


Dissolving an electrolyte in a polar solvent, such as water, yields an electrically conducting solution. The elec-
trolyte dissociates into positively and negatively charged ions (cations and anions, respectively). When a known
voltage is applied in the presence of electrodes, or poles, with opposing charges, the resistance can be deter-
mined and used to calculate conductance.
pH and Conductivity Value Determination

More specifically, according to Ohm’s law:

V = IR

where V = voltage, I = current (in amperes, A), and R = resistance.

Conductance, G, is equal to 1/R, and is measured in siemens (S). Conductance depends on the geometry of the
measuring cell, made up of the oppositely-charged electrodes, and can be used to derive the solution’s conduc-
tivity, κ, a value independent of the measuring cell.

DC

e− e−

cathode anode
Figure 20: Schematic set-up of a conductivity measuring cell.

Both strong and weak electrolytes exist; analogously to strong and weak bases, they may tend to dissociate fully,
or only partially, under ambient conditions.

Determining the conductivity of water is prerequisite to a variety of applications, as touched on above; among
them are formulation processes for foods and beverages, pharmaceuticals, or cosmetics. As conductivity may
denote the presence of ions that would interact with or inhibit key components of a reaction (e.g. for pharma-
ceutical compounds), or reveal that of microorganisms, its measurement is an important quality guarantor for
finished products. It may also be assessed in industrial settings such as power generation: the water used to
cool power systems should fall within specified conductivity limits to prevent corrosion from occurring. Similarly
to pH, therefore, industry standards and regulations obtain regarding conductivity measurement under specific
circumstances.

Please consult “A Guide to Conductivity Measurement: Theory and Practice of Conductivity Applications” for more
comprehensive information on conductivity determination.

4.4. Instrumentation to Measure pH and Conductivity

Historically, pH was measured via colorimetric assay – litmus paper (also called pH paper), which changes color
predictably with changing pH. Essentially qualitative, and poorly suited to measurements of colored samples,
litmus paper is now normally used only for basic applications such as testing the pH of an aquarium or a swim-
ming pool.

Modern pH meters, by contrast, are voltmeters that convert the difference in electrical potential between a reference
electrode and one that is sensitive to the activity of the H3O+ ion in solution into a quantifiable signal. The measur-
ing range usually lies between −2,000 mV and +2,000 mV with a resolution of 0.1 mV; the relationship between
the activity measured and the pH value is described mathematically by the Nernst equation.

METTLER TOLEDO Basic Laboratory Skills 31


Conductivity, like pH, is measured with a voltmeter. An alternating voltage is applied across the measuring cell
(the sensor, which comprises an anode and a cathode) to determine the conductivity of the electrolyte or, in more
general terms, the dissolved ion content of the sample of interest. A substance’s conductivity depends to a certain
extent on its chemical makeup, including its ionic charge, radius, and mobility, as well as on the properties of the
pH and Conductivity Value Determination

solvent, and is not entirely predictable mathematically. However, conductivity measurements are particularly use-
ful in detecting impurities, whether in pure water or other sample types.

In recent decades, pH and conductivity meters have been developed to meet different use cases. Whether hand-
held or benchtop, once paired with appropriate sensors they can support a variety of applications and through-
puts, facilitating research and development as well as quality control activities.

4.4.1. Portable Meters

METTLER TOLEDO’s portable FiveGo, SevenGo, and Seven2Go™ instruments enable accurate measurement of
routine samples in the field as well as at line. Simple and ergonomic in design, and requiring only a single hand
to manipulate, they are rugged and drop-resistant, with a long battery life. FiveGo and Seven2Go support mea-
surements of either pH or conductivity, while the SevenGo device is dual-channel.

Figure 21: Portable pH and conductivity meters can be used in


the field or at-line. Shown: METTLER TOLEDO’s FiveGo.

The StatusLight™ function indicates at a glance whether the meters are ready for measurements or must first
be adjusted; smart buttons let the user toggle through menus and complete readings quickly. Seven2Go is com-
patible with state-of-the-art InLab sensors incorporating Intelligent Sensor Management (ISM®) technology and
adapted to both liquid and solid applications. (ISM technology is described in section 4.6.1.; METTLER TOLEDO’s
sensor portfolio is enlarged on below and can be explored via a filterable tool.)

4.4.2. Benchtop Meters

A series of METTLER TOLEDO benchtop meters aligns with any budget and throughput, and their compact foot-
print allows them to slot into even restricted lab spaces. The FiveEasy™ provides high quality results for either pH
or conductivity measurements at a modest price and can be connected to a printer via USB cable for direct data-
export. The Advanced and Excellence Seven series models work with an expanded portfolio of ISM-enabled pH
and conductivity sensors applicable to additional sample types.

METTLER TOLEDO Basic Laboratory Skills 32


pH and Conductivity Value Determination

Figure 22: Benchtop pH and conductivity meters can be used in the lab or near-line. At left: the FiveEasy single-channel meter has
a minimal footprint and offers quick and simple measurements. At right: the SevenExcellence dual-channel meter can simultaneously
accommodate both pH and conductivity modules. Supporting user management as well as automation, it is suited to regulated
environments and / or those with higher throughput applications.

Advanced series meters are manufactured to withstand dust and spills, extending their usable lives near or at-line;
a bright color display and shortcut keys let analysts quickly zero in on the most important details. Data is logged
in the EasyDirect software and can be further examined or analyzed using spreadsheet applications.

An ideal addition to higher-throughput laboratories or those subject to audits, the SevenExcellence™ model
includes four levels of user management. Unintentional or unauthorized changes to methods and results can be
prevented, eliminating mixups and undergirding data traceability, and data is stored directly in the meter.

Compliant with GLP, SevenExcellence also offers a user-friendly touchscreen and the launch of pre-programmed
methods with One Click™. Hardcopies of results can be generated via connected or networked printers, and
data can be transferred automatically to the LabX software. The meter’s modular design permits dual-channel
pH and conductivity measurements in laboratories in which more than one parameter is routinely assessed, and
a third functionality – e.g. ion or oxidation-reduction potential analyses – can be incorporated by addition of a
module. The SevenExcellence meter additionally pairs with the InMotion™ Autosampler to enable the measure-
ment of more than 40 samples in sequence without operator intervention.

Figure 23: Connection to an InMotion Autosampler enables automated pH and conductivity readings. Workflows can be driven by the LabX
software, with data and metadata captured uniformly and stored in a secure database. Transfer to a LIMS or eLN is also possible.

METTLER TOLEDO Basic Laboratory Skills 33


4.4.3. Notes on Choosing a pH or Conductivity Meter

Any pH or conductivity meter should fulfill your requirements, and it may be worth investing in a higher-end instru-
ment if you expect your throughput or the applications you carry out to alter over the medium term. As general
pH and Conductivity Value Determination

rules of thumb, however, we propose the following:


• If pH and conductivity results must flow to process and enterprise control systems, automatic data transfer
is recommended to prevent transcription errors, increase data security, and improve efficiency. In this case,
meters with flexible (e.g. Ethernet or USB) connection interfaces and suitable software (such as LabX) offer
the best solution.
• When pH and conductivity values will be measured at various places outside the lab, a portable meter is
suggested. Portable meters are protected against water and dust (e.g. METTLER TOLEDO models have an
IP67 rating) and can store data for later read-out.
• Laboratories determining both pH and conductivity – e.g., during wastewater testing – may wish to invest in
a combination meter that supports the measurement of both parameters.

In other cases, a simple benchtop instrument will likely be sufficient for your needs; consultation with a technical
specialist may be a useful means of evaluating your options.

4.5. pH and Conductivity Sensors

An extensive range of sensors is available to support measurements of different sample types. Designed to
accommodate specific substance characteristics, and for use in various environments, selected sensors also
include features such as automatic temperature compensation to reinforce the accuracy of pH as well as con-
ductivity determinations.

pH sensors, as noted above, convert the difference in electrical potential that arises between reference and sam-
ple electrodes upon immersion in a sample of interest into a quantifiable signal. Combination sensors containing
both electrodes are now in common use, although half-cell sensors (i.e. with separate reference and measuring
electrodes) may also be purchased. Conductivity sensors typically include a complete measuring cell with poles
of opposing charge, but are engineered for samples with differing conductivity and / or chemical profiles.

4.5.1. pH Sensors

pH sensors are a form of ion-selective electrode, with the ion in question being protons (H+). The sensing elec-
trode is made of ion-specific glass; the standard design now in use for combination sensors employs a reference
electrode that concentrically surrounds the pH-sensitive one. The space between is filled with reference electro-
lyte, often with a silver / silver chloride–based reference system.

METTLER TOLEDO’s ARGENTHAL™ reference system, used in select pH sensors, is a modified version of this
Ag/AgCl schema. It is supplemented with additional AgCl; a silver ion barrier stops Ag+ from passing into the
electrolyte, such that the ARGENTHAL technology enables the use of standard 3 mol/L KCl as a reference electro-
lyte rather than 3 mol/L KCl saturated with AgCl. This reduces the likelihood that free Ag+ ions in the electrolyte
will react with the sample and precipitate from solution, clogging the sensing glass membrane.

METTLER TOLEDO Basic Laboratory Skills 34


Silver wire coated with AgCl
pH and Conductivity Value Determination

Ag / AgCl cartridge (ARGENTHAL)

Glass wool

Silver-ion–free reference electrolyte

Figure 24: Operating principle of the


ARGENTHAL system. An Ag/AgCl cartridge
supplies the reference system while remaining
Junction (ceramic frit) separate from the reference electrolyte.

Over the years, pH sensors have been developed to meet a variety of specifications. The type of junction and
material from which the shaft is made (e.g., glass, PEEK, or polysulfone) have been modified to improve
­measurements of formerly challenging samples, as has the shape of the glass membrane, as illustrated below:

Spherical Cylindrical Spear Flat Micro


For low temperature Highly sensitive For semi-solids and For surfaces and droplets: For samples in
samples: resistant to membrane: large surface solids: punctures the very small pH-membrane reaction tubes: very
contraction area, lower resistance sample easily contact area narrow electrode shaft

Figure 25: Differently shaped pH membranes improve the quality of measurements for challenging sample types.

The junctions of pH electrodes bridge the two components of the instruments: the pH-sensitive and reference
electrodes. Via the junction, the electrolyte flows into the sample and ensures electrical connection.
• A ceramic frit or similar material – also called a diaphragm – is the main type of junction. Compatible with
most aqueous samples, frits may become clogged by components such as undissolved particles or proteins
that react with the electrolyte.
• Open-hole or movable sleeve junctions can help to avoid clogging. These junctions can easily be cleaned,
reinforcing the reliability and accuracy of results.

Certain types of measurements are likelier to require modified sensors than others, among them those of biologi-
cal samples (biomolecules such as protein, as indicated above, or samples suspended in Tris-based buffers),
as well as samples involving non-aqueous solvents. Biological samples may precipitate from solution upon
exposure to standard electrolyte, clogging the sensing glass and standard ceramic junctions; Tris reacts with Ag+
ions to form a stable complex, again obstructing standard junctions. In each case unstable or erroneous read-
ings are the likely consequence.

METTLER TOLEDO Basic Laboratory Skills 35


Non-aqueous solvents introduce a different type of problem: standard pH sensors are simply designed for read-
ings in aqueous substances. Depending on the solvent, and the concentration thereof, the standard electro-
lyte (3 mol/L KCl) may become insoluble, or water may be stripped from the sensing glass, which depends
on hydration to function. The end result is the same as for biological samples – inaccuracy or instability of
pH and Conductivity Value Determination

assessments.

While there is no single solution – one sensor that affords high-quality results in every context – understand-
ing how your sensor works and your samples’ properties will help to improve data accuracy. White papers in
METTLER TOLEDO’s Expertise Library describe the sensor features that boost measuring quality for challenging
substances; proper cleaning and maintenance (see section 4.6.) also play a role.

4.5.2. Conductivity Sensors

Choosing the right conductivity sensor is a decisive factor in obtaining accurate and reliable results. Indeed,
a fundamental requirement is that no chemical reactions will occur between the sample and the sensor. For
chemically reactive samples, therefore, glass and platinum are often the most appropriate choice, as they have
the best chemical resistance of all commonly used cell materials. In the field, however, as well as for many
­laboratory applications, the mechanical stability of the sensor is a more important factor; a conductivity sen-
sor with an epoxy body and graphite electrodes is often used, as this has been shown to be both durable and
chemical-resistant. For low reactivity aqueous solutions and organic solvents, the use of cells made of stainless
steel or titanium is often a good alternative.

The next points to consider in selecting the optimal sensor are the cell constant and construction type. A suitable
cell constant correlates with the conductivity of the sample. The lower the expected conductivity of the sample,
the smaller the sensor’s cell constant should be. Figure 26 shows a set of samples and the range of cell constants
recommended for measurements. In deciding between 2-pole and 4-pole cells, the following rule of thumb can
be used: for low conductivity measurements (up to 500 μS/cm), a 2-pole cell is best. For mid-to-high conductivity
measurements a 4-pole cell is preferred, especially for measurements across a wide range of conductivities.

1 10 100 1,000 mS∙cm−1


0.1 1.0 10 100 1,000 μS∙cm−1

Pure water Tap water Waste water

Deionized water Rain water Sea water Conc. acid & base

Cell constant 0.1 cm−1

Cell constant 0.5 cm−1

Cell constant 1.0 cm−1

Figure 26: Set of samples and recommended cell constants.

METTLER TOLEDO Basic Laboratory Skills 36


4.5.3. Sensor Choice

The full range of METTLER TOLEDO pH and conductivity sensors can be explored in our interactive product guide,
which permits filtering by the search criteria most relevant to your needs and applications. It should be noted that
pH and Conductivity Value Determination

many standards and norms contain requirements concerning pH or conductivity sensors. For any measurements
performed according to such standards, the sensor employed must completely fulfill the requirements described.

4.6. Sensor Calibration and Maintenance

An important aspect of sensor use is proper care, which includes calibration and maintenance activities.
Calibration against buffers or standards of known pH or conductivity enables the relationship between the value
measured by a sensor and the “true” (specified, or pre-determined) value to be defined. As sensor performance
is affected by a range of factors, among them the buildup of residues from sample measurements, along with
drift, calibration is a useful check that should be undertaken frequently in accordance with the manufacturer’s
instructions and any relevant standards or norms. Consistently poor calibration results may indicate that service
is due or a sensor has failed and requires replacement.

Calibration is typically recommended at least daily for pH sensors. Particularly if the buffers used for a previous
calibration did not bracket the expected pH range of a new sample series, however, it may be advisable to cali-
brate more often. At minimum, a two-point calibration should be performed, but a three-point calibration is rec-
ommended for more sensitive applications. Buffers formulated to specific standards, including DIN/NIST or GB
(Chinese National), are readily available for purchase, and will yield the best results, particularly in environments
subject to regulation or auditing.

Figure 27: Ready-made pH buffers formulated to specific standards


are available for purchase.

Calibration of conductivity sensors is used to determine their cell constants, which is required to calculate con-
ductivity from conductance (see section 4.3.). As conductivity measurements are sensitive to environmental
influences, sensor calibration should be performed prior to the assessment of any sample series. It is crucial to
ensure that the temperatures of the standard and the samples subsequently measured are the same. In addition,
care must be taken to prevent conductivity standards from becoming contaminated or diluted, or from absorbing
carbon dioxide, as these factors will affect the quality of results. Standards formulated to SRM NIST guidelines
are available for purchase and will support the accuracy of measurements.

It should be noted that all calibration activities should be carried out with fresh buffers or standards which are
discarded after use.

Proper sensor storage and cleaning are also critical to the continued accuracy of measurements. The manufac-
turer’s instructions will outline the most appropriate procedures for a sensor of interest; suggestions are also indi-
cated in our pH and Conductivity Theory Guides. It may be useful to consult our series of cleaning posters for tips
on the maintenance of selected types of pH sensors; additional information on sensor care can be found on our
Good Electrochemistry Practice™ (GEP™) page.

METTLER TOLEDO Basic Laboratory Skills 37


4.6.1. Sensor Management Technology

Selected METTLER TOLEDO sensors – particularly those for use with Seven series handheld or benchtop instru-
ments – include Intelligent Sensor Management, or ISM®, technology. This i­nnovative functionality maintains a
pH and Conductivity Value Determination

record of the sensor’s identification and calibration data, which are carried with the sensor and read upon con-
nection to any meter. This enables sensors’ safe shared use, as the operator is immediately informed whether
calibration is necessary (or may even be prevented from using the sensor before stipulated actions are taken);
exposure to high temperatures is also flagged. Conversely, if a method specifies a ­certain sensor type, no other
will be accepted, ensuring that determinations are properly carried out.

Further details of the ISM technology are available in the pH and Conductivity Theory Guides linked to above,
and in the references.

4.7. Service

pH and conductivity meters and sensors should be assessed by a qualified professional from time to time to ver-
ify their continued good performance. The frequency of service visits will depend to some extent on the environ-
ment; for example, laboratories subject to regulation should define their service intervals with respect to auditing
requirements. As the intensity and conditions of use of both equipment and sensors may affect their performance,
a service professional may be able to recommend the schedule best suited to needs and applications.

Please refer to METTLER TOLEDO’s service options for pH and conductivity instrumentation for more details.

4.8. Resources and Further Reading

Guides:
Essential Knowledge of pH to Measure Correctly from the Start
A Guide to pH Measurement: Theory and Practice of Laboratory pH Applications
A Guide to Conductivity Measurement: Theory and Practice of Conductivity Applications

Application notes:
How to Measure pH in Protein-Containing Samples
How to Measure pH in Tris-Containing Samples
pH of Non-Aqueous Samples: Measurements in Organic Solvents
pH in Food and Beverage Production: Tips and Tricks to Overcome Measurement Challenges (Collection of
­application notes)

Other resources:
Easy Ways to Keep Your pH Electrode Clean and Working Properly (Poster)
Good Electrochemistry Practice™ (GEP™) framework (Best practice guidelines)
pH and conductivity sensor finder

Please also see METTLER TOLEDO’s Expertise Library for a complete list of resources.

METTLER TOLEDO Basic Laboratory Skills 38


5. Analytical Chemistry: Titration Procedures

Titration is a frequently performed analytical chemistry technique, employing chemical reactions to quantitate
Titration Procedures

substances of interest. Any analyte can in theory be measured as long as it reacts with a titrant at known stoi-
chiometry, but in reality the reaction must be fast, complete, unambiguous, and observable to generate useful
results. Various methods of detection are possible, including changes in pH, color, or electrical potential, the
­formation of a precipitate, and the evolution of energy as heat.

Previously performed manually, with titrant added in a controlled manner via manipulation of the stopcock on
a glass burette, titrations can now be undertaken using semi-automated or automated instruments that include
mechanical (piston-driven) burettes. Together with sensors that convert the details of reaction progression into
a quantitative readout, the newer generation of instruments has enabled analysts to develop equivalence point
(EQP) methods rather than following each reaction to a predetermined endpoint (EP). The EQP reflects stoichio-
metric equivalency between analyte and titrant, while the EP may overshoot this.

It should be noted that the choice of EQP versus EP titrations will often depend on the type of sample and the
nature of the reaction being performed, as described below.

E [pH or mV] E [mV]

V [mL] V [mL]
Figure 28: Endpoint (left) and equivalence point titrations (right). The former may overshoot the true equivalence point; the latter reflect
the stoichiometry of the reaction between analyte and titrant.

In a titration, part of the sample containing the substance to be analyzed (the analyte) is dissolved in a suitable
solvent – often deionized water. A second chemical compound, the titrant, is added as a solution of known con-
centration in a controlled manner until the analyte has reacted quantitatively. The stoichiometry of the reaction
between analyte and titrant is known, as are their molecular weights. From the consumption and concentration
of the titrant as well as the weight of sample used in the analysis, the quantity of analyte initially present in the
sample can be calculated.

Common Titration Applications


Titrations to detect particular chemical compounds or ionic species are performed routinely in a range of dis-
ciplines. Nitrogen1), sulfur dioxide, water, glucose, and vitamin C (ascorbic acid) may all be determined indi-
vidually; ions such as sodium, chloride, and acids or bases (strong as well as weak) may also be quantitated.
Applications abound not only during the production of chemicals, but also petrochemicals, cosmetics, pharma-
ceuticals, foods and beverages, construction materials, and common household items (e.g. soaps and deter-
gents). Titration methods may be employed in these cases to indicate levels of essential ingredients or contami-
nants, or simply to assist with grading or processing operations.

1)
In organic compounds and NH3 or NH+4 (Kjehldahl method).
METTLER TOLEDO Basic Laboratory Skills 39
5.1. Titrators and Sensors

Modern autotitrators come in two basic groups:


those that can carry out a single type of analy-
Titration Procedures

sis, and those suited to more flexible operations.


METTLER TOLEDO offers both, with instruments in
the EasyPlus™ series h­ andling single categories
of general titration, and dedicated titrators for water
determination via the Karl Fischer method available
in the EasyPlus as well as Compact (Advanced)
and Excellence lines (see below).

Multifunctional titrators enable more than one


type of analysis by permitting the replacement of
burettes and sensors. METTLER TOLEDO’s sim-
plest multifunctional titrator is the Easy Pro; an
EasyPlus instrument, it supports a limited number
of (­pre-installed) possible methods, but is compat-
ible with sensors for acid-base, redox, and precipi-
tation titrations. A broader range of applications is
possible with the Compact and Excellence series,
which also connect to automation units for higher
throughput.

A major benefit of all autotitrators is the possibil-


ity of method standardization, which improves
the accuracy and reproducibility of results. While
EasyPlus instruments follow the same basic order
of operations, permitting only minor alteration of
parameters, Compact and Excellence titrators offer
customizable methods. These can be saved as
shortcuts, or stored centrally in the LabX laboratory
software. In the latter case, SOPs can be pushed
to all, or designated, networked instruments; at the
titrator, an analyst can simply tap a shortcut button
to start the method with One Click once beakers
are loaded on the sample-changer rack. The soft-
ware then drives the automated workflow, man-
aging operations and storing data securely in its
Figure 29: Autotitrators are a fast and accurate substitute to manual
database. Connected to an automation unit such titration. Shown from top to bottom are METTLER TOLEDO’s Easy Pro,
as the InMotion Autosampler, more than 100 titra- Compact, and Excellence series. Easy Pro titrators support a limited
range of applications, while Compact and Excellence offer a more
tions can be performed in sequence following flexible experience and can connect to sample changers to process
the same protocol, further reinforcing repeatability. multiple samples in sequence.

METTLER TOLEDO Basic Laboratory Skills 40


Titration Procedures

Figure 30: An Excellence titrator and InMotion Autosampler. Method shortcuts can be launched with One Click on Excellence instruments,
standardizing results for sample series.

Myriad sensors have been developed for different types of titrations. Engineered for specific conditions of analysis
as well as sample types, the sensors offer features such as temperature control, suitability for aqueous or non-
aqueous solvents, and easy cleaning. In addition, Plug and Play sensors – which carry their identifying informa-
tion with them to be read upon connection to a titrator – augment the safety of titration protocols, as they encode
the details of recent calibrations and their period of validity, preventing their use when out of spec.

pH and oxidation-reduction potential (ORP or redox) sensors function by detecting differences between a reference
and a sensing electrode (now normally contained within one device). These differences, electrical or electrochemi-
cal in nature, arise during the course of a reaction. Depending on the reaction in play, the composition of the elec-
trodes, and the reference electrolyte, they reflect the flow of current (electrons) or specific ions, and are determined
according to the Nernst equation.

Certain sensors – photometric and thermometric, for example – monitor changes relative to a baseline (estab-
lished immediately prior to the reaction), and do not contain a reference element. In the case of the former, a
calibration curve for the titrant should usually be determined against a primary standard in order to calculate
measurement results accurately; in the case of the latter, routine testing is generally performed weekly to ensure
continued accuracy.

Figure 31: Photometric (left) and thermometric (right) sensors monitor the progress of a titration reaction by detecting alterations in light
transmission and temperature, respectively.

METTLER TOLEDO Basic Laboratory Skills 41


Sensor design depends on the reaction parameters under assessment, as mentioned above. For example, pH
sensors are used for acid-base titrations, and oxidation-reduction sensors for redox titrations. The former rely on
H+-sensitive glass to detect acid strength, with the latter measuring electron transfer to or from a platinum element.
Photometric sensors often detect light transmittance or absorbance at a defined wavelength, making them suited
Titration Procedures

to monitoring changes in color or turbidity.

In addition to sensor choice, the success of a titration reaction is contingent on sample preparation. In the
­simplest case, a liquid sample may be measured as is, or following dilution in an appropriate solvent; however,
solid or viscous samples may require additional treatment. This may include resuspension, milling or grinding,
digestion, or a range of other procedures, which must be performed consistently in order to reduce the risk of
errors. Standardized protocols should be developed for each case to ensure that results are interpretable, and
that replicate experiments are comparable.

METTLER TOLEDO’s SmartSample™ assists with sample tracking throughout the titration process. SmartSample
is based on radiofrequency identification (RFID) technology. Tags attached either to sample beakers or remov-
able beaker sleeves can capture weights at an Excellence balance and transfer them to an Excellence titrator.
Additional information, such as the correct titration method, can be selected at the balance or appended via
the SmartCodes™ function of LabX, preventing mix-ups. Cumulatively these features improve the processing of
sample series and diminish the incidence of transcription errors, reinforcing the overall accuracy and reliability
of results.

Figure 32: SmartSample technology leverages RFID tags to track titration samples from balance to analysis. Sample weights are captured
at the balance and transferred to the titrator, facilitating calculations.

A summary of titration theory can be found in METTLER TOLEDO’s “Basics of Titration” guide, along with details
of sensor choice for various methods and sample types. For more information on titration automation, please
refer to the “Titration Automation Guide: Opportunities and Benefits”. Titration applications, which include details
of sample preparation, may be browsed in the AnaChem Application Library.

5.2. Automated Titrators: Measuring Principle

Automated titrators follow a defined sequence of operations (the titration cycle), which is performed until the
titration ends. This sequence is basically the same for all models and brands, and substitutes automated titrant
addition and signal detection for their manual alternatives (typically a glass burette and visual observation)
within any method.

METTLER TOLEDO Basic Laboratory Skills 42


Piston-Driven Burette
1, 5, 10, 20 mL

A
Titration Procedures

Autotitrator / Processor 1. Titrant addition Titration Beaker


Titration control Stirring
2. Titrant reaction
Data evaluation Sample dilution / dissolution
Calculation of results 3. Signal monitoring Chemical reaction
Data storage 4. Evaluation Signal monitoring

D B

Sensors
Detection principles:
Potentiometric – photometric –
voltametric – amperometric –
turbidimetric – conductometric –
thermometric

Figure 33: Schematic representation of the fundamental steps in a titrator method (the titration cycle).

5.3. Tips and Hints for Titrations

The guidance below indicates how best to achieve accurate and reproducible results in common titration appli-
cations, following the four major components of the titration cycle: the burette, beaker, sensor, and autotitrator
of Figure 33.

Burette
• Select the appropriate burette size. Ideally, the titration will be completed with titrant consumption of 20–80%
of the burette’s nominal volume.
• When newly filling the burette, flush it sufficiently (with titrant) to ensure complete filling with the fresh titrant.
• Avoid air bubbles, by titrant degassing if necessary. If air bubbles remain in the glass cylinder after filling,
empty the burette completely and refill it.
• Clean the burette periodically, rinsing with deionized water.
• Treat the piston seal carefully, avoiding scratches and damage to prevent leaks.
• Store the burette appropriately. Burettes that will not be used for some time should be emptied, as should all
connected tubing, and carefully rinsed with deionized water.
• For the most accurate determinations, ensure your titrant is standardized using known methods and procedures.
Further guidance can be found in the Basics of Titration guide.
• For Karl Fischer water determinations, ensure you calibrate using traceable water standards according to your
method requirements.

Beaker
• Only use clean beakers.
• Ensure vigorous stirring while avoiding splashing and vortex formation. If the sample is sensitive to ambient
conditions or the titration reaction is very slow, apply gentle stirring only (lids may be placed over titration
beakers to prevent sample alteration before measurement).
• Set sufficient stirring time. The sample should dissolve completely before titration starts.
• Rinse the stirrer and electrode thoroughly after each sample to avoid carry-over. If necessary, apply a condi-
tioning step (30–60 seconds in a suitable solvent).

METTLER TOLEDO Basic Laboratory Skills 43


Sensor
• Ensure that the sensor is properly immersed in the sample. When using combined pH electrodes, the junction
should dip into the sample for safe and stable readings.
• Calibrate and adjust the sensor periodically in accordance with your SOPs or the manufacturer’s
Titration Procedures

recommendations.
• For acid-base endpoint titrations, the electrode should be calibrated carefully since it directly influences the
result. Calibration is less critical for equivalence point titrations, which rely mainly on changes in potential to
determine results, rather than the absolute potential value.
• Sensors should be appropriately conditioned prior to use (see the manufacturer’s recommendations). This
applies particularly to ion-selective electrodes (ISEs) or to pH electrodes used in non-aqueous solvents.
All sensors should be maintained regularly to ensure their continued good performance, as indicated, e.g.,
in METTLER TOLEDO’s Good Titration Practice™ (GTP®) framework and guide.

Autotitrator
• Activate user management. User management ensures that only qualified operators have access to specified
titrators and / or assigned methods.
• Build methods to include as many work steps as feasible to simplify tasks, ensure correct execution, and
increase repeatability.
• Run several test sample series before establishing method SOPs. For more details see Application Brochure 16,
“Validation of Titration Methods”.
• Decide which data parameters you need to capture (table of measuring values, results, equipment data, etc.).
Delete the others. Print reports if required. Store data electronically in a lab database where possible so as to
facilitate comparison to other results, as well as further analyses.

5.4. Routine Testing

The most effective way to guarantee accurate and reproducible titration data over the long term is to implement a
regular maintenance, service, and checking schedule for your instrumentation. This will include not only periodic
visits from a service professional (see below), but sensor cleaning and calibration with a standard or buffer, as
well as determination of the titer.

Routine maintenance may vary depending on the sensor type and the nature and frequency of the workflows
in which it is used. For example, pH and redox sensors should have their electrolyte refilled; sensors used with
­viscous samples, or those containing particulates, may need cleaning in a mild detergent or other solution
(details will be provided with the manufacturer’s instructions). In addition, metal-ring electrodes should be pol-
ished to prevent oxidation from affecting the results. As indicated above, sensor conditioning prior to use is also
essential to proper functioning, and long-term storage must align with the manufacturer’s recommendations to
prevent damage from occurring.

Titer determination – the procedure by which the exact concentration of the titrant is measured – is additionally
crucial to an understanding of reaction stoichiometry. Titrants stored at ambient conditions may alter with time,
absorbing moisture from the atmosphere or experiencing temperature-dependent changes. Their titer, or more
precisely, the moles of reactant available to react with the analyte, should therefore be determined against a pri-
mary standard from time to time (as often as daily for regulated environments, though the frequency will depend
on the applications and use cases, as well as the accuracy of results demanded). Detailed methods for a range
of titration protocols are provided in METTLER TOLEDO’s AnaChem Application Library (linked to above and in the
references).

Please refer to “Best Practice for Titration Sensors: GTP for Reliable Titration Results” for additional information on
sensor care, storage, calibration, and more, as well as the Basics of Titration guide for further details. For details
of how to clean a titrator, please consult our poster.

METTLER TOLEDO Basic Laboratory Skills 44


5.5. Service Options

METTLER TOLEDO offers various service and installation options for titrators, including support in selecting the
instrument that best meets your needs. Details of the packages available can be found on our website, and can
Titration Procedures

be discussed with a technical specialist.

5.6. Resources and Further Reading

Guides:
ABC of Titration in Theory and Practice
Basics of Titration
Best Practice for Titration Sensors: GTP for Reliable Titration Results
Titration Automation Guide: Opportunities and Benefits

Other resources:
Validation of Titration Methods: A Guideline for Customers (Brochure)
AnaChem Application Library (Application notes)
Easy Ways to Avoid Contact with Chemicals and Keep Your Titrator Clean (Poster)
Good Titration Practice™ (GTP®) framework (Best practice guidelines)

METTLER TOLEDO Basic Laboratory Skills 45


6. Analytical Chemistry: Ultraviolet-Visible Spectroscopy

Almost all substances absorb and emit energy in characteristic and measurable ways. This property can be
Ultraviolet-Visible Spectroscopy

employed to determine information such as their identity, concentration, and purity by the application of a suitable
technique: for example, UV/Vis spectroscopy. The method, which measures how much energy a chemical sub-
stance absorbs at a particular wavelength on the ultraviolet-visible spectrum, is a simple, fast, and inexpensive
assay used routinely in many industries, as well as in life- and physical-science research.

6.1. Measuring Principle

UV/Vis spectroscopy is performed using a spectrophotometer, an instrument comprising a light source, a sample
holder, a dispersive device to separate the different wavelengths of the light (e.g. a monochromator), and a suitable
detector. The basic principle of its operation is diagrammed below (see Figure 34); light of a known wavelength is
passed through a sample of interest (placed in the sample holder), and then captured by a detector. A reference
sample – usually a solvent blank – is measured as part of the same series. The intensity of the light that passes
through the blank into the detector (I0) is compared to that of the light passing through the sample (I), and these
values are used to calculate the transmittance, T:

T = I / I0

Transmittance is often reported as a percentage (%T), and can be used to calculate the absorbance, the negative
logarithm of T:

A = −log(T)

METTLER TOLEDO’s UV/VIS Excellence spectrophotometers combine a long-lasting xenon flash lamp with a
state-of-the art array detector, enabling very fast complete spectrum scans of each sample for high accuracy
and repeatability.
Concave reflection grating

Entrance slit

Cuvette

CPU
CCD sensor

Xenon discharge lamp

Touchscreen / Terminal
Figure 34: Instrument design of an array spectrophotometer.

METTLER TOLEDO Basic Laboratory Skills 46


According to the Lambert-Beer Law (also known as the Beer-Lambert Law), the absorbance of a sample is directly
proportional to its concentration (c), the pathlength (d), and the absorbance coefficient, epsilon (ε; also called the
extinction coefficient):
Ultraviolet-Visible Spectroscopy

A=ε∙c∙d

This enables determination of the concentration when d and ε are known.

Figure 35: The factors influencing sample absorbance according to


the Lambert-Beer Law. Shown: light intensity (I), concentration (c),
and pathlength (d).

As each molecule absorbs light at specific wavelengths, the UV/Vis spectrum can also be used for qualitative
analysis, to identify a substance as well as its purity. Examples include the four bases of DNA – adenine, guanine,
thymine, and cytosine – which exhibit slightly different absorbance spectra around 260 nm. Likewise, vitamin
B12 can be qualitatively identified by examining absorbance at 278, 361, and 550 nm, as well as the ratio of the
absorbances at the two latter wavelengths.

Figure 36: The UV/Vis spectrum of vitamin B12 shows characteristic peaks at 278, 361, and 550 nm.

For more information on the theory of UV/Vis spectrophotometry, please consult our “Fundamentals and
Applications” guide.

METTLER TOLEDO Basic Laboratory Skills 47


6.2. Instruments

6.2.1. Cuvette Spectrophotometers


Ultraviolet-Visible Spectroscopy

The classical spectrophotometer design accommodates samples in cuvettes (see section 6.3.). Usually made of
glass, plastic, or quartz, these receptacles accept aliquots ranging from tens of microliters to a milliliter or more.
Although capable of generating highly accurate results, cuvette-based spectrophotometers have a long pathlength
owing to the presence of the cuvette itself. This necessitates sample dilution according to the Lambert-Beer Law.
As errors may originate during either the dilution or the calculation operations, cuvette-based spectrophotometers
may not be preferred in all cases. For example, labs may opt for a micro-volume instrument, described below, for
work with biological macromolecules, which may be highly concentrated after their purification (e.g. following a
plasmid prep or protein prep).

METTLER TOLEDO offers three cuvette-based UV/VIS Excellence spectrophotometers. The UV5, UV5Bio, and UV7
models all provide highly accurate and precise results; the UV5Bio instrument comes preloaded with methods for
common life-science assays, including nucleic-acid and protein, as well as OD600, measurements. Each spectro-
photometer is compatible with the CuvetteChanger, an eight-position cuvette-holder for small-scale automation;
CuveT, a thermostating unit, can be placed atop these models for temperature-sensitive or kinetic studies.

Figure 37: Cuvette spectrophotometers such as the UV7 Excellence (shown)


provide highly accurate and precise results. Concentrated samples may
require dilution due to the cuvette’s longer pathlength, as indicated by the
Lambert-Beer Law.

6.2.2. Micro-Volume Measurements

Micro-volume instruments, which usually have a measuring surface that includes an optical cell, are also now
widely available. Requiring only a microliter or two of sample, these devices are of particular interest in the life
sciences for analyses of biomolecules. The reduced pathlength traversed by the light permits the assessment of
higher concentrations in the instrument’s recommended absorbance range, obviating the need for dilution steps.
The sample is applied directly to the optical glass of the measuring platform, and results are calculated immedi-
ately upon completion of a reading.

METTLER TOLEDO’s UV5Nano micro-volume spectrophotometer incorporates the proprietary LockPath™ technol-
ogy, which effects the measurement of a sample solution at two precisely defined pathlengths for high accuracy.
The switch between pathlengths occurs automatically, expanding the concentration range over which absorbance
is detectable without user intervention. A cuvette holder adds to the instrument’s usefulness, permitting operators
to select their preferred option for assessment.

METTLER TOLEDO Basic Laboratory Skills 48


Ultraviolet-Visible Spectroscopy

Figure 38: Only a microliter or two of sample is necessary for micro-volume spectroscopic measurements. Due to the shorter pathlength,
concentrated samples can be evaluated without dilution. Shown: METTLER TOLEDO’s UV5Nano.

LockPath™
pathlength adjustment
MAN or AUTO

Two pathlengths
Sample drop (0.1 or 1 mm)
Mirror

Polychromatic Light beam reflected and


light beam partially absorbed after
passing twice through
the sample droplet

Figure 39: Schematic drawing of the micro-volume platform on METTLER TOLEDO’s UV5Nano spectrophotometer showing the change
between the short and long pathlengths.

METTLER TOLEDO Basic Laboratory Skills 49


6.2.3. Workflow Security and Automation

As well as supporting single or lower-throughput readings, the UV7 Excellence spectrophotometer offers automa-
tion possibilities. It can be connected to an InMotion Autosampler to enable measurements of up to 300 samples
Ultraviolet-Visible Spectroscopy

in series. Driven by the LabX software, these workflows may also include the archiving of complete data and
metadata in LabX’s database following the capture of each measurement. With integrated calculations, an opera-
tor can be informed immediately if samples are beyond tolerance limits or exhibit concerning behavior.

Figure 40: The UV7 Excellence spectrophotometer supports high-throughput measurements when connected to an InMotion Autosampler.

The Fiber Probe adapter further enhances automation capabilities by assessing samples in situ, with no need for
transfer to a measuring cell. Compatible with fiberoptic probes with standard adapters, it permits spectroscopic
assessments at pathlengths from 1 to 20 mm.

Complete, multi-step, secured workflows may also be enabled by the integration of various analytical instru-
ments (such as an analytical balance and UV/Vis spectrophotometer) into a software-supported workstation.
As an example, METTLER TOLEDO’s LabX software permits the linking of an XPR 205 analytical balance to a
UV7 Excellence spectrophotometer for gravimetric sample dilution, with subsequent measurement of the sam-
ples’ absorbance values. The entire procedure, encompassing all experimental stages from sample preparation
to measurement, is defined by SOP and translated into corresponding LabX methods. User guidance is provided
directly at the instrument terminal, where each step is displayed as a prompt in accordance with the pre-estab-
lished sequence of actions. The data generated is easily comparable between replicates.

LabX fully supports FDA regulation requirements for electronic data management as well as ALCOA+ principles.
Its database-anchored setup is superior to file-based, paper-based, or hybrid data management approaches,
as results are easily accessible and the risk of deletion or misplacement is reduced to a minimum.

It should be noted that UV/VIS Excellence instruments can also be operated without the LabX software; results
can be exported in spreadsheet format via a TCP/IP networked connection, or transferred directly to a USB drive.
Laboratories concerned with data integrity should, however, use LabX.

METTLER TOLEDO Basic Laboratory Skills 50


6.3. Tips and Hints for UV/Vis Spectroscopy

6.3.1. Cuvette Selection and Handling


Ultraviolet-Visible Spectroscopy

• Pathlength: According to the Lambert-Beer Law, the absorbance is directly proportional to the cuvette’s path-
length and sample concentration. Selection of an ideal pathlength (from 2 mm to 5 cm) can eliminate the need
for dilution, which may introduce errors. Absorbance in the range of 0.2 to 1.5 absorbance units (A) generates
the most accurate results. For highly concentrated samples, a micro-volume instrument with a very short path-
length (e.g. 1 mm or 0.1 mm, as described above) is ideal.
• Transmittance range: High precision fused quartz cells (QS grade) are recommended for analyses in the UV
range of the spectrum. For the visible range (>400 nm), disposable PMMA or PS cuvettes are frequently used.
It should be noted that plastic cuvettes may be of varying quality, which may in turn affect the quality of mea-
surements; optical glass cuvettes are a more reliable substitute. Matched pairs of cuvettes will provide the
best results.
• Cuvette placement: Position the cuvette with the transparent side in the light beam and the cuvette label point-
ing in the same direction for both blank and sample.
• Cuvette handling: Cuvettes are best handled with latex or polymer (e.g. nitrile) gloves to reduce the incidence
of fingerprints. Glass or quartz cuvettes should be held only on the frosted sides; cuvettes should also be
cleaned thoroughly inside and out following use or when dirty. For thorough inner and outer cleaning, use a
60% isopropanol / water solution and wipe with an optical cleaning cloth or lint-free tissue.
• Filling: Avoid filling cuvettes with glass Pasteur pipettes, as they may scratch the optical surface. A laboratory
pipette or plastic transfer pipette is suitable.

Figure 41: High quality reusable cuvettes afford accurate and precise UV/Vis measurements, particularly when matched pairs are used.

6.3.2. Solvent Selection

• Transmittance range: The solvent should not absorb light in the region of the spectrum over which samples
in which it is dissolved are measured.
Solvent Pure water Hexane EtOH MeOH DMSO Acetone
λ cutoff <197 199 207 210 270 331

Table 1: Typical solvents and their cutoff values, indicating the minimal wavelength where the solvents can still be used.

• Concentration: Adjust sample concentrations as needed to enable measurements (see section on pathlength
selection above). Each sample should be in linear range for the instrument.
• Side reactions: Select solvents so as to avoid side reactions between analyte and solvent molecules.
For example, polar solvents (i.e. water, ketones, alcohols, etc.) used to dissolve or suspend polar samples
can influence the energy distribution of the electrons in the absorbing chromophore, thus lowering the
spectral resolution.

METTLER TOLEDO Basic Laboratory Skills 51


6.3.3. Blank Correction

• Blank preparation: The blank cuvette must consist of fresh solvent of the type used to resuspend the sample.
Importantly, the blank and sample should have similar optical properties to prevent measuring errors.
Ultraviolet-Visible Spectroscopy

• Measurement series: Cuvette changers offer a secure and efficient way to measure a blank cuvette prior to
each sample series or set of readings, as frequent blanking helps to ensure the continued quality of spectros-
copy data. The firmware of UV/VIS Excellence instruments automatically subtracts blank values from sample
data as a step in an integrated workflow.
• Background correction: Background correction subtracts an absorbance value measured at a wavelength
where the analyte does not absorb from that at the wavelength of interest. For example, background correction
is commonly applied at 320 or 340 nm for direct DNA and protein measurements in the UV range, as these
molecules do not absorb light around these wavelengths. Measuring and correcting the background removes
any possible interference due to light scattering by particles, air bubbles, or precipitates in the sample.

6.3.4. Effects of pH, Temperature, and Ambient Air

• pH: The effects of pH on absorbance spectra can be very large if the conjugated acid or base has a color differ-
ent than the unprotonated compound (as with pH indicators such as phenolphthalein, methyl red / orange etc.).
If the spectrum of the sample under study is found to be affected by pH, a buffer should be used as a control.
• Temperature: Certain compounds, including organic solvents, will experience temperature-dependent volume
expansion that may change the apparent absorbance. Samples or solvents of this kind should be measured
using a thermostatic sample holder.
• Cap: In order to limit solvent evaporation, or water absorption from ambient air in case of hygroscopic samples,
cover the cuvettes. Often a cap is provided for a tight seal.

6.3.5. Micro-Volume Measurements

• Cleaning: The optical portions of the micro-volume platform, including the window on the lower side and mirror
on the upper side, must always be perfectly clean to achieve accurate and repeatable results. To prevent previ-
ous samples from affecting readings, clean the window and mirror twice with distilled water both before and
after each set of measurements (as well as between each measurement of a series). Distilled water, analytical
grade isopropanol / ethanol or a specific cleaning agent for cuvettes (Hellmanex III) are all suitable and should
be applied via lab tissue or non-scratching optical wipes.
• Pipetting technique: Pipette the sample smoothly onto the micro-volume platform to prevent the formation of air
bubbles. Use a sufficient quantity of sample to fill the area between the upper and lower platform. The sample
should form a rounded droplet, and cleaning may be warranted if it does not.

Figure 42: When applying a sample to a micro-volume platform, it should form a round droplet. The platform may need to be cleaned
if it does not.

METTLER TOLEDO Basic Laboratory Skills 52


6.4. Performance Verification – General Aspects

Instrument performance is the main factor directly affecting measurements’ accuracy and repeatability. For criti-
cal UV/Vis measurements, especially in clinical, pharmaceutical, or industrial quality control, it is crucial that
Ultraviolet-Visible Spectroscopy

the instrument should perform according to specifications. In laboratories following pharmacopoeial guidance,
instrument performance should be monitored regularly, with documentary evidence. Furthermore, this is manda-
tory for laboratories that offer accredited measurement services (e.g. in accordance with ISO 17025).

Validation is also a key requirement for laboratories working according to GLP. The following sections describe
the requirements of the principal regulations, as well as procedures for measuring key instrument performance
parameters.

6.4.1. Regulatory Requirements

Quality requirements have become more stringent in recent years; some industries even demand the regular moni-
toring and documentation of instrument performance. Various regulatory bodies describe performance verification
tests for UV/Vis spectrophotometers, particularly the major pharmacopoeias, which both stipulate qualification
verification for spectrophotometers and indicate the requisite tests. They also recommend the use of certified refer-
ence materials (CRMs) from accredited suppliers. Instruments qualified according to such guidance will meet the
demands for the analysis of pharmaceutical chemicals presented in pharmacopoeial monographs.

6.4.2. UV/Vis Performance Verification According to the US and European


Pharmacopoeias
Performance verification tests assess the quality of the measurements made by a given instrument. For UV/Vis
measurements specifically, the instrument’s capacity to determine absorbance across the spectrum must be
verified. The tests must guarantee:
• That the wavelength positions (x axis) are correct (wavelength accuracy) and stable (wavelength repeatability),
• That the intensities, absorbances, or transmittances (y axis) are correctly measured (photometric accuracy,
­linearity, stray light, and resolution) and stable (photometric repeatability).

Instrument Self-Tests
Complete performance verification, required for a spectrophotometer’s compliance with the pharmacopoeias,
is normally carried out at specified intervals. These intervals are determined according to the stability of the
instrument. To ensure that any deviations in performance occurring between verifications will be detected, the
instrument should undertake self-tests. These are routine analyses that can be run daily, and should include
­electronic and optical operation checks of the spectrophotometer, as well as wavelength accuracy determinations
with the xenon line to verify the functioning of the spectrophotometer’s light source. Cumulatively, the drift (for
­stability), wavelength accuracy and repeatability, photometric noise, and baseline flatness should be assessed.

Certified Reference Materials for UV/Vis Spectrophotometry


CRMs or calibration standards are designed to comply with the requirements of the major pharmacopoeias
(Ph. Eur., USP). They allow instrument qualification procedures to be carried out in accordance with regula-
tory requirements and ensure that the instrument fulfills quality demands. The materials are certified as to their
absorbance values at a number of substance-specific wavelengths in the UV and visible spectral regions.

METTLER TOLEDO Basic Laboratory Skills 53


Performance Verification Tests with Automated Modules
METTLER TOLEDO’s CertiRef™ and LinSet™ automated performance verification modules are pharmacopoeia-
compliant accessories that allow automatic calibration and performance tests of the UV/VIS Excellence spectro-
photometers. Modules exist to meet both USP and Ph. Eur. verification guidelines. However, it should be noted
Ultraviolet-Visible Spectroscopy

that, while both LinSet and CertiRef modules exist for the UV7 Excellence model, the UV5 and UV5Bio UV/VIS
Excellence spectrophotometers can only be tested via CertiRef.

Figure 43: The CertiRef performance verification module contains


pharmacopoeia-compliance certified reference materials and stores
its own expected values for comparison to those generated during
a qualification procedure. Running this automated procedure
indicates whether an instrument is functioning within tolerances.
Shown: the inside of the module.

6.5. Measurement Uncertainty Considerations

A sample’s UV/Vis absorption spectrum is a series of points, each defined by wavelength and absorbance
(or transmittance) values. Both values are characterized by a certain accuracy and uncertainty, which arise
from the:
• Optical and electronic resolution of the detector,
• Accuracy of the wavelength adjustment,
• Conditions under which the sample is measured (cuvettes, ambient environment, lamp temperature, etc.).

All these measurement uncertainties are included in the specifications listed in the data sheet of METTLER
TOLEDO’s UV/VIS Excellence instruments. The specifications in the data sheets are based on the average and
standard deviation of repeated measurements performed with CRMs in accordance with guidance from the USP
and Ph. Eur. (see the sections above).
• Photometric accuracy: ±0.005 A (≤1 A) for UV5, UV5Bio, and UV7 Excellence instruments; ±0.006 A (≤1 A)
for UV5Nano.
• Wavelength accuracy: ±0.5 nm for UV7 and ±0.9 nm for the other instruments.
• Photometric repeatability: 0.002 A (≤1 A) in the UV range for the UV5, UV5Bio, and UV7 instruments; 0.003 A
(≤1 A) in the visible range for all UV/VIS Excellence instruments.
• Wavelength repeatability: <0.08 nm for UV7 and <0.15 nm for the UV5, UV5Bio, and UV5Nano instruments.

These details should be taken into account as necessary during measurements, or when tolerances are set.

METTLER TOLEDO Basic Laboratory Skills 54


6.6. Spectrophotometer Care and Service

To ensure the continued performance of spectrophotometers and accessories, they should be kept clean and / or
stored properly when not in use. Spills and samples pipetted onto a micro-volume pedestal should be wiped
Ultraviolet-Visible Spectroscopy

away gently with a water-dampened, lint-free cloth; reusable cuvettes should be appropriately cleaned and dried,
and then placed in their case to prevent scratches. More details can be found in our Easy Cleaning poster series.

Spectrophotometer service, including preventive maintenance, is likewise essential to their functioning, and should
be undertaken regularly by a qualified professional. Laboratories should define the most appropriate schedule
and options for their needs, as these will differ based on the instrument’s frequency and conditions of use, as well
as on any regulatory guidance. METTLER TOLEDO offers service options ranging from professional installation to
repair; a consultation is recommended to uncover the best schema for every circumstance.

6.7. Resources and Further Reading

Guide:
UV/Vis Spectrophotometry: Fundamentals and Applications

Other resources:
How Should UV Vis Labs Do Spectrometer Calibration (White paper)
Stray Light and Performance Verification (White paper)
Pharmacopeia Changes – What to Be Checked on a UV Vis Spectrophotometer (On-Demand Webinar)
Easy Cleaning: Tips and Tricks for the UV5Nano (Poster)
Good UV/Vis Practice™ (GUVP™) framework (Best practice guidelines)

METTLER TOLEDO Basic Laboratory Skills 55


7. Analytical Chemistry: Density

Density (denoted d or ρ; may also be reported as specific gravity) is a characteristic property of any material
Density

that can be used for its identification. Defined as an object or substance’s mass divided by its volume according
to the equation below, it is usually reported in g/cm3 or kg/m3.

d = m / V

As the volume of the substance is temperature-dependent, its density is as well. At higher temperatures, the
volume increases and therefore the density decreases. In consequence, density is always stated along with the
temperature at which it is measured, typically 20 °C.

d20 Water = 0.99823 g/cm3 d40 Water = 0.99224 g/cm3

It should be noted that the density of a gas is dependent not only on temperature, but also the pressure to which
it is subjected.

7.1. Measuring Principle

A simple way to measure density is to use a balance and a pycnometer – which is a glass receptacle of defined
volume. It is weighed without sample (M1), then filled with the sample and weighed again (M2). The difference
between M1 and M2 equals the mass of the sample; this value, divided by the volume of the pycnometer, repre-
sents the sample’s density.

d = (M2 − M1) / V

Density can also be measured automatically, in dedicated instruments called density meters. Current density
meters apply the physical principle of an oscillating U-tube surrounded by a thermo-conductive block, which
brings the sample to its required measuring temperature.

Peltier element
Thermo conductive block (heat and cool)

Figure 44: Schematic of a density cell.

Oscillator Temperature Oscillating U-tube


sensor

METTLER TOLEDO Basic Laboratory Skills 56


The frequency at which the U-tube oscillates depends on its mass, or more specifically its contents. The higher
the density of the sample, the lower the oscillation frequency of the U-tube. This oscillation frequency can be
converted to density with the corresponding formula:
Density

d = A ∙ OSC2 − B

0.79672 g/cm3
Density

OSC2 28880906

Figure 45: The oscillation frequency detected by a digital density meter can be converted to sample density.

In order to initiate a measurement, the sample is injected into the U-tube with a syringe. After temperature equili-
bration, the U-tube’s oscillation is measured, converted into density, and displayed on the instrument terminal.
The measurement takes 2–3 minutes, needs no reagents, and allows the sample to be recovered for further use.

Different types of density meters are available for different applications:

Portable Density Meters


Handheld density meters that can be used outdoors or at line allow operators to monitor samples in process, but
lack temperature control. They use temperature coefficients to calculate the density of a sample at a temperature
other than the ambient temperature, e.g. 20 °C, and can convert the measured density to many other units.

Figure 46: Portable density meters can be used in the field


or at-line. Shown: METTLER TOLEDO’s Density2Go.

METTLER TOLEDO Basic Laboratory Skills 57


Benchtop Density Meters
Benchtop density meters range from the simple to the complex, with the former – represented by METTLER
TOLEDO’s EasyPlus instruments – typically suited to use both in the lab and near the production line. Appropriate
for individual or small series of measurements, they offer an accuracy of up to four decimal places and fast tem-
Density

perature control.

Figure 47: Benchtop density meters. Left: the EasyPlus models support measurements in the lab or near-line. Right: Excellence series
instruments offer higher accuracy. They also include user management features and enable automated measurements in higher-throughput
environments.

A broader range of applications is supported by Excellence series meters, which determine density with an accu-
racy of up to six decimal places. These instruments permit user management and the automation of workflows
involving pre-programmed SOPs. They can also be connected to multiparameter systems for the measurement of
several parameters in parallel, as shown below (see also section 7.5.).

The connection of a sampling pump to Excellence density meters will enhance the reproducibility of sample
loading into measuring cells. Automation units with heating capabilities also expand the range of substances for
which density can be determined with confidence.

1 2

3 4

1. With sampling pump, three-way valve, and drying pump


2. With sample automation unit
3. With autosampler
4. Multiparameter system: density, refractive index, color, and pH

More information on density measurements and instrumentation can be found on our density application page
and in the guide “3 Ways to Measure Density”.

METTLER TOLEDO Basic Laboratory Skills 58


7.2. Density Applications

Density meters are used during manufacturing processes to measure liquids, typically raw materials, intermedi-
ates, and final products. Common samples include solutions, beverages, components of fragrance mixtures, or
Density

formulated cosmetics, among others; for raw materials, they allow quick checks that the correct liquid has been
delivered, as well as indicating its purity. During production, the density of intermediates is monitored to ensure
the integrity of the process; at final quality control, if the density of a product is within specified limits, the latter
can be released.

Density measurements also offer insight into the concentration of liquid mixtures, e.g. the alcohol concentration
of spirits. The density of water at 20 °C is 0.998 g/cm3, while that of ethanol at 20 °C is 0.789 g/cm3. The den-
sity of an ethanol-water mixture will therefore lie between 0.789 g/cm3 and 0.998 g/cm3, and can be calculated
using concentration tables. For example, the density of a 60% (weight / weight) solution of ethanol at 20 °C will
be 0.891 g/cm3.

Another typical concentration measurement is the determination of sugar content, as in soft drinks. This value is
usually given in °Brix, which reports percent sucrose in water. A reading of 15 °Brix means that 15 g of sucrose
are present in 100 g of solution. Note that a stringent correlation between density and concentration can only be
applied to binary mixtures of one compound in a single solvent.

A variety of units1) may be used for density measurement, depending on the application. The most common are
density, specific gravity, Twaddell, light and heavy Baumè, acid / base, and chemical or salt scales2). For alco-
hol, density may be reported as OIML, AOAC, Proof Degree, HM C&E, or Gay Lussac, while for sugar, Plato, Brix
(Emmerich, NBS 113), HFCS 42/55, Invert Sugar, KMW, Oechsle, and Babo are used. Petrochemical samples
may employ API degrees and gravity tables for crude oils, refined products, and lubricants.

Application notes describing procedures for the measurement of density in a broad range of sample types can
be accessed in METTLER TOLEDO’s Application Library.

7.3. Tips and Hints for Density Measurement

In order to obtain accurate results for density measurements, special care must be taken throughout the work-
flow. This entails the application of proper technique during not only sampling and measuring, but cleaning and
drying, to ensure that a series of samples can be assessed accurately.

Sampling
At the sampling stage, the measuring cell should be completely filled with the substance to be tested, there
should be no cross-contamination with the substance previously assessed, and no air bubbles or impurities
should be present.

Crucial to the success of this step is sample loading technique: the measuring cell must not be underfilled. If fill-
ing is insufficient, contamination is not flushed out and may remain in the cell, invalidating the reading. Make
sure that the sample is loaded at least 10 cm beyond the edge of the cell, as shown below.

Figure 48: Filling of the density cell. Left: inappropriate. Right: good.

1)
Units are integrated into METTLER TOLEDO density meters, although values are applicable to defined concentration ranges only.
2)
Chemical scales are used to determine the concentration of ammonia, ethylene glycol, glycerol, and hydrogen peroxide, among others;
salt scales can determine the concentration of, e.g., calcium chloride or silver nitrate.
METTLER TOLEDO Basic Laboratory Skills 59
The measuring cell should be filled at a slow speed and with laminar flow (5–10 cm per second) to ensure
complete wetting of the cell walls (and to prevent air bubbles from forming). It will be necessary to ensure that
no air is trapped in the syringe used for loading and that no air bubbles or impurities are present in the sample;
the syringe’s plunger should then be pressed slowly and uninterruptedly until loading is complete.
Density

Manual sample handling with a syringe is always operator-dependent and hence, error-prone. Automatic filling
systems ensure that the sample is loaded with the correct speed and in a reproducible manner, independent of
operators and substance types, and can be adjusted based on the characteristics of any substance.

Figure 49: Manual sampling with a syringe. Figure 50: Automatic sampling with a sample automation unit.

Modern density meters offer a video view with a camera, sometimes even with zooming and panning functions
to observe every portion of the measuring cell. This function can be utilized to ensure the absence of air bubbles
and impurities from the loaded sample.

Figure 51: Video view permits observation of the measuring cell to


ensure even sample loading without bubbles or impurities.

METTLER TOLEDO Excellence Density Meters additionally include a built-in Bubble Check™ to prevent errors from
air bubbles, which can have an outsized effect on measurement accuracy and reproducibility (Table 2).
Diameter of the air bubble [mm] Max. measuring error caused [g/cm3]
2 0.000838
1 0.000052
0.5 0.000003

Table 2: Measuring errors caused by air bubbles.

Measuring
Measurements with a digital density meter should only proceed once the sample has reached the set tempera-
ture (e.g. 20 °C) and its oscillation is stable. Readiness to measure is determined automatically by the density
meter; the reading is then given in the desired unit (density, specific gravity, alcohol, Brix, …), usually within a
few seconds.

METTLER TOLEDO Basic Laboratory Skills 60


Density

Figure 52: Density measurements begin once the instrument has reached the correct temperature, and can be converted to a range of units.

Many density meters also offer built-in product quality control, enabling upper and lower limits to be set to
acceptable values. After a measurement, the value is compared to these limits and pass or fail information is
displayed on the instrument:

Figure 53: Setting upper and lower tolerance limits enables pass / fail information to be displayed on the instrument screen immediately
following a measurement.

Cleaning
Between samples, the measuring cell should be thoroughly cleaned and dried. However, the previous sample
must be removed from the cell and tubing before cleaning can take place. Clean the density cell by rinsing with
solvent (see Table 3). Following the second rinse (solvent 2), air-dry the cell, if possible with an air pump with
dessicator.
Sample Solvent 1 Solvent 2
Water based Water Acetone or ethanol (puriss)
Acids Excess water Acetone or ethanol (puriss)
Fats and oils Deconex® (0.3 to 0.5% in water) Acetone or ethanol (puriss)
Petrochemicals Toluene or petrol ether Hexane or similar if temp. is >30 °C
At room temp. use low-boiling petroleum ether mixture
Acetone
Conc. sugar solutions / syrup Water (use enough water before rinse with acetone Acetone (puriss)
to prevent risk of polymerization)

Table 3: Recommended rinsing solvents for steps 1 and 2.

METTLER TOLEDO Basic Laboratory Skills 61


Density

Figure 54: Air pump with dessicator.

7.4. Performance Verification

It is important to regularly verify the accuracy of your system by measuring a sample of known density (e.g. dis-
tilled water or a standard). This may be termed a test, calibration, or check; once completed, the density value
obtained is compared to the known nominal value of the sample to determine whether the instrument is within
tolerance.

It is good practice to check that the measuring cell is fully clean and dry before each analysis. This can be done
simply by measuring the empty cell. If the result is equal to the value for air (0.001205 g/cm3 at 20 °C), contami-
nation-free measurement of the next sample is guaranteed. This can be done with just One Click via the “Cell Test”
shortcut on METTLER TOLEDO Excellence Density Meters.

It is recommended that you test your density meter with water (0.998203 g/cm3 at 20 °C) daily to guarantee
that there is no drift of the instrument. Less frequently (weekly, monthly, or even yearly, depending on QA require-
ments), a test with a certified standard within your measurement range is suggested to check the accuracy of
your benchtop or portable instrument around the density values of your samples.

METTLER TOLEDO Excellence Density Meters offer the possibility to define fixed intervals for test sets with an
automatic reminder for the operator. Measurement methods can be set up in such a way that the operator is
warned, or the instrument blocked from use, if the defined test interval has expired.

Readily available and stable solvents such as deionized water or analytical grade toluene can be used. The most
common test substance is deionized water, as it is available in almost every laboratory in high and reproducible
purity. It might be necessary to degas it by boiling, especially when measured at temperatures above approxi-
mately 35–40 °C.

A different test can be defined separately, for longer intervals (e.g. on a monthly or yearly basis), using c­ ertified
standards for quality assurance and traceability. METTLER TOLEDO offers combined (density and refractive
index) certified standards in different ranges:

METTLER TOLEDO Basic Laboratory Skills 62


• Water (0.99… g/cm3; nD 1.33…)
• Dodecane (0.75… g/cm3; nD 1.42…)
• 2,4-dichlorotoluene (1.25… g/cm3; nD 1.55…)
• 1-bromonaphthalene (1.48… g/cm3; nD 1.66…)
Density

The uncertainty of these standards is determined during the manufacturing process; it is normally around
±0.00003 g/cm3, and should be taken into account during the setting of tolerances.

A full testing procedure is described in the guide “How to Achieve the Best Results: Day-to-Day Density
Measurement”, along with details of how to interpret the results and when professional intervention is warranted.

7.5. Data Management

METTLER TOLEDO’s portable and EasyPlus benchtop density meters work with the EasyDirect software, a data-
logging platform that seamlessly permits the transfer of results to spreadsheet software. EasyDirect is compatible
with up to three instruments of the same or different kinds, including refractometers, to accommodate laboratories’
individual needs. Graphs may be generated to track measuring data over time, facilitating quality assessments.

Excellence series instruments can harness the power of the LabX software. Connected to an autosampler, an
Excellence density meter can support the automation of dozens of measurements in sequence; LabX orchestrates
the execution of pre-programmed workflows involving density assessments alone, or multiparameter analyses
of density, refractive index, absorbance, pH, conductivity, and other values. A sampling pump ensures the even
loading of each sample into the connected instruments’ measuring cells for improved reproducibility, and a heat-
ing unit can be employed as necessary to reduce the sample’s viscosity sufficiently for measurement.

Complete data and metadata are stored in LabX’s secure, 21CFR11–compliant database; results from different
dates are easily comparable, and audit-readiness is dramatically simplified. LabX additionally offers user- and
asset-management features, controlling which users can access particular instruments or protocols, and dis-
abling equipment as needed (e.g. when calibration is due).

When simple export of results is necessary, USB-enabled printers may be connected to either EasyPlus or
Excellence series instruments.

7.6. Maintenance and Service

Routine testing, preventive maintenance, and instrument repair services will prolong any density meter’s life-
time and ensure the continued quality of results. As laboratories’ needs will differ depending on the conditions
and frequency of use of their equipment, we encourage consultation with a service professional to determine an
appropriate service package and schedule. Professional installation and equipment qualification may also be
undertaken, based on requirements.

METTLER TOLEDO’s Good Density and Refractometry Practice (GDRP™) guidelines outline the five steps to
achieving excellent results for your density applications.

METTLER TOLEDO Basic Laboratory Skills 63


7.7. Resources and Further Reading

Guides:
How to Achieve the Best Density Results: Day-to-Day Density Measurement
Density

3 Ways to Measure Density: Know-How, Hints and More

Other resources:
What Is Density? All You Need to Know about Density Measurement (Density application page)
Multiparameter Measurements (Simultaneous measurement of density, refractive index and other parameters)
Keep Your Density Meter Clean (Poster series)
Good Density and Refractometry Practice (GDRP™) framework (Best practice guidelines)
AnaChem Application Library (Application notes)

METTLER TOLEDO Basic Laboratory Skills 64


8. Analytical Chemistry: Refractive Index

The refractive index of any substance, n, is a physical property describing the ratio of the speed of a light beam
Refractive Index

in a vacuum to its speed in the substance (n = c/v). Dimensionless, and characteristic of pure liquid samples,
it can be used in the identification of solvents, oils, and other hydrocarbons, as well as semi-solids and solids.

Air

y
Critical angle Total internal reflection

ra
n1

d
α

cte
f ra
Re

α
t ra y

n2 α α
β
den
In c i

Water

Figure 55: Refraction, critical angle, and total reflection of a light beam passing from one material (e.g. water) to another (e.g. air).

n1/ n2 = (sin β) / (sin α)

β = 90° → sin β = 1

n2 = n1 (sin α)

As refraction depends on the wavelength of the incident light, n is usually measured in reference to the D line of
sodium (wavelength 589.3 nm) and designated nD. Refractive index is also influenced by temperature. For this
reason the temperature must always be stated along with the result; the standard temperature is 20 °C, while for
a measurement at 25 °C results would be expressed as nD25.

A variety of units may be utilized for refractive index measurements, depending on the application as well as the
composition of the samples evaluated (see section 8.2., below).

8.1. Measuring Principle

Refractive index can be measured manually or automatically. Historically, the Abbe design was employed;
it involved the application of a liquid sample to a measuring cell bordered by an illuminating prism and a refract-
ing prism, with the latter made of glass with a high refractive index. Following the projection of light through the
illuminating prism, a detector behind the refracting prism enabled determination (by eye) of the refractive index.

Results obtained via the Abbe design were necessarily somewhat subjective. Further, the refractive index is tem-
perature-dependent and the Abbe design requires an external system to control for temperature effects, such that
automatic digital refractometers offer a simpler, more accurate alternative. In addition to providing temperature
control via integrated thermostating units, these instruments often reduce noise due to vibrations by limiting the
number of moving parts in the measuring cell.

METTLER TOLEDO Basic Laboratory Skills 65


Peltier element
(heat / cool) Sample Prism Lenses Thermostatic block
Refractive Index

Total reflection
LED Interference filter
Temperature measurement CCD Sensor

Figure 56: Schematic of a modern refractometer cell.

In digital refractometers, light is beamed under a broad angle to a prism in direct contact with the sample.
Depending on the refractive index of the sample, the incoming light is transmitted (= refracted) or totally reflected.
The angle of total reflection is determined by a CCD array sensor from which the refractive index of the sample
can be evaluated accurately and precisely.

The sample is added to the prism with a transfer pipette; only a few drops are necessary for the measurement.
Once it has reached the set temperature, its refractive index is calculated and the value displayed on the instru-
ment terminal, with the operation typically taking seconds from start to finish.

Figure 57: A few drops of sample on the prism of the refractometer


are sufficient.

Digital refractometers vary in design to reflect the range of applications they support:

Portable Refractometers
Handheld refractometers can often be used both indoors and out, but do not include temperature control units.
They use temperature coefficients to calculate the refractive index or °Brix of a sample at a temperature that dif-
fers from ambient conditions, e.g. 20 °C. They can convert the measured refractive index into many other units
and concentrations, depending on the process implicated.

METTLER TOLEDO Basic Laboratory Skills 66


Benchtop Refractometers
Benchtop refractometers come in varying levels of sophistication, depending on the readability desired. METTLER
TOLEDO’s portfolio encompasses both simple, robust EasyPlus options for the lab or near the production line,
and the Excellence series instruments, which offer user management features, permit workflow automation, and
Refractive Index

connect to multiparameter systems, as shown below (see also section 8.5.).

Figure 58: Benchtop refractometers. Left: the EasyPlus models can be used in the lab or near-line. Right: Excellence series instruments
afford up to five decimal places of accuracy, permitting workflow automation and enabling user management.

The former provide an accuracy of up to four decimal places, along with fast temperature control, and can con-
vert the measured refractive index into many other units and concentrations. Excellence instruments, by contrast,
support a broader range of applications and supply an accuracy of up to five decimal places. The latter can also
be connected to sampling pumps, for consistent results with a broader spectrum of substances.

1 2

1. Combined density / refractometry system


2. With autosampler
3. Multiparameter system: density, refractive index, color, and pH

METTLER TOLEDO Basic Laboratory Skills 67


8.2. Refractometry Applications

Since every material that interacts with light has a refractive index, measurements of this property are often used
to identify substances, as well as for purity checks. Liquid, semi-liquid, or solid samples may all be assessed,
Refractive Index

with accuracy down to ±0.00002 possible for liquids or semi-liquids with a digital or Abbe refractometer.

Refractive index values can be correlated with a wide range of concentrations, enabling further characterization
for applications such as those outlined below:

Chemicals: Determination of the freezing point (°C or °F), acid / base concentration, or the presence of organic
solvent and inorganic salt in % w/w or v/v. Freezing point measurements can be employed for substances such
as ethylene or propylene glycol, while the concentrations of formic acid and sodium or magnesium chloride can
also be determined via this method.

Pharma: Measuring hydrogen peroxide or methanol percentages, or assessing concentrations of various sub-
stances in human urine. In the case of urine, aberrant measurements may indicate an individual’s physiological
state (e.g. dehydration), or the use of controlled substances; the refractive index can also point toward the quan-
tity of total dissolved solids present.

Food and beverage: Measuring the Brix (sugar content) of soft drinks or juices, or the Oechsle (refractive index of
grape must) during winemaking. Special Brix refractometers are also available for sugar control at line, and can
provide measurements when the beverage is poured directly from the bottle into the funnel, as shown below.

Figure 59: The Easy Brix instrument can provide sugar control
measurements at line.

When used to measure the °Brix of a substance that is not a pure sucrose / water solution, such as orange juice,
a refractometer may provide results that diverge from the exact sugar concentration.

°Brix is also sometimes used to measure liquids in other contexts, including cutting oils from manufacturing and
machining processes, the composition of which differs significantly from beverages. The °Brix value in cases
such as these is unrelated to sugar content, but has been adopted as simpler to read than a refractive index value
(e.g. 25.1 °Brix instead of nD 1.3725).

A variety of units may be used for refractive index measurement, depending on the application. The most common
ones are nD, Zeiss (14.45), Zeiss (15.00), acid / base scales, salt, chemicals, and freezing point scales. For sugar,
units such as Brix, HFCS 42/55, invert sugar, and Oechsle are employed.

Additionally, for some applications, combining measurements of refractive index and density creates a simple
yet powerful quality control technique. These determinations can be fully automated and enable the assessment
of mixtures that, via determination of one parameter alone, would be indistinguishable from other substances
(e.g. a blend of sugar and alcohol in water).

METTLER TOLEDO Basic Laboratory Skills 68


Application notes describing procedures for the measurement of refractive index in a broad range of sample
types can be accessed in METTLER TOLEDO’s Application Library.

8.3. Tips and Hints for Refractive Index Measurement


Refractive Index

Successful measurements of the refractive index depend on a number of factors, including sampling technique,
the cleanliness of the instrumentation, and an appropriate procedure. Below, we outline guidance to support a
consistently high quality of data.

Sampling
For measurements with a digital refractometer, the sample should be applied directly to the prism. This can be
undertaken with a transfer pipette (shown) or similar liquid-transfer device. Note that – especially important for
viscous substances – stirring or mixing with the tip of the pipette is a necessary step to remove air cushions
between the prism and the sample. This also homogenizes the sample.

Figure 60: Sample can be applied directly to the prism for refractive index measurements. A plastic transfer pipette can be used; stirring
gently with the tip removes air cushions.

A few drops of sample on the prism of the refractometer are sufficient.

Measuring
Before a measurement is initiated, the sample should equilibrate to the set temperature (e.g. 20 °C). This is
­handled automatically by refractometers with inbuilt temperature control; the measuring result is then given in
the desired unit (nD, Brix, …).

METTLER TOLEDO Basic Laboratory Skills 69


Refractive Index

Figure 61: Samples should equilibrate to the measuring temperature before a refractive index reading is made; following the measurement,
the results can be converted to a variety of units.

Many refractometers also offer a product quality control functionality, which sets upper and lower limits for mea-
surements based on the values expected. After each measurement, the value is compared to these limits and the
outcome (pass or fail) displayed on the instrument.

Figure 62: Setting upper and lower tolerance limits enables pass / fail information to be displayed on the instrument screen immediately
following a measurement.

Cleaning
It is particularly crucial to the quality of results to clean and dry the prism before and after measurements. This
prevents the buildup of sample residues and reduces the risk of cross-contamination, which will reduce the
accuracy and repeatability of readings. Use a suitable solvent to clean the cell a few times; the solvent must be
able to quickly dissolve the sample.

We recommend the use of syringes to remove the sample (and solvents) from the refractometer cell. A “waste
syringe” can be used over and over again (tip: mark this syringe with tape to keep track of it). Using a syringe
saves lab tissue and reduces waste.

A brief cleaning schema would involve:


1. Adding the solvent;
2. Stirring with the waste syringe;
3. Removing all liquid with the waste syringe.

METTLER TOLEDO Basic Laboratory Skills 70


The following solvents may be used to clean common samples from a refractometer:
Sample Solvent
Water based Water
Acids Excess water
Deconex® (0.3 to 0.5% in water)
Refractive Index

Fats and oils


Petrochemicals Toluene or petrol ether
Conc. sugar solutions / syrup Water (use enough water before rinse with acetone to avoid risk of polymerization)

Table 4: Recommended rinsing solvents.

8.4. Performance Verification

Before determining the refractive index of a sample, it is important to verify the accuracy of your system, usually
by measuring a substance of known refractive index (e.g. distilled water or another standard). This is typically
called a test, calibration, or check; once completed, the refractive index measured is compared to the known
nominal value of the sample to establish whether performance is as expected.

It is recommended that you test your refractometer with water (nD = 1.33299 at 20 °C) daily to preempt the
drift of the instrument over time. Less frequently (e.g. weekly, monthly, or even yearly – depending on your QA
requirements), a test with a certified standard whose refractive index is within your measurement range should
be undertaken. This evaluates the measuring accuracy of the instrument around the expected refractive index
for your samples.

METTLER TOLEDO Excellence Refractometers offer the possibility to define fixed intervals for test sets with an
automatic reminder for the operator. Measurement methods can be set up in such a way that the operator is
warned, or the instrument blocked from use, if the defined test interval has expired. Readily available and stable
solvents such as deionized water or analytical grade toluene can be used. The most common substance for
daily tests is deionized water, as it is available in almost every laboratory in high and reproducible purity. Brix
standards are also often employed.

Tests using certified and traceable standards can be separately defined at longer intervals (e.g. monthly or yearly)
for quality assurance and traceability. METTLER TOLEDO offers combined (density and refractive index) certified
standards in different ranges:
• Water (0.99… g/cm3; nD 1.33…)
• Dodecane (0.75… g/cm3; nD 1.42…)
• 2,4-dichlorotoluene (1.25… g/cm3; nD 1.55…)
• 1-bromonaphthalene (1.48… g/cm3; nD 1.66…)

The uncertainty of these standards is determined during the manufacturing process; it is usually around
±0.00002, and should be taken into account during the setting of tolerances.

A full testing procedure is described in the guide “How to Achieve the Best Results: Day-to-Day Refractive Index
Measurement”, along with details of how to interpret the results and when professional intervention is warranted.

8.5. Data Management

METTLER TOLEDO’s portable and EasyPlus benchtop refractometers work with the EasyDirect software, a data-
logging platform that seamlessly permits the transfer of results to spreadsheet software. EasyDirect is compat-
ible with up to three instruments, of the same or different kinds, to accommodate laboratories’ individual needs.
Graphs may be generated to track measuring data over time, facilitating quality assessments, and reports may
be customized.

METTLER TOLEDO Basic Laboratory Skills 71


Excellence series instruments can leverage the power of the LabX software. Connected to an autosampler, an
Excellence refractometer can support the automation of dozens of measurements in sequence; LabX orchestrates
the execution of pre-programmed workflows involving refractive index alone, or multiparameter analyses of den-
sity, refractive index, absorbance, pH, conductivity, and other values. A sampling pump ensures the even loading
Refractive Index

of each sample into the connected instruments’ measuring cells for improved reproducibility.

Complete data and metadata are stored in LabX’s secure, 21CFR11–compliant database; results from different
dates are easily comparable, and audit-readiness is dramatically simplified. LabX also offers user and asset
management, controlling which users can access particular instruments or protocols, and disabling equipment
as needed (e.g. when calibration is due).

When simple export of results is required, USB-enabled printers may be connected to EasyPlus or Excellence
series instruments.

8.6. Maintenance and Service

Routine testing, preventive maintenance, and instrument repair services will prolong any refractometer’s lifetime
and ensure the continued quality of results. As laboratories’ needs will differ depending on the conditions and
frequency of use of their equipment, we encourage consultation with a service professional to determine an
appropriate service package and schedule. Professional installation and equipment qualification may also be
undertaken, based on requirements.

METTLER TOLEDO’s Good Density and Refractometry Practice (GDRP™) guidelines outline the five steps to
achieving excellent results for your refractive index applications.

8.7. Resources and Further Reading

Guide:
Day-to-Day Refractive Index Measurement

Other resources:
Refractive Index: All You Need to Know (Refractive index application page)
Multiparameter Measurements (Simultaneous measurement of density, refractive index, and other parameters)
Keep Your Refractometer Clean (Poster series)
Good Density and Refractometry Practice (GDRP™) framework (Best practice guidelines)
AnaChem Application Library (Application notes)

METTLER TOLEDO Basic Laboratory Skills 72


9. Thermal Values: Measurements of Melting, Softening,
Dropping, Boiling, Cloud, and Slip Melting Points
Thermal Values

Substances’ thermal values – characteristics such as their melting, dropping, and boiling points – have for
­centuries been used both for identification and for purity determination. In the sections below, we introduce each
technique, along with the types of samples for which it is recommended. We also outline strategies to obtain
accurate and reproducible data.

9.1. Melting Point

Melting point determination is one of the oldest methods for sample identification and testing known, and is
­particularly useful for organic substances. This technique is easy to carry out and yields quantitative results that
can be conveniently tabulated and classified. Since the melting point is highly dependent on a sample’s purity,
it can also be used to evaluate its quality.

Pure substances melt at a defined temperature, whereas impure ones (contaminated or blended) generally
exhibit a melting interval. In addition, the temperature at which all components of an impure mixture melt is
­usually lower than that of the pure substance. This phenomenon is known as melting point depression, and
affords qualitative information regarding purity.

In general, melting point determination is performed both in research and development and in quality control
in various industries to identify and monitor the purity of a range of substances. Modern instruments such as
METTLER TOLEDO’s Melting Point Excellence series enable automated detection of the melting point and visual
inspection of melting behavior via a video camera.

9.1.1. Melting Point Measuring Principle

Determination of the melting point is usually performed in slender glass capillaries loaded with the finely ground
sample of interest. As a sample capillary is heated in a furnace, its optical properties change: the sample’s
transmittance – the percent light intensity that can pass through it – typically increases temperature-dependently.
This property is exploited by melting point instruments such as METTLER TOLEDO’s MP30, MP55, MP70, MP80,
and MP90 devices, which use an LED as the transmission light source. The LED is shone through holes inside
the furnace in the lower region of the sample capillary as shown below; transmitted light is recorded by a video
camera and captured by a detector.

Capillary
with sample

Reflection
Window light source

Furnace

Transmission
light source

Camera
system

Figure 63: Manual melting point determination.

METTLER TOLEDO Basic Laboratory Skills 73


For more information on the operating principle of melting point instruments, as well as guidance on sample
preparation and loading, please consult our application page.

9.2. Dropping Point and Softening Point


Thermal Values

Certain substances, whether synthetic or naturally-occurring, may lack a defined melting point. They include
materials such as ointments, synthetic and natural resins, edible fats, greases, waxes, fatty acid esters, poly-
mers, asphalt, and tar. Gradually softening as the temperature rises, these substances melt over a relatively
large temperature interval.

Dropping or softening point tests are some of the few easily achievable methods available for the thermal char-
acterization of such materials. The former measures the temperature at which the first drop of a sample precipi-
tates from a measuring vessel (see below), while the latter reflects the point at which a sample has flowed a
defined distance. In order to obtain comparable results, standardized test equipment and conditions, as well as
appropriate sample preparation procedures, are necessary.

Samples’ dropping or softening points are predominantly measured in the context of quality control, but these
measurements can also be valuable in research and development to determine the temperatures of use and / or
process parameters for many different materials.

METTLER TOLEDO’s Dropping Point Excellence instruments enable the automated detection of the dropping point
and softening point, as well as the visual inspection of the process via a video camera.

9.2.1. Dropping and Softening Point Measuring Principle

In principle, the dropping point is the temperature at which the first drop of the molten substance precipitates from
a standardized vessel (cup) with an orifice of defined diameter under controlled test conditions. The samples are
heated in a furnace; manual methods require the visual inspection of the dropping process, which is both tedious
and subjective, necessitating detailed observation over an extended period of time.

The dropping point defines a suddenly occurring event: the gravity-driven escape from the cup of a liquefied
drop of sample. When this event occurs, the operator must quickly note the temperature, which in practice may
give rise to errors, as well as being subject to bias. METTLER TOLEDO’s Dropping Point Excellence instruments
employ a white-balanced LED light that shines on the test assembly (i.e. the cup and holder inside the furnace).
The light’s reflection is captured by a video camera; the entire dropping point measurement is video-recorded,
and image analysis is used to detect the first drop escaping the sample cup as it passes through a virtual white
rectangle located beneath the cup orifice. Notably, the furnace temperature is measured and recorded at a reso-
lution of 0.1 °C throughout the process.

METTLER TOLEDO Basic Laboratory Skills 74


Dropping point (DP)
cup, with a 2.8 mm
orifice, containing
sample in the oven
Thermal Values

DP
Video camera

• Online, video-based digital image


analysis used in the DP70
• Evaluation area: rectangle in top
half of picture
• Detection occurs when drop has
LED light source passed through the rectangle

Figure 64: Automatic dropping point determination.

Softening point (SP)


cup with 6.35 mm
orifice containing
sample in the oven

SP

Video camera

• Online, video-based digital image


analysis used in the DP70
19 mm

• Evaluation area: rectangular area


including stepping line
• Detection occurs when stepping
line reaches defined distance

LED light source

Figure 65: Automatic softening point determination.

The softening point represents the temperature at which a substance has flowed a certain distance under defined
test conditions, and is determined similarly to the dropping point. In softening point determinations, once the
sample reaches a distance of 19 mm from the cup orifice, the furnace temperature is automatically recorded as
the sample’s softening point. This assay can also be carried out using Dropping Point Excellence Systems.

For more information on dropping point and softening point determination, including the instruments’ operating
principle and a comparison with manual methods, please consult our application page. Additionally, we pro-
vide resources on best practice for sample preparation to reinforce the accuracy and repeatability of results, with
­specific guidance for particular sample types.

METTLER TOLEDO Basic Laboratory Skills 75


9.3. Boiling Point

The boiling point of a chemical compound is the temperature at which it undergoes a phase transition from liquid
to gas under ambient conditions. Substance-specific, it can provide useful information regarding both identity
Thermal Values

and purity, and is often used to select an optimal process temperature, for instance a sample’s ideal storage con-
ditions. Indeed, the boiling point of a substance must be reported on its Material Safety Data Sheet (MSDS).

METTLER TOLEDO’s Melting Point Excellence instrument model MP80 enables the automated detection of
a ­sample’s boiling point, as well as its visual inspection via video camera.

9.3.1. Boiling Point Measuring Principle

In order to determine a sample’s boiling point, a volume of approximately 100 μL is pipetted into a glass tube.
A smaller boiling point capillary is inserted into the filled tube to prevent superheating of the liquid, which would
induce boiling retardation and give rise to inaccurate readings. The sample is then loaded into the instrument
and the assay commenced.

As the temperature rises, gas bubbles form within the liquid and escape to its surface. These ascending bubbles
reflect the light of the built-in light source and are detected individually. The frequency of bubble formation is
measured and is used as the basis for boiling point determination.

Since the boiling temperature is pressure-dependent, a calculation is necessary to derive the boiling point at
sea-level pressure. Ambient pressure is measured with a built-in, calibrated barometer, and compensation to sea
level is automatically calculated and applied to the results.

78.4 °C

Figure 66: Duplicate, simultaneous boiling point determination.

9.3.2. Tips and Hints for Sample Boiling Point Measurement

• Measurement system assembly: The smaller boiling point inner capillary should be inserted into the outer
­boiling point capillary to prevent superheating. A new inner capillary must be inserted for every measurement.
• Heating rate: Usually 1 °C/min.
• Start temperature: The expected boiling point temperature minus 3–5 times the heating rate, so that the boiling
point is reached 3–5 minutes after the measurement is commenced.
• End temperature: A successful measuring curve requires an end temperature that is approximately 5 °C above
the expected event. Otherwise, use the method parameter “Stop at event” to terminate the temperature program
automatically as soon as both samples have reached their boiling points.
• Colored samples: For dark-colored samples, the manual brightness should be adjusted to an intensity that will
allow the best detection of the boiling point.

METTLER TOLEDO Basic Laboratory Skills 76


9.4. Cloud Point

A substance’s cloud point corresponds to the temperature above which it becomes turbid. Its determination is
most commonly performed for quality control, so as to ensure a high level of production and technical perfor-
Thermal Values

mance. Cloud point protocols often require a 1% weight dilution of the substance of interest – usually a sur-
factant, emulsion, or dispersion – in water. The water solubility of the substance tends to vary inversely with
temperature (i.e., the higher the temperature, the lower the solubility), and the cloud point is the temperature at
which the solution reaches saturation and becomes turbid.

METTLER TOLEDO’s Melting Point Excellence instrument model MP80 enables the automated detection of the
cloud point, as well as its visual inspection via video camera.

9.4.1. Cloud Point Measuring Principle

Cloud point determination is typically performed with a 1% weight dilution of the substance of interest in water.
Approximately 100 μL of the sample should be pipetted into a glass tube, which is inserted into the instrument;
the measurement is then commenced.

The solution of interest is transparent at the beginning of the experiment, becoming turbid once the cloud point is
reached. This turbidity is monitored via the detection of the light transmitted – the higher the temperature above
the cloud point, the more turbid the solution, and thus the lower the quantity of light that passes through it.
Automatic video camera detection of the decrease in transmitted light intensity is the key to obtaining repeatable
and reliable cloud point measurements.

60.0 °C 63.5 °C

Figure 67: Automatic cloud point determination. At 60.0 °C, the solution is clear. At 63.5 °C,
the solution has become turbid and the cloud point is detected.

9.4.2. Tips and Hints for Cloud Point Sample Measurement

• Heating rate: Usually 1 °C/min.


• Start temperature: The expected cloud point temperature minus 3–5 times the heating rate, so that the cloud
point is reached 3–5 minutes after the measurement is commenced.
• End temperature: A successful measuring curve requires an end temperature that is approximately 5 °C above
the expected event. Otherwise, use the method parameter “Stop at event” to terminate the temperature program
automatically as soon as both samples have reached their cloud point.

METTLER TOLEDO Basic Laboratory Skills 77


9.5. Slip Melting Point

The slip melting point is often used to characterize fats, oils, and waxes – or any other solids lacking a defined
or sharp melting point. The sample’s composition (and therefore the corresponding slip melting point) will deter-
Thermal Values

mine the applications for which it can be used, which may vary from ice cream to cosmetics production.

This value aids in product characterization and supports international trade, making standardization and compli-
ance pivotal. METTLER TOLEDO’s Melting Point Excellence instrument models MP55 and MP80 enable automated
detection of the slip melting point and visual inspection of samples via a video camera.

9.5.1. Slip Melting Point Measuring Principle

In order to determine the slip melting point, an inner slip melting point capillary tube containing a column of
fat (approximately 10 mm in height), crystallized under controlled conditions, is immersed in water, which is
then heated at a specific rate. The temperature at which the column of fat is observed to begin rising in the
inner ­capillary tube, due to a combination of buoyancy and the melting of the outside surface of the column,
is recorded as the slip melting point.

The parameter is evaluated via digital image analysis. When the column of substance starts to move upward,
the instruments’ image processing algorithm determines the slip melting point fully automatically.

45.0 °C 47.6 °C

Figure 68: Automatic slip melting point determination. At 45 °C, the sample column still has
not moved. At 47.6 °C, the substance has reached its slip melting point.

9.5.2. Tips and Hints for Slip Melting Point Sample Measurement

• Heating rate: Usually 1 °C/min.


• Start temperature: The expected slip melting point temperature minus 3–5 times the heating rate, so that the
slip melting point is reached 3–5 minutes after the measurement is commenced.
• End temperature: A successful measurement requires an end temperature that is approximately 5 °C above
the expected event.
• In order to determine the slip melting point reliably, the diameter of the outer capillary must be at least twice
that of the inner capillary: both the outer and inner capillaries from METTLER TOLEDO fulfill this requirement.

Depending on the nature of the sample and relevant standards, three possible methods exist for sample prepara-
tion in slip melting point determinations:
1. Waxy samples should first be completely melted, and the capillary should be dipped into the melted sample
and filled to the desired height. The sample should then be cooled so that it solidifies.
2. Crystalline samples can be ground, and the resulting powder packed into the inner capillary.
3. Soft samples, such as lipsticks, may be packed into the capillary without further preparation.

METTLER TOLEDO Basic Laboratory Skills 78


(2)
Thermal Values

(1) (3)

Figure 69: Sample preparation for cloud (1), boiling (2), and slip melting point (3). Insert the pipette tip containing the solution of interest
into the outer boiling point capillary so that it touches the inside wall of the capillary and allow the solution to slowly slide to the bottom
of the capillary. If you are preparing a boiling point sample, insert the inner boiling point capillary after sample addition. If you are preparing
a slip melting point sample, insert the inner slip melting point capillary containing 10 mm of sample after addition of the solution.

9.6. Data Management

Melting Point and Dropping Point Excellence Systems support the design of SOPs that can be launched with
One Click directly from an instrument’s touchscreen. Videos are saved for replay or further analysis, and may be
transferred to an SD card or a PC. All results are securely archived and can be tracked at any time. Analysis data
can be printed out, saved in PDF format, or directly transferred to a PC for storage.

Instrument settings can be personalized with shortcuts and the operator-selected language. The user management
system also allows different levels of access to be defined, protecting critical data.

Measurements of melting point, boiling point, cloud point, and slip melting point are further supported by
METTLER TOLEDO’s LabX laboratory management software for analytical instruments and balances. LabX ­powers
the MP70, MP80, and MP90 Excellence Melting Point Systems with automatic data handling, high process
security, and full SOP user guidance. Please note that although all three models support melting point determi-
nation, the MP80 is suggested for boiling point, cloud point, and slip melting point determinations.

9.7. Routine Testing and Service

In order to ensure the correctness of results, the measuring accuracy of melting point and dropping point analyzers
should be verified periodically. As the sample temperature cannot be measured directly using a certified thermom-
eter, the temperature accuracy is verified using certified reference substances.

METTLER TOLEDO recommends performing the following procedures:


1. Periodic (minimum monthly) instrument certification with a melting point standard. METTLER TOLEDO offers
standards as part of the MP VPac™ performance verification package; with predetermined measurement
­uncertainty, the MP VPac provides a faithful readout of temperature accuracy.
2. Annual calibration performed by a service specialist.

More information about the MP VPac performance verification package is available on our website. Service
options are described on our Good Melting and Dropping Practice™ (GMDP™) page, which also presents
­useful reference materials.

METTLER TOLEDO Basic Laboratory Skills 79


9.8. Resources and Further Reading

Guides:
Automated Melting & Dropping Point Analysis: A Comprehensive Textbook
Thermal Values

Understanding Thermal Values: A Short Guide

Knowledge pages:
Melting and Dropping Point Knowledge Materials
Melting Point Determination
Dropping Point Determination
Good Sample Preparation for Dropping Point and Softening Point

Other resources:
Melting and Dropping Point Applications (Application notes)
Good Melting and Dropping Point Practice™ (GMDP™) framework (Best practice guidelines)

METTLER TOLEDO Basic Laboratory Skills 80


10. Thermal Analysis

Thermal analysis is a branch of materials science used to investigate changes in the properties of a substance
Thermal Analysis

as it is heated, cooled, or held at constant temperature. It comprises a series of techniques carried out with
dedicated instrumentation, and provides highly specific, quantitative understanding of heat capacity, melting
behavior, tensile strength, and other values that determine suitability for, and performance in, an extensive range
of applications across industries. Thermal analysis can also be used for substance identification and purity
determination.

In the sections below, we outline the main thermal analysis techniques and advanced software for data acqui-
sition, evaluation, and instrument control. We also provide an overview of sample preparation strategies that
should be followed for accurate and reproducible results.

10.1. A Brief Overview of Important Thermal Analysis Techniques

Differential Scanning Calorimetry – DSC


Differential scanning calorimetry is used to measure the heat flow to and from a sample and a reference material
as a function of temperature. The sample may be heated, cooled, or held isothermally. The measurement signal
is the energy absorbed or released by the sample, in milliwatts, during the experiment.

DSC offers insight into materials’ physical properties, and can be used to:
• Detect endothermic and exothermic effects,
• Determine peak areas (transition and reaction enthalpies),
• Determine temperatures that characterize a peak or other effects, and
• Measure specific heat capacity.

For example, a common application is failure analysis in automotive or aerospace applications, which can be
used to examine the behavior of (e.g. polymer) parts subjected to heat stress.

Differential Thermal Analysis and Single Differential Thermal Analysis – DTA, SDTA
Differential thermal analysis is similar to DSC, but measures the temperature difference between the sample and
an inert reference substance as a function of temperature.

Single DTA, a variation of classical DTA, was patented by METTLER TOLEDO. SDTA is particularly useful when
used simultaneously with thermogravimetric analysis (described below). The measurement signal represents the
temperature difference between the sample and a previously measured and stored reference sample.

These techniques are a particularly powerful means to:


• Detect endothermic and exothermic effects, and
• Determine temperatures that characterize thermal effects.

DTA and SDTA are of use in many industries due to their ability to provide additional information, for example to
assist in identifying unknown materials.

METTLER TOLEDO Basic Laboratory Skills 81


Comparison of Three Techniques
^exo

DSC Integral −412.23 mJ


Left Limit 170.50 °C
Right Limit 236.66 °C
0.5 Left bl Limit 170.50 °C
Thermal Analysis

Wg^−1 Right bl Limit 236.66 °C


Heating Rate 10.00 °Cmin^−1
Baseline Type line
TGA
Step −1.8457%
um −0.1262 mg

TMA
Onset 211.71 °C 5
2,000
mg

Step −96.2221%
−6.5803 mg
1,000
Residue 1.7537%
Figure 70: The three techniques used to measure
0.1199 mg polyamide 6 show different thermal effects.
DSC: melting peak of the crystalline part;
0
TGA: drying and decomposition step;
50 100 150 200 250 300 350 400 450 500 °C TMA: softening under load.

Thermogravimetric Analysis – TGA


Thermogravimetric analysis measures the weight, and hence mass, of a sample as a function of temperature.
It was previously abbreviated TG; nowadays, TGA is preferred in order to avoid confusion with Tg, the glass
­transition temperature.

It is often employed to:


• Detect changes in sample mass (gain or loss),
• Determine stepwise changes in mass, usually as a percentage of the initial sample mass, and
• Determine temperatures that characterize a step in the mass loss or mass gain curve.

Relevant to the study of chemical and physical changes alike, TGA can be used to explore, e.g., combustion
reactions, as well as for analyses of purity and composition.

Evolved Gas Analysis – EGA


Evolved gas analysis is the name for a family of techniques through which the nature and / or quantity of gaseous
volatile products evolved from a sample can be measured as a function of temperature. Important analysis tech-
niques are mass spectrometry and infrared spectrometry. EGA is most often used in combination with TGA – i.e.,
hyphenated to TGA – because volatile compounds are eliminated in every TGA effect (mass loss).

Pharmaceutical applications of EGA abound, including the identification of residual solvents from API syntheses.
EGA’s utility thus lies in its capacity to discern the purity of production batches.

Thermomechanical Analysis – TMA


Thermomechanical analysis measures the deformation and dimensional changes of a sample as a function
of temperature. In TMA, the sample is subjected to a constant force, an increasing force, or a modulated force,
whereas in dilatometry, dimensional changes are measured using the smallest possible load.

Depending on the measurement mode, TMA allows you to:


• Detect thermal effects (swelling or shrinkage, softening, change in the expansion coefficient),
• Determine temperatures that characterize a thermal effect,
• Determine deformation step heights, and
• Measure expansion coefficients.

Polymers’ elasticity, viscoelasticity, relaxation, and thermal aging can all be investigated using TMA, among
other parameters.

METTLER TOLEDO Basic Laboratory Skills 82


Dynamic Mechanical Analysis – DMA
In dynamic mechanical analysis, the viscoelastic properties of polymers are investigated. The sample is subjected
to a sinusoidal mechanical stress and the force amplitude, displacement (deformation) amplitude, and phase shift
are determined.
Thermal Analysis

DMA enables the detection of thermal effects based on changes in the modulus or damping behavior. The most
important results are:
• Temperatures that characterize a thermal effect,
• The loss angle (the phase shift),
• The mechanical loss factor (the tangent of the phase shift),
• The elastic modulus or its components, the storage and loss moduli, and
• The shear modulus or its components, the storage and loss moduli.

Thermo-Optical Analysis – TOA


By thermo-optical analysis, we mean the visual observation or the measurement of the optical transmission of
a sample, for example using a thermo-microscope (or hot-stage microscope). Typical applications are the inves-
tigation of crystallization and melting processes as well as polymorphic transitions; coupling TOA to techniques
such as DSC enables the simultaneous study of heat flow, for a more complete portrait of substances’ physical
properties.

Table 5 summarizes the material properties or applications that can be analyzed by each thermal analysis
technique.
Property or application DSC DTA TGA TMA DMA TOA EGA
Specific heat capacity ••• •
Enthalpy changes, enthalpy of conversion ••• •
Enthalpy of melting, crystallinity ••• •
Melting point, melting behavior (liquid fraction) ••• • • •••
Purity of crystalline non-polymeric substances ••• ••• •
Crystallization behavior, supercooling ••• • •••
Vaporization, sublimation, desorption ••• • ••• ••• •••
Solid–solid transitions, polymorphism ••• ••• • •••
Glass transition, amorphous softening ••• • ••• ••• •
Thermal decomposition, pyrolysis, depolymerization, and degradation • • ••• • • •••
Temperature stability • • ••• • • •••
Chemical reactions, e.g. polymerization ••• • •
Investigation of reaction kinetics and applied kinetics (predictions) ••• • ••• •
Oxidative degradation, oxidation stability ••• ••• ••• •
Compositional analysis ••• ••• •••
Comparison of different lots and batches, competitive products ••• • ••• • • ••• •••
Linear expansion coefficient •••
Elastic modulus • •••
Shear modulus •••
Mechanical damping •••
Viscoelastic behavior • •••

Table 5: Application overview showing the thermoanalytical techniques that can be used to study particular properties or perform certain
applications.
••• means “very suitable”
• means “less suitable”

METTLER TOLEDO Basic Laboratory Skills 83


10.2. Method Development – Selection and Development of Procedures

Method development begins with precisely defining the information you hope to obtain from an analysis.
Thermal Analysis

Typical questions could include:


• At what temperature does the glass transition occur?
• Does the sample exhibit polymorphism?
• How pure is my product?
• What is the moisture content of my sample?

The most appropriate measuring technique will depend on the analytical task and the information you hope to
derive, as outlined in the previous section and Table 5.

The development and validation of methods is of major importance in today’s quality assurance systems. The
starting point is usually a trial method that is then optimized and validated in several iterative steps. The final
result is a validated method that is used for SOPs.

The process of developing and validating a measuring procedure is time-consuming and costly. This means it is
important to start with a good trial method from the beginning. Figure 71 presents an overview of this and helps
to systematize the development of thermoanalytical methods.

Choosing the measuring technique:


DSC
TMA (tension, compression, 3-point bending)
DLTMA (tension, compression, 3-point bending)
TGA
DMA (shear, tension, compression, bending)
TOA (thermooptical analysis)

Sample preparation

Choosing the crucible

Choosing the
temperature program

Choosing the atmosphere

After the measurement

Evaluation

Validation

Validated method
Figure 71: Procedure for developing a thermoanalytical method.

More details about thermal analysis measuring techniques, applications, and method development are provided
in METTLER TOLEDO’s Thermal Analysis Application Handbooks and in the Expertise Library.

METTLER TOLEDO Basic Laboratory Skills 84


10.2.1. Precision, Accuracy, Trueness

The accuracy, precision, and trueness of thermal analysis measurements are crucial to their interpretability.
However, they are subject to a number of influences, which include, but are by no means limited to, method
Thermal Analysis

design, user skill (both in preparing samples and in operating instruments), and the state of maintenance of
the instrumentation itself.

Figure 72 presents a graphical representation of these concepts; however, laboratories are strongly advised to
consult the detailed description of method validation, accuracy, repeatability, measurement errors, and other
issues related to the quality of data in our “Validation in Thermal Analysis: Collected Applications” handbook,
which gives an in-depth description of the terminology used and discusses many examples of DSC, TGA, TMA,
and DMA analyses that can be used to inform particular applications.

A shorter summary of the subject, entitled “Precision, accuracy, trueness”, can be found in UserCom 29. Support
may also be obtained via discussions with a technical specialist.

Accuracy: closeness of agreement


between a test result and an
accepted reference value.

Precision Trueness: closeness of agreement


(standard deviation) between a mean value obtained
from a large number of test results
and an accepted reference value,
i.e. the “true” value.

Precision: the spread of data around


a mean value.

True value Mean


Trueness

Accuracy of the value x Figure 72: The relationship


of accuracy, trueness, and
precision.

10.2.2. Uncertainty of Measurement

The total measurement uncertainty is made up of all the possible errors involved in the individual steps of an
analysis. For example, the influence factors shown in the cause-and-effect (or fishbone) diagram play a role in
the measurement of the enthalpy of fusion by DSC.

Sample preparation Instrument

Weighing in Calibration
(sample mass)
Cutting
the sample
Thermal contact
with crucible
Enthalpy
of fusion

Gas Integration limits

Heating rate
Baseline type
Flow rate

Figure 73: Fishbone


Method Evaluation diagram of the influences
on a DSC measurement.

METTLER TOLEDO Basic Laboratory Skills 85


Estimates for the individual contributions of uncertainty for this example are given in Table 6:
Source Uncertainty of measurement
Mass of the test specimen ±20 µg (e.g. reproducibility of the balance; if the mass is about 10 mg,
this corresponds to ±0.2%)
Putting the substance into the crucible negligible
Thermal Analysis

Thermal contact with the crucible ±0.5% (estimate)


Heating rate negligible
Gas and gas flow negligible if the instrument has been adjusted under the same conditions
as the measurement
Adjustment ±1.5% (uncertainty of the calibration material)
Integration limits and baseline type ±3% (statistics of repeated evaluations)

Table 6: Estimated influence of selected uncertainty factors.

The contributions can then be added together according to the rules of propagation of uncertainty to obtain
a value for the combined uncertainty:

± (0.2%)2 + (0.5%)2 + (1.5%)2 + (3%)2 = ±3.4% (pc = 68%)

• A more detailed description of measurement uncertainty can be found in UserCom 30.


• For TGA measurements, it is important to be aware of the minimum sample weight according to USP when
measuring very small sample masses or residues. A complete description of the influence of the minimum
weight of a balance specifically for TGA is given in UserCom 41.

As uncertainty may invalidate results, it is imperative to consider it in the design and validation of any thermal
analysis method.

10.2.3. Method Validation

There are three ways to obtain validated analytical methods; the details are summarized in Table 7.

For a detailed description of method validation, please refer to our “Validation in Thermal Analysis” handbook
(linked to above and in section 10.6., below).
Analytical method validation Inter–laboratory studies International standards
• The method is designed from the very • A specific analytical method is tested by • A published method can be used as it is
beginning by the user and tested for the all the participating labs using multiple described. The accuracy and precision
relevant variables (e.g. operator, sample specimens. The results are submitted to to be expected are given in the standard
preparation, instrument, environment, a central organizing body that evaluates documentation.
­accuracy, repeatability). Several iterations the data and issues recommendations • This is the easiest way to obtain a validated
to optimize the method parameters might for the expected accuracy of the method. method.
be needed. • A large amount of data can be obtained • The method might not fully meet the
• The method is exactly that required by relatively quickly. requirements or might not include particular
the user. • Reproducibility (repeatability between tests.
• Validation of the method can take a ­different labs) can also be tested. • Many international standards are available,
long time. for example from ISO, ASTM, and so on.
See Chapter 20 in the “Thermal Analysis in
Practice” handbook for a complete list of
standards relevant to thermal analysis.

Table 7: The three ways to obtain validated analytical methods.

METTLER TOLEDO Basic Laboratory Skills 86


10.3. Samples

Sampling as well as sample preparation are critical to the accuracy, trueness, and precision of results. General
requirements for sample preparation apply to DSC, TGA, TMA, and DMA. However, as the various instruments
Thermal Analysis

also have individual requirements, we encourage readers to refer to the resources listed below for more detailed
understanding.

Some of the requirements for sampling and sample preparation are common for DSC, TGA, TMA, and DMA:

Before the measurement After the measurement

The sample should be representative Weigh the sample before and after
of the bulk material [UC 29/1]. measurement to check for the loss
During the measurement of volatiles or decomposition.
The sample should not be altered (“offline TGA”).
thermally or mechanically.
Use the right atmosphere: Visual check of sample:
The sample composition should • Inert gas: nitrogen, helium, argon, etc. • Molten sample: melting visible on
­remain unchanged (moisture, • Reactive gas: air, oxygen, etc. curve?
­impurities, oxidation). • Good interaction between the gas • Discolored sample: is decomposition
and the sample. visible?
The sample should not react during • No gas exchange with the sample. • Gas bubbles: due to decomposition?
preparation (curing during storage, • Limited interaction: self-generated • Deformation of the bottom of the
e.g. induced by light). ­atmosphere. ­crucible: expansion, explosion?

Be aware of possible temperature Re-evaluate results and prepare a


g­ radients in the sample. p­ ossible second measurement with
­different parameters.
Make sure that there is good thermal
contact between sample and sample
holder (flat sample, flat sample holder).

The sample must not move during


measurement because this will cause
artifacts.

Several important points should be considered during the sampling process. These include:
• Which production lots should be examined?
• At which point or points, or from which part or parts, of a production lot should samples be examined?
• Are the sampling point and the sample size representative of the production lot with regard to possible
inhomogeneity?
• Does the selected sample allow us to draw conclusions about the bulk sample?
• How large should the sample size be with regard to volume and number of items?
• How often should a material property be determined, and how many samples should be measured?
• Are the samples taken at different points and then mixed together again to form a composite (or aggregate)
sample or are they processed separately as test samples?
• Are the samples always representative of the bulk material from which they were taken?

METTLER TOLEDO Basic Laboratory Skills 87


The diagram shows an example of a typical sampling procedure:
Thermal Analysis

Lot

Primary sample

Composite
sample

Secondary
sample

Laboratory
sample

Grinding
and / or
Test sample reduction

Test portion

A more substantive description, including detail on sampling, sample size, and sample preparation for individual
thermal analysis techniques, can be found in “Thermal Analysis in Practice: Collected Applications”, as well as in
UserCom 3 and UserCom 29.

10.4. Instrumentation and Software

METTLER TOLEDO’s Thermal Analysis Excellence portfolio supports the techniques presented above. The state-of-
the-art instruments provide high-quality data for individual samples, with automated DSC and TGA/DSC measure-
ments also available.

Each instrument is driven by the STARe Excellence Thermal Analysis Software. New functionalities can be added
via additional options to enable further types of experimentation or evaluation, facilitate data transfer to an exist-
ing LIMS, reinforce data integrity, or comply with 21CFR11 guidance. The integrated relational database affords a
clear overview of all experiments and is easily filterable and searchable.

METTLER TOLEDO Basic Laboratory Skills 88


10.5. Testing and Service

Sample collection and preparation, experimental setup and execution, and data evaluation and interpretation are
all vital to the accurate determination of samples’ behavior and intrinsic properties. However, professional instru-
Thermal Analysis

ment installation and maintenance, along with testing and repair services as needed, are also integral to the con-
tinuing quality of results. A service schedule that fits your needs and your instrumentation’s intensity of use can
be defined in consultation with a service professional; Good Thermal Analysis Practice™ (GTAP™) additionally
reduces risks, providing custom solutions to simplify your processes and ensure traceability and audit-readiness,
as necessary.

10.6. Resources and Further Reading

Handbooks and applications:


Thermal Analysis in Practice
Validation in Thermal Analysis
Thermal Analysis Application Handbooks
Thermal Analysis UserComs
Thermal Analysis Applications

Other resources:
Thermal Analysis Live Webinars
Thermal Analysis On-Demand Webinars
Thermal Analysis How-To Videos
Good Thermal Analysis Practice™ (GTAP™) framework (Best practice guidelines)
Thermal Analysis eNewsletter

METTLER TOLEDO Basic Laboratory Skills 89


11. Conclusion

The previous chapters of this guide outline instrumentation, software, and best practice suggestions for gravi-
Conclusion

metric, volumetric, analytical chemistry, thermal values, and thermal analysis workflows. We encourage you to
consult the references and resources provided in each chapter for deeper understanding of related topics, or to
contact your local METTLER TOLEDO representative for further discussion.

METTLER TOLEDO Basic Laboratory Skills 90


12. Additional Resources

Expertise Library
Additional Resources

In our Library you will find guides and white papers addressing the most important and current topics in the field
of laboratory weighing, analytical instruments, and liquid handling. Among these are the following:

• Efficient Lab Data Management


• Handling Challenging Samples: Resources and Strategies for Safe, Efficient Work
• Tailored Strategies for Lab Automation

Visit our Library


www.mt.com/library

Webinars
We provide web-based seminars (webinars) on different topics. Participate in on-demand webinars at any
­convenient time and place, or ask questions and discuss points of interest with METTLER TOLEDO specialists
and guest speakers during live webinars.
www.mt.com/webinars

Applications and Magazines


Our searchable application databases contain hundreds of applications from different industry ­segments.
Applications are proven methods with expected results, detailed method parameters, and concluding
evaluations.

UserComs are periodicals distributed to individuals interested in practical application feedback, expert tips,
t­echnical information, and the latest product news. The information is intended to give readers new ideas on
how to solve analytical problems in their own laboratories. If you have an interesting application that you would
like to share with other users, we would be delighted to publish it in the UserCom.

Titration applications www.mt.com/titration_applications


Analytical chemistry UserCom www.mt.com/anachem-usercom
Thermal analysis applications www.mt.com/ta-applications
Thermal analysis UserCom www.mt.com/ta-usercoms
Moisture analyzer applications www.mt.com/moisture

METTLER TOLEDO Basic Laboratory Skills 91


Optimize Equipment Use
With Good Measuring Practices

­ ETTLER TOLEDO’s Good Measuring


M selection, installation, calibration,
Practices framework supports you and operation.
in laboratory and production envi-
ronments alike. Discover quality Implement Good Measuring
assurance measures for laboratory Practices in your laboratory:
instruments and simple, straight-
forward recommendations for their www.mt.com/GP

Our Good Measuring Practices Programs

• Good Electrochemistry Practice™


• Good Weighing Practice™
• Good Titration Practice™
• Good Pipetting Practice™
• Good Melting and Dropping Point Practice™
• Good Density and Refractometry Practice™
• Good Thermal Analysis Practice™
• Good UV/ VIS Practice™

METTLER TOLEDO Group www.mt.com


Laboratory Weighing For more information
Local contact: www.mt.com/contacts

Subject to technical changes


© 09/2021 METTLER TOLEDO. All rights reserved
108590202A
Group MarCom RITM737488 JK

You might also like