You are on page 1of 35

AOAC PEER-VERIFIED METHODS

PROGRAM

MANUAL ON POLICIES
AND PROCEDURES

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

AOAC PEER-VERIFIED METHODS
POLICIES AND PROCEDURES
CONTENTS
I. Introduction and General Principles.................................................................................... 3
Purpose
Scope
Submission of Procedures
Distribution/Publication
Sunsetting Provision

II. Outline for Process of Acceptance as a Peer-Verified Method .......................................... 6

III. Suggested Practice for Characterizing Method Performance ............................................ 8


Minimum Validation Criteria .................................................................................... 8
Suggested Testing for Performance Characterization............................................ 8
Additional or Different Parameters for Qualitative Methods ................................. 10
Generic Descriptions ............................................................................................. 11
Immunoassay and Instrumental Methods ............................................................. 11

IV. Independent Laboratory Requirement .............................................................................. 13


Identification and Selection of Independent Laboratory ....................................... 13
Method Format ...................................................................................................... 13
Sample Selection .................................................................................................. 13
Precision and Accuracy......................................................................................... 13
Evaluation.............................................................................................................. 14

V. Technical Review Process ................................................................................................ 15

VI. Process for Comments on Peer-Verified Methods ........................................................... 17

VII. Process for Submitting Peer-Verified Methods to AOAC 


Official Methods Program...................................................................................... 18

CHECKLISTS AND FORMS

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

Checklist for Method Development, Characterization, and Ruggedness Testing........................ 19

Safety Checklist for Methods ........................................................................................................ 21

Peer-Verified Methods Format ...................................................................................................... 22

Checklist for Independent Laboratory Protocol............................................................................. 24

Independent Laboratory Report Form........................................................................................... 25

Technical Review Form for Peer-Verified Method Study.............................................................. 26

APPENDIXES

Definitions...................................................................................................................................... 27

Estimation and Statement of Uncertainty...................................................................................... 29

Ruggedness Testing Procedure ................................................................................................... 31

Terms of Reference: Peer-Verified Methods Advisory Committee............................................... 34

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

AOAC PEER-VERIFIED METHODS.
POLICIES AND PROCEDURES

I. INTRODUCTION AND GENERAL PRINCIPLES1


1. Purpose

The AOAC Peer-Verified Methods program is intended to provide a class of tested methods
which have not been the subject of full collaborative study. Through a less resource intensive
process, the program provides a rapid entry point for methods into and recognition by AOAC and
a level of validation for methods not otherwise evaluated.

Three stages in the validation of analytical methods are (1) establishment of acceptable
performance parameters within a laboratory; (2) demonstration of acceptable performance in a
second or third laboratory; and (3) demonstration of acceptable performance in an
interlaboratory collaborative study. Progression through the stages confers an increasing degree
of confidence in the performance of the method and the results produced in different
laboratories. Successful testing through stage 2 is the requirement for status as an AOAC Peer-
Verified Method.

By responding to the needs of the analysts and laboratories for an intermediate stage of
validation for a wide range of methods, the program provides the tools required to meet some
analytical demands. Full collaborative study of Peer-Verified Method is strongly encouraged,
especially when further validation will more fully meet user requirements.

2. Scope

A candidate AOAC Peer-Verified Method has been subjected to ruggedness testing in the
author's laboratory, and performance parameters have been tested in at least one other
laboratory. Information on the performance of the method includes evidence of analyte
identification, resolution from interferences, sensitivity to concentration, and lower limits of
detection or determination.

The scope of procedures suitable for testing and submission includes new analytical
procedures, revisions of validated methods to extend the applicability or improve the
performance, adaptations of methods or procedures to automation, and applications of
techniques to new analytical problems.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

There are 4 main phases in the Peer-Verified Methods program: (1) The method author
develops a data package characterizing the performance of the method according to method
characterization criteria recommended by AOAC. (2) The method author prepares a protocol for
testing the performance of the method and recruits a second or additional independent
laboratories to conduct the performance test, again according to testing criteria recommended
by AOAC. (3) The method characterization data and independent laboratory testing data are
submitted to the appropriate Technical Referee, generally chosen from among AOAC General
Referees and Methods Committee members, who seeks a review by qualified experts. The
Referee, on the basis of a minimum of 2 positive reviews, grants AOAC Peer-Verified Method
status to the method. (4) Accepted Peer-Verified Methods are collected and published on a
quarterly basis according to a format that includes the data on method characterization,
directions on performing the method, and a request for feedback on use of the method by other
laboratories.

The in-house characterization parameters are compatible with the ruggedness testing that is
required as part of the protocol submission for proposed AOAC Official Methods. This facilitates
movement of a Peer-Verified Method into the AOAC Official Method study process at some later
date. Peer-Verified Methods that are subsequently subjected to full collaborative study will be
deleted from the Peer-Verified Method category.

Both quantitative and qualitative methods may be studied and submitted for acceptance as
Peer-Verified Methods. Rapid "kit" methods may be submitted, but only if there are no
proprietary components. Commercial kit methods based on proprietary technology should be
submitted to the AOAC Test Kit Performance Tested Program, conducted by the AOAC
Research Institute, 481 North Frederick Ave., Suite 500, Gaithersburg, MD 20877-2417, USA;
Phone: +1-301-924-7077, Fax: +1-301-924-7089.

3. Submission of Procedures

The Peer-Verified Methods process is fully open. Any interested party may submit a procedure
for AOAC acceptance, and may submit comments on published procedures. Submissions,
queries, and comments should be sent to AOAC INTERNATIONAL, Peer-Verified Methods
Program, 481 North Frederick Ave., Suite 500, Gaithersburg, MD 20877-2417, USA; Phone: +1-
301-924-7077, Fax: +1-301-924-7089.

All tested procedures that are submitted are subjected to technical review before acceptance
and publication.

Oversight for the program is provided by the Peer-Verified Methods Advisory Committee. The
Committee also recommends and implements the policies of the AOAC Board of Directors
relating to the program.

4. Distribution/Publication

.
Accepted Peer-Verified Methods are published in the Journal of the AOAC INTERNATIONAL
(JAOAC). The JAOAC is available on a subscription basis from AOAC INTERNATIONAL, 481 North
Frederick Ave., Suite 500, Gaithersburg, MD 20877-2417, USA; Phone: +1-301-924-7077, Fax: +1-
301-924-7089. When an inventory has accumulated, a method's compendium will become available

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

by subscription. PVMs are individually available from AOAC INTERNATIONAL.

5. Sunsetting Provision

Accepted Peer-Verified Methods will be reviewed for continued appropriateness after they have
been available for 5 years. A notice seeking comments will be carried in The Referee section of
the AOAC magazine Inside Laboratory Management. Methods will continue to be available as
AOAC Peer-Verified Methods if no negative comments are received. Submitted comments will
be appended to the method, resolved by the method submitter or Technical Referee as
necessary, or will result in revision of the method, as necessary. Methods which are considered
no longer appropriate will be deleted from the collection.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

II. OUTLINE FOR PROCESS OF ACCEPTANCE AS A PEER-VERIFIED
METHOD2
1. The method author develops the initial method data according to AOAC criteria (see
"Checklist for Method Development, Characterization, and Ruggedness Testing").

2. The method author writes the method in the step-by-step format specified by AOAC that
follows the expected flow of laboratory operations (see "Peer-Verified Method Format").
The author is encouraged to incorporate appropriate safety information (see "Safety
Checklist for Methods").

3. The method author develops the design or protocol for second laboratory testing of
method performance according to AOAC criteria (see "Checklist for Development of
Independent Laboratory Protocol"). He or she is encouraged to discuss the method or
seek advice from the appropriate Technical Referee. The AOAC Office can identify an
appropriate expert if this is not known to the author.

4. The method author follows the policy for selecting an independent peer laboratory (see
"Guidelines for Selecting an Independent Laboratory") and recruits the laboratory to test
the method according to the written protocol. The author prepares any necessary
collaborating samples and sends the method, samples, protocol, and form (see
"Independent Laboratory Report Form") to the second laboratory.

5. The independent laboratory conducts the method performance test according to the
method and protocol. The two laboratories may communicate to facilitate clarifications
and improvements in the method and method directions. The laboratories shall keep a
record of these communications.

6. Within the agreed time, the second laboratory submits its results to the method author,
using the report form.

7. The method author analyzes the data and sends the method development,
characterization, and ruggedness testing data and the independent laboratory report to
the AOAC Office.

8. The AOAC Office distributes all material to the appropriate Technical Referee.

9. The Referee selects at least 2 experts to provide a written technical review, and sends
the submitted information (see "Peer-Verified Method Technical Review Form").
Reviewers are asked to return their reviews to the Referee within 3 weeks.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

10. The Technical Referee acts as follows:

10.1. If 2 reviews are negative, the Referee notifies the AOAC Office that the method is
not acceptable.
10.2. If 2 reviews are positive, with very minor comments, the Referee notifies the
AOAC Office that the method is acceptable with minor revisions.
10.3. If the initial review is split, the Referee seeks 1 or more additional experts as
needed to obtain 2 corroborating reviews.
10.4. If 2 reviews are positive, but moderate revisions are requested, the Referee
requests the revisions from the method author. Once the revisions are
satisfactory, the Referee notifies the AOAC Office that the method is acceptable.

11. The AOAC Office prepares the material for publication in the Peer-Verified Methods
Notebook.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

III. SUGGESTED PRACTICE FOR CHARACTERIZING METHOD
PERFORMANCE3
The following performance parameters have been developed primarily for quantitative chemical
and physical methods; parameters specifically applicable to qualitative and microbiological
methods are presented at the end of this section. The suggested practice is presented in
general terms; expected practice should be addressed topic to topic for the specific scientific
method proposed. Any questions about what is expected practice in method performance
validation for a specific scientific topic should be addressed to the appropriate General Referee
and should be documented.

1. Minimum Validation Data

The following method validation parameters are normally expected (see Appendix A for
definitions):

Accuracy
Recovery
Calibration curve
Linearity
Limit of detection
Limit of quantitation
Precision: Repeatability and reproducibility
Sensitivity
Specificity

For any of these parameters that are excluded, a brief explanation should be given to justify why
they were not included.

2. Suggested Testing for Performance Characterization

Accuracy - Accuracy can be determined by one of these techniques, in preferred order:


use of certified reference materials
use of reference method of known uncertainty
use of recovery from spiked samples
Reference material, if available, should be carried through entire procedure with each batch of
test samples. If fortified or spiked samples are used, the method of fortification should be
described.

Recovery - For quantitative methods, recovery of added analyte over an appropriate range of
concentrations (mean range variability) may be taken as an indication of trueness.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

The concentration should cover the range of concern including one concentration at target level,
one at low end, and one at the high end.

Recovery is calculated as follows:

% Recovery = (measured Conc in fort. material - measured Conc in unfort. material) + 100
_______________________________________________________________
(known increment in Conc)

The amount added should be a substantial fraction of, or more than, the amount present in the
unfortified material.

Guidelines are available for expected recovery ranges in specific areas of analysis. The design
of recovery experiments should be appropriate for the intended purpose of the method.

Calibration Curve -- A sufficient number of standard solutions is needed to define the response
in relation to concentration. The number of standard solutions is a function of the range of
concentration. In most cases, a minimum of 5 concentrations of standard solutions (not zero) is
appropriate to prepare a calibration curve. The curve should be statistically tested and
expressed.

Response at various concentrations in pure or defined solvent, and response at various


concentrations in matrix(es) should be presented. For nonlinear curves, more standard
solutions would be necessary. The curve should be extrapolated above or below the
concentrations tested only as necessitated to examine behavior at "zero" and to apply the
method of additions.

Linearity - Where applicable, linearity can be determined using external analyte standards at 4

nominal concentrations: 0x, x, 1x, and 2x, on 3 separate days (where x is the normal, usually
found level or target level for action), or as generally accepted in the commodity testing area.

Limit of Detection - Defined as the mean value of the matrix blank readings (n 20) plus 3
standard deviations of the mean, expressed in analyte concentration. For methods with less than
100% recovery, limit of detection should be corrected for recovery.

Limit of Quantitation - The limit of quantitation is the lowest amount of analyte in a sample
which can be quantitatively determined with precision and accuracy appropriate to analyte and
matrix considered. It can be defined, e.g., as the mean value of the matrix blank reading plus 10
standard deviations of the mean, expressed in analyte concentration. It is not determined by
extrapolation. Correction for recovery in the limit of quantitation, if used, should be stated in the
method.

Limit of Determination/Limit of Decision - The limit is equal to the mean of the measured
content of blank samples (n20) plus 6 times standard deviation of mean.

Precision - The relative standard deviation of individual results should be determined from
different days, different analysts, different calibration curves, different batches of reagents, and

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

for different matrixes.

Precision is usually defined in terms of reproducibility and repeatability:

Reproducibility: Ideally, precision should be measured for performance between



laboratories. Obtain values for different samples, different concentrations ( x, 1x, and

1 or 2x), and separate days. Reproducibility data is generated from a minimal number
of labs (own lab plus independent lab(s))

Repeatability: This component of reproducibility can be determined within-laboratory.


Specific examples in the Appendix suggest practices that are acceptable in various topic
areas.

Specificity - Reagent blanks and field blanks should be run to ensure no interfering compounds
are present. To verify the specificity of the method for the analyte(s) of interest, results should be
tested under different experimental conditions, e.g., 2 different analytical principles or 2 different
detection techniques. The method should be able to distinguish the analyte from known
interfering materials; and the behavior of the analyte during analysis should be indistinguishable
from the corresponding standard material in the appropriate matrix.

Sensitivity - Sensitivity is normally measured in terms of previously defined parameters.

Comparison to Existing Methods - Comparison to an existing method when applicable is


strongly recommended (preferably a validated or collaboratively studied method). Depending on
topic, a suitable number of comparative analyses, preferably at levels tested in a collaborative
study of the "old" method, if possible, are generally sufficient to show equivalent performance.

3. Additional or Different Parameters for Qualitative Methods

Recovery - For qualitative methods, recovery may be estimated by comparison of results with
an existing method.

Linearity - For qualitative methods, the entire response range may be examined by testing a
series of test samples and standards consisting of negative test samples, test samples at the
specification level, test samples bracketing the specification level, and strong positive test
samples.

Cross Reactivity- Samples should be tested for cross reactivity with other known entities.

Specificity - Usually determined as cross-reactivity. All samples should be tested for cross-
reactivity with other known entities.

Incident of False Positives/False Negatives - For qualitative methods, the precision cannot be
expressed as a standard deviation or coefficient of variation, but may be expressed as true and
false positive (or negative) rates. These rates should be determined at several concentrations
including the specification level. For qualitative methods the incidence of false positives/false
negatives may be determined. Data from a confirmatory method comparison should be provided

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

if such a method(s) is applicable to the same matrix(es) and concentration range(s). In the
absence of method comparison, populations of negative and positive fortified samples should be
analyzed.

Contact the appropriate General Referee for the calculation of false positive/false negatives for
your scientific area. Formulas used in the calculation must be defined and included in the
method.

Response rates can also be expressed in terms of specificity, sensitivity, and predictive rates.
Exact definitions of specificity, sensitivity, and predictive rates as used should be described.

Presence/Absence or Qualitative Comparative Tests - For microbiological rapid tests, the


following studies can be done by both the rapid test and a culture method

Inoculation Test with Viable Cells:


5 different matrix samples
10 different strains of relevant microorganism at
3 different levels (low: 1-10 cells; high: 10-100 cells; blank: no addition)
10 different strains of competitive microorganisms at high level
Perform tests in duplicate.

Quantitative Microbiological Tests - The same studies recommended for qualitative tests can
be done, but the range of applicability should be defined, with comparisons done over that
range.

Microbiological "Rapid" (Non-culture) Method: A rapid method should be compared with the
official culture method, where one exists, over selected strains of microorganism, over selected
strains of competitive microorganisms, and over applicable range. When method is intended as
a screen, false positive rates may be higher than culture methods, but false negative rate is
expected to be 0.

4. Generic Descriptions

Performance-Based Descriptions of Reagents and Apparatus - Apparatus and reagents


specified in methods shall be specified by performance or generic characteristics rather than
brand name. To avoid referring to specific brand products, critical parameters shall be identified
and defined. System suitability standards and tests should be established and clearly presented
such that the product is defined generically, and equivalency can be readily determined. Vendor
or brand names may be included and stated as "examples". A disclaimer should be included as a
footnote stating, "AOAC does not endorse the brand names used in this method and are only
included in the method as examples."

System Suitability Tests - Methods should contain system suitability tests where appropriate
for determining acceptability of reagents and apparatus. A system suitability test is a clear,
concise, specific technical statement of a definitive test or tests which indicate that all
controllable variables attributable to a particular product used in a method are within acceptable
limits. This test or tests will be used by the analyst to determine whether a product used in the

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

method will have any measurable effect on the method results.

5. Immunoassay and Instrumental Methods

Immunoassay: Quality control samples should be used at 0, low, mid, and high analyte range.
RSDr of control samples at limit of decision should be <0.15 (= 15%).Peer-Verified Methods .
validation does not imply the suitability of a particular instrument, reagents, and/or supply. It is up to
the user of the method to verify the suitability or acceptability of reagents and apparatus used in the
method.

Gas Chromatography/Liquid Chromatography: Analyte should elute at retention time of


reference material. Nearest peak maximum should be separated from analyte peak by one full
width at half-height. In co-chromatography, only the peak presumed to be due to the analyte
should be intensified, and the width at one-half maximum height should be within 10 of original
width. (Co-chromatography: Divide test solution into 2 parts. Chromatograph one part as such.
Add amount of standard material of analyte to second part equal to estimated or expected
amount of analyte, and chromatograph.)

Thin Layer Chromatography: Rf values of analyte should agree with Rf values characteristic of
reference material. Center of spot nearest to that of analyte should be separated from it by at
least one-half sum of spot diameters. Visual appearance of analyte should be indistinguishable
from that of reference material. In co-chromatography, only the spot presumed to be due to the
analyte should be intensified, and visual appearance should not change.

Mass Spectrometry: The relative abundance of all diagnostic ions should match those of
reference material, and relative intensities, as percentage of intensity of base peak, should
match reference material within 10% (EI) or 20% (CI).

Infrared Spectrometry: Absorption should be present in all regions of analyte spectrum which
corresponds with an adequate peak in reference spectrum. Percentage of adequate peaks
found for analyte should be 50. When there is no exact match for an adequate peak, the
relevant region of the analyte spectrum should be consistent with the presence of an adequate
peak.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

IV. INDEPENDENT LABORATORY REQUIREMENT4
1. Identification and Selection of Independent Laboratory

The method originator selects the peer laboratory. The purpose of the second laboratory is to
verify and evaluate the method. It is not expected to duplicate all the data generated by the
originator.

The independent laboratory can be public, private, or part of the method originator's organization
as long as the personnel used to conduct the independent laboratory validation do not report to
the study director involved in developing or submitting the method and do not use the same
equipment or supplies. The laboratory should be one where the type of assay in question is
done on a routine basis. As an example, if the assay is one done by liquid chromatography and
the matrix is food, then it should be done in a laboratory which routinely assays food samples
using liquid chromatography.

The name of the laboratory will be published as part of the validation information in the published
Peer-Verified Method.

2. Method Format

The independent laboratory should receive the method in the step by step format which would
be submitted to AOAC for publication. This is very important because if the laboratory has any
trouble, the exact step can be identified and clarified.

3. Sample Selection

The type of samples should be selected by the method originator and should address the
requirement for the second laboratory to substantiate the range and within-laboratory precision
and accuracy. The second laboratory should in addition obtain local samples which might be of
interest and for which the content is "known."

The independent laboratory should when possible receive at least some of the same samples
assayed by the method originator, so that between-laboratory precision data can be obtained. If
this is not possible, then, depending on the number of samples involved, at least half of the
samples should be known (reference materials) or previously assayed by other methods.

Trial or practice samples should also be made available to the independent laboratory for
analyst familiarization.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

4. Precision and Accuracy

The independent laboratory should verify the precision and accuracy of the method. Precision
should be determined in at least representative matrices of the target samples. Accuracy should
be determined through the use of reference materials, other known samples, or standard
additions. If available, all three methods should be used.

5. Evaluation

The independent laboratory should prepare a written evaluation of the method, including
suggestions for alternative steps or procedures. Also addressed should be accuracy and
precision as experienced by the second laboratory, ease of method use, special
glassware/instrumentation/etc. requirements, and general comments.

The independent laboratory should maintain a record of all communications with the method
originator pertaining to the method and interpretation of results. This record should be submitted
along with the evaluation.

The independent laboratory should retain detection records (chromatograms, spectra, etc.) for
all samples analyzed, including blanks and standards, and submit these along with the method
evaluation.

In the event of non-agreement by the second laboratory, the method should not be published
until all issues are resolved.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

V. TECHNICAL REVIEW PROCESS5
The General Referee and Methods Committee members are the basic resource within AOAC for
Technical Referees. The Technical Referee is responsible for selecting the technical reviewers
and evaluating their reviews of Peer-Verified Methods studies submitted for AOAC acceptance
within his or her broad analytical area (e.g., dairy products, drugs in feeds, pesticide
formulations, microbiological contamination). Once the technical reviews are obtained, and the
submission is revised, if necessary, the Technical Referee decides on acceptability of the
method for AOAC Peer-Verified Methods status.

The Technical Referee also monitors the comment activity on a published method. He or she
receives copies of submitted comments that are transmitted to the method author for review, and
may act on those comments if a method author is not available or does not act.

The following is a description of the process that should be followed in reviewing a contributed
method:

1. Receive a Peer-Verified Method submission package only from the AOAC Office.

2. Select a minimum of 2 - 3 qualified reviewers to comment on the method, i.e., at least 2


reviewers should submit comments of significance:

i. persons known to you to be working in the subject area;


ii. persons who have published on a similar topic;
iii. authors cited in the submission bibliography, if any;
iv. persons suggested by the submitter; use with discretion;
v. AOAC Associate Referees or Methods Committee members;
vi. collaborators acknowledged in studies of similar topics.

3. Notify the AOAC Office of the names, addresses, and phone and fax numbers if possible
of the selected reviewers.

4. AOAC Office will send a copy of the method report, including in-house method
characterization information, independent laboratory verification data, and method
description, with a letter requesting review and the Peer-Verified Method Technical
Review Form to 2 reviewers.

5. If a reviewer does not respond in a reasonable time (3 weeks), AOAC Office will write,
phone, or fax to determine if the request was received, and if the reviewer expects to be
able to submit an evaluation. If not, return of the package will be requested, and the third
reviewer will be contacted.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

6. Similarly, if two conflicting reviews are received, the third reviewer will be selected.

7. When the necessary reviews are received, they will be transmitted to the Technical
Referee for evaluation:

a. If two reviewers recommend acceptance as a Peer-Verified Method, with few


and/or minor revisions, the Technical Referee may make these revisions and
return all material and recommendation on acceptance to AOAC Office.
b. If two reviewers recommend against acceptance as a Peer-Verified Method,
return all material and recommendation against acceptance to the AOAC Office.
c. If two reviewers recommend acceptance but with moderate revision of study
and/or method description, work with author to reach an acceptable report.
Return all material, including revised report, and recommendation on acceptance
to AOAC Office.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

VI. PROCESS FOR COMMENTS ON PEER-VERIFIED METHODS6
1. Comments by Peer Laboratories, Technical Referees, and Reviewers to Peer-Verified
.
Methods under review must be itemized in writing on a separate piece of paper and
.
submitted to the AOAC office. Peer-Verified Methods must also include page numbers and
if possible line numbers. Examples of comments are extension of the use of the method
for additional matrixes, analytes, concentration ranges; information on interferences;
identification of critical steps; etc.

2. Comments shall be accompanied by supporting data or other information (such as


information from the literature).

3. The AOAC Office submits the comment(s) to the method author. In the absence of a
method author, or if the method author fails to respond to the AOAC Office, any
comments shall be submitted to the appropriate Technical Referee.

4. The method author or Technical Referee evaluates the acceptability and recommends on
inclusion of the comment(s) in the next update of the publication of the affected method.

5. Comments accepted for appending to the method will be accompanied by the


commenter's name and contact address.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

VII. PROCESS FOR SUBMITTING PEER-VERIFIED METHODS TO AOAC
OFFICIAL METHODS PROGRAM7
1. Any method submitted to or accepted as an AOAC Peer-Verified Method is a candidate
for collaborative study and consideration as an AOAC Official Method.

2. Information and data collected as part of the in-house method characterization and
independent laboratory verification may be submitted for fulfillment of similar
requirements in the AOAC Official Methods Program "Checklist for Protocol Design of
Interlaboratory Collaborative Studies."

3. Upon submission of method and complete protocol design to the AOAC Office, the
proposed method and study will be processed as required for review, conduct of study,
evaluation of results, and adoption as an AOAC Official Method. See AOAC Official
Methods Program.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

CHECKLIST FOR METHOD DEVELOPMENT, CHARACTERIZATION, AND
RUGGEDNESS TESTING
1. Scope of Method

Characterize scope of method as follows:

() Need/purpose of method (e.g., enforcement, surveillance, monitoring, acceptance


testing, quality control)

() Type of method (e.g., reference, screening, confirmatory)

() Applicability of method (e.g., concentration range, analyte(s), matrix(es))

2. Ruggedness of Selected Method

Follow procedures for ruggedness testing as outlined, for example, in one or both of the
following:

Youden & Steiner (1975) Statistical Manual of the AOAC, AOAC International, Arlington,
VA. SEE APPENDIX C.

Dols & Armbrecht (1976) J. Assoc. Off. Anal. Chem. 59, 1204-1207

3. Within-Laboratory Performance of Optimized Method

Provide data on the following:

() Trueness or Bias (systematic error)

() Recovery

() Calibration function (response vs concentration in solvent)

() Analytical function (response vs concentration in matrix, including blank)

() Limit of detection; limit of quantitation

() Repeatability

() Mathematical transformations needed, if any

4. Characterization/Specifications of Optimized Method

Provide information on the following:

() Interferences (specificity): impurities, contaminants, additives, any components

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

expected to be present; nonspecific effect of matrixes; effects of transformation
products or metabolic products

() Performance specifications and acceptability criteria for reagents,


instruments

() Suitability tests for systems

() Critical steps or parameters

() Comparison with other methods, if available

5. Quality Control of Method Performance

Provide information/specifications/requirements for the following:

() Acceptance limits

() Calibration checks

() Reference materials

() Blank checks

() Standards

() Fortification samples

6. Safety Considerations

See separate Safety Checklist for Methods

7. Detection records (copies of chromatograms, spectra, absorbance readings, etc.) for


samples, standards, and controls, clearly labeled.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

SAFETY CHECK LIST FOR METHODS (If "Yes," include cautionary statements)
Yes No

() () 1. Are any materials used or compounds formed which are explosive


or flammable?

() () 2. Are there any side reactions that could occur that might produce
flammable or explosive products or conditions?

() () 3. Are there any hazards created from electric or mechanical


equipment?

() () 4. Are pressure differentials created that could result in explosion or


implosion?

() () 5. Are any substances used or formed which are:


() a) radioactive
() b) carcinogenic
() c) mutagenic
() d) teratogenic
() e) abortogenic
() f) otherwise a significant health hazard

() () 6. Would there be increased hazards if the reaction temperature


were increased even modestly?

() () 7. Are special procedures required if a reaction mixture spill occurs?

() () 8. Is there a risk in producing a dangerous aerosol?

() () 9. Are special procedures required for the disposal of reagents or


reaction products?

() () 10. Are there any organisms and/or their products used/present that
are:
() a) pathogenic
() b) allergenic
() c) carcinogenic
() d) mutagenic
() e) teratogenic
() f) otherwise a significant health hazard

() () 11. Are there any potential hazards in handling or storage of reagents,


samples, or standards?
() () 12. Are there any other hazards that might be important to provide
information for with the method?

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#.  
PEER-VERIFIED METHOD FORMAT
1. Cover Page

Method Name Peer-Verified Method Number

Method Abstract
Summary of principles of analysis, scope of method, performance data

Method Author(s)
Address(es)
Phone number(s) of contact

Names of Submitting and Peer Laboratory (minimum of 2)

Acknowledgment of reviewers

List of Standard Reference Materials


(if available, to be used for method performance testing and/or quality assurance)

2. Reverse of Cover Page

Summary of Results of Verification Study (submitter and peer laboratory separately


identified)
Matrices
Number of samples
Accuracy and/or Recoveries
Precision: Repeatability and Reproducibility
Results on SRM (if applicable)

3. Method Page (ISO Format)

Safety Precautions

Scope
Matrices
Applicable Range, Concentrations
Limitations

References

Definitions

Principle
Description of all chemical reactions (if applicable)
Structures of reactants/products (if applicable)
Quantitation principle

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#.  
Standards
Purity
Dilutions
Stability

Reagents/Supplies

Apparatus
Instrument configurations
Operating conditions
Response
Performance specifications
Acceptance tests

Sampling
Sampling procedure
Sample storage

Sample Preparation
Sample workup
Sample storage

Controls Preparation
Blanks
Control samples
Fortification procedures

Procedure
Test portions
Extractions
Dilutions
Other treatments
Determination

Calculations
Equations
Factors
Standard curve

Test Results Report

Quality Assurance
Critical control points

Footnotes and User Comments (Annex)


For example, date submitted, available data

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#.  
CHECKLIST FOR INDEPENDENT LABORATORY PROTOCOL
1. Method

() AOAC Peer-Verified Method format

2. Samples and Standards

() Clearly labeled w/ identification, quantity, potency (for standards), storage

3. Laboratory Evaluation

() Directions for preparation of standards and standard curve

() Directions for familiarization (at least 2 control samples and 2 fortified samples)

4. Substantiation of Within-Laboratory Performance

() Trueness and/or Recovery

() Repeatability

5. Between-Laboratory Performance

() Reproducibility

6. Report Form

() Specify number of significant figures as needed according to appropriate


topic


AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 
INDEPENDENT LABORATORY REPORT FORM
Author(s):

Method:

Independent Laboratory:

Independent Laboratory Contact (name, address, phone):

Technical Evaluation Criteria:


Yes No
1. Is the method technically sound? ___ ___

2. Do the method description and peer evaluation cover


all pertinent items on the Checklist for Development,
Characterization and Ruggedness Testing? ___ ___

3. Are the conclusions and method applicability


statements, as proposed by the author,
valid based on the data? ___ ___

4. Are sufficient data points evaluated? ___ ___

5. Is the method practical, including safety and


environmental considerations? ___ ___

6. Are critical steps identified? ___ ___

7. Are quality control measures identified/adequate? ___ ___

Editorial Evaluation Criteria:

8. Is the method description clear and complete? ___ ___

9. Are all tables, figures, equations, and terms


sufficiently explained? ___ ___

10. Have reagents and apparatus been described in


performance terms with system suitability tests
where necessary? ___ ___

Submit with Report Form: All data; chromatograms, absorbance curves, etc., for blanks,
standards, control samples, samples.


AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 
TECHNICAL REVIEW FORM FOR PEER-VERIFIED METHOD STUDY
Author(s):

Method:

Technical Recommendation:

_____ The data indicate the method is acceptable.

_____ The method cannot be properly evaluated until additional information is supplied as
indicated below or specified on a separate sheet.

_____ The data indicate the method is not acceptable.

Technical Evaluation Criteria: Yes No

1. Is the method technically sound? ___ ___

2. Do the method description and peer evaluation cover


all pertinent items on the Checklist for Development,
Characterization, and Ruggedness Testing? ___ ___

3. Are the peer laboratory's comments addressed? ___ ___

4. Are the conclusions and method applicability


statements valid based on data presented? ___ ___

5. Are sufficient data points evaluated? ___ ___

Editorial Recommendation:

_____ The report presentation is acceptable.

_____ The report presentation requires revisions as indicated below or on a separate sheet.

Editorial Evaluation Criteria:

6. Is the method presented in the correct format? ___ ___

7. Is the method description clear and unambiguous? ___ ___

8. Are all tables, figures, equations, and terms explained? ___ ___

9. Have reagents and apparatus been described in performance


terms with system suitability tests where necessary? ___ ___


AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 
APPENDIX A: DEFINITIONS
Analyte - Component of a test sample, the presence of which has to be demonstrated. "Analyte"
includes derivatives formed from the analyte during analysis.

Matrix - All the constituents of the test sample with the exception of the analyte.

Standard Material - Well-defined substance in its highest obtainable purity.

Reference Material - Matrix containing known amount of analyte, ideally close to the expected
amount in test samples.

Control Sample - Sample that does not contain the analyte of interest.

Fortified or Spiked Sample -- Control sample to which known analyte quantity has been added.

Incurred Sample - Sample that contains analyte naturally produced (biologically incurred or
commercially produced).

Accuracy - The closeness of the determined value to the true value. A measure of accuracy is
the difference between the mean value and its true value, i.e., systematic error or bias,
expressed as a percentage of the true value.

Recovery - The fraction of analyte added to a test sample (fortified or spiked sample) prior to
analysis, which is measured (recovered) by the method. When the same analytical method is
used to analyze both the unfortified and fortified samples, %R is calculated as follows:

%R = [(CF - CU)/CA] + 100

where CF = concentration of analyte measured in fortified sample; CU = concentration of analyte


measured in unfortified sample; CA = concentration of analyte added (measured value, not
determined by method) in fortified sample.

Calibration Curve -- Graphic representation of measuring signal as a function of quantity of


analyte.

Linearity - Defines the ability of the method to obtain test results proportional to the
concentration.

Limit of Detection - Lowest content that can be measured with reasonable statistical certainty.

Limit of Quantitation - Equal to or greater than the lowest concentration point on the calibration
curve.

Limit of Determination/Limit of Decision - Lowest analyte content, if actually present, that will
be detected and can be identified.

Precision - Degree of agreement of measurements under specific conditions.

Specificity - Ability of a method to measure only what it is intended to measure.

Cross Reactivity - Response to analogues, metabolites, or other nontarget components that


may be present in the matrix(es).


AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 
Quantitative Method - The most important characteristic is the measurement of amount of
analyte. The most relevant performance parameters are trueness and sensitivity, or the ability of
the method to discriminate between small differences in analyte content -- slope of calibration
curve at level of interest.

Qualitative Method - The most important characteristic is the reliability of its identification of the
analyte. The most relevant performance parameter is the ability to distinguish between the
analyte and all other substances present in the sample.

Ruggedness Test - Intralaboratory study to examine the behavior of an analytical process when
small changes in the environmental and/or operating conditions are made, akin to those likely to
arise in different test environments. Ruggedness testing allows information to be obtained on
effects of minor changes in a quick and systematic manner. See scheme below.


AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 
APPENDIX B: ESTIMATION AND STATEMENT OF UNCERTAINTY
It is essential to distinguish between 'error' and 'uncertainty.'

There are two components to 'error' known as 'random error' and 'systematic error.'

Random errors arise from unpredictable variations in parameters that affect the result of a
measurement. These give rise to variations in the result of repeated measurements. Random
errors cannot be corrected for.

Identified possible Systematic errors are associated with the effects that known causes could
have on the result of a measurement. These errors may be estimated and corrected for, if
significant, but it is unlikely that they can be corrected for exactly. By definition, nothing can be
said or done about the effect of unknown causes of systematic error on the result of a
measurement.

The result of a measurement after correction for identified possible systematic errors is still only
an estimate of the true value. The uncertainty arising from the random errors and the uncertainty
associated with the corrections for systematic errors give rise to the uncertainty on the result of
measurement. The result of a measurement could unknowingly agree exactly with the true value
(i.e., have zero error) even though it may have a relatively large uncertainty.

Uncertainty is therefore an expression of the confidence that can be placed in a result and is
given as a confidence limit. For an observed result M, the uncertainty, x, in the value of M can be
defined as follows: it is estimated that true value of M lies in the range M  x with a probability, P.

Deriving the uncertainty arising from the random errors is a well understood procedure. For
example, in the absence of any systematic effects, the 95% confidence limits are approximately
 2 standard deviations.
The uncertainty associated with systematic errors has to be estimated from a critical evaluation
of each step of the method, carrying out subsidiary experiments if necessary. In practice, one or
two factors will dominate and need to be the only steps evaluated in detail. The current ISO draft
document recommends that these estimates are made at the 1 sigma level.

An example of how the overall uncertainty may be expressed is:


- between laboratory reproducibility (1 sigma)  w%
- extraction uncertainty (estimated by spiking)  x%
- equipment calibration uncertainty  y%
- interpretation of chromatogram  z%
The uncertainties due to temperature variations, solvent impurities and weighing are negligible
compared to the above components.

Overall uncertainty: (1 sigma) = (w2  x2  y2  z2)1/2 %

Because the individual components are combined by squaring and adding, small components
have a very little effect on the final value and hence can be neglected.

If the overall uncertainty is expressed at other than the 1 sigma level, e.g., at 95% level, then

Overall uncertainty (at 95% confidence level) = k (w2  x2  y2  z2)1/2 %

with the value of k depending on P and on the confidence in the estimate of the individual


AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 
components. (For P of 95% it is certainly greater than 2 and most probably less than 3.)


AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 
APPENDIX C. RUGGEDNESS TESTING PROCEDURE (from Youden, Statistical
Manual of AOAC)

When the initiating laboratory develops and standardizes the procedures of a method, data may
be collected for a set of operations and equipment that is never varied. This process does not
reveal what will happen during a method trial in a number of laboratories, each of which has its
own set of reagents, equipment, and routines. Preparation of standards, time and temperature
variations, instrument calibration and performance, and analyst technique all contribute to minor
variations even when a procedure is followed "exactly." The only way to forecast the
performance of a method under different laboratory conditions is to deliberately introduce
reasonable variations and observe what happens. If the procedure is "rugged" it should be
transparent to minor variations and the results should not be affected.

A ruggedness test is a scheme of attack that will conserve labor but be sensitive enough to pick
up small effects of variation. The scheme does not study one alteration at a time, but introduces
several changes at once in such a manner that the effects of individual changes can be
ascertained.

EXAMPLE:

Let A, B, C, D, E, F, and G denote the nominal values for seven different factors that might
influence the results if their values were varied slightly. Let their alternative values be denoted by
a, b, c, d, e, f, and g.

The conditions for running a determination will be completely specified by the seven letters, each
letter being either capital or lower case. There are 27 or 128 different combinations, which is
impractical to test. However, it is possible to choose a subset of 8 combinations that adequately
balances upper and lower case conditions.

Eight combinations of seven factors used to test the


ruggedness of an analytical method
_____________________________________________________________
Combination or Determination No.
_________________________________________________
Factor Value 1 2 3 4 5 6 7 8
______________________________________________________________
A or a A A A A a a a a
B or b B B b b B B b b
C or c C c C c C c C c
D or d D D d d d d D D
E or e E e E e e E e E
F or f F f f F F f f F
G or g G g g G g G G g
______________________________________________________________
Observed
result s t u v w x y z

The table specifies the value for seven factors to be used while running eight determinations.
The results are designated s through z. Now, how can one isolate the separate effects of factor
changes, even though 4 factors are always altered from the initial procedure conditions of all
capital letters.

To find whether changing factor A to a had an effect, compare the average (s + t + u + v)/4 with
the average (w + x + y + z)/4. Determinations 1, 2, 3, and 4 were run with factor level A, and
determinations 5, 6, 7, and 8 with factor level a. This gives two groups of 4 determinations, and

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 
each group gives the other six factors twice at the capital level and twice at the lower case level.
The effects of these factors, if present, cancel out, leaving only the effect of changing A to a.

Whenever the eight combinations are split into two groups of four on the basis of one of the
letters, all other factors cancel out. Every one of the combinations is tested by only eight
determinations. The effect of altering G to g, for example, is examined by comparing the average
(s + v + x + y)/4 with (t + u + w + z)/4.

Collect the seven average differences for A - a, B - b, etc., and list them in order of size. If one or
two factors are having an effect, their differences will be substantially larger than the group of
differences associated with the other factors. This ranking is a direct guide to the method's
sensitivity to modest alterations. A useful, i.e., rugged, method should not be affected by
changes that will most certainly be encountered between laboratories.

If there is no outstanding difference, the most realistic measure of the analytical error is given by
the seven differences obtained from the averages for capitals minus the average for
corresponding lower case letters. Denote these seven differences by Da, Db, ... Dg. To estimate
the standard deviation, square the differences and take the square root of 2/7 + the sum of their
squares.

To check the calculation, compute the standard deviation obtained from the eight results, s
through z: Obtain the mean of the eight results. Square the eight differences from the mean,
sum the squares, divide by 8 - 1, and take the square root.


AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 
AOAC INTERNATIONAL
481 North Frederick Avenue, Suite 500
Gaithersburg, Maryland 20877-2417 USA
Phone: +1-301-924-7077
Fax: +1-301-924-7089


AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 
AOAC INTERNATIONAL
TERMS OF REFERENCE

I. NAME:

Peer-Verified Methods Advisory Committee

II. MISSION:

To serve the Association in a scientific and advisory capacity on methods and the
methods process.

III. RESPONSIBILITIES:

To help coordinate policies of the Peer-Verified Methods Program with other AOAC
methods programs.

To recommend policies and procedures necessary and appropriate for the Peer-Verified
Methods Program.

To implement Board of Directors policies.

To promote uniform consideration and acceptance of Peer-Verified Methods.

IV. COMPOSITION AND ORGANIZATION:

The Advisory Committee shall have a maximum of 6 members, plus the chair, all
appointed by the President. Members shall be chosen from among past and serving
General Referees. Members shall be appointed for one 3-year term, staggered so that 2
members are appointed each year. The chair also serves one 3-year term. The current
chair may serve as Past-Chair for one year.

All members assume office immediately following the annual meeting when the
appointing President takes office.

The chair concurrently serves as a liaison to the Official Methods Board and will keep the
Official Methods Board apprised of PVM activities.

V. STAFF LIAISON:

The Executive Director shall assign a member of the staff to serve as staff liaison.

VI. REVIEW SCHEDULE:

These Terms shall be reviewed by the Board of Directors at least every three years.

VII. DATE ESTABLISHED:


November 16, 1993.

VIII. DATES REVISED:


November 18, 1997
October 22, 1998


AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
 #1#% +06'40#6+10#. 

You might also like