Professional Documents
Culture Documents
▪ For quantitation of a given analyte in a certain sample, an instrumental signal is measured for the analyte.
▪ The instrumental signal normally increases linearly with concentration (amount) of the analyte in the
sample
▪ The classical way to determine the relation is to measure the instrumental signal for a series of standard
solutions containing exactly known concentrations of the analyte and to plot the signals of these versus
concentration.
▪ The resulting linear relationship is termed a standard curve or a calibration curve and the entire procedure
is termed calibration.
Validation
▪ Validation is the process to demonstrate that an analytical procedure is suitable for
its intended purpose.
▪ New analytical methods must be validated before they can be applied.
▪ Analytical procedures:
• Identification tests
▪ The specificity of an analytical method is defined as ‘the ability to assess unequivocally the
analyte in the presence of compounds that may be expected to be present (impurities,
degradation products, matrix, etc.)
▪ Analytical methods with high selectivity provide measured instrumental signals solely from the
target analyte.
▪ If an analytical method lacks specificity, one or several matrix components in the sample may
contribute to the instrumental signal and the measurements may not provide an exact result.
Specificity
▪ Specificity is tested as part of the validation procedure for identification tests, determination
of impurities, and assay.
▪ For identification methods (qualitative analysis), the ability to select between compounds of
closely related structures that are likely to be present in real samples should be
demonstrated.
CRS : Chemical Reference Substance is a highly purified and well-defined quality of the
analyte.
Precision
• Repeatability
• Intermediate precision
• Reproducibility
Precision
▪ The repeatability is the precision under the same operating conditions over a short interval of
time.
▪ Normally, the same operator with the same equipment carries out the measurement repeatedly
within one day within the same laboratory.
▪ The detection limit (DL) of an analytical method is defined as ‘the lowest amount of analyte in a
sample that can be detected, but not necessary quantitated as an exact value’ under given
experimental conditions.
▪ The terms limit of detection (LOD) and lower limit of detection (LLOD) are used as alternatives to
detection limit.
▪ For instrumental methods, calculation of the detection limit can be based on the signal-to-noise
ratio (S/N).
Detection Limit
DL = 3.3 × 𝜎/S
▪𝜎 is the standard deviation of the response (representing the noise level)
▪S is the slope of the calibration curve, which is used to convert the signal
to a concentration
Quantitation Limit
▪ The quantitation limit (QL), also termed the limit of quantitation (LOQ) or the lower limit of
quantitation (LLOQ) of an analytical method, is defined as ‘the lowest amount of analyte in
a sample, which can be quantitatively determined with suitable precision and accuracy’.
▪ This parameter should be estimated for assays for low levels of analytes and particularly
for the determination of impurities and/or degradation products. Quantitation should not be
performed below this limit. The quantitation limit is defined as the concentration of analyte
providing a signal-to-noise ratio of 10:
▪QL = 10 × 𝜎/S
Linearity and Range
▪ The linearity of an analytical method is defined as ‘its ability (within a given range)
to obtain test results, which are directly proportional to the concentration
(amount) of analyte in the sample’
Linearity and Range
▪ Thus, the performance of the method is tested with small changes in the
parameters listed above, and the analytical results should be unaffected