You are on page 1of 8

Immunoassay

An immunoassay is a biochemical test that measures the presence or concentration


of a macromolecule or a small molecule in a solution through the use of an antibody
(usually) or an antigen (sometimes). The molecule detected by the immunoassay is
often referred to as an "analyte" and is in many cases a protein, although it may be
other kinds of molecules, of different size and types, as long as
the proper antibodies that have the adequate properties for the assay are
developed. Analytes in biological liquids such as serum or urine are frequently
measured using immunoassays for medical and research.

Immunoassay

Illustration of the basic components of an


immunoassay, which includes an analyte

(green), an antibody (black), and a detectable label (yellow).

Immunoassays come in many different formats and variations.


Immunoassays may be run in multiple steps with reagents being added and washed
away or separated at different points in the assay. Multistep assays are often called
separation immunoassays or heterogeneous immunoassays. Some immunoassays
can be carried out simply by mixing the reagents and sample and making a physical
measurement. Such assays are called homogenous immunoassays or less frequently
non-separation immunoassays.

The use of a calibrator is often employed in immunoassays.

Calibrators are solutions that are known to contain the analyte in question, and the
concentration of that analyte is generally known. Comparison of an assay's response
to a real sample against the assay's response produced by the calibrators makes it
possible to interpret the signal strength in terms of the presence or concentration of
analyte in the sample.

Principle

Immunoassays rely on the ability of


an antibody to recognize and bind a specific macromolecule in what might be a
complex mixture of macromolecules. In immunology the particular macromolecule
bound by an antibody is referred to as an antigen and the area on an antigen to
which the antibody binds is called an epitope.

In some cases, an immunoassay may use an antigen to detect for the presence of
antibodies, which recognize that antigen, in a solution. In other words, in some
immunoassays, the analyte may be an antibody rather than an antigen.

In addition to the binding of an antibody to its antigen, the other key feature of all
immunoassays is a means to produce a measurable signal in response to the binding.
Most, though not all, immunoassays involve chemically linking antibodies or antigens
with some kind of detectable label. A large number of labels exist in modern
immunoassays, and they allow for detection through different means. Many labels
are detectable because they either emit radiation, produce a color change in a
solution, fluoresce under light, or can be induced to emit light.

History

Rosalyn Sussman Yalow and Solomon Berson are credited with the development of
the first immunoassays in the 1950s. Yalow accepted the Nobel Prize for her work in
immunoassays in 1977, becoming the second American woman to have won the
award.Immunoassays became considerably simpler to perform and more popular
when techniques for chemically linked enzymes to antibodies were demonstrated in
the late 1960s.In 1983, Professor Anthony Campbell at Cardiff University replaced
radioactive iodine used in immunoassay with an acridinium ester that makes its own
light: chemiluminescence. This type of immunoassay is now used in around 100
million clinical tests every year worldwide, enabling clinicians to measure a wide
range of proteins, pathogens and other molecules in blood samples.By 2012, the co
The western blot (sometimes called the protein immunoblot) is a widely used
analytical technique used in molecular biology, immunogenetics and other
molecular biology disciplines to detect specific proteins in a sample of tissue
homogenate or extract.mmercial immunoassay industry earned US$17,000,000,000.

WESTERN BLOTTING

The western blot (sometimes called the protein immunoblot) is a widely used
analytical technique used in molecular biology, immunogenetics and other
molecular biology disciplines to detect specific proteins in a sample of tissue
homogenate or extract. In brief, the sample undergoes protein denaturation,
followed by gel electrophoresis. A synthetic or animal-derived antibody (known as
the primary antibody) is created that recognises and binds to a specific target
protein. The electrophoresis membrane is washed in a solution containing the
primary antibody, before excess antibody is washed off. A secondary antibody is
added which recognises and binds to the primary antibody. The secondary antibody
is visualised through various methods such as staining, immunofluorescence,
andradioactivity, allowing indirect detection of the specific target protein.
Other related techniques include dot blot analysis, quantitative dot blot,
immunohistochemistry, and immunocytochemistry where antibodies are used to
detect proteins in tissues and cells by immunostaining, and enzyme-linked
immunosorbent assay (ELISA).
The term "western blot" was given to the technique by W. Neal Burnette,[2]
although the method itself originated in the laboratory of Harry Towbin at the
Friedrich Miescher Institute.

Applications

 The western blot is extensively used in biochemistry for the qualitative


detection of single proteins and protein-modifications (such as
posttranslational modifications).
 Used as a general method d for identification of single proteins
 used for verification of protein production after cloning. It is also used in
medical diagnostics, e. g. in the HIV test or BSE-Test.
 A western blot can also be used as a confirmatory test for Hepatitis B infection
and HSV-2 (Herpes Type 2) infection. In veterinary medicine, a western blot is
sometimes used to confirm FIV+ status in cats.
 Blood doping such as EPO

Methods for the determination of limit of detection and limit of


quantitation of the analytical methods:
Analytical method development and validation procedures are vital in the discovery
and development of drugs and pharmaceuticals. Analytical methods are used to aid
in the process of drug synthesis, screen potential drug candidates, support
formulation studies, monitor the stability of bulk pharmaceuticals and formulated
products, and test final products for release. The quality of analytical data is a key
factor in the success of a drug and formulation development program. During the
post approval commercial production stage of bulk drugs and pharmaceutical
products, the official or in-house test methods that have resulted from the analytical
method development and validation process cycle become indispensable for reliable
monitoring of the integrity, purity, quality, strength and potency of the
manufactured products.

Limit of detection:
Lowest amount of analyte in the sample, which can be detected but not necessarily
quantitated under stated experimental conditions.

 METHODS:
 By visual evaluation
 Based on S/N ratio
 Applicable to procedure, which exhibits base line noise
 Low conc. of analyte is compared with blank
 Based on S.D. of response and slope LOQ=3.3σ/s
 s – Slope of calibration curve
 σ – S.D. of response; can be obtained by
 Standard deviation of blank response
 Residual standard deviation of the regression line
 Standard deviation of the y-intercept of the regression line Sy/x, i.e. standard
error of estimate
some common methods for the estimation of detection and quantitation limit are

i. Visual definition
ii. Calculation from the signal-to-noise ratio (DL and QL correspond to 3 or 2 and
10 times the noise level, respectively

LIMIT OF QUANTIFICATION: Lowest amount of analyte in a sample,


which can be quantitatively determined with suitable precision and
accuracy.

METHODS:

 By visual evaluation
 Based on S/N ratio
 Applicable to procedure, which exhibits base line noise.
 Low conc. of analyte is compared with blank
 Based on S.D. of response and slope
 LOQ=10σ/s

s – Slope of calibration curve

 σ – S.D. of response; can be obtained by


 Standard deviation of blank
 response
 Residual standard deviation of the regression line
 Standard deviation of the y-intercept of the regression line
 Sy/x, i.e. standard error of estimate
Calculation from the standard deviation of the blank

Calculation from the calibration line at low


concentrations

F×SD

DL/QL = b Where

F: Factor of 3.3 and 10 for DL and QL, respectively

SD: Standard deviation of the blank, standard deviation of the ordinate


intercept, or residual standard deviation of the linear regression

b: Slope of the regression line

The estimated limits should be verified by analyzing a suitable number of


samples containing the analyte at the corresponding concentrations. The
DL or QL and the procedure used for determination. QL can be regarded
as the maximum true impurity content of the manufactured batch, i.e., as
the basic limit

(stdf,95%) validation

QL=AL √nassay
AL Acceptance limit of the specification for the impurity.

s Precision standard deviation at QL, preferably under intermediate or


reproducibility conditions. AL and s must have the same unit (e.g.,
percentage with respect to active, mg, mg/ml, etc.).

nassay Number of repeated, independent determinations in routine


analyses, as far as the mean is the reportable result, i.e., is compared with
the acceptance limits. If each individual determination is defined as the
reportable result, n=1 has to be used.

tdf Student t-factor for the degrees of freedom during determination of


the precision, usually at 95% level of statistical confidence.

You might also like