You are on page 1of 36

Patents & Potions

Introduction We live today in a world of drugs. Drugs for pain, drugs for disease, drugs for allergies, drugs
for pleasure, and drugs for mental health. Drugs that have been rationally designed; drugs that have been synthesized
in the factory or purified from nature. Drugs fermented and drugs engineered. Drugs that have been clinically tested.
Drugs that, for the most part, actually do what they are supposed to. Effectively. Safely.
By no means was it always so.
Before the end of the 19th century, medicines were concocted with a mixture of empiricism and prayer. Trial and
error, inherited lore, or mystical theories were the basis of the world’s pharmacopoeias. The technology of making
drugs was crude at best: Tinctures, poultices, soups, and teas were made with water- or alcohol-based extracts of
freshly ground or dried herbs or animal products such as bone, fat, or even pearls, and sometimes from minerals best
left in the ground—mercury among the favored. The difference between a poison and a medicine was a hazy
differentiation at best: In the 16th century, Paracelsus declared that the only difference between a medicine and a
poison was in the dose. All medicines were toxic. It was cure or kill.
“Rational treatments” and “rational drug design” of the era were based on either the doctrine of humors (a
pseudoastrological form of alchemical medicine oriented to the fluids of the body: blood, phlegm and black and
yellow bile) or the doctrine of signatures. (If a plant looks like a particular body part, it must be designed by nature
to influence that part. Lungwort, for example, was considered good for lung complaints by theorists of the time
because of its lobe-shaped leaves.) Neither theory, as might be expected, guaranteed much chance of a cure.
Doctors and medicines were popular, despite their failures. As pointed out by noted medical historian Charles
E.Rosenberg, a good bedside manner and a dose of something soothing (or even nasty) reassured the patient that
something was being done, that the disease was not being ignored.
Blood and mercury
By the first part of the 19th century, the roots of modern pharmacy had taken hold with a wave of heroic medicine.
Diseases were identified by symptom, and attacking the symptom as vigorously as possible was the high road to
health.
Bloodletting dominated the surgeon’s art, and dosing patients with powerful purgatives and cathartics became the
order of the day in an attempt to match the power of the disease with the power of the drug. Bleed them till they
faint. (It is difficult to sustain a raging fever or pounding pulse when there is too little blood in the body, so the
symptoms, if not what we would call the disease, seemed to vanish.) Dose them with calomel till they drool and
vomit. (Animals were thought to naturally expel toxins this way.) Cleanse both stomach and bowels violently to
remove the poisons there.
Certainly these methods were neither pleasant nor very effective at curing patients already weakened by disease.
George Washington died in misery from bloodletting; Abraham Lincoln suffered chronic mercury poisoning and
crippling constipation from his constant doses of “blue mass.” The “cure” was, all too often, worse than the disease.
In the second half of the 19th century, things changed remarkably as the industrial revolution brought technological
development to manufacturing and agriculture and inspired the development of medical technology.
Spurred in part by a reaction against doctors and their toxic nostrums, patent medicines and in particular
homeopathy (which used extreme dilutions of otherwise toxic compounds) became popular and provided an
“antidote” to the heroic treatments of the past. Not helpful, but at least harmless for the most part, these new drugs
became the foundation of a commodity-based medicine industry that galvanized pharmacist and consumer alike.
Technology entered in the form of pill and powder and potion making.
Almost by accident, a few authentic drugs based on the wisdom and herbal lore of the past were developed: quinine,
digitalis, and cocaine. Ultimately, these successes launched the truly modern era. The century ended with the
development of the first of two synthesized drugs that represent the triumph of chemistry over folklore and
technology over cookery. The development of antipyrine in 1883 and aspirin in 1897 set the stage for the next 10
decades of what we can look back on in retrospect as the Pharmaceutical Century. With new knowledge of microbial
pathogens and the burgeoning wisdom of vaccine technology, the first tentative steps were taken to transform
medicines to a truly scientific foundation.
From these scattered seeds, drug technology experienced remarkable if chaotic growth in the first two decades of the
20th century, a period that can be likened to a weedy flowering of quackery and patent medicines twining about a
hardening strand of authentic science and institutions to protect and nourish it.

1
Staging the Pharmaceutical Century
In the latter half of the 19th century, numerous beneficent botanicals took center stage in the world’s
pharmacopoeias. Cocaine was first extracted from coca leaves in 1860; salicylic acid—the forerunner of aspirin—
was extracted from willow bark in 1874 for use as a painkiller. Quinine and other alkaloids had long been extracted
from China bark; but an antifebrile subcomponent, quinoline, was not synthesized in the lab until 1883 by Ludwig
Knorr. The first truly synthetic pain reliever, antipyrine, was produced from quinoline derivatives. Digitalis from
foxglove and strophantin from an African dogbane were both botanicals purified for use against heart disease. The
opium poppy provided a wealth of pain relievers: opium, morphine, codeine, and heroin.
But it was not until the birth of medical microbiology that the true breakthroughs occurred, and science—rather than
empiricism—took center stage in the development of pharmaceuticals.
Murderous microbes
The hallmark of 19th-century medicine has to be the microbial theory of disease. The idea that infectious diseases
were caused by microscopic living agents provided an understanding of the causes and the potential cures for ills
from anthrax to whooping cough.
Technology made the new framework possible. The brilliance of European lens makers and microscopists, coupled
with the tinkering of laboratory scientists who developed the technologies of sterilization and the media and
methods for growing and staining microbes, provided the foundation of the new medical science that would explode
in the 20th century. These technologies offered proof and intelligence concerning the foe against which
pharmaceuticals, seen thereafter as weapons of war, could be tested and ultimately designed.
In 1861, the same year that the American Civil War began, Ignaz Semmel weis published his research on the
transmissible nature of purperal (childbed) fever. His theories of antisepsis were at first vilified by doctors who
could not believe their unwashed hands could transfer disease from corpses or dying patients to healthy women. But
eventually, with the work of Robert Koch, Joseph Lister, and Louis Pasteur adding proof of the existence and
disease-causing abilities of microorganisms, a worldwide search for the microbial villains of a host of historically
deadly diseases began.
In 1879, as part of the new “technology,” Bacterium coli was discovered (it was renamed Escherichia after its
discoverer, Theodor Escherich, in 1919). It quickly became the quintessential example of an easily grown, “safe”
bacteria for laboratory practice. New growth media, new sterile techniques, and new means of isolating and staining
bacteria rapidly developed. The ability to grow “pathogens” in culture proved remarkably useful. Working with pure
cultures of the diphtheria bacillus in Pasteur’s laboratory in 1888, Emile Roux and Alexandre Yersin first isolated
the deadly toxin that causes most of diphtheria’s lethal effects.
One by one over the next several decades, various diseases revealed their microbial culprits to the so-called
microbe-hunters.
Initially, most American physicians were loath to buy into germ theory, seeing it as a European phenomenon
incompatible with the “truth” of spontaneous generation and as a threat to the general practitioner from the growing
cadre of scientifically trained laboratory microbiologists and specialist physicians.
“Anti-contagionists” such as the flamboyant Colonel George E. Waring Jr., pamphleteer, consulting engineer, and
phenomenally effective warrior in the sanitation movement, ultimately held sway. Filth was considered the source of
disease. A host of sewage projects, street-cleaning regimens, and clean water systems swept urban areas across the
United States, with obvious benefits. Ultimately, the germ theory of infectious diseases had to be accepted,
especially as the theoretical foundation behind the success of the sanitation movement. And with the production of
vaccines and antitoxins, older medical frameworks fell by the wayside, though rural American physicians were still
recommending bleeding and purgatives as cures well into the first few decades of the 20th century.

2
Victorious vaccines SOCIETY: Muckraking and
The most significant outgrowth of the new germ theory, and the one medicine
that created the greatest demand for new technologies for Media “muckrakers” exposed the
implementation, was the identification and production of the new seedy underbelly of robber baron
“immunologicals”—drugs that are, in essence, partially purified capitalism and helped transform
components or fractions of animal blood. In 1885, Pasteur developed American society and government
attenuated rabies vaccine—a safe source of “active” immunity with a wave of reform. They were
(immunity developed against a form or component of the disease- particularly effective in the areas of
causing microorganism by the body’s own immune system). medicine and food, especially with the
Vaccines would be developed against a variety of microorganisms in 1905 Collier’s series, “The Great
rapid succession over the next several decades. American Fraud,” and Upton Sinclair’s
But active immunity was perhaps not the most impressive result of 1906 novel of the Chicago stockyards,
the immunologicals. Antitoxins (antibodies isolated against disease The Jungle. The muckrakers revealed
organisms and their toxins from treated animals), when injected into patent medicines laced with addictive
infected individuals, provided salvation from otherwise fatal drugs, toxic additives, and—horror-of-
diseases. This technology began in 1890 when Emil von Behring and horrors to teetotaling conservatives—
Shibasaburo Kitasato isolated the first antibodies against tetanus and, alcohol; and sausages laced with
soon after, diphtheria. In 1892, Hoechst Pharma developed a offal, sawdust, and excrement.
tuberculin antitoxin. These vaccines and antitoxins would form the These works galvanized the American
basis of a new pharmaceutical industry. public (and a disgusted President
Perhaps as important as the development of these new immunologics Teddy Roosevelt) to demand
was the impetus toward standardization and testing that a new government regulation rather than the
generation of scientist-practitioners such as Koch and Pasteur prevailing laissez-faire mentality.
inspired. These scientists’ credibility and success rested upon stringent control—and ultimately, government
regulation—of the new medicines. Several major institutions sprang up in Europe and the United States to
manufacture and/or inspect in bulk the high volume of vaccines and antitoxins demanded by a desperate public
suddenly promised new hope against lethal diseases. These early controls helped provide a bulwark against
contamination and abuse. Such control would not be available to the new synthetics soon to dominate the scene with
the dawn of “scientific” chemotherapy.
Medicinal chemistry
Parallel (and eventually linked) to developments in biology, the chemist’s art precipitously entered the medicinal
arena in 1856 when Englishman William Perkin, in an abortive attempt to synthesize quinine, stumbled upon mauve,
the first synthesized coal tar dye. This discovery led to the development of many synthetic dyes but also to the
realization that some of these dyes had therapeutic effects. Synthetic dyes, and especially their medicinal “side
effects,” helped put Germany and Switzerland in the forefront of both organic chemistry and synthesized drugs. The
dye–drug connection was a two-way street: The antifever drug Antifebrin, for example, was derived from aniline
dye in 1886.
The chemical technology of organic synthesis and analysis seemed to offer for the first time the potential to
scientifically ground the healer’s art in a way far different from the “cookery” of ancient practitioners. In 1887,
phenacetin, a pain reliever, was developed by Bayer specifically from synthetic drug discovery research. The drug
eventually fell into disfavor because of its side effect of kidney damage. Ten years later, also at Bayer, Felix
Hoffman synthesized acetylsalicylic acid (aspirin). First marketed in 1899, aspirin has remained the most widely
used of all the synthetics.
Many other new technologies also enhanced the possibilities for drug development and delivery. The advent of the
clinical thermometer in 1870 spearheaded standardized testing and the development of the antifever drugs. In 1872,
Wyeth invented the rotary tablet press, which was critical to the mass marketing of drugs. By 1883, a factory was
producing the first commercial drug (antipyrine) in a ready-dosaged, prepackaged form. With the discovery of X-
rays in 1895, the first step was taken toward X-ray crystallography, which would become the ultimate arbiter of
complex molecular structure, including proteins and dna.
The Pharmaceutical Century
Not only did the early 1900s bring the triumph of aspirin as an inexpensive and universal pain reliever—the first of
its kind—but the science of medicine exploded with a new understanding of the human body and its systems.
Although not immediately translated into drugs, these discoveries would rapidly lead to a host of new

3
pharmaceuticals and a new appreciation of nutrition as a biochemical process and hence a potential source of drugs
and drug intervention.
Of equal if not more importance to the adoption and implementation of the new technologies was the rise of public
indignation—a demand for safety in food and medicines that began in Europe and rapidly spread to the United
States.
Tainted food and public furor
The developing understanding of germ theory and the increasing availability of immunologics and chemical
nostrums forced recognition that sanitation and standardization were necessary for public health and safety. First in
Europe, and then in the United States, the new technologies led to the growth of new public and semipublic
institutions dedicated to producing and/or assaying the effectiveness and safety of pharmaceuticals and foods in
addition to those dedicated to sanitation and public disease control. Unfortunately, the prevalence of disease among
the poor created a new line of prejudice against these presumed “unsanitary” subclasses.
In the United States, where the popular sanitation movement could now be grounded in germ theory, this fear of
contagion manifested among the developing middle classes was directed especially against immigrants—called
“human garbage” by pundits such as American social critic Herbert George in 1883. This led to the Immigration Act
of 1891, which mandated physical inspection of immigrants for diseases of mind and body—any number of which
could be considered cause for quarantine or exclusion. Also in 1891, the Hygienic Laboratory (founded in 1887 and
the forerunner of the National Institutes of Health) moved from Staten Island (New York City) to Washington,
DC—a sign of its growing importance.
That same year, the first International Sanitary Convention was established. Although restricted to efforts to control
and prevent cholera, it would provide a model of things to come in the public health arena. In 1902, an International
Sanitary Bureau (later renamed the Pan American Sanitary Bureau and then the Pan American Sanitary
Organization) was established in Washington, DC, and became the forerunner of today’s Pan American Health
Organization, which also serves as the World Health Organization’s Regional Office for the Americas.
Fears of contagion on the one hand and poisoning on the other, resulting from improperly prepared or stored
medicines, led to the 1902 Biologicals Controls Act, which regulates the interstate sale of viruses, serums,
antitoxins, and similar products.
One of the significant outgrowths of the new “progressive” approach to solving public health problems with
technological expertise and government intervention was the popularity and influence of a new class of journalists
known as the Muckrakers. Under their impetus, and as the result of numerous health scandals, The 1906 U.S. Pure
Food and Drugs Act, after years of planning by U.S. Department of Agriculture (USDA) researchers such as John
Wiley, was passed easily. The act established the USDA’s Bureau of Chemistry as the regulatory agency.
Unfortunately, the act gave the federal government only limited powers of inspection and control over the industry.
Many patent medicines survived this first round of regulation.
The American Medical Association (AMA) created a Council on Pharmacy and Chemistry to examine the issue and
then established a chemistry laboratory to lead the attack on the trade in patent medicines that the Pure Food and
Drugs Act had failed to curb. The ama also published New and Nonofficial Remedies annually in an effort to control
drugs by highlighting serious issues of safety and inefficacy. This publication prompted rapid changes in industry
standards.

4
International health procedures continued to be formalized also— TECHNOLOGY: Enter the gene
L’Office International d’Hygiène Publique (OIHP) was established in The Pharmaceutical Century ended
Paris in 1907, with a permanent secretariat and a permanent on a wave of genetic breakthroughs—
committee of senior public health officials. Military and geopolitical from the Human Genome Project to
concerns would also dominate world health issues. In 1906, the isolated genes for cancer. And so did
Yellow Fever Commission was established in Panama to help with it begin, relatively ineffectively at first,
U.S. efforts to build the canal; in 1909, the U.S. Army began mass at least with regard to the
vaccination against typhoid. development of medicines.
Nongovernmental organizations also rallied to the cause of medical In 1908, A. E. Garrod described
progress and reform. In 1904, for example, the U.S. National “inborn errors of metabolism” based
Tuberculosis Society was founded (based on earlier European on his analysis of family medical
models) to promote research and social change. It was one of many histories—a major breakthrough in
groups that throughout the 20th century were responsible for much of human genetics and the first
the demand for new medical technologies to treat individual diseases. recognized role of biochemistry in
Grassroots movements such as these flourished. Public support was heredity. In 1909, Wilhelm Johannsen
often behind the causes. In 1907, Red Cross volunteer Emily Bissell coined the terms “gene,” “genotype,”
designed the first U.S. Christmas Seals (the idea started in Denmark). and “phenotype.” In 1915, bacterio
The successful campaign provided income to the Tuberculosis phages were discovered. First thought
Society and a reminder to the general public of the importance of to be another “magic bullet,” their
medical care. Increased public awareness of diseases and new failure as routine therapeutics became
technologies such as vaccination, antitoxins, and later, “magic secondary to their use in the study of
bullets,” enhanced a general public hunger for new cures. bacterial genetics. By 1917, Richard
The movement into medicine of government and semipublic Goldschmidt suggested that genes
organizations such as the AMA and the Tuberculosis Society are enzymes and, by doing so, fully
throughout the latter half of the 19th and beginning of the 20th embraced the biochemical vision of
centuries set the stage for a new kind of medicine that was regulated, life.
tested, and “public.” Combined with developments in technology and
analysis that made regulation possible, public scrutiny slowly forced medicine to come out from behind the veil of
secret nostrums and alchemical mysteries.
The crowning of chemistry
It was not easy for organized science, especially chemistry, to take hold in the pharmaceutical realm. Breakthroughs
in organic synthesis and analysis had to be matched with developments in biochemistry, enzymology, and general
biology. Finally, new medicines could be tested for efficacy in a controlled fashion using new technologies—
laboratory animals, bacterial cultures, chemical analysis, clinical thermometers, and clinical trials, to name a few.
Old medicines could be debunked using the same methods—with public and nongovernmental organizations such as
the ama providing impetus. At long last, the scientific community began to break through the fog of invalid
information and medical chicanery to attempt to create a new pharmacy of pharmaceuticals based on chemistry, not
caprice.
The flowering of biochemistry in the early part of the new century was key, especially as it related to human
nutrition, anatomy, and disease. Some critical breakthroughs in metabolic medicine had been made in the 1890s, but
they were exceptions rather than regular occurrences. In 1891, myedema was treated with sheep thyroid injections.
This was the first proof that animal gland solutions could benefit humans. In 1896, Addison’s disease was treated
with chopped up adrenal glands from a pig. These test treatments provided the starting point for all hormone
research. Also in 1891, a pair of agricultural scientists developed the Atwater–Rosa calorimeter for large animals.
Ultimately, it provided critical baselines for human and animal nutrition studies.
But it wasn’t until the turn of the century that metabolic and nutritional studies truly took off. In 1900, Karl
Landsteiner discovered the first human blood groups: O, A, and B. That same year, Frederick Hopkins discovered
tryptophan and demonstrated in rat experiments that it was an “essential” amino acid—the first discovered. In 1901,
fats were artificially hydrogenated for storage for the first time (providing a future century of heart disease risk).
Eugene L. Opie discovered the relationship of islets of Langerhans to diabetes mellitus, thus providing the necessary
prelude to the discovery of insulin. Japanese chemist Jokichi Takamine isolated pure epinephrine (adrenaline). And
E. Wildiers discovered “a new substance indispensable for the development of yeast.” Growth substances such as
this eventually became known as vitamines and later, vitamins.

5
In 1902, proteins were first shown to be polypeptides, and the AB blood group was discovered. In 1904, the first
organic coenzyme—cozymase—was discovered. In 1905, allergies were first described as a reaction to foreign
proteins by Clemens von Pirquet, and the word “hormone” was coined. In 1906, Mikhail Tswett developed the all-
important technique of column chromatography. In 1907, Ross Harrison developed the first animal cell culture using
frog embryo tissues. In 1908, the first biological audioradiograph was made—of a frog. In 1909, Harvey Cushing
demonstrated the link of pituitary hormone to giantism.
Almost immediately after Svante August Arrhenius and Soren BIOGRAPHY: Paul Ehrlich
Sorensen demonstrated in 1909 that pH could be measured, Sorenson Ehrlich’s breakthrough research
pointed out that pH can affect enzymes. This discovery was a critical originally began from his study of coal
step in the development of a biochemical model of metabolism and tar dyes. Their properties of
kinetics. So many breakthroughs of medical significance occurred in differentially staining biological
organic chemistry and biochemistry in the first decade of the material led him to question the
Pharmaceutical Century that no list can do more than scratch the relationship of chemical structures to
surface. patterns of distribution and affinities
Making magic bullets for living cells. He expanded this
It was not the nascent field of genetics, but rather a maturing theoretical framework (his side-chain
chemistry that would launch the most significant early triumph of the theory of cell function) to include
Pharmaceutical Century. Paul Ehrlich first came up with the magic immunology and chemotherapy.
bullet concept in 1906. (Significant to the first magic bullet’s Ehrlich believed strongly in the
ultimate use, in this same year, August von Wasserman developed necessity of in vivo testing. Using the
his syphilis test only a year after the bacterial cause was determined.) arsenical compound atoxyl, which
However, it wasn’t until 1910 that Ehrlich’s arsenic compound 606, British researchers had discovered
marketed by Hoechst as Salvarsan, became the first effective was effective against trypanosomes
treatment for syphilis. It was the birth of chemotherapy. (but which also damaged the optic
With the cure identified and the public increasingly aware of the nerve of the patient), Ehrlich modified
subject, it was not surprising that the “progressive” U.S. government the chemical side chains in an attempt
intervened in the public health issue of venereal disease. The to preserve its therapeutic effect while
Charmerlain–Kahn Act of 1918 provided the first federal funding eliminating its toxicity. This “rational”
specifically designated for controlling venereal disease. It should also approach led to compound 606 in
not be a surprise that this attack on venereal disease came in the 1909. Tradenamed Salvarsan, it was
midst of a major war. Similar campaigns would be remounted in the the first “magic bullet.”
1940s.
The “fall” of chemotherapy
Salvarsan provided both the promise and the peril of chemotherapy. The arsenicals, unlike the immunologicals, were
not rigidly controlled and were far more subject to misprescription and misuse. (They had to be administered in an
era when injection meant opening a vein and percolating the solution into the bloodstream through glass or rubber
tubes.) The problems were almost insurmountable, especially for rural practitioners. The toxicity of these
therapeutics and the dangers associated with using them became their downfall. Most clinicians of the time thought
the future was in immunotherapy rather than chemotherapy, and it wasn’t until the antibiotic revolution of the 1940s
that the balance would shift.
Ultimately, despite the manifold breakthroughs in biochemistry and medicine, the end of the ’Teens was not a
particularly good time for medicine. The influenza pandemic of 1918–1920 clearly demonstrated the inability of
medical science to stand up against disease. More than 20 million people worldwide were killed by a flu that
attacked not the old and frail but the young and strong. This was a disease that no magic bullet could cure and no
government could stamp out. Both war and pestilence set the stage for the Roaring Twenties, when many people
were inclined to “eat, drink, and make merry” as if to celebrate the optimism of a world ostensibly at peace.

6
Still, a burgeoning science of medicine promised a world of wonders Suggested reading
yet to come. Technological optimism and industrial expansion • Two Centuries of American
provided an antidote to the malaise caused by failed promises Medicine: 1776–1976
revealed in the first two decades of the new century. Bordley, J., III; Harvey, A. M. (W. B.
But even these promises were suspect as the Progressive Era drew to Saunders Co.: Philadelphia, 1976)
a close. Monopoly capitalism and renewed conservatism battled • Healing Drugs: The History of
against government intervention in health care as much as the Pharmacology
Facklam, H.; Facklam, M. (Facts on
economy did and became a familiar refrain. The continued explosive
File, Inc.: New York, 1992)
growth of cities obviated many of the earlier benefits in sanitation
• Other Healers: Unorthodox Medicine
and hygiene with a host of new “imported” diseases. The constitutive
in America
bad health and nutrition of both the urban and rural poor around the Gevitz, N., Ed. (Johns Hopkins
world grew worse with the economic fallout of the war. University Press: Baltimore, 1988)
Many people were convinced that things would only get worse • The Greatest Benefit to Mankind: A
before they got better. Medical History of Humanity
The Pharmaceutical Century had barely begun. Porter, R. (W. W. Norton & Co.: New
York, 1997)
• In Search of a Cure: A History of
Pharmaceutical Discovery
Weatherall, M. (Oxford University
Press: New York, 1990)

7
Salving with Science
Introduction Throughout the 1920s and 1930s, new technologies and new science intersected as physiology
led to the discovery of vitamins and to increasing knowledge of hormones and body chemistry. New drugs and new
vaccines flowed from developments started in the previous decades. Sulfa drugs became the first of the anti bacterial
wonder drugs promising broad-spectrum cures. Penicillin was discovered, but its development had to await new
technology (and World War II, which hastened it). New instruments such as the ultracentrifuge and refined
techniques of X-ray crystallography paralleled the development of virology as a science. Isoelectric precipitation
and electrophoresis first became important for drug purification and analysis. Overall, the medical sciences were on
a firmer footing than ever before. This was the period in which the Food and Drug Administration (FDA) gained
independence as a regulatory agency.And researchers reveled in the expanding knowledge base and their new
instruments. They created a fusion of medicine and machines that would ultimately be known as “molecular
biology.”
The assault on disease
Just as pharmaceutical chemists sought “magic bullets” for myriad diseases in the first two decades of the century,
chemists inthe 1920s and 1930s expanded the search for solutions to the bacterial and viral infections that continued
to plague humankind. Yet, the first great pharmaceutical discovery of the 1920s addressed not an infectious disease
but a physiological disorder.
Diabetes mellitus is caused by a malfunction of the pancreas, resulting in the failure of that gland to produce insulin,
the hormone that regulates blood sugar. For most of human history, this condition meant certain death. Since the late
19th century, when the connection between diabetes and the pancreas was first determined, scientists had attempted
to isolate the essential hormone and inject it into the body to control the disorder. Using dogs as experimental
subjects, numerous researchers had tried and failed, but in 1921, Canadian physician Frederick Banting made the
necessary breakthrough.
Banting surmised that if he tied off the duct to the pancreas of a living dog and waited until the gland atrophied
before removing it, there would be no digestive juices left to dissolve the hormone, which was first called iletin.
Beginning in the late spring of 1921, Banting worked on his project at the University of Toronto with his assistant,
medical student Charles Best. After many failures, one of the dogs whose pancreas had been tied off showed signs
of diabetes. Banting and Best removed the pancreas, ground it up, and dissolved it in a salt solution to create the
long-sought extract. They injected the extract into the diabetic dog, and within a few hours the canine’s blood sugar
returned to normal. The scientists had created the first effective treatment for diabetes.
At the insistence of John Macleod, a physiologist at the University of Toronto who provided the facilities for
Banting’s work, biochemists James Collip and E. C. Noble joined the research team to help purify and standardize
the hormone, which was renamed insulin. Collip purified the extract for use in human subjects, and enough
successful tests were performed on diabetic patients to determine that the disorder could be reversed.
The Connaught Laboratories in Canada and the Eli Lilly Co. in the United States were awarded the rights to
manufacture the drug. Within a few years, enough insulin was being produced to meet the needs of diabetics around
the world. Although Banting and Best had discovered the solution to a problem that had troubled humans for
millennia, it was Lilly’s technical developments (such as the use of an isoelectric precipitation step) that enabled
large-scale collection of raw material, extraction and purification of insulin, and supplying of the drug in a state
suitable for clinical use. Only after proper bulk production and/or synthesis techniques were established did insulin
and many other hormones discovered in the 1920s and 1930s (such as estrogen, the corticosteroids, and testosterone)
become useful and readily available to the public. This would continue to be the case with most pharmaceutical
breakthroughs throughout the century.
Although the isolation and development of insulin was a critically important pharmaceutical event, diabetes was by
no means the greatest killer of the 19th and early 20th centuries. That sordid honor belonged to infectious diseases,
especially pneumonia, and scientists in the 1920s and 1930s turned with increasing success to the treatment of some
of the most tenacious pestilences. Paul Ehrlich introduced the world to chemotherapy in the early years of the 20th
century, and his successful assault on syphilis inspired other chemists to seek “miracle drugs” and “magic bullets.”
But what was needed was a drug that could cure general bacterial infections such as pneumonia and septicemia.
Bacteriologists began in the 1920s to experiment with dyes that were used to stain bacteria to make them more
visible under the microscope, and a breakthrough was achieved in the mid-1930s.

8
Sulfa drugs and more
The anti-infective breakthrough occurred at Germany’s I. G. Farben, which had hired Gerhard Domagk in the late
1920s to direct its experimental pathology laboratory in a drive to become a world leader in the production of new
drugs. Domagk performed a series of experiments on mice infected with streptococcus bacteria. He discovered that
some previously successful compounds killed the bacteria in mice but were too toxic to give to humans. In 1935,
after years of experimentation, Domagk injected an orange-red azo dye called Prontosil into a group of infected
mice. The dye, which was primarily used to color animal fibers, killed the bacteria, and, most importantly, all the
mice survived. The first successful use of Prontosil on humans occurred weeks later, when Domagk gave the drug to
a desperate doctor treating an infant dying of bacterial infection. The baby lived, but this did not completely
convince the scientific community of the drug’s efficacy. Only when 26 women similarly afflicted with life-
threatening infections were cured during clinical trials in London in late 1935 did Prontosil become widely known
and celebrated for its curative powers.
The active part of Prontosil was a substance called sulfanilamide, so BIOGRAPHY: An instrumental
termed by Daniele Bovet of the Pasteur Institute, who determined figure
that Prontosil broke down in the body and that only a fragment of the Warren Weaver, a University of
drug’s molecule worked against an infection. After the discovery of Wisconsin professor and
the active ingredient, more than 5000 different “sulfa” drugs were mathematical physicist, was the
made and tested, although only about 15 ultimately proved to be of principal architect of the Rockefeller
value. In 1939, Domagk received the 1939 Nobel Prize in Physiology Foundation’s programs in the natural
or Medicine. sciences beginning in 1932. Weaver
Sulfanilamide was brought to the United States by Perrin H. Long saw his task as being “to encourage
and Eleanor A. Bliss, who used it in clinical applications at Johns the application of the whole range of
Hopkins University in 1936. It was later discovered that the sulfa scientific tools and techniques, and
drugs, or sulfonamides, did not actually kill bacteria outright, like especially those that had been so
older antiseptics, but halted the growth and multiplication of the superbly developed in the physical
bacteria, while the body’s natural defenses did most of the work. sciences, to the problems of living
Certainly the most famous antibacterial discovered in the 1920s and matter.” Indeed, it was Weaver who
1930s was penicillin—which was found through almost sheer coined the term “molecular biology.”
serendipity. In the years after World War I, Alexander Fleming was Although many of the instruments
seeking better antiseptics, and in 1921 he found a substance in mucus Weaver would use had already been
that killed bacteria. After further experimentation, he learned that the developed for use in the physical
substance was a protein, which he called lysozyme. Although sciences, under Weaver’s financial
Fleming never found a way to purify lysozymes or use them to treat and promotional encouragement, they
infectious diseases, the discovery had implications for his later were applied in novel ways in
encounter with penicillin because it demonstrated the existence of pharmaceutical chemistry and
substances that are lethal to certain microbes and harmless to human molecular biology. According to
tissue. Robert E. Kohler, “For about a
Fleming’s major discovery came almost seven years later. While decade, there was hardly any major
cleaning his laboratory one afternoon, he noticed large yellow new instrument that Weaver did not
colonies of mold overgrowing a culture of staphylococcus bacteria on have a hand in developing. In addition
an agar plate. Fleming realized that something was killing the to the ultracentrifuge and X-ray and
bacteria, and he proceeded to experiment with juice extracted from electron diffraction, the list includes
the mold by spreading it on agar plates covered with more bacteria. electrophoresis, spectroscopy,
He found that even when the juice was highly diluted, it destroyed electron microscopy, . . . radioactive
the bacteria. Calling the new antiseptic penicillin, after the Latin term isotopes, and particle accelerators.”
for brush, Fleming had two assistants purify the mold juice, but he Weaver’s impact as an administrative
performed no tests on infected animal subjects. He published a paper promoter of science was as important
in 1929 discussing the potential use of penicillin in surgical dressings in his era as that of his “spiritual”
but went no further. It wasn’t until the 1940s that penicillin was taken successor, Vannevar Bush, science
up by the medical community. advisor to the President in the post–
Another important achievement in antibacterial research occurred in World War II era.
the late 1930s, when Rene Dubos and colleagues at the Rockefeller
Institute for Medical Research inaugurated a search for soil microorganisms whose enzymes could destroy lethal
bacteria. The hope was that the enzymes could be adapted for use in humans. In 1939, Dubos discovered a substance

9
extracted from a soil bacillus that cured mice infected with pneumococci. He named it tyrothricin, and it is regarded
as the first antibiotic to be established as a therapeutic substance. The 1920s and 1930s were also interesting times in
malaria research. The popular antimalarial drug chloroquine was not formally recognized in the United States until
1946, but it had been synthesized 12 years before at Germany’s Bayer Laboratories under the name Resochin.
While much of pharmaceutical science concentrated on finding answers to the problems posed by bacterial
infections, there was some significant work done on viruses as well. Viruses were identified in the late 19th century
by Dutch botanist Martinus Beijerinck, but a virus was not crystallized until 1935, when biochemist Wendell
Stanley processed a ton of infected tobacco leaves down to one tablespoon of crystalline powder—tobacco mosaic
virus. Unlike bacteria, viruses proved to be highly resistant to assault by chemotherapy, and thus antiviral research
during the era did not yield the successes of antibiotic research.
Most clinical research was dedicated to the search for vaccines and prophylactics rather than treatments. Polio was
one of the most feared scourges threatening the world’s children in the 1920s and 1930s, but no real breakthroughs
came until the late 1940s. Scientifically, the greatest progress in antiviral research was probably made in
investigations of the yellow fever virus, which were underwritten by the Rockefeller Foundation in the 1930s. But as
in the case of polio, years passed before a vaccine was developed. In the meantime, efforts by the public health
services of many nations, including the United States, promoted vaccination as part of the battle against several
deadly diseases, from smallpox to typhoid fever. Special efforts were often made to attack the diseases in rural areas,
where few doctors were available.
Vitamins and deficiency diseases
Whereas the medical fight against pernicious bacteria and viruses brings to mind a military battle against invading
forces, some diseases are caused by internal treachery and metabolic deficiency rather than external assault. This
notion is now commonplace, yet in the early 20th century it was hotly contested.
For the first time, scientists of the era isolated and purified “food factors,” or “vitamines,” and understood that the
absence of vitamins has various detrimental effects on the body, depending on which ones are in short supply.
Vitamins occur in the body in very small concentrations; thus, in these early years, determining which foods
contained which vitamins, and analyzing their structure and effects on health, was complex and time-intensive.
Ultimately, scientists discerned that vitamins are essential for converting food into energy and are critical to human
growth. As a result of this research, by the late 1930s, several vitamins and vitamin mixtures were used for
therapeutic purposes.
The isolation of specific vitamins began in earnest in the second decade of the 20th century and continued into the
1920s and 1930s. Experiments in 1916 showed that fat-soluble vitamin A was necessary for normal growth in young
rats; in 1919, Harry Steenbock, then an agricultural chemist at the University of Michigan, observed that the vitamin
A content of vegetables varies with the degree of vegetable pigmentation. It was later determined that vitamin A is
derived from the plant pigment carotene. Also in 1919, Edward Mellanby proved that rickets is caused by a dietary
deficiency. His research indicated that the deficiency could be overcome—and rickets prevented or cured—by
adding certain fats to the diet, particularly cod-liver oil. At first, Mellanby thought that vitamin A was the critical
factor, but further experimentation did not support this hypothesis. Three years later, Elmer McCollum and
associates at Johns Hopkins University offered clear proof that vitamin A did not prevent rickets and that the
antirachitic factor in cod-liver oil was the fat-soluble vitamin D. The research team soon developed a method for
estimating the vitamin D content in foods.
Experiments on vitamin D continued into the mid-1920s. The most significant were the projects of Steenbock and
Alfred Hess, working in Wisconsin and New York, respectively, who reported that antirachitic potency could be
conveyed to some biological materials be exposing them to a mercury-vapor lamp. The substance in food that was
activated by ultraviolet radiation was not fat, but a compound associated with fat called ergosterol, which is also
present in human skin. The scientists surmised that the explanation of the antirachitic effect of sunlight is that
ultraviolet rays form vitamin D from ergosterol in the skin, which then passes into the blood. The term vitamin D-1
was applied to the first antirachitic substance to be isolated from irradiated ergosterol, and we now know that there
are several forms of vitamin D.
Other important vitamin studies took place in the 1920s and 1930s that had implications for the future of
pharmaceutical chemistry. In 1929, Henrik Dam, a Danish biochemist, discovered that chicks fed a diet that
contained no fat developed a tendency toward hemophilia. Five years later, Dam and colleagues discovered that if
hemp seeds were added to the chicks’ diet, bleeding did not occur. The substance in the seeds that protected against
hemorrhage was named vitamin K, for koagulation vitamin. In 1935, Armand Quick and colleagues at Marquette
University reported that the bleeding often associated with jaundiced patients was caused by a decrease in the blood

10
coagulation factor prothrombin. This study was complemented by a report by H. R. Butt and E. D. Warner that
stated that a combination of bile salts and vitamin K effectively relieved the hemorrhagic tendency in jaundiced
patients. All of these scientists’ work pointed to the conclusion that vitamin K was linked to the clotting of blood
and was necessary for the prevention of hemorrhage, and that vitamin K was essential for the formation of
prothrombin.
Dam and the Swiss chemist Paul Karrer reported in 1939 that they had prepared pure vitamin K from green leaves.
In the same year, Edward Doisy, a biochemist at Saint Louis University, isolated vitamin K from alfalfa, determined
its chemical composition, and synthesized it in the laboratory. Vitamin K was now available for treating patients
who suffered from blood clotting problems. Dam and Doisy received the 1943 Nobel Prize in Physiology or
Medicine for their work.
With the advent of successful research on vitamins came greater commercial exploitation of these substances. In
1933, Tadeus Reichstein synthesized ascorbic acid (vitamin C), making it readily available thereafter. The
consumption of vitamins increased in the 1930s, and popular belief held them to be almost magical. Manufacturers,
of course, did not hesitate to take advantage of this credulity. There was no informed public regulation of the sale
and use of vitamins, and, as some vitamins were dangerous in excess quantities, this had drastic results in isolated
cases. Water-soluble vitamins such as vitamin C easily flow out of the body through the kidneys, but the fat-soluble
vitamins, such as A, D, and K, could not so easily be disposed of and might therefore prove especially dangerous.
Many physicians of the era did nothing to discourage popular misconceptions about vitamins, or harbored relatively
uncritical beliefs themselves.

11
The FDA and federal regulation SOCIETY: Long before Viagra...
The threat posed by unregulated vitamins was not nearly as Male sexual dysfunction is certainly
dangerous as the potential consequences of unregulated drugs. Yet not a new condition to be subjected to
legislation was enacted only when drug tragedies incensed the public pharmaceutical intervention. Today,
and forced Congress to act. After the 1906 Pure Food and Drug Act, Viagra is a hugely successful “lifestyle
there was no federal legislation dealing with drugs for decades, drug” used to treat this condition; in
although the American Medical Association (AMA) did attempt to the early 20th century, injections of
educate physicians and the public about pharmaceuticals. The AMA glandular materials and testicular
published books exposing quack medicines, gradually adopted transplants had a heyday.
standards for advertisements in medical journals, and in 1929, In the 1920s and 1930s,
initiated a program of testing drugs and granting a Seal of experimentation culminated in the
Acceptance to those meeting its standards. Only drugs that received discovery of testosterone. In 1918,
the seal were eligible to advertise in AMA journals. Leo L. Stanley, resident physician of
Dangerous drugs were still sold legally, however, because safety San Quentin State Prison in
testing was not required before marketing. Well into the 1930s, California, transplanted testicles
pharmaceutical companies still manufactured many 19th and early removed from recently executed
20th century drugs that were sold in bulk to pharmacists, who then prisoners into inmates, some of whom
compounded them into physicians’ prescriptions. But newer drugs, claimed that they recovered sexual
such as many biologicals and sulfa drugs (after 1935), were packaged potency. In 1920, a lack of human
for sale directly to consumers and seemed to represent the future of material led to the substitution of boar,
drug manufacturing. deer, goat, and ram testes. In the
In 1937, an American pharmaceutical company produced a liquid 1920s, Russian–French surgeon
sulfa drug. Attempting to make sulfanilamide useful for injections, Serge Voronoff made a fortune
the company mixed it with diethylene glycol—the toxic chemical transplanting monkey glands into
now used in automobile antifreeze. Ultimately sold as a syrup called aging men. Throughout this period,
Elixir of Sulfanilamide, the drug concoction was on the market for researchers tested the androgenic
two months, in which time it killed more than 100 people, including effects of substances isolated from
many children, who drank it. large quantities of animal testicles and
Under existing federal legislation, the manufacturer could be held from human urine. (Adolf Butenandt
liable only for mislabeling the product. In response to this tragedy isolated milligram amounts of
and a series of other scandals, Congress passed the Food, Drug and androsterone from 15,000 L of
Cosmetic Act of 1938, which banned drugs that were dangerous policemen’s urine.) Finally, Karoly G.
when used as directed, and required drug labels to include directions David, Ernst Laqueur, and colleagues
for use and appropriate warnings. The act also required new drugs to isolated crystalline testosterone from
be tested for safety before being granted federal government approval testicles and published the results in
and created a new category of drugs that could be dispensed to a 1935. Within a few months, groups led
patient only at the request of a physician. Before the act was passed, by Butenandt and G. Hanisch (funded
patients could purchase any drug, except narcotics, from pharmacists. by Schering Corp. in Berlin), and
The Food and Drug Administration (the regulatory division Leopold Ruzicka and A. Wettstein of
established in 1927 from the former Bureau of Chemistry) was given Ciba, developed synthetic methods of
responsibility for implementing these laws. preparing testosterone. Butenandt and
The 1938 legislation is the basic law that still regulates the Ruzicka shared the 1939 Nobel Prize
pharmaceutical industry. New manufacturing and mass-marketing in Chemistry for this achievement.
methods demanded changes in federal oversight, because a single compounding error can cause hundreds or even
thousands of deaths. Yet before the 1940s, the law did not require drugs to be effective, only safe when used as
directed. It was not until the 1940s that the Federal Trade Commission forced drug manufacturers to substantiate
claims made about their products, at least those sold in interstate commerce.
Instrumentation
Although the 1920s and 1930s were especially fruitful for “soft” technologies such as antibiotics and vitamin
production, these decades also produced several significant “hard” technologies—scientific instruments that
transformed pharmaceutical R&D. Initially, one might think of the electron microscope, which was developed in
1931 in Germany. This early transmission electron microscope, invented by Max Knoll and Ernst Ruska, was
essential for the future of pharmaceutical and biomedical research, but many other critical instruments came out of
this era. Instrument production often brought people from disparate disciplines together on research teams, as
12
physical chemists and physicists collaborated with biochemists and physiologists. New research fields were created
along the boundary between chemistry and physics, and in the process, many new instruments were invented or
adapted to molecular phenomena and biomolecular problems.
One of the scientists who worked under the aegis of Warren TECHNOLOGY: Electrifying
Weaver, research administrator for the natural sciences at the separations
Rockefeller Foundation, was Swedish chemist Theodor Spurred by the desire to separate
Svedberg. Svedberg’s early research was on colloids, and he proteins, Theodor Svedberg and his
began to develop high-speed centrifuges in the hope that they student Arne Tiselius were responsible
might provide an exact method for measuring the distribution of for the development of electrophoresis.
particle size in the solutions. In 1924, he developed the first Tiselius transformed the electrophoresis
ultracentrifuge, which generated a centrifugal force up to 5000 apparatus into a powerful analytical
times the force of gravity. Later versions generated forces instrument in the late 1920s. He used a
hundreds of thousands of times the force of gravity. Svedberg U-shaped tube as the electrophoresis
precisely determined the molecular weights of highly complex chamber, added a cooling system, and
proteins, including hemoglobin. In later years, he performed adapted a Schlieren optical system to
studies in nuclear chemistry, contributed to the development of visualize the refraction boundaries of
the cyclotron, and helped his student, Arne Tiselius, develop colorless solutions. Despite its expense
electrophoresis to separate and analyze proteins. ($6000 to build and $5000 a year to
Another essential instrument developed in this era was the pH maintain and operate), there were 14
meter with a glass electrode. Kenneth Goode first used a vacuum Tiselius stations in the United States by
triode to measure pH in 1921, but this potentiometer was not the end of 1939.
coupled to a glass electrode until 1928, when two groups (at New Electrophoresis became the workhorse
York University and the University of Illinois) measured pH by instrument for several generations of
using this combination. Rapid and inexpensive pH measurement biologists interested in physiology and
was not a reality until 1934, however, when Arnold Beckman of genetics—and the enabling instrument for
the California Institute of Technology and corporate chemist most of the coming work in protein and
Glen Joseph substituted a vacuum tube voltmeter for a nucleic acid purification and analysis. It
galvanometer and assembled a sturdy measuring device with two proved the absolute requisite for the
vacuum tubes and a milliammeter. The portable pH meter was rational development of pharmaceuticals
marketed in 1935 for $195. In the world of medical applications, based on interaction with human
the electrometer dosimeter was developed in the mid-1920s to enzymes or human genes.
assess exposure to ionizing radiation for medical treatment,
radiation protection, and industrial exposure control. For clinical dosimetry and treatment planning, an ionization
chamber connected to an electrometer was valued for its convenience, versatility, sensitivity, and reproducibility.
It was not simply the invention of instruments, but the way research was organized around them, that made the
1920s and 1930s so fertile for biochemistry. Weaver was involved in encouraging and funding much of this activity,
whether it was Svedberg’s work on molecular evolution or Linus Pauling’s use of X-ray diffraction to measure bond
lengths and bond angles, and his development of the method of electron diffraction to measure the architecture of
organic compounds. Inventing and developing new instruments allowed scientists to combine physics and chemistry
and advance the field of pharmaceutical science.
Radioisotopes
Another powerful technological development that was refined during the 1920s and 1930s was the use of radioactive
forms of elements—radioisotopes—in research. Hungarian chemist Georg von Hevesy introduced radioisotopes into
experimental use in 1913, tracing the behavior of nonradioactive forms of selected elements; he later used a
radioisotope of lead to trace the movement of lead from soil into bean plants.
The radioactive tracer was an alternative to more arduous methods of measurement and study. By the late 1920s,
researchers applied the tracer technique to humans by injecting dissolved radon into the bloodstream to measure the
rate of blood circulation. Yet there were limits to the use of radioisotopes, owing to the fact that some important
elements in living organisms do not possess naturally occurring radioisotopes.
This difficulty was overcome in the early 1930s, when medical researchers realized that the cyclotron, or “atom
smasher,” invented by physicist Ernest Lawrence, could be used to create radioisotopes for treatment and research.
Radiosodium was first used in 1936 to treat several leukemia patients; the following year, Lawrence’s brother, John,
used radiophosphorus to treat the same disease. A similar method was used to treat another blood disease,
polycythemia vera, and soon it became a standard treatment for that malady. Joseph Hamilton and Robert Stone at

13
the University of California, Berkeley, pioneered the use of cyclotron-produced radioisotopes for treating cancer in
1938; and one year later, Ernest Lawrence constructed an even larger atom smasher, known as the “medical
cyclotron,” which would create additional radioisotopes in the hopes of treating cancer and other diseases.
Thus began the age of “nuclear medicine,” in which the skills of Suggested reading
physicists were necessary to produce materials critical to biochemical • FDA History on the Web:
research. The new use of radioisotopes was a far cry from the quack www.fda.gov/oc/history/default.htm
medicines of the period that used the mystique of radioactivity to • Final Report of the Advisory
peddle radium pills and elixirs for human consumption—although in Committee on Human Radiation
their ignorance, many legitimate doctors also did far more harm than Experiments (Stock # 061-000-00-
good. The cost of producing radioisotopes was high, as cyclotrons 848-9) (Superintendent of
Documents, U.S. Government
often operated continuously at full power, requiring the attention of
Printing Office: Washington, DC,
physicists around the clock. The human and financial resources of 1995)
physics departments were strained in these early years, which made
• In Search of A Cure: A History of
the contributions of foundations essential to the continuation of these Pharmaceutical Discovery
innovative projects. Ultimately, as the century wore on, radioisotopes Weatherall, M. (Oxford University
came into routine use, the federal government’s role increased Press: New York, 1990)
immensely, and radioisotopes were mass-produced in the reactors of • Partners in Science: Foundations
the Atomic Energy Commission. After World War II, nuclear and Natural Scientists, 1900–1945
medicine occupied a permanent place in pharmaceutical science. Kohler, R. E. (University of Chicago
By the end of the 1930s, the work of scientists such as Svedberg, Press: Chicago, 1991)
Tiselius, Banting, Dubos and Domagk, along with the vision of • The Greatest Benefit to Mankind: A
administrators such as Weaver, set the stage for the unparalleled Medical History of Humanity Porter,
developments of the antibiotic era to come. In addition, World War R. (W. W. Norton: New York, 1997)
II, already beginning in Europe, spurred a wealth of research into perfecting known technologies and developing
new ones—instruments and processes that would have a profound impact on the direction that biology and
pharmacology would ultimately take.

14
Antibiotics & isotopes
Introduction As the American public danced to the beat of the Big Band era, so did pharmacology swing into
action with the upbeat tone of the dawning antibiotic era.
The latter is the nickname most commonly used for the 1940s among scientists in the world of biotechnology and
pharmaceuticals. The nickname is more than justified, given the numerous impressive molecules developed during
the decade. Many “firsts” were accomplished in the drug discovery industry of the Forties, but no longer in a
serendipitous fashion as before. Researchers were actually looking for drugs and finding them.
To appreciate the events that paved the way for the advancements of the 1940s, one need only look back to 1939,
when Rene Dubos of the Rockefeller Institute for Medical Research discovered and isolated an antibacterial
compound—tyrothricin, from the soil microbe Bacillus brevis—capable of destroying Gram-positive
bacteria.Before this discovery, penicillin and the sulfa drugs had been discovered “accidentally.” Dubos planned his
experiment to search soil for microorganisms that could destroy organisms related to diseases. “It was a planned
experiment. It wasn’t a chance observation,” explains H. Boyd Woodruff, who worked on the penicillin project at
Merck in the early 1940s, “[and because the experiment] had been successful, it sort of opened the field in terms of
looking at soil for microorganisms that kill disease organisms.”
Fleming’s serendipity
The most famous example of serendipity in the 20th century has to be the discovery of penicillin by Alexander
Fleming as discussed in the previous chapter. Although Fleming observed the antibiotic properties of the mold
Penicillium notatum in 1928, it was another 12 years before the active ingredient, penicillin, was isolated and
refined.
Of course, as in most of history, there are some inconsistencies. Fleming was not the first scientist to observe the
antibacterial action of penicillin. In 1896 in Lyon, France, Ernest Augustin Duchesne studied the survival and
growth of bacteria and molds, separately and together. He observed that the mold Penicillium glaucum had
antibacterial properties against strains of both Escherichia coli and typhoid bacilli. The antibacterial properties of
penicillin serendipitously surfaced at least three times during the course of scientific history before scientists used its
power. And the third time was definitely a charm.
Finding a magic bullet
An Oxford University student of pathology, Howard Walter Florey’s early research interests involved mucus
secretion and lysozyme—an antibacterial enzyme originally discovered by Fleming. The more he learned about the
antibacterial properties of lysozyme and intestinal mucus, the more interested he became in understanding the actual
chemistry behind the enzymatic reactions. However, he did not have the opportunity to work with chemists until
1935, when Florey hired Ernst Boris Chain to set up a biochemistry section in the department of pathology at the Sir
William Dunn School of Pathology at Oxford. Because Chain was a chemist, Florey encouraged him to study the
molecular action of lysozyme. Florey wanted to find out whether lysozyme played a role in duodenal ulcers and was
less interested in its antibacterial properties.
During a scientific literature search on bacteriolysis, Chain came upon Fleming’s published report of penicillin,
which had, as Chain describes, “sunk into oblivion in the literature.” Chain thought that the active substance
inducing staphylococcus lysis might be similar to lysozyme, and that their modes of action might also be similar. He
set out to isolate penicillin to satisfy his own scientific curiosity and to answer a biological problem—what reaction
lysozyme catalyzes—not to find a drug.
The scientific collaborations and discussions between Chain and Florey eventually laid the foundation for their 1939
funding application to study the antimicrobial products of microorganisms. It never crossed their minds that one of
these antimicrobial products would be the next magic bullet. The timing of their funded research is also
significant—it occurred within months of Great Britain’s declaration of war with Germany and the beginning of
World War II.
Because of its activity against staphylococcus, Fleming’s penicillin was one of the first compounds chosen for the
study. The first step toward isolating penicillin came in March 1940 at the suggestion of colleague Norman G.
Heatley. The team extracted the acidified culture filtrate into organic solution and then re-extracted penicillin into a
neutral aqueous solution. In May, Florey examined the chemotherapeutic effects of penicillin by treating four of
eight mice infected with Streptomyces pyogenes. The mice treated with penicillin survived, whereas the other 4 died
within 15 hours. In September, Henry Dawson and colleagues confirmed the antibiotic properties of penicillin by
taking a bold step and injecting penicillin into a patient at Columbia Presbyterian Hospital (New York).

15
With the help of Chain, Heatley, Edward P. Abraham, and other Dunn School chemists, Florey was able to scrape
up enough penicillin to perform clinical trials at the Radcliffe Infirmary in Oxford in February 1941. The first
patient treated was dying of S. aureus and S. pyogenes. Treatment with penicillin resulted in an amazing recovery,
but because of insufficient quantities of the drug, the patient eventually died after a relapse.
Over the next three months, five other patients responded well when treated with penicillin. All of these patients
were seriously ill with staphylococcal or streptococcal infections that could not be treated with sulfonamide. These
trials proved the effectiveness of penicillin when compared to the sulfa drugs, which at the time were considered the
gold standard for treating infections.
Producing penicillin
Florey had difficulties isolating the quantities of penicillin required to prove its value. In the early years, the Oxford
team grew the mold by surface culture in anything they could lay their hands on. Because of the war, they couldn’t
get the glass flasks they wanted, so they used bedpans until Florey convinced a manufacturer to make porcelain pots,
which incidentally resembled bedpans. Britain was deep into the war, and the British pharmaceutical industry did
not have the personnel, material, or funds to help Florey produce penicillin.
Florey and Heatley came to the United States in June 1941 to seek assistance from the American pharmaceutical
industry. They traveled around the country but could not garner interest for the project. Because of the as yet ill-
defined growing conditions for P. notatum and the instability of the active compound, the yield of penicillin was low
and it was not economically feasible to produce. Florey and Heatley ended up working with the U.S. Department of
Agriculture’s Northern Regional Research Laboratory in Peoria, IL.
The agricultural research center had excellent fermentation facilities, but more importantly—unlike any other
facility in the country—it used corn steep liquor in the medium when faced with problematic cultures. This liquor
yielded remarkable results for the penicillin culture. The production of penicillin increased by more than 10-fold,
and the resulting penicillin was stable. It turns out that the penicillin (penicillin G) produced at the Peoria site was an
entirely different compound from the penicillin (penicillin F) produced in Britain. Fortunately for all parties
involved, penicillin G demonstrated the same antibacterial properties against infections as penicillin F. With these
new developments, Merck, Pfizer, and Squibb agreed to collaborate on the development of penicillin.
By this time, the United States had entered the war, and the U.S. government was encouraging pharmaceutical
companies to collaborate and successfully produce enough penicillin to treat war-related injuries. By 1943, several
U.S. pharmaceutical companies were mass-producing purified penicillin G (~21 billion dosage units per month), and
it became readily available to treat bacterial infections contracted by soldiers. In fact, by 1944, there was sufficient
penicillin to treat all of the severe battle wounds incurred on D-day at Normandy. Also, diseases like syphilis and
gonorrhea could suddenly be treated more easily than with earlier treatments, which included urethra cleaning and
doses of noxious chemicals such as mercury or Salvarsan. The Americans continued to produce penicillin at a
phenomenal rate, reaching nearly 7 trillion units per month in 1945. Fleming, Florey, and Chain were recognized
“for the discovery of penicillin and its curative effect in various infectious diseases” in 1945 when they received the
Nobel Prize in Physiology or Medicine.
But all magic bullets lose their luster, and penicillin was no different. Dubos had the foresight to understand the
unfortunate potential of antibiotic-resistant bacteria and encouraged prudent use of antibiotics. As a result of this
fear, Dubos stopped searching for naturally occurring compounds with antibacterial properties.

16
As early as 1940, Abraham and Chain identified a strain of S. aureus BIOGRAPHY: A.J.P. Martin,
that could not be treated with penicillin. This seemingly small, chromatographer extraordinaire
almost insignificant event foreshadowed the wave of antibiotic- In 1941, Archer John Porter Martin
resistant microorganisms that became such a problem throughout the and Richard Laurence Millington
medical field toward the end of the century. Synge, working for the Wool
Malaria and quinine Industries Research Association in
Although penicillin was valuable against the battle-wound infections England, came up with liquid–liquid
and venereal diseases that have always afflicted soldiers, it was not partition chromatography. It was
effective against the malaria that was killing off the troops in the arguably the most significant tool for
mosquito-ridden South Pacific. The Americans entered Guadal canal linking analytical chemistry to the life
in June 1942, and by August there were 900 cases of malaria; in sciences and helped create and
September, there were 1724, and in October, 2630. By December define molecular biology research.
1942, more than 8500 U.S. soldiers were hospitalized with malaria. Partition chromatography was
Ninety percent of the men had contracted the disease, and in one developed to separate the various
hospital, as many as eight of every 10 soldiers had malaria rather amino acids that made up proteins. In
than combat-related injuries. 1944, Martin, with his colleagues R.
The only available treatment, however, was the justifiably unpopular Consden and A. H. Gordon,
drug Atabrine. Besides tasting bitter, the yellow Atabrine pills caused developed paper chromatography—
headaches, nausea, vomiting, and in some cases, temporary another of the most important
psychosis. It also seemed to leave a sickly hue to the skin and was methods used in the early days of
falsely rumored to cause impotence. Nevertheless, it was effective biotechnology. This enabled the
and saved lives. Firms such as Abbott, Lilly, Merck, and Frederick routine isolation and identification of
Stearns assured a steady supply of Atabrine, producing 3.5 billion nucleic acids and amino acids
tablets in 1944 alone. unattainable by column
But Atabrine lacked the efficacy of quinine, which is isolated from chromatography. Amazingly enough,
cinchona, an evergreen tree native to the mountains of South and in 1950, Martin, with yet another
Central America. Unfortunately, the United States did not have a colleague, Anthony T. James,
sufficient supply of quinine in reserve when the war broke out. As a developed gas–liquid partition
result, the U.S. government established the Cinchona Mission in chromatography from an idea Martin
1942. Teams of botanists, foresters, and assistants went to South and Synge had put forward in their
America to find and collect quinine-rich strains of the tree—a costly, 1941 paper. These technologies,
strenuous, and time-consuming task. along with a host of others begun in
Out of desperation, research to develop antimalarials intensified. As the 1940s, would make possible the
an unfortunate example of this desperation, prison doctors in the rational drug discovery breakthroughs
Chicago area experimentally infected nearly 400 inmates with of the coming decades.
malaria during their search for a therapeutic. Although aware that they were helping the war effort, the prisoners
were not given sufficient information about the details and risks of the clinical experiments. After the war, Nazi
doctors on trial for war crimes in Nuremberg referred to this incident as part of their defense for their criminal
treatment of prisoners while aiding the German war effort.
In 1944, William E. Doering and Robert B. Woodward synthesized quinine—a complex molecular structure—from
coal tar. Woodward’s achievements in the art of organic synthesis earned him the 1965 Nobel Prize in Chemistry.
Chloroquine, another important antimalarial, was synthesized and studied under the name of Resochin by the
German company Bayer in 1934 and rediscovered in the mid-1940s. Even though chloroquine-resistant parasites
cause illness throughout the world, the drug is still the primary treatment for malaria.

17
Streptomycin and tuberculosis SOCIETY: Atoms for peace?
When Dubos presented his results with tyrothricin at the Third Although “Atoms for Peace” became
International Congress for Microbiology in New York in 1939, a slogan in the 1950s, in the 1940s
Selman A. Waksman was there to see it. The successful development the case was just the opposite. By the
of penicillin and the discovery of tyrothricin made Waksman realize start of the decade, Ernest
the enormous potential of soil as a source of druglike compounds. He Lawrence’s Radiation Laboratory and
immediately decided to focus on the medicinal uses of antibacterial its medical cyclotron, which began
soil microbes. operation in the 1930s to produce
In 1940, Woodruff and Waksman isolated and purified actinomycin experimental therapeutics, were
from Actinomyces griseus (later named Streptomyces griseus), which transformed to the wartime service of
led to the discovery of many other antibiotics from that same group atom bomb research. Ultimately, the
of microorganisms. Actinomycin attacks Gram-negative bacteria Manhattan Project yielded both the
responsible for diseases like typhoid, dysentery, cholera, and Hiroshima and Nagasaki bombs, but
undulant fever and was the first antibiotic purified from an also what the Department of Energy
actinomycete. Considered too toxic for the treatment of diseases in refers to as “a new and secret world
animals or humans, actinomycin is primarily used as an investigative of human experimentation.” Tests on
tool in cell biology. In 1942, the two researchers isolated and purified informed and uninformed volunteers,
streptothricin, which prevents the proliferation of Mycobacterium both civilian and military, of the effects
tuberculosis but is also too toxic for human use. of radiation and fallout flourished
A couple of years later, in 1944, Waksman, with Albert Schatz and simultaneously in a world where, for
Elizabeth Bugie, isolated the first aminoglycoside, streptomycin, the first time, sufficient quantities of a
from S. griseus. Like penicillin, aminoglycosides decrease protein wide variety of radioactive materials
synthesis in bacterial cells, except that streptomycin targets Gram- became available for use in a dizzying
positive organisms instead of Gram-negatives. Waksman studied the array of scientific research and
value of streptomycin in treating bacterial infections, especially medical experimentation.
tuberculosis. In 1942, several hundred thousand deaths resulted from Radioisotopes would become the tool
tuberculosis in Europe, and another 5 to 10 million people suffered for physiological research in the
from the disease. Although sulfa drugs and penicillin were readily decades that followed.
available, they literally had no effect.
Merck immediately started manufacturing streptomycin with the help of Woodruff. A consultant for Merck,
Waksman sent Woodruff to Merck to help with the penicillin project, and after finishing his thesis, Woodruff
continued working there. Simultaneously, studies by W. H. Feldman and H. C. Hinshaw at the Mayo Clinic
confirmed streptomycin’s efficacy and relatively low toxicity against tuberculosis in guinea pigs. On November 20,
1944, doctors administered streptomycin for the first time to a seriously ill tuberculosis patient and observed a rapid,
impressive recovery. No longer unconquerable, tuberculosis could be tamed and beaten into retreat. In 1952,
Waksman was awarded the Nobel Prize in Physiology or Medicine for his discovery of streptomycin—1 of 18
antibiotics discovered under his guidance—and its therapeutic effects in patients suffering from tuberculosis.
Merck had just developed streptomycin and moved it into the marketplace when the company stumbled upon
another great discovery. At the time, doctors treated patients with pernicious anemia by injecting them with liver
extracts, which contained a factor required for curing and controlling the disease. When patients stopped receiving
injections, the disease redeveloped. The Merck chemists had been working on isolating what was called the
pernicious anemia factor from liver extracts, and they decided to look at the cultures grown by Woodruff and other
microbiologists at Merck, to see if one of the cultures might produce the pernicious anemia factor. They found a
strain of S. griseus similar to the streptomycin-producing strain that made the pernicious anemia factor.
With the help of Mary Shorb’s Lactobacillus lactis assay to guide the purification and crystallization of the factor,
Merck scientists were able to manufacture and market the factor as a cure for pernicious anemia. The factor turned
out to be a vitamin, and it was later named vitamin B12. As Woodruff describes the period, “So, we jumped from
penicillin to streptomycin to vitamin B12. We got them 1-2-3, bang-bang-bang.” Merck struck gold three times in a
row. The United Kingdom’s only woman Nobel laureate, Dorothy Crowfoot Hodgkin, solved the molecular
structure of vitamin B12 in 1956—just as she had for penicillin in the 1940s, a discovery that was withheld from
publication until World War II was over.

18
The continuing search TECHNOLOGY: Transforming DNA
After developing penicillin, U.S. pharmaceutical companies The 1940s was the era that changed
continued to search for “antibiotics,” a term coined by P. Vuillemin DNA from just a chemical curiosity to
in 1889 but later defined by Waksman in 1947 as those chemical the acknowledged home of the
substances “produced by microbes that inhibit the growth of and gene—a realization that would have
even destroy other microbes.” profound impact for the
In 1948, Benjamin M. Duggar, a professor at the University of pharmaceutical industry by the end of
Wisconsin and a consultant to Lederle, isolated chlortetracycline the century. In the mid-1930s, Oswald
from Streptomyces aureofaciens. Chlortetracycline, also called T. Avery, after discovering a
aureomycin, was the first tetracycline antibiotic and the first broad- pneumonia-fighting enzyme with
spectrum antibiotic. Active against an estimated 50 disease Dubos and being disheartened by its
organisms, aureomycin works by inhibiting protein synthesis. The lack of utility compared with Prontosil,
discovery of the tetracycline ring system also enabled further moved into the field of molecular
development of other important antibiotics. biology. His greatest achievement
Other antibiotics with inhibitory effects on cell wall synthesis were came in 1944 with Colin MacLeod and
also discovered in the 1940s and include cephalosporin and Maclyn McCarty, when the team
bacitracin. Another ß-lactam antibiotic, cephalosporin was first showed that DNA constitutes the
isolated from Cephalosporium acremonium in 1948 by Guiseppe genetic material in cells. Their
Brotzu at the University of Cagliari in Italy. Bacitracin, first derived research demonstrated that
from a strain of Bacillus subtilis, is active against Gram-positive pancreatic deoxyribonuclease, but not
bacteria and is used topically to treat skin infections. pancreatic ribonuclease or proteolytic
enzymes, destroy the “transforming
Nonantibiotic therapeutics principle.” This factor that could move
Even though Lederle Laboratories was a blood processing plant genetic information from one type of
during World War II, it evolved into a manufacturer of vitamins and bacteria to another was naked DNA.
nutritional products, including folic acid. Sidney Farber, a cancer In a sense, this was really the
scientist at Boston’s Children’s Hospital, was testing the effects of experiment that set the stage for
folic acid on cancer. Some of his results, which now look dubious, Watson and Crick in the 1950s and
suggested that folic acid worsened cancer conditions, inspiring the marvels of genetic engineering yet
chemists at Lederle to make antimetabolites—structural mimics of to come.
essential metabolites that interfere with any biosynthetic reaction
involving the intermediates—resembling folic acid to block its action. These events led to the 1948 development of
methotrexate, one of the earliest anticancer agents and the mainstay of leukemia chemotherapy.
But the pioneer of designing and synthesizing antimetabolites that could destroy cancer cells was George Hitchings,
head of the department of biochemistry at Burroughs Wellcome Co. In 1942, Hitchings initiated his DNA-based
antimetabolite program, and in 1948, he and Gertrude Elion synthesized and demonstrated the anticancer activity of
2,6-diaminopurine. By fine-tuning the structure of the toxic compound, Elion synthesized 6-mercaptopurine, a
successful therapeutic for treating acute leukemia. Hitchings, Elion, and Sir James W. Black won the Nobel Prize in
Physiology or Medicine in 1988 for their discoveries of “important principles for drug treatment,” which constituted
the groundwork for rational drug design.
The discovery of corticosteroids as a therapeutic can be linked to Thomas Addison, who made the connection
between the adrenal glands and the rare Addison’s disease in 1855. But it wasn’t until Edward Calvin Kendall at the
Mayo Clinic and Thadeus Reichstein at the University of Basel independently isolated several hormones from the
adrenal cortex that corticosteroids were used to treat a more widespread malady. In 1948, Kendall and Philip S.
Hench demonstrated the successful treatment of patients with rheumatoid arthritis using cortisone. Kendall,
Reichstein, and Hench received the 1950 Nobel Prize in Physiology or Medicine for determining the structure and
biological effects of adrenal cortex hormones.
One of the first therapeutic drugs to prevent cardiovascular disease also came from this period. While investigating
the mysterious deaths of farm cows, Karl Paul Link at the University of Wisconsin proved that the loss of clotting
ability in cattle was linked to the intake of sweet clover. He and his colleagues then isolated the anticoagulant and
blood thinner dicoumarol (warfarin) from coumarin, a substance found in sweet clover, in 1940.
Many other advances
The synthesis, isolation, and therapeutic applications of miracle drugs may be the most well-remembered
discoveries of the 1940s for medical chemists and biochemists, but advances in experimental genetics, biology, and
virology were also happening. These advances include isolating the influenza B virus in 1940, by Thomas Francis at
19
New York University and, independently, by Thomas Pleines Magill. Also in 1940, at the New York Hospital–
Cornell University Medical Center, Mary Loveless succeeded in blocking the generation of immunotherapy-induced
antibodies using pollen extracts. Routine use of the electron microscope in virology followed the first photos of
tobacco-mosaic virus by Helmut Ruska, an intern at the Charité Medical School of Berlin University, in 1939; and
the 1940s also saw numerous breakthroughs in immunology, including the first description of phagocytosis by a
neutrophil.
In 1926, Hermann J. Muller, a professor at the University of Texas at Austin, reported the identification of several
irradiation-induced genetic alterations, or mutations, in Drosophila that resulted in readily observed traits. This
work, which earned Muller the Nobel Prize in Physiology or Medicine in 1946, enabled scientists to recognize
mutations in genes as the cause of specific phenotypes, but it was still unclear how mutated genes led to the
observed phenotypes.
In 1935, George Wells Beadle began studying the development of eye pigment in Drosophila with Boris Ephrussi at
the Institut de Biologie Physico-Chimique in Paris. Beadle then collaborated with Edward Lawrie Tatum when they
both joined Stanford in 1937—Beadle as a professor of biology (genetics) and Tatum as a research associate in the
department of biological sciences. Tatum, who had a background in chemistry and biochemistry, handled the
chemical aspects of the Drosophila eye-color study. Beadle and Tatum eventually switched to the fungus
Neurospora crassa, a bread mold. After producing mutants of Neurospora by irradiation and searching for
interesting phenotypes, they found several auxotrophs—strains that grow normally on rich media but cannot grow
on minimal medium. Each mutant required its own specific nutritional supplement, and each requirement correlated
to the loss of a compound normally synthesized by the organism. By determining that each mutant evoked a
deficiency in a specific metabolic pathway, which was known to be controlled by enzymes, Beadle and Tatum
concluded in a 1940 report that each gene produced a single enzyme, also called the “single gene–single enzyme”
concept. The two scientists shared the Nobel Prize in Physiology or Medicine in 1958 for discovering that genes
regulate the function of enzymes and that each gene controls a specific enzyme.
Also recognized with the same prize in 1958 was Joshua Lederberg. As a graduate student in Tatum’s laboratory in
1946, Lederberg found that some plasmids enable bacteria to transfer genetic material to each other by forming
direct cell–cell contact in a process called conjugation. He also showed that F (fertility) factors allowed conjugation
to occur. In addition, Lederberg defined the concepts of generalized and specialized transduction, collaborated with
other scientists to develop the selection theory of antibody formation, and demonstrated that penicillin-susceptible
bacteria could be grown in the antibiotic’s presence if a hypotonic medium was used.
In the field of virology, John Franklin Enders, Thomas H. Weller, and Frederick Chapman Robbins at the Children’s
Hospital Medical Center in Boston figured out in 1949 how to grow poliovirus in test-tube cultures of human
tissues—a technique enabling the isolation and study of viruses. Polio, often referred to as infantile paralysis, was
one of the most feared diseases of the era. These researchers received the Nobel Prize in Physiology or Medicine in
1954.
Salvador Luria, at Indiana University, and Alfred Day Hershey, at Washington University’s School of Medicine,
demonstrated that the mutation of bacteriophages makes it difficult for a host to develop immunity against viruses.
In 1942, Thomas Anderson and Luria photographed and characterized E. coli T2 bacteriophages using an electron
microscope. Luria and Max Delbrück, at Vanderbilt University, used statistical methods to demonstrate that
inheritance in bacteria follows Darwinian principles. Luria, Hershey, and Delbrück were awarded the Nobel Prize in
Physiology or Medicine in 1969 for elucidating the replication mechanism and genetic structure of viruses.
Although these discoveries were made outside of the pharmaceutical industry, their applications contributed
enormously to understanding the mechanisms of diseases and therapeutic drugs.
Biological and chemical warfare
Biological warfare—the use of disease to harm or kill an adversary’s military forces, population, food, and
livestock—can involve any living microorganism, nonliving virus, or bioactive substance deliverable by
conventional artillery. The history of biological warfare can be traced back to the Romans, who used dead animals
to infect their enemies’ water supply. The United States started a biological warfare program in 1942 after obtaining
Japanese data about the destructive use of chemical and biological agents from pardoned war criminals. Japan
sprayed bubonic plague over parts of mainland China on five occasions in 1941. Despite the fact that the spraying
was ineffective, the attempts prompted the United States to develop its biological warfare program. Later, the
developing Cold War further stimulated this research in the United States—and in the Soviet Union.
Ironically, the first chemotherapeutic agent for cancer came from an early instance of chemical warfare. Initially
used as a weapon in World War I, mustard gas proved useful in treating mice and a person with lymphoma in 1942,

20
when Alfred Gilman and Fred Phillips experimentally administered the chemical weapon as a therapeutic. Because
the patient showed some improvement, chemical derivatives of mustard gas were developed and used to treat
various cancers.
The Nuremberg Code
Not only did World War II encourage the discovery and development of antibiotics and antidisease drugs, it also
instigated the need to define what constitutes permissible medical experiments on human subjects. The Nazis
performed cruel and criminal “medical” experiments on Jews and other prisoners during the war. In 1949, the
Nuremberg Code was established in an effort to prevent medical crimes against humanity. The Code requires that
individuals enrolled in clinical trials give voluntary consent. The experiment must hypothetically achieve useful
results for the good of society, be performed by scientifically qualified persons, and be derived from experiments on
animal models that suggest the anticipated outcome will justify human clinical experiments. The code also
emphasizes that all physical and mental suffering must be avoided and that precautions must be taken to protect the
human subject if injury or disability results from the experiment. In achieving its goals, the Nuremberg Code
necessarily empowers the human subject and holds the researcher responsible for inflicting unnecessary pain and
suffering on the human subject.
On the practical level, it was not until the 1960s that institutionalized Suggested reading
protections for subjects in clinical trials and human experimentation • “The Early Years of the Penicillin
were put into place. Discovery”
Chain, E. (TIPS, 1979, 6–11)
“Swing” time • Fighting for Life: American Military
The 1940s ended with the antibiotic era in full swing and with a host Medicine in World War II
of wartime advancements in fermentation and purification Cowdrey, A. E (The Free Press: New
technologies changing the drug development process. Penicillin and York, 1994)
DDT became the chemical markers of the age, promising to heal the • Penicillin: Meeting the Challenge
world—curing the plagues and killing the plague carriers. The Hobby, G. (Yale University Press:
radioisotopes now easily produced through advances in technology New Haven, CT, 1985)
promoted by the war were becoming routinely available for health • Biological Weapons: Limiting the
research, as the era of computer-aided drug analysis began. The baby Threat
boom launched by postwar U.S. prosperity produced the first Lederberg, J. S., Ed. (MIT Press:
generation born with the expectation of health through drugs and Cambridge, MA, 1999)
medical intervention. • The Antibiotic Era
Because of these new possibilities, health became a political as well Waksman, S. A. (The Waksman
as a social issue. The leading role science played in the Allied victory Foundation of Japan: Tokyo, 1975)
gave way in the postwar 1940s to its new role as medical savior. The • Scientific Contributions of Selman A.
Waksman: Selected Articles
new technologies that became available in the 1940s—including
Published in honor of his 80th
partition chromatography, infrared and mass spectrometry, as well as birthday, July 22, 1968
nuclear magnetic resonance (NMR)—would eventually become Woodruff, H. B., Ed. (Rutgers
critical to pharmaceutical progress. University Press: New Brunswick,
NJ, 1968)

21
Prescriptions & polio
Introduction The 1950s began amid a continuing wave of international paranoia. The Cold War intensified in
the late 1940s as the West responded to the ideological loss of China to communism and the very real loss of atom
bomb exclusivity to the Soviets. The first few years of the 1950s heated up again with the Korean conflict.
On the home front, World War II– related science that was now declassified, together with America’s factories now
turned to peace, shaped a postwar economic boom. Driven by an unprecedented baby boom, the era’s mass
consumerism focused on housing, appliances, automobiles, and luxury goods. Technologies applied to civilian life
included silicone products, microwave ovens, radar, plastics, nylon stockings, long-playing vinyl records, and
computing devices. New medicines abounded, driven by new research possibilities and the momentum of the
previous decade’s “Antibiotic Era.” A wave of government spending was spawned by two seminal influences: a
comprehensive new federal science and technology policy and the anti-“Red” sentiment that dollars spent for
science were dollars spent for democracy.
While improved mechanization streamlined production in drug factories, the DNA era dawned. James Watson and
Francis Crickdetermined the structure of the genetic material in 1953. Prescription and nonprescription drugs were
legally distinguished from one another for the first time in the United States as the pharmaceutical industry matured.
Human cell culture and radioimmunoassays developed as key research technologies; protein sequencing and
synthesis burgeoned, promising the development of protein drugs.
In part because of Cold War politics, in part because the world was becoming a smaller place, global health issues
took center stage. Fast foods and food additives became commonplace in theWest. “The Pill” was developed and
first tested in Puerto Rico. Ultrasound was adapted to fetal monitoring. Gas chromatography (GC), mass
spectrometry, and polyacrylamide gel electrophoresis began transforming drug research, as did the growth of the
National Institutes of Health (NIHI) and the National Science Foundation (NSF). The foundations of modern
immunology were laid as the pharmaceutical industry moved ever forward in mass-marketing through radio and the
still-novel format of television.
But above all, through the lens of this time in the Western world, was the heroic-scientist image of Jonas Salk,
savior of children through the conquest of polio via vaccines.
Antibiotics redux
The phenomenal success of antibiotics in the 1940s spurred the continuing pursuit of more and better antibacterial
compounds from a variety of sources, especially from natural products in soils and synthetic modifications of
compounds discovered earlier. In 1950, the antibiotic Nystatin was isolated from Streptomyces noursei obtained
from soil in Virginia. In 1952, erythromycin was first isolated from S. erythreus from soil in the Philippines. Other
antibiotics included Novobiocin (1955), isolated from S. spheroides from Vermont; Vancomycin (1956) from S.
orientalis from soils in Borneo and Indiana; and Kanamycin (1957) from S. kanamyceticus from Japan. The search
for antibiotic compounds also led researchers in Great Britain to discover, in 1957, an animal glycoprotein
(interferon) with antiviral activity.
Not only did the development of antibiotics that began in the 1940s lead to the control of bacterial infections, it also
permitted remarkable breakthroughs in the growth of tissue culture in the 1950s. These breakthroughs enabled the
growth of polio and other viruses in animal cell cultures rather than in whole animals, and permitted a host of
sophisticated physiological studies that had never before been possible. Scientists were familiar with the concepts of
tissue culture since the first decades of the century, but routine application was still too difficult. After antibiotics
were discovered, such research no longer required an “artist in biological technique” to maintain the requisite sterile
conditions in the isolation, maintenance, and use of animal cells, according to virus researcher Kingsley F. Sanders.
In 1957, he noted that the use of antibiotics in tissue culture made the process so easy that “even an amateur in his
kitchen can do it.”
Funding medicine
In the 1950s, general science research, particularly biological research, expanded in the United States to a great
extent because of the influence of Vannevar Bush, presidential science advisor during and immediately after World
War II. His model, presented in the 1945 report to the President, Science: The Endless Frontier, set the stage for the
next 50 years of science funding. Bush articulated a linear model of science in which basic research leads to applied
uses. He insisted that science and government should continue the partnership forged in the 1940s with the
Manhattan Project (in which he was a key participant) and other war-related research.

22
As early as 1946, Bush argued for creating a national science funding BIOGRAPHY: A computer pioneer
body. The heating up of the Cold War, as much as anything else, Dorothy Crowfoot Hodgkin (1910–
precipitated the 1950 implementation of his idea in the form of the 1994) is the only woman Nobel
National Science Foundation—a major funnel for government laureate from the United Kingdom.
funding of basic research, primarily for the university sector. It was a She earned the 1964 Nobel Prize in
federal version of the phenomenally successful Rockefeller Chemistry for her solution to the
Foundation. The new foundation had a Division of Biological and molecular structure of penicillin and
Medical Sciences, but its mission was limited to supporting basic vitamin B12. She pointed to a book by
research so that it wouldn’t compete with the more clinically oriented Sir William Bragg, Old Trades and
research of the NIH. New Know ledge (1925), as the
The NIH rode high throughout the 1950s, with Congress regularly beginning of her interest in the use of
adding $8 million to $15 million to the NIH budget proposed by the X-rays to explore many of the
first Eisenhower administration. By 1956, the NIH budget had risen questions left unanswered by
to almost $100 million. By the end of the decade, the NIH was chemistry—such as the structure of
supporting some 10,000 research projects at 200 universities and biological substances. After studying
medical schools at a cost of $250 million. at Somerville College (Oxford, U.K.),
Other areas of government also expanded basic medical research Hodgkin was introduced to J. D.
under the Bush vision. In 1950, for example, the Atomic Energy Bernal, a pioneer X-ray
Commission received a $5 million allocation from Congress crystallographer, in 1932, and later
specifically to relate atomic research to cancer treatment. In this same became his assistant.
vein, in 1956, Oak Ridge National Laboratory established a medical Hodgkin’s work was unique for its
instruments group to help promote the development of technology for technical brilliance and medical
disease diagnostics and treatment that would lead, in conjunction importance, but also for its use, at
with advances in radioisotope technology, to a new era of every step, of computing machines of
physiologically driven medicine. various degrees of sophistication. The
Part of this research funding was under the auspices of the Atoms for use of computing machines became
Peace program and led to the proliferation of human experiments essential because crystallographic
using radioactive isotopes, often in a manner that would horrify a analysis required adding a series of
later generation of Americans with its cavalier disregard for numbers derived from the positions
participants’ rights. and intensities of reflected X-rays
Science funding, including medical research, received an additional combined with some idea of their
boost with the 1957 launch of the first orbital satellite. The Soviet phases. With small molecules, the
sputnik capped the era with a wave of science and technology fervor numbers were not too difficult to
in industry, government, and even the public schools. The perceived manage, but with large molecules, the
“science gap” between the United States and the Soviet Union led to numbers to be summed became
the 1958 National Defense Education Act. The act continued the unmanageable. The structure of
momentum of government-led education that started with the GI Bill vitamin B12 was incredibly complex,
to provide a new, highly trained, and competent workforce that and by 1948, chemists had
would transform industry. This focus on the importance of deciphered only half of it. Hodgkin
technology fostered increased reliance on mechanized mass- and her group collected data on B12
production techniques. During World War II, the pharmaceutical for six years and took 2500 pictures of
industry had learned its lesson—that bigger was better in its crystals. Using an early computer,
manufacturing methods—as it responded to the high demand for she and her colleagues at UCLA
penicillin. announced the complete structure in
Private funds were also increasingly available throughout the 1956.
1950s—and not just from philanthropic institutions. The American
Cancer Society and the National Foundation for Infantile Paralysis were two of the largest public disease advocacy
groups that collected money from the general public and directed the significant funds to scientific research. This
link between the public and scientific research created, in some small fashion, a sense of investment in curing
disease, just as investing in savings bonds and stamps in the previous decade had created a sense of helping to win
World War II.
The war on polio
Perhaps the most meaningful medical story to people of the time was that of Jonas Salk and his “conquest of polio.”
Salk biographer Richard Carter described the response to the April 12, 1955, vaccine announcement: “More than a

23
scientific achievement, the vaccine was a folk victory, an occasion for pride and jubilation…. People observed
moments of silence, rang bells, honked horns, … closed their schools or convoked fervid assemblies therein, drank
toasts, hugged children, attended church.” The public felt a strong tie to Salk’s research in part because he was
funded by the National Foundation for Infantile Paralysis. Since 1938, the organization’s March of Dimes had
collected small change from the general public to fund polio research. The group managed to raise more money than
was collected for heart disease or even cancer.
The Salk vaccine relied on the new technology of growing viruses in cell cultures, specifically in monkey kidney
cells (first available in 1949). Later, the human HeLa cell line was used as well. The techniques were developed by
John F. Enders (Harvard Medical School), Thomas H. Weller (Children’s Medical Center, Boston), and Frederick C.
Robbins (Case Western Reserve University, Cleveland), who received the 1954 Nobel Prize in Physiology or
Medicine for their achievement.
Salk began preliminary testing of his polio vaccine in 1952, with a massive field trial in the United States in 1954.
According to Richard Carter, a May 1954 Gallup Poll found that “more Americans were aware of the polio field
trial than knew the full name of the President of the United States.” Salk’s vaccine was a killed-virus vaccine that
was capable of causing the disease only when mismanaged in production. This unfortunately happened within two
weeks of the vaccine’s release. The CDC Poliomyelitis Surveillance Unit was immediately established; the
popularly known “disease detectives” traced down the problem almost immediately, and the “guilty” vaccine lot was
withdrawn. It turned out that Cutter Laboratories had released a batch of vaccine with live contaminants that
tragically resulted in at least 260 cases of vaccine-induced polio.

24
Ironically, these safety problems helped promote an alternative SOCIETY: Rx for prescriptions
vaccine that used live rather than killed virus. In 1957, the attenuated The 1951 Humphrey–Durham
oral polio vaccine was finally developed by Albert Sabin and became amendments to the U.S. Federal
the basis for mass vaccinations in the 1960s. The live vaccine can Food, Drug, and Cosmetic Act of
infect a small percentage of those inoculated against the disease with 1938 fully differentiated two
active polio, primarily those with compromised immune systems, and categories of drugs: prescription and
is also a danger to immunocompromised individuals who have early nonprescription, referred to as
contact with the feces of vaccinated individuals. In its favor, it “legend” and “OTC” (over-the-
provides longer lasting immunity and protection against counter), respectively. The great
gastrointestinal reinfection, eliminating the reservoir of polio in the expansion of the prescription-only
population. category shifted the doctor–patient
The debate that raged in the 1950s over Salk versus Sabin (fueled at relationship more toward the giving
the time by a history of scientific disputes between the two men) out and receipt of prescriptions. Not
continues today: Some countries primarily use the injected vaccine, unexpectedly, this also dramatically
others use the oral, and still others use one or the other, depending on changed the relationship between
the individual patient. doctors and drug companies in the
Instruments and assays United States throughout the 1950s.
The 1950s saw a wave of new instrumentation, some of which, Prescription drug advertising had to
although not originally used for medical purposes, was eventually be directed at doctors rather than
used in the medical field. In 1951, image-analyzing microscopy patients, which spurred
began. By 1952, thin sectioning and fixation methods were being pharmaceutical companies to
perfected for electron microscopy of intracellular structures, aggressively lobby doctors to
especially mitochondria. In 1953, the first successful open-heart prescribe their drugs.
surgery was performed using the heart–lung machine developed by In 1955, the AMA discontinued its
John H. Gibbon Jr. in Philadelphia. drug-testing program, claiming in part
Of particular value to medical microbiology and ultimately to the that it had become an onerous
development of biotechnology was the production of an automated burden. Critics claimed it was
bacterial colony counter. This type of research was first because the organization wanted to
commissioned by the U.S. Army Chemical Corps. Then the Office of increase advertising revenues by
Naval Research and the NIH gave a significant grant for the removing conflict-of-interest
development of the Coulter counter, commercially introduced as the restrictions preventing advertising in
Model A. AMA journals. In 1959, the AMA
A.J.P. Martin in Britain developed gas–liquid partition demonstrated its support of the drug
chromatography in 1952. The first commercial devices became industry in its fight against the
available three years later, providing a powerful new technology for Kefauver–Harris Bill, which was
chemical analysis. In 1954, Texas Instruments introduced silicon intended to regulate drug
transistors—a technology encompassing everything from development and marketing. The bill
transistorized analytical instruments to improved computers and, for languished until 1962, when the
the mass market, miniaturized radios. thalidomide tragedy and powerful
The principle for electromagnetic microbalances was developed near public pressures led to its passage.
the middle of the decade, and a prototype CT scanner was unveiled. In 1958, amniocentesis was developed, and
Scottish physician Ian McDonald pioneered the use of ultrasound for diagnostics and therapeutics. Radiometer
micro pH electrodes were developed by Danish chemists for bedside blood analysis.
In a further improvement in computing technology, Jack Kilby at Texas Instruments developed the integrated circuit
in 1958.
In 1959, the critical technique of polyacrylamide gel electrophoresis (PAGE) was in place, making much of the
coming biotechnological analysis of nucleic acids and proteins feasible.
Strides in the use of atomic energy continued apace with heavy government funding. In 1951, Brookhaven National
Laboratory opened its first hospital devoted to nuclear medicine, followed seven years later by a Medical Research
Center dedicated to the quest for new technologies and instruments. By 1959, the Brookhaven Medical Research
Reactor was inaugurated, making medical isotopes significantly cheaper and more available for a variety of research
and therapeutic purposes.

25
In one of the most significant breakthroughs in using isotopes for research purposes, in 1952, Rosalyn Sussman
Yalow, working at the Veterans Hospital in the Bronx in association with Solomon A. Berson, developed the
radioimmunoassay (RIA) for detecting and following antibodies and other proteins and hormones in the body.
Physiology explodes
The development of new instruments, radioistopes, and assay techniques found rapid application in the realm of
medicine as research into general physiology and therapeutics prospered. In 1950, for example, Konrad Emil Bloch
at Harvard University used carbon-13 and carbon-14 as tracers in cholesterol buildup in the body. Also in 1950,
Albert Claude of the Université Catolique de Louvain in Belgium discovered the endoplasmic reticulum using
electron microscopy. That same year, influenza type C was discovered.
New compounds and structures were identified in the human body throughout the decade. In 1950, GABA (gamma-
aminobutyric acid ) was identified in the brain. Soon after that, Italian biologist Rita Levi-Montalcini demonstrated
the existence of a nerve growth hormone. In Germany, F.F.K. Lynen isolated the critical enzyme cofactor, acetyl-
CoA, in 1955. Human growth hormone was isolated for the first time in 1956. That same year, William C. Boyd of
the Boston University Medical School identified 13 “races” of humans based on blood groups.
Breakthroughs were made that ultimately found their way into the development of biotechnology. By 1952, Robert
Briggs and Thomas King, developmental biologists at the Institute for Cancer Research in Philadelphia, successfully
transplanted frog nuclei from one egg to another—the ultimate forerunner of modern cloning techniques. Of
tremendous significance to the concepts of gene therapy and specific drug targeting, sickle cell anemia was shown to
be caused by one amino acid difference between normal and sickle hemoglobin (1956–1958). Although from
today’s perspective it seems to have occurred surprisingly late, in 1956 the human chromosome number was finally
revised from the 1898 estimate of 24 pairs to the correct 23 pairs. By 1959, examination of chromosome
abnormalities in shape and number had become an important diagnostic technique. That year, it was determined that
Down’s syndrome patients had 47 chromosomes instead of 46.
As a forerunner of the rapid development of immunological sciences, in 1959 Australian virologist Frank
Macfarlane Burnet proposed his clonal selection theory of antibody production, which stated that antibodies were
selected and amplified from preexisting rather than instructionally designed templates.
A rash of new drugs
New knowledge (and, as always, occasional serendipity) led to new drugs. The decade began with the Mayo Clinic’s
discovery of cortisone in 1950—a tremendous boon to the treatment of arthritis. But more importantly, it saw the
first effective remedy for tuberculosis. In 1950, British physician Austin Bradford Hill demonstrated that a
combination of streptomycin and para-aminosalicylic acid (PAS) could cure the disease, although the toxicity of
streptomycin was still a problem. By 1951, an even more potent antituberculosis drug was developed simultaneously
and independently by the Squibb Co. and Hoffmann-LaRoche. Purportedly after the death of more than 50,000 mice
(part of a new rapid screening method developed by Squibb to replace the proverbial guinea pigs as test animals)
and the examination of more than 5000 compounds, isonicotinic acid hydrazide proved able to protect against a
lethal dose of tubercle bacteria. It was marketed ultimately as isoniazid and proved especially effective in mixed
dosage with streptomycin or pas.
In 1951, monoamine oxidase (MAO) inhibitors were introduced to treat psychosis. In 1952, reserpine was isolated
from rauwolfia and eventually was used for treating essential hypertension. But in 1953, the rauwolfia alkaloid was
used as the first of the tranquilizer drugs. The source plant came from India, where it had long been used as a folk
medicine. The thiazide drugs were also developed in this period as diuretics for treating high blood pressure. In
1956, halothane was introduced as a general anesthetic. In 1954, the highly touted chlorpromazine (Thorazine) was
approved as an antipsychotic in the United States. It had started as an allergy drug developed by the French chemical
firm Rhône-Poulenc, and it was noticed to have “slowed down” bodily processes.
Also in 1954, the FDA approved BHA (butylated hydroxyanisole) as a food preservative; coincidentally,
McDonald’s was franchised that same year. It soon became the largest “fast food” chain. Although not really a new
“drug” (despite numerous fast food “addicts”), the arrival and popularity of national fast food chains (and ready-
made meals such as the new TV dinners in the supermarket) were the beginning of a massive change in public
nutrition and thus, public health.
Perhaps the most dramatic change in the popularization of drugs came with the 1955 marketing of meprobamate
(first developed by Czech scientist Frank A. Berger) as Miltown (by Wallace Laboratories) and Equanil (by Wyeth).
This was the first of the major tranquilizers or anti-anxiety compounds that set the stage for the 1960s “drug era.”
The drug was so popular that it became iconic in American life. (The most popular TV comedian of the time once
referred to himself as “Miltown” Berle.) Unfortunately, meprobamate also proved addictive.
26
In 1957, British researcher Alick Isaacs and J. Lindenman of the National Institute for Medical Research, Mill Hill,
London, discovered interferon—a naturally occurring antiviral protein, although not until the 1970s (with the advent
of gene-cloning technology) would it become routinely available for drug use. In 1958, a saccharin-based artificial
sweetener was introduced to the American public. That year also marked the beginning of the thalidomide tragedy
(in which the use of a new tranquilizer in pregnant women caused severe birth defects), although it would not
become apparent until the 1960s. In 1959, Haldol (haloperidol) was first synthesized for treating psychotic
disorders.
Blood products also became important therapeutics in this decade, in large part because of the 1950 development of
methods for fractionating blood plasma by Edwin J. Cohn and colleagues. This allowed the production of numerous
blood-based drugs, including fraction X (1956), a protein common to both the intrinsic and extrinsic pathways of
blood clotting, and fraction VIII (1957), a blood-clotting protein used for treating hemophilia.
The birth of birth control
Perhaps no contribution of chemistry in the second half of the 20th century had a greater impact on social customs
than the development of oral contraceptives. Several people were important in its development—among them
Margaret Sanger, Katherine McCormick, Russell Marker, Gregory Pincus, and Carl Djerassi.
Sanger was a trained nurse who was a supporter of radical, left-wing causes. McCormick was the daughter-in-law of
Cyrus McCormick, founder of International Harvester, whose fortune she inherited when her husband died. Both
were determined advocates of birth control as the means to solving the world’s overpopulation. Pincus (who
founded the Worcester Foundation for Experimental Biology) was a physiologist whose research interests focused
on the sexual physiology of rabbits. He managed to fertilize rabbit eggs in a test tube and got the resulting embryos
to grow for a short time. The feat earned him considerable notoriety, and he continued to gain a reputation for his
work in mammalian reproductive biology.
Sanger and McCormick approached Pincus and asked him to produce a physiological contraceptive. He agreed to
the challenge, and McCormick agreed to fund the project. Pincus was certain that the key was the use of a female
sex hormone such as progesterone. It was known that progesterone prevented ovulation and thus was a pregnancy-
preventing hormone. The problem was finding suitable, inexpensive sources of the scarce compound to do the
necessary research. Enter American chemist Russell Marker. Marker’s research centered on converting sapogenin
steroids found in plants into progesterone. His source for the sapogenins was a yam grown in Mexico. Marker and
colleagues formed a company (Syntex) to produce progesterone. In 1949, he left the company over financial
disputes and destroyed his notes and records. However, a young scientist hired that same year by Syntex ultimately
figured prominently in further development of “the Pill.”

27
The new hire, Djerassi, first worked on the synthesis of cortisone TECHNOLOGY: HeLa cells
from diosgenin. He later turned his attention to synthesizing an Henrietta Lacks moved from Virginia
“improved” progesterone, one that could be taken orally. In 1951, his to Baltimore in 1943. She died eight
group developed a progesterone-like compound called norethindrone. years later, but her cells continued to
Pincus had been experimenting with the use of progesterone in live, sparking a controversy that is still
rabbits to prevent fertility. He ran into an old acquaintance in 1952, alive and well today. In 1951, Lacks
John Rock, a gynecologist, who had been using progesterone to was diagnosed with a malignant
enhance fertility in patients who were unable to conceive. Rock cervical tumor at Johns Hopkins
theorized that if ovulation were turned off for a short time, the School of Medicine. Before radium
reproductive system would rebound. Rock had essentially proved in treatment was applied, a young
humans that progesterone did prevent ovulation. Once Pincus and resident took a sample of the tumor
Rock learned of norethindrone, the stage was set for wider clinical and passed it along to George Gey,
trials that eventually led to FDA approval of it in 1960 as an oral head of tissue culture research at
contraceptive. However, many groups opposed this approval on Hopkins. His laboratory had been
moral, ethical, legal, and religious grounds. Despite such opposition, searching for the tool needed to study
the Pill was widely used and came to have a profound impact on cancer: a line of human cells that
society. could live outside the human body.
DNA et al. Growing these cells would allow
Nucleic acid chemistry and biology were especially fruitful in the possible cures to be tested before
1950s as not only the structure of DNA but also the steps in its using them in humans. They found
replication, transcription, and translation were revealed. In 1952, this tool with Lacks’ cells, which grew
Rosalind E. Franklin at King’s College in England began producing at an amazing pace. Gey called them
the X-ray diffraction images that were ultimately used by James HeLa cells. These cells were
Watson and Francis Crick in their elucidation of the structure of instrumental in growing the poliovirus
DNA published in Nature in 1953. and helped create the polio vaccine.
Two years later, Severo Ochoa at New York University School of Gey shared these cells with his
Medicine discovered polynucleotide phosphorylase, an RNA- colleagues worldwide, and they
degrading enzyme. In 1956, electron microscopy was used to became a staple of in vitro human
determine that the cellular structures called microsomes contained physiological research.
RNA (they were thus renamed ribosomes). That same year, Arthur However, Lacks’ family was
Kornberg at Washington University Medical School (St. Louis, MO) completely unaware of this use of her
discovered DNA polymerase. Soon after that, DNA replication as a cells until 1974, when one of her
semiconservative process was worked out separately by daughters-in-law found out by chance
autoradiography in 1957 and then by using density centrifugation in through a family friend that the friend
1958. had been working with cells from a
With the discovery of transfer RNA (tRNA) in 1957 by Mahlon Bush Henrietta Lacks. She contacted Johns
Hoagland at Harvard Medical School, all of the pieces were in place Hopkins to see what was in
for Francis Crick to postulate in 1958 the “central dogma” of DNA— Henrietta’s files. However, Hopkins
that genetic information is maintained and transferred in a one-way never got back in touch with the
process, moving from nucleic acids to proteins. The path was set for family.
the elucidation of the genetic code the following decade. On a related The two issues raised are the
note, in 1958, bacterial transduction was discovered by Joshua question of consent and whether the
Lederberg at the University of Wisconsin—a critical step toward person’s family is due anything
future genetic engineering. morally or legally if what comes from
the cells is of commercial value.
Probing proteins Although consent is required now,
Behind the hoopla surrounding the discovery of the structure of what happened at Johns Hopkins in
DNA, the blossoming of protein chemistry in the 1950s is often the 1950s was not at all uncommon.
ignored. Fundamental breakthroughs occurred in the analysis of The other issue is still unresolved.
protein structure and the elucidation of protein functions. One could say that the field of
In the field of nutrition, in 1950 the protein-building role of the biomedical ethics was born the day
essential amino acids was demonstrated. Linus Pauling, at the Henrietta Lacks walked through the
California Institute of Technology, proposed that protein structures doors of Johns Hopkins.
are based on a primary alpha-helix (a structure that served as
inspiration for helical models of DNA). Frederick Sanger at the Medical Research Council (MRC) Unit for
28
Molecular Biology at Cambridge and Pehr Victor Edman developed methods for identifying N-terminal peptide
residues, an important breakthrough in improved protein sequencing. In 1952, Sanger used paper chromatography to
sequence the amino acids in insulin. In 1953, Max Perutz and John Kendrew, cofounders of the MRC Unit for
Molecular Biology, determined the structure of hemoglobin using X-ray diffraction.
In 1954, Vincent du Vigneaud at Cornell University synthesized the Suggested reading
hormone oxytocin—the first naturally occurring protein made with • Breakthrough: The Saga of Jonas
the exact makeup it has in the body. The same year, ribosomes were Salk, Carter, R. (Trident Press: New
identified as the site of protein synthesis. In 1956, the three- York, 1966)
dimensional structure of proteins was linked to the sequence of its • Dates in Medicine: A Chronological
amino acids, so that by 1957, John Kendrew was able to solve the Record of Medical Progress Over
first three-dimensional structure of a protein (myoglobin); this was Three Millennia, Sebastian, A., Ed.
followed in 1959 with Max Perutz’s determination of the three- (Parthenon: New York, 2000)
dimensional structure of hemoglobin. Ultimately, linking protein • The Greatest Benefit to Mankind: A
Medical History of Humanity Porter,
sequences with subsequent structures permitted development of
R. (W. W. Norton: New York, 1997)
structure–activity models, which allowed scientists to determine the
• Readings in American Health Care:
nature of ligand binding sites. These developments proved critical to Current Issues in Socio-Historical
functional analysis in basic physiological research and to drug Perspective, Rothstein, W. G., Ed.
discovery, through specific targeting. (University of Wisconsin Press:
On to the Sixties Madison, 1995)
By the end of the 1950s, all of pharmaceutical science had been • The Social Transformation of
transformed by a concatenation of new instruments and new American Medicine, Starr, P. (Basic
Books: New York, 1982)
technologies—from GCs to X-ray diffraction, from computers to
tissue culture—coupled, perhaps most importantly, to a new • Two Centuries of American
Medicine: 1776–1976, Bordley, J. III,
understanding of the way things (meaning cells, meaning bodies) and Harvey, A.M. (W. B. Saunders:
worked. The understanding of DNA’s structure and function—how Philadelphia, 1976)
proteins are designed and how they can cause disease—provided
windows of opportunity for drug development that had never before been possible. It was a paradigm shift toward
physiology-based medicine, born with the hormone and vitamin work in the 1920s and 1930s, catalyzed by the
excitement of the antibiotic era of the 1940s, that continued throughout the rest of the century with the full-blown
development of biotechnology-based medicine. The decade that began by randomly searching for antibiotics in dirt
ended with all the tools in place to begin searching for drugs with a knowledge of where they should fit in the
chemical world of cells, proteins, and DNA.

29
Anodynes & estrogens
Introduction Mention the Sixties and there are varied “hot-button” responses—“JFK, LBJ, civil rights, and
Vietnam” or “sex, drugs, and rock ’n’ roll.” But it was all of a piece. Politics and culture mixed like the colors of a
badly tie-dyed t-shirt. But in this narrative, drugs are the hallmark of the era—making them, taking them, and
dealing with their good and bad consequences. Ever after this era, the world would be continually conscious of pills,
pills, pills—for life, for leisure, and for love. In many ways, the Sixties was the Pharmaceutical Decade of the
Pharmaceutical Century.
A plethora of new drugs was suddenly available: the Pill was first marketed; Valium and Librium debuted to soothe
the nerves of housewives and businessmen; blood-pressure drugs and other heart-helping medications were
developed. Another emblem of the 1960s was the development of worldwide drug abuse, including the
popularization of psychotropic drugs such as LSD by “gurus” like Timothy Leary. The social expansion of drugs for
use and abuse in the 1960s forever changed not only the nature of medicine but also the politics of nations.
The technology of drug discovery, analysis, and manufacture also proliferated. New forms of chromatography
became available, including HPLC, capillary GC, GC/MS, and the rapid expansion of thin-layer chromatography
techniques. Proton NMR was developed to analyze complex biomolecules. By the end of the decade, amino acid
analyzers were commonplace, and the ultracentrifuge was fully adapted to biomedical uses. Analytical chemistry
and biology joined as never before in the search for new drugs and analysis of old ones.
And of equal importance, a new progressivism took the stage, especially in the United States, where increasing
demand for access to health care and protection from unsafe and fraudulent medications once more led to an
increased federal presence in the process of drug development, manufacture, and sale.
Popping the Pill
If there was any single thing that foreshadowed the tenor of the decade, one medical advance was paramount: In
1960 the first oral contraceptive was mass-marketed. Sex and drugs were suddenly commingled in a single pill,
awaiting only the ascendance of rock ’n’ roll to stamp the image of the decade forever. Born of the Pill, the sexual
revolution seemed to be in good measure a pharmaceutical one.
The earlier achievements of women’s suffrage and the growing presence of women in the labor force were
somewhat blocked from further development by the demands of pregnancy and child rearing in a male-dominated
society. Feminist historians recount how hopes were suddenly energized by the ability of women to control their
own bodies chemically and thus, socially. Chemical equality, at least for those who could afford it and, in the
beginning, find the right doctors to prescribe it, was at last available.
However, it was not a uniformly smooth process. First and foremost, despite its popularity, the technology for
tinkering with women’s reproductive systems had not been fully worked out. In Britain in 1961, there were
problems with birth control pills with excess estrogens. Similar problems also occurred in the United States. Dosage
changes were required for many women; side effects debilitated a few.
But still the sexual revolution marched on, as was documented in the 1966 Masters and Johnson report Human
Sexual Response, which showed a transformation of female sexuality, new freedoms, and new attitudes in both
sexes. Furthermore, technology was not done fiddling with reproduction by any means. In 1969, the first test-tube
fertilization was performed.
Valium of the dolls
In an era that would be far from sedate, the demand for sedatives was profound, and the drug marketplace responded
rapidly. Although Miltown (meprobamate), the first of the major “tranks,” was called the Wonder Drug of 1954,
sedatives weren’t widely used until 1961, when Librium (a benzodiazepine) was discovered and marketed as a
treatment for tension. Librium proved a phenomenal success. Then Valium (diazepam), discovered in 1960, was
marketed by Roche Laboratory in 1963 and rapidly became the most prescribed drug in history.
These drugs were touted to the general population and mass-marketed and prescribed by doctors with what many
claimed was blithe abandon. While the youth of America were smoking joints and tripping on acid, their parents’
generation of businessmen and housewives were downing an unprecedented number of sedatives. According to the
Canadian Government Commission of Inquiry into the Nonmedical Use of Drugs (1972), “In 1965 in the USA,
some 58 million new prescriptions and 108 million refills were written for psychotropes (sedatives, tranquilizers,
and stimulants), and these 166 million prescriptions accounted for 14% of the total prescriptions of all kinds written
in the United States.” Physical and psychological addiction followed for many. Drug taking became the subject of
books and movies. In the 1966 runaway best-seller Valley of the Dolls by Jacqueline Susann, the “dolls” were the

30
pills popped by glamorous upper-class women in California. Eventually, the pills trapped them in a world of drug
dependence that contributed to ruining their lives.
Drug wars
It was only a matter of time before the intensive testing of LSD by the military in the 1950s and early 1960s—as
part of the CIA’s “Project MKULTRA”—spread into the consciousness of the civilian population.
By 1966, the chairman of the New Jersey Narcotic Drug Study Commission declared that LSD was “the greatest
threat facing the country today… more dangerous than the Vietnam War.” In the United States, at least at the federal
level, the battle against hallucinogens and marijuana use was as intense as, if not more intense than, the fight against
narcotic drugs.
According to some liberal critics, this was because these BIOGRAPHY: FDA heroine
“recreational” drugs were a problem in the middle and the upper In 1961, after developing concerns in
classes, whereas narcotic addiction was the purview of the poor. her first review case, Frances Kelsey,
From the start of the decade, in the United States and around the M.D., held up FDA approval for a new
world, regions with large populations of urban poor were more sedative, thalidomide. Then, birth
concerned about the growing problem of narcotic drug addiction. In defect tragedies linked to thalidomide
1961, with the passage of the UN Single Convention on Narcotic came to light in Canada and Europe.
Drugs, signatory nations agreed to processes for mandatory On July 15, 1962, the Washington
commitment of drug users to nursing homes. In 1967, New York Post ran a story headlined “Heroine of
State established a Narcotics Addiction Control Program that, the FDA Keeps Bad Drug Off Market.”
following the UN Convention, empowered judges to commit addicts Kelsey was soon famed as the
into compulsory treatment for up to five years. The program cost woman who had spared her country a
$400 million over just three years but was hailed by Governor Nelson nightmare, and President Kennedy
Rockefeller as the “start of an unending war.” Such was the measure awarded her the Distinguished
of the authorities’ concern with seemingly out-of-control drug abuse. Federal Civilian Service medal. On
The blatant narcotics addictions of many rock stars and other October 10, 1962, Kennedy signed
celebrities simultaneously horrified some Americans and glamorized the Kefauver–Humphrey Drug
the use of “hard” drugs among others, particularly young people. Amendment, which required that drug
By 1968, the Food and Drug Administration (FDA) Bureau of Drug companies send reports of bad
Abuse Control and the Treasury Department’s Bureau of Narcotics reactions to the FDA and that drug
were fused and transferred to the Department of Justice to form the advertising mention harmful as well
Bureau of Narcotics and Dangerous Drugs in a direct attempt to as beneficial effects.
consolidate the policing of traffic in illegal drugs. Also in 1968,
Britain passed the Dangerous Drug Act to regulate opiates. As these efforts to stop drug use proliferated, technology
for mass-producing many of these same drugs continued to improve in factories around the world. By 1968, Canada
was producing nearly 56 million doses of amphetamines; by 1969, the United States was producing more than
800,000 pounds of barbiturates.
Forging a great society
The 1960 election of a Democratic administration in the United States created a new demand for government
intervention in broader areas of society. John F. Kennedy and his successor, Lyndon B. Johnson, expanded federal
intervention in a host of previously unregulated areas, both civil and economic, including food and medicine.

31
Around the world, a new social agenda demanding rights for SOCIETY: The “great” paternalism
minorities (especially apparent in the civil rights struggles in the Starting with the Kennedy
United States) and women (made possible by freedoms associated administration in 1961, a new activist
with the Pill) fostered a new focus on protecting individuals and government in the United States
ending, or at least ameliorating, some of the damage done to hitherto posited the need for expanding
exploited social classes. Health and medicine, a prime example, led federal intervention in a host of
to what many disparagingly called “government paternalism.” This previously unregulated areas, both
same liberal agenda, in its darker moments, moved to protect less- civil and economic. The necessity of
developed nations from the perceived global communist threat— added governmental regulation of
hence the Vietnam war. Ultimately, there was neither the money nor food and the healthcare system
the will to finance internal social and external political agendas. seemed overwhelmingly apparent.
General inflation resulted, with specific increases in the cost of The Kennedy administration’s
medical care. healthcare plans and the tenets of
Perhaps one of the most significant long-term changes in drug Johnson’s “Great Society”—which
development procedures of the era, especially regarding the role of was ultimately responsible for both
governments, came with a new desire to protect human “guinea Medicare and Medicaid—set the
pigs.” General advocacy for the poor, women, and minorities led to a stage for increased federal funding for
reexamination of the role of the paternalistic, (generally) white male those unable to afford health care.
clinicians in the morally repugnant treatment of human subjects These expanded dollars not only
throughout the century, before and after the Nuremberg trials of Nazi provided impetus for pharmaceutical
doctors. It was a profound catalyst for the modern bioethics research, they also created
movement when health groups worldwide established new inflationary pressures on health care
regulations regarding informed consent and human experimentation when coupled with the difficulties of
in response to specific outrages. simultaneously funding the Vietnam
In 1962, the Kefauver–Harris amendments to the Federal Food, War.
Drug, and Cosmetic Act of 1938 were passed to expand the FDA’s
control over the pharma and food industries. The Kefauver amendments were originally the outgrowth of Senate
hearings begun in 1959 to examine the conduct of pharmaceutical companies. According to testimony during those
hearings, it was common practice for these companies “to provide experimental drugs whose safety and efficacy had
not been established to physicians who were then paid to collect data on their patients taking these drugs. Physicians
throughout the country prescribed these drugs to patients without their control or consent as part of this loosely
controlled research.” That the amendments were not passed until 1962 was in part because of the profound battle
against allowing additional government control of the industry. The 1958 Delaney proviso and the 1960 Color
Additive Amendment led to industry and conservative resentment and complaints that the FDA was gaining too
much power.
However, with the 1961 birth defects tragedy involving the popular European sedative thalidomide (prevented from
being marketed in the United States by FDA researcher Frances Kelsey), public demand for greater protections
against experimental agents was overwhelming. Thalidomide had been prescribed to treat morning sickness in
countless pregnant women in Europe and Canada since 1957, but its connection to missing and deformed limbs in
newborns whose mothers had used it was not realized until the early 1960s. In 1961, televised images of deformed
“thalidomide babies” galvanized support for the FDA, demonstrating the power of this new medium to transform
public opinion, as it had in the 1960 Nixon–Kennedy debates and would again in the Vietnam War years.
Building on the newfound public fears of synthetic substances, Rachel Carson’s 1962 book Silent Spring
precipitated the populist environmental movement. As with the 1958 Delaney clause and the Federal Hazardous
Substances Labeling Act of 1960, it was all part of increased public awareness of the potential dangers of the rapid
proliferation of industrial chemicals and drugs. According to the 1962 amendments to the 1938 Food, Drug, and
Cosmetic Act, new drugs had to be shown to be effective, prior to marketing, by means to be determined by the
FDA. Ultimately, this requirement translated to defined clinical trials. However, Congress still vested excessive faith
in the ethics of physicians by eliminating the need for consent when it was “not feasible” or was deemed not in the
best interest of the patient; these decisions could be made “according to the best judgment of the doctors involved.”
Despite the fact that President Kennedy proclaimed a Consumer Bill of Rights—including the rights to safety and to
be informed—more stringent federal guidelines to protect research subjects were not instituted until 1963. Then the
NIH responded to two egregious cases: At Tulane University, a chimpanzee kidney was unsuccessfully transplanted
into a human with the patient’s consent but without medical review. At the Brooklyn Jewish Chronic Disease

32
Hospital, in association with the Sloan-Kettering Cancer Research Institute, live cancer cells were injected into
indigent, cancer-free elderly patients.
Particularly important to the public debate was the disclosure of 22 examples of potentially serious ethical violations
found in research published in recent medical journals. The information was presented by Henry Beecher of Harvard
Medical School to science journalists at a 1965 conference sponsored by the Upjohn pharmaceutical company and
was later published in the New England Journal of Medicine.
In 1964, the World Medical Association issued its Declaration of Helsinki, which set standards for clinical research
and demanded that subjects be given informed consent before enrolling in an experiment. By 1966, in the United
States, the requirement for informed consent and peer review of proposed research was written into the guidelines
for Public Health Service–sponsored research on human subjects. These regulations and the debate surrounding
human experimentation continued to evolve and be applied more broadly.
And in an attempt to protect consumers from unintentional waste or fraud, in 1966, wielding its new power from the
Kefauver amendments, the FDA contracted with the National Academy of Sciences/National Research Council to
evaluate the effectiveness of 4000 drugs approved on the basis of safety alone between 1938 and 1962.
The U.S. government faced other health issues with less enthusiasm. In 1964, the Surgeon General’s Report on
Smoking was issued, inspired in part by the demand for a response to the Royal College of Surgeons’ report in
Britain from the year before. Although the Surgeon General’s report led to increased public awareness and mandated
warning labels on tobacco products, significant government tobacco subsidies, tax revenues, smoking, and smoking-
related deaths continued in the United States. The debate and associated litigation go on to this day.
Heart heroics and drugs
The lungs were not the only vital organs to attract research and publicity in the Sixties. From heart transplants and
blood-pressure medications to blood-based drugs, a new kind of “heroic” medicine focused technology on the
bloodstream and its potential for helping or hindering health.
Heart surgery captivated the public with its bravado. In 1960, the transistorized self-contained pacemaker was
introduced. In the 1960s, organ transplantation became routinely possible with the development of the first immune
suppressant drugs. In 1967, the coronary bypass operation was developed by Michael DeBakey; in that same year,
physician (and some would say showman) Christiaan Barnard performed the first heart transplant. In 1968,
angioplasty was introduced for arterial treatment and diagnosis.

33
But for all the glamour of surgery, chemicals for controlling heart TECHNOLOGY: Cracking the code
disease provided far more widespread effects. Blood-pressure drugs For all the hoopla over decoding the
based on new knowledge of human hormone systems became structure of DNA in the 1950s, it was
available for the first time. In 1960, guanethidine (a noradrenaline still a molecule without a mission.
release inhibitor) was developed for high blood pressure; in rapid How did genes make proteins? How
succession the first beta-adrenergic blocker appeared in Britain were they regulated? Although the
(1962), and alpha-methyldopa, discovered in 1954, was first used role of mRNA was elucidated in the
clinically in the early 1960s for treating high blood pressure by late 1950s, it wasn’t until 1961 that
interfering not with noradrenaline release, but with its synthesis. In Marshall Nirenberg and Heinrich
1964, methods were perfected to nourish individuals through the Matthaei at the NIH used an artificially
bloodstream. This ability to feed intravenously and provide total created poly-U mRNA in a cell-free
caloric intake for debilitated patients forever changed the nature of system to synthesize the novel
coma and the ethics of dealing with the dying. The use of blood- protein poly(phenylalanine). By 1965,
thinning and clot-dissolving compounds for heart disease was also mix-and-match experiments with
pioneered in the early 1960s: Streptokinase and aspirin reduced other artificial mRNAs established
deaths by 40% when taken within a few hours of a heart attack. that there was a unique degenerate
In the late Sixties, as methods of fractionation using centrifugation triplet codon for each amino acid and
and filtration improved dramatically, concentrated blood factors identified all of these triplet
became readily available for the first time. Plasmapheresis— sequences that make up the genetic
centrifuging the red blood cells from plasma and then returning the code.
whole cells to the donor—allowed donations to occur twice weekly By 1964, Pamela Abel and T. A.
instead of every few months. Blood proteins such as “Big-D” Rh Trautner demonstrated the
antibodies used to immunize women immediately after the birth of universality of the genetic code, from
their first Rh-positive child, albumin, gamma globulins, blood-typing bacteria to the “higher” forms of life.
sera, and various clotting factors such as factor VIII created a The genetic code’s elegance and
booming industry in plasma products. But it also made blood all the practical potential inspired Linus
more valuable as a commodity. Pauling to propose molecular
Abuses increased in collecting, distributing, and manufacturing blood fingerprinting using protein and DNA
products, as recounted by Douglas Starr in his 1999 book Blood. In sequences as a means to identify
1968, in a move that foreshadowed the AIDS era yet to come, the organisms. By 1969, Pauling’s idea
United States revoked all licenses for consumer sales of whole was bearing clinical fruit:
plasma prepared from multiple donors because of fears that viral Enterobacteria isolates proved
hepatitis would spread. Fears were exacerbated by scandalous classifiable using comparative DNA
revelations about the health status (or lack thereof) of many paid hybridization. This development was
blood donors. This donor problem became even more newsworthy in not only a major breakthrough in
the early 1970s as malnourished street hippies, drug abusers, species analysis, but the prelude to a
unhealthy indigents, and prisoners were revealed as sources often universe of molecular diagnostics.
contaminating the world blood supply.
High tech/new mech
In the 1960s, analytical chemists increasingly turned their attention to drug discovery and analysis and to
fundamental questions of biological significance in physiology and genetics. New technologies were developed, and
previously designed instruments were adapted to biomedical applications.
The development of high-pressure (later known as high-performance) liquid chromatography (HPLC) heralded a
new era of biotechnology and allowed advanced separations of fragile macromolecules in fractions of the time
previously required. The radioimmune assay, first developed in 1959 by Rosalyn Yalow and Solomon Berson, was
perfected in 1960. Tissue culture advances proliferated, allowing more and better in vitro testing of drugs; and, when
coupled with radiotracer and radioimmunoassay experiments, led to unprecedented breakthroughs in all areas of
mammalian physiology. In 1964, for example, Keith Porter and Thomas F. Roch discovered the first cell membrane
receptor. Further developments in analytical chemistry came as gas chromatography (GC) was first linked with mass
spectrometry (MS), providing a quantum leap in the ability to perform structural analysis of molecules. And
laboratory automation, including primitive robotics, became a powerful trend.
But perhaps the most important breakthrough in tissue culture, and one that created a direct path to the Human
Genome Project, was the invention of somatic-cell hybridization by Mary Weiss and Howard Green in 1965. By
fusing mouse and human cells together via the molecular “glue” of Sendai virus, these researchers and others

34
quickly developed a series of cell lines containing mostly mouse chromosomes but with different single human ones,
all expressing unique proteins. For the first time, human proteins could be assigned to individual human
chromosomes (and later chromosome fragments) to the degree that gene mapping was finally possible in humans. Of
comparable importance, new improvements in fermentation technology allowed continuous cycling and easy
sterilization along with mass-produced instrumentation.
Fundamental breakthroughs in the etiology of disease transformed the understanding of infection. In 1961, the
varying polio virus receptors were correlated to pathogenicity of known isolates; in 1967, diphtheria toxin’s mode of
action was finally determined and provided the first molecular definition of a bacterial protein virulence factor.
Structural biology proceeded at an enthusiastic pace. In 1960, John Kendrew reported the first high-resolution X-ray
analysis of the three-dimensional structure of a protein—sperm whale myoglobin. In the 1960s, image analyzers
were linked to television screens for the first time, enhancing the use and interpretation of complex images. And in
1967, Max Perutz and Hilary Muirhead built the first high-resolution model of the atomic structure of a protein
(oxyhemoglobin), which promoted a wave of protein structural analysis. Computer systems became increasingly
powerful and quickly indispensable to all laboratory processes, but especially to molecular analysis.
Hardly falling under the category of miscellaneous discoveries, at least in terms of the development of
biotechnology and the pharmaceutical industry, was the development of agarose gel electrophoresis in 1961. This
requirement was critical for separating and purifying high molecular weight compounds, especially DNA; in 1963,
to the benefit of a host of laboratory workers, the first film badge dosimeter was introduced in the United Kingdom;
and in 1966, the disc-diffusion standardized test was developed for evaluating antibiotics, which was a boon to the
exploding pharmaceutical development of such compounds.
Dancing with DNA
Finally, the understanding of the structure of DNA and acceptance that it was indeed genetic material yielded
intelligible and practical results.
At the beginning of the decade, researchers A. Tsugita and Heinz Fraenkel-Conrat demonstrated the link between
mutation and a change in the protein produced by the gene. Also in 1960, Francois Jacob and Jacques Monod
proposed their operon model. This was the birth of gene regulation models, which launched a continuing quest for
gene promoters and triggering agents, such that by 1967, Walter Gilbert and co-workers identified the first gene
control (repressor) substance.
Perhaps most significantly, the prelude to biotechnology was established when restriction enzymes were discovered,
the first cell-free DNA synthesis was accomplished by biochemist Arthur Kornberg in 1961, and the DNA–amino
acid code was deciphered.
Deciphering the genetic code was no minor feat. Finding the mechanism for going from gene to protein was a
combination of brilliant theorizing and technological expertise.
The technology necessary for the dawn of biotechnology proliferated as well. Throughout the 1960s, automated
systems for peptide and nucleic acid analysis became commonplace. In 1964, Bruce Merrifield invented a simplified
technique for protein and nucleic acid synthesis, which was the basis for the first such machines. (In the 1980s, this
technique would be mass-automated for gene synthesis.) And in 1967, the first specific gene transfer was
accomplished; the lac operon was functionally transferred from E. coli to another bacterial species. And, although its
importance was not realized at the time, in 1967, Thermus aquaticus was discovered in a hot spring in Yellowstone
National Park. This microbe was the first Archea ever found and the ultimate source of heat-stable taq polymerase—
the enabling enzyme for modern PCR.
By the late 1960s, not only had the first complete gene been isolated from an organism, but biologists were already
debating the ethics of human and animal genetic engineering.
Generation of drug-takers
In the 1960s, the post–World War II generation of baby boomers entered their teenage years. These were the
children of vitamins, antibiotics, hormones, vaccines, and fortified foods such as Wonder Bread and Tang. The
technology that won the war had also transformed the peace and raised high expectations on all social fronts,
including, and perhaps especially, health. The unparalleled prosperity of the 1950s was largely driven by new
technologies and a profusion of consumer goods. This prosperity, coupled with a new political progressivism,
created a euphoric sense of possibility early in the decade, especially with regard to health care. The prevailing
belief was that medicine would save and society would provide. Although the dream ultimately proved elusive, its
promise permeated the following decades—Medicare, Medicaid, and a host of new regulations were the lingering
aftereffects in government, a generation of drug-takers the result in society at large.

35
The Sabin oral polio vaccine was approved in the United States in 1960 after trials involving 100 million people
overseas and promised salvation to a new generation from this former scourge. In 1961, the sweetener cyclamate
was introduced in the first low-calorie soft drink, Diet Rite, and it created demand for consumption without cost, or
at least without weight gain. In 1964, a suitable, routine vaccine for measles was developed that was much better
than its predecessor vaccine first produced in 1960. In 1967, a live-virus mumps vaccine was developed. Faith that
childhood diseases could be stamped out grew rapidly. The Surgeon General of the United States even went so far as
to state that we were coming near—for the first time in history—to finally “closing the books on infectious
diseases.”
From today’s perspective, these seem like naive hopes. But they were not without some foundation. Antibiotics had
yet to lose effectiveness, and more were being discovered or synthesized. In 1966, the first antiviral drug,
amantadine-HCl, was licensed in the United States for use against influenza. In 1967, the who began efforts to
eradicate smallpox. The rhetoric of nongovernmental antipolio and antituberculosis groups provided additional
reason for optimism.
No wonder a generation of baby boomers was poised to demand a Suggested reading
pill or vaccine for any and every ill that affected humankind— • Pills-A-Go-Go: A Fiendish
pharmaceutical protection from unwanted pregnancies, from mental Investigation into Pill Marketing, Art,
upset, from disease. Rising expectations were reflected in the science History & Consumption Hogshire, J.
and business arenas, where research was promoted and industry (Feral House: Venice, CA, 1999)
driven to attack a greater variety of human ills with the weapons of • The Greatest Benefit to Mankind: A
science. Technological fixes in the form of pills and chemicals Medical History of Humanity Porter,
seemed inevitable. How else could the idea of a war on cancer in the R. (W. W. Norton: New York, 1997)
next decade be initiated with the same earnestness and optimism as • Drugs and Narcotics in History
Porter, R.; Teich, M., Eds.
the quest for a man on the moon? The 1969 success of Apollo 11 was
(Cambridge University Press:
the paradigm for the capabilities of technology. Cambridge, U.K., 1997)
On a darker note, the decade ended with a glimpse of the more • Readings in American Health Care:
frightening aspects of what biomedical technology could do, or at Current Issues in Socio-Historical
least what militarists wanted it to do. In 1969, the U.S. Department of Perspective, Rothstein, W. G., Ed.
Defense requested $10 million from Congress to develop a synthetic (University of Wisconsin Press:
biological agent to which no natural immunity existed. Funding soon Madison, 1995)
followed under the supervision of the CIA at Fort Detrick, MD. • Readings in American Health Care:
Ultimately, although a new battery of technologies had become Current Issues in Socio-Historical
available in the 1960s, including HPLC, GC/MS, and machines to Perspective Rothstein, W. G., Ed.
synthesize DNA and proteins, and new knowledge bases were (University of Wisconsin Press:
developed—from cracking the genetic code to the discovery of Madison, WI, 1995)
restriction enzymes—the majority of these breakthroughs would not • Blood: An Epic History of Medicine
and Commerce Starr, D. (Alfred A.
bear fruit until the 1970s and 1980s. The 1960s would instead be
Knopf: New York, 1999)
remembered primarily for the changes wrought by new
• The Social Transformation of
pharmaceuticals and new social paradigms. As the decade ended, it American Medicine Starr, P. (Basic
was an open question as to whether the coming decade would bring Books: New York, 1982)
the dawn of massive biological warfare or the rather expected nuclear • FDA History on the Web:
holocaust (heightened in the world psyche by the Cuban Missile www.fda.gov/oc/history/default.htm
Crisis of 1962). Others speculated that the future would bring
massive social collapse arising from the intergenerational breakdown in the West that many blamed on the success
of pharmaceutical technology in developing and producing new and dangerous drugs.

36

You might also like