1

EUGENICS
AND

GENOCIDE
IN THE

MODERN WORLD:
the
cause of the
epidemic?
Dr Romesh Senewiratne
(MBBS, Australia, 1983)

AIDS

2

CONTENTS:
1. Introduction: Where did the HIV plague come from?……………………3
2. Chronology: Immuno-warfare development and testing……………….23
3. The Australian Government’s Response to AIDS……………………..39
4. Foundations of the Eugenics Movement…….…………………….…….53
5. Negative eugenics programs…………………………………………….82
6. Paranoia about population growth…………………………………… 95
7. The effect of AIDS on U.N. population estimates………………………119
8. Imperial designs in Africa……………………………………………….126
9. The development of biological warfare in the Cold War………………148
10. Medical wars and the AIDS industry………………………………….167
11.The Macfarlane Burnet Centre on AIDS……………………………….179
12.The Burnet Institute’s International Health Unit……………………...188
13.AIDS, psychiatry and Glaxo Smith Kline……………………………..198
14.Genetic engineering and experimental chimpanzees…………………...208
15.Biological warfare in Central Africa?...................................................219
16.Biological warfare research in Australia……………………………….231
17.Tying it all together……………………………………………………..244

3

References………………………………………………………………….256

Chapter 1
QUESTIONS ABOUT THE ORIGIN OF AIDS
I began an independent, informal investigation into the cause of the epidemic of
HIV infection and AIDS in 1996. The investigation was independent because I
neither sought nor was given funding for my research. It was informal because my
work was not based at a research institution or university. I researched HIV, AIDS
and eugenics in my spare time, motivated by a combination of curiosity and
concern. Curiosity about the remarkable coincidences I was noticing between the
history of eugenics and the history of AIDS, and concern about the unfolding
epidemic and my growing knowledge of biological warfare by the ‘Allies’ during,
and after, the Second World War.
The main reason I became interested in AIDS in the mid-1990s was an unusual
epidemiological pattern I noticed in the global epidemic. In America, Europe and
Australia it was affecting, apparently selectively, male homosexuals and, to a lesser
degree, heroin addicts. Yet in the Third World it was a heterosexual disease
affecting men and women equally. In the so-called ‘developing nations’ those
contracting the disease were not, it was reported, drug users.
During my medical studies at the University of Queensland I learned nothing about
AIDS, other than rumours of a new disease that was affecting homosexuals in the
USA, which was being called, at that time, ‘GRID’ – or ‘Gay Related Immune
Deficiency’. I graduated in 1983, the year before Robert Gallo at the National
Institutes of Health announced his discovery of the virus that caused the disease,
by which time its name had been changed to AIDS – ‘Acquired Immune
Deficiency Syndrome’.
Since 1997 I have been seeking evidence to confirm, or deny, the possibility that
HIV and AIDS are being used for the purpose of genocide. To do this I have had to
look at the entire history of the epidemic – drawing primarily on extensive
publications on HIV and AIDS by a wide range of sources. Not having any
particular expertise in virology, immunology, or genetics I did not regard myself,
when I began my investigations, as doing “work” as such – I was just interested in
the range of causal theories published in the “alternative media”, and comparing

4

them with my own – that AIDS is the result of the implementation of a “negative
eugenics program”.
This theory was original – I developed it by observing epidemiological patterns,
with the certain knowledge that disease can be deliberately created, and that
deliberate disease creation by the members of the medical profession did occur,
including frank (if usually covert) biological warfare.
This book presents evidence that the AIDS epidemic was created deliberately, as
part of a covert global depopulation strategy. The motive for this heinous act was
concern about overpopulation in so-called ‘developing nations’ and the strategy
was orchestrated by doctors working under the auspices of the United Nations and
its health body, the World Health Organization (WHO). These doctors, many of
whom were respected professors at leading universities in the ‘developed world’,
were responsible for the genetic engineering of animal viruses to create the Human
Immunodeficiency Virus (HIV) and implementation of the depopulation program
through deliberate contamination of vaccines with the newly created virus. The
contaminated vaccines were seeded into carefully chosen populations though the
WHO’s Global Immunization Program, first in Africa and later in the Caribbean,
Asia, the Pacific Islands, and South America (notably Brazil). In the First World,
the virus was seeded into homosexual populations and injecting drug users
(specifically heroin addicts).
I experienced several disincentives in taking my investigations into HIV and AIDS
any further than the collection of information. The few of my medical colleagues I
discussed my theory with told me to “leave it alone”, and that though it was a
possibility, it would be difficult to prove – and likely to get me into trouble. This
advice was, at least in part, because of who I was investigating: medical research
institutions and universities, the medical profession and various “professional
organizations”, international medical charities, mining companies, drug companies
and “Western” governments, including our own – I was even looking at the United
Nations and World Health Organization as potential culprits.
The theory that the AIDS virus (HIV) was deliberately created and introduced, via
infected vaccines and blood products to targeted populations will be explored in
detail in the following pages, looking specifically at the institutions in Australia
that may have been involved in such research. The Annual Reports of these

5

institutions are examined, to ascertain the types of research being done, and who is
funding it.
In addition, the history of biological and chemical warfare research in the 20 th
century will be discussed, in an effort to determine what illnesses in the modern
world can be attributed to the undisclosed implementation of such warfare. In
doing so, the secret history of biological and chemical warfare research programs
in Australia will be disclosed in detail for the first time. These programs, none of
which have been admitted to by the implicated authorities, include experiments in
Australia on Australians, and experiments based in, and orchestrated from,
Australia, in neighbouring parts of the world.
It will be shown that if AIDS is the result of germ warfare, or intentional genocide
of targetted races, several Australian institutions and universities and successive
Australian governments are deeply implicated in the crime. The same is the case
with several respected medical research institutions and universities in the USA,
and others in the UK, France, Japan and Belgium, along with successive
governments in these nations, and international health organizations and aid
organizations including the International Red Cross and the World Health
Organization. Other research institutions and universities implicated in such
genocide, if the AIDS epidemic is the result of biological warfare with the HIV
virus, would be located in Japan, South Africa, Canada and other nations that have
been involved in an intricate network of HIV/AIDS research collaboration.

6

Overview of the Epidemic
The AIDS epidemic began in the late 1970s and early 1980s, roughly
simultaneously in certain cities in the USA and in Central Africa. I graduated in
medicine at the University of Queensland in 1983, the year before Robert Gallo at
the National Cancer Institute in the USA announced he had discovered the cause of
AIDS – a virus he termed ‘Human T-lymphotropic Virus III’ (HTLV III). Later the
name was changed to Human Immunodeficiency Virus, or HIV.
The global endemic (pandemic) of AIDS is justifiably regarded as the world’s
biggest public health threat from an infectious disease with the possible exception
of malaria, and is said have killed in excess of 20 million people over the past 15
years. Over 30 million people are said to be infected with HIV in Africa alone.
Another 8 million or so people have been infected with HIV in Asia and over a
million people have HIV/AIDS in South America, according to official figures.
Europe, North America, Australia, Britain, New Zealand and other “First World”
nations (coincidentally those with a history of eugenic movements) have been
spared the worst of the AIDS epidemic – in these countries AIDS and HIV
infection have remained confined to homosexuals, intravenous drug users and
prostitutes.
In the “Third World”, where the epidemic continues to worsen (unlike in the First
World), the disease is seen equally in men and women, and is not associated with
homosexuality or drug use. Here the epidemic also involves infants and children.
Many of these children are known to suffer to severe malnutrition, which is known
to cause depressed immunity and susceptibility to infection. These children are
routinely, and increasingly, being injected, despite this malnutrition, with a range
of live virus vaccines, including measles and polio vaccines. In Africa, such
vaccination has been occurring, with little attention to sterilization of needles or the
general health of infants and children, for almost a century – long before the advent
of HIV and AIDS.
This has remained, for nearly three decades, the global pattern of AIDS and HIV
infection. It affects women, men and children in the Third World, but is limited to
homosexuals and injecting drug users in ‘the West’ (Australia, where I live, has the
same pattern – known amongst AIDS researchers and epidemiologists as ‘Pattern

7

1’, despite not actually being in the ‘Western hemisphere).

The Coincidence with Nazi eugenic program targets
It is common knowledge that homosexuals were among the populations targeted
for mass-murder (euphemized as ‘euthanasia’) by the Nazis during the Second
World War, and that the scientific rationale for the meticulously orchestrated
genocide perpetrated by Adolf Hitler and his government was the implementation
of what they regarded as ‘eugenics’ – the science of improving the human race by
selective breeding. It is also common knowledge that the Nazis targeted mental
patients, Jews, Gypsies, Poles and ‘non-Germanic races’ in the parts of Europe
they conquered as ‘degenerates’ to be ‘exterminated’ in the interest of improving
the ‘human stock’ (or ‘blood’). It is less well known that the ‘science’ of eugenics
was imported into Germany from the United Kingdom – the British Empire, in
those days.
The fact that eugenics was embraced by all the universities in Australia and other
‘white colonies’ of the British Empire, and in the major universities in the USA
prior to (and during) the genocidal extermination programs of the German
government was an embarrassment to these institutions after 1945, when details of
the Holocaust emerged.

The AIDS epidemic in Africa, and subsequently other parts of the “Third World”,
has showed very different demographics from the ‘West’ from the start – men and
women have been equally affected, and the vast majority of victims have not been
homosexual or intravenous drug users (intravenous drug use is, of course, far less
common in the “Third World” than in the “First World). Noticing that the
preventive measures being promoted to control AIDS and HIV infection had
obvious similarities to those previously employed as a strategy to combat
population increases in Africa and elsewhere in the so-called “developing world”
during the years of publicly declared eugenics programs, and knowing that since
the 1960s there had been increasing paranoia in the West about “overpopulation” in
poor nations, especially those in Africa and Asia, I looked more closely at the
possibility that HIV was being used to control the much-vaunted problem of “Third
World overpopulation”.

8

Cold War Development of Biological Weapons
The development of biological weapons, being illegal, is routinely denied by every
country in the world. Australia is no exception; the Australian media rarely if ever
mention our nation’s involvement in developing and testing ‘biological agents’, as
they are called by the military – unless one happens to be trained by the military
(and I was not, at least officially).
During the Second World War, however, all the protagonist nations were involved
in “research” into germ warfare. In Australia, which served as a command post for
the British (Royal) Army, Navy and Airforce, medical doctors and scientists
experimented on Australian soldiers with mustard gas and anti-malarial drugs.
These experiments, conducted in secret in remote North Queensland military
camps, were orchestrated from the Heidelberg Military Hospital in Melbourne.
Adding insult to injury, successive governments in Australia and Britain denied the
occurrence of any such experiments until they were revealed to the world by the
mass-media fifty years later (meaning that those who claimed to have ongoing
disability and illness from the experiments were denied compensation, and worse,
diagnosed as “mentally ill” by virtue of their “delusion” that they were subjected to
these tests and their “paranoia” that the government would do such things).
The mustard gas experiments involved deliberately exposing young soldiers, who
had been told they would be helping the war effort by assisting in medical tests for
the Allies, to the corrosive chemical warfare agent “mustard gas”. Mustard gas,
which was being stockpiled by the Germans, causes severe burns to the skin and
respiratory system, something the doctors who conducted the tests were well aware
of. They were not sure, however, about how many times a man with severe
mustard gas burns could run around an obstacle course. They did not know how
long the burns would take to heal, and how fully, without treatment. They wanted
to find out how badly burned a half-naked man would be if a mustard gas bomb
was exploded near them, these being developed, according to their “intelligence
sources”, by the Japanese. They chose far North Queensland because it was far
from prying eyes. Those who conducted the experiments knew, in other words, that
the public would disapprove of such “medical research”, and demand that it be
stopped.

9

Australian Malaria Experiments during and after WWII
At the same time as the mustard gas (chemical warfare) experiments, the parasitic
infection malaria was also being studied by Royal Australian Army and British
Army doctors, in collaboration with the International Red Cross and two drug
companies with an interest in the drug the doctors were experimenting with –
paludrine, which had been recently discovered by German scientists and said to be
effective against this scourge of the tropics. Malaria kills millions of people every
year in the tropical parts of the world – including the tropical parts of Asia, Africa
and South America. The infection, which is caused by parasites that enter the blood
through mosquito bites, affects people of all ages, and can be fatal. Most often
recovery occurs, though, and immunity to malaria is common in people indigenous
to malarial areas. The infection was thus more dangerous to Australian, American
and British troops in the tropics than to the local population in these areas
(although over-all the vast majority of those who die from malaria in the modern
world are still, as was the case in the 1940s, impoverished residents of the tropical
“Third World” regions).
There are several types of malaria, and these are carried by only certain species of
mosquito. Some types of malaria are more dangerous than others, and mosquitoes
can be specially bred to carry an abnormally big load of parasites. These scientific
facts were known to the Australian and British doctors who scoured Royal
Australian Army convalescent depots looking for crippled soldiers who could be
used as “human guinea pigs” for their malaria experiments.
Failing to find enough subjects for the experiments from the military hospitals, the
Army turned to the Department of Immigration – ‘interned’ Italians and selected
Jewish refugees who wanted to serve the British empire that had saved them from
Nazi persecution, but were not allowed to carry arms, were sent to Queensland to
become “human guinea pigs”, along with the wounded soldiers.
During the experiments, these men were transfused with infected blood (supplied
by the Red Cross) and exposed to “specially bred mosquitoes” carrying many
times the usual load of parasites. Once infected, the men were subjected to tests of
endurance, under various conditions, after the style of similar medical experiments
being done at the same time in Northern China by Japanese military doctors, and
Europe by German ones. The Australian and British doctors also tested the German
drug paludrine (proguanil) for effectiveness and toxicity on their “human guinea

10

pigs”. This is where the American drug company and the British Imperial
Chemical Industries (ICI) came in. They were interested in marketing the drug if it
proved to be safe and effective. ICI pharmaceuticals (now part of Astra-Zeneca)
markets paludrine in Australia to this day (see reports in the Age, 19.4.99 and
20.4.99 for details of these malaria tests).
The malaria experiments included testing such things as how fast the parasites
could be transmitted from one person to another when the two “human guinea
pigs” were connected, via intravenous lines, to each other, testing the results of
repeated transfusions of infected blood, and the effectiveness in causing infection
of specially bred mosquitoes. The drug Paludrine (proguanil) was tested both for
effectiveness and toxicity. Toxicity tests involved giving the infected subjects
bigger and bigger doses of Paludrine and observing the effects. These inhumane
experiments were only revealed by the (Fairfax) press 50 years after they occurred.
Although front-page news for two days the matter was dropped in the Australian
media after refusals from the government or the drug companies involved to
discuss compensation with the victims. The Murdoch press, typically, did not take
up the story.
The scientific and medical press did not mention the matter at all, and most people
in Australia have forgotten the incident if they were ever aware of it. Others might
suppose that because the events occurred “long ago” they do not have relevance to
the present. ICI clearly did not think so, though. Shortly before the malaria
experiments were made public, ICI (Imperial Chemical Industries, a British
chemical and explosives company) changed the name of its pharmaceutical
operations to ‘Zeneca’ and merged with the Swedish drug company Astra
pharmaceuticals, becoming Astra-Zeneca, the current owners of the paludrine
operations.
The head office of Astra-Zeneca remains in Britain, home of the main Imperial
Chemical Industries. The Australian branch of ICI chemicals (ICI Australia)
separated from its British parent company a few months before the malaria
revelations, becoming “Orica”, a new “Australian chemical company” based in
Melbourne that is the world’s leading exporter of explosives, according to their
Annual Report 2000. The chairman of Orica, Professor B.H.Lochtenberg is also
Chairman of Melbourne University Private and a member of the University of
Melbourne Council, as well as being Chairman of the Mental Health Research
Institute!

11

The consequences of HIV/AIDS being man-made
If the human immunodeficiency virus was developed by genetic engineering there
will obviously be massive changes in society when this is publicly known. This
change will undoubtedly be for the better. If it was developed through
developments of biological warfare strategies – specifically germ warfare
programs, several other infectious diseases are likely to have been developed and
spread in the same way. If this is the case, epidemics of these diseases, which
include tuberculosis, viral hepatitis, haemorrhagic fevers, measles and influenza,
can be stopped, saving millions of lives. If the AIDS epidemic is being deliberately
worsened and spread to areas of the world for political or economic reasons, as the
evidence suggests is the case, regardless of how the epidemic originated, again
massive political, social and economic changes, again for the better, can be
expected when this crime against humanity is proved and publicly known.
If HIV is being used to implement genocide the people who are involved in the
actual diagnosis and treatment of HIV and AIDS are being cruelly deceived and
betrayed, along with their patients. Most people believe that AIDS is a natural
plague, including the large majority of doctors and nurses working in the field.
Most politicians also believe, it appears, that AIDS is the result of a natural virus
that “probably” originated in central Africa – as they have been told by their
scientific advisers. If AIDS is a man-made plague the millions of people who have
donated their time and money to various “AIDS charities” have been cruelly
betrayed, although not as badly as the 40 million people who are said to be infected
with the HIV virus.
The good news is that if the AIDS epidemic is man-made, it has been caused by a
relatively small number of greedy, ruthless, ignorant men. The epidemic can thus
be stopped by the arrest of this small number of criminals and a re-evaluation of
public health policy. If the epidemic is man-made, several extremely rich and
powerful institutions are obviously guilty of a heinous crime against humanity,
reparations for which would leave them bankrupt and their bosses in jail. These
would be the institutions that created the virus or adapted it for human use,
experimented with the virus on animals or humans, and released the virus into
targetted populations. The governments of the nations in which such institutions
existed will have been guilty of gross negligence at best and complicity in
genocide at worst. If the genocide of targetted populations was actually based on
covert government policies (as it was in Nazi Germany) the politicians themselves,

12

including relevant health ministers, prime ministers and senior administrative staff
would also be guilty of crimes against humanity. Any corporations that
intentionally financed or supported the development and use of HIV as a biological
weapon (or agent for mass-murder) would also share legal responsibility for the
crime of genocide, as would those that knowingly or unknowingly profiteered
through such a man-made epidemic. This is good news, because few would
disagree that the supposed “property” of such individuals, institutions and
corporations should be forfeited, at the very least, in compensation for their crimes,
which have no precedents in magnitude, even when compared with the worst of
Adolf Hitler’s.
Like any other murder investigation, investigating genocide requires the
identification of the scene of the crime, the motive, weapon, victims and suspects.
In the case of genocide using the HIV virus, the scene of the crime is the whole
world, but especially southern and central Africa, where most the epidemic has
already claimed millions of lives and over 30 million people are said to be infected
with HIV. These people, and the over 10 million others (outside Africa) who have
been infected constitute the obvious direct victims, but there are millions of others
– including the families and friends of those who have died of AIDS, and those
who have suffered from the terror of being told they have HIV antibodies in their
blood and that this constitutes a certain death sentence.
The weapon, in this case, needs no identification: it is the human
immunodeficiency virus (HIV). Regardless of where the virus originated, and
whether it is the product of genetic engineering or natural evolution, the HIV virus
could theoretically be used for genocide. This could be done by deliberately
contaminating injections (including immunizations), heroin or condoms with the
virus, by breeding mosquitoes to carry the virus (remembering the Queensland
malaria experiments) or by enhancing its spread through other means (such as
facilitating behaviour and providing conditions conducive to the dissemination of
the virus). In other words, one would not need to prove than HIV is “man made” to
prove that the AIDS epidemic is the result of a genocidal eugenic program (or
several such programs).
If one could prove that the human immunodeficiency virus is the product of
biological warfare research and deliberate genetic engineering to create a killer
virus this would obviously strengthen the case regarding proving genocide,
however it is not imperative.

13

Suspicion about the HIV/AIDS Epidemic in New Guinea
I had known since the 1980s that in Africa, South America and Asia AIDS and HIV
infection were not confined to homosexuals or drugs addicts. Most of those
affected were heterosexual, with an equal number of men and women showing
evidence of HIV infection in various epidemiological studies conducted by “First
World” researchers in the “Third World”. Subsequently, in the 1990s the epidemic
in Africa spread to babies, too. In Africa over a million infants and young children
were said to have died of AIDS by 1993. Infection via the breast milk was put
forward as “the cause” by experts although breast-feeding usually confers
improved immunity to babies, and I could not help but wonder, despite my lack of
virology qualifications, why other possibilities were not considered. Specifically,
since there were sound scientific reasons to investigate the accusation by Dr Alan
Cantwell and others that HIV infection was initially introduced into targetted
populations in the USA and Africa via contaminated vaccines, I wondered why this
theory was not being seriously considered by the medical research establishment.
When I read, in September 1997, that the epidemic had suddenly spread to young
women of child-bearing age in New Guinea, coincidentally another gold and
mineral-rich part of the world, I became more concerned. Why was the AIDS
epidemic so much worse in areas of the world known to be particularly rich in
deposits of diamonds, gold, and other precious minerals? Why was the epidemic
following so closely the targets of previous eugenic programs: namely
homosexuals, opiate addicts, prostitutes and “blacks”? Why were the NHMRC,
WHO and other authorities not considering the possibility that the disease was
caused by contaminated vaccines? Even if it turned out not to be true, the
possibility could not be excluded unless investigated – this much was obvious.

14

The media report that provoked me to research the AIDS epidemic more urgently
and seriously was a short article credited only to “AAP” in the medical magazine
Australian Doctor titled “PNG faces AIDS epidemic”. The September 1997 report
began:
“Papua New Guinea is on the verge of an AIDS epidemic to rival
African countries such as Uganda, with young women as the main risk
group, health authorities have warned.
“PNG has the third-highest per capita incidence of HIV infection in the
Pacific Rim region, [PNG] Health Minister Ludger Mond said while
announcing plans to develop a national HIV/AIDS strategy.” (Australian
Doctor, 26.9.97, p.37)
Obviously, the PNG Health Minister had been told this and was repeating what he
had been told. Papua New Guinea does not collect and collate data from other
“Pacific Rim” countries regarding HIV/AIDS or any other illnesses. Mr Mond,
claimed, according to the report, that although there were only 745 reported cases
of HIV in PNG (629 of which were in the capital Port Moresby) this was likely to

15

be only the “tip of the iceberg”. He feared that as many as 10,000 people could
have the virus. As the article continues it becomes clear as to where the PNG
health minister’s facts and figures had come from – and the idea of a “National
HIV/AIDS Strategy”:
“The WHO [World Health Organization] representative for PNG, Dr
Paul Chen, said immediate action through education programs was needed
to help arrest the epidemic. ‘The prevalence here is higher than Australia,
higher than Fiji, higher than New Caledonia and so on,’ he said. ‘From that
point of view and the rapidity of the increase…it shows that we are right in
the midst of an epidemic. And if we don’t do something fast we could end
up like Uganda where you would have seen the villages abandoned, people
are dying, a whole generation missing.’ ”
What puzzled me was why, given the HIV virus’ behaviour elsewhere (including
Australia, just to the south of New Guinea), the HIV infection rate was highest in
young New Guinean women, rather than homosexual men, or intravenous drug
users, as it was in Australia. The article provides an explanation – that young
women in New Guinea are being exploited by older New Guinean men. The
possibility of deliberate demonization of New Guinean men (as being sexually
aggressive and promiscuous) and women (as being weak and tending to
prostitution) occurred to me, but I needed more proof to satisfy myself that the
demonization was deliberate rather than unintended. The “explanation” reads:
“The greatest HIV infection rate in PNG was among women aged 17-35,
Health Department statistics found.
“Dr Timothy Pyakalyia, the department’s deputy secretary, said 65% of
HIV patients younger than 26 were women.
“Dr Pyakalyia blamed the high infection rate among young women on
the social disadvantages they faces in PNG and their exploitation by older
men who were carrying the virus. ‘In developing countries, younger females
are basically being exploited by older males who are financially secure,’ he
said.”
Are we to believe, then, that exploitation of younger women by “financially
secure” older men is more likely to occur in “developing countries” than
“developed” ones? Considering what is generally referred to as “development” –
development of the mining, construction, monocrop agriculture, timber and cattle
industries, all of which have been controlled by financially over-secure male
tycoons for centuries (since their inception, in fact), it is a ridiculous insinuation
that this sort of “industrial development” brings emancipation to women.

16

Chauvinism and Ethics in the Corporate World
When one looks at the boards of directors of the biggest corporations in these
industries one finds only middle aged and elderly men – all millionaires and almost
all “white”. In recent years the most racist and sexist of these corporations have
recognised the importance of being seen as “equal opportunity providers” – so
they often employ a woman or a “black person” in a prominent “public relations”
position, such that the “proof of equal opportunity” can be photographed by the
media and seen by the public. This strategy is widely used by the corporations and
institutions engaged in the most heinous crimes against humanity, as one might
expect, given the racial and gender considerations involved in such crimes. The
strings that move such “public relations puppets” are controlled, almost exclusively
by white, middle-aged men – although it may be true that some of these men are
under the “private” control of their wives. The fact remains that the industrial
world is glaringly chauvinistic and deeply racist at the “top of the pyramid”.
It is widely recognised that inciting racial hatred is abhorrent, and illegal. Thus
Australian Doctor could not claim that exploitation of young women by older men
is rife in “developing countries” or Papua New Guinea without quoting a “local”
who said what they wanted to print.
Australian Doctor is sent, free and unsolicited, to tens of thousands of doctors
around Australia by its publisher, Reed Business Information. Its main objective,
although it claims to be “medical education”, is pharmaceutical advertising. Every
copy has a drug company advertisement on almost every page, and the
magazine/journal is one of two weekly promotions from the drug industry to
general practitioners in Australia. It is claimed as “The Independent Weekly
Newspaper for Australian GPs”, and is edited by a medical doctor, with an
advisory board of medical specialists. The publication claims that:
“Australian Doctor is an independent publication serving the needs of
Australia’s general practitioners. It has no affiliation with any medical
organisation or association.”
However independent the weekly publication may claim to be, it is difficult to
believe that the editors do not take into consideration the interests of their major
sponsor – the pharmaceutical industry. The drug companies that buy advertising
space in Australian Doctor certainly provide information, but how reliable are their
claims? Some of the slogans of the drugs promoted in a recent edition of
Australian Doctor give an indication of the pervasive influence of corporate

17

marketing and advertising on drug promotion these days. From the 16.2.2001
edition:
Page 2: “Every morning your hypertensive patients enter the danger
zone.” “Micardis – Protection when they need it most” (by Boehringer
Ingelheim)
Page 3: “It’s that Premia feeling!” “Premia – HRT made easy” (by Wyeth
Ayerst)
Page 4: “Lower Low oest” “Loette” (by Wyeth Ayerst)
Page 5: “Nail onychomycosis fast” “with Lamisil” (by Novartis)
Page 6-7: “People with neuropathic pain wear it every day.” “Neuropathic
pain can make life agony.” “Neurontin – strong relief for neuropahthic pain.”
(by Pfizer)
Page 8: “Davoinex…First line alternative to topical steroids for psoriasis”
(by CSL pharmaceuticals)
Page 9: full-page enlargement of Micardis advertisement on page 2
Page 10-11: “This man will continue to take his friends for all they’re
worth thanks to Exelon” [showing a photo of an old man shuffling cards,
sitting at a table with 6 other old men] “Dual Action Exelon – with
Alzheimer’s Every Day Counts.” (by Novartis)
Page 12-13: “When the big picture looks complicated…Get back to
basics.” “Monopril/Monoplus – Simplifying management of at risk
hypertensives.” (by Bristol-Myers Squibb)
Page 14: “New from Merck Sharp & Dohme” “Renitec plus – PBS
February 1st 2001”
Page 15: “The lift you can rely on” “Caverject – reliable erections”
The recent edition of Australian Doctor, from which these advertising slogans are
taken, contains an article of particular relevance to understanding the relationship
between corporations, government and the medical profession in Australia, and
elsewhere in the “Western World”. Credited to Heather Ferguson, the article
features a close-up photograph of Australian Federal Health Minister Dr. Michael
Woolridge’s face, below which is the caption: “Dr Woolridge…under increasing
fire over appointment of Mr Clear”.
The article, titled “Drug industry push to join govt bodies” concerns the recent
appointment to the Pharmaceutical Benefits Advisory Committee (PBAC), a
supposedly independent expert body that advises the Federal government on what
drugs to subsidise on the National Pharmaceutical Benefits Scheme (PBS), of Mr
Pat Clear, previously a senior executive at Glaxo-Wellcome (the world’s largest
drug company) and Chief Executive Officer (CEO) of the Australian

18

Pharmaceutical Manufacturers Association (APMA). The APMA is a collective
organization of the biggest international drug companies active in Australia.
Advertising standards regarding drugs in Australia is largely regulated by the
APMA, which operates according to its own “code of practice” – the industry is
thus largely self-regulated; in other words it is largely unregulated other than by
“market forces” and what it can get away with.
The article quotes the managing director of Roche Australia as someone who had
admitted that the APMA had long aspired to having a representative on the PBAC.
He should know, because he himself was previously on the APMA Board of
Directors:
“A senior drug industry figure has claimed the appointment of a former
drug lobbyist to the Pharmaceutical Benefits Advisory Committee follows
an industry campaign to gain representation on all government
pharmaceutical committees.
“Fred Nadjarian, managing director of Roche Australia, made the claim
amid escalating concerns over the appointment of former Australian
Pharmaceutical Manufacturers Association CEO Pat Clear to the PBAC.
“Mr Nadjarian said he became aware of the campaign several years ago,
when he served on the association’s board of directors.
“He did not know what tactics drug companies were employing because
Roche had decided not to take part in any push for representation. It had
believed this would not increase industry influence with government.
“Although colleagues from other drug companies had attempted to get
Roche onside, the company had not felt pressure was being placed on it to
change its viewpoint, he said.
“AMA president Dr Kerryn Phelps said Mr Nadjarian’s comments came
as no surprise, and simply confirmed the association’s suspicions that Mr
Clear’s appointment was the result of industry pressure.” (p.4)
At what point should suspicions be replaced by certainty? How doubtful is it that
Mr Clear’s appointment was the result of “industry pressure”? Probably only in so
far as the extent that the appointment was the result of direct political pressure
from the Health Minister, Dr Michael Woolridge. It was he who made the
appointment. Adding to the question of conflicts of interest, Dr Woolridge himself
has been accused of “inappropriate links” with the drug company Pfizer; the health
minister’s office, according to the Australian Doctor report “strenuously defended
false claims of inappropriate links with Pfizer”.

19

Glaxo-Wellcome and AZT
Mr Pat Clear was previously a publicist for Glaxo-Wellcome, the UK-USA
pharmaceutical giant that formed as an amalgamation of British Wellcome
pharmaceuticals with its rival, Glaxo. More recently Glaxo-Wellcome merged with
the massive British-American SmithKline Beecham to form Glaxo SmithKline
(GSK), said to be the biggest drug company in the world. SmithKline Beecham,
formed by the merger between American drug giant Smith Kline and French, and
Beecham Laboratories, is a major vaccine manufacturer (in addition to several
psychoactive drugs), and Wellcome pharmaceuticals has, since the 1980s,
marketed two drugs of particular importance to any discussion of AIDS –
azidothymidine (AZT) and methadone (physeptone). No other drug company sells
these drugs in Australia and both have long been subjects of controversy. I am not
certain about whether any other drug companies now manufacture AZT, but
Wellcome pharmaceuticals was manufacturing the drug several decades before
HIV and AIDS were heard of.
The drug AZT was first marketed in the 1950s as a treatment for cancer, although
its toxicity and lack of efficacy prevented widespread use throughout the 1960s and
1970s, when it was a specialist drug kept within the armoury of hospital
oncologists. In the 1980s AZT was the first drug to be widely heralded as the “first
line treatment” for AIDS – it has subsequently become the “benchmark drug”
against which other “antiretroviral drugs” are compared. AZT is now promoted,
rather than an anti-cancer drug, as an antiviral drug. Confusingly, Glaxo-Wellcome
have changed the name of the drug with its new identity as “the standard antiAIDS and anti-HIV drug”. It is now marketed as “Retrovir”, a name evocative of
the “retrovirus” that it is claimed to be active against (HIV).
The changing of a brand name is a common (but confusing) practice, but the
“generic name” customarily remains constant. This is the case with almost all other
drugs – thus paracetamol, regardless of whether it is marketed as Panadol,
Panamax, Dymadon or Tylenol (brand names), is still identifiable by a physician,
nurse or chemist as paracetamol (the generic name of the drug). This constancy of
the generic name is essential to avoid confusion and for researchers to follow the
drug for long-term adverse effects. Significantly, then, Glaxo-Wellcome have
recently dropped the well-known generic name “azidothymidine” (AZT) and
replaced it with a new generic name “Zidovudine”. This is significant because
those who are looking for the early history of AZT in the future will “lose the
trail” in the 1990s, when both the trade and generic names of the drug were
changed.

20

Glaxo-Wellcome was said to be the biggest drug company in the world, rivalled
only by the American Merck Sharpe & Dohme, even before the recent merger with
SmithKline Beecham. In the 1980s many already huge drug companies merged –
lowering their distribution and advertising costs, and allowing them to “downsize”.
This did not mean that drug prices came down. On the contrary, they rocketed. The
AIDS epidemic has been the biggest money spinner in the “Third World”, while
drug treatments for “mental illness” and “heart disease prevention” are the biggest
drains on the public purse in “First World” nations. This difference between the
“Third” and “First” worlds in their drug purchases reflect the radical differences in
disease patterns and drug-marketing strategies by pharmaceutical companies. The
head offices of almost all these pharmaceutical companies are located in capital
cities of one or another “First World” nation. Of those that sell drugs in Australia
the biggest of these drug companies are located in the USA, UK, Switzerland,
Germany, Belgium, Scandinavia (mainly Sweden and Denmark), Australia, France,
Italy and Spain – in roughly that order. The same drug companies dominate the
pharmaceutical industry around the world.
Giant drug companies are often referred to as “multinational” or “transnational”
corporations. This is misleading, because although these companies certainly
operate in many nations and across national borders, they are inevitably based in
only one country, or at the most, two. It is in this country that major corporate
decisions are made, and from the “head office” in this country that the orders and
protocols obeyed by the whole corporation emanate.
The biggest drug companies are among the largest corporations in the world
(although they do not rival the biggest mining and oil companies or banking
corporations in size), and make bigger sales profits than any other. Drug companies
also operate across more national boundaries than any other – due to the ubiquitous
need for medicines, and the monopolies held by particular companies over specific
“patented” drugs. The advertising budget of the large companies amounts to
billions of dollars every year, and they are able to employ the most expensive
advertising and legal experts to sell their wares. Major decisions about advertising
and promotional strategy emanate, as in the case of other large corporations, from
the head office of each drug company, as do major financial decisions (such as
takeovers and mergers with other drug companies). These are multibillion-dollar
decisions, and drug companies are eager to know what various independent
government committees know, and have advance knowledge impending scandals
or litigation risks. Thus Mr Clear, were he to retain political and social contacts
with staff of Glaxo-Wellcome or members of the Australian Pharmaceutical

21

Manufacturers Association, could be faced with a serious confict of interest. He
could, in fact, act as a spy – providing the pharmaceutical industry with inside
knowledge of the PBAC activities and knowledge. We are to be reassured,
according to the Federal Health Minister, Michael Woolridge, that Mr Clear no
longer works for the drug companies, and he is only one of a 12-member
committee.
In recent years several drug companies have been criticised for “dumping”
unwanted drugs in the Third World. Dr Michel Clerc of Medecins Sans Frontiers
(MSF) wrote of this, in the August 1999 MSF Australia newsletter:
“There is nothing wrong with an incentive for pharmaceutical companies
to contribute to emergency aid, but this needs to come with effective control
mechanisms over the appropriateness and correct usage of the drugs they
provide. As it is now, they get a financial reward for getting rid of old stock
cheaply (incinerating drugs can be expensive), meanwhile adding to the
public health problems of people who already live on the edge of human
tolerance.”
In addition to dumping unwanted drugs in poor countries, drug companies may
deliberately create “new markets” by introducing drugs into Third World nations.
Dr Clerc of MSF comments of nicotine patches being sent as “aid” for Kosovar
refugees, but there are more serious examples of “inappropriate drugs” than this
(although nicotine patches are bad enough, under the circumstances). There is a
fine line between a medicine and a poison – a drug that can serve as a medicine can
act as a poison if given for the wrong reasons or in the wrong dose. Such is the
case with almost any drug, although, of course, some drugs are more toxic than
others. Those drugs that predictably cause toxicity in high doses (or prolonged use)
may be readily adapted as “chemical warfare agents” – chemicals used to create
illness rather than health. Chemical warfare agents are, by definition, poisons
(toxins), and many drugs are potential poisons at the same time as being potential
medicines, depending on how and why they are used.
Before looking in detail at the scientific, geographical and political background of
the AIDS epidemic, and the parallel development of psychological, chemical and
biological warfare during the Cold War and related industries, a list of possible
motivations for the use of HIV and AIDS as a biological weapon, is presented to
conclude this chapter, followed by a timeline of the development of the epidemic.
These serve to inform the discussion and details in subsequent chapters.

22
POSSIBLE MOTIVES FOR USING THE HUMAN IMMUNE DEFICIENCY VIRUS (HIV) AS
A BIOLOGICAL WEAPON:
1. As a measure against ‘Third World’ overpopulation.
2. Implementation of eugenics programs.
3. Drug development and experimentation.
4. Promotion of contraceptives (condoms, spermicidal gels and lubricants)
5. Drug sales (AZT/Zidovudine, other antiviral drugs, antibiotics, antifungals,
psychoactive drugs, vitamins and mineral ‘supplements’)
6. Induction and promotion of terror resulting in increased insurance sales, treatment
services (medical, psychiatric, ‘alternative’/complementary)
7. Genocide of people with native rights of ownership of land and resources
8. Genocide of targetted races and ethnic groups
9. Genocide of targetted individuals deemed to be “depraved/degenerate” (homosexuals,
intravenous drug-injectors)
10. Genocide of targetted religious groups and religious dissidents
11. Genocide of women of child-bearing age (in Third World)
12. Genocide of children and babies (in Third World)
13. Stigmatization of targetted ethnic, religious, political groups and dissident individuals
14. Promotion of screening, diagnostic and “follow up” blood tests
15. Surveillance of targetted populations and induction into chronic disease state
16. Collection of human and animal tissue for “tissue banks”.
17. Increased funding for drug and vaccine research
18. Promotion of hepatitis B vaccination and other vaccines
19. Increased profits for animal experimentation industry (including breeding, construction
of institutions for research, specialist equipment etc)
20. Increased public donations to “charities” for medical research and “aid” to the Third
World
21. Statistics and data collection and manipulation
22. Propaganda against selected targets for demonization
23. Propaganda glorifying relevant institutions for their “fight against AIDS”
24. Propaganda in Third World promoting First World drugs
25. Demonization of men in nations ravaged by AIDS (accusations of sexual violence,
promiscuity and bestiality)
26. Demonization of women in nations ravaged by AIDS (accusations of weakness and
helplessness, tendency to prostitution, ignorance)
27. Demonization of youth in First, Second and Third World Nations (as prone to drug
addiction and abuse, violence and decadence)

23

CHRONOLOGY – BIOLOGICAL MANUFACTURE,
TESTING AND SPREAD OF HIV
1880s: Sir Francis Galton and Major Leonard Darwin (son of Charles Darwin)
form the ‘Eugenics Society’ with plans to create an ‘international religion’ based in
London and Cambridge. The organization was dedicated to ‘improving the human
race through selective breeding. Superior ‘stock’ were to be encouraged to have
numerous offspring, while ‘inferior stocks’ (including ‘blacks’, ‘criminal classes’,
‘undesirables’ and ‘lunatics’) were to be segregated and sterilized, to prevent
transmission of their ‘defects’ to subsequent generations. Galton also wrote, in his
1869 book, Hereditary Genius, that the ‘African negroes’ of Southern Africa (the
Bantu rather than the Khoi) appear to have a ‘slavish instinct’ and ‘fall readily in to
the ways of a slave’. He also commented that ‘Australian blacks’ appear to be a full
grade below ‘African negroes’ in intelligence, while the African blacks are, on
average, two grades below the ‘Caucasian race’. Galton’s ideas are warmly
embraced in the USA, Australia, Canada, South Africa and New Zealand, and also
in Scandinavia and (later) in Greece, Germany and other parts of Europe.
1899-1903: Second Anglo-Boer War in South Africa
1901: Queen Victoria dies; Edward VII inherits British Crown (reigns from 1901
till his death in 1910). Edward VII (1841-1910) was elected Grandmaster of British
masonry when Prince of Wales in the 1880s.
1902: Cecil Rhodes (1853-1902) dies, leaving his fortune from gold and diamonds
in Southern Africa (South Africa and Rhodesia) into trusteeship of Anglo-Jewish
banking magnate Nathan Rothschild, to continue his ‘Round Table’ plan of global
domination by the British Empire. Rhodes’ plan included, in the immediate future,
‘opening’ Africa to British exploitation from ‘Cape to Cairo’. Rothschild had
financed the purchase, by the British, of French shares in the Suez Canal, which
had been built in the 1860s with French expertise and African (Egyptian) labour
(including forced labour). Both Rhodes and Rothschild were Freemasons.
1906: Belgian Congo taken by Belgian Government from the hands of the
notorious King Leopold after international outcry against the brutality of the
Belgian colonists in King Leopold’s ‘Congo Free State’ (hand, and head-hacking;
feeding children to crocodiles; starvation, terrorisation and torture of native
Congolese and other atrocities). This has relevance to the Cold War epicentre of
AIDS in Belgian Congo (when the epidemic began) and the extraordinary

24

appointment of the Belgian microbiologist Peter Piot as head of the United Nations
peak AIDS body (UNAIDS).
1910:
Winston Churchill becomes Home Secretary of the British Empire in 1910.
Churchill was initiated as a Freemason at Studholme Lodge No.1591 in 1901.
According to his official biographer, Sir Martin Gilbert, Churchill was, at the time
“in favour of the confinement, segregation and sterilization of a class of persons
contemporarily described as the feeble minded”.
According to Gilbert, Churchill, a prominent and loud enthusiast of Galton’s
‘eugenics’ strategies was one of the early drafters of the Mental Deficiency Act
1913. This cruel law legislated for permanent incarceration of British subjects
labelled as ‘mental defectives’ – graded as ‘idiots’, ‘imbeciles’ or ‘feeble minded’,
‘moral defectives’ and those diagnosed as ‘insane’. He had sought legislation for
forced sterilization of these people, as was the case in several states in the USA
(notably Iowa), but this was rejected by British parliament. Churchill had been
impressed by Dr H.C. Sharp’s booklet ‘The Sterilization of Degenerates’ when he
read it in 1910, and was, according to Sir Martin, in favour of using huge doses of
X-rays (then called Roentgen rays) to the genitals, though, he argued for surgical
sterilization of ‘defective’ men and women, too.
1910:
King Edward VII dies and his younger brother George V ascends the British
throne.
1911:
Dr Francis Peyton Rous transplants cancer between chickens
and then between species, with funding by Rockefeller
Foundation (wins Nobel Prize for Medicine or Physiology in
1966, many years after his discovery, which was kept largely
secret till then).

Francis Peyton Rous

1912:
First International Eugenics Conference held 24-30 July 1912 in London. Churchill

25

is vice-president. Directors include Charles Elliot (former President of Harvard
University), Sir William Osler (Professor of Medicine at Oxford) and Alexander
Graham Bell (inventor of telephone). Opening banquet of the conference addressed
by Arthur Balfour (British Conservative Prime Minister 1902-1905 and a Fellow of
the Royal Society since 1888). Balfour succeeded Churchill as First Lord of the
Admiralty in 1915 and as Foreign Secretary in 1917 issued a letter to Lord Lionel
Walter Rothschild (eldest son of Lord Nathan Rothschild) promising a Jewish
‘National Home’ in Palestine, then part of the Ottoman Empire (the ‘Balfour
Declaration’). Interestingly, and with relevance to the CIA’s Cold War MK
Programs and ‘remote viewing’ research by the Allies during the 20 th century,
Balfour was President of the Society for Psychical Research from 1892 to 1894.
Balfour graduated in Law at the University of Edinburgh in Scotland, and was a
Freemason (Scotland has the highest population density of Masonic men in the
world – approximately 1 in 12 in the 1980s, current figures are not known).
1914-1918: First World War, marketed at the time as ‘The War to End All Wars’
and ‘The Great War’, waged between various colonizing empires and would-be
empires. Australian youth enlist enthusiastically to defend the British Empire
against the “evil (Moslem) Turks”. Anglican Churches and Red Cross Organization
(founded by Freemason Henri Dunant during the Crimean War) encourage men
and women to devote their time, effort and blood to the war effort.
1917: Insurance magnate and Chicago Freemason Melvin Jones (1879-1961)
founds Lions Club International. Jones was the son of an Army Captain who
commanded a troop of Scouts. Lions Club was subsequently invited to consult on
the formation of the United Nations Organization from the League of Nations at
the 1945 San Francisco Conference at the end of the Second World War (under US
President Harry S. Truman, who was a 33 rd Degree Freemason under the Scottish
Rite, Grandmaster of the Grand Lodge of Missouri and Honorary Grand Master of
DeMolays).
1919: Treaty of Versailles at end of First World War carves up German, AustroHungarian, and Turkish Ottoman Empires, and ensures poverty of the vanquished
empires by war reparations to the victors. German-controlled colonial territories in
Africa, and the Asia-Pacific region are distributed between Britain, New Zealand,
France, Belgium, Holland and the USA. Britain gets the “Lion’s Share”, followed
by its ‘White Colonies’ including Australia, which gets control (or retains control)
of the gold-rich east half of New Guinea, and surrounding copper-rich islands,
including Bouganville as well as the phosphorus-rich island of Christmas Island

26

and the gas-rich Torres Strait. New Zealand gets the phosphorus-rich island of
Nauru, while France gets to keep several Pacific Islands, as well as previously
German and French colonies in South-East Asia (Laos, Cambodia, and Vietnam).
The USA retains control of the Philippines, various strategic locations in the
Pacific (including some they could test their missiles at) and also gains control of
the League of Nations (and consequently the United Nations Organization when it
forms at the end of the Second World War).

1931:
Dr Cornelius P. Rhoads transplants cancer to Puerto
Ricans, under sponsorship of Rockefeller Institute.
When exposed by Puerto Rican Nationalist leader
Pedro Albizu Campos, Rhoads is transferred, and
subsequently promoted to head two chemical
warfare research projects in the 1940s. He then
established U.S. Biological Warfare facilities in
Maryland, Utah and Panama. After WWII, Rhoads
is given a seat on the U.S. Atomic Energy
Commission, where he continues human
experimentation, studying effects of nuclear
(ionising) radiation in causing cancers and other
diseases.
1934:
King George V dies and is succeeded by his oldest son, Edward VIII. Edward, who
was a Freemason, and a personal friend of Winston Churchill, abdicate the next
year to marry a divorced American heiress, and is succeeded by his younger
brother, George VI. George VI, the father of Queen Elizabeth, was a Freemason,
and led Britain as the Emperor of Great Britain and India, into the Second World
War, dying in 1952 at the age of 57.

1940s:
During Second World War extensive development of chemical, biological warfare

27

techniques and psychological warfare techniques (induction of panic, terror,
anxiety, confusion, rage and emotional collapse or ‘abreaction’) along with
hypnosis and brain-washing by propaganda bombardment by Americans, British,
Australians, New-Zealanders (‘Allied’ powers) and, as more publicised, the
Germans, Italians, Spanish and Japanese (‘Axis’ powers).
Eugenics theory, developed by Sir Francis Galton (Fellow of the Royal Society) at
Cambridge and London in the 1860s-90s, influences both Axis and the Allies
during the First and Second World Wars (being, by then, the dominant
social/psychological/anthropological theory in UK, USA, Australia, New Zealand,
Canada, South Africa, France, Belgium, Germany, Switzerland, Denmark, Sweden,
Norway and Finland and also in Japan).
The USSR and Eastern Bloc also embraced some eugenics theory and terminology,
but adapted to political enemies and dissidents (e.g. ‘sluggish schizophrenia’) of
Stalin’s communist regime. In South America, Brazil (Portuguese colony) may
have adopted eugenics ideas more than Spanish colonies, but eugenics, being a
development of Darwin’s evolutionary theory was much less embraced by Catholic
and Christian Fundamentalist (creationist) organizations and universities. Positive
and negative eugenics formed the ‘scientific rationale’ for White South Africa’s
‘apartheid’ and White Australia’s ‘assimilation’ policies towards Aboriginal and
Torres Strait Islander people and dictated Fiji’s and New Zealand’s treatment of
Pacific Islanders and Maoris.
Biological warfare research in Australian, American and British universities and
institutions centred on increasing virulence of endemic animal viruses, bacteria and
parasites – documented evidence of:
1) Increasing virulence of malaria (and testing resistance under
various forms of torture)(Australia/Britain)
2) Transfusing malaria between prisoners/volunteers and
testing efficacy of drugs (Australia/Britain/USA)
3) Deliberately infecting patients with malaria (so-called
’malaria therapy’) and typhoid as a supposed ‘treatment’ for
‘schizophrenics’ and other ‘mental patients’ (Australia,
British Commonwealth, Canada – where Heinz Lehmann
continued such experiments after the war).
4) Increasing virulence of dengue fever (USA)

28

5) Creating/modifying cancer-causing viruses and development
and chemical and radiation means of producing cancers in
specially bred rodents (USA/Britain/Australia)
6) Creating cirrhosis causing viruses and creation of alcohol
dependence (USA/Australia)
7) Investigating effects of typhoid and other bacterial infections
on the brain (Canada/USA)
1947:
Australian immunologist Sir Frank Macfarlane Burnet recommends, in a secret
report, the development of biological and chemical weapons to target food crops
and spread infectious diseases in Indonesia and ‘overpopulated Asiatic countries’
following a request by Sir Frederick Geoffrey Sheddan, Secretary of the Australian
Department of Defence, on 24 December 1946. Sheddan had stated, in his request
for Burnet to brief top military officers at the department, that Australia could not
ignore the fact that many countries were conducting intensive research on
biological warfare.
Burnet responded in a secret meeting (minutes of which were released in 1998 to
historian Phillip Dorling in 1998, following initial opposition from the Department
of Defence and Department of Foreign Affairs and Trade) that Australia’s
temperate climate could give it a significant military advantage. The minutes of the
meeting (as reported by Dorling, according to political correspondent for The Age,
Brendan Nicholson in 2002) quote Burnet as saying:
“The main contribution of local research so far as Australia is concerned
might be to study intensively the possibilities of biological warfare in the
tropics against troops and civil populations at a relatively low level of
hygiene and with correspondingly high resistance to the common infectious
diseases.”
Burnet was consequently invited in September 1947 to join a chemical and
biological warfare subcommittee of the New Weapons and Equipment
Development Committee, providing a secret report titled ‘Note on War from a
Biological Angle’ suggesting that biological warfare could be a powerful weapon
to defend a thinly populated Australia, and “its use has the tremendous advantage
of not destroying the enemy’s industrial potential which can then be taken over
intact.” He urged the government to encourage Australian universities to research
those branches of biological science that had a special bearing on biological

29

warfare.
The minutes of a 1948 meeting at Victoria Army Barracks note that Sir Frank
Macfarlane Burnet “was of the opinion that if Australia undertakes work in this
field it should be on the tropical offensive side rather than the defensive”.
After Burnet visited Britain to examine their chemical and biological warfare effort
in 1950, he reported to the committee that though the initiation of epidemics in
enemy populations had “usually been discarded as a means of waging war”
because of the danger of it rebounding on the user, that:
“In a country of low sanitation the introduction of an exotic intestinal
pathogen, e.g. by water contamination, might initiate widespread
dissemination”,
and that:
“Introduction of yellow fever into a country with appropriate mosquito
vectors might build up into a disabling epidemic before control measures
were established.”
In 1951, the subcommittee recommended that:
“A panel reporting to the chemical and biological warfare subcommittee
should be authorised to report on the offensive potentiality of biological
agents likely to be effective against the local food supplies of South-East
Asia and Indonesia.”
From 1950 to 1960, the Walter and Eliza Hall Institute at the University of
Melbourne, headed by Frank Macfarlane Burnet produces more published papers
on virology and immunology than any other institute in the world. In 1965, Burnet
retires and is replaced by 34-year-old microbiologist Gustav Nossal (later Professor
and Sir Gustav Nossal). Nossal, as consultant to the WHO, supervised the 1970s
Smallpox Eradication (vaccine) Program in Africa, implicated in the spread of HIV
into African populations, following its initial introduction to Belgian Congo.
Burnet continued to work at the University of Melbourne, drawing attention to the
‘problem of overpopulation’ as president of the Australian Academy of Science
until his retirement in 1978.

1950s:
CIA director Alan Dulles sets up MK Programs sometime between 1950 and 1953,

30

an integrated series of biological, chemical, nuclear and psychological warfare
programs. MK-NAOMI, which ran (officially) from the 1950s to the early 1970s,
was focussed on biological warfare, while MK-ULTRA, headed by Jewish
psychiatrist Sidney Gottlieb (Joseph Scheider) concentrated on psychological and
chemical warfare. Research was conducted at major universities and research
institutions in the USA (including Harvard, Yale, Stanford and UCLA), Canada
(McGill), UK (including Porton Downs, University College London and probably
Cambridge) and Australia (University of Melbourne, and possibly Australian
National University).
In 1953 Drs Francis Crick and James Watson announce discovery of structure of
DNA. Watson, Crick and Wilkins awarded joint Nobel Prize for the discovery in
1962.
Watson returns to USA after the announcement, subsequently heading the old
‘eugenics records office’ at Cold Spring Harbor (sponsored by Carnegie
Corporation). This is renamed ‘Cold Spring Harbor Laboratory’ after WWII as
another ‘genetics’ (rather than ‘eugenics’) research institute. From 1956 to 1976 he
worked at Harvard University’s ‘Biological Laboratories’. Later in life James
Watson embarrasses his employers with several racist comments including claims
that Africans blacks are less intelligent than Caucasians (a central claim by the
founder of eugenics movement Francis Galton who claimed, in his 1869 book
Hereditary Genius, that ‘African blacks’ are ‘on average two grades below whites’
in intelligence, and ‘Australian blacks’ a further grade below the Africans, despite
never meeting any Aboriginal people himself).
In the 1950s Winston Churchill leads the propaganda exercise to ‘write out’ British
and American support for the discredited, racist and genocidal agenda of the
eugenics movement of which he was a senior figure (including being vicepresident of the London Eugenics conference of 1914).
Development of more lethal viruses in Australia
Sir Frank MacFarlane Burnet at the
Walter and Eliza Hall Institute
(Immunology Department of University
of Melbourne) had been collecting
‘exotic viruses’ in the 1920s under
Brigadier Sir Neil Hamilton Fairley and
developed techniques to use bacteria to

31

introduce fragments of one virus into another, or into bacteria and other cells to
increase their virulence and change their behaviour. He was recognised for
research into bacteriophages in the 1920s, following work (1925-1927) at the
Lister Institute in London. These techniques allow engineering of infectious agents
to make them more lethal.
In the 1950s, following his secret work urging the Australian government and
defence forces to develop biological and chemical weapons, Burnet changed from
‘researching viruses’ (actually, he was collecting ‘exotic viruses’ from wherever he
could, and introducing them into various species) to ‘studying immunology’. It was
then that Burnet was credited with working out details of the function of
lymphocytes (T-cells and B-cells) and the production of clonal antibodies, for
which he was awarded the Nobel Prize in Medicine/Physiology in 1960.
Understanding the stimulation or inhibition of antibody production by B-cells and
immune activity of T-cells (thymus-activated lymphocytes) allowed drugs and
viruses to be developed that killed specific tissues and organs in the body, inducing
leukaemias, lymphomas and auto-immune diseases. Leukaemia was previously
known to be causable by exposure to ionizing radiation (including gamma rays and
X-rays), especially after Marie Curie died of radiation-induced leukaemia.
Through the 1960s until his death in 1985 Burnet remained vocal about the ‘threat
of overpopulation’ (in the Third World), echoing concerns of the discredited
eugenics movement and Paul and Anne Erlich at Stanford University.
Demonization of Blacks by Gadjusek and Burnet:
Also in the 1950s, Dr Carlton Gadjusek from US National
Institutes of Health ‘investigated’, with Burnet, a new epidemic
of encephalopathy in New Guinea. What was called ‘kuru’
(spongiform encephalopathy) was possibly deliberate
introduction of sheep-infection ‘scrapie’ to New Guinean
highland Fore people. The investigations were done in
Melbourne University’s Walter and Eliza Hall Institute, which
Burnet headed until 1965. The infection, described as due to first
‘slow’ or ‘retro’ virus, was attributed to ‘mortuary cannibalism’
by Gadjusek and Burnet. Gadjusek won the 1976 Nobel Prize for this ‘discovery’,
sharing it with the Jewish-American discoverer of Hepatitis B, Baruch Blumberg.
Gadjusek was jailed many years later (in his old age) for sexual abuse of boys he

32

brought he ‘adopted’ and brought to live with him in the USA from New Guinea
and the Pacific Islands (supposedly ‘for their education’).
Also in the 1950s, British drug company Burroughs-Wellcome pharmaceuticals
develops Azidothymidine (AZT) that destroys white blood cells, causing immune
collapse. AZT was marketed for some years as treatment for blood and other
cancers, but failed to gain support from oncologists. Wellcome pharmaceuticals
was (and still is, as part of GSK) also a major producer of opiate drugs, including
codeine, morphine and methadone (mostly derived, since the 1970s, from poppy
plantations in Tasmania).
In 1959 Sir Charles Galton Darwin (grandson of the famous evolutionary biologist)
warns of the danger of overpopulation as ‘more serious than the threat of nuclear
war’ at Caltech (California Institute of Technology) urging development of a ‘more
brutal solution’ than merely warfare. This was said in a speech entitled
‘Forecasting the Future’, and was published by Unwin books in Frontiers of
Science (1959). Sir Charles was a physicist involved with the Manhattan Project
during WW II.

1960s:
World Health Organization adopts global immunization by needle as a prime
objective in ‘promoting global health’ along with promotion of voluntary
sterilization of men and women, and use of condoms (to reduce population growth)
sponsoring ‘population studies’ around the world based on the assumption that the
world is dangerously overpopulated. Blame for this overpopulation was placed on
people of the Third World (then usually referred to as ‘underdeveloped nations’,
later ‘developing nations’, where population increases were more rapid than in
‘developed nations’). Sub-Saharan Africa, South America (especially Brazil), India
and South-East Asia were focussed on in WHO population control programs
(centred on promotion of contraceptives).
1969:
US biological warfare program to develop a ‘refractory agent’ that causes collapse
of the immune system is approved by Senate after Donald MacArthur seeks
funding, claiming the Russians would otherwise beat them to it.
US biological warfare programs in Bethesda Maryland renamed ‘cancer research
programs’ and extended to network of private universities and associated medical
research institutes, after President Richard Nixon signs treaty banning offensive

33

biological warfare.

1970s:
Dr Baruch Blumberg, a Jewish-American graduate of
Columbia University, is credited with discovery of Hepatitis B
(serum hepatitis, known to be transmitted by infected blood
transfusions) though he says, in his 1976 Nobel speech that a
collaboration of scientists from many nations were responsible
for isolation of the Hepatitis B surface antigen (known as the
Australia antigen, since it was first isolated from a blood
specimen from an Australian Aborigine, according to
Blumberg).
Previous work in Surinam (formerly Dutch Guiana) in South
America in the early 1950s included testing resistance of
different human races to Filaria, a parasitic infection that causes ‘elephantiasis’ in
Moengo, a remote mining village populated by expatriates of many nationalities.
Blumberg worked at the NIH from 1957 to 1964, before employment at the
Institute for Cancer Research (later Fox Chase Cancer Research Center) funded by
Fox Chase Bank (New York). In the mid-1970s he was involved in development of
the Hepatitis B vaccine which was licensed to American drug giant Merck.
Hepatitis B is transmitted by blood and sexual contact, especially anal intercourse,
and unlike Hepatitis A can cause chronic scarring of the liver (cirrhosis) and death,
many years after the infection, and is associated with considerably increased risk of
liver cancer (hepatocellular carcinoma). The likelihood of cirrhosis is increased
with alcohol consumption.
Hepatitis B was found, in the 1970s to be common in South-East Asians, Chinese
and Pacific Islanders. Blumberg is awarded Nobel Prize for discovery of Hepatitis
B in 1976, sharing the prize with Daniel Carlton Gadjusek.
The following is an extract from Baruch Blumberg’s addendum to his Nobel
speech:
“On 08.07.76, FCCC signed an agreement with Merck & Company, Inc.,
whose vaccine facilities were located near Philadelphia, to produce the
vaccine using the novel method we had designed. The vaccine was made
from small HBV surface antigen particles, made in the liver cells of the

34

human host guided by the surface antigen gene introduced by the virus. This
was a unique method for producing a vaccine that had never been attempted
before.
The vitally important next step was the field testing of the vaccine. For a
variety of reasons we had decided that we would not be directly involved in
the testing of the vaccine. That task was undertaken by Dr. Wolf Szmuness
and his colleagues at the New York Blood Bank. Ordinarily, vaccine field
trials involve thousands or even hundreds of thousands of individuals; for
example, 1.8 million people were involved in the testing of the Salk polio
vaccine. Dr. Szmuness's study required less than a thousand volunteers, but
the results were convincing. He showed that the vaccine was highly effective
– over 90% protection rate – and it appeared to have no deleterious side
effects.”
The above claim of ‘no deleterious side effects’
from the Hepatitis B trials conducted by Wolf
Szmuness is strongly contested by AIDS
conspiracy theorists, including Alan Cantwell,
who gave a different account of the New York
Blood Bank Hepatitis B vaccine trials, which
specifically recruited promiscuous homosexuals.
AIDS conspiracy researchers have reported that a
large number of vaccinated men developed
HIV/AIDS in the next few years, and have
suggested that HIV was deliberately introduced
into homosexual populations via contaminated
vaccines in this Hepatitis B vaccine trial.
Paul Erlich, at Stanford University, feeds fear of
growing population by Third World (specifying
Africa, South-East Asia and India) in The
Population Bomb.
See entry above regarding Baruch Blumberg’s work with Hepatitis B, and vaccine
against it.
1976:

35

Peter Piot, subsequently Executive Director of UNAIDS, codiscovers Ebola virus, a deadly haemorrhagic fever, in Zaire
(previously Belgian Congo). Piot graduated at the University of
Ghent in Belgium in 1974, and has a PhD in Microbiology from the
University of Antwerp in 1980. He was also Professor of Public
Health at the Prince Leopold Institute of Tropical Medicine in
Antwerp, and a professor at Imperial College, London. From 1991
to 1994, he was president of the International AIDS Society, and
was ennobled as a Baron by King Albert II of Belgium in 1995.
1978-79
First reports of mysterious immune deficiency disease in homosexuals in New
York (Manhattan), Los Angeles and San Francisco. The disease is typified by
opportunistic infections and high incidence of Kaposi’s Sarcoma, previously
largely confined to elderly Jewish men. This disease, later called AIDS was first
named ‘GRID’, or Gay Related Immune Deficiency.
Around the same time (late 1970s, early 1980s) first reports of a similar disease
affecting heterosexual men and women (and later children) in Zaire, Burundi and
Rwanda. These were all ex-Belgian colonies, the indigenous people of Zaire
having been treated with such extraordinary cruelty during the reign of King
Leopold II in what he called the “Congo Free State” (1880s to 1905) that he
received international condemnation and what he had claimed as a ‘personal
possession’ was taken out of his hands by the Belgian Government. The grotesque
cruelty of the Belgians in the Congo included hand
hacking and decapitation of Africans who did not produce
the required quotas of rubber, and throwing their children
to crocodiles (see Wikipedia entries on Congo Free State
and Belgian Congo for an outline.

King Leopold II
of Belgium

36

Epidemiological and historical evidence suggests HIV was first introduced to
heterosexual women (targeting prostitutes) in ex-Belgian colonies in Africa,
specifically Zaire (Belgian Congo), and adjoining Rwanda and Burundi, via
infected batches of vaccines supplied by Smith-Kline Biologicals, based in
Antwerp, Belgium.
Later, in the early 1980s, HIV appears to have been seeded to other countries in
Sub-Saharan Africa, including Uganda, Malawi and Tanzania by the same means.
Introduction to South Africa, Zimbabwe and Zambia and to the West African
nations (and Ethiopia and Sudan in the horn of Africa, appears, according to the
epidemiological data on the UN AIDS website and other sources (including
Scientific American and Nexus publications of 1988) to have occurred in the early
to mid-1980s.
The immunization program with batches of infected vaccines would have to have
been paid for, orchestrated, organised and overseen by the World Health
Organization, under advice from Professor (Sir) Gustav Nossal. Gus Nossal took
over as head of the Walter and Eliza Hall (Virology and Immunology) Institute at
the University of Melbourne in 1965, after Frank MacFarlane Burnet left the
institute.

37

1980s:

Gustav Nossal and Victorian
Premier John Brumby

Increasing numbers of homosexuals, injecting drug users and recipients of blood
transfusions in Western nations diagnosed with AIDS; much larger numbers of
Africans from Congo and other Sub-Saharan African states also diagnosed with
AIDS. The Belgian microbiologist Peter Piot at the Prince Leopold Institute of
Tropical Medicine is involved in collaborative projects in Zaire, Burundi, Ivory
Coast, Tanzania and Kenya. Incidence of HIV antibodies, after development of test
by Max Essex and co-workers at Harvard University, estimated at up to 20% of the
population in some of these African nations. Peter Piot subsequently heads the
United Nations AIDS program after the death in a plane crash of Jonathan Mann.
1984:
Robert Gallo at the
National Cancer Institute
at Bethesda Maryland
announces, in press
conference arranged by
National Institutes of
Health, the isolation of
the virus that causes
Acquired Immune Deficiency Syndrome
(AIDS) which he names
Human Tlymphotrophic Virus III (HTLV III).

In Australia, recommendations by government for doctors to seek voluntary
consent from Aboriginal and Torres Strait Islanders/South-East Asians for
immunization against Hepatitis B, and also advised the vaccines for health
workers, due to risk from accidental ‘needle-stick injuries’. Later (post-2000) Hep
B immunization was added to routine immunization program for all babies in
Australia. The fact that health workers and Indigenous people immunized with

38

Hepatitis B in Australia did not contract AIDS, as did a disproportionate number of
subjects in the Gay Hepatitis B trial vaccines conducted by Wolf Szmuness and the
New York Blood Bank in the late 1970s indicates that HIV/AIDS (GRID) infection
in these gay men was unlikely to be due to accidental contamination of Hepatitis B
vaccines.

Chapter 2
THE AUSTRALIAN GOVERNMENT’S RESPONSE TO
HIV/AIDS
In 1995, the Commonwealth Government of Australia published an “evaluation of
the National HIV/AIDS Strategy, 1993-94 to 1995-96”. Titled “Valuing the past…

39

investing in the future”, the evaluation and report were commissioned by the (then)
Commonwealth Minister for Health, Dr Carmen Lawrence. Dr Lawrence, who has
a psychology degree, appointed Professor Richard Feacham (CBE, BSc, PhD, DSc,
Hon FFPHM), Dean Emeritus of the London School of Hygeine and Tropical
Medicine, as the “independent evaluator”. Professor Feacham wrote in the
foreword that he fully supports the reports content and conclusions, and that:
“Australia is to be commended for the prompt and creative way in which
it has responded to this new and special disease. Two features of this
response have been of particular importance and must be maintained: first,
non-partisan political support, which has allowed pragmatic and effective
programs to operate; second, the partnership, which has harnessed the
energies of those groups most affected by HIV, government at several levels,
and researchers and health professionals, Australia can be proud of its
achievements in controlling the spread of HIV and in developing services to
propvide care and support for people living with HIV/AIDS. All this has
been accomplished at reasonable cost.”
He adds, though, that:
“There is, however, no room for complacency. With current trends, the
target of 2 new infections per 100,000 people per year by the year 2000 will
not be achieved. In particular, the unacceptably high rate of new infections
among gay men must be confronted, and the emerging epidemic among
Aboriginal and Torres Strait Islander people calls for greatly increased
effort and resources.” (emphasis added)
Is there really an “emerging epidemic” of AIDS among the Aboriginal and Torres
Strait Islander populations, and if so, what is causing it? In what ways are the
current resources being spent? Why are the specified “high risk groups” so similar
to the previous targets of negative eugenics programs?
“Valuing the past…investing in the future: evaluation of the National HIV/AIDS
Strategy 1993-94 to 1995-96 begins with an “Overview”, in which “the problem”
is summarised as:
“HIV/AIDS is a new disease and, despite enormous advances in
knowledge in the past decade, it is still inadequately understood. There is no
cure and there is no vaccine, and these will probably elude us for some time.
The best estimate of the total number of people who had acquired HIV
infection in Australia by the end of 1994 is 16 000. Among the 5737 people
who had progressed to AIDS by the end of 1994, there had been 4014
deaths. HIV/AIDS is an enormous public health problem by any measure.”

40

The overview continues with a false claim regarding 100% fatality from HIV
infection:
“HIV/AIDS has similarities with other infections in terms of its routes
of transmission and modes of treatment and care. It remains exceptional,
though, because it presents a unique combination of characteristics: it is 100
per cent fatal (as far as we know); there is a long incubation period, during
which an HIV-positive person may be unaware of his or her infection; there
is often a long period between infection and death, during which an HIVpositive person shows no sign of ill health and is able to transmit the
infection; there is no cure; and the major modes of transmission involve
intimate and sometimes illegal behaviour, making prevention efforts difficult
and community reaction to the disease complex. Added to this is the fact that
Australia is part of a region with rapidly growing HIV epidemics.” (p.2)
It was common knowledge in the mid-1990s that HIV-infection does not cause 100
percent mortality. There have been, since tests for HIV antibodies became available
(in the mid-1980s) a group of people (mainly men) who are known to have HIV
antibodies in their blood but have not developed AIDS. These are termed, by the
AIDS research establishment, “long term non-progressors”. Describing HIV
infection as being “100 percent fatal” is a very serious error, since it would
predictably lead to hopeless and extreme pessimism in any person identified as
being “HIV-positive”. It would also predictably lead to a rising level of paranoia
about HIV infection and drive more people to taking potentially toxic drugs to
attempt to delay what is seen as an inevitable progression of the infection. The
main drug that has been promoted for this reason is Glaxo-Wellcome’s Zidovudine,
also known as azidothymidine (AZT). AZT has been known since the 1960s, when
it was being used as a risky “palliative” treatment for certain serious cancers, to
cause a range of serious health problems, including damage to white blood cells
and immune suppression. Pessimism and a sense of hopelessness are also known to
have suppressive effects on the immune system, and in health, generally. These
problems will be explored in later chapters.
I first read the false claim that “HIV infection is 100% fatal” in 1997, by which
time I was well aware that 100 percent fatality is extremely rare in medicine. Since
I was also aware of intense medical interest in the “long term non-progressors”
(those who had been identified as having HIV infection in the mid-1980s) I read
the following claim, by Professor John Mills of the Macfarlane Burnet Centre, that:
“There will be cases where patients have had the infection for 20 or 30
years before becoming ill. However virtually everybody infected with HIV

41

will eventually get AIDS, if they are not treated. The estimate at the moment
is about 95% but it could be 100%.”
The above quote is transcribed from a “public education” pamphlet titled
“HIV/AIDS – The Whole World’s Problem” produced by the Macfarlane Burnet
Centre in Melbourne and is “adapted from a talk by Professor John Mills, Director,
Macfarlane Burnet Centre for Medical Research”. Professor Mills, a graduate of
Harvard University in the USA, is also the Director of the National Centre in HIV
Virology Research, and is an honourary Professor of Microbiology at the
University of Melbourne, Monash University and the Royal Melbourne Institute of
Technology. He is also “adjunct Professor” at the University of California in the
USA and Chairman (and inaugural President) of the Australian Association of
Medical Research Institutes. Presenting an obvious conflict of interest, Professor
Mills is also, according to the Macfarlane Burnet Centre’s 1997, 1998 and 1999
Annual Reports, Director of AMRAD Corporation (a Melbourne-based drug and
biotechnology company) and Rothschilds Bioinvestment Fund, a profit-motivated
branch of the massive Rothschilds Banking Corporation.
Human Immunodeficiency Virus was publicly identified as the cause of AIDS in
1984, following publication of work by the American cancer researcher Robert
Gallo. Gallo, who was Chief of the Laboratory of Tumor Cell Biology at the
National Cancer Institute, in Bethesda, Maryland, named the virus he postulated as
the cause of AIDS, which had been first described in 1978, HTLV III (Human TLymphocyte Virus, type III). Since HIV-infection has only been detectable since
the mid-1980s the claim that “there will be cases where patients have had the
infection for 20 or 30 years before becoming ill” is clearly speculative. The claim
that the HIV infection is fatal “if not treated” is also extremely misleading. No
drug treatment has been discovered that prevents people infected with HIV from
dying.
The fact that drug treatment does not prevent death from HIV is admitted, as it
turns out, in the same paper:
“There are a number of ways in which people are treated, including
general supportive care and good nutrition. The associated infections and
tumours which occur can be treated to some extent. Sinve 1987 drugs have
been available for treatment – AZT, Zidovudine, ddI (didanosine), ddC
(zalcitibine) and stavudine (d4T). They are reasonably effective but really
only make a two to three year difference in survival.” (emphasis added)
Obviously, the more pessimistic the alleged “untreated prognosis” (likelihood of

42

becoming ill and dying), the more effective will appear (clinically and statistically)
any “palliative” drug treatment. For example, if it is predicted that a person will die
from HIV infection in 6 months and they live for two years while taking AZT, it
will appear that the drug has prolonged life by 18 months. If the prediction is that
people will die, instead, in 10 years time the same drug would be seen as having
worsened chances of survival. The same applies for predictions of 100% fatality. If
statistics are compiled based on this assumption, and extrapolations are made of
the likely incidence of HIV infection in the general population, or subgroups
within it, these will be glaringly inaccurate – but consistently tending towards
exaggeration of the efficacy of drug treatment.
The fact that overly pessimistic prognoses increase the apparent effectiveness of
interventional treatments has relevance to disease states generally – both physical
and mental. Of the medical specialties, psychiatry and oncology require special
mention in this regard. Pessimism is a consistent feature of the entire system of
psychiatric labels that are currently employed by health care workers around the
world. A person who has been “depressed” is predicted, on the basis of statistics, to
have recurrent episodes unless they “take their medicine”. People who have had
“psychosis” are likewise predicted to have further, worsening “episodes of illness”
unless, again, they take their tablets (or subject themselves to injections). Those
who have been diagnosed with “manic-depression” are told that only by taking
“mood stabilisers” (generally lithium) can they hope to remain sane. In each of
these cases the chronic disease state caused by pessimistic prognoses combined
with drug toxicity are routinely ignored, denied, and blamed on the “mental
illness”. When the roots of these pessimistic “psychiatry” labels are examined, the
historical trail inevitably leads to “negative eugenics”.
The 1995 report, Valuing the past…investing in the future is a publication of the
“AIDS/Communicable Diseases Branch” of the (Australian) Commonwealth
Department of Human Services and Health. Although the “independent evaluator”
appointed by the Minister for Health was the British professor Richard Feacham,
the evaluation was a group effort, organised by the Department of Health staff (the
“AIDS/Communicable Diseases Branch Evaluation Team”). Professor Feacham
evidently spoke to representatives of various “interest groups” as the main part of
his “independent evaluation” and was “supported by an evaluation advisory
committee” chaired by Peter Read, chairman of the Intergovernmental Committee
on AIDS. Members of this committee included representatives from the
Queensland Department of Health, Victorian Department of Health and Human
Services, Commonwealth Department of Human Services and Health, Australian
Federation of AIDS Organisations, Haemophilia Foundation Australia, People

43

Living With AIDS NSW, National Association of People wWith HIV/AIDS,
Australasian Society of HIV/AIDS Medicine, Australian National Council on
AIDS, National Centre in HIV/AIDS Social Research, National Centre in HIV
Epidemiology and Clinical Research and the South Australian Development
Commission. Thus the report, other than the foreword by Professor Feacham, has
no clear author. It does, however, give some indication of the Australian
Commonwealth Government’s view of itself regarding its success in containing the
global HIV/AIDS pandemic at the end of the “Second Strategy”. This was actually
the second phase of the Government’s 1989 strategy, as the report explains:
“The National HIV/AIDS Strategy 1993-94 to 1995-96 is Australia’s
second National HIV/AIDS Strategy. The first Strategy operated from 1989
to 1993 and was evaluated in 1992. The current evaluation report was
requested by Federal Cabinet when it agreed to the National HIV/AIDS
Strategy 1993-94 to 1995-96.”
Following the Department of Health’s apparent success (according to its own
report), strong support is urged for a “third Strategy” in the list of 79
recommendations that opens the report. This strategy should, according to the
report be especially directed towards “homosexually active men”, “injecting drug
users” and “Aboriginal and Torres Strait Islander communities”, the first four
recommendations being:
1. It is recommended that homosexually active men remain the highest priority
for the Education and Prevention Program and that program monitoring and
accountability arrangements take account for the need to monitor program
implementation in relation to stated priorities.
2. It is recommended that the Commonwealth, the State and Territory
governments and the Australian Federation of AIDS Organisations and AIDS
councils reassess and refocus education and prevention measures for
homosexually active men, to further decrease the practice of unprotected anal
intercourse and thus decrease the rate of HIV.
3. It is recommended that injecting drug users remain a priority for the
Education and Prevention Program. Community development and peer
education for injecting drug users should be strengthened.
4. It is recommended that the Commonwealth, the Intergovernmental
Committee on AIDS and the Australian National Council on AIDS continue
to support intensive education and prevention activities among Aboriginal
and Torres Strait Islander communities. These activities should be placed
within the broader sexual health and injecting drug use contexts and use
models of best practice.

44

In the acknowledgements of the 1995 report on the independent evaluation of the
National HIV/AIDS Strategy, Valuing the past…investing in the future it is
admitted that Chapter 2 of the report, titled “the epidemiology of HIV and AIDS in
Australia” was “drafted by John Kaldor and the staff at the National Centre in HIV
Epidemiology and Clinical Research, with the advice of Sue Kippax, June
Crawford, Anthony Smith and Nick Crofts” rather than by Professor Feacham. Dr
Nick Crofts is Head of the Epidemiology and Social Research Unit at the
Macfarlane Burnet Centre. This will be seen to have relevance when the activities
of the Rio Tinto Aboriginal Foundation, via the Centre, are subsequently explored.
Chapter 2 of Valuing the past…investing in the future is divided into 9 sections,
each of which contains an analysis of statistics gained through “HIV surveillance”.
The sections are:
1. Introduction
2. Overview of the HIV epidemic
3. People living with HIV/AIDS
4. Homosexually active men
5. People who inject drugs
6. Aboriginal and Torres Strait Islander people
7. Medically acquired HIV infection
8. Sex workers
9. Heterosexual contact
The above list gives the impression that homosexually active men, people who
inject drugs, Aboriginal and Torres Strait Islander people and sex workers are more
likely than the “general population” to have HIV infection. It has been stated
elsewhere in the “report” that a major public health problem is that of people who
appear to be well but are actually infective. The potential to stigmatise Aboriginal
and Torres Strait Islander people as being “more likely to be infectious” is obvious.
By associating “Aboriginal and Torres Strait Islander people” with homosexuals,
drug addicts and prostitutes (“sex workers”) the possibility of stigmatisation
increases to a high probability.
When one examines the details of the “emerging epidemic among Aboriginal and
Torres Strait Islander people”, however, one finds that this epidemic, while
expected and predicted, does not actually exist. The authors argue, however that
such an epidemic is imminent, and that the AIDS research industry, and
surveillance industry, need more resources (money) and effort (people) to avert
possible disaster. The section in Chapter 2, “The Epidemic of HIV and AIDS in
Australia”, on “Aboriginal and Torres Strait Islander People” begins with the

45

veiled admission that, since the time of the notorious “Grim Reaper” ads in the
1980s, the indigenous people of Australia have been regarded, by the medical
profession, as a “high risk group”:
“The possibility of a major outbreak of HIV infection among Aboriginal
and Torres Strait Islander people has been a cause for concern since the
AIDS epidemic began in Australia.”
No such epidemic has occurred, but the authors continue to predict one:
“So far, the information available from routine surveillance indicates that
the rates of HIV infection among Aboriginal and Torres Strait Islander
people are similar to those in the non-indigenous population…But it appears
that the rate of HIV diagnosis in Aboriginal and Torres Strait Islander people
is increasing, in contrast with rates in the overall Australian HIV epidemic,
which reached a peak in the early to mid-1980s. New HIV diagnoses in
Australia between 1992 and 1994 contained only 17 per cent of cumulative
HIV notifications, whereas among Aboriginal and Torres Strait Islander
people during the same period new HIV diagnoses constituted 50 per cent of
cumulative diagnoses (see Table 2.13).” (p. 49)
Table 2.13 gives the following details. Between 1985 and 1988 there were 20
reported cases of HIV infection in people categorised as “Aboriginal and Torres
Strait Islanders”, out of an Australian total of 11,054 cases. Between 1989 and
1991 there were 25 new “A & TSI” cases (as presented in this table) compared
with 4,428 cases for “Aust”. The two years from 1992 to 1994 saw 45 cases in
Aboriginal and Torres Strait Islanders out of an Australian total of 3,239 reported
new HIV infections. From these statistics it has been calculated that, since 50% of
the new cases in Aboriginals and Torres Strait Islanders were reported in the 199294 biennium, while in Australia as a whole only 17 percent of the total
(cumulative) number of cases were reported during this period of time, this
indicated an “emerging epidemic”. When the rest of the text, which will be quoted
almost in full, is carefully analysed, keeping in mind the history and doctrines of
eugenics, questions are raised more by what is omitted than what is included.
It is for this reason that the following quote is rather long, since it is intended to
serve as a reference for analysis in subsequent chapters. Continuing from the
previous quotation:
“Possible biases in these data include misclassification or non-reporting
of Aboriginality and the non-uniform surveillance mechanisms for HIV
infection that are used by the States and Territories – some do not include
data on Aboriginality; others have included such data only in recent years.

46

The possibility of relative underdetection of early HIV infection among
Aboriginal and Torres Strait Islander people also exists.
“Indirect sources of data support routine surveillance in indicating a
currently low prevalence of HIV among Aboriginal and Torres Strait
Islander people. First, HIV prevalence among antenatal women, measured
through voluntary antenatal screening and anonymous delinked surveys, has
remained extremely low. Although the proportion of Aboriginal and Torres
Strait Islander women included in these prevalence surveys is unknown, it is
probable that a substantially increased prevalence among such women would
have been detected. Second, all States and Territories perform HIV testing
on a proportion of prison entrants to the end of 1994 – virtually all entrants
are tested in New South Wales, Queensland and Victoria. In the Northern
Territory, which has the highest proportion of Aboriginal and Torres Strait
Islander people in both the overall and the prison population, 75 per cent of
prison entrants have been tested since 1991. Of the over 5000 HIV tests
performed during that time there were only two seropositive results, neither
of which was in an indigenous person.
“Although there is some similarity in reported modes of transmission of
HIV among Aboriginal and Torres Strait Islander people and people from
non-indigenous backgrounds, among whom sex between men is the
predominant HIV transmission mode, there has been an increasing
proportion of HIV diagnosis in Aboriginal and Torres Strait Islander people
reporting heterosexual contact as the likely mode of transmission (see Table
2.13). Among Aboriginal and Torres Strait Islander people the cumulative
proportion of HIV cases attributed to heterosexual contact (24 per cent) is
much higher than in the non-indigenous HIV-infected population (7 per
cent), and half the cases have been reported since 1992, in contrast with the
non-indigenous cases, of which over 80 per cent were reported before 1992.
This may suggest an increasing rate of heterosexually acquired HIV
infection among Aboriginal and Torres Strait Islander people, or it could be
explained by more active surveillance, including contact tracing, in recent
years. As in the non-indigenous population, the possibility exists that
heterosexual exposure is incorrectly attributed, but the equal number of male
and female cases attributed to heterosexual contact argues against significant
misclassification bias.
“Predictions of a large HIV epidemic among Aboriginal and Torres Strait
Islander people have been stimulated by the documentation of very high
rates of other STDs [sexually transmitted diseases] in this population. As
well as representing an important health problem in their own right, other
STDs are believed to increase the likelihood of HIV transmission,

47

particularly if they cause genital ulceration. Although STD rates among
homosexual men and female sex workers have declined substantially since
the start of the AIDS epidemic, a similar fall in STD rates has not been
observed among Aboriginal and Torres Strait Islander people.
“A large HIV epidemic among Aboriginal and Torres Strait Islander
people would have immense social and health ramifications. The already
considerable imbalance in health status between Aboriginal and Torres Strait
Islander people and non-indigenous people would be amplified. Existing
health problems among Aboriginal and Torres Strait Islander people – such
as malnutrition, tuberculosis and other infectious diseases – would be
exacerbated and the current life expectancy gap of 20 years between
Aboriginal and Torres Strait Islander people and non-indigenous Australians
could widen.
“Although HIV prevalence among Aboriginal and Torres Strait Islander
people appears to be low, the recent increase in the rate of diagnosis,
including the trend to a higher proportion of heterosexually acquired cases of
HIV, and continuing high rates of other STDs emphasise the urgent and
continuing need for improved HIV/STD prevention and control programs.”
(pp. 49-51)
Australia’s success in controlling the AIDS epidemic is contasted, in Chapter 7,
titled “Looking outwards – the International Assistance and Cooperation
Programs”, with the situation in “developing countries”:
“The HIV/AIDS pandemic is spreading rapidly through the developing
world. In 1990 an estimated 80 per cent of all new infections were occurring
in developing countries and this has been predicted to rise to 95 per cent by
the year 2000 (World Bank 1993). The majority of these infections have
resulted from heterosexual contact and the incidence of infections among
women of child-bearing age is rising steeply.
“The worst affected area is sub-Saharan Africa – with an estimated 11
million people infected with HIV by mid-1995 – but the most alarming
trends are now in southern and south-eastern Asia. Although extensive
spread of HIV in this region began only in the mid-1980s, and estimated 3.5
million adults were infected by mid-1995, up from 2.5 million in mid-1994
(WHO 1995).
“Reported infection rates in the Pacific nations and Papua New Guinea
are low. But testing levels in these countries are also low, which suggests
that available figures underestimate the extent of infection in this region.
“The inclusion of the International Assistance and Cooperation Program
in the National HIV/AIDS Strategy is an acknowledgement that HIV is a

48

pandemic and that efforts to control its spread domestically cannot be made
in isolation from international efforts to contain the disease. It is also
testimony to Australia’s humanitarian commitment to help poorer nations
cope with the impact of HIV and to prevent further spread.” (p.167-8)
Three government agencies are identified, in Professor Feacham’s report, as being
“reponsible for supporting Australia’s international response”. These are the
Australian Agency for International Development (AusAID), the Department of
Human Services and Health and the Department of Foreign Affairs and Trade.
Some of these programs will be explored in subsequent chapters, particularly those
involving the International Health Unit of the Macfarlane Burnet Centre in
conjunction with AusAID. Valuing the past…investing in the future explains the
central role of AusAID in the second Strategy of the National HIV/AIDS program:
“AusAID is identified in the second Strategy as the agency that had
provided, and would continue to provide, most of the financial support for
international HIV/AIDS programs. It spent an estimated $14.02 million in
1994-95, up from 6.14 million in 1992-93.
“AusAID delivers international HIV/AIDS assistance through individual
country and regional bilateral programs, multilateral agencies, and NGOs.
Among the agencies and organisations receiving funing in 1994-95 were the
WHO Global Program on AIDS (in the process of becoming UNAIDS), the
United Nations Development Fund, the South Pacific Commission,
UNICEF, AFAO [Australian Federation of AIDS organizations], the
Australian Council for Overseas Aid, the Australian Red Cross Society,
World Vision, Community Aid Abroad, the Adventist Development and
Relief Agency and Quaker Service Australia.”
The Australian government’s HIV/AIDS strategy, domestically and internationally,
has, since the “First Strategy”, been centred on “needle and syringe exchange” and
the promotion of condom use. This combined strategy is frequently referred to as
“harm reduction”, a term that is also, and even more inappropriately, used to
describe needle and syringe distribution without reliable collection of contaminated
(used) needles. Professor Feacham’s evaluation of the Australian Government’s
First Strategy is that “Australia has been successful in containing epidemics among
injecting drug users, sex workers and heterosexuals but there is still much to be
done.” Needle exchange programs are recommended strongly, which the report
claims “must be a foundation of Australia’s prevention efforts in a third Strategy
and beyond”:
“Needle and syringe exchange programs were established before the
first Strategy and have continued in the second Strategy. A study of the cost-

49

effectiveness of these programs suggests that they prevented approximately
3000 infections in 1991 [based on very pessimistic, indeed catastrophic,
forecasts] and saved over $250 million through the avoidance of treatment
costs associated with those infections. Needle and syringe exchange
programs must be a foundation of Australia’s prevention efforts in a third
Strategy and beyond. Peer education and community development are
important adjuncts to the provision of needles and syringes and should also
be integral to the long-term response.” (p.2)
“Peer education”, including education in schools, is centred on two related
strategies: “harm reduction” and “safer sex”. The latter, revised from the
previously used “safe sex”, means little more than promotion of condom use,
particularly among young people and the identified “high risk groups”. The
Valuing the past…investing in the future report advocates further condom
promotion and reform of laws relating to prostitution, citing the low prevalence of
HIV in sex workers in Australia as proof that such measures have “contained the
[predicted, but not actual] epidemic among sex workers”.
The report claims a stark difference in southern Asia and sub-Saharan Africa
regarding HIV in female sex workers:
“In some parts of the world, notably southern Asia and sub-Saharan
Africa, female sex workers have been documented as having a high
prevalence of HIV infection. In contrast, HIV prevalence among female sex
workers in the sex industry in Australia has been very low…Condom use is
reported to be very high in the organised sex industry: this is the result of
vigorous peer education by sex workers themselves, supported by
government programs. Laws relating to prostitution should be reformed to
increase the safety of sex workers in all parts of the industry. Sex workers
who are illegal immigrants, especially from countries with a high prevalence
of HIV, and men with a history of sex work require greater attention in a
third Strategy; otherwise, HIV prevalence rates could rise rapidly among
these communities and their clients.” (p.3)
The incarceration of “illegal immigrants” in detention centres has recently become
a public scandal in Australia, following exposes of cruel treatment to those
unfortunate enough to be imprisoned for this reason. The disproportionate number
of Aboriginal people in Australian jails has long been an outrage. The
Commonwealth Department of Human Services and Health report, Valuing the
past…investing in the future, makes passing reference to the “over-representation”
of indigenous people in the prison population, but predictably ignores the racial

50

discrimination that has resulted in this problem. The risk of introducing those
imprisoned into self-injection of drugs, with attendent risk of infection with HIV
and hepatitis C is evident from what is mentioned, however:
“Since nearly half of all people imprisoned in Australia report a history
of injecting drugs, and about half of these people continue to inject while in
prison, it is vital that more efforts are made to overcome barriers to the
provision of safe injecting equipment in Australian prisons [a better
alternative would surely be to release those incarcerated for political reasons,
minor crimes and ‘victimless crimes’; and abolish privately owned prisons,
which, since they profit more from more prisoners can be predicted to
contribute to a dangerously expanding “imprisonment industry”]. This
would also help reduce the currently high rates of hepatitis C transmission
resulting from the sharing of contaminated injecting equipment in prisons.
HIV prevention efforts in prisons would also be helped if condoms were
provided. Prisons are of central importance in HIV prevention among
Australia’s indigenous people because of the over-representation of these
people in prison populations.” (p.3)
The Macfarlane Burnet Centre, based at the Fairfield Infectious Diseases hospital
in Melbourne, is the main NGO (non-government organization) involved in
promoting HIV “harm reduction” strategies in the Asia-Pacific region. Experts at
the centre advise the United Nations and the Australian government about how to
contain the AIDS epidemic. They train the trainers – setting up AIDS programs in
various “developing countries” and inviting health workers to “learn about public
health”. Strangely, however, the HIV surveillance and prevention strategies they
promote are almost identical regardless of whether the “at-risk population” is male
homosexuals in Australian cities or remote hill-tribes in Asia: condom-use and
immunizations against viral infections, after extensive blood-collection – replacing
“Third World debt” with “Third World compensation” is not on the MBC agenda,
although this would achieve much more in terms of International Health.
Homosexually active men account for the vast majority of cases of AIDS that have
been diagnosed in Australia. This has remained the case since the first diagnosed
cases of HIV infection and AIDS in Australia. The report confirms this:
“The vast majority of those diagnosed cases of HIV and AIDS in
Australia for which the route of infection was reported have been in men
who became infected through homosexual contact…The numbers of
diagnosed cases of HIV infection associated with heterosexual contact, the
injection of illicit drugs, or medical procedures have been relatively small
compared with the number of cases transmitted through sexual contact

51

between men. The number and proportion of cases for which no exposure
category was reported has steadily declined as surveillance procedures have
improved [increased] nationally. The annual number of HIV diagnoses in
women has been roughly constant in the past decade, but there has been a
clear move towards a higher number of cases attributed to heterosexual
transmission (parallelled by a decline in other modes of transmission).
Because of the relatively small number of women who have acquired HIV
infection in Australia, the transmission of HIV from mother to child has been
rare.” (p.32)
The reference, above, to HIV infection from “medical procedures” refers to
infection from blood transfusions and other blood products between 1981 and
1985, prior to which about 200 people are known to have developed AIDS from
infected blood, many of them suffering from the hereditary coagulation disorder
haemophilia (which, coincidentally, runs in the British royal family). Since
“universal screening of blood donors” was introduced in 1985, the risk of such
infection is said, in the report, to have “substantially improved”. There is another
“medical procedure”, however, that is not discussed as a possible risk factor in this,
or any other official text or report about HIV infection and AIDS. This is
immunization, also known as “vaccination”.
Since the 1970s, male homosexuals have been specified, by the NHMRC and other
medical authorities, as being at particular risk, along with Aboriginal Australians,
Torres Strait Islanders and South-East Asians (particularly Vietnamese) of
developing hepatitis B, a viral infection that can cause cirrhosis (chronic
inflammation and scarring of the liver, ultimately resulting in liver failure and
death). This was taught to me at medical school in the early 1980s, and when
hepatitis B immunization became publicly available, in the form of SmithKline
Beecham’s Engerix-B, it was these populations that I, and other doctors identified
as “high risk groups”. We also identified ourselves as being in a “high risk group”
and the more anxious of the young doctors had themselves immediately
immunized.
The American AIDS researcher, Dr Alan Cantwell, is one of a small number of
physicians who have publicly expressed an opinion that, firstly, the AIDS epidemic
is man-made, and secondly, that the initial inoculation of HIV into white
homosexual men in the late 1970s, was probably done through vaccination in
experimental hepatitis B vaccine trials.
Because there has not been an epidemic of HIV and AIDS among all the health

52

workers who were immunized with Engerix-B and Merck Sharp & Dohme’s
alternative, H-B-Vax II, in the 1980s, the theory that HIV was unintentionally
introduced into homosexuals via hepatitis vaccines in the 1980s is not tenable.
Could it have been done intentionally – by using contaminated batches of vaccine
to target male homosexuals? Certainly the medical infrastructure exists to conduct
such an act. There exist special clinics that service largely homosexual clients,
many of which are connected with STD (sexually transmitted disease) clinics via
the public hospital system.
When the officially declared “high risk groups” for developing HIV infection and
AIDS, and the epidemiology of the epidemic, are compared with the previous
targets of negative eugenics programs, some notable coincidences are immediately
evident. Specifically, the Nazis targetted homosexuals, drug addicts and “blacks”
(‘Negroes’) as degenerate and needing of “mercy killing”. If one looks deeper at
the eugenics doctrines that the Nazi program was based on one finds also a morbid
preoccupation with the “threat of overpopulation”. This threat was seen, by
eugenists, as coming from the “uncivilised masses”, most of whom were darkskinned and resident in what were designated as “underdeveloped countries”. Far
from diminishing after the carnage of the Second World War, fears of
overpopulation expressed by a predominantly white, male, professional elite grew
in the 1950s, during the early years of the Cold War and exploded in the 1960s,
with the publication, in 1968, of The Population Bomb, by Stanford University’s
Professor of Population Studies, Paul Ehrlich.

Chapter 3
THE FOUNDATIONS OF THE EUGENICS MOVEMENT
The word ‘eugenics’ was first coined by the English psychologist Francis Galton in
1883. A few years later Galton, a cousin of Charles Darwin, founded, with the
evolutionary biologist’s son, Major Leonard Darwin, the first Society for Eugenics.

53

The stated objective of the eugenics society, and the politico-scientific movements
it spawned, was to improve the human “stock” by selective breeding. This was
attempted by various means, based on two complementary strategies: those of
“superior stock” were encouraged to have larger families, and those deemed to be
of “inferior stock” were prevented from having children at all. These parallel
strategies were termed, respectively, “positive” and “negative” eugenics.
This present work examines role eugenics has played in the planning and execution
of genocide, specifically exploring the theory that the global AIDS epidemic is the
result of the implementation of negative eugenic programs with the intent of
‘culling undesirables’ and reducing the ‘threat of overpopulation’.
The 1948 United Nations laws against genocide were formulated to prevent a
repeat of the horrific genocide that had occurred between 1938 and 1945 in Naziruled Europe. This, including the mass-murder of over 20 million Jews, Poles,
Russians and Gypsies ('non-Germans') had come about because of the merciless
application of “negative eugenics” programs by the German National Socialist
(Nazi) regime.
It is suppressed history, in Commonwealth nations, that the notorious doctrines of
eugenics did not originate in Germany and were certainly not the creation of Adolf
Hitler. Rather, they were exported to Germany from the University of Cambridge
in England in the late 19th century. Cambridge was then, and remains, one of the
most respected universities in the world, and was the ‘mother university’ of
Harvard University in the USA, when it was founded back in the 1600s. The
University of Cambridge also has close links with the oldest and most prestigious
Australian Universities, including the University of Melbourne, where my coresearcher Sara obtained her education degree during the writing of this book, and
the University of Queensland, where I gained my medical degree in 1983.
When I studied medicine, from 1978 to 1983, the first reports of a new infectious
disease that affected ‘gays’ in the USA was filtering through to medical students in
Australia, but it was only when I did my residency at the Royal Brisbane Hospital
that I heard that it was a viral disease, thought to be caused by a ‘slow virus’ or
‘retrovirus’, transmissible, like viral Hepatitis B, by blood and sexual contact. I
did not treat any patients with the disease myself, nor examine any, though I did
hear of an unfortunate young man who had a surgical problem that went untreated
because the surgeons at the hospital refused to operate on him, for fear of
becoming infected. I gathered that ‘GRID’ or ‘Gay Related Immune Deficiency’ as
it was then called, was caused by a newly discovered virus by the name of HTLV3.

54

The controversy regarding Robert Gallo’s announcement, in 1984, that he had
discovered the viral cause of Acquired Immune Deficiency Syndrome, did not
come to my attention till many years later, when HIV and AIDS had become
household words around the world. The fact that Luc Montagnier of the Pasteur
Institute accused Gallo of stealing his work not long after the cancer researcher’s
triumphant 1984 announcement has relevance, though, to an investigation of the
politics of the related industries of virology, microbiology, immunology, oncology,
genetic engineering, pharmacology, chemical warfare and biological warfare.
These industries, and their relevance to AIDS, will be explored in later chapters.
At the University of Queensland we learned little about the history of medicine,
and nothing of its political history. Instead, we studied human physiology, anatomy
and biochemistry before focusing on pharmacology, indications for various
investigations, and for surgery. We were being trained to become the doctors of
hospitals of the future – and the hospitals of the future were clearly envisaged to be
much like the hospitals of the past, the hospitals that had trained those who trained
us.
The fact that doctors, most of who work outside the hospital system are trained
within the hospital system and the rarefied atmosphere of university academia has
unfortunate results for public health. The fact that drug companies have played an
increasingly dominant role in sponsoring ‘medical’ research, ‘peer-reviewed
literature’ and continued medical education in the hospitals and universities has
contributed to this problem.
When we began researching AIDS in 1997 I was working as a family physician in
a sole practice in Melbourne’s outer suburbs. My regular patients were largely
elderly, and heterosexual. The vast majority were ‘white’ in the ‘black and white’
classification of human skin colour that permeates the medical and political
literature. As such it is not surprising that none of my regular patients had HIV or
AIDS.
The reason I became interested in looking deeper at the cause of the epidemic was
primarily curiosity. I had known for years that the Nazis killed homosexuals as
‘degenerate’, along with Jews and Gypsies. It was common knowledge that AIDS
largely affected homosexual males in ‘the West’ – in the USA, Europe and the UK.
In the lingo of the AIDS research establishment these were called ‘Pattern 1
countries’ where HIV infection was largely limited to male homosexuals, and
intravenous drug users. Australia was also reported to be demonstrating ‘Pattern 1’

55

in the demographics of those infected. Most cases of AIDS, though, were in socalled ‘Pattern 2 countries’. Here the infection affected men and women equally,
was not associated particularly with homosexuality or injected drug use, and was
also common in children.
In 1997 I read, in an Australian medical newspaper that a new front in the AIDS
war had developed in New Guinea, where a high infection rate had been
discovered in 17 to 25-year old women.
This rang alarm bells in my mind. New Guinea has an unfortunate colonial history,
as a colony of Germany, of Britain and of Queensland. The exploitation of New
Guinea, I knew, was centrally connected to the exploitation of its forests and the
minerals, especially gold. The same can be said for Sub-Saharan Africa, where
AIDS has claimed more lives than in the rest of the world combined.
The present work charts previously unexplored historical and medical territory –
the influence of negative eugenics since the end of the Second World War, in
various areas of medicine and politics, not in Germany, but in nations of the British
Commonwealth and the USA. The modern history of eugenics can only be
understood, though, when the earlier history of the Eugenics Movement and its
founders is explored.
According to the 1974 edition of the Dorland’s Medical Dictionary, the term
‘eugenics’ is derived from the Greek eu, meaning ‘well’, ‘good’, or ‘easily’ and
gennan, meaning ‘to generate’, defining the word as follows:
“eugenics: the study and control of procreation as a means of improving
the hereditary characteristics of a race; called also orthogenics…negative e.,
that concerned with prevention of reproduction (procreation) by individuals
possessing inferior or undesirable traits. Positive e., that concerned with
promotion of optimal mating of individuals possessing superior or desirable
traits.”
The 1976 edition of the Concise Oxford Dictionary defines ‘eugenic’ as, “of the
production of fine (esp. human) offspring by improvement of inherited qualities”.
One who practices, or advocates eugenics is called, according to this dictionary, a
“eugenist”. The Dorland’s Medical Dictionary uses the alternative term
“eugenicist”.
The study of eugenics is fundamentally connected with the study of scientific and
political racism, and no study of the current causes of racism can be complete

56

without an understanding of eugenics. This is because eugenics, which was
initially conceived primarily as a “biological” white-supremacist race theory, has
had deep and enduring influence on medical science, biology, sociology,
anthropology, psychology, education and politics. In particular, eugenic theories lie
deep in the medical disciplines of genetics, psychiatry, neurology and preventive
medicine.
In the early heyday of eugenics, between 1905 and 1930, many esteemed
professors and doctors in Australia, the USA, New Zealand, Canada, South Africa,
Western Europe and Britain were self-confessed eugenists, and many books were
written about the absolute necessity of “eugenic reform” to prevent the
“degradation of the human race” that was seen to be imminent.
The threat the eugenists declared was not to themselves – it was to their
descendants, whom they feared would not be able to maintain their numbers
against the hordes of those deemed “degenerate” - people that were said to be
breeding “like rabbits”. These included some in their own countries – including the
very poor, whom they called “paupers”, the “insane” (who were mostly poor, and
alternatively termed “lunatics”), “drunkards”, “criminals” and the “feebleminded”. These, the eugenists demanded, must not be allowed to breed lest they
result in worsening degeneration of the Anglo-Saxon race.
They all agreed on this point, although they disagreed, sometimes strongly, about
what should be done about “the problem”, and as to how much of this apparent
“degeneracy” was due to genetic inheritance and how much due to environmental
and social inequities.
The eugenists also agreed that particular races, especially so-called “black races”
were fundamentally degenerate, and markedly inferior to their own; these, they
agreed, needed to have their fertility controlled by far-sighted men such as
themselves if they lacked the intelligence, vision, and lack of sentimentality of the
eugenists, and failed to take the drastic measures demanded as “eugenic reform”.
This included forced sterilization of “degenerate stocks” that refused to voluntarily
abstain from having children, and segregation of those with impure blood from
those with the “superior” blood of the pure, noble, white, Anglo-Saxon race.
The present work is mainly concerned with the problems that have resulted, may
have resulted, and may yet result from the theories and practices of “negative
eugenics”. This follows 12 years of research, based in Melbourne, Australia, on
man-made, and thus preventable, causes of illness. My own discovery of the 100-

57

year history of eugenics came about through research into the history of medical
treatment of psychological problems, and specifically my research into psychiatric
theory and practice. This, I found, is fundamentally rooted in eugenic theories. At
the same time, it is rare for those with modern ‘eugenic’ ideas to recognize the
source of their beliefs. These days ‘eugenic’ doctrines are presented, more often, as
‘modern genetic studies’, ‘international health’, ‘population health’ and ‘biological
psychiatry’.
Although several prominent psychiatrists of the early twentieth century were selfdeclared eugenists, many more eugenists have not been psychiatrists. This is
because, although it has had much influence on psychiatric theory and practice,
eugenics was developed as a general anthropological theory, by an aristocratic
English general scientist with a particular fondness for statistics, “good family
connections” and a total idolization of his older cousin, Charles Darwin.
This was Francis Galton, who, despite never finishing his medical studies at the
University of Cambridge, went on to shape the development of academic and
institutional medicine and surgery in addition to the fields of anthropology,
sociology, psychology and politics at Cambridge and, later, around the world.
How Francis Galton came to be so influential provides a clear refutation of his
central theory: that the ruling classes occupied their exalted position in society, not
because of familial and political connections (nepotism and “jobs for the boys”)
but because of genetic superiority.
Like his cousin, Charles Darwin, Galton, born into a wealthy English family, was
groomed by his father to become a doctor. With his father’s assistance Galton
entered medical school at the esteemed Cambridge University in England in 1841
at the age of 18. From the time of his birth Francis lived a life of affluence and
privilege in British society. He was given opportunities that few men of his time
were afforded, these being largely arranged by his ambitious and influential father.
Professor C.P. Blacker, General Secretary of the Eugenics Society, and physician at
the infamous Bethlem Mental Hospital, wrote, in Eugenics: Galton and After
(1952):
“Galton’s father, ever solicitous of his son’s education, arranged that the
youth should accompany Dr. Allen Miller, subsequently a great chemist and
for many years Treasurer of the Royal Society, to Giessen in order to attend
lectures on chemistry by Liebig, then the leading exponent of this subject in
Germany.” (p.21)

58

This followed an earlier arrangement by his father, when Galton was 16, to give his
son a direct channel to the “corridors of power”. Blacker writes:
“When Galton was sixteen, his father arranged that he should accompany
Sir William Bowman, later eminent as an authority on diseases of the eye, on
a continental tour, partly as a holiday, partly to see medical institutions.
Galton then witnessed his first operation. He also paid a visit to a mental
hospital in Vienna…On returning to England, still aged sixteen, Galton was
sent as an indoor pupil to the Birmingham General Hospital, where he
worked in the dispensary and later in the wards…Galton then moved to
London and spent a year at King’s College where he was given less
responsibility and did less practical work than in Birmingham but enjoyed
better teaching.” (p.21)
When his father died in 1845, Galton, then 23 years old, was left a large
inheritance, and, being independently wealthy, abandoned his medical studies,
becoming a “gentleman of leisure and independent means”. He decided, instead, to
seek his fame through travel, hoping to emulate his venerated cousin Charles, who
had made such notable discoveries when he travelled to distant lands aboard the
HMS Beagle.
When, in 1850, Galton organised a subsequently lauded (and richly rewarded)
journey to Southern Africa, he did so with the support and organizational help of
the Royal Geographical Society. The British Psychiatrist C.P. Blacker, then
“General Secretary of the Eugenics Society”, “Adviser on Population and Medicosocial Questions” to the British Ministry of Health, Honorary Secretary of the
“Population Investigation Committee” and previous “Member of the Biological
and Medical Committee in the Royal Commission of Population (1944-49)” wrote
about Galton’s life 40 years after the founder of eugenics’ death.
In Eugenics: Galton and After (1952) Blacker writes of Galton’s motives for this
trip, which was subsequently awarded by a Gold Medal from the Royal
Geographic Society and election as a “Fellow of the Royal Society” at the young
age of 34:
“Galton was attracted to South Africa by the romance of exploration and
the prospect of shooting big game. The recent discovery of Lake Ngami did
much to shape his plans, and the reader who wishes to understand Galton’s
travels would do well to have a mental picture of where this is. Lake Ngami
occupies a central position of the Southern tip of the continent. It lies about
950 miles from the East coast and about 550 miles from the West. Separating

59

Lake Ngami from the colonized lands of the Cape lies the Kalahari desert.
Livingstone and Oswell had shortly before traversed this desert, thus
rendering accessible the well-watered districts of the lake to wagons from
the Cape. Galton wanted to explore this new opening, and very sensibly – as
it turned out – made contacts with the Royal Geographical Society. His
vague plans were then carefully discussed, given precision and
authoritatively approved. Introductions were obtained to influential persons
both in England and in the Cape, and the services secured, as travelling
companion and second in command, of Charles J. Andersson, a Swede, to
whom Galton pays many warm tributes.” (p.30)
For many years prior to Galton’s travels, Britain had been battling with the Dutch
and German “Boers” for control of Southern Africa; this conflict intensified later
with the discovery of gold and diamonds in the early 1870s, culminating in the
“Boer War” (1880-1902). This colonial history has relevance to the political,
academic and social environment in the areas of Africa from which HIV and AIDS
were first reported in the 1980s, specifically ‘Sub-Saharan Africa’, and the
relationship between the European nations that have led the world (along with the
USA and Russia) in biological warfare research and experimentation. These will be
discussed in chapters 11 and 12.
When Galton attempted to forge a new route for the British to the newly
discovered Lake Ngami district, in the 1850s, he was discouraged by Boer
(‘Afrikaaner’) settlers from crossing Boer-controlled land. He made his way,
instead, by using a “chain of mission stations which had been established along the
Swakop river”, which flowed into Wolfisch Bay, north of the Boer-controlled
areas. Galton travelled inland about 1000 miles, with two wagons, ten Europeans
and “about eighteen natives” (Blacker, 1952, p.32). Professor Blacker quotes his
hero Galton, who wrote about his own behaviour towards the “natives” who led
him towards Lake Ngami, fame and fortune:
“I had to hold a little court of justice on most days, usually followed by
corporal punishment, deftly administered. At a signal from me the culprit’s
legs were seized from behind, he was thrown face forward on the ground
and held while Hans applied the awarded number of whip strokes.” (Galton,
quoted in Blacker, 1952, p. 32)
Hans, who “applied the awarded [by Galton] number of whip strokes” was
Galton’s Danish companion (among ten Europeans including another
Scandinavian, Galton’s “deputy”, Charles Andersson) during his “exploration” of
South-West Africa. During their attempt to find a Western approach to Lake Ngami

60

(which they failed to do), they travelled through land occupied by the Bantuspeaking Damara people. These, recorded Galton, “were for the most part thieving
and murderous, dirty and of a low type”.
He formed a more favourable impression of the Hottentots and Bushmen, the
indigenous people of South Africa, whom he described as “yellow skinned”,
claiming that “we became great friends with the Bushmen, and sat late into the
night hearing their stories about themselves and the recent doings of a body of
strange Namaquas coming from the south, who in the preceding year had swept
past them and onwards to lake Ngami, leaving unmistakable signs of their
expedition, and marauding as usual as they went”. The Namaquas were another
tribe of “black Africans”, who, according to Galton, were engaged in raiding cattle
from the Damaras to the north.
Galton’s fury at the insolent behaviour of the Damara people was translated into
revenge when he wrote, in his 1853 book Tropical South Africa, of the Damaras:
“These savages court slavery. You engage one of them as a servant, and
you find that he considers himself as your property, and that you are, in fact,
become the owner of a slave. They have no independence about them,
generally speaking, but follow a master as a spaniel would. Their heroworship is directed to people who have wit and strength enough to ill-use
them. Revenge is a very transient passion in their character; it gives way to
admiration of the oppressor. The Damaras seem to me to love nothing; the
only strong feelings they possess, which are not utterly gross and sensual,
are those of admiration and fear. They seem to be made for slavery, and
naturally fall into its ways.” (Galton quoted by Blacker, 1952, p.73)
Francis Galton was clearly not an advocate of the emancipation of slaves,
claiming, as he did, that the “negro race” possessed what he called a “slavish
instinct” – making the state of slavery “natural”, and even desirable. Galton’s
opinion of the Damaras (expanded in his judgement to ‘savages’, ‘natives’ and
‘blacks’) may have been coloured by the fact that he had formed a modus vivendi
with the Namaquan chiefs when he first arrived in South-West Africa, explaining
why “the Damaras…were hostile to missionaries and [these particular] explorers”
(Blacker, 1952, p.31).
Attempting to conceal the understandable suspicion and resistance shown by the
Damara people towards his party of Europeans, Galton claimed a now familiar
allegation of eugenists: that “the Damaras were convulsed by tribal wars among
themselves”. As for his ‘allies’, the Namaquas, they did not escape Galton’s

61

subsequent judgement, in Hereditary Genius (1869) that “the average intellectual
standard of the negro race is some two grades below our own” (p.327).
Francis Galton was particularly interested in the breeding of “pedigree” dogs,
particularly the English hunting breed called “Bassett Hounds”. Generalisations
based on his observations of dogs and horses, barely disguising deep racism,
hypocrisy and elitism, became a consistent feature of Galton’s subsequent “eugenic
theories”. These are seen in his thoughts on “human variety”, expressed in the
1901 Huxley Lecture of the Royal Anthropological Institute titled “The Possible
Improvement of the Human Breed”:
“Human Variety – The natural character and faculties of human beings
differ at least as widely as those of the domesticated animals, such as dogs
and horses, with whom we are familiar. In disposition some are gentle and
good-tempered, others surly and vicious; some are courageous, others timid;
some are eager, others sluggish; some have large powers of endurance,
others are quickly fatigued; some are muscular and powerful, others are
weak; some are intelligent, others stupid; some have tenacious memories of
places and persons, others frequently stray and are slow at recognising. The
number and variety of aptitude, especially in dogs, is truly remarkable;
among the most notable being the tendency to herd sheep, to point and to
retrieve. So it is with the various natural qualities that go towards the making
of civic worth in man. Whether it be in character, disposition, energy,
intellect, or physical power, we each receive at our birth a definite
endowment, allegorised by the parable related in St. Matthew, some
receiving many talents, others few; but each person being responsible for the
profitable use of that which has been entrusted to him.”
The above quote is taken from the 1909 publication by the Eugenics Education
Society titled Essays in Eugenics, a collection of lectures by Francis Galton
between 1901 and 1908. It was published in the same year (1909) that the Galton
Eugenic Laboratory was established at University College, London, following a
payment of 1400 pounds sterling to the College by the affluent, but formally
unqualified, “Father of Experimental Psychology”.
The latter title was accorded to Francis Galton in the 1940 historical text by
Stanley Casson, The Discovery Man. Casson, an English anthropologist based at
Oxford, refers to Francis Galton’s role in arranging finance and connections for the
British Egyptologist Flinders Petrie:
“About this time [1883, the year Galton coined the term ‘eugenics’]
Petrie made the acquaintance in England of Francis Galton, the father of

62

experimental Psychology. Galton was deeply interested in Ethnology and
racial types, and arranged for Petrie to make copies and casts of any of the
relief sculptures in Egypt which bore representations of races alien to Egypt.
From an anthropological point of view this was a most important mission.
Petrie was given a grant from the Royal Society for the purpose.
“It was to Galton’s credit that the enterprise was undertaken, and to
Petrie is due the discovery that the Egyptians, as has been explained in an
earlier chapter, were the first people that we know of to be conscious of
racial distinctions.” (p.271-72)
In an earlier chapter Professor Casson describes in glowing terms another
Londoner who reached the pinnacle of the British Scientific Establishment at an
exceptionally young age. Not only was he elected a Fellow of the Royal Society at
the age of 30 (even younger than Galton when he received the same honour), he
did so having, apparently, never sat for a single examination in his life. This was
Edward Burnett Tylor, born in 1832, and later Professor of Anthropology at Oxford
University. Casson compares Tylor favourably with the famous German
Archeologist Heinrich Schleimann, the discoverer (and plunderer) of the site of the
city of Troy (in modern-day Turkey):
“Tylor is to be compared with Schleimann not only in what he
accomplished for the nascent study of Anthropology but also his personality.
He was born in 1832, the year of the Reform Bill. His parents were Quakers,
so that from the outset he was destined to grow up in an atmosphere in
which the intellectual Liberalism of the early nineteenth century was
fighting its long battle with an older order. His natural tendency towards a
scientific outlook and method was a part of the background in which he
lived. The fact that Tylor ultimately became a Professor at Oxford, though as
he boasted, he had never sat for a single examination in his life, was proof
that it was at last possible for learning to develop outside the fixed curricula
of academic life.” (p.234)
Galton, whose parents, incidentally, were also Quakers, was guaranteed fame and
fortune when he was elected as a Fellow of the Royal Society in 1856, ostensibly
due to his rather minor “explorations” of Southern Africa. Professor Blacker
describes this admiringly, while failing to see the possibility of nepotism at work:
“The Geographical Society’s medal gave to Galton an established
position in the scientific world, and contributed, together with subsequent
work, to his election in 1856 to the Fellowship of the Royal Society. This
outstanding honour was conferred upon Galton at the early age of thirtyfour.” (p.35)

63

As a member of the Royal Society, Galton became increasingly involved with, and
influential within, a growing range of Royal Institutes. One of these was the
Anthropological Institute, where he was invited to give the second Huxley Lecture,
in 1901 (in honour of Darwin’s friend and supporter, Thomas Huxley). The lecture,
titled “Possible Improvement of the Human Breed”, reiterates Galton’s
mathematics-based argument, familiar to his audience from his 1869 book
Hereditary Genius, that the upper classes were in danger of being swamped by the
“degeneracy” of the “lower classes” – a fate that could only be averted by “eugenic
reform”.
Galton’s arguments were based on a misapplication of the statistical concept of
“normality” as indicated by “bell curves”. As in Hereditary Genius, he begins his
argument with the claim that because distribution of men’s height around the
median (average) height of a given population assumes the shape of a “bell curve”,
which is symmetrical on either side of the median line, other human
(‘anthropometric’) features and qualities must inevitably follow the same “law”.
This he termed the “law of normality” which he held to be a universal law,
applicable with equal validity to heights as to muscular strength, physical
endurance, intelligence, or “civic worth”. The term “civic worth” was favourite of
Francis Galton, and it was on the basis of his opinions of “civic worth” that the
main arguments of Hereditary Genius were based. These are adapted in his 1901
lecture on “The Possible Improvement of the Human Breed” to a division of the
population of London into a gradation of “lowest” to “highest” classes by Charles
Booth. The following quote, from the lecture, gives an indication of Galton’s style
of argument, and of his class and caste prejudices:
“Let us now compare the normal classes with those into which Mr.
Charles Booth has divided the population of all London in a way that
corresponds not unfairly with the ordinary conceptions of civic worth. He
reckons them from the lowest upwards, and gives the numbers in each class
for East London. Afterwards he treats all London in a similar manner, except
that sometimes he combines two classes into one and gives the joint result.
For my present purpose, I had to couple them somewhat differently, first
disentangling them as best I could. There seemed no better way of doing this
than by assigning to the members of each couplet the same proportions that
they had in East London. Though this was certainly not accurate, it is
probably not far wrong. Mr. Booth has taken unheard of pains in this great
work of his to arrive at accurate results, but he emphatically says that his
classes cannot be separated sharply from one another. On the contrary, these
frontiers blend, and this justifies me in taking slight liberties with his figures

64

[manipulating them to suit his own theory]. His class A consists of criminals,
semi-criminals, loafers and some others, who are in number at the rate of 1
per cent. in all London – that is 100 per 10,000, or nearly three times as
many as the v class: they therefore include the whole of v and spread
upwards into the u [v and u were lowest categories of “inferior stock”
according to Galton’s own classification and analysis, described later as
“undesirables”, together with class t, the third-lowest “class”]. His class B
consists of very poor persons who subsist on casual earnings, many of whom
are inevitably poor from shiftlessness, idleness or drink. The numbers in this
and the A class combined closely correspond with those in t and all below t.
“Class C are supported by intermittent earnings; they are a hard-working
people, but have a very bad character for improvidence and shiftlessness. In
Class D the earnings are regular, but at the low rate of twenty-one shillings
or less a week, so none of them rise above poverty, though none are very
poor. D and C together correspond to the whole of s combined with the
lower fifth of r. The next class, E, is the largest of any, and comprises all
those with regular standard earnings of twenty-two to thirty shillings a week.
This class is the recognised field for all forms of co-operation and
combination; in short for trade unions. It corresponds to the upper four-fifths
of r combined with the lower four-fifths of R. It is therefore essentially the
mediocre class, standing as far below the highest in civic worth as it
stands above the lowest class with its criminals and semi-criminals. Next
above this large mass of mediocrity comes the honourable class F, which
consists of better paid artisans and foremen. These are able to provide
adequately for old age, and their sons become clerks and so forth. G is the
lower middle class of shopkeepers, small employers, clerks and subordinate
professional men, who as a rule are hard-working, energetic and sober. F and
G combined correspond to the upper fifth of R and the whole of S [in
Galton’s classification], and are, therefore, a counterpart [in terms of a
statistical ‘bell curve’] to D and C. All above G are put together into one
class H, which corresponds to our T, U, V and above, and is the counterpart
of his two lowermost classes, A and B. So far, then, as these figures go, civic
worth is distributed in fair approximation to the normal law of frequency.
We also see that the classes t, u, v, and below are undesirables.” (pp. 8-12)
Galton was trying to fit the figures of Charles Booth into his own theory, detailed
in Hereditary Genius, that when men are classified according to their “natural
gifts” (“whether in respect to their general powers, or to special aptitudes”) they
demonstrate a “normal distribution”, as depicted in a bell curve. While Charles
Booth divided the population into classes only on a “positive axis” (A, B, C, D, E

65

and so on), Galton used both a positive and a negative axis, divided by a median
line, which he defined, in line with statistical theory, as “ the average”, with classes
R, S, T, U, V above the median, and r, s, t, u, v below it.
According to his theory, half of the population, regardless of what was being
measured, belonged to the two classes closest to the median, the superior R class
(immediately above the median line) and an inferior r class. 25% belonged to R,
and 25% to r, described together as “essentially the mediocre class”. A
symmetrical, proportionally smaller population belonged to the adjacent classes,
the more superior S class (16%) and a more inferior s (also representing 16% of the
population). The even more superior class T contained 7% of the population
(whose “sons become clerks and so forth”), while the corresponding inferior class,
t, also included 7% of the population. Class t, together with the “lowest classes”, u
and v, were, according to Galton, “undesirables”: 9% of the London population,
according to his calculations. He argued that an equal number of people existed
above the median line as below it, regardless of what “anthropometric” parameter
was being discussed. There were just as many “intelligent” people above the
average (median) as there were “stupid” ones below it, just as many “superior” as
“inferior” – and just as few “desirables” (such as himself) as “undesirables”. Such
was the over-simplicity of Galton’s mathematical and anthropological theories, and
of his philosophy. The latter can be summarised by the motto, “If you can’t
convince them of your mathematical superiority, use simplistic mathematics to
prove your genetic superiority – and theirs, that way they’ll agree with you”. This
was attempted in terms of his “class”, and also in terms of his “race”.
With the publication of Hereditary Genius in 1869, Galton won the adulation of
many prominent British academics, including Charles Darwin himself, who wrote,
to his younger cousin:
“I do not think I have ever in all my life read anything more
interesting…You have made a convert of an opponent. I congratulate you on
producing what I am convinced will prove a memorable work.” (Darwin,
1869, quoted in De Paoli, 1997, p.33)
An obvious reason for such accolades was that Galton was mathematically
“proving” the superiority of those who were providing the accolades, notably the
esteemed graduates and Fellows of his “own” university, Cambridge, and other
Fellows of the Royal Society (the fact that all the early British founders of the
Eugenics movement were Fellows of the Royal Society also suggests support by
the royalist Freemason organization for the development and promotion of the
movement, as will be seen). His argument that exceptional intellectual superiority,

66

as rare as “the brightest star in the night sky”, is reliably measured by the man who
gained top marks in Mathematics at Cambridge (the “senior wrangler”) could have
been predicted to gain support among his more elitist colleagues and “superiors”:
“There can hardly be a surer evidence of the enormous difference
between the intellectual capacity of men, than the prodigious differences in
the numbers of marks obtained by those who gain mathematical honours at
Cambridge. I therefore crave permission to speak at some length upon this
subject, although the details are dry and of little interest. There are between
400 and 450 students who take their degrees in each year, and of these, about
100 succeed in gaining honours in mathematics [coincidentally something
Galton himself attempted, but failed to achieve, while at Cambridge], and
are ranged by the examiners in strict order of merit. About the first forty of
those who take mathematical honours are distinguished by the title of
wranglers, and it is a decidedly credible thing to be even a low wrangler; it
will secure a fellowship in a small college. It must be carefully borne in
mind that the distinction of being first in this list of honours, or what is
called the senior wrangler of the year, means a vast deal more than being the
foremost mathematician of 400 or 450 men taken at hap-hazard. No doubt
the large bulk of Cambridge men are taken almost at hap-hazard [actually it
mainly depended on whether their parents could afford the university fees
and “pull the necessary strings”]. A boy is intended by his parents for some
profession; if that profession be either the Church or the Bar, it used to be
almost requisite, and it is still important, that he should be sent to Cambridge
or Oxford.” (Hereditary Genius, 1869, p.14)
After he returned to London in 1852 Galton wrote about his travels to South and
South West Africa and was greatly honoured by the Royal Geographical Society
and allied Royal Society for his “exploration”. The plaque next to Galton’s longtime residence in London announces, “Sir Francis Galton, 1822-1911, explorer,
statistician, founder of eugenics, lived here for fifty years”. His experiences with
African “natives”, several (perhaps all) of whom were slaves that Galton claimed
were “given to him” by their owners, shaped his increasingly racist views about
“Negroes” and the “Negro race”. These were “scientifically rationalized” at the
conclusion of his first major publication, Hereditary Genius: an Inquiry into its
Laws and Consequences. Published in 1869, this icon of the eugenics movement
argued that the mental ability of the “Negro (black) race” is “not less than two
grades” below that of the Anglo-Saxon race (which Galton, and other Anglo-Saxon
eugenists, equated with “the white race”):
“Let us, then, compare the Negro race with the Anglo-Saxon, with
respect to those qualities alone which are capable of producing judges,

67

statesmen, commanders, men of literature and science, poets, artists, and
divines [priests]. If the negro race in America had been affected by no social
disabilities [versus discrimination], a comparison of their achievements with
those of the whites in their several branches of intellectual effort, having
regard to the total number of their respective populations, would give the
necessary information. As matters stand, we must be content with much
rougher data.
“First, the negro race has occasionally, but very rarely, produced such
men as Toussaint l’ Ouverture, who are of our class F [according to a ‘bell
curve’ rating scale, as before, but graded from an increasingly superior A to
G, and increasingly inferior a to g on either side of the median line, with X
and x designating the ‘one in a million’ group above G and below g,
respectively]; that is to say, its X, or its total classes above G, appear to
correspond with our F, showing a difference of not less than two grades
between the black and white races.
“Secondly, the negro race is by no means wholly deficient in men capable
of becoming good factors, thriving merchants, and otherwise comfortably
raised above the average of whites – that is to say, it cannot unfrequently
supply men corresponding to our class C, or even D. It will be recollected
that C implies a selection of 1 in 16, or somewhat more than the natural
abilities possessed by average foremen of common juries, and that D is as 1
in 64 – a degree of ability that is sure to make a man successful in life. In
short, classes E and F of the negro may roughly be considered as the
equivalent of our C and D – a result which again points to the conclusion,
that the average standard of the negro race is some two grades below our
own.
“Thirdly, we may compare, but with much caution, the relative position of
negroes in their native country with that of the travellers who visit them. The
latter, no doubt, bring with them the knowledge current in civilized lands,
but that is an advantage of less importance than we are apt to suppose. A
native chief has as good an education in the art of ruling men as can be
desired; he is continually exercised in personal government, and usually
maintains his place by the ascendancy of his character, shown every day
over his subjects and rivals. A traveller in wild countries also fills, to a
certain degree, the position of a commander, and has to confront native
chiefs at every inhabited place. The result is familiar enough – the white
traveller almost invariably holds his own in their presence. It is seldom that
we hear of a white traveller meeting with a black chief whom he feels to be
the better man. I have often discussed this subject with competent [no doubt,
‘white’] persons, and can only recall a few cases of the inferiority of the

68

white man,– certainly not more than might be ascribed to an average actual
difference of three grades, of which one may be due to the relative demerits
of native education, and the remaining two to a difference in natural gifts.
“Fourthly, the number among the negroes of those whom we should call
half-witted men is very large. Every book alluding to negro servants in
America is full of instances. I was myself much impressed by this fact
during my travels in Africa. The mistakes the negroes made in their own
matters were so childish, stupid and simpleton-like, as frequently to make
me ashamed of my own species. I do not think it an exaggeration to say, that
their c is as low as our e, which would be a difference of two grades, as
before. I have no information as to actual idiocy among the negroes – I
mean, of course, of that class of idiocy which is not due to disease.” (pp.3278)
Galton adds, although he never travelled to Australia:
“The Australian type is at least one grade below the African negro. I
possess a few serviceable data about the natural capacity of the Australian,
but not sufficient to induce me to invite the reader to consider them.” (p.328)
This rather long quote is worth examining closely, since the methodology
pioneered by Galton, and especially his application of statistical “bell curves” to
human psychology, sociology and anthropology led to entire schools of
“experimental” psychology, sociology and anthopology while also profoundly
affecting subsequent practices of “biological psychiatry”. Galton himself was not
interested in healing people of psychological or physical distress; his main concern
was proving his own superiority and that of his esteemed colleagues (particularly
other Fellows of the Royal Society). This is clearly evident in Hereditary Genius,
which is almost amusing when one also considers Galton’s own psychology and
his social and educational experiences, together with his family connections – were
it not for the huge influence his theories have had on subsequent generations.
Francis Galton’s basic argument was that, just as certain physical features and
abilities are inherited, so are mental abilities – and deficiencies. This he
endeavoured to prove by various mathematical means. The weight of his argument,
in establishing that genius is largely hereditary, rested on the supposition that
“genius”, or exceptional ability, could be reliably ascertained by social status (what
he describes as “eminence”) and fame, as declared by English history books,
official records, and his own personal opinion. This superiority was, according to
Galton, largely determined by racial and familial inheritance (which he generally
referred to as “nature”), rather than the result of social, educational and

69

environmental advantages (referred to as “nurture” in the scientific dichotomy
prevalent at the time). In Hereditary Genius he reveals, before explaining his
“classification of men according to their natural gifts”, his disdain for advocates of
“natural equality”:
“I have no patience for the hypothesis occasionally pressed, and often
implied, especially in tales written to teach children to be good, that babies
are born pretty much alike, and that the sole agencies in creating differences
between boy and boy, and man and man, are steady application and moral
effort. It is in the most unqualified manner that I object to pretensions of
natural equality. The experiences of the nursery, the school, the University,
and of professional careers, are a chain of proofs to the contrary. I
acknowledge freely the great power of education and social influences in
developing the active powers of the mind, just as I acknowledge the effect of
use in developing the muscles of a blacksmith’s arm, and no further.” (p.12)
Although these did not daunt Galton’s many eminent followers, there were several
scientists and intellectuals of the day who saw the obvious flaws in Galton’s
methodology (such as F. C. Constable, who published Poverty and Hereditary
Genius: a criticism of Mr Francis Galton’s Theory of Hereditary Genius in 1905),
and recognised also that Galton was ‘feathering his own bed’. He was, in effect,
arguing the case for his own superiority. He almost gives away his personal
motivation for this in Hereditary Genius, when he writes:
“The eager boy, when he first goes to school and confronts intellectual
difficulties, is astonished at his progress. He glories in his newly-developed
mental grip and growing capacity for application, and, it may be, fondly
believes it to be within his reach to become one of the heroes who have left
their mark upon the history of the world. The years go by; he competes in
the examinations of school and college, over and over again with his
fellows, and soon finds his place among them. He knows he can beat such
and such of his competitors; that there are some with whom he runs on equal
terms, and others whose intellectual feats he cannot even approach. Probably
his vanity still continues to tempt him, by whispering in a new strain. It tells
him that classics, mathematics, and other subjects taught in universities, are
mere scholastic specialties, and no test of the more valuable intellectual
powers. It reminds him of numerous instances of persons who had been
unsuccessful in the competitions of youth, but who had shown powers in
after-life that made them the foremost men of their age. Accordingly, with
newly furbished hopes, and with all the ambition of twenty-two years of age,
he leaves his University and enters a larger field of competition. The same
kind of experience awaits him here that he has already gone through.

70

Opportunities occur – they occur to every man – and he finds himself
incapable of grasping them. He tries, and is tried in many things.” (p.14)
The above description seems remarkably autobiographical for a general
“experience that every student has had of the working of his mental powers”, as it
is supposed to be. This reveals a certain lack of insight by Galton into his own
thinking and behaviour and that of others, even more ironic when one considers
that he has been heralded as a “great psychologist” and even the “Father of
Experimental Psychology” (Sargent, 1944; Casson, 1940).
Galton’s race theories, specifically his theories on inferior and superior races, are
of most relevance to the present work, rather than his theories on class superiority;
although the two are closely related, and he graded what he termed “inferior races”
according to the scale of “civic worth” he first developed in reference to social
classes in England, specifically in London. These race theories are summarised by
a later, and more cautious, apologist for Galton and the Eugenics Movement,
Professor Mark Haller of the University of Chicago. In his 1962 book Eugenics:
Hereditarian Attitudes in American Thought, interestingly “manufactured with the
assistance of the Ford Foundation”, Haller summarises and euphemises:
“Galton was interested not only in individual differences but also in race
differences. As a result of his extensive travels, his wide reading in
anthropology, and his activities with the Royal Anthropological Institute, he
concluded that the ancient Greeks had been the finest of all races. He
believed that in his own day the Anglo-Saxons far outranked the Negroes of
Africa, who in turn outranked the Australian aborigines, who outranked
nobody. Because of the large innate differences between races, Galton felt
that a program to raise the inherent abilities of mankind involved the
replacement of inferior races by the superior, and he regretted that ‘there
exists a sentiment, for the most part quite unreasonable, against the gradual
extinction of an inferior race.’ In this, Galton set the tone for the racial bias
that played an important part in the eugenics movement.” (p.11)
Arising, as it did, shortly after the official abolition of slavery by the governments
of France, Britain and the United States of America, the social and economic face
of the eugenics movement, sometimes referred to as “Social Darwinism” included,
from the outset, a rationale and system by which a small, affluent elite of “white”
Europeans could enslave the poor and exploit the land and resources of Europe and
the rest of the world. Drawing heavily on the theories of the British Anglican priest
and economist Thomas Malthus (1766-1834), and particularly his 1798 Essay on
the Principle of Population, the Social Darwinists were convinced that the world

71

was overpopulated, and that this constituted the “greatest threat to humanity”.
Malthus, who held that “dependent poverty ought to be held disgraceful” argued
that the “poor laws” (ostensibly to assist the poor) tended to increase population
without increasing the food for its support, advocating the abolition of such laws,
incentives for agriculture (landlords) and “workhouses” for those in distress
(Burne, 1991, p.796).
In his 1798 Essay on the Principle of Population, the Anglican priest and
economist Thomas Malthus calculated, using a rather dubious mathematical model,
that in times of plenty population growth is inevitably bound to outstrip food
production. This, he argued, led to inevitable famines, because “unchecked,
population increases in a geometrical ratio” whereas any increase in food
production, he believed, could only increase in an arithmetical ratio. Citing the
example of the United States, where the population had apparently doubled in 25
years, Malthus argued that while the population might double, again, in another 25
years, food supply could only be expected, at most, to increase by a similar amount
every 25 years. The population of Europe, having increased from 66 million to 180
million less than a hundred years later, was a further indication, according to
Malthus, that “population growth is the greatest threat to humanity”. This fear of
“global overpopulation” became a preoccupation of European “social scientists”
and eugenists in the 19th and 20th centuries. Galton himself argues, in Hereditary
Genius, that Malthus’ suggestion of delaying pregnancy in women until later in life
would be eugenically catastrophic – the elite, he argued, should be encouraged to
have as many children as possible. The assumption that there was a massive threat
of overpopulation by undesirables is implicit in all Galton’s theories.
The other major inspiration for eugenists and ‘Social Darwinists’ of the late
nineteenth century (and since) was the Scottish philosopher Adam Smith, who
provided a moral justification for the amassing of wealth by capitalists, and
agriculturalists in particular, in his 1776 publication An Inquiry into the Nature and
Causes of the Wealth of Nations. Oxford-educated Smith, then professor of Moral
Philosophy at the Glasgow University, argued that economies built on self-interest
should not be constrained by the state, since, he claimed, when wealthy men are
given the freedom to “advance themselves”, the capital raised gives rise to jobs.
Jobs, he assumed, are of general benefit.
In 1794, few years after the seeds of modern capitalism were sewn by Adam Smith
and Thomas Malthus, Dr Erasmus Darwin, British physician and naturalist,
published Zoonomia, in which he laid the foundations of modern biological
evolutionary theory, which was later built into fundamental scientific dogma by his

72

grandson, Charles Robert Darwin. Contrary to the accepted doctrines of the
Catholic and Protestant Church, Erasmus Darwin suggested the age of the earth to
be millions of years old, and raised the possibility that “all warm-blooded animals
have arisen from one living filament”. Erasmus Darwin was also a Fellow of the
Royal Society, an honour bestowed on several (male) members of the Darwin and
Galton families.
As we have seen, Francis Galton (1822-1911), the first self-professed eugenist
(and coiner of the term ‘eugenics’) was another grandson of Erasmus Darwin and
published Hereditary Genius in 1869. In this, his first and most influential book, he
argued that the self-designated “upper classes” (to which he, of course, belonged)
should be encouraged to have more children, whereas the “lower classes” should
be induced, or compelled, to have fewer. As a member of the esteemed Darwin
family, Galton was in no doubt that his own genius-loaded family was a good
example of the “superior stock” that should be favoured – a view readily accepted
by many of his family and colleagues. Drawing inspiration from his cousin Charles
Robert Darwin’s work on natural selection of variable features as a dominant force
shaping the evolution of different species, Galton adapted the theory of natural
selection to human psychology, anthropology and sociology, arguing that the
privileged position of the ruling (‘upper’) classes and races, and dominant
individuals within these ‘classes’ and ‘races’ was the result of natural superiority.
In doing so he further developed the idea of “survival of the fittest”, first attributed
to Herbert Spencer, and apparently accepted as an “accurate and convenient”
alternative term for “natural selection” by Darwin himself (Moore, 1964, p.43).
“Survival of the fittest” became the motto of so-called Social Darwinists, who
assumed that the richest were the “fittest” and thus “naturally deserving” of their
dominant position in human affairs.
Charles Darwin had published The Origin of Species in 1859 after several years of
careful deliberation and meticulous scientific observation and analysis. His search
for truth, despite the opposition and ridicule he anticipated and received, is clearly
evident in this undeniably brilliant book. Galton’s work did not display similar
scientific rigour. The 1944 publication by Columbia University Professor of
Psychology, S. Stanfield Sargent, in Great Psychologists, explains Galton’s flawed
methodology for studying “genius”, while nevertheless painting a glowing portrait
of the founder of the Eugenics Movement:
“A book called Hereditary Genius was published in 1869 by Francis
Galton, a brilliant free-lance British scientist who helped launch psychology
on its scientific career. Besides studying heredity and founding the eugenics
movement, he devised many mental and physical tests and statistical

73

methods for interpreting differences between individuals.
“Galton believed genius is inherited and undertook to prove it. He
selected names of nearly a thousand of the most eminent statesmen, military
and naval leaders, and professional persons, who had lived in the British
Isles during the preceding several generations. To see whether they had more
distinguished relatives than would be expected by chance, he studied their
genealogies carefully. For comparative purposes he calculated that among all
the relatives of any thousand garden-variety Britishers, one would expect to
find only four eminent persons [a more than dubious foundation upon which
his entire argument was based]. In the family trees of his selected thousand,
however, he found well over 500 distinguished relatives. Galton believed
these results proved that genius is inherited.” (p.62)
Professor Sargent goes on to point out the most obvious flaws in Galton’s
methodology and conclusions:
“This conclusion would dishearten the obscure thousands who aspire to
fame had not later psychologists pointed out that Galton overlooked, or
possibly in his enthusiasm ignored, several important items. In the first
place, to identify eminence with genius is fallacious. Eminence implies
achievement and social recognition. Some persons attain eminence at least
partially through political favour, good name, or just plain luck. Genius
connotes exceptional talent, which shows itself in creative achievement.
Eminence and genius are related, but scientifically can not be considered
synonymous. Secondly, Galton judged eminence by his own personal
standards. Probably he chose well, but any one person almost certainly
shows bias, if only betraying preference for certain professions.
“The most significant oversight, however, lay in Galton’s slighting the
role of environment. He attached little importance to the various nonhereditary factors influencing the lives of his eminent men.” (p.63)
Galton’s genealogical studies of eminent Englishmen, and his conclusions about
the heredity of “genius” illustrates the selective blindness of his analysis, and
reveals much about the prejudiced thinking that characterised the eugenists of the
nineteenth and twentieth centuries. This selective blindness also affected the
related “Social Darwinist” and “Mental Hygeine” movements, which have
profoundly shaped modern academic psychological, psychiatric and economic
theory and practice. More reproachably, Galton’s selection of the most politically
powerful and affluent professions as examples of superior “stock” demonstrating
“hereditary genius” can be seen as an appeal for support through flattery of those
who regarded themselves as “eminent” and “well-bred”. Elitists, in other words. In

74

fact, as it turned out, the eugenics movement gained much support from already
established organizations of “eminent” racists and elitists, and of elitist racists. As
Professor Sargent correctly observed, the identification of social eminence with
genius is fallacious. This is especially so in nations and societies flawed by
nepotism, class-prejudice and selected opportunity based on class. It is obvious that
hereditary aristocracies, including those of “Royal Families”, will produce many
“eminent” persons from the same family. By Galton’s measure, the current Prince
of Wales, Charles, and his mother, Queen Elizabeth of the United Kingdom, are the
possessors of genius, and likewise the current president of the USA, George W.
Bush and his father, George Bush (Sr.). Lachlan Murdoch (Rupert’s son) and James
Packer (Kerry’s son) would also, according to the methodology and theories of
Galton, be showing “hereditary genius”, as would the many affluent (and thus
‘influential’) members of various “billionaire” families of the past 150 years such
as the Rothschild family, Rockefeller family, Carnegie family, Kellogg family and
Oppenheimer family – all of whose empires supported subsequent eugenics-related
programs and research. Needless to say, the Darwins themselves were an
extraordinarily superior family, liberally possessed of “good genes”, according to
the analysis of Francis Galton.
During his long career, Francis Galton set the foundations for many subsequently
misused “experimental psychology” techniques, several of which subsequently
developed into distinct “schools”. He was the first to formally use “word
association tests” and also developed various “intelligence tests” (Intelligence
Quotient, or I.Q. tests), always seeking to prove the superiority of his own class
and kin and the inferiority and “criminality” of the “lower classes” and “inferior
races”. These have subsequently been recognised, in their many variants, to be
seriously biased, politically, socially and culturally. Galton also adapted the
ancient Chinese technique of identification by finger-prints, which together with
the notorious studies of the facial features of “criminals” by the Italian
anthropologist Cesare Lombroso, were developed as additional tools to identify
“undesirables” – and make sure they did not breed.
Galton’s preoccupying concern was the large mass of humanity that he, and his
fellow eugenists, regarded as “feeble-minded”. The early I.Q. tests he pioneered
were developed as a “scientific” way of categorising various degrees of “feeblemindedness” using a numerical scale (based, of course, on a ‘bell curve’). This was
seen as an improvement over the labels of “idiot”, “imbecile” and “moron” that
were then in common use by the medical fraternity. The term “Intelligence
Quotient” was first coined by the German psychologist William Stern in 1912, the
same year that the American Medico-Psychological Association (later the

75

American Psychiatric Association) appointed a Committee on Applied Eugenics,
and the First International Congress of Eugenics was held at the University of
London. Major Leonard Darwin, son of Charles, presided; the English vicepresident being none other than Winston Churchill, then British Secretary for
Home Affairs, and subsequently British Prime Minister (during the Second World
War). German vice-presidents included Professor von Gruger of the University of
Munich and Dr Alfred Ploetz, President of the International Society of Race
Hygeine (Meinsma, 1998). In 1895 Ploetz had published The Excellence of Our
Race and the Protection of the Weak following which he founded the German
Society of Race Hygeine in 1905 through which he promoted the selective control
of reproduction along eugenic lines in Germany. London, however, remained the
home of the eugenics movement, the first Chair of Eugenics having been
established at University College, London, in 1904.
Although the early eugenicists claimed inspiration from the book, Origin of
Species was not primarily concerned with human evolution, and Charles Darwin
mentions the potential ramifications of his theory to anthropology and human
psychology only at the conclusion of the work:
“In the distant future I see open fields for far more important researches.
Psychology will be based on a new foundation, that of the necessary
acquirement of each mental power and capacity by gradation. Light will be
thrown on the origin of man and his history.” (p.458)
Darwin’s prophesy about the “distant future” underestimated the extraordinary
influence of Origin of Species and of his next book, The Descent of Man (1871), in
which he detailed theories regarding the significance of “natural selection” to
human evolution – and the major role his own family members would play in
extolling his work and promoting it, together with their own, to a wider audience
and sphere of influence. Specifically, Charles Darwin’s cousin, Francis Galton, and
his son, Major Leonard Darwin, used their famous relative’s name, and, with
certain distortions, his theory, to promote Galton’s own concept of “eugenics”. This
did not occur in the “distant future” as Darwin had envisaged – the first American
eugenic laws, resulting in the castration of all inmates of the Michigan Home for
the Feeble Minded and Epileptic in the US State of Michigan, and 24 boys in
Massachusetts for “persistent epilepsy and masturbation” and “masturbation with
weakness of mind” among various labels of “misbehaviour”, were introduced in
1898 (Haller, 1963, Meinsma, 1998). These were America’s, and the world’s, first
“eugenic sterilization bills”, but not the first laws based on political acceptance of
the doctrines of eugenics. This dubious distinction is held by the American State of
Connecticut, which first adopted legislation regulating marriages on a eugenic

76

basis in 1896. Under these laws, subsequently followed by several other states,
marriage of “insane” people was forbidden. The first compulsory sterilization laws
in the USA were passed in 1907, in the state of Indiana, legislating for mandatory
sterilization of criminals, idiots, imbeciles and rapists. Eugenic laws were
subsequently introduced in 30 North American states, resulting in thousands of
sterilizations over the next 20 years (Haller, 1963, Meinsma, 1998).
Mark Haller’s Eugenics: Hereditarian Attitudes in American Thought, published
by Rutgers University Press in 1963 provides significant information about the
expansion of the Eugenics Movement in the United States of America and the early
implementation of “negative eugenic” sterilization programs:
“American laws preceded by twenty years the sterilization laws of other
countries and were pioneering ventures watched by eugenists of other lands.
But the pioneering nature of the laws can hardly explain their early crudities.
Two of the laws, by Washington and Nevada, were not eugenic at all but
purely punitive. They permitted a judge, at the time of sentencing, to impose
sterilization ‘whenever any person shall be adjudged guilty of carnal abuse
of a female person under the age of ten years, or of rape, or shall be
adjudged a habitual criminal.’ All laws passed by 1921 and many passed
thereafter applied to rapists or sexual perverts and were therefore at least
partly punitive or therapeutic in purpose. Even those that were primarily
eugenic in purpose applied sterilization to a bewildering variety of persons
with little attention to scientific classification. All applied to the
feebleminded and most to epileptics and the insane, but also included in
various laws were confirmed criminals, drunkards, drug addicts, syphilitics,
moral degenerates, and prostitutes. Doubtless the oddest bill was that
introduced by a Missouri legislator for sterilization of those ‘convicted of
murder (not in the heat of passion), rape, highway robbery, chicken stealing,
bombing or theft of automobiles.’” (p.135)
In 1930 the eugenics movement in the USA attempted to gain further support for
sterilization with the publication and promotion of Sterilization for Human
Betterment by Gosney and Popenoe. Gosney and Popenoe claimed that following
6,255 sterilizations in California between 1909 and 1929 there had been only four
deaths from the operation. They claimed the program was so successful that all
persons released from the Sonoma State Home for the Feebleminded and one out
of six of all new inmates admitted to hospitals for the insane were being sterilized
in California (Haller, 1963, p.138). In 1927, legislation in the USA for sterilization
of the “feeble-minded” was supported by the now notorious judgement of the
eugenist judge Oliver Wendall Holmes who held that sterilization fell within the

77

police power of a State in advocating the sterilization of “feebleminded” Carrie
Buck, who had given birth to a baby who was “supposed to be mental deficient”
while in the Virginia State Colony for Epileptics and Feebleminded. Justice
Holmes declared:
“It is better for all the world, if instead of waiting for their imbecility,
society can prevent those who are manifestly unfit from continuing their
kind. The principle that sustains compulsory vaccination is broad enough to
cover cutting the fallopian tubes…Three generations of imbeciles is
enough.” (Holmes quoted by Haller, 1963, p.139)
The development of eugenics theory and practice was initially focused on the
“mental powers” and “intelligence”, and eugenics thus became a particular
preoccupation of psychologists and psychiatrists. Francis Galton was regarded as
an eminent “psychologist” in England at the time his famous cousin was
developing the extension of his theory of “natural selection” to primate evolution,
which was published as The Descent of Man in 1871. In it Darwin revealed his
Malthusian sympathies and concerns:
“With savages, the weak in body or mind are soon eliminated; and those
that survive commonly exhibit a vigorous state of health. We civilized men,
on the other hand, do our utmost to check the process of elimination; we
build asylums for the imbecile, the maimed and the sick; we institute poor
laws; and our medical men exert their utmost skill to save the life of every
one to the last moment. There is reason to believe that vaccination has saved
thousands, who, from a weak constitution would formerly have succumbed
to smallpox. Thus the weak members of civilized societies propagate their
kind. No one who has attended the breeding of domestic animals will doubt
that this must be highly injurious to the race of man. It is surprising how
soon a want of care, or care wrongly directed, leads to the degeneration of a
domestic race; but excepting in the case of man himself, hardly any one is so
ignorant as to allow his worst animals to breed.”
Charles Darwin’s son, Major Leonard Darwin, the inaugural President of the
Eugenics Education Society, took the argument for aggressive measures against
this perceived risk of “degeneration” several steps further in his 1926 book The
Need for Eugenic Reform. In it he argued for State-controlled “mate selection”
(matching “like with like”), government grants and “philanthropic support” for
favoured couples, segregation (in prisons or by forced “emigration”), strict
immigration laws based on eugenic considerations (obviously barring “coloured”
races), and political measures against what he called “misceganation” (the
deterioration of “pure” races by intermarriage, based on the assumption that the

78

resulting “half-breeds” are inferior to either “pure race”). This could be a problem,
argued Leonard Darwin (who was a soldier by profession), in South Africa, where
the British were attempting to create a “White colony” – in order to exploit the
extensive diamond and gold deposits that had been discovered there.
In a chapter titled “misceganation”, Leonard Darwin extrapolates directly from his
father’s theory of white and dark skin being respectively suited to cold and warm
climates, to come up with a racist theory about “mulattos” or “mongrels” (as he
later refers to people of mixed race as):
“There is another purely theoretical consideration which may be worth
mentioning as a point to be held in view whilst our knowledge in regard to
misceganation is accumulating. When a cross takes place between a person
belonging to a dark race accustomed to a hot climate and a fair person
belonging to a race originating in a cold climate, the different members of
the resulting family will receive different amounts of black and white
elements of their ancestral make-up; and the same will be true of all their
subsequent descendants. Now it is natural to suppose that those persons with
the greater share of white blood in their composition would be more likely to
survive in a cold country, whilst those with the larger share of black blood
would be relatively more likely to thrive in the tropics. And if there is any
truth in this hypothesis, we should expect that a mulatto race, for example,
would become whiter and whiter in cold climates, and darker and darker
under a tropical sun, these changes taking more and more slowly as time
went on. It is also conceivable that mixed races, like certain hybrid animals,
are less fertile than either of the pure races. Whether any of these facts can
be brought forward in favour of either of these hypotheses is doubtful; but I
know of none that can rule them out altogether. And these possibilities are
worth considering in a country like South Africa; for it may be that all the
white blood which has been absorbed in misceganation is in truth all the
time slowly diminishing in quantity and may in time disappear entirely; a
possible contingency which seems to make it all the more desirable to
endeavour to maintain the purity of the white race by the avoidance of race
mixture.” (p. 497)
At the end of The Need for Eugenics Reform, is a single propaganda page for the
“Eugenics Education Society”. Defining “eugenics” as “the study of heredity as it
may be applied to the betterment, mental and physical, of the human race”, it
provides the following list, in answer to the question “How can practical results be
obtained?”:
1. By educating the general public to think eugenically, and to recognize the

79

2.
3.

4.
5.
6.

responsibility of parenthood.
By furthering such measures or tendencies as would prevent the diminution
of superior types or the multiplication of the inferior.
By insisting that the palpably unfit and degenerate shall not reproduce and
multiply their kind.
This may be accompanied by segregation, and ultimately, perhaps,
by sterilization, voluntary or compulsory.
By watching and recording the results of eugenical research and legislation
in other countries.
By collecting statistics and facts that may be of use for legislative and
scientific purposes in this country [Britain].
By keeping in close touch with current legislation and executive action in
order to intervene, either with objection or support, where the racial
qualities of the [British] empire may be affected.

Shortly before Major Leonard Darwin published this list of “practical” solutions,
Adolf Hitler wrote, in Mein Kampf:
“Those who are physically and mentally unhealthy and unfit must not
perpetuate their sufferings in the bodies of their children. Through
educational means the state must teach individuals that illness is not a
disgrace but a misfortune for which people are to be pitied, yet at the same
time it is a crime and a disgrace to make this affliction the worse by passing
it on to innocent creatures out of a merely egotistical yearning.” (Hitler,
1925)
Prior to Hitler’s extension of negative eugenic educational, “social”, and medical
programs to exterminating “degenerate races”, the Nazi regime had, for several
years, been systematically killing those individuals unfortunate enough to be
labelled as having a “serious hereditary defect”, or those with seriously threatening
(to him) political and philosophical opinions. The diagnosed “chronically mentally
deficient and ill”, including those labelled with “schizophrenia”, “feeblemindedness”, “personality disorder” and “manic-depression” were systematically
killed, along with those suspected of being “Communists” or “anarchists”.
Homosexuals, “alcoholics” and “drug addicts”, deemed as being morally
degenerate, were also “euthanased” by Nazi executioners, “euthanasia”, or “mercykilling” being the euphemism they used to conceal their crime. Thus it can be seen
that the Nazi extermination campaign, also known as the “Holocaust” included the
mass-murder of a range of racial, religious and political targets.

80

The first formal eugenics organization in Germany was founded in 1905, and the
Society for Race Hygeine was founded by Dr Alfred Ploetz in the same year. In
1907, the International Society for Racial Hygeine was founded, composed mainly
of German eugenists, and the next year all future mixed marriages in German
South-West Africa (now Namibia) were banned, while existing mixed marriages
were “dissolved”. At its annual meeting in 1914 the German Society for Race
Hygeine adopted a resolution on the subject of applied eugenics, arguing that the
future of the German people was at stake and that “the German empire can not in
the long run maintain its true nationality and the independence of its development
if it does not begin without delay and with the greatest energy to mold its internal
and external politics as well as the whole life of the people in accordance with
eugenic principles” (Popenoe, Johnson, 1920, p.163).
In 1914 Professor Morton Aldrich, professor of Industrial hygeine at the University
of Louisiana wrote of his and Winston Churchill’s opinion about what should be
done to men and women who are “obviously unfit” (those who had been labelled
“idiots, insane, feeble-minded, depraved or criminal”):
“To men and women of these classes parenthood must be effectively and
permanently denied. In the words of Mr.Winston Churchill, they deserve all
that can be done for them by a Christian and scientific civilization now that
they are in the world, but their curse must die with them and not be
transmitted to future generations. In self-protection, society owes it to the
future to prevent these obviously unfit from bringing into the world others
like themselves.” (p.304)
If Churchill’s ideas regarding the best eugenic approach to the “feeble-minded” are
combined with Galton’s ideas about the inferiority of “black races” (which
Churchill clearly shared, from his many writings) and the definitions of “feeblemindedness” in the 1920s, it is easy to recognise the common rationale behind
Hitler’s genocidal ‘eugenics’ program and other genocidal programs against
“coloured races”. In Applied Eugenics (1920), Paul Popenoe (editor of the
American Journal of Heredity) and Roswell Johnson (professor at the University of
Pittsburg) defined “feeble-mindedness” as:
“A condition in which mental development is retarded or incomplete. It is
a relative term, since an individual who would be feeble-minded in one
society might be normal or even bright in another.”
If one also considers the same book’s definition of “segregation” the rationale
behind the monstrous “colour bar” in British and American colonies, and those in

81

South Africa and Australia become clearer:
“As used in eugenics [segregation] means the policy of isolating feebleminded and other anti-social individuals from the normal population into
institutions, colonies, etc., where the two sexes are kept apart [to prevent
breeding]”
It is evident, then, that Nazi extermination program did not appear from nowhere,
and neither was it the sole creation of Adolf Hitler. Many respected doctors,
scientists and academics in Nazi Germany worked together to develop a complex
“mental hygeine” and “racial purification” program, based on the arguments of
mainly British and American eugenicists of the previous 50 years. These will be
explored in the following pages, since they may shed light on some of the most
serious diseases threatening humanity today. Specifically they may explain the
worsening global epidemics of HIV/AIDS, drug addiction and mental illness of the
past 20 years that have been amply documented, but not adequately explained.

Chapter 4
NEGATIVE EUGENICS PROGRAMS
Today there are few people who publicly admit to being eugenicists, due to the
enduring memory and historical records of Nazi atrocities during the Second World
War. This was not the case in the late nineteenth century and early twentieth
century, when Eugenic Societies, composed of self-professed eugenists, were
established in most of the European nations, North and South America (especially
the USA and Canada), Japan, South Africa, New Zealand and Australia. These

82

societies unanimously assumed that “white superiority”, at least in terms of
intelligence, was a “scientifically proven fact” – although they continued to try and
prove how superior by devising various “intelligence tests”, measuring skulls and
brains, and taking tissue samples for laboratory analysis. Massive funding for a
world-wide eugenics movement was provided American industrialists and
financiers including the Rockefeller corporation and the Carnegie corporation.
John D. Rockefeller, oil magnate and founder of the Rockefeller corporation,
established the “Bureau of Social Hygeine” to study prostitution, venereal disease
and delinquency, all indications of “degeneracy” according to eugenics
propaganda, in 1904. In the 1930s the Rockefeller Foundation contributed funds
for the study of “human heredity” to the Galton (Eugenics) Laboratory at the
University of London, and to other university-based eugenic-related research
programs in Great Britain, France, Holland, Belgium, Germany, Sweden, Norway
and Switzerland (Fosdick, 1952, p.150). The Rockefeller Corporation also
established, prior to the First World War, several research stations and universities
in Africa and selected parts of the “Third World” (then European colonies and
‘protectorates’). According to Raymond Fosdick’s The Story of the Rockefeller
Foundation (1952), the Foundation began a “co-operative relationship” with the
National Committee for Mental Hygeine in 1913, and progessively expanded its
activities in the area of psychiatry in the 1930s, during the time of Dr David Edsall,
who was a trustee of the Foundation and also the Dean of Harvard Medical School.
Fosdick writes:
“In 1933, when The Rockefeller Foundation began to concentrate in this
field, it can fairly be said that teaching in psychiatry was poor, research was
fragmentary, and application was feeble. Some American medical schools
had no departments at all in psychiatry, neurology and allied specialties;
some had primitive and inadequate departments; and a few had departments
which, though fairly well organized, were incomplete or isolated from the
other activities of the school.” (p.148)
Dr Edsall suggested in his report to the Foundation, that the Rockefellers pursue
parallel objectives in the area of psychiatry – “the development of good teaching in
a few medical schools with the integration of psychiatry into the regular medical
curriculum”, and “the support of scientific research” (Fosdick, 1952, p.149).
Following Edsall’s report, the Foundation poured money into selected American
universities for the development of “departments of psychiatry”. These included
Harvard, the University of Chicago, Duke University, John Hopkins University,
Universities of Colorado, Michigan and Tennessee, and the Institute of the
Pennsylvania Hospital. In Monteal, Canada, the Rockefeller Foundation

83

established an institute of neurology and neurosurgery at McGill University, under
the leadership of Dr Wilder Penfield. The Rockefellers can truly be said to have
changed the face of psychiatry in the modern world, and strongly promoted the
doctrines of “Social Darwinism” in medicine, as did the Carnegie Corporation and
the Rothschilds Banking Corporation. “Social Darwinism” will be further explored
in later chapters, as will the “Mental Hygeine Movement” (subsequently renamed
the “Mental Health Movement”).
In the areas of research into and surveillance of infectious disease and cancer, two
areas of importance in understanding the cause and history of AIDS, the
Rockefeller Corporation had several famous, and infamous, achievements. As long
ago as 1910, a Rockefeller scientist, Dr Peyton Rous was the first to transplant
cancer, in this case a virus-induced sarcoma (cancer of muscle cells) between
chickens. He was awarded the Nobel prize in Medicine and Physiology for this
achievement.
Less commendable was the work of Dr Cornelius Rhoades, who, with Rockefeller
funding, transplanted cancer, with fatal results, to several people in Puerto Rico in
the 1930s. Rhoades was rewarded with a seat on the Atomic Energy Commission, a
Legion of Honour award and a senior position in the United States Chemical
Warfare department during the Second World War.
Rockefeller scientists and doctors were pioneers in field observations, including
the collection of tissue samples (mainly blood, but also other tissues, both diseased
and healthy) in the so-called ‘Third World’, while academics at Rockefeller-funded
universities and colleges themselves oversaw the changing terminology from
“backward nations”, to “underdeveloped countries”, to “developing nations”, to
“the developing world” and finally to the “Third World”, as more “politically
correct” terms were sought by the “Western”, “industrialized” nations to
collectively describe the dozens of nations in Africa, South America and Asia that
were previously the “colonies” of European imperialists. Especially since
conclusion of the Second World War and the formation of the United Nations
organization from the embers of the League of Nations, the Rockefeller
Corporation has ostensibly encouraged the idea of indepedence, decolonization and
democratization of poorer nations, and poured many billions of dollars into health
and education programs and institutions in the most impoverished parts of the
world. The ever-increasing gap between life expectancy, and levels of infectious
disease, between the poorest people in these countries (or even national average
lifespan) and those of the USA and other ‘Western’ countries has either resulted in
spite of these programs or as a result of them.

84

The Carnegie Corporation, founded by the mining magnate Andrew Carnegie,
established the first Eugenics Laboratory in the United States also in 1904. Initially
called a “biological experiment station” named the “Station for Experimental
Evolution”, what became the “Eugenics Record Office” was established at Cold
Spring Harbor, Long Island, under the direction of the prominent eugenist Charles
Davenport. In 1939 the Carnegie corporation changed the name of the Eugenics
Record Office to the “Genetic Record Office” following increasing awareness in
the United States and Britain about the excesses of eugenists in Nazi-controlled
Germany.
The Californian eugenics movement had an uncritical view of the German
sterilization program, having organized an exhibition of the Nazi eugenics program
at Pasadena, California, in 1934. Their newsletter announced, according to Robert
Miensma’s valuable synopsis A Brief History of Mental Therapy (1998):
“It portrays the general eugenics program of the Nazi government,
giving special attention to the need for sterilization. Those who have seen
this exhibit say it is the finest thing of the kind that has ever been produced.
Take the opportunity to see this while in Los Angeles. Tell your friends
about it.”
Between 1933 and 1945 German government “doctors” sterilized an estimated 2
million people (Miensma, 1998) and “euthanased” many millions more than they
sterilized. “Euthanasia” was the term developed by eugenists as a euphemism for
mass-murder of “valueless life” (also referred to as “life devoid of meaning”) –
literally meaning “good killing” or “mercy killing”. This was initially done mainly
through poisoning – using toxic gas such as carbon monoxide, or ingested poisons.
Many were just starved, or worked as slaves until they died of a combination of
disease, fatigue and starvation. Others were killed through a range of gruesome
surgical, medical and chemical research experiments. The killings, initially of the
inmates of mental hospitals, and later of so-called “degenerate” individuals and
races, were hidden from the German public and from the outside world. The
“mercy killing” of psychiatric patients was concealed from members of the
victims’ families, as were the later “concentration camps” and “death camps” from
the German people and the outside world.
The Australian psychiatrist Professor Sidney Bloch gave details of how the killings
of “mental patients” were disguised in his 1996 Beattie lecture at the University of
Melbourne:
“Mercy killing was merciless killing. Naked patients were herded into

85

chambers, camouflaged as showers, and gassed with carbon monoxide by
hospital staff. Relatives were subsequently informed of the patient’s
‘unfortunate’ death from a medical condition and commiserated with.”
(Bloch, 1996, p.177)
The “criteria for death”, according to Professor Bloch, were “schizophrenia,
epilepsy, senile disorder, intellectual disorder and the like; hospitalization for 5 or
more years; an incapacity to work in a mental hospital setting; or not being of the
German race and nationality”. These were orchestrated, in Heidelberg University,
by the eugenist psychiatry professor Karl Schneider:
“Karl Schneider held an even more prestigious post as chairman of
psychiatry at Heidelberg. Alongside his celebrated academic activities,
Schneider contributed energetically to the euthanasia program. A party
stalwart from 1932, he became imbued with the Nazi vision, particularly
racial hygeine. Ironically, he was able to pursue two contradictory pathways.
On the one hand he elaborated progressive measures of rehabilitation for the
chronically ill and, on the other, participated actively in furthering both the
sterilizations and medical killings. Moreover, he developed a grand plan to
establish a research institute dedicated to biological anthropology, launching
his studies with the examination of brains derived from the victims of Aktion
T4 (other eminent academics also snatched the opportunity to examine the
hundreds of available brains).” (Bloch, 1996, p.177)
Aktion T4 was a dehumanizing code name. It was, as many such codes are, used as
a euphemism to disguise a terrible crime against humanity. Professor Bloch
explains:
“From a secret operational centre in Berlin the infamous
Tiergaartenstrasse No.4 (hence the code name Aktion T4), registration forms
were dispatched to German mental hospitals seeking data about their
patients. Completed forms were hastily examined by medical panels,
comprising mainly psychiatrists, who determined whether the patient would
live or die. Among the panellists were the most distinguished of academic
psychiatrists.”
These killings followed an edict by Hitler, in October 1939, to launch widespread
“medical killings” in the interests of “racial hygeine”. Although he mentions
Gypsies and homosexuals as targets for killing, Professor Bloch, who is Jewish,
focuses on the Nazi mass-murder of Jews, and fails to mention the killing of
Communists – thus obscuring the full agenda of the German eugenists. He does,
however, indicate the covert nature of the Nazi “eugenics” program, and observes

86

that Hitler “had divulged a wish to implement this policy [medical killing] as early
as 1935 but preferred to do so in the context of war when it would be more easily
concealed from the gaze of likely critics” (Bloch, 1996, p.177).
Another crime of the Nazis, that of enslavement of other races, and of political
dissidents was, like the German State’s “euthanasia” program and the “mental
hygeine programs”, a direct consequence of the “practical” application of the
theories of Francis Galton. The creator of eugenics, as we have seen, believed that
“blacks” (“negroes”) have a “slavish instinct” and are, as a result of this, “natural
slaves”. The German eugenists (who extended Galton’s bigotry to include Jews),
regarding the Germans and “Aryan race” as the most pure, noble, and superior of
an ever-increasing hierarchy of “races”, predictably viewed themselves as
belonging to the “Master Race”. Equally predictably, British Anglo-Saxon
eugenists saw themselves as the race that was destined to rule all the others. This
had, indeed, been a central assumption of the British Imperialists that profiteered
through the trade of human “cargo” from Africa and Asia for two hundred years
prior to the theories of Darwin and Galton.
Turning to sub-Saharan Africa, where the AIDS epidemic is currently decimating
the “black” population, the Union of South Africa, formed in 1910, was a white
union – power was shared between British and Afrikaaner settlers; “black” and
“coloured” races were banned from skilled jobs and had no say in government,
dissidents were incarcerated in increasingly large, crowded prison, while a “colour
bar” was instituted at all levels of society. Schools were segregated, marriage
between “blacks and whites” was forbidden by the State, and separate living areas
were designated for “blacks”, “whites” and “coloureds”. This was the notorious
system known in South Africa as apartheid (separate development), but was
actually a direct application of eugenic programs, as can be seen by comparing the
South African program with Major Leonard Darwin’s suggestions in The Need for
Eugenic Reform, in which he argued for segregation of those of “degenerate stock”
from Europeans “of good breeding”. Similar segregation and the so-called “colour
bar” were instituted in other British colonies in Africa, including Rhodesia
(Zimbabwe and Zambia), British East India (Kenya), Nigeria and the Gold Coast
(Ghana).
Systems of government, including legal, education and health systems were
developed in the various British, German and Belgian colonies and “Protectorates”
in Africa that were explicitly white supremacist. This was less obviously the case
in French colonies, which were encouraged to become part of the “Frenchspeaking world”. Elite Africans were selected for education in France, and allowed

87

to reach positions of eminence in French society – both in France and in its
colonies. This is not to say that the French did not cruelly exploit Africans in Africa
and in France. The French also were also guilty of profiteering from the African
slave trade – although they were the first of several European nations to officially
condemn slavery, and set their empire’s slaves free (in 1794).
The British in Africa had a different agenda from the French in terms of the
dissemination of their language in Africa. As long as “natives” knew enough
English to understand orders and work in mines and plantations the British were
not keen to educate them in key aspects of their language. The command of
English necessary to understand the language of politics, law and science was
jealously guarded – and taught only in elite schools and universities. The oldest of
these, Oxford and Cambridge, claimed authority over all the “colonial universities
and colleges” distributed in various cities around the “Empire”. This has remained
the case until very recent times. It was scholars at Oxford and Cambridge who,
with theologists from Westminister Abbey, translated the King James version of the
the Bible in the year 1611 – the English-language bible that has now been itself
translated into hundreds of other languages, and on which oaths are sworn in
Australian and other British Commonwealth courts to this day. When the first
University in North America was founded in 1636 by the Puritan cleric John
Harvard, the site chosen was named Cambridge (in Massachusetts). Nowadays
Harvard University, as this university was named three years later, is regarded as
the premier university in the world, regarding medicine and science, at least.
Harvard became a centre of American eugenics, and perhaps found even more
enthusiasts there than in the British Isles (as was the case with universities in
Australia and South Africa, where racist “white supremacist” ideas were already
well established in political and social theory and practice). Major Leonard
Darwin, the British President of the Eugenics Education Society was invited in
1907 to give a series of lectures at Harvard.
Despite seeking support among white-supremacists in the USA, English eugenists
remained xenophobic after the First World War, as can be seen in the worries of
Professor C.W. Saleeby, in The Eugenic Prospect (1921):
“What is really involved in ‘the survival of the fittest,’ the most famous
of all of Herbert Spencer’s phrases – indeed, the most famous phrase of the
nineteenth century. On our earth at present, as ever, there are nations or races
which rule and others which are ruled – some Imperial and others almost
slaves. We have inherited the Imperial status, and, having lately shared in the
conquest of Imperial Germany, we seem to have finally confirmed that

88

status. On the contrary, I believe that in the essentials of national conduct we
were more secure in 1914 than we are today. We deceive ourselves when we
suppose that we are an Imperial race by right divine, by Nature’s final
verdict; and some of us, I fear, suppose that it is even our Imperial privilege
to idle and speculate, to guzzle and booze, while common folk and subject
races slave. But is it true that an Imperial race is one which idles and
gambles and plays whilst servile races work?”
Professor Saleeby, who believed that the most important measures the British
government needed to take were to develop its chemical industry (especially
agricultural chemicals) and build more houses for those married couples who
wanted to have children (and were of the right breeding), was another Fellow of
the Royal Society, and a medical doctor. He was also a Fellow of the Obstetrical
Society of Edinburgh, Vice-President and Vice-Chairman of the National Council
of Public Morals, Member of the National Birth-Rate Commission from 1913 to
1916 and Chairman of the Executive of the World League Against Alcoholism in
Washington. As a Scottish Fellow of the Royal Society, Professor Saleeby manages
to come up with a theory that allows him to condemn other races for drinking
alcohol, but not the Scottish:
“Yet it would seem that the typical Scotsman, with his motionless face
and hands, and his even voice, almost needs his native whisky to make him
feel as jolly as many southerners do without it.” (p.91)
In The Eugenic Prospect: National and Racial, Dr Saleeby writes at length about
his two main concerns – racial poisons and dysgenics (the opposite of eugenics).
He regards alcohol and venereal diseases as racial poisons, resulting in the
“poisoning” of racial “stock”. Apparently with the blessing of his “master” Galton
(as Saleeby refers to the founder of eugenics), Saleeby added a third objective (in
addition to negative and positive eugenics) to the British National Eugenics
agenda: “the protection of parenthood from the racial poisons”, for which he
coined the term “preventive eugenics” (having previously coined the term
‘negative eugenics’, with Galton’s blessing). He writes:
“In May, seventeen years ago, the late Sir Francis Galton founded
modern eugenics in an address to our newly-formed Sociological Society in
London. None who heard will ever forget it. In later years it was my
privilege, with his approval, to extend the concept of National Eugenics
under three heads:
Positive Eugenics, the encouragement of worthy parenthood.
Negative Eugenics, the discouragement of unworthy parenthood.
Preventive Eugenics, the protection of parenthood from the racial

89

poisons –
and to introduce the term dysgenics (in analogy with eupepsia and
dyspepsia) for the opposite of what the great pioneer desired.” (p.30)
“Dysgenics” is a term used by Saleeby and his followers to refer to the result of the
“breeding of inferior stocks and undesirables”. It lead, according to their theory, to
degeneration of the human race, generally. Negative eugenics (at first, sterilization
and later ‘euthanasia’) was thus a practical program to prevent dysgenics. The
additional measure of “preventive eugenics”, ostensibly to protect “worthy
parents” (‘white’, ‘educated’, ‘professional’, ‘upper class’) from the ‘pollution’ of
“racial poisons” endemic in the “lower classes” and “lower races”, was primarily
concerned with alcoholism and venereal disease (particularly syphilis). These,
claimed Saleeby, could lead to degradation of individuals and entire races.
In fact, Dr Saleeby was correct in claiming that excess alcohol consumption can
destroy the lives of individuals and families. It can bring social and political chaos,
and it can be used deliberately for such reasons. Alcohol is the most obvious of the
“natural” chemical warfare agents, and, along with opium, was an integral part of
the British Empire’s armoury when seeking to expand their territorial
“possessions” by subjugating new lands. The British Empire never claimed to be
motivated by a desire to subjugate, however, although they did often use the term
“exploit” (which did not then have the negative connotations it has today).
Economic and history texts through till the 1960s continued to write of
“exploitation” of the resources of a country as a good thing. Exploitation was
synonymous with “development” and “civilization”, and included exploitation of
both natural resources (land, minerals, forests, plants, animals etc.) and “human
resources” – people, in other words. When the British Empire expanded in the 17 th
and 18th centuries, it built a massive slave trade, in competition with those of the
Dutch, French, Spanish and Portuguese empires. African slaves were the main
“human resource” that was shipped to the Americas, when the “New World” was
claimed in the name of the various European monarchs.
“Imperialism”, or the building of empires, was a prime motive for the European
territorial battles over the newly “discovered” mineral-rich American, African and
Australian continents that eventually culminated in the First and Second World
wars. It was assumed that if “they” did not get control of any newly discovered, or
strategically important territory, their rivals would. Initially controlled by
aristocratic monarchies and the Churches these gained “divine sanction” and
clerical support from, the entire European colonial expansion assumed the
supremacy of the “white race”. This was the case long before the German “Father

90

of Anthropology”, Otto Blumenbach divided humanity into five (competing) races
in 1775. Appropriating the term “race” from the French biologist Buffon,
Blumenbach developed a classification of humanity according to skin colour and
geographical origin into five races: Caucasian or ‘white’, Mongolian or ‘yellow’,
Ethiopian or ‘black’, American or ‘red’ and Malayan or ‘brown’ (Dunn,
Dobzhansky, 1952, pp.109-10). This division of humanity according to skin colour
has resulted in unimagineable misery over the past 200 years. Blumenbach
provided a simple classification of humanity that all but the blind could easily use
– one that could be used to compare people of “colour” with “whites” and to
compare “blacks” with “whites”. The results of the copious data that
“anthropologists” and “sociologists” collected could be analysed by
“psychologists” in “eugenic laboratories” and acted upon by educators and
propagandists, psychiatrists and physicians, politicians, priests, financiers and
industrialists. All had a vested interest in demonising the poor and idolising
themselves, their ancestors and families, and their teachers and “institutes of
learning”.
Today, as a direct continuation of this tradition of medical (and social,
anthropological and psychological) colour-based and race-based division, statistics
from the USA on the distribution of AIDS in their nation distinguishes victims as
being ‘Black’, ‘White/Caucasian’, ‘Hispanic’, and ‘other’. The category of ‘other’
is sometimes divided into ‘Native Americans’ and ‘Asians’.
As the present work is mainly concerned with the medical impact of eugenics, the
chain of command and of responsibility for the actual forced segregations,
kidnappings, mutilations, sterilizations and killings that resulted from negative
eugenics in the first half of the twentieth century needs to be considered. This can
be expected to shed light on the existing chains of command resulting in the worst
crimes against humanity occurring in the world today.
The founders of the first “Eugenics Society”, based in London, were Francis
Galton and Charles Darwin’s son, Major Leonard Darwin. Major Leonard Darwin
(son of Charles) was not a physician or a psychiatrist, nor was the first architect of
eugenics and coiner of the term, Francis Galton. Charles Darwin, from whom these
“Fathers of Eugenics” claimed inspiration, was the son of a physician, and had
been encouraged to become one himself, but instead studied to become an
Anglican priest before embarking on his historic career as a biologist. Galton, also
groomed by his father to become an eminent physician, failed to complete his
medical training when his father died and left him a considerable monetary
inheritance. Trained and qualified physicians, however, have long been responsible

91

for the most heinous crimes attributable to the implementation of negative eugenic
programs. It is my own profession that has actually decided who should be
sterilized, who should be experimented on and who should be killed. It has been
members of the medical profession that have given the actual orders, and, in the
case of surgical sterilization, done the actual mutilation (at their most militant, in
the 1930s, some American eugenists were calling for the surgical sterilization of
upto 15 million Americans – it takes little imagination as to what skin-colour most
of these would have had). In the case of eugenic murder and sterilization by
chemicals, it was trained medical doctors who authorised and orchestrated the
killings. Medical doctors also gave orders for people to be killed, crippled and
sterilized by injection, but less often actually gave the injections themselves, unless
the doctors did not know that what they were ordering be injected would be likely
cause death, illness or sterility. The latter has, however, undoubtedly been the case
in most instances, especially in more recent years.
These disturbing facts have been ignored in most of the few books that have
mentioned the subject of eugenics in a positive light since the Second World War,
including the influential 1952 Mentor Books publication Heredity, Race and
Society, by the American Zoology Professors L.C. Dunn and Theodosius
Dobzhansky, both of Columbia University and both members of the American
Philosophical Society and the American Academy of Science:
“More enthusiasm [than for ‘positive eugenics’] has been shown in many
places for negative eugenics, which urges elimination of undesirable genes
by discouraging or making it impossible for persons who show the effects of
such genes to have children. Since voluntary abstentation from parenthood
may be difficult, ‘sterilization’ for individuals who are likely to have severe
hereditary defects is recommended. Sterilization is accomplished through a
surgical operation; the operated individuals are by no means ‘unsexed,’ show
no outward signs of having undergone the treatment, but are unable to beget
children. Sterilization laws are now on the statute books in many states and
in some foreign countries. Some of them provide for sterilization only with
the consent of the persons involved or of their guardians, others make it
compulsory but controlled by the courts.” (pp.85-6)
The selective re-writing of medical history had already begun. Professors Dunn
and Dobzhansky would undoubtedly have known about far more gruesome
applications of “negative eugenics” in the (then) very recent past. Although
reference is made in an earlier chapter to Hitler, they fail to mention that the
genocide of “non-Aryans” by the Nazi leader was the direct application of negative
eugenics theory. The criminal act of genocide, internationally recognised as the

92

most grave of crimes against humanity, was widely recognised, in the 1950s, to be
intimately connected, historically and ideologically, with the theory and practice of
eugenics. The immediate precedent of eugenics resulting in genocide in Nazi
Germany in the 1930s and 1940s, when millions of men, women and children were
murdered after enslavement, torture (often in the guise of ‘medical
experimentation’) and dispossession, could barely have escaped the knowledge of
Professors Dunn and Dobzhansky.
The Nazi eugenic programs, as many would know, targetted races and individuals
deemed to be “degenerate”. The racial groups that were genocided included Jews,
Gypsies and Negroes, which were seen as threats to the “purity” of the “superior”
Aryan race – they were killed in the ostensible interests of “racial hygeine”.
Hitler’s Nazi regime conducted parallel programs of positive and negative
eugenics. The ‘positive eugenics’ programs included the selective education and
political and social favouritism of those Aryan children and adults with the “ideal
German” features of blond hair and blue eyes. These were, according to prevailing
anthropological and eugenics doctrine, Caucasian Aryans of the (‘superior’) Nordic
Type, who were seen as destined to rule over darker-featured Aryans (with the
notable exceptions of Hitler himself and other dark-haired, dark-eyed members of
the Nazi party).
Prior to the First World War, Britain had ambitions of creating a “white” colony in
control of Sub-Saharan Africa, having already gained control of the previous
“white masters” of the Cape, Transvaal and Natal colonies. These were the Boers
(Afrikaaners), descendants of Dutch and German settlers of the 18 th and 19th
centuries, who had surrendered to the British in 1902, concluding the “Boer War”,
during which over 20,000 Boer women and children perished in Britishconstructed concentration camps. The Boer surrender allowed the British South
Africa Company, headed by the diamond baron Cecil Rhodes, to control most of
Southern Africa. Rhodes’s main threat came from the imperial designs of Germany
and from Britain’s traditional enemy, France. The French had claimed much of
western and northern Africa prior to the Napoleonic wars, and also had significant
mining interests in South Africa until Rhodes bought out the French Company, in
the 1890s, with assistance from the Anglo-Jewish Rothschilds banking family. The
Rothschilds had previously financed Britain’s acquisition of French shares in the
Suez canal, which had been constructed by African labour under European orders
in the 1860s, and also financed other British Imperial ventures, including goldmining ventures in Australia.
When the British abolished slavery in 1830, the Anglo-Jewish Rothschild banking

93

family arranged for the British Government to raise the necessary sum of 20
million pounds – a vast sum of money at the time. This money was not paid to the
victims of slavery but to those who perpetrated it. Slave owners were
“compensated” for losing “their” slaves. Later, when the Rothschilds arranged
finance for the British diamond-baron Cecil Rhodes to “purchase” most of
Southern Africa so he could expand his gold and diamond empire north from South
Africa, a blind eye was turned to the fact that Rhodes was still using African slaves
to do the hard and dangerous work in the mines (many years after slavery was
‘abolished’, as in the case of Leopold’s ‘Congo Free State’), while white
“managers” and “administrators” lived in spendour at their expense. The same
occurred in Kenya, Ceylon, India, Australia and elsewhere in the British
“Commonwealth” of Nations.
The history of recent slavery is inextricably tied to the history of eugenics, not least
because the Father of Eugenics, Francis Galton, scientifically “rationalised”
slavery as “natural” for “black races” (‘Negroes’). Galton, like his eminent
followers, believed that the “master-slave” relationship was the “natural order” of
things – in human society and in the animal world. “Natural superiority” leading
to “natural dominance” of the “white race” was a central assumption of the
eugenics movement – an assumption that eugenics laboratories financed by the
Rockefeller and Carnegie corporations attempted to prove using various
“technological” and “mathematical” methods. Foremost among the concerns of
eugenists was that of overpopulation in the countries with mainly “yellow”,
“black”, “red” and “brown” races – Asia, Africa and South America, in other
words. Those who shouted the loudest about “too many people in the world”
continued to have large families themselves, and promote an obscene accumulation
of wealth among a tiny elite of “white” families. This was the situation in 1905
when Leonard Darwin and Francis Galton formed the first “Eugenics Society” in
London, and this remains the situation today.

94

Chapter 5

Chapter 5
PARANOIA ABOUT POPULATION GROWTH
It has been reported in the medical literature and by the World Health Organization
(WHO) that there have been more deaths from AIDS in sub-Saharan Africa than in
the rest of the world combined: an estimated 30-40 million people. Another 30-50
million men, women and children are reported to be infected with the HIV virus in
Africa, mostly south of the Sahara desert. It is a historical fact that the epidemic is

95

particularly bad in countries that were said, in the 1960s and 70s, to have an
“unacceptably high” rate of population growth.
With the so-called “population explosion” of the 1960s, cries of “global
overpopulation” became even more strident than before the First World War – led
by the same American, British and Australian universities that had ardently
promoted eugenics and related white-supremacy policies and dogmas before the
Second World War. Despite the recognised fact that most of the world’s resources
were being consumed (and even more of the pollution produced) by the affluent
“First World”, blame for “overpopulation” was routinely allocated to the
“developing nations” and China. South-East Asia, South Asia, South and Central
America and Africa were specified as having a catastrophically high “population
growth” by “population experts” in the USA, such as Stanford University’s Paul
Erlich.
Erlich has written several books about the “population problem” over the past 30
years, starting with The Population Bomb in 1968. In the preface to his 1990 The
Population Explosion Erlich writes about his continued obsession:
“In 1968, The Population Bomb warned of impending disaster if the
population explosion was not brought under control. Then the fuse was
burning; now the population bomb has detonated. Since 1968, at least 200
million people – mostly children – have perished needlessly of hunger and
hunger-related diseases, despite “crash programs to ‘stretch’ the carrying
capacity of Earth by increasing food production.” The population problem is
no longer primarily a threat for the future as it was when the Bomb was
written and there were only 3.5 billion human beings.” (p.9)
Erlich continues, dramatically:
“The size of the human population is now 5.3 billion, and still climbing.
In the six seconds it takes you to read this sentence, eighteen more people
will be added. Each hour there are 11,000 more mouths to feed; each year,
more than 95 million. Yet the world has hundreds of billions fewer tons of
topsoil and hundreds of trillions fewer gallons of groundwater with which to
grow crops than it had in 1968.”
In 1968 the AIDS epidemic was still fifteen years away. When Paul Erlich
published Population Explosion in 1990 the epidemic was already ravaging Africa,
but the Stanford professor still warns that “human populations are exploding at
record rates in Africa” despite AIDS. This claim is made in a chapter titled
“Population and Public Health”:

96

“AIDS is caused by a special kind of virus, known as a retrovirus. It
invades white blood cells, which play a crucial role in providing immunity
to disease.
“The original source of AIDS is thought to have been an African
monkey, a close relative of the vervets that gave us Marburg virus; but the
African origin of this disease has not been conclusively demonstrated, and
the suggestion has been highly controversial because of connotations of
responsibility. Nonetheless, human populations are exploding at record rates
in Africa, ecological situations have been changing dramatically,
malnutrition (and thus impairment of immune systems) is widespread, and
contact with our primate relatives there is more extensive there than on any
other continent. In addition, close relatives of the AIDS viruses have been
isolated from various African monkeys but not from wild monkeys living on
other continents. So the opportunity for transfer from an animal host was
probably higher there than anywhere else, and the inference is not
unreasonable. The question of whether the virus transferred long ago and
only ‘broke out’ in response to recent deleterious changes in the human
ecology of Africa, or whether the virus only invaded human beings in the
last few decades, is much more in doubt.” (p.146)
Erlich’s claims regarding “close relatives of the AIDS virus” being found in
African monkeys will be discussed later, as will the claim that Marburg virus
originated in wild Vervet monkeys (the name itself, Marburg virus, comes from the
German city of Marburg where the first outbreak of Marburg haemorrhagic fever
occurred). First we might look at the contentious claim that Africa is
“overpopulated”, and was so in 1990 when Population Explosion was published. It
is worth noting that the publishers of Paul and Anne Erlich’s Population Explosion
is the massive Simon and Schuster publication house, itself a division of
Paramount Communications. Simon and Schuster’s head office is located in the
Rockerfeller centre in New York.
Professor Paul Erlich was recently in Australia, and Chloe Salter of The Age
reports that “Population biologist Paul Erlich has a warning for Australians:
populate and you’ll perish”. In a feature article titled “The bombs still ticking”, she
wrote, on 19th December, 2000:
“Australia’s fertility rate is at an all-time low of 1.75 babies per woman
and falling. A range of academics and politicians say that the trend away
from motherhood is alarming, that it speaks volumes about how successfully
(or otherwise) women can combine work with parenting in this supposedly
liberated era. But Erlich told the conference Australia’s fertility rate should

97

go lower, for we passed the point of sustainability long ago.”
Erlich, who has apparently been described as “Stanford’s Nuttiest Tenured Turkey”
by the student newspaper Stanford Review, was invited by the Australian
Population Association to speak at their biennial conference, held in Richmond,
Melbourne. Presenting the semblance of a “debate” and the presentation of
dissident opinions, Chloe Saltau writes that, “the conference organisers have
hunted the country for a speaker brave enough to take on the famed scientist, and
have settled on David Buckingham, the executive director of the Australian
Business Council”, although Professor Erlich has apparently declared that
“scientists don’t care what idiotic businessmen and politicians think, they care
what their colleagues think.” Saltau continues:
“And then he really takes off. Business ‘would just as soon make their
profits now and let their children and grandchildren die, and that’s what
they’re doing’, he says. ‘They’re stealing from their children and
grandchildren. There’s no question that some of us can live very happy lives
today, while the people in Africa die of AIDS and starvation and so on, but
there aren’t any barriers to keep it from happening in the rest of the world.’”
Professor Erlich, who was born in Philadelphia in 1932, studied evolution and
ecology, and first visited Australia on sabbatical leave in 1965, when he regarded
the CSIRO (Commonwealth Scientific and Industrial Research Organization) as
“among the finest government scientific organizations in the world”. It can be
safely assumed that his opinion was based on the fact that the CSIRO scientists
shared his paranoia about “overpopulation in developing nations”. Erlich has no
decent words to spare for those who disagree with his extreme views about
overpopulation.
In 1960, the American “ecologist and conservationist” William Vogt, who had
written Road to Survival in 1948, published People! in which he demonstrates a
comparable lack of historical, political and ecological awareness to Paul Erlich.
Published by William Sloane and associates (who also published the earlier book
by Vogt), the book’s sleeve notes describe the author as a “long-time leader in the
Planned Parenthood movement” and summarise his contribution to population
fear-mongering, and it appears, the development of modern eugenics:
“Villages and towns, fields and streams, forests and wild life – these parts
of our heritage heretofore taken for granted are threatened with destruction
by the mushroom growth of population [an unsubtle and untenable
comparison with the threat of nuclear bombs]. By 1975 we can expect a
country of vast interlocking cities, with towns and rural areas totally

98

engulfed in suburbia; water scarcities in wide areas with good, fresh food
becoming available to progressively fewer people; escape from the human
hive ever more difficult, with enjoyment of nature a vanishing pleasure. The
list is long, the prospect appalling, and the worst of it is that in few places in
the world will it be possible to fare better.
“At the present time, with only 7% of the world population, the United
States consumes more than one-half of the world’s raw materials. How long
will the other peoples of the world put up with such inequity when already
they suffer widespread hunger, illiteracy and the restless tides of discontent?
So critical, in fact, is the global situation becoming, that unless effective
steps to counteract it are taken immediately, there will be little hope of
saving civilization as we know it. The Congo, Southeast Asia, Latin
America, India: all are suffering pathological growing pains that are largely
of our making.”
The regions which are here claimed to be “suffering pathological growing pains”
are now suffering from growing epidemics of HIV infection and AIDS. The Congo
(Zaire) is also the specific area of Africa in which the epidemic first broke out on
this continent. In a chapter titled “A Pathology of People”, Vogt writes, of Asia and
Africa:
“No region, unless it be Africa, is in a more parlous plight than Asia.
Here, according to a recent United Nations estimate, various populations are
doubling in about twenty-five to twenty years [the original measure of
Thomas Malthus, when warning about the danger of overpopulation in the
USA, back in the late 1700s]; and within twenty years may be growing at a
rate to double populations again within seventeen or eighteen years! The
proportion of children who are, of course, nonproducing dependents and
therefore burdens on the parents and the economy [the over-riding concern
of Malthus], will have jumped to well over 40 per cent within the next
twenty years. The resources of CARE and UNICEF are hardly likely to
begin to meet the human demands of the next generation, a generation they
have helped to swell. If hundreds of thousands, and even millions, of
children starve it will be in part because of the good intentions of these
organizations. They have been conspicuously unwilling to do anything about
trying to reduce the birth rate.” (p.90)(his emphasis)
William Vogt, and other overpopulation doomsayers, were concerned that any
efforts to reduce infant mortality (which include clean water, nutritious food and
decent houses) would result in “too many mouths to feed”. This is Malthus’s
discredited and inhumane argument, which Vogt attempts to keep alive when

99

predicting disaster for India, taking a 1959 study by the Ford Foundation as his
inspiration:
“The experts who drew up the report have given India a virtual blueprint
of an agricultural revolution to be brought about in seven years’ time!
Should the revolution fail, a mere half-dozen years hence, we may see nearly
half a billion people in an agonizing demonstration that Malthus’s theories
were basically sound. The results – whether of Indian success or failure – are
sure to be felt far beyond the country’s borders and the consequences could
be world-shaking. And since we, in comfortable, overstuffed America, are
part of the world, what happens during the next half-dozen years in this
subcontinent may be of fateful import to us.” (p.129)
The population of India was reckoned, by Vogt, to have been 3.45 million in 1947,
increasing to 3.98 million in 1958. He claims that during this time, the first decade
following the nation’s independence from British rule, infant mortality fropped
from more than 130 per thousand live births to less than 100, while life expectancy
“jumped more than half, from twenty-seven to forty-two years”, adding, in
brackets, that “It is still not a great deal over half that in the United States and
Scandinavia”.
Why did life expectancy suddenly increase in India after the nation gained
independence? Why was life expectancy in India only 27 when the British left in
1947, given that the “normal span of a man’s life” according to the English bard
William Shakespeare was 70 years (“three score and ten years”) back in the 1600s?
The answers to these questions may cast light on similar changes in other nations
after they gained independence from European colonial rule – the so-called nations
of the “developing” or “Third” world. They may also cast light on the vexing
question of “overpopulation” – are there too many people in the world, and if so,
where are they?
In the mid to late 1960s The Age in Melbourne and other local newspapers featured
a series of articles threatening disaster for white Australians if the population
growth of “developing countries” in the region was not curbed. On 20 th April 1967,
an article titled “Birth Control Stand Angers” quoted Professor F.J. Hird, professor
of biochemistry at the University of Melbourne. It was he who was angry:
“Prof. F.J.Hird of Melbourne University, told a large audience at Kew
city hall last night that he was ‘angered in the extreme’ by a religious
organisation standing in the way of population control.”
The religious organisation that angered many “population controllers” in the 1960s

100

was the Catholic Church, which forbade the use of contraception at the time,
including the use of condoms, the oral contraceptive pill and various forms of
sterilization. This opposition to control of the breeding (and thus the numbers) of
Catholics was ridiculed and denigrated in many ways by atheist and Protestant
eugenists, and had been since the earliest years of the eugenics movement. Mark
Haller wrote of this conflict between eugenists and the Catholic Church in
Eugenics (1963), a book which was “manufactured with the assistance of a grant
from the Ford Foundation”:
“Because eugenics was nearly irreconcilable with Catholic teaching,
many Catholic clergy and laymen entered the lists against it. A prominent
Catholic physician of Philadelphia, in fact, went so far as to claim that it was
‘doubtful whether physical defects, disease, and degeneracy can, in a
biological sense, be transmitted from parents to offspring’ and that the
children of a chaste and loving couple were ‘bound to be good.’
“Eugenists, aware that their creed met resistance in religious circles,
defended the compatibility of religion and eugenics. A psychologist at the
Hartford School of Religious Pedagogy, for instance, argued that salvation
of the world would eventually come through development of higher types of
individuals and noted that Christ himself demonstrated what eugenics might
accomplish, for ‘He came from a stock of priestly and prophetic men.’”
This line of argument must have convinced the priests (who were among the
‘eminent’ groups according to Galton’s hierarchy), because, Haller writes:
“Among the clergymen themselves, the more liberal and reform-minded
tended to be most sympathetic with eugenics…Such ministers were most
likely to be aware of recent trends in the study of crime, insanity, and
feeblemindedness and therefore most likely to support the programs of
eugenics.” (p.83)
Other articles warning about overpopulation in Australia quoted the New York
Times and various American authorities about the “population problem”. On 15 th
August, 1967, in an article from the Australian Associated Press titled “Grim vision
of the year 2000” The Age informed Australians about the worrying predictions of
the American Commission on the Year 2000, published in Daedalus, the journal of
the American Academy of Arts and Sciences: Professor Harry Kalven junior,
professor of law at the University of Chicago thought that “Man’s technical
inventiveness may, in terms of privacy, have turned the whole community into the
equivalent of an army barracks”; Professor George Miller, Professor of Psychology
at Harvard was concerned that “by 2000, the limit of man’s mind to absorb
information may be reached…for many of the less gifted among us”; Professor

101

David Reisman, Professor of Sociology at Harvard warned that “growing pressures
for personal achievement could bring severe social tensions by 2000, as well as a
decline in manners and charm, and social disapproval of many hobbies”; while
Professor Kahn of the Hudson Institute predicted that “almost half of the American
population would live in three huge super cities: ‘Boswash’ the urban strip
including Boston, New York City and Washington; ‘Chipitts’, the area from
Chicago to Pittsburgh; and ‘Sansan’ which would stretch from San Francisco to
San Diego. Many of these predictions are consistent with the general mentality of
eugenists prior to the Second World War, especially the “sugar-coated”
presentation of restricted parenthood:
“Parenthood would [according to the anthropologist Margaret Mead] be
limited to a smaller number of families whose principal function would be
child rearing. The rest of the population would be free to function – for the
first time in history – as individuals.”
In May 1967, the Melbourne Advertiser ran a piece designed to terrify, rhetorically
titled “World’s Baby Explosion: Is Man Populating Himself Out Of Existence?”.
The article is embellished with three small black and white photographs, to draw
the readers’ attention, and to get the point across to those who could not be
bothered reading the whole article. The first, captioned only “India…” shows
frowning woman holding a crying baby. The second, captioned, “China…” shows
a toddler, maybe 2 years old, who has the facial features of a Chinese child. Under
the third photo, apparently depicting “Indonesia…” (but strangely showing a
Chinese woman with a child) is the rest of the caption “Indonesia…by the year
2000 the world’s population will have doubled to 7.5 thousand million people
unless the birth-rate in controlled.”
The article “World’s Baby Explosion: Is Man Populating Himself Out Of
Existence?” warns:
“The cry of ‘populate and perish’ is going to be heard increasingly from
now on. ‘In five years the headlines will be full of the overpopulation
threat,’ an official in the population section of the US Agency for
International Development (AID) said this week.
“AID now has teams in 20 countries and is the spearhead of the US
attack on the problem of skyrocketing world population. This year its birthcontrol budget is $8.9m. Next financial year it will ask the Congress for
$20m. There are signs that Congress may raise this figure to $50m. For the
first time, AID is now prepared to provide the actual contraceptives or the
means of producing them to countries which ask for help.
“What is worrying demographers is the staggering rate of increase. World

102

population is now about 3.3 thousand million. By the year 2000 – a year
beginning to appear to be as apocalyptic as 1984 once did – it will more than
double itself to an estimated 7.5 thousand million. It is as though the globe is
breeding its own mirror image; food resources, already stretched thin, will
have to support this whole extra world within 33 years.”
Actually, the world’s food resources were not “stretched thin” in 1967 because
there was not enough food to go around – they were “stretched” artificially.
Throughout the 1960s, 70s and 80s, tons of wheat and other staple foods were
deliberately dumped in the sea to “keep commodity prices up”, rather than give the
grain to those who were starving. Increasingly, while food resources have been
“stretched” so thin in some nations and localities that millions of children and
adults have died of malnutrition, in others people have been dying prematurely
because they have been consuming too much. The gap between rich and poor has
widened in every nation in the world, and the surplus of food in some homes has
grown to an obscene degree, matched only by surpluses in the same homes of
luxury items, expensive jewellery, alcoholic drinks and perfumes, antique furniture
and so on. The poorest homes continue to have none of these things, and the
poorest people continue to have no home. As was the case in the 1960s, the world
provides enough for every person’s need but not every person’s greed, as Mahatma
Gandhi wisely observed half a century ago.
The 1967 Advertiser article on the “World’s Baby Explosion” assumes that the
increase in global population predicted by the American and Australian population
experts would not be accompanied by fundamental redistribution of wealth, or a
cessation of the practice of exploitation of the “have-nots” by the “haves”.
Emphasising the text in bold type, the grim warning (to “haves”) continues:
“The trouble is that the food-short nations are breeding faster than
ever before, while the affluent countries are slowing down. This makes
the gap between the haves and have-nots wider than ever. One rather
frightful prediction is that by 2000 6,000 m. people will be starving
have-nots, leaving the 1.5 thousand million ‘haves’ in a somewhat
precarious position.”
Under the subheading “Desperate” the article continues:
“US Agriculture Secretary Orville Freeman, an ardent anti-populationboom man, said bluntly not long ago that unless the population increase was
dramatically slowed down, by the now famous year 2000 the world would
be ‘…a world where the developed nations sacrifice compassion on the altar
of survival – feeding only themselves as they huddle behind arms-and-tariff-

103

protected borders. A world where the trickling food supply of the hungriest
lands runs dry before it reaches everyone…and millions succumb to
starvation…a world where hopelessness breeds hostility, where the evergrowing gap between the haves and have-nots first provokes riots in the
streets, then insurrection and the toppling of Governments, then final,
desperate international aggression.”
This is a somewhat prophetic prediction – the world’s population has apparently
doubled, and the gap has certainly widened between rich and poor nations and
between rich and poor families and individuals. It is also true that rich nations have
“sacrificed compassion” – not on the “altar of survival”, but on the twin altars of
greed and militarism. In the 1960s, and even more so today, the poverty of the
“developing nations” was the direct result of exploitation by the “developed
nations”. The most obvious economic mechanism by which worsening
impoverishment of the ex-colonial (“developing”) nations has been the crippling
effect of “Third World Debt”. The most obvious causes of deprivation of the poor
in these nations are exploitation of their resources (and themselves) by affluent
“professional” elites in these nations and the fact that these elites have consistently,
and increasingly, yielded to the suggestions of arms dealers to buy more weapons
and “security measures”.
In June 1967, the Australian ran an alarming article titled “Control population or
starve, say experts”. The article has no author other than “from our world cable
service: Washington”, and begins:
“A world crisis of staggering proportions is forecast by 1985 unless the
biggest effort ever made by man is begun immediately to solve the world
food problem.
“The warning is given in a report sent to President Johnson by the world
food panel of the President’s science advisory committee. The panel says
that the solution is biologically, technically and economically possible but
may not be politically possible in the U.S. or abroad.”
As in the 1980 and 1990s, the “world food problem” of 1967 was little different to
that of the 1940s when Mohandas Gandhi claimed that the earth provides enough
for every person’s need but not every person’s greed. The same is true today –
there is no absolute shortage of food on Earth – there is more than enough for
every man, woman and child, and a lot left over. Yet many millions go hungry,
while others die prematurely because they have consumed too much food and
drink during their lives. There is no absolute shortage of land for people to live in,

104

but many billions live in over-crowded environments, and become ill as a result of
overcrowding, and lack of adequate housing. Sometimes they are kept there by
walls and barricades, sometimes they are kept there by fences which exist only in
their minds. Most often they are kept there because of economic necessity – they
do not have enough money to live in decent houses, and cannot afford to escape to
a more pleasant and safe environment. The situation is all the more outrageous
because many houses remain unoccupied because poor people are unable to
“afford the rent” or “afford a mortgage”, or because they “don’t have a job”, or
“look like they may be unreliable”, or because of their skin colour – whilst wealthy
individuals, ‘landowning families’, affluent institutions and massive corporations
greedily accumulate more land, more houses and more slaves.
The Australian’s “Control population or starve” article refers to a “growing food
shortage” (at a time when grain was being dumped in the ocean to keep commodity
prices up), and says that this (non-existent) shortage “cannot be relieved without
solving the problem of population growth”. Claiming that to solve “the problem
that will exist at least by the turn of the century demands that programs of
population control be initiated now” (1967), the “expert panel” quoted by The
Australian recommended that:
“To avert widespread famine and violent political and social upheaval,
the rich nations must drastically increase foreign aid to poor nations for food
and population control programmes.”
Here we see a clear polarization between “rich” and “poor” nations. This is the
same division that has been renamed, at different times, as between First World and
Third World nations, ‘under-developed’ and ‘developed’ nations, ‘developing’ and
‘developed’ nations, ‘backward’ and ‘advanced’ nations. The search for the
“politically correct” terms to use to describe the deep economic divide that has
characterised international politics since the Second World War has been clearly
evident in literature emanating from the United Nations and its subsidiary
organizations, the World Health Organization (WHO) and World Bank (WB). Now,
the official terms are “First” and “Third” world – but if one speaks of ‘rich’ and
‘poor’ nations everyone knows which countries one is talking about. In the
apparently ‘rich countries’, however, there still remain people who live in abject
poverty and many who die prematurely due to disease and stress caused by
poverty.
In terms of the “Commonwealth”, the ‘rich’ countries, according to most books,
include Australia, New Zealand and Canada and the ‘poor’ ones include the
African nations (with the exception of white-controlled South Africa), South-East

105

Asian ex-colonies (except Singapore), India, Sri Lanka, the Pacific Islands, West
Indies and other nations where the majority population was dark-skinned. Skin
colour is, in fact, the most obvious (and least mentioned) thing about the difference
between “first world” and “third world” nations as defined by the United Nations.
All the “first world” nations have majority populations with “white” or “light”
skin. All the “third world” nations are mainly populated with people having
“brown”, “black” or “dark skin”. Thus, when accusations have been made by “first
world” academics that there are too many people in the “third world”,
coincidentally, it has meant that white men and women have been insisting that
there were too many brown and black people in the world – and not enough food to
feed them with. The racism inherent in the “third world overpopulation theory” is
particularly obvious in Australian newspaper reports from the 1960s onwards,
however the difference in skin colour between those calling for “population
control” and those they demanded be controlled is never mentioned.
So what is population control? The article quoted above, and others to follow, refer
to contraception as an important aspect of population control – so obviously what
was regarded as “population control” included control of the breeding
(reproduction) of the population. This obviously involved influencing prospective
parents, with conscious intent to change the thinking and behaviour of the target
population (without which reproductive behaviour cannot be altered). This does
not necessarily involve literal control of young women and men, but it can – and
has, at times.
In the 1960s the main methods of contraception advocated by the medical
profession were the (female) oral contraceptive pill and condoms. Other methods
of contraception promoted in the “first world” as well as in the “third world” in the
1960s and 1970s were intrauterine devices (IUDs), tubal ligations (for women) and
vasectomies (for men). It is also, however, possible to sterilise women by
performing a hysterectomy (surgical removal of the womb) and, in the 1970s
thousands of hysterectomies for the purpose of sterilization (rather than for disease
of the uterus) were also performed. Coercive hysterectomies and forced
sterilizations were a particular problem in India, where the “population problem”
was regarded as being the most serious according the much-publicised Population
Bomb by Stanford University’s Paul Ehrlich. During Indira Gandhi’s reign in the
1980s there were several reports in the Western press of Indian women finding,
after surgery, ostensibly for different problems, that their fallopian tubes had also
been cut and tied. Such atrocities were not a feature of “population control” in First
World countries, nor was it common for women to be given long-lasting hormone
injections to render them temporarily infertile in the West, while depo-provera and

106

similar injections, which are not recommended as a contraceptive in the First
World (due to unacceptable risks) were widely used by doctors in the “Third
World”. There has been, without doubt, a considerable difference between what
has been regarded as “acceptable practice” in rich nations and what has been seen
as acceptable practice in poor nations.
This has become a hotly debated issue in the area of AIDS treatment, centred on
the availability of drugs to treat AIDS in Africa, and drug trials on Africans by
Europeans. It is debated as to whether or not it is ethical to perform “double-blind
trials” on African women and babies comparing active drug with placebo. The
controversy is said to be centred on whether it is ethical to hold back drugs which
“have been shown” to improve the prognosis of HIV infection. This argument is
potentially misleading, and assumes that the “benchmark drug” for the treatment of
AIDS, azidothymidine (AZT, or Zidovudine) is as safe and effective as the drug’s
manufacturer, Glaxo-Wellcome pharmaceuticals claims it to be – or safer. This
assumption will be challenged later in this book, however, there can be no doubt
that AZT is a potentially highly toxic drug. It is for this reason that its initial use as
a cancer treatment was abandoned (or seriously restricted) when it was first
developed in the 1950s.
The eugenics hypothesis of HIV/AIDS is supported somewhat by the politics and
propaganda surrounding AZT and another controversial Glaxo-Wellcome product,
the opiate drug physeptone, which is marketed in Australia as Methadone. The
“Methadone program” is the cornerstone of heroin-addiction treatment by the
public health system in Australia, although the drug is extremely addictive and has
been soundly criticised for this reason by health workers for several decades, and
treatment is usually recommended for many years (thus facilitating addiction). The
Commonwealth public health system (centred in the public hospitals) in Australia
administers the methadone program, although certain general practitioners and
pharmacists in the community are allowed to prescribe the drug for “registered
addicts”. The Commonwealth Government also funds the “needle exchange
program” which is another key measure in the official “war against AIDS” and the
“war on drugs”. Needle exchange and condom distribution are a major focus of
current AIDS policy in Australia and in the USA – and being actively exported by
the Australian university academia and public health experts to neighbouring parts
of the “Third World”. Coincidentally, both ‘drug addicts’ and homosexuals were
targetted for ‘euthanasia’ (‘mercy killing’) in Nazi eugenics programs.
The “needle exchange” and condom promotion are officially referred to as “harm
reduction” or “harm minimization” strategies. “Safe injecting rooms” are also

107

included as “harm reduction measures” and the same lobby groups that promote
needle-exchanges also promote injecting rooms. They maintain that “as drug use is
now a part of society we should try and contain the problem”, particularly that of
the spreading of HIV and Hepatitis (B and C) viruses. Logically, if needles are not
shared, blood borne viruses will not be transmitted between intravenous drug users,
and if new needles and syringes are used rather than re-using the same ‘works’,
blood-borne transmission of infections can be reduced. This logic is difficult to
fault, as is the logic that if contaminated needles are returned (by exchange) for
‘safe disposal’ (by incineration or burying) the risk of accidental injury to the
public is also reduced. This is not, however, the whole story, and the policy of
needle exchange can seriously worsen problems of intravenous drug addiction in
communities, while exposing more people to risk of blood-borne viruses and
introducing the habit of self-injection to more and more young people.
The needle exchange program includes both collection of contaminated needles
and syringes and the distribution of needles and syringes. There is no doubt that
collection of contaminated needles and syringes (along with other potentially
infectious) rubbish is an important, and hygenically necessary public service. The
distribution of needles is not so obviously worthwhile. It may not be worthwhile at
all. The distribution of more syringes and needles into the community (and the
world) may well be an act of social and environmental vandalism – increasing the
prevalence and extent of injecting habits, while also subjecting more of the noninjecting public to accidental infection from contaminated needles.
The vast majority of the people on Earth do not inject themselves with drugs and
have no desire to do so. Most have never injected themselves and never will. This
is because injecting oneself with heroin, amphetamines and other psycho-active
drugs is, in addition to being very dangerous, very unnatural. Unlike drinking
alcohol, or even smoking herbs, which have been part of human culture for
millenia, injecting oneself with metal needles and using plastic syringes to selfadminister administer heroin or amphetamines is an extremely unnatural and
abnormal act that has become a “human habit” only in the past 100 years. The
terms “unnatural” and “abnormal” are here used literally, however their use
requires some clarification.
It is obviously ‘unnatural’ to inject oneself with anything – but then it is also
‘unnatural’ to drive a car, or watch television. Injecting drugs is ‘abnormal’ (not
normal) in that the majority of the population (and most of society within two
standard deviations on a bell curve) does not inject drugs – but then, diabetics who
inject themselves with insulin are also ‘abnormal’ in the same way (simply because

108

most people are not insulin-dependent diabetics). In other words diabetics who
inject themselves with insulin are doing something that is ‘unnatural’ and
‘abnormal’, but obviously not something that is bad, or harmful to their health
(indeed it can be life-saving). Moreover, if they inject themselves with human
insulin, as most do nowadays, the substance they are injecting is not “unnatural”
(although the act of self-injection remains so). Also, while diabetics who inject
themselves with insulin are “abnormal” in the sense that they differ in this
behaviour from the majority (‘normal’) population, and constitute a minority
among diabetics (most are non-insulin-dependent), their behaviour is “normal” for
insulin-dependent diabetics.
The need for insulin of people with diabetes mellitus has remained, for several
decades, a popular analogy used by the psychiatric profession regarding the need
for “antipsychotics” by schizophrenics, but the history of psychiatrists use of
insulin dates back to the early 1920s, immediately after the pancreatic hormone
was discovered in 1922. The first insulin shocks and insulin comas were performed
in the USA and Germany, as Edward Shorter reveals in A History of Psychiatry
(1997):
“This story [that of coma and chemical shock ‘therapy’] begins in Berlin
with a young medical graduate of the Vienna University named Manfred
Sakel. Sakel was born in 1900 in Nadverna, Galicia (then part of Austria), of
a pious Jewish family…By the time of his graduation in 1925, so scorching
had been the blast of Austrian anti-Semitism become that he found a job in
Berlin as assistant physician at Kurt Mendel’s expensive suburban private
clinic, the Lichterfelde Sanatorium. The clinic courted the actresses and
physicians who typically were at risk of morphine addiction. Yet a coldturkey withdrawal often entailed symptoms such as vomiting and diarrhoea.
In the late 1920s, Sakel discovered that such symptoms could be
successfully managed by aministering small doses of insulin, a hormone just
discovered in 1922.
“Insulin had already been tried several times in psychiatry during the
1920s, as early as 1923, when the staff of the Psychopathic Hospital in Ann
Arbor, Michigan, gained the impression that insulin relieved the depression
of diabetic patients as well as their diabetes. In fact it did not. Later in the
1920s, insulin was used on patients who had lost their appetites or refused to
eat. But it occurred to no one that a coma from insulin shock might be
curative.
“Sakel was probably not aware of much of this previous writing in any
event. Yet he did have to cope with the insulin comas that happened
inadvertently to several of his own patients. He noted that after the comas

109

were over the patients’ desire for morphine had been abolished. In addition,
these previously ‘restless and agitated’ patients had become ‘tranquil and
accessible.’ Sakel reported this finding in 1933. At this point, he was clearly
thinking that putting patients into an insulin coma might in and of itself be a
cure for major psychiatric illness.
“Evidently as a result of the Nazi takeover, Sakel returned to Vienna in
1933, getting a post at the university psychiatric clinic under Otto Poetzl,
Wagner-Jauregg’s successor as the professor of psychiatry. Sakel also
became chief physician of a private clinic in a Viennese suburb. Sakel
persuaded a reluctant Poetzl to try out the dangerous-sounding therapy, and
in October 1933, Sakel began testing systematically at the university clinic
his theory that insulin “shock” represented a cure for schizophrenia.” (p.209)
Insulin “shocks” do not cure schizophrenia, although, like other forms of torture,
insulin shocks can stop people talking about things that disturb others. Because
insulin, the hormone secreted by the Islets of Langerhans in the pancreas lowers
the blood glucose level, if people are injected with large amounts of insulin
suddenly, they suffer from a sudden drop in blood sugar level leading to loss of
conciousness and, often convulsions. This is the “shock” that Dr Sakel imagined
was a “cure” for schizophrenia.
The example of Dr Sakel’s experiments with insulin shocks and comas reveal
much about the psychiatric profession in Germany in the 1920s and 30s, but also
about the ethics and assumptions of the psychiatric profession today. Psychiatry at
the dawn of the twentyfirst century is far more closely related to psychiatry of the
early twentieth century than most people realise. A hidden thread that runs through
the past one hundred years of the profession is that of eugenics – predominantly
negative eugenics and population control. In the case of psychiatry (and the
medical profession more generally), ‘population control’ includes both control of
population numbers and control of the population. ‘Population Control’ thus
includes control of global population (the number of people in the world) and
national population figures, while ‘control of the population’, a bigger concern for
psychiatrists, includes control of the behaviour and minds of the population and the
individuals who comprise the populace – including their sexual behaviour and their
social/political behaviour. Both facets of ‘population control’ involve the
application of coercion – and it is in coercive population control that human rights
abuses are to be most obviously expected.
The strategies of “population number controllers” include much more than the
promotion of condoms or the instilling of fear of sex (for risk of disease or as an

110

affront to God). In the face of perceived failure to control the “population
explosion” by voluntary contraception, various methods have been used around the
world to restrict reproduction, by coercion and, at times, physical force. This
includes forced hysterectomy, tubal ligation and euphemistically named “providerdependent contraceptives”, including “long-lasting injections”.
The danger of “provider-dependent contraceptives” is admitted in the 1992 book
Poverty and Development in the 1990s published by the conservative Oxford
University Press:
“These trends are particularly worrying for women: first of all because
most contraceptive methods are directed at women who have no part in
determining research priorities and standards. Second, some of the trends
have serious implications for women’s health. For example, women in
developing countries have been used frequently for testing contraceptives.
The women of Puerto Rico, Haiti, Guatemala and Chile were among the first
to take part in the tests of contraceptive pills and sterilization injections in
the 1960s and 70s. More recently, women in India and Bangaladesh have
been used in the trials of hormonal contraceptives. In all cases, such
experiments were carried out with little or no information given to the
women themselves or, at times, to the local personnel involved in the trials”
(p.94)
The sterilizing injections referred to include “depo-provera injections” (which are
still given, at times, to psychiatric patients and Aboriginal women in Australia to
prevent them from “irresponsibly” conceiving) and beta-HCG injections (“antifertility vaccines”), which have been given without prior consent or knowledge to
young women in several “third world” countries, including Bangaladesh, Indonesia
and the Philippines.
These “anti-fertility vaccines” are, however, described in glowing terms in the
specialist textbook Immunology, published in 1996 by Mosby (Times-Mirror), and
edited by Professor Ivan Roitt of the University College Hospital, London:
“In principle, conception and implantation can be interrupted by
inducing immunity against a wide range of pregnancy hormones. The target
of the most successful experimental trials has been human chorionic
gonadotrophin (hCG), the embryo specific hormone responsible for
maintaining the corpus luteum. Vaccines based on the beta chain of hCG,
coupled to tetanus or diphtheria toxoid, have been extremely successful in
preventing conception in baboons and, more recently, humans. In the human
trial [where, and on whom is not mentioned], infertility was only temporary

111

[untrue], and no serious side effects were observed [also untrue]. Clearly this
represents a powerful new means of safely limiting family size, though there
are of course cultural and ethical aspects to consider too.” (p.19.10)
As for the safety of vaccines generally, the book has few reservations, but in
glossing over the risks of and many concerns about long-term damage to the
immune system (and other systems) from immunization, admits to some “general
risks” which are worth noting:
“Some more of the serious complications may stem from the vaccine or
from the patient. Vaccines may be contaminated with unwanted proteins
or toxins, or even live viruses. Supposedly killed vaccines may not have
been properly killed; attenuated vaccines may revert to the wild type.
The patient may be hypersensitive to minute amounts of contaminated
protein or immunocompromised in which case any living vaccine is usually
contra-indicated.” (p.19.10) (emphasis added)
The last sentence in this textbook quotation brings the entire “third world
immunization strategy” under question. It is common knowledge that many
children in poor countries suffer from malnutrition. It is also accepted that
malnutrition compromises the immune system, causing immunosuppression. It is
thus clearly a dangerous thing to inoculate such children with live viruses, even if
they are supposed to be “attenuated”. It would be even more suspect if these
children developed an epidemic of acquired immune system deficiency and
collapse of the immune sysem and no serious attempts were made by vaccinators
and epidemiologists to look for a connection between the two.
Live virus vaccines made from infected animal tissues or “cell lines” have been
injected into malnourished African children and babies since the 1920s (smallpox
vaccines in Rhodesia) and immunization programs using live viruses were greatly
expanded in the late 1950s (with the Salk polio vaccine). At this time and for
another two decades “retroviruses” (‘slow viruses’) were not known to cause
disease in humans, but they were found to cause disease (mainly immune damage
and cancers) in several animal species. Various animal viruses had been noted, in
the 1970s, as ‘harmless contaminants’ in human vaccines derived from animal
tissue cultures (such as SV-40, or simian virus 40, which was noted from Salk
polio vaccines which were manufactured from polio virus grown on sliced monkey
kidneys).
In an ongoing industry of animal virus experimentation, chimpanzees, monkeys,
dogs, cats, rabbits, goats, sheep, rats, chickens, pigeons and mice in universities

112

around the world have been deliberately infected with an increasing variety of
viruses, including retroviruses (since the development of the Rous Sarcoma Virus
in 1910). New hybrid viruses have been created, and animals have been specially
bred with susceptibility to diseases (particularly cancers) from these viruses, as
well as from other causes (such as Sprague-Dawley rats, which readily develop
cancers). This is the scientific environment in which the entire virology and
immunology industries have developed, and the focus of their efforts has been to
develop new drugs, find new uses for old drugs, and develop new vaccines.
Alleviating third world debt or poverty has not been on the agenda.
Did the HIV virus develop ‘accidentally’ as a result of unrestrained and careless
animal experimentation, or was it deliberately designed for the purposes of
biowarfare? Was it unintentionally spread through infected blood transfusions in
Africa, or was it a further effort at ‘population control’, following the apparent
failure of condom-promotion to prevent population increases in Africa in the
1970s, involving the deliberate inoculation of Africans with a killer-virus?
David Suzuki wrote, rather glibly, in 1990:
“In 1972, then-president Richard Nixon signed the Biological Weapons
Convention, under which all research on biological weapons was stopped
and all cultures were destroyed.” (Inventing the Future p.89)
Did Nixon and his administration reveal as much about American biowarfare as he
did about events in Watergate, Vietnam, the Middle East and Central America? Is it
possible to “stop all research on biological warfare” when it is all secret (and
therefore routinely denied) anyway? Can we expect honest answers from
governments and research institutions, when the history of biological warfare is
systematically denied and attributed to “natural disasters” and the susceptibility of
“naïve native populations” to admittedly introduced, but accidentally introduced,
disease?
As seen in the previous chapter, ‘modern’ political, medical and scientific concern
about overpopulation and the belief that increased population will inevitably lead
to mass starvation dates back to the influential essay by the Anglican priest and
economist Thomas Robert Malthus titled Essay on the Principle of Population.
This essay, published in England in 1798, claimed that the greatest threat to
humanity was population growth. Alarmed by a rise in the population of Europe
from 66 million in 1700 to 180 million in the 1790s, Malthus argued that
“population, when unchecked, increases in a geometric ratio” while “subsistence
only in an arithmetical ratio”. Based on this dubious supposition, the economist-

113

priest claimed that rates of population growth inevitably exceed rates of increase in
food production. Citing the example of the United States of America (which
Britain had recently lost ‘possession’ of Eastern North America to) Malthus argued
that since the population of the USA had doubled in 25 years, it could keep on
doubling every 25 years, while food production could only be expected to increase
by a similar amount every 25 years. The Anglican priest, applying his theories to
Europe with a callousness that makes a mockery of his claim to be a man of God,
argued that “dependent poverty out to be held disgraceful” because it “diminishes
the power and will to save” advocating “workhouses” for the poor and distressed.
Malthus’ mathematical and biological assumptions were badly flawed, and his
economic theory, in which Social Darwinism and capitalist theory is deeply rooted,
enabled economic slavery to be rationalised by the ‘upper classes’ at the same time
that public opinion was making cargo slavery politically unsustainable. Although
cargo slavery was politically unpalatable and morally reprehensible the economies
of the European empires and their colonies had been built on slavery and were
deeply dependent on them. For this reason the abolition of slavery was resisted for
economic reasons (especially by land and slave owners in the colonies) long after
the moral issues had been clearly defined – equality, emancipation and freedom
were recognised to be good, while any form of slavery was recognised to be evil.
The word ‘evil’ was, in fact used frequently by opponents of slavery – the trade in
human lives was regarded as evil and the maintenance of states of slavery also evil.
Despite this recognition, slavery continued long after Thomas Jefferson wrestled
with his conscience over his personal ownership of slaves and economic
dependence on them after publicly acknowledging that slavery was an affront to
decency; at the same time that, across the Atlantic, Thomas Malthus was arguing
that the real problem was poor people having too many children. Slavery in the
United States of America continued long after Jefferson, by then President of the
USA, banned the importation of slaves to the USA, back in 1807. It continued long
after Abraham Lincoln’s famous Emancipation Proclamation of 1st January 1863.
Slavery in the British colonies and “Commonwealth” did not end when it was
officially banned by the British Parliament in 1830, and continued in various
British ‘colonies’, ‘dominions’ and ‘protectorates’, notably in Africa, but also in
the Pacific region and Asia – and has not ended yet. The evils of slavery did not
disappear from the face of the Earth, or the minds of would-be ‘masters’ and
‘owners’ when Allied troops released millions of slaves from ‘concentration
camps’, ‘death camps’ and ‘labour camps’ in German-controlled Europe in 1945 –
slavery, in the form of debt that exists as soon as one is born (part of inevitable
‘national debt’) has become ubiquitous. This is compounded by the fact that few

114

who are ‘economically’ enslaved regard their plight as slavery. The history of
modern slavery has much pertinence to the history of genocide – because, as in
Nazi Europe, slavery and genocide have often gone hand in hand.
After the Second World War, despite the loss of millions of young lives during the
1940s, the eugenicists (who were much less likely to identify themselves as such
for fear of association with Nazism) intensified their warnings about
“overpopulation”. In fact, because many of the lives lost during the Second World
War were those of young, white Anglo-Saxons, the eugenicists lamented that they
had lost some of their best, and thus feared, all the more, the dark ‘hordes’ they
imagined existed in the “backward parts of the world”.
They also feared the poor, whom they recognised were becoming increasingly
critical of the political, social and legal privileges of the rich. These fears can be
seen in the ravings of the nuclear physicist Sir Charles Galton Darwin, grandson of
Charles Robert Darwin, at the California Institute of Technology (Caltech). Sir
Charles, who had been Director of the National (British) Physics Laboratory from
1938 to 1949 (during the Second World War and the Manhattan Project), gave a
lecture to Caltech scientists in 1956 titled “Forecasting the Future”, which was
later published in the 1960 Allen and Unwin book Frontiers of Science: a survey.
In it he argues for a “tremendous” solution to the problem of overpopulation, even
more “efficient” than war, and more “efficient” than Nature’s apparently “simple,
brutal, control” of population. His fears are based on Thomas Mathus’ argument
that food production can be predicted to always lag behind population growth (a
theory that has been manifestly shown to be false):
“I have noticed that most people, when for the first time they face the
population problem, at once think about the possibilities of producing more
food. They first think, perhaps, of the fields we all notice here and there that
are not being properly cultivated. Then they may think of improved breeds
of plants that will produce two or three crops a year instead of only one.
Then there is the possibility of cultivating the ocean. And there is the
Chlorella, an alga which might be grown on a sort of moving belt in a
factory; it can produce proteins perhaps ten times more efficiently than the
garden vegetables do, but unfortunately at a hundred times the cost. Finally,
with the rapid progress of our knowledge of chemistry, it is not to be
excluded that one day the foodstuffs necessary for life will be synthesized in
factories from their original elements, carbon, nitogen, phosphorus and so on
[so much for the culinary arts!].
“All these things are possible, and I do not doubt that some of them will
be done, but to accomplish them is no help, because of the central point

115

made by Malthus, that there has to be a balance between food production
and population numbers. Until population numbers are controlled, it will
always continue to be true that, no matter what food is produced there will
be too many mouths asking for it. New discoveries in the way of food
production may make it possible for many more people to keep alive, but
what is the advantage of having twenty billion hungry people instead of only
three billion?
“In the light of these considerations it seems to me that the food
problem can be left to look after itself and that all attention must be given to
the other side of the balance. Can anything be done about it? Frankly, though
perhaps for a short term something might be done, in the long run I doubt it.
My reason is this. Nature’s control of animal populations is a simple, brutal
one. In order to survive, every animal produces too many for the next
generation, and the excess is killed off in one way or another. It is a method
of control of tremendous efficiency, and during most of his history it has
also applied to man. To replace a mechanism of this tremendous efficiency it
is no use thinking of anything small; the alternative we must offer, if we
want to beat nature, must also be tremendous.” (p.108-9) (his emphasis)
What sort of “tremendous” solution did Sir Charles have in mind? He mentions the
traditional negative eugenic strategies of promoted by Leonard Darwin and Francis
Galton, such as “some world-wide creed prohibiting large families” but argues that
“there seems little hope for any universal creed which would permanently limit
population in this way”. He makes the sinister claim that “nature’s method of
limiting population is so brutally tremendous [the word ‘tremendous’ is repeated
several times] that it can never be replaced by any triviality such as the extension
of methods of birth control”(p.109).
Even more alarming is his argument as to why “war is not nearly murderous
enough to have any effect”:
“The first thing we may think of which might reduce the numbers is war, but
most war is not nearly murderous enough to have any effect. Thus we should
count as a really bad war one in which five million people were killed, but this
would only set back the population increase for less than three months, and that
hardly seems to matter. I doubt if even an atomic war would have any serious
influence on the estimate, unless it led to such appalling destruction of both the
contestants that the economy of the whole world was entirely ruined and that
barbarism and starvation would ensue. There is perhaps some hope that man
will be wise enough not to embark on such a war, but anyhow I shall refuse to
consider it in my forecast.” (p.111)

116

These are not reassuring words to hear from the retired Director of the British
National Physics Laboratory, given that Sir Charles Darwin held this position
during the Manhattan Project, when the Allies were developing the atom bomb.
Shortly before Sir Charles expressed his opinion that he doubted if “even an atomic
war” would be “tremendous” enough to control “population numbers”, a series of
atom bombs were dropped in Australia, killing an unknown number of Aboriginal
people who lived in the region of Maralinga, in South Australia. The people who
died were not at war with the British and Australian authorities who decided that
Maralinga was a suitable site for the British nuclear “testing program”. The site, in
rural South Australia, was far away from the white population in the coastal cities
of Melbourne and Sydney in the South East of the continent, and far from
Canberra, the seat of Government, where the decision was made. It was even
further from England. The French decided, with similar motives, to use “their”
Pacific Island “territories”, in reality beautiful coral reef islands and atolls, to test
their bombs, as did the U.S. Government (along with other “testing sites”,
including some in America). In Australia, however, the bombing was more than an
act of environmental vandalism, it was an act of genocide, given the government’s
then overt aim of ‘assimilation’ and ‘breeding out the black’ in Indigenous
populations to create a “White Nation”.
The Masonic Australian Prime Minister at the time, Robert Menzies, who wore his
love for the British Queen on his sleeve, along with his adulation of the “Mother
Country”, pleaded with the British Government to test their atom bomb in
Australia partly because he wanted to please his heroes. He also wanted to attract
“business” to Australia, and encourage Australia’s mining and military industries
(including Australia’s developing nuclear industry). At this time the White
Australia policy was in full force and Aboriginal people were seen, by the
government, as a “problem”. Previous claims that Aborigines were a “dying race”
were being increasingly questioned, and the terra nullius (empty land) claim by the
initial British colonists had been long-recognised as a convenient lie. Although
there has never been an officially declared genocidal policy against the indigenous
people of Australia, the dropping of the Maralinga atom bombs and other nuclear
“tests” in parts of the Australian mainland that were known to be populated by
unknown numbers of men, women and children are a testament to how callously
the Aboriginal people in Australia have been treated even after the Second World
War (during which several Aboriginal men fought and died for the Allies). This
will be detailed in subsequent chapters.
It is obvious, when one considers it, that estimates of global population have

117

always been approximate figures. It is also a matter of historic fact that
governments have consistently manipulated population statistics for political
reasons. This sometimes results in exaggerated or reduced figures for particular
groups. When convenient, population numbers have been minimised or whole
sections of the population have been disregarded. In the case of Australia,
Aboriginal people were not included in the census until relatively recently (1960s).
During the policy of so-called “assimilation” only “full-blooded Aborigines” were
counted as “Aboriginal” and only those in “missions” and “reserves”. The rest of
the “outback” was regarded as “barely populated” – thus dropping of atom bombs
in these areas was regarded as likely to cause ‘few’ deaths. The reason that the
Maralinga atomic tests were genocidal is not because of the number of Aborigines
killed, but because the site was chosen to spare “whites” from the known risks of
radiation. Furthermore, this bombing is only one of countless examples of
indigenous people being subjected to known toxicity because the politicians whose
resposibility it was to protect them failed to do so. This serious matter will be
discussed further in subsequent chapters, not only in reference to indigenous
members of society, but to society at large.
Sir Charles Galton Darwin, in his attempt to forecast the future, assumed as
roughly correct the estimates of “expert demographers” that in the year 2000 the
world’s human population would be about 4 billion, increasing to 6 billion in 2050.
He regarded these estimates as “cautious ones”, in which opinion he appears to
have been correct, if the figures of modern “expert demographers” can be relied
upon. These claim that the world’s human population is already 6 billion, and they
have made predictions of continued population increases in the future. When Sir
Charles made his predictions, he based his conclusion that human population
growth is catastrophic on the assumption that the world’s population in 1958 was
2.5 billion. This was based, as are most global population statistics today, on the
calculations and pronouncements of the United Nations Organization and its
agencies (such as the World Health Organization and World Bank). The United
Nations is widely regarded as a reliable source of global population figures. The
statistical analyses and predictions of the UN’s many agencies are quoted widely
and acted upon by governments and health policy planners around the world.
However, in many countries nobody has counted how many people are living in
there – in fact this is obviously impossible, since there still remain people in the
world, in the Amazon, for example, who have not come into contact with
“civilization” at all. Other nations are ravaged by war, starvation and disease; in
these, taking a “reliable census” is understandably not a high priority.
It is just as true today as it was fifty years ago that “the world provides enough for

118

every man’s need but not every man’s greed”, as Mohandas (Mahatma) Gandhi
observed, ten years before Sir Charles Galton Darwin warned his academic
colleagues that in fifty years the world would be struggling to feed “a hungry four
billion” – unless they could come up with a “tremendous solution” – more
“tremendous” than the “simple brutality” of Nature and more murderous than war.
Even a nuclear war, he claimed, would not be adequate to “control population
numbers”. Could he have been suggesting the use of biological or chemical
weapons? Could he have been that insane? Regardless, the theories of three
generations of the Darwin family, Charles Robert Darwin, Major Leonard Darwin
and Sir Charles Galton Darwin provide sound refutation of Francis Galton’s
argument that “genius” is inherited. Despite obvious nepotism, or perhaps because
of it, the brilliance of Charles Darwin is not seen in the racist and elitist “eugenics”
program of his son Leonard, or the call for a “tremendous”, “brutal” solution to the
threat of overpopulation by his grandson, Sir Charles.

Chapter 6
THE EFFECT OF AIDS ON U.N. POPULATION
ESTIMATES
Although population growth rates in African nations have long been described as
“among the highest in the world”, actual population numbers in the huge continent
of Africa are actually very small compared with those of the Asian, European and
American continents, and the population density in even the most crowded African
nations is a fraction of the most crowded Asian and European countries. Back in

119

1984, when the AIDS epidemic began in Africa, Nigeria, generally regarded as the
most overpopulated of the African countries at the time had a population density
around 96 per square kilometre. Kenya had a population density of 29 per square
kilometre, Ethiopia 28, Zaire 12.7, and Sudan only 7.7. This contrasted with much
higher densities in European nations: Switzerland 153 people per square kilometre,
Denmark 120, Italy 192, Germany (East and West combined) 219, and Britain 227.
Asian countries, also accused, along with Africa, of being especially responsible
for global overpopulation, included Indonesia with a population density of 73
people per square kilometre, China with 105, Pakistan with 106, Phillippines with
169, India with 212, and Sri Lanka with 234. Due to its small size and heavy
urbanization, Japan had a high population density on 322 per square kilometre (a
population of 120 million in 372, 000 square kilometres). At the time there was
also talk of a growing ‘population problem’ in the Middle East – mainly blamed on
countries with mainly Moslem populations. Iraq, in 1984, had a population density
of 32 per square kilometre, Iran had 24 and Saudi Arabia 4.4. Israel’s population
density was, in contrast, 197 people per square kilometre. The huge USSR had a
population density of only 12, while the population density in Australia was only
1.9 people per sqaure kilometre.
These figures have been calculated by simply taking the total population estimate
as given in a 1983 atlas and dividing it by the surface area of the nation. By such
calculation, Taiwan had a population density of 475 in 1983, and Singapore an
incredible 4262 people per square kilometre. The population density of Hong Kong
was even higher: 4888 /square km. Yet few people were starving in Hong Kong or
Singapore in 1983 – and many were in Africa.
If we look at total population numbers, China has long been the most populous
nation in the world, due to its large size and relatively high population density.
China is now said to have about 2 billion residents, and the Chinese Government
has taken harsh measures over the past 3 decades to curb population growth. The
Indian Government, with about a billion people to govern, has also engaged in
aggressive measures to limit the size of the population, including coercive
sterilization, and sterilization without knowledge or consent (a practice that
became common during the rule of Indira Gandhi in the 1970s and 80s). Sri Lanka
and the Philippines were also regarded, in the 1970s as potential disaster areas
regarding population. I remember a massive condom-promotion campaign in 1973
when I was at high school in Kandy, a hill city in central Sri Lanka. Condoms were
suddenly available at every corner shop. The shops advertised them with little
purple and white signs that caught one’s notice when walking down any city street.
All the signs were the same – the condom company must have had a monopoly,

120

and a massive advertising budget.
The 1998 “Revision of the World Population Estimates and Projections” of the
United Nations’ “Population Division” reveals the extent to which the AIDS
epidemic has altered long-term estimates of population numbers in Africa:
“The 1998 Revision shows a devastating toll from AIDS with respect to
mortality and population loss. In the 29 hardest-hit African countries that are
studied, life expectancy at birth is currently estimated at 47 years, 7 years
less than what could have been expected in the absence of AIDS (Figure 1).
The demographic impact of AIDS is even more dramatic when one focuses
only on the hardest hit countries. For example, the average life expectancy at
birth in the 9 countries with an adult HIV prevalence of 19 per cent or more
is projected to reach 48 years in 1995-2000 whereas it would have reached
58 years in the absence of AIDS, al loss of 10 years. This group includes
Botswana, Kenya, Malawi, Mozambique, Namibia, Rwanda, South Africa,
Zambia and Zimbabwe [Zaire, previously known as Belgian Congo, where
the epidemic was once reported to be the worst, is not mentioned on this
list]. Yet, the demographic impact of HIV/AIDS is expected to intensify in
the future. By 2010-2015, the average life expectancy at birth in these
countries could be only 47 years (Figure 2). In the absence of AIDS, it
would have been expected to reach 63 years. This represents 16 years lost to
AIDS. However, in none of the 34 countries, is the population expected to
decline because of the AIDS epidemic.”
This information, obtained from the United Nations website, compares the
“projections” of life expectancy in the “hardest hit” African nations upto the year
2015 after adjusting these figures in view of the AIDS epidemic and incidence of
HIV-antibodies in sampled populations (most data has been obtained from
nominated high risk groups such as prostitutes, from antenatal testing of pregnant
women or from entry testing into the armed forces). The UN population division
has compared these projections with “projections without AIDS”. The comparison
is based on extrapolations from measurements showing improving life expectancy
in these countries from 1950-1955. Figure 1, referred to in the UN text above,
compares the life expectancy at birth in 29 African countries “with and without
AIDS”. The projections are presented as a bar graph, with columns for 1985-90,
1990-95, 1995-2000, 2000-05, 2005-10, and 2010-15. According to the graph, the
average life expectancy in these countries in 1985-90 was 49.2 years, whereas it
would have been 50.2 without AIDS. The epidemic began in Central Africa in
1984. The life expectancy had dropped, during 1990-95, to 48.5 years while the
“projection of life expectancy without AIDS” is given at 51.7. The latter projection

121

assumes that the improved life expectancy over the previous 15 years would have
continued at roughly the same rate over the subsequent 5 years, thus improving
from 50.2 to 51.7. (In contrast, the life expectancy of “whites” in South Africa was,
at the time, over 70 years – similar to the average life expectancy in Australia,
Japan and Western European nations). According to the UN projections, the life
expectancy of Africans in the nominated 29 “hardest hit” countries, will have
dropped to 47.5 in 1995-2000, and further to 47.4 in 2000-2005. This is contrasted
with projections of 54.1 and 56.4 “without AIDS” by 2000 and 2005 respectively.
In reality, the life expectancy in several African nations has, since these figures
were released (in 1998) apparently dropped to below 40 years, and even below 30
in some countries (such as Malawi).
The 1998 UN life expectancy figures for these African countries are unrealistically
optimistic given the fact that there is no known cure for AIDS or vaccine to prevent
it. After dropping to a low of 47.4 years in 2000-2005 the projection is for an
improvement to 49.4 years by 2010 and 52.6 years by
2015 (in comparison to 58.4 and 60.4 repectively without AIDS). It is these longterm predictions and projections that are particularly dubious – comparisons are
made between to mathematical constructs, and neither of them is reliable. The first
is the “projection of life expectancy reduction” if the AIDS epidemic continues (as
it has been predicted to do). The second is the “projection of life expectancy
improvement” were it not for AIDS.
How falsely optimistic these projections are can be seen in Figure 2 of the UN’s
1998 Revision of the World Population Estimates and Projections: a line graph
comparing “Life Expectancy at Birth in the Seven Hardest Hit Countries” from
1950 to the present, and showing projections to the year 2015. These countries are
Zimbabwe, Zambia, South Africa, Kenya, Namibia, Botswana and Malawi, all in
southern and eastern Africa. Each country shows a steady increase in life
expectancy (depicted as a rising gradient) from 1950 to 1985, followed by a
precipitous fall 1985 and 2000. In the case of Malawi, life expectancy apparently
increased from 36 years in 1950-55 to 45 years in 1985-90, dropping to 42 years in
1990-95 and was predicted to fall to 39 years in 1995-2000. This, according to the
UN prediction is as low as the life expectancy would fall in Malawi – the
projection is for improvements to 41 years in 2000-2005, 44 years in 2005-10 and
48 years in 2010-15. Recent reports of life expectancy in Malawi have been in the
region of 27 years, contrary to these 1998 projections.
The figures for Botswana (previously Bechuanaland, a mineral and gold-rich
British ‘Protectorate’ administered via the white regimes in adjacent Rhodesia and

122

South Africa), show an increase in life expectancy from 42 years in 1950, rising to
63 in 1985-90, followed by a fall to 61 years over the next 5 years (the beginning
of the HIV epidemic in Botswana) and a horrifying 47 years in the following 5
years. This devastating decrease in life expectancy of 14 years is prediced to
worsen further to 41 years in 2000-2005. The accompanying text, from the UN
“population division” website, claims that:
“In Botswana, the hardest hit country, one of every 4 adults is infected
by HIV. Life expectancy at birth is expected to drop from 61 years in 19901995 to 47 years in 1995-2000. In the absence of HIV/AIDS, it would have
been expected to reach 65 years in 1990-1995 and 67 years in 1995-2000.
Due to the impact of AIDS, life expectancy is projected to fall further to 41
years by 2000-2005.”
This has obviously had a dramatic effect in decreasing the rate of population
growth, since most of those who have died of AIDS are children, women of
childbearing age and young men. The 1998 UN report says:
“Mainly due to the mortality impact, population growth in Botswana has
been significantly reduced. The average annual population growth rate of 3.5
per cent per year in 1980-85 has fallen to 2.9 per cent in 1990-1995 and will
likely further fall to 1.9 per cent in 1995-2000 and 1.2 per cent in 2000-2005
(figure 3). In the absence of AIDS, Botswana’s population would have
experienced growth above 2.5 throughout the 1990-2005 period. Because of
the mortality impact of AIDS, Botswana’s population by 2015 is expected to
be 20 per cent smaller than it would have been in the absence of AIDS.
Nevertheless, because of high fertility, Botswana’s population is still
expected to nearly double between 1995 and 2050.”
The eugenists of the early 20th century and population experts of the 1960s were
aiming at Zero Population Growth (ZPG) for the world’s population as a whole.
Regarding the world as “overpopulated” even when there were only thought to be
2 billion people in the world, the more aggressive of the eugenists were arguing
that any increase above 2 million would spell disaster – famine, warfare and death
from infectious plagues. The eugenic focus was on changing the racial mixture of
the human population that survived what they saw as an inevitable struggle for
food, living space and natural resources. The ‘white’ eugenists wanted their own
descendants and “race” to remain the dominant group in any future society. They
did not really care what happened to the other “races” – these had been
dehumanised and reduced to “non-whites”. Whether they were sterilized, enslaved
or genocided was of little concern to most of these eugenists, as long as population
growth among the “uncivilised natives” and “irresponsible” lower classes was

123

controlled. The more callous amongst such eugenists could have seen the United
Nations projections of “population growth” and “Life Expectancy at Birth in 29
African Countries With and Without AIDS” as “tremendous” but not “tremendous”
enough, as Sir Charles Darwin put it in his 1959 lecture at the California Institute
of Technology – a reduction of population growth rate from 3.5 per cent per year to
1.2 per cent is closer to, but not yet at, the aimed-for Zero Population Growth rate.
Could such a mentality exist in the modern world?
The 1998 Revision of the World Population Estimates and Projections of the UN
claims that despite the “devastating mortality toll from HIV/AIDS” the population
of Botswana is still expected to double between 1995 and 2050. This prediction is
based on the assumption that reduced life expectancy from AIDS will reach a low
point around 2000 and then steadily improve – to 41 years in 2005, 44 years in
2010 and 48 years at 2015. Worsening figures regarding HIV infection and the
failure of current strategies to curtail the epidemic in the “Third World” bring this
prediction into considerable doubt.
If one looks at the seven “hardest hit countries” in Africa on a map, it is evident
that the AIDS epidemic, once localised to central Africa, is now a massive public
health problem throughout southern and eastern Africa also. The seven countries
focused on by the UN population division include 5 in southern Africa (South
Africa, Zimbabwe, Zambia, Namibia and Botswana) and 2 in east Africa (Kenya
and Malawi). In the 1980s the epidemic was reported to be the most serious in
Zaire, Rwanda, Burundi, Tanzania and Uganda. Later Mozambique, on the east
coast of Africa, was identified as another focus of the epidemic. Mozambique
remains among the 34 “hardest hit” nations according to the UN:
“Among those countries, 29 are in Sub-Sahara Africa (Benin, Botswana,
Burundi, Burkina, Faso, Cameroon, Central Republic of Africa, Chad,
Congo, Cote d’Ivoire, Democratic Republic of Congo, Eritrea, Gabon,
Guinea Bissau, Kenya, Lesotho, Liberia, Malawi, Mozambique, Namibia,
Nigeria, Rwanda, Sierra Leone, South Africa, Togo, Uganda, United
Republic of Tanzania, Zambia, Zimbabwe), 3 in Asia (Cambodia, India and
Thailand), and 2 in Latin America and the Caribbean (Brazil and Haiti). Of
the 30 million persons in the world currently infected with HIV (UNAIDS,
1997), 26 million (85 per cent) reside in these 34 countries. In addition, 91
per cent of all AIDS deaths in the world have occurred in these 34
countries.”
When one looks at the colonial history of these nations some interesting patterns
become evident. All of these countries have been ruled by European governments

124

over the past one hundred years, with the exceptions of Thailand in south-east
Asia, Haiti (a possession of the USA) and Brazil (once a Portuguese colony, and
the previous destination of millions of African slaves). Looking more closely at the
nations in sub-Saharan Africa, all were colonised and ruled by British, French,
Portuguese, Dutch or Belgian “white” elites prior to being granted “independence”,
sometime between 1960 and 1980, and all were previously the sites of slavery, the
obtainment of slaves and other atrocities at the hands of European “protectors”.
The areas of central and southern Africa that have been the worst sites of the AIDS
epidemic coincide with the “gold belt” and “copper belt” and this part of the
continent also known to contain rich diamond and uranium deposits.
It is a matter of historical fact that Kenya, Rhodesia and South Africa were
strongholds of British Imperialism and white supremacy in Africa. It is also a
matter of historical fact that slavery of “blacks” by “whites” was instituted in other
“white” colonies – notably in Kenya, the Belgian Congo (now Zaire) and the
various French and German colonies. Slaves were also obtained from Portuguese
colonies including Mozambique and Angola, British West African colonies
including Ghana (Gold Coast) and Nigeria and forced to work in various other
parts of Africa, in Europe, or elsewhere. In The Coburgs of Belgium (1968), Theo
Aronson wrote of King Leopold’s efforts to deny the Belgian atrocities in the
Congo, which will be described shortly:
“No one believed him. A less notorious personality might have got away
with it, but with the sort of reputation be enjoyed, the charges stuck all too
easily. The world already knew that he was depraved, mercenary and heardhearted; that he had betrayed his wife and ill-treated his daughters; that he
was moving heaven and earth to rob these daughters of their inheritance. If
he was capable of such behavior toward his own flesh and blood, could there
be any doubt that he was guilty of atrocities toward unknown savages?”
(p.148)
In the opinion of Vernon Mallinson (a British Professor of Comparitive Education)
in Belgium (1970), Leopold was, though, “behaving only as other industrial and
capitalist bosses were behaving in Africa”. Allowing for Mallinson’s conscious
effort to minimise the Belgian monarch’s misdeeds, (in the light of this observation
being made by a British academic who previously worked for the British secret
service), this admission does not say much for the behaviour of British colonists
and “capitalist bosses” in Africa. Certainly, it would not have been out of character
for Leopold II to have unleashed a killer plague in the Congo if he could make
money out of it – what about his followers, and other colonial (and neo-colonial)
“masters”?

125

One important fact about the African AIDS epidemic that raises significant
questions about its origin and mode of introduction is the fact that, from the outset,
it has affected males and females roughly equally – affecting heterosexual
populations rather than male homosexual populations as was the fact in so-called
‘Pattern 1 countries’ (the USA, Europe and Australia). No sensible scientist has
been able to blame either homosexuality or intravenous drug use for the spread of
AIDS in Africa. Instead, various insinuations and claims have been made about the
sexual behaviour of black Africans, as had previously been made about Haitians,
when the epidemic was first judged, by the American AIDS establishment, to have
arrived there because of ‘voodoo’ practices in the Caribbean island.

Chapter 7
Chapter 7
IMPERIAL DESIGNS IN AFRICA
At the time of the first European voyages of discovery one substance in particular
was valued above all others by the ship’s captains and the kings and queens that
sent them in search of new lands and treasures – the soft metal, gold. All other
substances were valued relative to gold, and the European monarchs who financed
the voyages were obsessed by it, as were their captains and commanders. Even the

126

Churches had a more than healthy fascination for gold, and the missionaries they
sent in the wake of ‘navigators’ and ‘explorers’ frequently acted as “middlemen”
and “ambassadors” for those whose primary interest was obtaining gold for as little
as possible from the “natives”.
Although the exotic spices of the East were, in later times “worth their weight in
gold”, when the first Spanish and Portuguese “explorers” set off in their galleons
from Iberian ports in the late 1400s, they were in search of gold. When Christopher
Columbus crossed the Atlantic and “discovered” the ancient civilizations of
Central America he found that they had enormous amounts of gold – gold
ornaments, jewellery, statues, and even gold toys for children. The amount of gold
they found literally sent the Spanish Conquistadores mad. They developed an
irrational state of mind that resulted in heinous crimes being committed in an effort
to obtain more gold – what was later termed “gold fever” (although this term was
selectively applied to the slaves rather than the masters who actually plundered
most of the gold, and ordered the most cruel atrocities). In the 1600s, when the
French, Dutch and British sought to claim empires of their own, they, too were
obsessed by gold, and were prepared to steal, lie, betray, and murder in order to get
their hands on it. Wherever gold and diamonds were found, there followed
European territorial claims.
In the late 19th century, when Francis Galton and Leonard Darwin were espousing
the theory that “negroes” have a “slavish instinct”, both the British and Germans
had plans to ultimately subjugate and exploit the whole of the African continent (as
did the Portuguese three centuries earlier). In the late 19 th century and early 20th
century, Lord Cecil Rhodes, a British aristocrat and mining magnate, who had been
given virtually free reign to do as he pleased in “his” diamond empire by the
British government, expanded the British mining operations in Southern Africa in
the name of the British South Africa Company – claiming as “Rhodesia” a vast
area of already inhabited land to the north of British-controlled ‘white’ South
Africa (present day Zimbabwe and Zambia). Rhodes made public his intention to
create a colony that stretched from the Cape (Cape Colony in South Africa) to
Cairo (the capital city of Egypt).
Cecil Rhodes (1853-1902), the son of an Anglican priest, has affected global
politics and economics to an extraordinary degree since in the 1880s, when, after
obtaining a Law Degree from Oxford and making various social, financial and
political contacts in England, he returned to Durban in South Africa to make
money from the diamond mining industry. By 1887 Rhodes had control of the De
Beers Mining Company, which was rivalled in size only by the Kimberley Central

127

Mining Company, owned by Barney Barnato, the son of an impoverished Jewish
tailor from the East End of London. Barnato was also born in 1853, the same year
as Rhodes, and had arrived in South Africa at the age of 18. He too had followed
his older brother (Harry) to seek his fortune in diamonds, and had considerable
success in doing so by the time Rhodes bought out control of De Beers. Barnato’s
real surname was ‘Isaacs’, but he changed it to ‘Barnato’ in a time of growing antiSemitism in Britain. Rhodes, on the other hand, had a name he was proud of, and
after which he named his whole Empire – “Rhodesia”. He acquired an effective
monopoly in “legal” trade in Southern African diamonds when he bought out
Barney Barnato’s Kimberley Central Mining Company.
The diamond-promotion book Diamonds in Australia: The Fields and the
Prospectors (1980) by Leo Chapman, gives details of the financial strategy of
Cecil Rhodes and the Rothschilds banking corporation’s Nathan Rothschild, that
ultimately led to the premature death, apparently by suicide, of Barnato, and the
creation of an exclusively British, Jewish and American controlled mining industry
in Southern Africa:
“At the outset of the amalgamation battle of 1887-88 Barnato was the
more favourably placed, as Kimberley Central’s claims were more valuable
than De Beers’. Barnato was also richer than Rhodes. But Rhodes out-witted
him. After Barnato’s Kimberley Central, the biggest miner in the Kimberley
pipe was the French Company, for which Rhodes decided to bid. Big money
was needed. The controllers of the French Company were not just casual
shareholders; only a high price would tempt them to part with their valuable
holdings [which were being dug out of the ground by black slaves, long after
the official abolition of slavery].
“Gardner Williams, an American mining engineer and the general
manager of De Beers, was on friendly terms with E.G de Crano, the mining
adviser to N.M.Rothschild and Sons, the London merchant bank which had
financed the British Government’s acquisition of the Suez Canal shares
[from the French]. Rhodes, who had some imperial schemes of his own
which later culminated in the formation of Rhodesia, had Williams ask de
Crano to put him in touch with Rothschilds. Rothschilds, who had become
interested in the diamond mining industry on de Crano’s advice, agreed to
finance the bid by De Beers for the French Company. Rothschilds smoothed
the way with the French shareholders and headed a syndicate which
advanced 1.4 million pounds to buy the French-held shares. They also took
up a new issue of shares in De Beers…With Rothschild’s backing, Rhodes
launched De Beers’…takeover bid for the controlling shares of the French
Company. Barnato, understandably furious at this attempt to poach on what

128

he considered his territory, began canvassing support for a [bigger]
counterbid.”(p.44)
Rhodes then engaged in a combination of economic and psychological warfare
against Barnato. He first “proposed to Barnato that he should not go ahead with his
higher bid, but take the shares to be acquired by De Beers in the French Company
in return for giving De Beers a 20 per cent holding in Kimberley Central”
(Chapman, 1980, p.44). This was Barnato’s big mistake. Rhodes then dramatically
increased production from the De Beers mining operation, flooding the market
with diamonds [which he had earlier been preventing, in order to keep prices high],
and forcing Barnato’s shareholders into selling their shares – which Rhodes bought
up, eventually holding 10,600 out of 17,000 shares in Kimberley Central. With
Cecil Rhodes owning so much of “his” company, Barnato could see no choice
other than to amalgamate Kimberley Central Mining Company with De Beers, and
“De Beers Consolidated Mines was formed”, in of which he owned 33% of the
shares, and was one of four Life Governors. Rhodes, as usual, arranged the terms
in his own favour. Chapman writes:
“In order not to lose all influence in the amalgamated company, Barnato
stipulated, as part of the terms, that Life Governors should be appointed – he
could not then be voted off the board of directors. But Rhodes saw to it that
it was to his own advantage. The Life Governors were to receive one quarter
of the net profits remaining after the payment of dividends up to 36 per cent
(at first 30 per cent). The Life Governors were Rhodes, Barnato, Beit and
Philipson Stow.” (p.47)
These events occurred in the early 1890s, around the time eugenics was gaining
support in the USA, Australia and Europe. All the Life Governors of “De Beers
Consolidated Mines” regarded themselves as “white”, including Barney Barnato
(who was an Anglicised Jew). All the people who actually did the mining –
crawling on their hands and knees collecting gem diamonds on the beaches,
sweating and suffocating in deeper and deeper mine shafts, aching with the toil of
hacking at hard rock with pickaxes, carrying buckets of valuable ore under the
control of whips – were “black”. Many were children, many were hungry, and all
were desperately poor. They were deliberately subjected to these oppressive
conditions. In the view of Francis Galton and his followers the “Negro race” was
“naturally designed” for slavery.
Cecil Rhodes became one of the richest men in the world, and by the end of the
19th century De Beers controlled 90 per cent of the world’s diamond production
(Chapman, 1980, p.50). Rhodes was appointed Prime Minister of Cape Colony (the

129

largest province in South Africa) in July 1890 and his friend Dr Leander Starr
Jameson was made administrator of all the British South Africa Company’s
territories in June 1891.
An indication of the imperial arrogance of the British Empire at the time was
evident in the fact that in July 1890, the same month that Rhodes was made prime
minister of Cape Colony, the borders of “German” Tanganyika were redrawn to
include the tallest mountain in Africa, the magnificent Mount Kilimanjaro, to
satisfy the request by the aging British “empress”, Queen Victoria, that her
grandson William II, who had become the Kaiser of Germany that year
(succeeding his father Otto von Bismarck), be given a mountain, “because he had
no mountains in Africa” (Burne, 1991, p.1005). At the same time the Germans
were “given” all the interior to the south Zanzibar in East Africa (German East
Africa, later called Tanganyika), while Britain gained the slaving island of
Zanzibar on the East Coast (adjacent to Tanganyika) and the East African interior
north of it as far as the Congo (British East Africa, now Kenya).
Queen Victoria’s nephew, King Leopold II of Belgium, had already claimed the
whole of the central African inland south of the Congo River, a massive 920,000
square miles of land, as his own personal possession, having abandoned the
pretense of creating a haven free from the ravages of slavery.
In 1876 Leopold had invited a collection of explorers, geographers and
‘philanthropists’ to his Royal Palace in Brussels and explained how he wanted to
“open up the Dark Continent” to Christian civilization, and that rather than having
egotistical motives or a desire for territorial conquest, his dream was to abolish the
slave trade and set up research stations in the heart of Africa. This was the
notorious Leopold’s initial claim, when he announced the creation of the “Congo
Free State” after sending Henry Morton Stanley back up the Congo River to set up
“research stations” and get some treaties signed by the “savages”. To Stanley, all
the Africans he had encountered in his historic “exploration” of the interior of the
continent, when he travelled there in search of the British missionary-explorer
David Livingstone, were “savages”. “Research stations” were indistinguishable
from “slaving stations”, except that they also collected information and body parts.
The Welsh-born American ‘explorer’ Henry Morton Stanley, previously a reporter
for the New York Herald, was employed by the Belgian monarch after Leopold had
recognised that here, at last, was an opportunity for the creation of a Belgian
Empire to rival those of the larger European nations. This imperialist motive was
revealed after the pretense of the ‘Association Internationale du Congo’ (AIC) was

130

abandoned and Leopold announced to the world, on the 1 st of August, 1885, that
“His Majesty, in accord with the International Association [his own company], has
taken the title of Sovereign of the Independent State of the Congo”. In doing so he
had acquired a country the size of Western Europe [on the basis of bogus
‘treaties’], and none of the other European powers tried to stop him – they had,
after all, acquired, through similar, and often considerably more brutal means, even
more territory for their own empires.
The “treaties” acquired by Henry Morton Stanley in the Congo were obtained by
“shotgun diplomacy”. Theo Aronson, in The Coburgs of Belgium (1968) admits
that after over 300 Congo chiefs had “signed away their independence” in
exchange for bottles of gin and cloth (although “the majority of them had not the
slightest idea of what it was that they were signing”) Leopold showed his true
colours when he sent Morton further east from his new base in the Congo to claim
territory as far as the Nile River, using a more violent strategy:
“Ever since he had first visited Egypt over thirty years before …Leopold
had dreamed of one day establishing himself on this great river. Both
strategically and commercially, it would be a splendid position: he would be
in control of the very crossroads of Central Africa and would have access to
almost unlimited supplies of ivory. And who could tell, if all went smoothly,
the Coburgs – spreading northward through the Sudan toward Egypt – might
yet become latter-day pharoahs.
“Never, indeed, did the time seem more opportune than at present. With
General Gordon having just been killed by the Mahdists at Khartoum, the
Sudan was in a state of chaos. Emin Pasha, one of Gordon’s subordinates,
was still holding out at Lado on the Nile, and as Lado was exactly where
King Leopold wanted to be, he hit upon a way of turning Emin’s
predicament to his own advantage. He would send an expedition under
Stanley to relieve Emin and then, as a way of converting the territory around
Lado into a province of the Congo, offer Emin the job of governor.
“Stanley, obliged to approach Lado [on the Nile, in Sudan] from the
Congo rather than by the easier route through East Africa, was faced with his
most difficult task to date. As usual, by a display of iron will and at the cost
of hundreds of lives, he accomplished it. He reached Lado, however, to find
that not only was Emin not interested in Leopold’s offer but that he was not
even particularly pleased to be rescued. Stanley, determined to fulfill at least
one part of his mission, rescued him willy-nilly [took him hostage] and
marched him back to Bagamoyo on the East African coast. Here…the nearsighted Emin fell off a balcony and fractured his skull.
“The Stanley expedition having failed to gain the Nile foothold, Leopold

131

tried other methods. First he set out to convince Bismarck [the German
emperor, or ‘Kaiser’] that his presence on the Nile would help stamp out the
slave trade, and when this ruse misfired, he about-faced and tried to get
Tippo Tib, the famous Arab slaver, to establish a station there. Finally, in
desperation, he was obliged to finance an expedition of his own. In 1890, a
small army set out from the Congo. By signing treaties with such chiefs as
were amenable and by shooting down such as were not, Leopold’s
expedition slowly gained control of the area. By October 1892, King
Leopold was on the Nile. It seemed as though he had made yet another of his
dreams come true.” (p.108-9) (emphasis added)
General Charles George Gordon, whose ‘subordinate’ Emin was ‘rescued’ by
Stanley, had been sent by the British Government to lead Anglo-Egyptian forces
out of the Sudan, which was being attacked by the forces of the Moslem leader
Mohammed Ahmad. Ahmad’s followers hailed him as the “Mahdi”, or “divinely
guided one”, and had attempted to drive the British out of Sudan. Prior to Gordon
being sent to the Sudan the British had assembled an army of 8,000 men against
the Mahdi’s men (‘Mahdists’), of whom only 300 had survived. Sudan and Egypt
were then offically under the indirect rule of the Turkish Ottoman Empire, via the
“Khedive” in Egypt, although, following the building of the Suez Canal, most of
the political power in the area was in the hands of the British and French (who had
financed the building of the canal in the 1860s). Egyptian soldiers served under
British officers after Britain became the principal shareholder in the Suez Canal (in
1875) and prior to this the British and French established a “condominium” or
“joint rule” in Egypt. Since Sudan was also partly under the control of the Turkish
Khedive in Egypt, this vast area came under the sphere of influence of the British.
With a long history of colonization and attempted colonization, the Egyptian
officers did rise up against the British in the early 1880s, however the response
from London was to bomb Alexandria and send forces to occupy Cairo (1882). The
war between Moslem and Christian armies intensified, and in 1885 the Mahdi’s
army captured and beheaded General Gordon, taking control of Khartoum and the
Sudan.
Hoping to take advantage of the battle over the Nile, the Belgian king approached
the British Prime Minister, Lord Salisbury, suggesting that in exchange Britain
allowing him to “lease the Sudan from the Khedive of Egypt”, Leopold would
“lend Britain his Sudanese subjects to do with whatsoever they pleased – even to
using them in an army with which Britain could annex China” (Aronson, 1968,
p.109). The British had, however, already gained the “trading concessions” they
desired with China, at the conclusion of the Opium Wars of the 1840s, and did not

132

need the Belgian’s assistance in creating an army of “colonial subjects” – they
already had one. The British also had plans of their own for the control, not just of
the Sudan, but of the entire continent of Africa. Unfortunately for the British, and
more so for the Africans, so did others.
One might describe Leopold’s and Rhodes’ imperial strategies as “conquest by
conning, conspiracy and confusion”. Hiding behind a façade of “trading
companies” these would-be emperors acquired territory by fair means or foul
(usually with ‘paperwork’ of some sort) – and then did whatever they deemed
necessary to acquire the minerals and other resources in the land they had
“purchased”. The legacies of King Leopold of Belgium and Lord Cecil Rhodes
have endured in the form of ongoing exploitation of Africa and Africans by the
political, social and economic system they instituted south of the Sahara in British
and Belgian-controlled Africa. This system was a direct development of that which
made African slavery become the most lucrative of the imperial enterprises –
“black ivory” earned much more for the British than did white ivory (elephant
tusks). There were many more “blacks” than elephants, and they were easier to
control with guns, whips, and existing social hierarchies. Like the Portuguese,
Spanish and Dutch before them the British, Germans and Belgians sought out the
most dictatorial tyrants in Africa, provided they were also adequately corrupt, or
foolish enough, to form alliances with – these had existing “governments” through
which the white colonial masters could rule. Through such alliances the colonists
obtained slaves and plundered the wealth of Africa, as they did in Asia and the
Americas – before offering “autonomy” (under colonial control) and subsequently
“independence” (under neo-colonial control).
The subsequent behaviour of Leopold towards “his African subjects” and the way
the Belgian Congo atrocities have been “whitewashed” in subsequent European
history books tell us much about colonial and neo-colonial attitudes towards Africa
and Africans. It is sometimes mentioned in history books that “the worst colonial
atrocities occurred in the Belgian Congo”, leading to “international condemnation”
following which Leopold was forced to hand over “his” colony to the Belgian
Government. This, having occurred in 1908, is apparently when the abuses (which
are rarely detailed) ended. When Belgium granted the Belgian Congo
independence in 1962, all the atrocities had apparently been forgotton and
forgiven. Vernon Mallinson, Professor of Comparitive Education at the University
of Reading, wrote, in Belgium (1970) that:
“The granting of its charter to the Belgian Congo on 18 th October 1908
marked the beginning of some fifty years of enlightened rule, backed by a
doctrine of paternalism towards the native Congolese, and brought a

133

systematic and steady exploitation of the country’s abundant mineral and
hydro-electric resources as well as the application of modern and scientific
techniques to the development of a large agricultural sector [after the forests
had been ‘cleared’]. Indeed, with the sole exception of the Union of South
Africa, no other territory in what may be termed the black sub-continent has
had such a high industrial and agricultural development as the Congo, and
its importance (particularly as regards its mineral resources) became
increasingly valuable from the end of the First World War.” (p.204)
Mallinson continues with a summary of Leopold’s legacy as the despotic Belgian
king would have liked it to be remembered:
“Leopold II had bequeathed to his country a colony eighty times larger
than Belgium and with a population of 11 million, then controlled by some
3,000 whites who with their successors sought assiduously over seventy-five
years to weld the country (with its bewildering multiplicity of tribal loyalties
and affiliations) into one politically unified whole. As for Leopold II, his
own personal achievement (never having once set foot on his territories)
must be unique in history. He had set out with the initial aim of conferring
on the Congolese the benefits of European civilization [!]. He had occupied,
explored and pacified his colony and had obtained the blessing of the greater
powers to do so. In the short span of twenty-three years he had established
order out of chaos on the basis of a smoothly-running administrative
machine. Slavery had completely disappeared. Missionary, capitalist, and
civil servant were beginning to work in unison to further the introduction of
Christianity, to promote the general welfare of the African, and to bring
about necessary technical progress and efficiency for full economic
development. He had built thousands of kilometres of railroads and highway
and fleets of steamers were navigating all the available waterways…It was
now up to the Belgian government to build on the sound foundations he had
laid.” (p.204-5)
Professor Mallinson is a huge admirer of King Leopold II. In fact, he writes that
this disgusting man is now increasingly regarded as “Leopold the Great”. He
claims that slavery was “stamped out” in the Congo by Leopold’s civilizing
influence, and that the Belgian king provided his government with “sound
foundations” to carry on his noble legacy. We might consider, then, what this
legacy consisted of.
Theo Aronson, in The Coburgs of Belgium provides this insight into the once
widely known Belgian atrocities in the Congo:

134

“What, exactly, were these horrors? They were the almost inevitable
result of King Leopold’s monopoly system. The King’s only interest in the
Congo was to make as much money out of it as quickly as he possibly could.
In 1901, when he had been due either to repay the Government loan of ten
years before or to hand the Congo to Belgium, he managed, in his usual
adroit manner, to do neither. The Government obligingly passed a bill
postponing repayment of the loan indefinitely, and King Leopold remained
in control of the Congo. Not even he, however, imagined that he would live
forever and his milking of the Congolese cow was now accelerated to
provide funds for his many unfinished projects [mainly the construction of
ostentatious buildings in Belgium].
“Rubber provided the main wealth of the Congo and it was in the
collecting of rubber that the African population was forcibly engaged.
Leopold, interested in quick profits only, made no effort to conserve the
rubber vines or to plant new ones; as the supplies of wild rubber dwindled,
so did the search become more frantic. If the Africans were slow in
delivering their quota, they must be made to work faster. The easiest way to
do this was to terrorize them. And the best way to terrorize them was to kill
those guilty of not working fast enough. To prove to King Leopold’s
impatient agents that a punitive raid against a dilatory village had indeed
been carried out, severed human hands were brought in as testimony. The
more hands that were brought in, the more efficiently had the punishment
been carried out and the more efficiently, presumably, would the rubber be
gathered by others in the future. Reliable witnesses saw baskets of human
hands being hauled along by Africans to a local European official for his
inspection. When the soldiers…were accused of wasting too many bullets,
they would simply hack the hands off living men to make up the pile. Nor
did the atrocities stop at shootings and hand-hackings. Villages were burned,
men flogged, women mutilated, children chained in sheds as hostages or
flung into crocodile-infested rivers, whole clans wiped out. Tribes fled in
terror across the borders. Those who survived lived a nightmare existence.
Senator Picard saw ‘a continual succession of blacks, carrying loads upon
their heads; worn-out beasts of burden, with projecting joints, wasted
features, and staring eyes, perpetually trying to keep afoot despite their
exhaustion…They totter along the road, with bent knees and protruding
bellies, crawling with vermin, a dreadful procession across hill and dale,
dying from exhaustion by the wayside…’ It has been estimated that in
fifteen years, through massacres, through flight and through disease, the
population of the Congo fell by some three million.” (pp.146-7)

135

Confirmation of Leopold’s atrocities in the Congo and more details of the Belgian
crimes in Africa can be found in the 1967 history textbook Europe and the World
since 1815:
“In 1891 Leopold forbade the sale of ivory and rubber to European
merchants, and made it clear that all such materials were state property.
Since the commission paid by Leopold’s government on these goods was
fixed, the less the native received for his rubber and ivory, the larger the
commission for the agent. This was an incentive to robbery and violence. If
the natives refused to gather the wild rubber, official were permitted to get it
as best they could. White and native soldiers were stationed in the villages to
drive the natives out to work. If the rubber did not reach the amount
required, the natives were sometimes attacked, killed or taken as slaves. If
the soldiers failed to produce the required quantity of rubber or a suitable
number of human hands or heads to show that the natives had been
punished, sometime they themselves faced death. In 1894 an English
traveller, E.J.Grave, reported: ‘Twenty-one heads were brought to Stanley
Falls, and have been used by Captain Rom as a decoration round a flower
bed in front of his house.’ This system broke up family life, ruined the native
economy, and destroyed tribal law. It seems likely that within thirty years ten
million people died as a result of these activities and the consequent
disintegration of tribal life.” (p.251)
Yet Professor Vernon Mallinson wrote, in Belgium (1969), that “in fairness to
Leopold it must be stated that he was behaving only as other industrial and
capitalist bosses were behaving in Africa”, adding that “the Congo was far from
being the only place in which the workers could be said to be exploited”. In
Mallinson’s eyes, “The charter Leopold secured from his government for the
Belgian Congo was, for the times, both forward-looking and progressive.” The
Professor of Comparitive Education, who tellingly was granted the honour of
Officier de l’Ordre de Leopold II, for services rendered to Belgium, after working
as a British “intelligence officer” in German-occupied Belgium during the Second
World War, continues with his cover-up of Leopold’s atrocities, portraying the
king’s behaviour in the Congo as being exemplary:
“From the beginning he [Leopold] had made an appeal to Belgian
missionaries to take up work in the Congo and he had met with a ready
response. To the missionaries he had assigned the task of providing
instruction for the mass of the population, while he himself had become
responsible [although he never set foot in the Congo] for the training of
suitable African personnel for the civil service and for work in Europeanorganised enterprises. Following the explorations and the campaign against

136

slave-traders [!] he had gathered together large numbers of abandoned
children in government posts [stolen children]. In 1890 he assumed official
guardianship for such children and created school colonies for them, at
Boma and New Antwerp, to house, feed and clothe them and to provide
basic instruction [enough to assure obedience as slaves]. In 1892 he
empowered the missionaries to do likewise, and they in their turn created
agricultural and vocational colonies [slave plantations], and extended their
influence through their colonies as far afield as possible. The children
themselves were wild and intractable [not surprisingly, since they were being
kept prisoner and cruelly mistreated], whilst parents were blinded by
prejudices of all kinds and could see no object of sending their children to
school. Children and parents alike needed a long and protracted period of
indoctrination. The missionaries, by the very nature of their work and the
kind of contact they had with the Africans, were eminently suited to their
task and performed it most efficiently. By 1905 there were already in
existence 59 fixed and 29 movable Catholic missionary posts served by 384
men and women. There were also 40 prinicipal Protestant missions and 192
subsidiary ones served by 283 pastors.” (pp.86-7)
One wonders what all these men and women of God were doing while the
Congolese were being driven into the forests to collect wild rubber under threat of
having their hands or heads chopped off. One wonders where the missionaries
were when Belgian appointed “sentinels” kept guard over hostage villages,
inflicting any punishments they deemed “necessary”.
Professor Mallinson’s Belgium, despite intent, provides some information that may
have relevance to the current AIDS epidemic in Africa, if it is true that the HIV
virus spread through the African population from the Congo. Leopold established,
in 1906, the “National School of Tropical Medicine” in Leopoldville (now
Kinshasa). This was the year after the horrific “extermination order” in nearby
“German South-West Africa” (renamed Namibia, after Independence). In 1905, in
response to the killing of 123 European colonists in Namibia by the local Herero
people, the German commander, Lothar von Trotha, ordered that “Every Herero
with or without a rifle, with or without cattle, shall be shot” in the name of the
“most powerful [German] emperor”. By the time von Trotha was recalled to
Germany, an estimated three out of four Herero people had been killed (Burne,
1991, p.1935).
Ironically, in view of their own colonial history, the atrocities of King Leopold’s
Belgian regime in the Congo were brought to light by the British. In 1905, Roger

137

Casement, the British Consul in Belgian Congo, presented a report to British
Parliament, stating that up to 100,000 Congolese were being slaughtered annually,
and that entire villages were being razed to make way for new rubber plantations,
in which slavery (forced labour) was the norm. Leopold’s response to the mounting
outcry in Britain and Europe was that his critics were “impractical idealists” and
“meddling do-gooders”, or British imperialists jealous of his success and covetous
of ‘his’ Congo, and that while these atrocities were “sad”, “one cannot accomplish
great work without doing some evil” (Burne, 1991, Aronson, 1968). The Belgian
Government, with the support of the British, took over control of Leopold’s
“estate” in 1908, and further developed the industries that Leopold had begun –
rubber and coffee plantations, timber (logging) mills and mines, all ruled by
“whites” and worked by “blacks”.
After the First World War, Belgian colonial territory in Central Africa increased
with the acquisition of the small nations of Burundi and Rwanda by ‘mandate’
from the League of Nations. The League of Nations, the precursor of the United
Nations, had been set up at the end of the First World War (then called the “Great
War”) by the victorious “Allies” to prevent warfare between European nations in
the future – in this regard a spectacular failure, given that the Second World War
erupted less than two decades later. Rwanda and Burundi, other nuclei of the early
AIDS epidemic in Africa, were previously German-controlled states in the rift
valley, adjoining Belgian Congo. When the Belgian Government received the
“League of Nations mandate” over Rwanda and Burundi (Urundi) the Belgians
extended the discriminatory administrative and social policies employed in the
“Belgian Congo” into “their” new territories in the fertile Rift Valley. This was
centred on providing selected opportunities to a wealthy elite minority at the
expense of the poor majority – in the classic style of “divide and rule” politics that
was so elaborately implemented throughout the British empire, with which the
Belgian administration was allied.
In all the European colonial administation systems in Africa, selected ‘blacks’,
preferably of racial or linguistic minority groups, were employed to control the
majority population (which was mainly dark-skinned) while selected ‘whites’ were
employed to control these ‘elite blacks’. The divide and rule strategy dictated that
men from other colonies be placed in senior administrative positions – this, it was
correctly assumed, would result in an administrative system (and government) that
had greater loyalty to the rulers than the ruled. Thus the British brought elite,
‘educated’ Indians and Ceylonese to work as professionals (doctors, lawyers,
engineers, administrators and so on) in Africa – ruling over the Africans, but
subservient to white colonial rulers. This was also done in the West Indies, Pacific

138

Islands and other British colonies, ‘protectorates’ and ‘dominions’.
India had been under British rule for more than a century prior to the “Scramble for
Africa”. In India, as in other countries, established empires were taken over by the
British – in this case the Moslem Moghul Empire and the religious empire of the
Brahmins. The slaves of these empires, and their ‘brown’ masters, became indirect
and direct slaves of the British and the other “whites” the Empire regarded as
“equal” to the dominant Anglo-Saxons. The “brown masters” (“professionals”)
were sent to other British colonies and dominions to work in the middle-levels of
the colonial bureaucracy, where they lived in relative opulence, ruling over the
“blacks” and obeying the “whites”. Meanwhile ‘black’ slaves from the various
British dominions, including South India, were sent to work as “indentured labour”
in the tropics. Thus impoverished Indians were sent to work as slaves in British
plantations in the Caribbean, the Pacific Islands and the British-controlled islands
in the Indian Ocean, while affluent Indians and Ceylonese were given moderately
senior positions in the British Colonial Empire. There was even an experiment with
Indian “indentured labour” in Australia in the 1830s prior to the introduction of
“Kanaka labour” (‘Kanaka’, a Polynesian word for ‘boy’ was adopted for all
“black” men and women from Melanesia, the Torres Strait Islands and Southern
Pacific Islands when such labour, which was essentially slave-labour, was
introduced to Queensland canefields in the 1840s).
The European colonial mentality, and its callous treatment of the numerous poor
‘natives’ under their ‘protection’ can be clearly seen in the following passage from
Belgium, when Professor Mallinson writes about the use of Congolese soldiers and
“porters” by the Belgians and British against the Germans during the First World
War:
“In point of fact the administrative arrangements made for the Congo in
terms of its charter and the development of educational services were such
that no real changes were considered until after the First World War. On 7
August 1914 King Albert [of Belgium, Leopold’s successor] had proposed
that the Congo basin be considered neutral territory, but a week later the
Germans cut telephonic communications along the western bank of Lake
Tanganyika and gave the 12,000-strong Belgian-Congolese army the
opportunity it was itching for. In September 1914 a detachment helped the
French wipe out a spearhead attack from the Cameroons and the following
year a further detachment joined the French in their Cameroons campaign
which put an end to all German resistance in this area. In November 1915
still another detachment answered an appeal from the British authorities and
helped free Northern Rhodesia from a German invasion. The fourth and last

139

operation to be mounted entailed the transformation of the Congolese army
into a modern professional fighting force, demanded lengthy preparations
and training, and resulted in the conquest of German East Africa
[Tanganyika], completed on 19 September 1916. Before the war ended,
further assistance was rendered the British in East Africa where brilliant
manoeuvring on the part of the Belgian Colonel Huyghe helped drive the
last remnant of German forces for refuge to the neutral Portuguese colony of
Mozambique on 9 October 1917. These African campaigns cost the lives of
257 Belgians, 2,500 Congolese soldiers, and 20,000 porters.” (p.205)
With these figures of literal decimation of Africans the concept of cannon-fodder
takes on new dimensions. Ten times as many “Congolese” soldiers, in the estimate
of Mallison (which is conservative) died as Belgians, and ten times again was the
number of “porters” who died. While the massacre of Australian soldiers in
Gallipoli is commonly cited here in Australia as an example of the British
sacrificing Australian lives as “cannon-fodder” during the First World War, at least
these young men were not sent unarmed, carrying heavy loads on their heads, to
face an enemy armed with German machine guns and rifles. It is not surprising,
given the nature of “colonial warfare”, that 100 times as many civilian African
porters died in the Anglo-Belgian campaigns against the Germans – civilians are
regarded as more expendable than “trained men” by those who wage such
campaigns.
Cecil Rhodes died in 1902, leaving his ambition to “open up Africa from the Cape
to Cairo” to other British imperialists, and the “Council for Foreign Relations” that
grew out of his clandestine “Round Table” group of Anglo-American industrialists
and imperialists. Chapman writes, in Diamonds in Australia: The Fields and the
Prospectors, of Rhodes’s worsening megalomania, after he became Prime Minister
of the Cape Colony:
“Cecil Rhodes, who became prime minister of Cape Colony and a Privy
Councillor, was obsessed with a grandiose scheme for the opening up of
central Africa through his chartered British South Africa Company. His
involvement in the raid, led by Dr Starr Jameson, to gain control of the
Afrikaaner Republic of the Transvaal in 1895 led to his downfall. The raid
was a disastrous failure and Rhodes was forced to resign as prime minister
and as managing director of the Chartered Company. At the age of 42
Rhodes’ career was ended – he had only six more years to live.” (p.51)
The strategy Rhodes employed to gain control of Southern Africa, and more
specifically “mining rights” to Southern Africa was typical of the British imperial

140

strategy – “conquest by treaty”. Conquest by treaty was always immediately
followed by “betrayal of treaty”, and such was the case in Rhodesia – except that
there was not even a treaty! All that justified Rhodes’ De Beers empire’s
exploitation of the Bantu gold mines “for eternity” was a written “agreement”
signed, under duress, by Lobengula, the Bantu “chief” of the Matabele people of
present day Zimbabwe. Lobengula, was alternatively titled a “chief” or a “king”,
depending on the ‘legal’ necessity. It was difficult, even in 1888, when the
“agreement” was signed, to justify a local “chief” as having the authority to agree
to what Lobengula did in the “Rudd Concession”. In exchange for 1,000 rifles,
ammunition, a gunboat to patrol the Zambezi river and a monthly rent of $200, the
“king of the Matabele people” agreed, according to the British lawyers who drew
up the “business contract”, that De Beers could have mining rights to all of
“Matabeleland” and “Mashonaland”. Furthermore, he agreed, in the small print,
that the “concessionaires” (Rhodes, his partner Charles Rudd, and their mining
company) had “full power to do all things that they may deem necessary to win
and procure the minerals” (Burne, 1991, p.1001).
It is obvious that Lobengula did not have the authority, if it be judged on the will of
his people, to sign to such a ridiculous agreement, and even less to sell the wealth
of his neighbours. The Shona people, who had been mining gold in the region since
as early as 1000AD and built the magnificent stone cities of Zimbabwe certainly
did not agree to their goldmines and any gold and precious minerals in their land
being “sold” by the chief of Matabele, as “Mashonaland” along with his own
“Matabeleland”.
Following this “agreement” with Lobengula, the British South Africa Company
lost no time in building a new “seat of government” – founding the town of
Salisbury (Harare) as the “capital” of Rhodesia. Rhodes, who refused to admit, to
his dying day, that the stone cities of Zimbabwe were built by ‘blacks’, had
successfully divided the Bantu people of Zimbabwe into “Matabele” slave states
and “Mashona” slave states, and more were to follow. Eventually, the white
“Rhodesian” and “South African” rulers divided the Bantu people into dozens of
“lands” and even tried to cement the borders of some (and further divide the Bantu
people) with the term “nation”. Massive slave states in South Africa were named
“Swaziland” and “Basutoland”. A huge piece of Southern Africa immediately to
the north of South Africa was ‘annexed’ as the “British Protectorate of
Bechanaland” (now Botswana). As far as Cecil Rhodes was concerned, it did not
matter what these “lands” were called, he “owned” them, and they were part of
“Rhodesia”. In the manner of King Leopold of Belgium (who egotistically named
‘his’ new capital city in the Congo “Leopoldville”), Rhodes owned, in his own

141

mind, the people as well as the land. Leopold had claimed ownership of the whole
of the Congo, which he ironically named the “Congo Free State” – ostensibly a
refuge from slavery. Rhodes was more obvious, naming the country after himself,
and had no pretensions of being an emancipist.
When Cecil Rhodes first arrived in South Africa in the 1870s his ship docked in
Durban, the capital city of Natal. Natal, previously the Boer colony of ‘Natalia’ had
been declared a British colony in 1843, forcing the Boers to move to the Orange
River Province (west) and Transvaal (north), losing their access to the coast.
Immediately adjoining Natal province are the small “states” of Basutoland and
Swaziland, which were landlocked and surrounded on all sides by much larger
“white” states – Basutoland was surrounded by Orange Free State to the north,
Natal province (including Zululand in its north) to the east and Cape Province to
the South. Swaziland was surrounded by Transvaal to the north and west,
Portuguese Mozambique to the east and Natal province to the south. Basutoland
and Swaziland provided the black slaves for the surrounding white states, but these
were no longer called “slaves” or “savages”. They were called “labourers”,
“miners”, “workers”, “porters”, “cleaners”, and so on. They did not work for no
pay at all, as did Australian Aboriginal slaves at the time, but they did work under
duress – they were, in other words, subjected to forced labour. If they did not work
in the dangerous and menial “jobs” they were offered they starved. This was a
different form of slavery, but an equally cruel one to that of Leopold’s Congo, or
the Trans-Atlantic trade in “black ivory”.
The first Europeans to exploit West Africa and the gold-rich Congo were
Portuguese. In 1482 the Portuguese navigator Captain Diogo Cam kidnapped four
Africans to take back to the Portuguese King, John II, after four of his men were
held by King Nziza, the “Manikongo” (king of Congo). King John decided to forge
an alliance with the Manikongo, whose kingdom was said to be rich in gold and
slaves, and sent Diogo back to West Africa (present day Angola, where the Kongo
kingdom was centred) with the four kidnapped Africans “trained” as “Portuguese
Ambassadors”. These men helped the Portuguese establish trust among the
Congolese, trust that was misplaced. The main attraction for the Portuguese
imperialists was gold, at first. Later it became gold, ivory and slaves. By the time
the Dutch challenged the Portuguese claims over African territory in the 1700s
millions of slaves had already been shipped to Brazil and Central America by the
instigators of the trans-Atlantic slave trade. It is estimated that half of the ‘cargo’
(about 20 million people) died in transit.
When the Dutch, French and British joined in the so-called “scamble for Africa”

142

all of Africa was populated, and all of it had already been “claimed” by the
Portuguese in their intial claim of the entire “Eastern Hemisphere” back in 1494.
They therefore needed to adopt a different strategy to get their hands on the gold,
ivory and slaves. This was the strategy used in other already “colonised” lands –
they offered to defend and protect the native rulers from the barbaric Catholics
(Portuguese and Spanish), to provide them with education and medicines, to give
them weapons and fight for their freedom.
The British and Dutch governments of the 18 th and 19th centuries used a similar
strategy to gain control of intended colonies. “Trading companies” were formed,
such as the “British South Africa Company”, “British East India Company”,
“British East Africa Company” and “Dutch East India Company”, which were
authorised to carry arms, wage wars and “sign treaties”. The existing rulers of the
countries were first tempted with offers of special privileges and military
assistance. The “great British/Dutch king” (or queen) they represented was
portrayed as a benevolent, wise ruler who wished only to engage in mutually
beneficial trade and the enlightenment and emancipation of the masses. Hearing
such noble motives many “savages” signed “treaties”, especially when encouraged
to do so with gifts of alcohol, knives and trinkets. In those nations with knowledge
of the duplicity of European colonists, the signing of treaties required considerably
more stealth, and often considerably more expensive bribes. If such measures
failed to produce the required “treaty”, threats were made – first threats that they
may be attacked by their neighbours, or by other imperial nations; if these failed
direct threats by the “Trading Companies” were resorted to. Failing direct threats,
the next measure employed for “forced colonization” was demonstration of might
– using guns and cannons, or torture and mutilation if necessary (for example, the
amputation of the hands of competing Indian tailors when the British wanted to
establish Manchester as the textile centre of the world).
Once the paperwork was out of the way the colonists set to work implementing the
notorious “divide and rule” policy. “Divide and rule” was a strategy employed by
imperialists all over the world, and included a complex and carefully constructed
social and military policy, including an army of bureaucrats, army and navy
officers and soldiers, police, judges, propagandists and missionaries. When the
doctrines of positive and negative eugenics were promoted in the late 19 th and early
20th centuries the socio-political system through which they could be implemented
had long been in existence.
Southern Africa, because of its mineral wealth became a prize to be fought over by
European capitalists following the discovery of gold in the 1870s and diamonds in

143

the 1890s, a battle eventually won by the English aristocrat Cecil Rhodes and his
“British South Africa Company”. Miners and prospectors from the Nederlands
(Holland), British Isles, Germany, Austria, Australia, the USA flocked in their
hundreds to seek their fortune in Southern Africa, where the British had sought to
establish a “white homeland”, as part of the British empire.
The British had strategic (military and political) reasons for wanting to establish
control of Southern Africa, long before it was known to contain rich mineral
deposits. Until the Suez Canal was built in the 1860s the only sea route to Asia was
around the Cape of Good Hope. Since the Moslem Ottoman Empire (Moors)
controlled much of Northern Africa and the Middle East, the overland trade route
between Europe and Asia was not accessible to the Christian European states. This
made sea routes all the more important to European trade, an importance that
continues to the present day in the case of trade in minerals, chemicals, food and
other “raw materials” and “manufactured goods” (including weapons). After the
British had claimed the Indian subcontinent as part of the “British Empire” in the
1700s and established direct rule in India via the “British Raj”, they sought to
establish control of the entire Indian ocean, excluding other European, and also
Arab, Chinese and Indian traders from trade with “their colonies” in Africa, Asia
and Australia.
When gold was discovered, in the 1800s, in Southern Africa, Australia and the
Americas, the same mining magnates, and the same financiers (bankers) exploited
all three continents. The discovery of gold in California, South Africa and Southern
Australia in the mid-nineteenth centuries caused “gold-rushes” in every new site
that was announced, and poor men flocked there as well as the idle rich, and
lawmakers from the “mother country”. Inevitably poor men found the first bits of
gold and had their finds taken from them by the rich, and were later forced to slave
in the increasingly deep and dangerous mines for little or no money. In South
Africa, “blacks” and “coloureds” often made the first finds, but found themselves
in chains or worse as a result, while “respectable white businessmen” set up
“proper mining operations” in the “newly discovered goldfield”. In Australia,
“convicts” were treated much as “blacks” and “coloureds” were in South Africa.
The rules in the gold-rich areas were made by the men who ran the bureaucracy
that sprang up wherever gold was found. This bureacracy, or “administration” –
including police officers, magistrates and bankers – made the rules, and the rules
dictated that gold belonged to the owners of the mines that were rapidly
constructed, and the financiers that “loaned” money to the mining magnates.
There has been a big difference, however, between the mining operations in South

144

Africa versus those in Australia and the USA, since the times of the gold-rushes. In
Australia and the USA, most of the miners were paid (poorly, at first, but
increasingly well), and were “white-skinned”. In South Africa, when European
gold-mining began in the 19th century those who did the hard work were paid
minimally, if at all, and worked under the duress of starvation and the intimidation
of whips wielded by “whites”. The “miners” themselves, were, needless to say, all
“black” Africans. The magnates who “owned the mines” and “owned the land”
were all white.
The Portuguese had led the European navigation race, and the race for exploitation
of African gold when they began trading with the kingdom of Kongo (Congo) in
the 1500s. The Portuguese were much impressed by the Congolese kingdom, and
saw it to be “rich in slaves and gold”. Hoping to form a Christian (Catholic)
alliance with the African kingdom against the Moslem Ottoman Empire, the
Catholic Church and Portuguese government sent missionaries to convert the
Africans, and traders to buy gold and slaves. Having claimed Brazil in South
America as part of the Portuguese empire in 1500, there was much demand from
South America for African slaves to work in the mines and farms the Portuguese
settlers hoped to become wealthy from.
Between 1500 and 1888, (when the Brazilian government reluctantly agreed to ban
slavery in line with other nations), over 20 million Africans were shipped,
manacled and huddled together in the holds of putrid ships, from European and
African slave markets to white buyers in Brazil, coincidentally the South American
nation worst hit by AIDS. Millions died during transportation of disease and
starvation. Many millions more died prematurely due the treatment they received
as slaves. So cruel were the Brazilian slave-owners that African slaves survived, on
average, for only seven years after arrival. Brazil is, incidentally, the only exPortuguese colony in South America, the other nations in the continent having been
Spanish colonies.
The Portuguese began the transatlantic slave trade in the 1500s centred on slave
markets in Angola, Lisbon and Brazil, but their example was shortly followed by
the other sea-faring European states – notably Spain, Nederlands, France and
Britain. Tens of millions of Africans were shipped from slave markets established
at ports in Liverpool, Amsterdam, Nantes (in France) and Genoa (in Spain) to
respective British, Dutch, French and Spanish colonies. Other slave markets were
established in islands on the coast of Africa, and in fortified cities on the mainland,
from which slavers could obtain their cargo, while others were established on
islands in the Caribbean sea, where American slavers could buy the “best” slaves

145

for the cotton, tobacco and other single-crop plantations they established after the
forests had been ‘cleared’ – and to slave in the mines. Meanwhile, the indigenous
population of the Americas was genocided, using a combination of cannons, guns,
poisons and infections.
Bartholomew Diaz, exploring for the Portuguese empire, rounded the Cape of
Good Hope in 1488, establishing a possible sea route to India, Ceylon and the
“Spice Islands” in the eastern Indian Ocean. Diaz’s discovery was confirmed by
the famous Portuguese explorer Vasco Da Gama shortly afterwards. Da Gama
sailed around the Cape and across the Western Indian Ocean, reaching India – the
centre of the spice and cotton trade at the time. Prior to the recent invention of
refrigeration salt and spices were essential for the long-term preservation of food,
and also highly valued for medicinal as well as more obvious culinary reasons.
Trade in spices had long been controlled by Arab (Moor) and Indian traders. The
Javanese empire that was later conquered and acquired by the Dutch, and
converted to the “Dutch East Indies” (Indonesia) was actually established by an
ancient Indian civilization as early as the 5th Century AD.
Much more has been written in European history books about “the Voyages of
Discovery” than the “Voyages of Exploitation” that inevitably followed. In fact, the
voyages by now famous “explorers”, “adventurers” and “navigators” were
sponsored by various European monarchies and governments with an explicit
mission to discover new lands that could be exploited. Exploiting the lands
included exploiting the people, and exploiting what was on and in the actual land.
Whenever a “new” part of the world was discovered by the “explorers” of the 16 th,
17th and 18th centuries, it was immediately “claimed” in the name of the monarch
who sponsored the mission. The Portuguese and Spanish did this even if there
already existed established civilizations in the countries being claimed. So
grandiose were their claims that by 1500 they had claimed half the world each. The
Spanish Monarchy “possessed” the “Western Hemisphere” and the Portuguese the
“Eastern Hemisphere”, according to the judgement of the Roman Pope Alexander
VI in 1494.
Since the East African slave trade was controlled by the Moors, the Portuguese
obtained most of their slaves from the Congo region of West Africa until they
established control of the port of Mozambique and later the islands of Zanzibar and
Madagascar on the African coast. Zanzibar had long been a centre of the Arab and
Indian slave trades, from which Africans had been sold to Moslem merchants.
Earlier, the Portuguese had established a port on the south-eastern coast of Africa –
in Angola, which came to be more highly prized when gem diamonds were

146

discovered there in the late 19th century.
It has been observed that the second strain of HIV to be named, HIV-2 appears to
have originated, more or less simultaneously, in several countries that happen to
have been Portuguese colonies. This is mentioned, but not adequately explained,
by Laurie Garrett in The Coming Plague:
“Another piece of missing data concerned the remarkable coincidence
of HIV-2 and areas of former Portuguese colonization (Angola,
Mozambique, Guinea-Bissau, Sao Tome and Principe). The only East
African site of HIV-2 was in Mozambique, and West African ex-colonies
had among the highest incidences of HIV-2. It would have been helpful if
somebody had systematically tested Portuguese and African veterans of the
1965-75 colonial wars to determine whether these soldiers caught, and
spread, the virus.” (p.389)
Could Africans have been deliberately infected during these “colonial wars”
(actually, wars fought for independence from colonial rule) or infected since then,
perhaps as an act of revenge against ‘ungrateful natives’? Could such things have
been done elsewhere? Could HIV-1, the strain of the Human Immunodeficiency
Virus that is said to be responsible for the main epidemic of AIDS in Central,
Eastern and Southern Africa have been introduced into targetted populations for
the same reason? Could it have been done for pecuniary or political reasons, rather
than revenge – or for the implementation of a eugenics program? Wars have been
fought with revenge as a prominent, if not primary, motive – including ‘revenge’ or
‘retribution’ against the already oppressed by oppressors. Terrible atrocities have
always occurred during wars, and terrible atrocities have been committed with no
motivation other than revenge against real or perceived insults. The rejection of
colonial rule was certainly seen as an insult by those who claimed to bring
“civilization”, “development” and “law” to ‘backward natives’ and ‘barbarous
heathens’.
The history of European “colonization” of Africa is of central importance to the
present work, and for those unfamiliar with the geography of Africa, an atlas will
be a valuable accompaniment to reading this book, since much of the evidence
presented of an ongoing negative eugenics program in the modern world can only
be understood with an awareness of the geography of the relevant areas. This
evidence includes historical and epidemiological information that requires basic
geographical awareness. In this work I can only present pieces of a puzzle that I
have been working on for five years – that of the cause of AIDS and the
extraordinary coincidences that have characterised the global pandemic when

147

compared with the theories, policies and programs of “negative” and “positive”
eugenics. These pieces need to be placed in a geographical context for the picture
to be understood, requiring some form of mental map of the world’s geography,
geology and ethnography. Such understanding requires for the facts provided, the
“pieces of proof”, to be placed also in a historical context or time frame.

Chapter 8
THE DEVELOPMENT BIOLOGICAL WEAPONS DURING
THE COLD WAR
The AIDS epidemic appeared during the Cold War. The Cold War, which included
a propaganda war between the Capitalist West and the Communist East, and an
unprecedented arms race between the so-called Superpowers, the USA and the
USSR, was characterised by the development of new “cold” rather than “hot”

148

weapons, although many new explosive devices, including bombs, land-mines and
bullets were also invented by the lucrative industry of arms-manufacture. These
“cold weapons” included biological, chemical and psychological weapons, heinous
“weapons of mass destruction” that kill and maim more cheaply, and less
noticeably than bombs, missiles and guns. Cold warfare does not damage
buildings, or valuable artifacts, long-standing motivations for waging war. While
the land, air and water may be poisoned by chemical and biological warfare, the
public are less likely to blame it on warfare than on “civilization” itself, or the
much-spoken-of problem of “overpopulation”. These are only a few of the reasons
that biological, chemical and psychological warfare have proliferated over the past
50 years alongside so-called “conventional warfare”. Others are that they are
financially profitable to those developing and manufacturing the weapons, and that
those who research and experiment with cold weapons are able to easily disguise
themselves. Biological and chemical warfare research can easily be passed off as
“medical research”, and psychological warfare research (and espionage) can be
difficult to differentiate from “experimental psychology”. When illness and death
result from these non-explosive forms of warfare, these, even if they number in the
millions, can be attributed to natural causes, or at the most, unintentional
consequences of pollution, poverty, overcrowding, and environmental degradation.
Most of the deaths from this ‘cold’ warfare have not been in the First World and
Second (Communist) World, despite these being the most visible protagonists in
the Cold War. They have been, instead, in the Third World – a large group of
nations, that are apparently “poor” and “underdeveloped”. These countries have
also been called “undeveloped”, “developing” and “backward” in supposedly
“expert, academic” writing on modern global health and politics. The division of
the habitable land on the planet into “First”, “Second” and “Third World” was an
idea, and later an official action of the white, male, patriarchs in the “First World”.
Even though it maintained the prestige and wealth of many women (and
engineered the poverty and misery of many more), the creation of the fundamental
model of modern global politics comprising the “industrialised” First World, the
Communist Second World” and the “underdeveloped” Third World was a construct
of rich men suffering from an unhealthy degree of “Cold War paranoia” and
cultural grandiosity. They were the same rich men, belonging to the same “old boy
network” that had fought against the freedom of black slaves, the votes of women,
and the slavery of children. They were eugenists in view and sentiment, even if
they did not recognise themselves as such. These was the much larger proportion
of eugenists, who cared little for Blumenbach’s classification, or the more detailed
classifications of “physical anthropologists” – they saw the issue in terms of
“black” and “white”. As far as the majority of “undeclared” eugenists and “Social

149

Darwinists” were concerned, Darwin’s theory of evolution proved that the white
“race” is good, superior, and destined to rule (and deserving of the continued
“right” to rape, pillage, exploit and plunder, as long as it was done ‘discreetly’)
while the “blacks” (Africans, Melanesians and Aborigines) and “people of colour”
(Asians, Polynesians and Indigenous Americans) had been “proved” by the
“scientific experts” to be “naturally inferior” – especially in terms of mental power
and, more insidiously, moral and rational judgement.
“Blacks”, and other “natives” were seen as being more akin to monkeys and apes
(often no distinction was made between monkeys and apes) and thus to be
naturally more “wild” and “savage”. With a deep fear and loathing of nature, these
white racists equated “wildness” with brutality, barbarity and violence. With the
addition of Galton’s arguments the objects of their hatred could also be demonised
as “drug addicts”, “alcoholics”, “mental defectives”, “moral degenerates” and so
on. The misuse of stigmatising medical labels soon became part of colloquial
expression, including “idiot”, “moron” and “imbecile”, which were initially
developed as medical terms to grade degrees of “feeble-mindedness”. As new
labels entered the public’s vocabulary, the experts (the medical profession) kept
producing more, and changed the “official name” of older, apparently
“misunderstood” terms. The creation of such labels, which function also as codewords for “disease surveillance” and data collection, is a full-time business for a
veritable army of staff of the American Psychiatric Association, the political
emperors of the Global Psychiatric Empire. The American Psychiatric Association,
publishers of the widely-used Diagnostic and Statistical Manual of Mental
Disorders or DSM, will be discussed at length later in this book, but suffice to say,
the “Father of American Psychiatry”, according to the DSMIV, was Dr Benjamin
Rush, one of the “Founding Fathers” of America, and the only physician to sign
George Washington’s Declaration of Independance. Rush, in anticipation of the
eugenists, believed that slavery was necessary for the civilization of blacks, and,
owning many black slaves, taught that “drapetomania” (the madness of slaves
wanting freedom) should be “treated” by whipping. Galton, as his own records of
his travels in Africa reveal, assumed the right to whip his black slaves after a
“small court of justice” presided over by himself. Like Rush, Galton did not
actually do the whipping himself (or face the danger of retaliation) – he delegated
that “duty” to his Danish mate Hans, who “deftly administered the awarded
strokes”.
Given such assumptions of superiority, and many other examples of racism,
xenophobia, imperialism and elitism demonstrated by medical politicians and
“eugenists” in the past can we be sure that racism is not still part of global health

150

programs and medical academia? Is it possible that those who gave financial and
political support to the academic, University-based, eugenics movement in the
1920s, re-introduced race and class-based slavery after the Second World War,
armed with new knowledge of biological, chemical, psychological, technological
and economic warfare?
To look at these possibilities scientifically we must return to the history and
politics of the founders of the eugenics movement and their ardent, but
increasingly secretive and increasingly ruthless followers. These frequently
assumed the identity of “experts” in genetics, psychiatry, population control, and
“family planning”. Convinced by the struggle for survival of different “races” as
portrayed by the founders of the first Society for Eugenics, Francis Galton and
Major Leonard Darwin, the American, German and British political, military and
industrial leaders developed what they saw as practical strategies to apply Charles
Darwin’s theories of natural selection to win the race between the different “races”
of humanity. They were sure of the intellectual superiority of their own “white”
race, but were not as certain of their physical superiority – especially their ability
to resist infection, or to survive in the tropics. Regarding the latter, in the opening
chapter of Origin of Species, Darwin mentions the earlier work of W.C. Fields:
“In 1813, Dr W.C. Wells read before the Royal Society ‘An Account of a
White female, part of whose skin resembled that of a Negro’; but his paper
was not published until his famous ‘Two Essays upon Dew and Single
Vision’ appeared in 1818. In this paper he distinctly recognises the principle
of natural selection, and this is the first recognition which has been
indicated; but he applies it only to the races of man, and to certain characters
alone. After remarking that negroes and mulattos enjoy an immunity from
certain tropical diseases, he observes, firstly, that agriculturists improve their
domesticated animals by selection; and then, he adds, but what is done in
this latter case ‘by art, seems to be done with equal efficacy, though more
slowly, by nature, in the formation of varieties of man, which would occur
among the first few and scattered inhabitants of the middle regions of Africa,
some one would be better fitted than the others to bear the diseases of the
country. This race would consequently multiply, while the others would
decrease; not only from their inability to sustain the attacks of disease, but
from their incapacity of contending with their more vigorous neighbours.
The colour of this vigorous race I take for granted, from what has already
been said, would be dark. But the same disposition to form varieties still
existing, a darker and a darker race would in course of time occur; and as the
darkest would be best fitted for this climate, this would at length become the
most prevalent, if not the only race, in the particular country in which it had

151

originated.’ He then extends these same views to the white inhabitants of
colder climates.” (p.56)
When Galton formulated his theory he was, like his cousin Charles Darwin,
unfamiliar with the work of the Austrian Monk and scientist, Gregor Mendel.
Mendel, working alone in an Austrian monastery, worked out the fundamental
principles of dominant and recessive genes involved in the physical variations of
cultivated plants, focusing on detailed experiments on peas. The scholarly priest
presented his work to a dismissive audience of local academics, but, lacking the
necessary support, his work lay “undiscovered” until 1900, when it was publicised
by the Dutch biologist Hugo DeVries. The rediscovery of Mendel’s work
stimulated growing interest in eugenics in the United States of America, centred in
the major universities, Harvard and Stanford, and taking direction from the
Carnegie-controlled Eugenics Records Office at Cold Spring Harbor, Long Island.
Mark Haller reveals how fervid were the first enthusiasts of eugenics in America:
“An expanding interest in genetics at the turn of the century led directly
to the first organization of the eugenics movement. The American Breeders’
Association – formed in 1903 by agricultural breeders and university
biologists – provided the chief center for encouragement of Mendelian
research in the United States. At its second meeting in January, 1906, the
Association set up numerous committees on specific breeding problems.
Among them was a Committee on Eugenics ‘to investigate the report on
heredity in the human race’ and ‘to emphasize the value of superior blood
and the menace to society of inferior blood’. This was the first group in the
United States to advocate eugenics under the name of eugenics.”(p.62)
Emphasizing the menace to society of inferior blood means, in effect, demonising
the victims of negative eugenic prejudices. Since eugenics, as defined by Galton,
stressed that “white” blood is superior to “black”, “brown”, “red” and “yellow”
blood, it is obvious that the “inferior blood” referred to ran in the veins of all the
“coloured races” that the eugenists compared and contrasted with their own.
“People of colour” was a collective term used by the less discriminating eugenists
to describe everyone who was not “white”. According to the formal and informal
hierachies assumed by the eugenists, “black” races were more inferior to “brown”,
“red” and “yellow” races, according to them 18th century anthropological
classification of Blumenbach. It is easy to see how white supremacist extremists
could develop a plan to segregate and enslave the “coloureds” and exterminate the
“blacks”, if they were obsessed by “contamination” of their blood and that of their
children, and terrified of being overwhelmed, physically, by wild, angry, violent
“blacks” – aggressively seeking revenge against their enslavers and exploiters.

152

While greedy industrialists might develop a grandiose plan for global slavery and
exploitation it might be supposed, though, that the academic world would never go
along with such a nefarious scheme, and could not be bribed or coerced into
supporting a program of slavery let alone one of genocide and slavery. After the
horrors of the Second World War, surely the eugenics movement was purged of
racism, paternalism, elitism and “ethnic chauvinism”? Surely the medical
profession has cured itself of the megalomania that affected those who killed
millions in the name of “medical treatment” and “racial purity”?
It is commonly claimed that the Nazi genocide was ideologically-driven, in that it
was implemented for the miguided, but nevertheless, ‘ideological’ reason of
achieving Aryan “racial purity”. This is only partly true. Much killing was done for
direct mercenary profit, and often the two acted as a combined motive for murder,
as ideology and greed have through the ages. Those who were killed by the Nazis
were robbed first. Even their dead bodies provided “capital” for those callous
enough to create a business out of “human tissue specimens” and gold fillings. The
chemical companies providing poisons and explosives for the German war effort
profited, as did the “agriculturalists” and “industrialists” who took advantage of
slave labour. The mining industry and automobile industry in Europe made a
fortune as the Second World War intensified, prompting President Roosevelt to
argue, in his historic radio broadcast to the American people that supplying arms to
Britain would be “no more unneutral…than it is for Sweden, Russia, and other
nations near Germany to send steel and ore and oil and other war materials into
Germany every day” (Roosevelt, 1940 quoted in Sellers, 1975, p.693). At the time,
the richest man in America was the oil tycoon John D. Rockefeller. The Scottishborn steel magnate Andrew Carnegie’s Carnegie Corporation, financiers of the
“Eugenic Records Office”, also stood to make a killing if, as Roosevelt urged, the
United States of America became the “Great Arsenal of Democracy”.
While this book is mainly concerned with ‘cold’ weapons and cold (chemical,
biological and psychological) warfare, conventional (hot) weapons and warfare are
closely related, historically, economically and politically, with cold weapons and
warfare. The chemical industry is closely associated with the mining industry – the
raw materials procured by the mining industry are processed into various
chemicals, including pharmaceutical drugs, agricultural chemicals and explosives.
This is clearly illustrated by the year 2000 Annual Report of the main Australian
manufacturers of explosives and cyanide, Orica – previously the Australian
subsidiary of Imperial Chemical Industries (ICI Australia).
Imperial Chemical Industries (ICI) was formed in 1926, shortly before the British

153

Broadcasting Corporation (BBC). These companies were classic imperial
constructs – loyal to the British monarchy, and designed to further the empire’s
interests. They both expanded in size and wealth during the Second World War –
when the BBC grew by producing propaganda and ICI by producing explosives.
ICI also produced pharmaceutical drugs, and was involved in paludrine and
malaria experiments in Australia during the 1940s, during which ‘human guinea
pigs’ were infected with malaria-infested blood courtesy of the Red Cross.
In early 1998 (about a year before the malaria revelations in the Fairfax press), ICI
Australia was made a “public company” and changed its name to Orica, separating
from the British ICI chemicals. ICI’s pharmaceutical operations (now named
Zeneca) were amalgamated with the Swedish Astra pharmaceuticals, becoming
Astra-Zeneca. The AstraZeneca office in Sydney recently informed me that the
International Head Office of Astra-Zeneca is in England (home also of the head
office of ICI). ICI Australia, now named Orica, is controlled from Melbourne
where 7 of its 10 Board Members are resided, and its head office is located. Orica
employs “9000 staff across approximately 30 countries” and sells the following
products: explosives, cyanide (for the gold-mining industry, including Rio Tinto,
with which they have a large contract), electronic detonators (named “Blaster” and
“Logger”), paints (including the Dulux brand), glues and resins (including the
Selleys range), herbicides, insecticides, fungicides and fertilizers.
One of the biggest environmental catastrophes in gold-mining areas around the
world is cyanide poisoning of the surrounding land and waterways. Sodium
cyanide is used, at the site, to extract gold, and, being extremely toxic to all
vertebrate life, has rendered many streams and rivers in New Guinea, Indonesia,
Africa, South America, Australia and, recently, Eastern Europe, devoid of fish and
poisonous to residents. Orica boasts, in its Annual Report that “Sales volume of
sodium cyanide to the gold mining industry increased significantly over the year,
including exports to Asia, Oceania, South America and Africa”. The same Annual
Report attempts to encourage investors with the disclosure that “Large contracts
with Inco and Falconbridge in Canada and Rio Tinto in Australia and the Americas
were won through a combination of new technology, process improvement, cost
efficiencies and global teamwork by our people”. This has included expansion of
the companies “mining services”:
“Our Explosives business is the world’s leading supplier of commercial
explosives and fully integrated blasting services to the mining, quarrying
and construction industries [and military?]. It operates in more than 30
countries with manufacturing facilities in Australia, the USA, Canada,
Brazil, Mexico, Chile, Argentina, Venezuela, Guyana, the UK, Spain,

154

Turkey, Kazakhstan, Kyrgzstan, the United Arab Emirates, New Zealand, the
Philippines, Malaysia, China, Indonesia and Thailand.”
Several of these countries are known to manufacture explosive weapons in addition
to having a developed mining industry, and to export these weapons to other
countries. These weapons include bullets, grenades, land-mines, missiles and
bombs. The mining industry, with which explosives companies obviously work
closely, also provides the metal and other raw materials for conventional (hot)
weapons, and for military hardware – guns, helicopters, military planes, ships,
submarines and satellites. Do Orica and Rio Tinto also have undisclosed military
contracts or contracts with arms manufacturers? Is there any control over what
explosive chemicals are used for in the nations that buy explosives, and
“accessories” such as electronic detonators from Orica? Aside from this, the
pollution directly attributable to “increased cyanide sales” and the environmental
devastation caused around the world by the mining industry are enough implicate
Orica in crimes against nature and crimes against humanity.
The Annual Report 2000 of Orica, raises some serious questions about medical
research and university education in Australia, because the Chairman of the Board
of Orica, Dr Ben Lochtenberg (an engineer, not a medical doctor) is
simultaneously the Chairman of the Mental Health Research Institute in Parkville,
Melbourne, Australia’s “premier” psychiatry research centre, and recepient of the
largest slice of “mental health research funding” from the National Health and
Medical Research Council (NHMRC, the chief allocater of Commonwealth
Government Research grants in Australia). Professor Lochtenberg is also Director
of Capral Aluminium, the Australian Foundation for Science Limited and
Melbourne University Private, the spectacularly unsuccessful attempt by private
capital-holders in Melbourne to create a “private campus” alongside the 150-yearold public campus of Melbourne University. Could Dr Lochtenberg be faced with a
conflict of interest – being simultaneously in charge of an explosives corporation, a
mining company, a major educational institution and a major health research
institute?
The Fairfield Infectious Diseases Hospital, next to which the new Forensic
Psychiatry Hospital is currently being built is the home of the Macfarlane Burnet
Institute, the largest AIDS research institution in Australia. The Macfarlane Burnet
Centre (MBC) is soon to be located next to the Alfred hospital in a multi-million
dollar development. The executive director of the Macfarlane Burnet Institute is
the American Harvard University graduate Professor John Mills, who is also the

155

director of the AMRAD corporation. AMRAD is a new ‘Australian’ biotechnology
company, a branch of which is AMRAD Pharmaceuticals, which is involved in
joint projects (as “corporate partners”) with the Macfarlane Burnet Institute,
according to the Institutes Annual Report. Other (non-executive) directors of the
Institute, which is soon to be relocated to new premises at the Alfred Hospital in
Prahran, include Sir Roderick Carnegie, who is described in the 1998 MBC Annual
Report as Chairman of Hudson Conway and Director of John Fairfax Holdings
limited. Hudson Conway is part owner of the Crown Casino in Melbourne and
Fairfax Holdings owns the Age newspaper and several popular magazines.
The 1996/97 Annual Report of the Macfarlane Burnet Centre for Medical Research
Limited lists their biggest corporate sponsors as HIH Winterthur (insurance), Rio
Tinto (mining) and Smith Kline Beecham Pharmaceuticals. HIH Winterthur
donated $112,700, Rio Tinto donated $90,000 and Smith Kline Beecham donated
$40,000. Page 17 of the Annual Financial Report (1998) of the Macfarlane Burnet
Centre states (in bold italics) under “renumeration of directors” that non-executive
directors do not receive any income. It also contains a small table that one director
(presumably the executive director, Professor Mills) was paid $273,515 (30 June
1997) and $453,745 (31 December 1998). Chairman of the Board of the
Macfarlane Burnet Centre is Mr.Graeme Hannan, also Chairman of the Hannan
finance group, and the Deputy Chairman is Mr Raymond Williams, also chief
executive officer (CEO) of HIH Winterthur International Holdings Limited and
director of the following organizations: Insurance Council of Australia, Australian
Motor Insurers Limited, and Garvan Institute for Medical Research (in Sydney).
The insurance industry and mining industry both have a vested interest in the
public health programs promoted by the Macfarlane Burnet Centre for the
prevention of AIDS and hepatitis, programs which are exported to Africa, Asia and
the Pacific Region by the Centre under the auspices of the World Health
Organization. These programs have an almost exclusive focus on surveillance,
injections, drugs and condom distribution as part of what is euphemistically called
a “harm reduction” strategy. The promotional literature of the National Mental
Health Strategy and Drug Strategy suggest that “harm minimization” and “harm
reduction” programs accept that “drug use is now an unavoidable feature of
society” and rather than attempt to stop people from injecting themselves with
heroin, amphetamines and other chemicals, public health designers are focusing on
teaching young people “safe injecting habits” such as not sharing needles between
“users” and safe disposal of contaminated needles and syringes.
The

other major focus of the Macfarlane Burnet Centre, under the guise of

156

“epidemiological research”, is investigation of the sexual habits of particular
populations of young people in Australia and elsewhere, particularly the Aboriginal
population, with the simultaneous promotion of what is, again euphemistically,
termed “safe sex”, meaning the use of condoms and lubricants, rather than sexual
fidelity. This is the same lobby group that have actively promoted “safe injecting
houses”, also called “shooting galleries”, where, it is planned, young people will be
provided with the means and environment to inject themselves with
pharmaceutically regulated heroin, using clean disposable needles in a “controlled
environment” where they can be resuscitated if the “overdose”. The strategy of
virus infection “control” is centred on, in their own terminology, “surveillance”.
The Macfarlane Burnet Centre Annual Report of 1997-98 describes their
involvement in an ongoing project titled “Victorian Aboriginal Health Service
Study of Young People’s Health and Well-Being. It is described as follows:
“The objective of the Young People’s Health Study is to establish a
longitudinal study of a cohort of young Aboriginal people in order to
describe their health problems, explore the interrelated causes of these
problems, and describe factors associated with adolescent resilience and
vulnerability. This year the project team have finalised the questionnaire
which was programmed for computer use. A team of peer interviewers was
trained and a data collection manual prepared. The team of young peer
interviewers contacted young people on the random sample list and invited
them to take part in the study. 180 young Koori people living in metropolitan
Melbourne have now completed the lengthy questionnaire on portable
computers. Those over 16 years have also been counselled and had tests for
blood borne viruses and sexually transmitted diseases [hepatitis B and
AIDS]. Data collection is now finished and the data entered into the
computer. The next stage of the study will be analysis and writing up the
results. The results will be disseminated to the Aboriginal community and
the local Aboriginal community organisations. There will also be
presentations at seminars and conferences and the results will be published
in journals.” (p.82)
The Macfarlane Burnet Centre were also involved in a project titled “Community
Health Needs Assessment: Yarrambah Aboriginal Community”. This one week
project, funded by Qld Health and Harvard University involved “assisting” the
Yarrambah Community “to design an evaluation process for a community-based
needs assessment”. One wonders whether this Aboriginal Community know who
sits on the Board of the Macfarlane Burnet Centre, or that Rio Tinto Mining are
contributing to their activities, along with the insurance industry. One wonders also

157

what conclusions “the computer” will reach with all the information gathered about
young aboriginal people in urban and rural Australia, and what other purposes this
sensitive information could be used for.
The Macfarlane Burnet Centre is a keen proponent of AZT (Azidothymidine, also
called Zidovudine, and manufactured by Glaxo-Wellcome) for the treatment of
HIV infection and AIDS, and needle and condom distribution for the prevention of
“sexually transmitted diseases” including AIDS. They have been involved in
establishing a “needle and syringe exchange program” in the Indian State of
Manipur, which is “the first of its kind”, and is described in the previous year’s
annual report as follows:
“The SHALOM (Society for HIV/AIDS Lifeline Operation in Manipur)
Project is a collaboration between MBC and the Emmanuel Hospitals
Association (EHA). The project was established early in 1995 as an
indigenous response to the alarming incidence of HIV infection among
young drug users in the semi-rural community of Churachandpur in Manipur
state, in far Northeast India. This community-based project aims to reduce
the transmission of HIV and the impact of AIDS in the community. Home
based care and drug detoxification together with counselling and community
education continue as major components of the program.
“A needle and syringe exchange program has been established, the first
of its kind in India, thus providing leadership in the introduction of new but
acceptable strategies to reduce the transmission of HIV in south Asia. MBC
has provided technical support, assisting in the review of project activities
and in planning and design of the third phase. Further support has been
extended through training and support for investigations including a study of
impact of the epidemic on women by the community and seroprevalence of
HIV among injecting drug users.”
In the next annual report, the same strategy is described as “a harm reduction
approach” without giving the detail that this involves the distribution of needles
and syringes.
There is a fundamental difference between swallowing a drug and self-injecting it.
This is a point exploited by the methadone lobby, long after the methadone
program had demonstrably failed to prevent an increasing number of Australians,
Americans and Europeans from becoming addicted to opiates. Other parts of the
world are not equally troubled by opiate addiction, although it is said to be a
growing problem in large cities throughout the world. The reason that methadone

158

failed to decrease addiction levels in the world is obvious. It is itself an opiate, and
can cause even worse and more prolonged withdrawal if suddenly stopped, than
heroin. A fear of the pain and suffering of withdrawal, and a weakening of
resistance to refuse the drug as the symptoms worsen are recognised amongst the
many factors that contribute to this terrible problem.
Methadone (physeptone) is a synthetic opiate available in tablet and syrup form,
and sold in Australia by the same company that produce AZT, the giant
pharmaceutical company Glaxo-Wellcome, the head offices of which are based in
the US and England. Wellcome Pharmaceuticals is related to the Wellcome Trust,
Britain’s largest medical research trust fund, although it is claimed that the two
organizations are politically independent, and that financial, political and scientific
decisions of the Wellcome Trust are not influenced by agendas for the profit of
Wellcome Pharmaceuticals, now merged with the huge American drug company
Glaxo to form “Glaxo-Wellcome”. Wellcome Pharmaceuticals is the only drug
company in this part of the world to manufacture and sell azidothymidine (AZT),
now being promoted by the Macfarlane Burnet Centre as a successful treatment for
AIDS, despite much evidence to the contrary. The Centre’s literature also claims
that HIV antibodies in the blood signify an infection that is inevitably fatal, with or
without drug treatment, a claim that is scientifically unjustified and potentially
disastrous.

159

Chapter 10
MEDICAL WARS AND THE AIDS INDUSTRY
On 2.2.2000, the Murdoch-owned Australian newspaper, contained a warning by
reporter Evan Whitton in an article titled “Hard evidence a casualty in the HIV
war”:
“Also alarming is any initiative with “war” in the title. US president
Richard Nixon’s 1971 war on cancer garnered billions for research, but no
cure. Gerald Ford’s 1976 war on swine flu procured 50 million vaccinations
against a bug that may have caused one death.
“And Ronald Reagan’s 1984 war on AIDS extracted more billions from
the public purse…”
The article describes the immediately disputed claim by Robert Gallo that he had
discovered the viral cause of AIDS, with Reagan’s health secretary “at his
shoulder” at a press conference on 23.4.84:
“Gallo, a Californian virus hunter and veteran of the lost war on cancer,
said he had found the probable cause of AIDS and that it was a virus, later
called human immunodeficiency virus. This meant AIDS was caused by a

160

bug rather than by lifestyle-stress, and that women were also in danger. My
God, how the money rolled in.”
Whitton goes on to express doubt as to whether HIV causes AIDS, quoting the
Nobel prize-winning Dr Kary Mullis as writing, in 1996, that “no one has ever
proved that HIV causes AIDS…there is simply no scientific evidence
demonstrating that this is true.”
He gives some figures and facts about AIDS, HIV and AZT which give some
indication of the size of the AIDS industry and where significant conflicts of
interest may lie:
“…[Up] to 1997, US taxpayers alone had contributed $US45 billion to
the AIDS industry and it had generated 1500 HIV-related US patents,
100,000 scientific papers, blood-screening tests for evidence of HIV worth
millions and sales of AZT totalling $US2.5 billion.
“AZT (Azidothymidine) is a form of chemotherapy invented by Jerome
Horwitz in the 1950s to treat cancer. It was abandoned because of its toxicity
and resurrected to attack the alleged HIV virus.”
Evan Whitton gives an indication of the “financial cost” to those concerned if the
“man-made theory” turns out to be correct (another disincentive to looking for or
accepting evidence of such):
“Not surprisingly, lawyers are becoming key players in the AIDS
industry: it was reported in January that a legal firm had got ‘$200 million
for 500 Australians with medically acquired HIV’.
“On the other hand, if people such as Nobel prize winner Mullis and
Perth’s Papadopulos-Eleopulos turn out to be right about the dreaded HIV,
lawyers will see $200 million as no more than a drop in the ocean.”
Eleni Papadopulos-Eleopulos is described in the article as a Perth bio-physicist
who has been studying AIDS since 1981. She is quoted as saying, “There is no
proof that HIV exists; there is no proof that HIV causes AIDS.”

161

The difficulty which arises when one talks about “proof” of HIV causing AIDS, the
existence of HIV or the origin of AIDS is that there is no universal measure of
what types and quantities of evidence constitute “proof”. One hundred percent
proof is rare in medical science, but it does exist. Levels of certainty, however,
range from complete certainty to disproof. It has now been disproved that the earth
is flat. It has also been proved that masturbation does not cause blindness, but with
less certainty. The reason it is less certain is that, to my knowledge, no scientific
studies have been done that prove beyond doubt that people who masturbate do not
develop deterioration (or improvement) in their vision over the years (though most
unlikely). There are also, more worryingly, few studies showing that watching
television (or computer) screens does not damage vision. What is “proved”, in
other words, depends on what is looked for.
Viruses cannot be seen the way bacteria and other small cellular organisms can be
seen: directly, with the aid of a light microscope. The existence of viruses is mainly
inferred through serological tests (blood tests) and the clinical course of various
illnesses which behave as infectious diseases, but in which bacterial or fungal
causative organisms have not been identified. Viral infections also cause
characteristic changes in the appearance and behaviour of cells in the body, which
can be detected under a microscope. They also cause macroscopic changes, which
can be evidenced by looking at the organs and tissues of people (and animals) that
have died of the infection. The virus itself, however, is too small to be seen, even
with the most powerful light microscope, and the level of certainty allowable for
their existence is therefore necessarily less than that of bacteria (100% certainty).
The existence of viruses is, nevertheless, very close to certainty. Modern imaging
techniques, including electron microscopy, have been claimed to “show what
viruses look like”, but the colour enhanced, computer-enhanced pictures of “virus
particles” cannot be relied on with the same faith as seeing the organism oneself
under a microscope. In the case of HIV, we are forced to rely on the very institutes
who may have created the organism to tell us about it, as well as the disease it is
said to cause: “acquired immunodeficiency disease” (AIDS). The problem is, they
have said some very contradictory things, and said them with the authority of

162

“experts”. They have also claimed, with certainty, things that have turned out to be
untrue.
The popular science magazine Scientific American is regarded as an authoritative
and reliable source of information about science, including medical science. In
their 1988 October edition, they featured a “single-topic issue” titled “What
Science Knows About AIDS”. In it, Robert Gallo and Luc Montagnier (of the
Pasteur Institute in Paris, who contested Gallo’s 1984 claim of discovery saying he
had discovered HIV first) wrote, in their “first collaborative article”:
“As recently as a decade ago it was widely believed that infectious
disease was no longer a threat in the developed world. The remaining
challenges to public health there, it was thought, stemmed from
noninfectious conditions such as cancer, heart disease and degenerative
diseases. That confidence was shattered in the early 1980’s by the advent of
AIDS. Here was a devastating disease caused by a class of infectious agents
– retroviruses – that had first been found in human beings only a few years
before. In spite of the startling nature of the epidemic, science responded
quickly. In the two years from mid-1982 to mid-1984 the outlines of the
epidemic were clarified, a new virus – the human immunodeficiency virus
(HIV) – was isolated and shown to cause the disease, a blood test was
formulated and the virus’s targets in the body were established.”
The article describes the discovery of the first “retroviruses” in animals by Howard
Temin of the University of Wisconsin and David Baltimore of the Massachusetts
Institute of Technology in 1970 and the decade-long search for human retroviruses:
“In spite of such discoveries, by the mid-1970’s no infectious retroviruses
had been found in human beings, and many investigators firmly believed no
human retrovirus would ever be found. Their skepticism had several
grounds. Many excellent scientists had tried and failed to find such a virus.
Moreover, most animal retroviruses had been fairly easy to find, because
they replicated in large quantities, and the new virus particles were readily
observed in the electron microscope; no such phenomenon had been found
in human beings. In spite of this skepticism, by 1980 a prolonged team effort

163

led by one of us (Gallo) paid off in the isolation of the first human
retrovirus: human T-lymphotropic virus type I (HTLV-I).
“HTLV-I infects T-lymphocytes, white blood cells that have a central role
in the immune response. The virus causes a rare, highly malignant cancer
called adult T-cell leukemia (ATL) that is endemic in parts of Japan, Africa
and the Caribbean but is spreading to other regions as well. Two years after
the discovery of HTLV-I the same group isolated its close relative, HTLV-II.
HTLV-II probably causes some cases of a disease called hairy-cell leukemia
and lymphomas of a more chronic type than those linked to HTLV-I. The
two viruses, however, share some crucial features. They are spread by blood,
by sexual intercourse and from mother to child. Both cause disease after a
long latency, and both infect T lymphocytes. When AIDS was first
recognized, these properties took on great additional significance.”
More detail about Gallo’s discovery of HTLV-I is given in the 1994 Penguin
publication The Coming Plague by Laurie Garrett (who won the Pulitzer Prize for
reporting on the Ebola Virus):
“Dr. Robert Gallo and his NCI colleagues found evidence of a virus
inside the T cells …of a twenty-eight-year-old African-American man who
had come to Bethesda, Maryland, in 1979 from his Alabama home for
experimental cancer treatment. The NCI group quickly found two other
individuals who suffered T-cell lymphomas and seemed to be infected with a
virus: an immigrant woman from the Caribbean and a Caucasian man who
had traveled extensively in the Caribbean and Asia.
“Two years earlier Kiyoshi Takasuki, an epidemiologist with the Tokyo
Cancer Institute, had discovered groups of people living on outer Japanese
islands who apparently had cancer involving their immune systems’ T cells.
The Japanese researcher dubbed the disease adult T-cell leukemia or ATL.
Gallo’s laboratory isolated their virus and named it HTLV, or human T-cell
leukemia virus [Gallo himself describes it as T-lymphotropic virus in his
1988 article]. The Gallo group also identified the existence of an oncogene
[cancer-causing gene] in the HTLV virus that gave the microbe the ability to
produce leukemia. Attempts at collaboration between the Japanese and

164

American researchers went awry and Yorio Hinuma and Mitsuaki Yoshida of
Kyoto University announced discovery of a different virus in the Japanese
leukemia patients, named ATLV, or adult T-cell leukemia virus.
“Ultimately, Mitsuaki Yoshida led a Tokyo Cancer Institute study in 1980
that compared ATLV and HTLV and found them identical. They furthermore
showed that Japanese monkeys (Macaca fuscata), Indonesian rhesus
monkeys, and African green monkeys captured in Kenya and held in
captivity in Germany had antibodies to ATLV/HTLV, and that the virus – or
a monkey version of the human virus – could be transmitted from one cocaged animal to another.” (p.229)
Could HTLV-1, HTLV-2 and HTLV-3 (later named HIV by Gallo) been serially
developed, natural (‘wild’) or genetically engineered, cancer-causing viruses? Are
monkeys and Africans being blamed merely as convenient scapegoats? The extent
of “primate research” which involves intentionally infecting monkeys with fatal
viruses becomes evident from several sources, including the above passage and
from the table in Gallo and Montagnier’s Scientific American article, in which a
table purporting to establish “evidence that HIV causes AIDS is by now as firm as
that for the causation of any other human disease” claims, as evidence from
“animal systems”:
“Several types of retroviruses can cause severe immune deficiencies in
animals. For example, the feline leukemia virus (FeLV) can cause either
immune deficiency or cancer, depending on slight genetic variations in the
virus. A virus related to HIV, the simian immunodeficiency virus (SIV), can
cause AIDS in macaque monkeys. The second AIDS virus, HIV-2, may also
cause AIDS in macaques.”
The discoverers of the HIV virus do not blame animal experimentation for the
development of new viruses and viral strains, nor do they blame, or mention,
biological warfare or even the immunization programs of the preceding years in
Africa when they explain their theory on “where was HIV hiding?” In their answer
they present a scenario which is speculative and not supported by any specific
evidence:

165

“Both of us think the answer is that the virus has been present in small,
isolated groups in central Africa or elsewhere for many years. In such groups
the spread of HIV might have been quite limited and the groups themselves
may have had little contact with the outside world. As a result the virus
could have been contained for decades.
“That pattern may have been altered when the way of life in central
Africa began to change. People migrating from remote areas to urban centers
no doubt brought HIV with them. Sexual mores in the city were different
from what they had been in the village, and blood transfusions were
commoner. Consequently HIV may have spread freely. Once a pool of
infected people had been established, transport networks and the generalized
exchange of blood products would have carried it to every corner of the
world.” (p.31)
Gallo and Montagnier give no explanation for the almost simultaneous epidemics
affecting black heterosexual populations in central Africa and white homosexuals
in America, and although they refer to “known at risk populations” they fail to
specify who, exactly, these are. The discussion refers to homosexuals, who are
clearly “at risk”, and the end of the article implies that the “drug culture” is also to
blame:
“All of us must learn how HIV is spread, to reduce risky behavior, to
raise our voices against acceptance of the drug culture and to avoid
stigmatizing victims of the disease.”
The “discoverers of HIV” do not explain why women and children in Central
Africa and later in South East Asia and the “Third World” are also “high risk
populations”, although they make insinuations about “villagers” in Africa
“undoubtedly” bringing HIV with them to urban centres where “sexual mores”
were different. Their model for the dispersal of AIDS via contaminated African
blood transfusions following the initial infection of a local population is plausible
and undoubtedly explains some, but not all, of the spread of AIDS around Africa
(via transfusions and blood products). It is accepted that prior to screening, several
recipients of transfusions and blood products (particularly haemophiliacs)

166

developed HIV antibody reactions as well as AIDS. In fact, this is included in the
“proof of HIV causing AIDS” provided in the Scientific American article:
“A study of people who received blood transfusions in 1982-83 (when
the fraction of blood donors infected with HIV was about 1 in 2000) showed
that of 28 people who got AIDS, the virus could be found in all 28.
Furthermore, for each recipient who got AIDS an infected donor could be
found. Today most of those infected donors have also developed AIDS.”
It is not exactly true that the “HIV virus could be found”, however, in the cases
described. It would be more accurate to say “evidence of HIV exposure could be
found”. The HIV “infection” was inferred by the presence of HIV antibodies:
evidence of the immune systems reaction against the human immunodeficiency
virus. In all the other described viral infections, and according to the fundamental
principles of immunology, antibodies are produced as part of the immune defences
– their production is an indicator of a healthy, not an unhealthy, immune response.
They can, however, fail to control an infection and people can become ill from
viral infections while still producing antibodies.
After an infection has been defeated the immune system continues, for a variable
period of time (sometimes for life), to produce antibodies which protect against reinfection or subsequent infection by the specific virus. These antibodies are
transmitted to a breast-feeding infant in breast milk, protecting the baby from
infections while the immune system is developing. With no satisfying logic, this is
said not to be the case with HIV: antibodies are measured as an indicator of active
infection and the breast milk of “infected” (HIV antibody positive) mothers is said
to pose a risk to their babies. Or are the World Health Organization’s immunization
programs the real risk?
Confusing the issue is the argument put forward by Professor Peter Duesberg, an
eminent virology professor who disagrees that HIV causes AIDS, believing instead
that “recreational drug use” and antiviral drugs, such as AZT, are the real culprits.
In Inventing the AIDS Virus, published in 1996, he wrote:

167

“While hundreds of thousands of people die of heavy drug abuse or from
their AZT prescriptions, AIDS officials insist on pushing condoms, sterile
needles, and HIV testing on a terrified population. “AIDS propaganda is
ubiquitous,” observes Charles Ortleb, publisher of the homosexual-interest
New York Native. “Ten percent of every brain in America must be filled with
posters, news items, condom warnings, etc., etc. The iconography of ‘AIDS’
is everywhere. Part of the Big Lie is that ‘AIDS’ is somehow not on the front
burner of America. ‘AIDS’ propaganda has become part of the very air that
Americans breathe.” All of this is based on a war against a harmless virus
waged with deadly “treatments” and misleading public health advice.
“This is truly a medical disaster on an unprecedented scale.
“Ironically, HIV-positives actually have no reason to fear. As with
uninfected people, those who stay off recreational drugs and avoid AZT will
never die of “AIDS”. Antibody-positive people can live absolutely normal
lives. Worldwide, seventeen million of eighteen million HIV-positives
certainly do. Those at real risk of AIDS could help their fate if they were
only informed that recreational drugs cause AIDS. And those with AIDS
could recover if they were informed that AZT and its analogs inevitably
terminate DNA synthesis, and thus life.” (p.462)
Duesberg, who is Professor of molecular and cell biology at the University of
California at Berkeley, is a pioneer in retrovirus research, and is credited with
being the first scientist to isolate a cancer-causing gene (oncogene). Aware of a
professional bias towards blaming viruses for toxin-related illnesses, he has been
disputing, with detailed scientific arguments, the “official explanation” of AIDS
since 1989. For this he has been personally vilified by senior figures in the
American medical (and scientific) establishment and his work has suffered from a
degree of suppression, misrepresentation and ridicule.
Nevertheless, his articles challenging the HIV/AIDS hypothesis have been
published in such journals as The New England Journal of Medicine, Science, The
Lancet, British Medical Journal and Cancer Research. The biological warfare
theory and vaccination theory have not received such corporate support.

168

Surprisingly, Duesberg does not even mention the correlation between Hepatitis B
vaccines (or any other vaccines) and the development of AIDS. He blames
malnutrition and other environmental factors for the development of
immunosuppression in Africa, and drugs for the appearance of AIDS amongst
homosexuals and intravenous drug users in the USA. He presents a complex and
detailed argument to support his view, some of which is quite convincing.
Duesberg explains that “AIDS” is not a specific diagnosis. It is a clinical
syndrome, which is categorised as “AIDS” if antibodies against HIV are detected.
If a person has tuberculosis and HIV antibodies it becomes “AIDS”, if such
antibodies are not detected, it is just tuberculosis. He points out that the disease
presents very differently in Africa and the United States of America (where it was
first reported), affecting different populations and causing different illnesses. He
writes:
“HIV would need to perform other miracles to cause AIDS. Virtually all
diagnoses of Kaposi’s sarcoma are made in homosexuals, not in the other
AIDS risk groups. Intravenous drug addicts disproportionately suffer from
tuberculosis, Haitians from toxoplasmosis, and hemophiliacs from
pneumonias. African AIDS diseases are basically different, manifesting as
tuberculosis, fever, diarrhoea, and a slim disease, unlike our wasting
syndrome. A homosexual with HIV who may develop Kaposi’s sarcoma can
donate blood for a hemophiliac. But no hemophiliac has ever developed
Kaposi’s sarcoma from a blood transfusion. Instead he is more likely to
develop pneumonia, if he contracts anything at all. Only HIV is common to
both victims.” (p.215)
Duesberg considers HIV to be a harmless “passenger virus”, similar to the simian
virus SV40 which was discovered in the late 1950s and unintentionally injected
into human populations in polio vaccination programs at the time. Although there
have been concerns about long-term disease (or risk of disease) from this virus,
Duesberg, along with most other experts, considers the viral contaminant “safe in
humans”. He writes of the discovery of SV40:

169

“The war on polio provided an unexpected opportunity for finding new
viruses. In 1959, the Salk polio vaccine was in wide distribution, and the
Sabin vaccine was undergoing large-scale trials in foreign countries. Almost
simultaneously, two scientists independently found a new virus in the
monkey kidneys in which the poliovirus was being mass-produced for the
vaccine – in other words, a contaminant. The virus was native to monkeys
and caused cell death in the kidney tissues. Inspired by the polyoma
discovery, both researchers injected this virus into newborn hamsters in an
attempt to cause cancer, even though neither yet knew of the other’s work.
To the investigators’ excitement, the hamsters did indeed get tumors from
the virus. As the fortieth virus isolated from monkey cells used to propagate
polio vaccines, it was named Simian Virus 40, or SV40.
“The new virus was first publicly announced in 1960. Millions of
children in the United States and abroad had already been immunized with
polio vaccine contaminated with this potentially cancer-causing monkey
virus. Another million soldiers had received vaccines for a different disease
that had been similarly contaminated. Huge studies tracking vaccinated
people soon confirmed no unusual cancer cases among them, but the virus
hunters had achieved their victory. In the wake of the near panic over SV40,
growing amounts of research dollars were earmarked for cancer-virus study.
In 1959, for example, NCI specifically reserved the extraordinary sum of $ 1
million for the field. The notion that viruses might cause cancer in humans
had been firmly embedded in the thinking of the scientific community.”
(p.94)
There are, in fact, several known viruses that can cause cancer in other mammals
(including mice, hamsters, cats and monkeys). It would seem more than likely that
viruses could also cause cancers in humans, but this is only, of course, a partial
explanation of the causation of cancer. Radiation can also cause cancers and so can
exposure to chemical toxins and hormones. Some cancers demonstrate “family
clustering”, often claimed as evidence that “genetic susceptibility” (oncogenes) is a
major factor in the development of cancer. Duesberg is an expert in both oncogenes
and virology. He doubts, somewhat inexplicably, that any viruses cause cancer in

170

humans. This would be an exception in the mammal world. He also doubts that
hepatitis B infection predisposes the sufferer to cancer, a dogma accepted by the
AIDS industry and medical profession, and promoted as a reason to increase
hepatitis-B vaccination programs.
Duesberg argues that firstly, no evidence has surfaced that liver cancer is
infectious, and secondly, most people who develop liver cancer are not infected
with viral hepatitis. This does not, however, prove that hepatitis B infection does
not predispose to cancer of the liver, although it can be safely assumed that viral
infection is not the only cause of liver cancer. If the transmission of the hepatitis-B
virus is limited to blood and sexual contact, as is generally claimed, vaccination
against hepatitis-B should be limited to those genuinely at risk of infection – and
the possibility that Hepatitis-B is being introduced, itself, through infected
vaccines is a real possibility that cannot be reasonably discounted or ignored
without careful evaluation. These possibilities are not discussed by Duesberg who
does not mention biological warfare in his 700-paged thesis. He infers, however,
that disease is being intentionally created in a multi-billion-dollar terrorisation
campaign about “superbugs” (including HIV) to attract funds for research into new
viruses and new viral explanations for old illnesses.
Professor Duesberg also refutes, with good reason, claims of uniform fatality from
HIV infection:
“The national AIDS figures fall well short of a virus with a nearly 100
percent fatality rate. But rather than abandon the hypothesis, the experts
have chosen to revise the parameters of HIV infection. The latency period
was originally calculated in 1984 on the basis of tracing sexual contacts,
finding homosexual men developing AIDS an average of ten months after
their last sexual contacts with other AIDS patients. This “incubation period”
has since been stretched out to ten to twelve years between HIV infection
and disease. For each year that passes without the predicted explosion in
AIDS cases, approximately one more year is added to this incubation time.
Even this is insufficient; with only 5 percent of infected Americans

171

developing AIDS each year, the average latent period would have to be
revised up to some twenty years for 100 percent to become sick.” (p.214)
This is indeed what Professor John Mills has done at Melbourne’s Macfarlane
Burnet Centre. He claims in the MBC pamphlet “HIV/AIDS – the whole world’s
problem” that although “there will be cases where patients have had the infection
for 20 or 30 years before becoming ill”, “virtually everybody infected with HIV
will eventually get AIDS, if they are not treated”. Since HIV was first discovered
only 16 years ago, and several (the numbers are disputed) who were diagnosed as
“positive” at that time remain well today, it is strange that Professor Mills is so
certain about the fatality rate from HIV infection, of which he says, “the estimate at
the moment is about 95% but it could be 100%.”
It is obviously in the interests of drug promoters to describe the disease they are
treating as more lethal than it is. If HIV infection is said to cause death in 2 years,
and those who are given AZT live, on average for 5 years, the drug could be said to
have “improved life expectancy by 3 years”. If AIDS is viewed as not being so
dangerous, and causing death, on average, say, in 7 years time, the same results
(death on AZT in five years), would be recognised as a harmful treatment. If HIV
infection is not dangerous at all, as Duesberg believes, the use of the drug in
asymptomatic HIV-positive people or even established cases of AIDS would be an
inexcusable crime, since the toxicity of the drug was recognised in the 1960s
(which is why the drug was rejected as a treatment for cancer).
It is worth noting the CDC’s criteria for the diagnosis of “AIDS” (which were
already overinclusive in the opinion of Duesberg and others) were further
broadened in 1993 to include “all HIV-infected persons” (those testing positive for
HIV) who have CD4 counts and T lymphocyte counts below 200 per microlitre,
even if they show no signs of illness, and people with HIV antibodies who develop
pulmonary (lung) tuberculosis, recurrent pneumonia or invasive cervical cancer
(Harrison’s Principles of Medicine, 1997, p.1567). Needless to say this greatly
increases the “official number” of people with “AIDS” as opposed to

172

asymptomatic HIV, and greatly increases the number of candidates for drug
treatment.
In the next chapter the work and political connections of Australia’s premier AIDS
research institute, the Macfarlane Burnet Centre in Melbourne will be examined in
more detail. Specifically, the promotional material from the centre on HIV/AIDS
will be looked at, along with the public pronouncements of the centre’s medical
director, the American microbiologist John Mills.

Professor Mills, who is also director of the vaccine-developing biotechnology and
drug company AMRAD and the Rothschilds Bioinvestment Fund, was recently
quoted in The Age (14.3.2001), as promoting the use of a “live attenuated HIV
vaccine” (LAHV) for use in “countries where the prevalence of HIV was greater
than about 10 percent”, even though he admitted that “this would also cause the
disease to develop in some people”. His promotion of this admittedly dangerous
course of action was based on “a mathematical model that predicted the trade-off
between safety and effectiveness of live attenuated HIV vaccines”, according to the
article by science reporter Penny Fannin titiled “HIV vaccine may cause AIDS:
study”. This mathematical model was used to “predict the epidemiological impact
of HIV vaccines” in Zimbabwe, where, apparently, “every fourth person is infected
with HIV” and Thailand, “where 2 per cent of the population has the virus”,
according to the article.
According to Mills (whose connection with AMRAD is not mentioned in the
article), “although an LAHV would eventually eradicate HIV in both countries
[according to his dubious mathematical model], it would take at least 50 years and
would temporarily increase the death rate of AIDS in Thailand, where transmission
is low”. The Age article failed to mention the fact that Sir Roderick Carnegie, the
director of Fairfax Holdings, publishers of The Age, was on the Board of Directors
of the Macfarlane Burnet Centre until last year. It also failed to mention the AngloAmerico-Australian AIDS vaccine consortium that has recently received support
from the Australian Government.

173

Photos of Burnet Institute Board of Directors, 1997-2001
CHAPTER 11
MACFARLANE BURNET CENTRE ON AIDS
The Harvard-trained Professor John Mills, head of the Macfarlane Burnet Centre in
Melbourne, is a keen promoter of Zidovudine and the other newer anti-viral drugs,
making little or no admission of the terrible side-effects they can cause. In a 1998

174

public education pamphlet titled “HIV/AIDS – The Whole World’s Problem”,
based on a public lecture by Professor Mills, the Macfarland Burnet Centre claims:
“HIV can also be transmitted from mother to infant, so a woman who has
HIV and becomes pregnant has between 15% and 30% chance of passing
that infection on to her baby. The risk of death for a child with HIV infection
is very high, probably 60% over two years rather than over a decade or two
as is the case with adults. Fortunately, we now know that zidovudine (AZT)
therapy of the mother will reduce transmission to her baby by over 75%”.
The same pamphlet claims that “virtually all transmission of HIV is through sexual
contact and injecting drug use”, however since 1998 a new theory is being claimed
and accepted as incontrovertible fact by the medical profession: HIV is said to also
be transmitted from mother to baby by breastfeeding. This has permitted a
resumption of the practice of convincing millions of African women (and other
women in the so-called Third World) to feed their infants with milk powder, for
“medical reasons”. This follows a long-running campaign to sell milk powder to
poor nations by massive corporations such as Nestle (manufacturers of NAN milk
powder and now the owners of large pathology companies) and Wyeth
(manufacturers of S-26 milk powder and a broad range of pharmaceutical drugs),
despite the widely publicised fact that babies around the world were dying because
of contaminated bottle-milk (made with impure water), and lack of protective
maternal antibodies in the breast-milk. HIV antibodies are said, however, to signify
chronic infection rather than a successful immune response (as in other viral
infections). The MBC pamphlet a “consumer version” of this theory:
“In the first few weeks after exposure to HIV the virus multiplies rapidly.
Next, usually within three months, the person seroconverts – that is, they
start to produce antibodies to HIV. At this point laboratory tests will record
the person as ‘HIV-positive’. Many people experience an acute febrile
(feverish) illness at this time. The illness lasts about fourteen days and may
be mistaken for glandular fever.
“After that, the majority of people who have HIV do not have any
symptoms at all for years or even decades before the illness progresses to the
point where it becomes clinically evident (eg as AIDS).

175

“During this incubation period the victim is perfectly well and capable of
working and leading a normal life, but also capable of transmitting the
infection to other people. This is one of the very real difficulties with the
HIV virus; there is absolutely no evidence, without specific laboratory
testing, that a person has it. And if you’re in apparent good health, who
would think of taking such a test?”
The answer to the last question is “anyone who read the MBC literature and took it
seriously”. Given the basic psychoimmunological and medical principle, though,
that pessimistic prognoses resulting in despair can weaken the body’s resistance to
illness, it is hard to see how this sort of claim can be justified in the interests of
“health promotion”:
“The latent period – the interval between infection and the development
of the disease – averages 9 years. That means that half the people will get
AIDS before 9 years, but the other half will not get sick until later. There
will be cases where patients have had the infection for 20 or 30 years before
becoming ill. However, virtually everybody infected with HIV will
eventually get AIDS, if they are not treated. The estimate at the moment is
about 95% but it could be 100%.”
The qualifying “if they are not treated” is very misleading: there is no treatment
which has been shown to stop people who are “HIV-positive” from developing
AIDS. Some authorities, including Peter Duesberg, believe that “antiretroviral”
drugs such as AZT actually trigger the illness via their known immunosuppressive
activity. The MBC, however, claims that these drugs make a “2 to 3 year difference
in survival”, which is described as “reasonably effective”:
“Since 1987 drugs have been available for treatment – AZT, Zidovudine,
ddI (didanosine), ddC (zalcitibine) and stavudine (d4T). They are reasonably
effective but really only make a two to three year difference in survival. At
the NCHVR (National Centre for HIV/AIDS Research, located at the MBC)
and MBC one of our key tasks is to develop better therapy for HIV infection,
and there are some promising leads”.

176

They claim, however, that the “developing world” cannot afford AZT, which is
admitted to be a “relatively toxic drug”:
“In the developing world these therapeutic agents are for the most part
not available to most people. The annual costs for AZT is about $3,000 per
person, and in many such countries the total health care budget is only a
couple of dollars per year per person. In addition, it is a relatively toxic drug
which requires sophisticated medical facilities to monitor therapy.”
The pamphlet, which was apparently “adapted from a talk by Professor John Mills”
(director of the MBC) expands on imminent problems for “developing countries”:
“For the most part, the biggest growth in HIV and AIDS cases has been
in developing countries, with the major risk areas being in Africa,
particularly equatorial Africa, where it is estimated there will be in excess of
6,000,000 cases by the year 2000. Other high risk areas are South America
and South East Asia. In places like Thailand, India and the Philippines we
are facing an AIDS epidemic that is going to be incredibly serious. At the
moment, large numbers of patients are asymptomatic, but as their illness
progresses there will be an appalling medical problem to cope with.”
As to the origin of AIDS, Professor Mills repeats the official claim about monkeys
in Africa. He answers the question “where did the virus come from?”:
“Probably from Central Africa, and it probably represents a virus
originally prevalent in non-human primate population – monkeys, for
example – which got into the human population. It has been present in
Africa for many decades [there is no reliable scientific evidence of this],
perhaps even for centuries (but probably not for thousands of years, unlike
some other viruses). The reason for the epidemic in the second half of the
twentieth century is because of the profound social and political changes
which have permitted this infection to become a pandemic, meaning that it is
a disease of world-wide distribution.”
He does not expand on the precise nature of these “profound social and political
changes”, or the injustice of “third world debt” causing starvation, lack of health

177

care and a worsening gap between rich and poor. He does not refer to the ongoing
wastage of arms purchases by poor nations from rich ones, or the support of
corrupt and despotic regimes by the war machine. He does not mention biological
warfare, chemical warfare or drug warfare. He does not mention the mining and
forestry industries’ interests in the gold, diamond, uranium, minerals and forests of
Central Africa or Australia. He does, however, paint a grim picture of the toll of
AIDS in Africa:
“In Africa, where the disease is older than in other parts of the world, it is
a disaster with whole villages wiped out. There is real concern that AIDS is
now the major political threat to the economic and social future of Africa…
In the central African country of Rwanda, which is heavily affected by the
AIDS epidemic, up to 30% of young Rwandan women delivering in the
obstetric ward are infected with HIV. This is a terrible problem, because
apart from the economic difficulties and the need for medical care, these
women can also pass the infection along to their children. The risk is 3050% per pregnancy and since she may be infected for many years before she
actually gets sick, there may be many children involved who will be infected
and die. Those that are not infected, or who live for years, are orphaned
when their mothers die of AIDS.”
What about their fathers? Or are African children “orphaned” when their mother’s
die because their fathers do not look after them? There are many insinuations,
subtle at times, about the morality of people in Africa and the “Third World” or
“developing countries” as the poor (colonised) nations are interchangeably referred
to, in the MBC propaganda:
“In developing countries the AIDS epidemic is particularly a problem in
women and children, because women play a key role in the society of those
countries [they don’t here?]. In many of those countries women form a
crucial link as care givers, food providers and part of the social network and
fabric of society, and so the epidemic amongst women in these societies will
have an effect far beyond that which would have occurred had it been
amongst men. Statistics on the prevalence of HIV amongst female sex
workers [where?] show the role that women are playing, or are being forced

178

to play, in the evolution of the epidemic in these countries. In some countries
virtually female sex workers are infected, in some countries up to 70 or 80
per cent.”
The last sentence contains what is presumably a typographical error, and should
read “virtually all” or something to that effect. Overall, Professor Mills’ appraisal
of “developing countries” demonstrates a gross ignorance of the complex societies
and cultures of Africa and the world, and also suggests that a dominant focus of the
centre’s research is getting people tested for HIV antibodies and presenting
statistical one-liners that support the use of drugs for even asymptomatic “HIV
positive” people. That is, as long as they can afford the expensive drugs. For the
“underdeveloped countries”, Professor Mills can see little hope:
“In the less developed parts of the world the prognosis is poor. We are
losing the battle from the standpoint of control, losing economically because
money for AIDS control is either static or declining, and losing medically
because there are no inexpensive and non-toxic drugs which can be used in
these countries.”
Whilst describing Rwanda as a “central African country” which is “heavily
affected by the AIDS epidemic”, Professor Mills neglects to mention several
historical, political, social and medical facts about Central Africa, and specifically
the treatment of Central Africa and Central African people by Europeans over the
past 400 years that might enable a better perspective to be gained of the health
problems of the millions who have survived unspeakable colonial atrocities.
Rwanda is a small land-locked, forested, mountainous country located east of
Zaire, which is the second largest country in Africa. It was a colonial “possession”
of Germany, but was ‘mandated’ to Belgium by the League of Nations (the
forerunner of the United Nations) after the First World War. The Belgian
Government had already taken control of Zaire, which as “Belgian Congo” had
been a “personal possession” of the Belgian monarch, King Leopold II. Leopold
was guilty of some of the worst colonial abuses of Africa, continuing a system of
direct slavery in his “private possession” accompanied by brutality to the natives of

179

such cruelty that it became the subject of international concern in 1903, following
which his Government took over administration of the region (in 1908). This
continued until the 1960s, when Zaire and Rwanda became “independent”.
Rwanda has been in a state of civil war since then, with the colonial formalised
labelling of “Hutus” and “Tutsis” resulting in repeated mass-murder, constituting
genocide. Zaire and Rwanda, as well as neighbouring Uganda have also been
ravaged by AIDS, and were the areas of Central Africa where the African epidemic
was first noted.
As in the homosexual population in America, immunization programs have been
blamed, by various researchers over the past 15 years, for the introduction of AIDS
and HIV into Africa. The vaccination programs which have been most frequently
mentioned are the polio eradication program of the 1960s and the smallpox
eradication program of the 1970s. R. Ayana wrote, in 1988:
“Some researchers, including Dr Douglass, have researched the smallpox
vaccination programmes conducted in Africa at that time. Strecker,
Mendelsohn, Pearce Wright, Douglass, Rifkin and others claim that the
epidemiology of AIDS corresponds precisely with the WHO smallpox
vaccination programme.
“Douglass goes so far as to say that a particular vaccination programme
(referred to in a 1972 WHO report of a 1970 NIH conference) was laced
with HIV…assertions that HIV was created in Fort Detrick/NIH/NCI have
been made repeatedly over the past decade and this possibility must continue
to be considered until proven incorrect.”
Dr Robert Mendelsohn wrote, in 1987, about the difficulty in proving this due to
lack of followup of people who were immunized in these mass-immunization
campaigns, when often a line of children would be injected with the same needle in
unhygeinic situations:
“The theory – that the AIDS epidemic in Africa may have been triggered
by the smallpox vaccination program – has sparked intense debate among
scientists…an urgent call for evidence to support the idea has been
demanded by the World Health Organization. This theory was discussed by

180

WHO officials last Autumn (1987). No follow-up data are available from the
smallpox eradication campaign because no systematic studies of the
complications produced by the mass immunization have been done.”
The practice of immunization against smallpox began in Africa as early as the
1920s, with Rhodesia (now Zimbabwe and Zambia) a centre for immunization in
Southern Africa. Rhodesia, named after the British diamond baron Cecil Rhodes,
had a white supremacist regime for most of the twentieth century (and during the
time of the beginning of the smallpox vaccination program) along with
neighbouring South Africa, which also contains extensive deposits of diamonds
which were (and are still) exploited by British mining companies. Zimbabwe,
Zambia and South Africa have also been ravaged by AIDS.
Zimbabwe and Malawi (in the rift valley, also a previously British-ruled nation)
have been selected for “International Health Programs” in Southern Africa by the
Macfarlane Burnet Centre. Their 1997 Annual report states:
“David Hipgrave conducted an assessment study and designed a primary
care project in several districts of Malawi which focussed on communitybased malaria prevention activities using insecticide-impregnated bednets [!]
“Wendy Holmes, in collaboration with several NGOs, designed several
HIV/AIDS prevention and care projects in Zimbabwe which focused on care
and support for orphaned children; home-based care and support for people
and their families living with HIV/AIDS; HIV prevention among young
people; and support to the coordination of the responses to HIV/AIDS by
government, non-government, and community organisations in two
districts.”
More detail is provided in the 1998 Annual Report:
“The IHU [International Health Unit of the MBC] is collaborating with
the Batsirai group, a community based HIV prevention care and support
NGO in Chinhoyi, Zimbabwe, in carrying out a project which focuses on the
needs and concerns of women in relation to pregnancy and care of babies
under 12 months in the context of the HIV epidemic. It will also include an

181

operational research study on the feasibility and acceptability of
alternatives to breastfeeding. The district of Makonde, in which the project
is based, has probably the highest rate of HIV infection in the world.
“The project began in October 1998; Wendy visited for three weeks last
November. A Project Co-ordinator and Research Officer have been
recruited...They participated in an eight day workshop on participatory
research methods and strategic planning. The workshop was attended by a
diverse group including a traditional healer, the Provincial Nursing Officer, a
church leader, the provincial representative of ‘people living with AIDS’, a
social welfare officer and the nurse from the municipal clinic. Detailed plans
for the situation analysis were developed at the workshop…The next step
will be to carry out questionnaire surveys and to compile all the information
so that it can inform the strategic planning process to be conducted in April
1999.”
It is to be hoped that the “strategic plans” decided on will be better than those
developed for Zambia and Malawi, which the MBC admits are “controversial”:
“In Malawi and Zambia, two southern African countries, the HIV
epidemic is widespread, with some surveillance suggesting that up to 25% of
adults may already be infected with HIV. Yet nobody talks openly about the
epidemic, and this stifles the ability of the people of these nations to make
changes. This project, initiated by UNDP (United Nations Development
Programme) and funded by UNV, aims to facilitate community discussion
through placing people living with HIV as volunteers in key government and
other agencies. The project is controversial within those countries, and the
way it unfolds will be closely considered by other countries in Africa and
elsewhere.”
It is interesting that the MBC refer to “surveillance”, which is precisely what is
being done in the guise of “research” around the world. Surveillance of who has
“HIV antibodies” and surveillance of the health, vulnerability and sexual habits of
people around the world, especially young people. Surveillance accompanied by
the collection of blood specimens and data from “questionnaires”, which can

182

subsequently be interpreted in ways favourable to the profits of the banking
(insurance) industry, pharmaceutical industry and mining industry. The
surveillance in Australia includes such things as “a phone-in survey of
demographics, attitudes and practices of male clients of female sex workers”,
which is described in the 1998 Annual Report of the MBC:
“Through a telephone survey approach, Project Client Call examined the
knowledge, attitudes, and risk-taking behaviours and practices of 328 men
who had been clients of female commercial sex workers in Victoria. The
interviews revealed that at least one in ten of these men had had penetrative
vaginal or anal sex with a prostitute without a condom in the preceding
twelve months…”
Young Vietnamese people in Melbourne have attracted the MBC’s special
attention. In a “study into HIV and HCV in Vietnamese IDUs”, young Vietnamese
heroin users were trained to implement a survey of other young Vietnamese IDUs
(intravenous drug users):
“This project is jointly auspiced by The Centre for Harm Reduction and
the Western Region AIDS and Hepatitis Prevention, which includes a
primary needle exchange based in Footscray, Melbourne.
“The study builds on research completed in 1995 with 100 Vietnamese
IDUs. A questionnaire was devised by 11 young Vietnamese heroin users
who were then trained with support from staff of CHR to implement the
survey. Participants were recruited with informed consent by the peer
workers through their own social networks and at the NSEP. Over a period
of 12 weeks, 200 questionnaires were completed and finger prick samples
tested for HIV and HCV antibodies.”
Another project is titled “ethnographic research with heroin users of Vietnamese
ethnicity”:
“This NHMRC-funded project is investigating the social context of
heroin use by people from Vietnamese ethnicity. This involves 30 in-depth
interviews with Vietnamese users to establish a picture which gives a greater
understanding of the issues involved in heroin use in the Vietnamese

183

community. Themes being explored include: the processes and influences on
initiation into injecting; routes of administration including implications for
behaviour change; economic and social factors influencing the choice of
administration route; availability and accessibility of treatment services.”
No mention is made of the fact that the treatment provided for heroin addiction in
Victoria is largely focused on replacing the heroin addiction with methadone and
psychiatric drugs. Neither is mention made of the fact that heroin was developed
by the pharmaceutical industry at the turn of the 20 th century, and used extensively
by the medical establishment (particularly the psychiatric branch of it) before it
became a “street drug”. No mention is made of the British use of opiate drug
addiction in their 1840s war against China (the Opium Wars) or the fact that
methadone (which is as addictive as heroin) is marketed by the same company that
produces AZT (Zidovudine), the British company Wellcome Pharmaceuticals. No
mention is made of the Vietnam War or the American sponsored militarisation of
Australia and South-East Asia. No mention is made of the real causes of diseases
of poverty and warfare. No mention is made of the fact that the Macfarlane Burnet
Centre operates from the very place (Yarra Bend) where from ancient times
Aboriginal people met and danced the “gaggip” at corroborees celebrating the
unity and friendship of the Kulin Nation. The MBC literature does not reveal much
in terms of history, of the people they advise the health care of, or the country they
base their operations and “strategic plans” from.

184

Chapter 12
THE BURNET INSTITUTE’S INTERNATIONAL HEALTH
UNIT
If HIV is being artificially spread in the Third World, the Macfarlane Burnet Centre
(now the Burnet Institute) is deeply implicated in the crime. If not, the recent
Annual Reports of the centre suggest that the International Health programs being
implemented by the centre are grossly inappropriate for the health needs of the
people at whom these programs are directed. These include certain targetted
populations in Australia (especially homosexuals, intravenous drug users and
Aboriginal people) and larger populations in rural and urban areas of Asia and
Africa.
The 1999 Annual Report of the Macfarlane Burnet Centre states that the
International Health Unit, headed by the epidemiologist Mike Toole, is active in
the following regions in addition to Australia (where the units operations are
based): the eastern islands of Indonesia and Melanesia, India, Sri Lanka, Mongolia,
Myanmar (Burma), Vietnam, Laos, Tibet (in collaboration with the Chinese
government), Nepal, Papua New Guinea, seventeen pacific island countries (Cook
Islands, Niue, Samoa, Tonga, Fiji, Solomon Islands, New Caledonia, Vanuatu,
Kiribati, Marshall Islands, Nauru, Tuvalu, American Samoa, Northern Mariana

185

Islands, Micronesia, Guam and Palau), Tanzania and Zimbabwe.
All four MBC projects in Tibet involve the epidemiologist Associate Professor
Mike Toole, head of the International Health Unit (who gets to travel a lot). The
first two, “Tibet HIV/AIDS Response Project” and “Hepatitis B Prevalence
Study”, are of enormous concern if it is true that HIV-infection is being introduced
into targetted populations via Hepatitis-B vaccines (the claim was first made in the
1980s and has been repeated many times since then, along with mounting evidence
that this was the case at least among homosexuals in the USA in the late 1970s).
The “Hepatitis B Prevalence Study”, funded by the MBC and Red Cross, is
summarised as:
“Given the lack of precise information on the importance of Hepatitis B
in Tibet, a prevalence survey was conducted on a sample of 324 persons in
eight villages in Shigatse Municipality. The prevalence of infection with the
Hepatitis B virus was 21.6%, considered to be very high and twice the
reported prevalences from other parts of China. Of these, almost 50% were
Hepatitis B e-antigen positive indicating high infectiousness. It is likely that
much of the transmission is from mother to infant during the period
surrounding delivery. These findings gave impetus to the local health
authorities to seek sources of free Hepatitis B vaccine, which could be
provided to new born infants through the health services being strengthened
in the PHC and Water Supply project.” (p.79)
So, without any history or physical examination, the rudimentary and fundamental
basis of good medical diagnosis, the Tibetan people are being judged as having a
high incidence of Hepatitis-B, a form of viral hepatitis said, for many years, to be
only transmissible by blood contact or sexual contact. No mention is made of how
many people actually showed signs of illness, and one gains the impression that
such signs were not looked for – the entire program was evidently motivated by
“giving added impetus” to the case for selective vaccination of Tibetans against
Hepatitis-B, and the collection of data.
Who would give the Tibetans “free Hepatitis-B vaccines”, and would the Tibetans
be wise to trust in their safety? The 1997 Annual Report of the MBC lists
SmithKline Beecham, one of the world’s biggest Hepatitis-B vaccine producers, as
having provided the centre with as “corporate gift” of $40,000. Would they be the
company that MBC advisers had in mind as a source of free Hepatitis-B vaccines
for the Tibetan babies? Is it possible that the Chinese Government deliberately
understated the prevalence of Hepatitis-B in the “general Chinese population” in
order to stigmatise the Tibetans as “more infectious”? Is it possible that the

186

Chinese Government want to get rid of the Tibetans and Tibet altogether? If so, the
infrastructure being set up by the Macfarlane Burnet Centre provides a clear
opportunity to implement genocide by spreading HIV among Tibetans, as the
synopsis of the centre’s “Tibet HIV/AIDS Response Project” indicates:
“This project will facilitate an expanded response to HIV/AIDS in the
TAR [Tibet Autonomous Region], an area with increasing vulnerability to
HIV transmission and where prevention efforts have been limited. Technical
assistance will strengthen the skills of a Task Group on HIV and STDs
[sexually transmitted diseases] in the TPHB [Tibet Regional Public Health
Bureau]. A comprehensive situation analysis will be undertaken in Lhasa
City and San Nan and Shigatse prefectures, considered to be among the most
vulnerable areas of Tibet. A multisectoral approach will be promoted and
training will involve key sectors including education, social welfare,
transport, defence and police, as well as mass organisations representing
women, youth and the Red Cross. The situation analysis will inform the
development of a strategic plan to respond to HIV/AIDS in the Tibetan
geographic, cultural, and economic context. The first regional strategic
workshop was conducted in October 1999 for 26 participants from various
sectors and was assisted by Dr Wu Zunyou from the Ministry of Health
AIDS Prevention and Control Centre in Beijing. Generous support has been
provided by UNAIDS Beijing in the form of Chinese language training
materials.”
A medical infrastructure including immunization and medication dispensing
facilities, health education programs and assistance with the provision of clean
water supplies and advice on hygeine can be of immense value to Tibetans, as to
any other impoverished population – and any aid organization that helps provide
such services to the needy is to be commended. A medical infrastructure that
allows those in remote areas to access necessary medical treatments, and gain
knowledge of how to improve their health is also of undeniable value. Every
country in the world is in need of such an infrastructure, including Australia and
the United States of America. In no country has such a service been perfected, and
in every nation disparities exist regarding the quality of services provided
depending on affluence and other factors.
If a technologically complex health care system is provided to ruthless or
repressive governments, terrible things can be done to more people at less cost.
The same infrastructure that could provide “rational prescription of medicines” can
be used to poison people with toxic chemicals and killer viruses. A “health
education” intrastructure can be used to dessiminate political propaganda, or for

187

racial vilification. A “disease surveillance” infrastructure can be used for political
surveillance, and data collection for less benign reasons than genuine medical
research. Is the Chinese Government’s health care system for Tibet an example of a
caring government suddenly recognising neglected responsibilities, or could we be
looking at the development of something more sinister?
It has been amply documented that the Chinese are employing a classical colonial
strategy in Tibet, similar to that being employed by Java in its attempt to maintain
political control of Indonesia and Israel in the “Occupied Territories”. This is to
encourage migration into the area to be colonised. In the case of China, a
systematic effort has been made, since Tibet was claimed as part of “One China”,
to increase the number of Chinese living in Tibet. In the case of Indonesia, the
government has been attempting to populate the surrounding islands with people
from densely populated Java for decades. In the “Occupied Territories” Israel has,
for several decades, been establishing Jewish “settlements” in Palestinian areas –
these “settlers” act as a “buffer zone” and a para-military force against those whose
land has been “annexed”. In both China and Indonesia there are important racial,
religious and cultural dimensions to this “resettlement” strategy, in additional to
the obvious political ones, as is also the case (obviously) with Israel.
The 1996/7 Annual report of the Macfarlane Burnet Centre (MBC), states that its
major corporate sponsors were HIH Winterthur (insurance), Rio Tinto (mining) and
SmithKline Beecham (pharmaceuticals and vaccines), however “corporate gifts”
together with other “donations” comprised only 7% of the Centre’s income. The
MBC also received grants from the Commonwealth Government (27% of their
declared income), National Health and Medical Research Council (38%) and other
grants, as well as interest from interest and dividends. The Macfarlane Burnet
Centre Company retained profits at the end of the financial year, according to
their annual report, of $ 3,776,231 increasing their operating revenue to $
13,424,863. This has allowed the company to expand their activities in South
East Asia, South Asia, New Guinea, the Pacific Islands and Africa (Eritrea and
Southern Africa) with new projects centred on disease surveillance, needle
and condom promotion and distribution (dominant AIDS prevention
strategies) and vaccine trials.
The International Health Unit of the MBC, which is conducting joint projects with
AusAID and the Red Cross, mention the following projects in the Annual Report
for 1996/97:
1. HIV/AIDS and STD prevention and care in Indonesia; a $20 million joint
project between the Australian and Indonesian Governments through AusAID,

188

which is worryingly focused “with specific reference” to “three Eastern
Indonesian Provinces” – Bali, NTT and South Sulawesi. The project involves
training from the MBC International Health Unit staff given to the Indonesian
Government Departments controlling “education, manpower and employment,
religious affairs, social affairs and family planning” and includes “a policy to
provide HIV/AIDS and sexual health education in all Indonesian schools”. The
MBC report states that “HIV/AIDS programming is now an established part of
the strategic plans of all six departments, with dedicated budgetary allocations
from the national planning board.
2. “Pioneering” the use of “a novel, prefilled, non-reusable injection device
(Uniject) to deliver hepatitis B and tetanus toxoid vaccines” to pregnant
women, infants and their mothers in Lombok, another Eastern Indonesian
island with a large non-Javanese indigenous population who have been seeking
independence from Javanese rule in recent years.
3. Training “health personnel engaged in the care of women and newborn
children, at levels in the [eastern] Provinces, down to the village” (emphasis
added). These are the eastern provinces of NTB and NTT, and it is acknowleged
that these eastern areas are characterised by “cultural, social and religious
diversity” distinct from the western parts of Indonesia.
4. Establishment of the first needle exchange program in India, in collaboration
with the Emmanuel Hospitals Association, despite the fact that unlike in
Australia, most of the cases of AIDS in India, South East Asia and Africa, do
not affect intravenous drug self-injectors.
5. Short courses (programming) in Primary Health Care and HIV/AIDS in India.
6. Studying the incidence, morbidity and mortality of hepatitis E infection in
pregnant women in Northern India (a newly discovered, and possibly newly
created viral infection that apparently causes a 20% mortality in pregnant
women who become infected with it).
7. Development of the National HIV/AIDS/STD plan in Laos in collaboration
with the Joint United Nations Program on AIDS (UNAIDS), the United Nations
Development Programme (UNDP), and the Government of the Lao Democratic
Republic. This included advice from Bruce Parnell, who has a Masters degree
in Public Health from Monash University in Melbourne, and was “engaged as a
policy development specialist for six months as part of a larger project”, to

189

expand existing programs in the South-East Asian nation “in scope, and be
complemented with further programs in provincial areas and in sectors other
than the health sector”.
8. Comparing combined versus separate Diphtheria, Tetanus, Pertussis (DTP) and
Hepatitis B (HB) vaccines in Thailand. Although no indication is given that
hilltribe people are at risk of any of these infections, “funding was obtained for
a subproject to strengthen EPI delivery in hilltribe areas”, because the rate of
seroconversion was “somewhat lower in hilltribe people”. Seroconversion can
only be detected by repeated blood tests, which presumably these people
were subjected to, without proper evaluation of their environment, diet and
the real causes of health problems in this population.
9. Community Health and Development in Vietnam, in collaboration with World
Vision Australia and AusAID, which is introduced in the Annual Report with a
revealing perspective on MBCs views of the modern “global economy”: “The
introduction of a free market economy presents great challenges to models of
primary health care (PHC) developed under communist rule.” This project
involved Dr Peter Deutschmann, who has a medical degree from the University
of Melbourne advising on the health of women and children and the
implementation of “public health” measures (predictably centred on
immunization, many of which are available from the Macfarlane Burnet
Centre’s largest pharmaceutical sponsor, SmithKline Beecham).
10.HIV/AIDS Education and Awareness Project in Vietnam, also by Peter
Deutschmann in collaboration with World Vision and AusAID. The synopsis
reads: “Youth in Vietnam are vulnerable to the transmission of HIV through
sexual practices and experimentation with drug use. The development of the
first Vietnamese curriculum for sexual health education for youth in schools and
the introduction of sex and HIV education to out of school youth provided the
major focus of this project.”(No mention is made of the fact that heroin and
intravenous drug use were intoduced into Vietnamese society by American
soldiers during the Vietnam War)
11.Strengthening Immunization and Malaria control in Vietnam, in
collaboration with the University of Melbourne, Department of Medicine at the
Royal Melbourne Hospital and the Walter & Eliza Hall Institute (an
immunology and medical research institute in Melbourne, located at the Royal
Melbourne Hospital, previously headed by Sir Frank Macfarlane Burnet
himself, and subsequently by Sir Gustav Nossal, both Fellows of the Royal

190

Society. Macfarlane Burnet was a declared eugenist, as will be seen).
12.Development of a National AIDS strategy in Papua New Guinea, again by
Peter Deutschmann in collaboration with UNAIDS.
13.Development of National Drug Policy and standard treatment guidelines in
Eritrea in North East Africa.
14.Development of “community-based HIV/AIDS prevention and care and malaria
control projects in Southern Africa (Malawi and Zimbabwe).
15.“Youth and women’s health project in six Pacific Island countries”.
16.“Development of a regional strategy for the prevention and control of
STD/AIDS in Pacific Island Countries and Territories”.
17.“The control of Hepatitis B infection in Pacific Island Countries”.
18.“Developing national drug policies in Fiji and fifteen Pacific Island countries.
19.HIV/AIDS program evaluation in South-East Asia (by Bruce Parnell and Kim
Benton in collaboration with the Australian Red Cross).
20.Integrated management of childhood illness (in collaboration with the World
Health Organization).
21.“Project Male-call”: this involved training a “project team in recruiting
strategies to access men who have sex with men for a national telephone
sexual behaviour survey and implementation of the strategy in New
Zealand”.
22.“Victorian Aboriginal Health Service Youth Health Promotion Project: risk
reduction in the Melbourne aboriginal community”, the objective of which
is claimed as “to establish a longitudinal study of a cohort of young
Aboriginal people in order to describe their health problems, explore the
interrelated causes of these problems, and describe factors associated with
adolescent resilience and vulnerability”. The data collection will, according
to the research synopsis, “include administration of an appropriate
questionnaire which has been programmed for computer use”, a health
check, and blood and urine testing. (Not the sort of information that one

191

would want falling in the wrong hands).
23.HIV/AIDS policy development (Bruce Parnell, in collaboration with the
Australian Federation of AIDS Organizations).
24.“SexDrive II condom usage study (involving Mike Toole, Roger Pole and
others in collaboration with Enersol Engineering Consultants, the
University of Melbourne Department of Public Health and Community
Medicine and the La Trobe University Centre for Study of STDs). The
synopsis of this project provides some details which give an indication of
the MBCs strategic direction:
“The aim of this study was to replicate an earlier study comparing the
performance of two types of condoms in actual use; one that met the
Australian and ISO standards for condom quality and one that met the more
stringent Swiss Quality Seal requirements; and to compare condoms used for
anal and vaginal sex.
“Packs of 12 condoms were allocated at random to 101 participants from
Metropolitan Melbourne as each man entered the study and a pack of 12
alternative condoms was sent out when the first batch of diary sheets were
received. There were 1895 condoms used by 101 men over seven months.
There was an overall breakage rate of 2.1% and no significant difference
was found in the overall performance of the condoms.
“Compared with the earlier study, the overall breakage rate is smaller
(2.1% versus 2.9%) and the breakage rates for anal and vaginal sex were
smaller than in the previous study. The suggestion of a difference in
breakage rates between the two types of condom remains, but other, larger,
or differently designed trials are necessary to confirm its existence.”
Is this really what we want our health research budget spent on? And what are we
to make of the “Rio Tinto Aboriginal Foundation”, that is paying for the MBC to
conduct “surveillance” on urban Aboriginal youth in Melbourne (who happen to
pose a significant threat to Rio Tinto’s mining aspirations in Aboriginal land)?
The 1996/97 Annual Report of the MBC also lists nine projects, described as
“global” including five training programs (including a Master of Public Health and
Graduate Diploma in International Health) as well as three projects worth careful
scrutinization, given the current global AIDS epidemic, which has worsened
considerably since the Macfarlane Burnet Centre began its global operations.
These three projects are titled, “Development of a guide to Strategic Planning of

192

National AIDS Responses”, “Strengthening the role of WHO in complex
humanitarian emergencies”, and “Development of guidelines on tuberculosis
control in emergency settings”.
It is essential, given the nature of these programs, that strict ethics are enforced to
prevent the increase and spread of disease and illness, rather than its reduction. It is
also essential to closely examine the motives that lie behind these programs. Why
these programs in particular? With SmithKline Beecham and others with financial
interest in “global immunization programs” as corporate collaborators and
sponsors, is the relentless push for more injections (into pregnant women and
children in particular) motivated by concern for their health or concern about the
economic “health” of corporations in England, Australia, Germany, France,
Belgium, Switzerland, the United States of America and other countries that
manufacture and market immunizations, condoms and drugs?
Is the sponsorship of the Macfarlane Burnet Centre by Rio Tinto mining and BHP
an indication that these mining companies are altruistic organizations which are
keen to promote health in the Third World? If so, why do they behave towards
indigenous populations in impoverished nations as they do – stealing their
resources, poisoning their land and water, dispossessing and disempowering them
and attacking them in other ways? It is important to remember that Belgian Congo
(later Zaire), Rhodesia (later Zambia and Zimbabwe) – the ‘Subsaharan’ regions of
Africa that have been worst hit by the AIDS epidemic are also mineral-rich parts of
the continent that have suffered some of the worst colonial atrocities of the 1800s
and 1900s. Rhodesia was so named because it, along with South Africa, were
virtually given by the British Government to the British diamond-baron Cecil
Rhodes, whose companies set up the present-day mining operations throughout
Southern Africa (particularly mining for diamonds and gold), including those now
operated by the largest mining company in the world. This is the Anglo-Australian
giant named after a gold-mining area in Spain (where the transnational mining
company also has mines) – Rio Tinto. Rio Tinto made many billions of dollars of
profit from the mining of diamonds and metals (including gold) during the reign of
the notorious white supremacist ‘apartheid regime’ in South Africa. It seems an
extraordinary coincidence that Professor John Mills is, in addition to being the
Director of the Macfarlane Burnet Centre, also Director of the Rothschilds
Biosciences Investment Fund, given the interests and investments of the
Rothschilds Banking Corporation in African, South American and Australian gold,
gold mines, diamonds and diamond mines for almost two centuries. It is surely an
improbable coincidence that Sir Frank Macfarlane Burnet, after whom the
Macfarlane Burnet Centre is named (with his approval and support) was a eugenist

193

who previously served as an adviser to the United Nations and Commonwealth
Government on biological and chemical weapons – and whose major
preoccupation, other than collecting viruses, was the “threat of overpopulation” (in
“developing countries”).
The “Belgian Congo” together with the people who lived there was a regarded as a
‘personal possession’ of King Leopold II of Belgium until the early 1900s, and he
treated his ‘possession’, as the history books admit, monstrously – so cruelly that
his ‘property’ was taken of him (at the insistence of the British and the
‘international community’) and placed in the hands of the Belgian Government in
1908. Euphemistically, Belgian Congo, home of the worst colonial atrocities in
Africa (and there were many) was termed “Congo Free State” by the Belgians – a
‘free state’ where the native Africans were enslaved, tortured and murdered at will
by Belgian-controlled ‘protectors’ and ‘sentinels’. This is one of the first areas
where AIDS was reported from in Africa, and happens to contain rich deposits of
minerals and precious metals which were being exploited by Europeans even
before the time of Leopold. Coincidence or negative eugenics?
It is of relevance that one of the main WHO international health concerns over the
past 40 years has been “excessive breeding” (resulting in ‘overpopulation’) by
human beings in what have been interchangeably and serially described as
“undeveloped nations”, “underdeveloped nations”, “developing nations” and “third
world nations”. These nations, which also happen to have been colonised and
exploited for several centuries by the nations which now comprise the self-styled
“first world” are mainly in Africa, South America and Asia (including Indonesia,
Philippines, Bangaladesh, India, Sri Lanka and the nations of South-East Asia)
with small islands in the Pacific Region also thus designated. Britain, Canada, New
Zealand and Australia are the only “Commonwealth Nations” that are allowed in
the “first world club”. Canada, Australia and New Zealand are also nations that
have escaped accusations of being ‘overpopulated’ and, in fact, claims were made,
in the 50s, 60s and 70s that these countries were ‘underpopulated’. Ironically, and
significantly, at the same time, the “international health” academia in Australia,
England and the USA were making increasingly hysterical claims that the “world”
was overpopulated – that, in Paul Ehrlich’s words, there was a “population bomb”
exploding, which exceeded the danger of the “nuclear bomb” (then the ‘big terror
of the future’).

194

Chapter 13
AIDS, PSYCHIATRY and GLAXO-WELLCOME
The connection between AIDS and medical psychiatry, and a combined drugpromotional strategy by Glaxo-Wellcome is evidenced by a “continued medical
education publication” sent to Australian doctors in 1998 titled “HIV/AIDS: a
developing issue for general practitioners”. This edition claimed to “focus on GI
and psychiatric manifestations of HIV/AIDS”. Actually, the focus of the
publication is on encouraging doctors to “identify patients at risk”, test them for
“antibodies” and treat them with drugs. The education program is “sponsored” by
Britain-based Wellcome pharmaceuticals, which advertise themselves as “leaders
in antiviral therapy”. Following the article on “HIV/AIDS and Psychiatry” in
which it is claimed that people with HIV/AIDS are at particular risk of
“depression”, “anxiety disorders” and suicide (at 21 to 66 times the rate of the
general population), a short article features under the banner “HIV/AIDS News”.
The “news” is that “low dose zidovudine” is “cost-effective in asymptomatic HIVpositive patients” and that “zidovudine benefits children with advanced HIV

195

infection”. Zidovudine, previously called AZT (Azidothymidine) and marketed in
Australia as Retrovir is produced by Wellcome pharmaceuticals (Glaxo-Wellcome,
now part of Glaxo Smith Kline, the world’s largest drug company).
People with AIDS are predictably often diagnosed with anxiety and depression, for
which, in Australia, they are systematically prescribed psychiatric drugs. They are
also said to suffer from an increased incidence of psychotic illnesses (including
‘mania’) and ‘dementia’ characterised by ‘paranoia’. It is evident, however, that
“paranoia” and “delusions” could be diagnosed for such things as believing that
antiviral or multiple drug treatment is harmful, that AIDS is a man-made disaster
or that HIV-antibodies do not necessarily indicate infectivity. Obviously, a belief
that AIDS is the result of biological warfare, or that HIV surveillance has a hidden
agenda would also be diagnosable as ‘paranoid’ according to current psychiatric
criteria (endorsed by the World Health Organization).
Surveillance of “HIV status” has, in recent years, been a central focus of the UScentred “AIDS research strategy”, with an accompanying insistence (despite
evidence to the contrary) that HIV-antibodies indicate a death sentence, signifying
the inevitable development of AIDS (at some stage). Contradicting this pessimistic
prediction, however, there are several people in the world today who have tested
positive for HIV antibodies as far back as 1984 but not developed AIDS. Closely
observed by the medical research establishment, these “long-term nonprogressers”, as they are termed by the AIDS research establishment, have
disproved the death sentence they were given when they were first diagnosed as
“HIV infected”. The AIDS establishment have refused to be convinced, however,
that HIV is not necessarily fatal. Meanwhile the campaign to collect more blood
specimens from people of all races, cultures and nationalities continues.
The Australian medical establishment has played a crucial role in collecting blood
specimens and statistical data on HIV status in this part of the world (including the
Pacific region and South-East Asia), and advising governments of the region about
how to contain the epidemic of AIDS which has worsened in the surrounding
“Third World” nations, but not in Australia. This is attributed to successful public

196

health measures in Australia (for which the medical establishment, hospitals and
governments take credit), which are said to be lacking in “underdeveloped
nations”. There is, according to the medical establishment, no cure for AIDS or
HIV infection, but a combination of drugs is recommended to “improve survival”.
No immunization against AIDS is currently available, and this is one of the hopedfor outcomes of medical and immunological research, we are told. In the mean
time, vast amounts of public money have been spent on what are euphemistically
called “harm reduction measures”. These include “safe sex” (use of condoms),
needle exchange and distribution, and testing “high risk groups” for HIV
antibodies. The results of HIV testing are compiled into statistics and interpreted
for the world by “experts”.
In Australia, the main “at risk groups”, according to the dogma of the AIDS
establishment are “men who have sex with men”, “intravenous drug users”,
“psychiatric patients” and particular racial groups (Aboriginal and Torres Strait
Islanders and dark-skinned immigrants from Africa and, less vehemently, certain
Asian countries and Pacific Islands). The possibilility of stigmatising these
populations by branding them “potentially diseased and infectious” is obvious and
the risk becomes clearer still when one reads the “public education” material on
AIDS from the Macfarlane Burnet Centre, Melbourne’s (and Australia’s) premier
HIV/AIDS Research Institute:
“It is a terrible disease process, terrible because many patients die but
also because many of the infections and tumours are disfiguring and socially
obvious. It is almost literally true that patients with AIDS have become
the lepers of the 20th century.” (emphasis added)
The same pamphlet provides Director of MBC John Mills’ perspective of
“injecting drug users”:
“Injecting drug users are frequently the bridge between the homosexual
community and the heterosexual community. They are predominantly
heterosexual, sell sex for money or directly for drugs and may not respond to
the educational messages that motivate others.”

197

These “educational messages”, of which the MBC particularly recommend the use
of condoms, are accompanied by extremely pessimistic predictions about the AIDS
epidemic and the chances of individuals surviving HIV infection. There is also a
poorly veiled condemnation of the morals of people in “developing nations”,
particularly in Africa:
“In many African countries there is no social prohibition against
husbands having non-monagamous relationships, with the consequence that
most of the women infected are done infected [sic] by their husbands having
such relationships. We need to recognise that these women need social
power before they can deal with the AIDS epidemic. The power to tell their
husbands that polygamous relations are not appropriate behaviour. The
power to tell him to use a condom until she has evidence that his behaviour
has changed.” (He does not indicate what sort of “evidence” is required)
Whilst implying that the problem with these cultures is that they “will not behave
in a reasonable rational, socially and morally correct manner” Mills does admit
something that suggests that Africans are indeed well educated about the risks of
AIDS; they are merely reluctant to use condoms and change centuries-old and
social structures and traditions, and/or cannot afford them – or do not trust them:
“A study done in Uganda, a country deeply affected, showed that the
level of AIDS education there is high. Over 97% of people knew that AIDS
is an infectious disease transmitted by sexual intercourse or injecting drug
use, and that it could be prevented by condom usage in sexual intercourse.
But only 10% of these people were using condoms – not because they were
unavailable but because of social, economic or religious barriers to their
use.”
Needless to say, theories that AIDS was introduced to Africa in immunization
programs or for genocidal reasons are not included in the official “HIV/AIDS
education” in Australia or in Africa. The educational message is basically: get
tested for HIV, take drugs at the earliest medically recommended opportunity
(prescribed drugs, that is) and use a condom for anal or vaginal sex. The
Melbourne AIDS researchers are particularly interested in anal sex, as their

198

research and publications demonstrate. In Australia, the focus of the AIDS research
establishment is on “discovering” patterns of “risk-taking behaviour” among
young people (especially young Aboriginal and Vietnamese people). This research
includes the collection of blood specimens (to test for venereal diseases),
questionnaires and personal interrogation (described as ‘interviews’). In these often
involuntary interviews the most intrusive details about the young person’s sex life
are sought, with an enthusiasm that suggests more than an element of voyeurism.
These research projects involve a close collaboration of the hospital-based
psychiatric system and the AIDS research institutions (of which there is really only
one of note – the MBC). Both collect “data” and the integrated results are analysed
together. Somehow the result of creating a stigmatised and drug-addicted sub-class
of society has been ignored, while the research results clearly point to this as a
feature of AIDS and HIV infection in Australia.
In 1997 the Australian and New Zealand Journal of Psychiatry published the
results of such a collaboration in an article titled “HIV risk behaviour and HIV
testing of psychiatric patients in Melbourne”, involving several authors, including
(lead author) Sandra Thompson, Gill Checkley, Jane Hocking and Nick Crofts of
the Epidemiology and Social Research Unit of the Macfarlane Burnet Centre. Nick
Crofts is described as the project’s “biostatistician”. The study involved the
Department of Psychiatry of the University of Melbourne and the Alfred Hospital
in Prahran, and aimed to, according to the stated “objectives”, “determine the
prevalence of risk behaviours associated with HIV transmission and factors
associated with HIV testing in psychiatric patients in Melbourne”.
While lumping all “psychiatric patients” into a “high risk group”, the study
involved 145 people of whom “55.2% had schizophrenia”. Most psychiatric
patients are not labelled with “schizophrenia”, although many who are subjected to
psychiatric treatment at the Alfred Hospital are given this “diagnosis”. The results
of this study reflect this and also that people who are subjected to psychiatric
treatment at the hospital develop both a fear of AIDS and a tendency to drug
addiction. This is not surprising, since the hospital forces them to take drugs,

199

offering addictive benzodiazepines liberally and has a policy of injecting people
who refuse to take the drugs voluntarily. It is difficult to imagine a more certain
recipe for the creation of drug addiction, and introduction into a “chemical
paradigm” (let alone initiation into injected drugs). The hospital also dispenses the
addictive opiate methadone to heroin addicts and has a significant number of
homeless (and often drug-addicted young people) as chronic or recurrent patients.
They keep these patients as official patients of the hospital through a system of
Community Treatment Orders, which are routinely issued against any patient who
is suspected of “lack of insight” or “lack of compliance”. Such was the population
of young people sampled by the Macfarlane Burnet Centre for their study of “HIV
risk behaviour and HIV testing of psychiatric patients in Melbourne”.
The “results” are summarised as follows:
“Of 145 participants, 60% were male and 55.2% had schizophrenia.
Injecting drug use (IDU) was reported by 15.9%, a figure approximately 10
times that found in other population surveys. Most patients reported sex in
the last decade and over 20% had multiple sexual partners in the last year. Of
males, 12.6% reported sex with another male (9.2% anal sex); 19% of
females reported sex with a bisexual male. Nearly half of the males reported
sex with a prostitute, 2.5 times that in a population sample. Only 15.9%
reported ever having someone talk to them specifically about HIV and its
transmission, although one-third had been tested for HIV. In multivariate
analysis, male-male sex, paying for sex, and IDU were associated with HIV
testing, but those whose primary language was not English were less likely
to be tested. Those who had received HIV education were more likely to
have used a condom last time they had sex.”
The conclusion is summarised at the beginning of the paper, presenting no
surprises:
“This study provides evidence that those with serious mental illness in
Victoria have higher rates of participation in risk behaviour for HIV
infection than those in the general community. Attention to HIV education

200

and prevention in this group has been inappropriately scant; strategies to
encourage safer behaviour are urgently needed.”
By remarkable coincidence, “HIV education and prevention”, as defined by the
MBC will produce the previous ambition of eugenists – preventing the breeding of
“mental defectives” (‘psychiatric patients’). The Macfarlane Burnet Centre, located
in Yarra Bend at the site of the now defunct Fairfield Infectious Diseases Hospital
(where the centre originated) is Australia’s foremost virology research institute,
according to their own propaganda. Their 1997 Annual Report begins with a
“mission statement” which claims:
“The Macfarlane Burnet Centre for medical research is Australia’s premier
virology and communicable disease research institute. The Centre has
ongoing active research programs into many of the viruses of major public
health importance, including hepatitis A, B, C and E, HIV and AIDS,
rubella, and respiratory syncitial virus infection.
“The Centre’s philosophy is that a variety of approaches should be used
to achieve control of viral and other communicable diseases, from
fundamental research in virology, molecular biology and immunology;
through cell biology, laboratory support for clinical trials, clinical trials,
epidemiological and social research, and the design, implementation and
evaluation of public health programs. The centre’s work becomes
increasingly relevant not only locally but globally in the face of growing
numbers of viral diseases manifesting themselves in often horrifying ways
throughout the world.
“Established initially as a centre for basic laboratory research MBC has,
in line with its philosophy, started to reach out not only to the local
community through its Epidemiology and Social Research Unit, but to the
larger global community through its International Health Unit. It now has
offices in Indonesia, Nepal and Vietnam which co-ordinate various field
studies in these locations, and an office in Geneva which has the
responsibility of liaising with international organisations such as the World
Health Organisation. The Centre has a lesser presence in many other Asian,
Indian and African areas.”

201

Although MBC advises on the treatment of people of many races, most of whom
are impoverished and dark-skinned, the Members of the MBC Board are all
“white”, with the exception of Ms Sharon Firebrace, who is one of two female
board members and is described as “an active member of several committees
concerned with Aboriginal and general community affairs over 18 years”, and
“director of [a] private consultancy specializing in public relations and economic
and business development of Aboriginal and wider communities” (including the
mining industry).
Despite the best efforts of Ms Firebrace to represent the interests of aborigines (and
women), her views could easily be over-ridden by the commercial interests of the
11 middle-aged men who sit on the 13-member Board. They included, in 1998, the
Executive Director, Professor John Mills (also director of AMRAD
pharmaceuticals and the Rothschild’s Biosciences Investment Trust), Sir Roderick
Carnegie (Chairman of Hudson Conway, Newcrest Mining, Valiant Consolidated
Limited and John Fairfax Holdings Limited) and Raymond Williams (Chairman
and CEO of HIH Winterthur insurance and director of The Insurance Council of
Australia and Australian Motor Insurers Limited). The Board of MBC also
included at the time, Peter Ivany (CEO of the Hoyts Group), Douglas Rathbone
(Director of Gibson Chemicals and other corporate industrial positions), Graeme
Hannan (Chairman of Hannan corporate finance), and Michael Roux (director of
the Local Authorities Superannuation Board and advisor to the Deutsche Bank
Group in Australia).
It is of relevance that the most generous corporate sponsors of the Macfarlane
Burnet Centre, according to their 1997 and 1998 Annual Reports were HIH
Winterthur Insurance, Rio Tinto Mining, SmithKline Beecham (pharmaceuticals
and vaccines) and BHP. It is easy to recognise the vested interests that all these
companies could have in disease-creation. The insurance industry benefits from
concern about illness generally, and there have been few illnesses that generated
the public and governmental panic that AIDS generated in the 1980s, with active
assistance from public terrorization campaigns such as the notorious “Grim
Reaper” ads on Australian television. The mining industry profits from the sale and

202

distribution of metal needles, plastic syringes and drugs but also could perceive a
political benefit in genociding indigenous populations in areas where they have
mining interests. SmithKline Beecham obviously profits from the promotion of
immunizations, being one of the world’s biggest vaccine manufacturers.
SmithKline Beecham is also one of the many transnational drug companies that
have profited financially from the treatment of people with AIDS (who, if they can
afford it, are routinely prescribed multiple drugs). In addition to these, the medical
diagnostics industry (which now includes branches of SmithKline Beecham and
Nestle), have obviously profited financially from paranoia about HIV/AIDS. Of
course, these industries (and the pharmaceutical and medical treatment industries)
are mutually supportive, and, while competing with other for power and money,
also co-operate with other to a degree.
SmithKline Beecham (which has recently merged with Glaxo-Wellcome), is a
massive British drug company that promotes the SSRI drug Aropax for “panic
disorder” and “depression” and markets several vaccines, including vaccines
against hepatitis A, hepatitis B, diphtheria, pertussis, tetanus, influenza, measles,
mumps, rubella, neisseria meningitidis, polio and typhoid. They, together with the
Commonwealth Serum Laboratories (CSL) and the American Mercke Sharpe &
Dohme (MSD) are Australia’s biggest vaccine marketers. Staff at SmithKline
Beecham International informed me when I was researching vaccine manufacture
that all the vaccines used in the Australian region are manufactured in and
distributed from Rixensart in Belgium, which has been the centre of “SKB
biologicals” for many years (a rather extraordinary coincidence, given the fact that
the African epidemic seems to have begun in the Belgian ex-colonies of Zaire,
Rwanda and Burundi; and rather alarming in view of previous Belgian treatment of
“blacks” in Central Africa). The corporate decisions of the company, however, are
generated from Britain, where the head office of SmithKline Beecham is located.
The company is on the British and American (but not the Australian or European)
stockmarkets. SmithKline Beecham has recently amalgamated with GlaxoWellcome, the manufacturers of AZT (Zidovudine), methadone and, according to
the new company’s promotional material, 25% of the world’s pharmaceutical
opiates (narcotics).

203

The public relations department of SKB Australia, which informed me that all SKB
vaccines are manufactured by “SKB Biologicals” in Belgium, was unable to
discover how long this has been the case, explaining that the company is divided
into “consumer”, “pharmaceutical” and “biological” sections. “Biologicals” is
centred in Belgium, where a factory employing 2000 people has produced
700,000,000 vaccines used in 158 countries since 1956, when they produced their
first injectable polio Salk vaccine.
According to the official SKB promotion, this was followed by the following
vaccines: Sabin polio (oral) vaccine in 1961, live attenuated rubella vaccine in
1969, measles vaccine (1976), meningococcal vaccine (1978), chicken pox (1984),
(the world’s first) genetically engineered hepatitis B vaccine (1986), influenza
vaccine (1991), hepatitis A vaccine (1992), DVP (triple antigen with acellular
pertussis, in 1995), haemophilus influenzae (1996), measles, mumps rubella
vaccine (1998) and typhoid vaccine (1998). This is only a summary, and, for
example, earlier non-genetically engineered hepatitis B vaccines are not mentioned
in the official “immunization highlights”.
SmithKline Beecham was formed by the amalgamation of the American Smith
Kline & French pharmaceuticals with the British Beecham laboratories, which
occurred in the 1980s. Maybe it could be described as a “takeover” by Beecham of
Smith Kline & French, since the corporate decisions of the company now emanate
from Britain, although this may change again with the recent amalgamation of
SmithKline Beecham and Glaxo-Wellcome. It was Smith Kline & French that first
marketed the crippling chemical chlorpromazine as an “antipsychotic” back in the
1950s, after its initial discovery and testing by the French drug company RhonePoulenc (which had tested the drug in French Africa, France, and Montreal,
Canada). The Canadian medical historian Edward Shorter wrote, in A History of
Psychiatry (1997), about momentous pharmaceutical events in 1953:
“The scene shifts to the United States, toughest market of all to crack
because of the predominance of psychoanalysis and its predilection for
getting to the “real” causes of the illness. Here an ambitious young drug
house named Smith Kline & French enters the picture. The company was a

204

maker of patent medicines, and its new president, Francis Boyer, wanted to
upgrade it to a manufacturer of “ethicals,” meaning drugs prescribed by the
medical profession. Aware that Rhone-Poulenc had a hot new potentiator
going – but unaware that it might have psychiatric uses – in the spring of
1952 Boyer went to France…When Boyer signed the licensing agreement,
he thought he was buying an antiemetic (anti-vomiting drug). Having
virtually no research budget, Smith Kline was not prepared to undertake
extensive trials. Said Boyer, “Let’s get this thing on the market as an
antiemetic and we’ll worry about the rest of that stuff later.” The company
brought it out as “Thorazine.” (p.254)
Thorazine was the American trade name for chlorpromazine, which was, and still
is, marketed in Australia by the original French discoverers of the drug, RhonePoulenc, as “Largactil”. Thousands of people in Australia, and millions of people
worldwide have been slowly crippled and killed by injections of this drug, often
given to them against their will for treatment of “schizophrenia” and “mania” in
public hospitals, clinics and prisons. This has occurred continuously over the past
forty years, and still goes on today, although there is widespread recognition that
far safer alternatives exist to injections, or ingestion of this drug. Chlorpromazine
is a direct neurotoxin, damaging the brain and nervous system from the time of
initial ingestion rendering a state of emotional and intellectual dullness, muscular
stiffness and pain, lethargy and drowsiness. More seriously, the drug, and related
“dopamine-blockers”, frequently cause Parkinsonism, characterised by tremor,
difficulty initiating movement (including that required for speech), depression and
fatigue. The worst of the drug’s neurotoxic effects, however, is probably the
chronic disease only known to occur through the use of this class of drug
(dopamine-blockers): tardive dyskinesia.
Tardive dyskinesia, which can develop following the use of any and all dopamineblockers, including Stelazine, the dopamine-blocker currently sold in Australia by
SmithKline Beecham, is incurable and usually worsens with time, being a direct
result of damage to the brain by constant dopamine-blockade. It is characterised by
bizarre, uncontrollable movements, predominantly affecting the head and face,

205

such as repeated protrusion of the tongue, grimaces of the face, puckering of the
lips and puffing of the cheeks. The spasms may also affect the upper and lower
limbs, accompanied by strange writhing movements of the hands, and difficulty
walking. It is difficult to conceive a more stigmatising collection of movement
disorders, and they give an appearance of “strange behaviour”, in people whose
mental state may be quite normal. Suffering from tardive dyskinesia is truly to
experience a living hell.
Despite these well-recognised problems with chlorpromazine, Edward Shorter is
unqualified in his praise of the drug:
“Chlorpromazine initiated a revolution in psychiatry, comparable to the
introduction of penicillin in general medicine. While it did not cure the
diseases causing psychosis, it did abolish their cardinal symptoms so that
patients with underlying schizophrenia could lead relatively normal lives
and not be confined to institutions.” (p.255)
It is difficult to see how a drug which frequently causes a crippling state of
illhealth can be compared with antibiotics, which effect a permanent cure from
bacterial infections. This sort of blind support for anti-psychotics and other
psychiatric drugs is not uncommon, however, among academic promoters of
medical and psychiatric history and the history of the pharmaceutical industry.
These writers generally ignore biological warfare, chemical warfare or drug
warfare and the connection between the biotechnology and medical industries
(including the psychiatric industry) and the global war machine. References to
psychological warfare are also noticeably absent from modern psychiatric texts,
although this aspect of warfare has been central to every major conflict since
ancient times.

206

Chapter 14

GENETIC ENGINEERING AND EXPERIMENTAL
CHIMPANZEES
In the following chapters it will be shown that the scientific and technological
knowledge necessary to genetically engineer a virus to cause collapse of the
immune system in humans were in existence many years before the first reliable
reports of AIDS. Cancer-causing viruses in animals were, in fact, first described as
early as 1911. This was by an American physician and virologist, Francis Peyton
Rous, working at the Rockefeller Institute in New York. Rous, who graduated in
medicine at the John Hopkins University in Baltimore, Maryland, took extracts
from a cancerous chicken tumour and injected it into healthy chickens, producing
cancer in them (Daniel, 1987, p.147). Doubt as to whether the same thing could be

207

done in humans was resolved by the 1931 Puerto Rico Cancer Experiment, again
conducted by the Rockefeller institute, this time under the supervision of Chief
Pathologist Cornelius Rhoades. During these experiments, Dr Rhoades infected
Puerto Ricans (people from the Caribbean island of Puerto Rico, which was
“annexed” by the USA in 1898), whom he regarded as the “dirtiest, laziest, most
degenerate and thievish race of men ever inhabiting this sphere”, with
“transplanted” cancer, killing at least thirteen people (Vankin and Whalen, 1995,
p.297). Rhoades was later placed in charge of two large chemical warfare projects
and rewarded with a seat on the Atomic Energy Commission in the 1950s.
With the discovery of the structure of DNA by James Watson and Francis Crick in
1956, the ability to genetically engineer viruses on a molecular level became
theoretically possible. Working in Francis Crick’s laboratory at Cambridge
University (London) Watson and Crick integrated existing knowledge about the
molecule and worked out its physical and chemical structure, winning a Nobel
Prize for this achievement in 1962. Since then, increasing focus on genetics has
resulted in billions of dollars and countless hours being spent on the “human
genome project” and various aspects of genetic engineering.
Genetic engineering is inextricably connected with the ancient practice of selective
breeding to develop desired characteristics. Even though the term had not then
been coined, animal breeders and agriculturists were engaged in “macroscopic”
genetic engineering 5000 years ago, when particular wild species of plant and
animal were altered, through selected breeding, to maximise their value to the
humans that owned them. This value included aesthetic value, nutritive value,
commercial value and utility in hunting and war. Thus the eugenics-obsessed
breeders of different varieties of rose, dog, rabbit, horse and pigeon in the British
Royal Society and associated universities on both sides of the Atlantic, and the
“breeder’s clubs” and “planters’ societies” distributed around the British
Commonwealth, had the background knowledge and experience, and the
technological infrastructure necessary to augment their strategies of selective
breeding of plants, animals and humans with detailed knowledge of what was

208

occurring on a cellular and molecular level, when this became widely available in
the 1960s.
With the development of knowledge in the field of medical and genetics, came
opportunities for more scientists to specialise in different subspecialty areas –
molecular genetics, plant genetics, invertebrate genetics, medical genetics,
psychiatric genetics and genetic engineering. Medical doctors and general
scientists who wished to study genetics had a choice of one or other sub-speciality
area – there was minimal integration, and little communication between different
sub-specialities and with other areas of science. When the importance of DNA to
all cellular life was discovered, many institutions and scientists focussed on
discovering new genes and patenting the knowledge if they thought it was of
potential commercial benefit. Vast amounts of money were poured into existing
and newly established genetics research establishments, with spectacular results –
the genetic defects involved in several inherited diseases were discovered (such as
haemophilia), and the underlying chromosomal abnormalities in others (such as
Down’s syndrome). Of relevance to our present discussion, a huge revolution
occurred, as a direct result of Watson and Crick’s discovery, in the fields of
microbiology and virology.
The invention of the light microscope and the first description of bacteria by the
Dutch amateur microscope builder Anton van Leeuwenhoek in the 17 th century
heralded the birth of modern microbiology – the scientific study of microbes. Van
Leeuwenhoek, who had made his own simple hand-held microscope, sent drawings
of the “wee animalcules” he had observed to the British Royal Society in 1684.
These were later named bacteria – tiny single-celled animals that cover the surface
of the earth, and even our own skin. Bacteria are truly ubiquitous – they are found
deep in the ocean and in the suphurous waters of volcanic springs, they cover every
terrestrial surface and thrive even in the digestive tract of insects, cows and human
beings. Bacteria and algae (single celled ‘plants’) are the most simple forms of
cellular life. While some bacteria are harmless and others are necessary for health
(those in the large bowel help provide the body with vitamins, for example),
several types of bacteria can cause serious human infectious disease. The actual

209

term “micro-biology” could be translated as the microscopic study of life, however,
since the earliest origins of the discipline, the scientific specialty of
“microbiology” has been centred on the study of microscopic infective agents –
also known as “microbes” or “germs”.
All known germs can be classified as either bacteria, funghi or viruses, and the
fields of bacteriology, mycology and virology are the respective disciplines that
specialise in their study. Bacteria, which are broadly classified as bacilli (rods) or
cocci (spheres), and the infectious spores of funghi are invisible to the naked eye
but can be easily observed under a microscope of moderate magnification. Viruses
are too small to be seen with even the most powerful light microscope. They are, in
fact, small strands of DNA or RNA that can only reproduce within other living
cells (by hijacking the cells’ DNA for virus replication). Increasing knowledge
about DNA, and more efficient techniques for working out the molecular structure
of genetic material, contributed massively to increased knowledge about the
molecular structure of viruses, as much as that of human chromosomes and
agricultural stock and livestock-breeding.
Animal experimentation with a range of viruses developed in parallel with the
discovery of the genetic structure of viruses. Viruses known to cause disease in one
species were injected into different species in countless permutations and
combinations during the 1950s, 60s and 70s and continue to this day. Prior to this,
blood and infected tissue from humans and other species was injected into
hundreds of species of animal in an effort to discover whether or not they might be
“carriers” of different infections. The Rockefeller Institute was conducting such
experiments in the USA, Africa and South America as early as 1915, when
researching mysterious epidemics of yellow fever (a dangerous viral infection
which is carried by mosquitoes) in South and Central America. The Rockefeller
Foundation’s own publications indicate that blood and serum of humans who were
suffering from, or had recovered from various viral infections were injected into
experimental animals of all types, including chimpanzees, decades before the
discovery of DNA or the molecular structure of viruses (which occurred in the
early 1960s).

210

Monkeys and chimpanzees, because of their close physiological similarities to
humans were in especially high demand for testing human viruses and toxins
(including alcohol and cigarette smoke). Chimpanzees were also used for such
reasons as “test crash dummies” and “space exploration”, for which the U.S.
military, in particular, developed several “primate breeding centers”. When such
practices as strapping chimpanzees into test cars to measure damage from high
velocity impact came to be widely recognised as ethically, socially and politically
unacceptable in the USA, where most such testing has occurred, these primate
centres redirected their chimpanzee and other primate sales towards medical
research institutions, including private research institutes and government-run
institutes.
In February 1999, Time magazine featured an extrordinary article titled “The First
Chimpanzee”. Authored by someone named “Frederic Golden”, the short article is
promoted in the “contents” page as a “health” story, within the “society and
science” section of the magazine. The contentious claim being presented was:
“Scientists trace AIDS back to chimps in western Africa”. For over a decade these
same scientists had been claiming that AIDS came from African green monkeys –
so what had prompted the change? Was this new claim credible, who was making it
and why?
The chimpanzee the Time magazine article referred to as “the first chimpanzee”
was called “Marilyn”, and she had died in 1985 at the young age of 26, after
“giving birth at least a dozen times”. Marilyn was a “breeder” – her babies were
taken from her and used for whatever purposes the Holloman Air Force Base in
New Mexico, which had been breeding chimpanzees for many years, decided.
These included rocketing them into space (the only ‘use’ other than nonspecific
“medical research” mentioned in Golden’s article), using them as “crash
dummies”, poisoning them with known toxins (chimps were fitted with gas masks
through which cigarette smoke was forced on them, and subjected to force-feeding
with alcohol through a tube into their stomach, in the 1950s, 60s and 70s), and

211

infecting them with viral and bacterial infections (sometimes to observe the effect
of disease, sometimes to test drugs and vaccines).
Fifteen years later, scientists at the University of Alabama had been invited to test
old tissue samples “from the only chimp in the Airforce colony to have tested
positive for HIV”, according to Frederic Golden’s article in Time magazine –
Marilyn. Her tissue had been stored in a freezer at the National Cancer Institute
(located, it has been reported elsewhere, at Fort Detrick, in Bethesda, Maryland,
the previous location of the U.S.Army’s main biological warfare centre, this having
been converted to the National Cancer Institute’s Frederick Cancer Research
Facility in 1969). The researchers from the University of Alabama took tissue
samples of Marilyn’s spleen and using, in the words of Golden, “the sort of genetic
wizardry unavailable during the animal’s lifetime”, had “set about amplifying,
sequencing and analysing Marilyn’s virus” (Golden, 1999, p.44).
Frederic Golden’s Time magazine article promises much more than it delivers:
“In this week’s Nature, researchers at the University of Alabama in
Birmingham report that Marilyn’s frozen tissue, carefully preserved all these
years, may have solved a pair of lingering medical mysteries: where the
dominant form of the AIDS virus originated in the animal world, and how it
made the deadly leap to humans. More than brilliant scientific detective
work, the Alabama research, if it turns out to be correct, could lead to new
treatments and possibly even a cure for a fatal disease that afflicts more than
35 million people around the globe.” (p.44)
The “brilliant scientific detective work” involves some leaps in logic that are
simply not supported by reliable evidence. The argument that has apparently
allowed scientists to “zero in on the source of the AIDS virus” reads as follows:
“Except in the rarest cases, chimps like the sooty mangabeys never
show AIDS-like symptoms. Even so, when the researchers compared the
viral DNA with the three known types of SIV (simian immunodeficiency
virus), they found it had a substantially different genetic makeup. And when
they compared Marilyn’s genetic makeup with that of other chimps, they

212

determined that she belonged to a different subspecies than the chimps that
harbor the other SIV strains; those kindred chimps live further east in the
equatorial African rain forest. More important, Marilyn’s virus closely
matched the three major groupings of the HIV-1 strains, whereas the other
simian viruses appeared only remotely like the human virus.
“To the researchers, this suggested that the chimp virus had mutated and
crossed over to humans on at least three separate occasions, each time
finding man a more congenial host than ape. The momentous leaps, Hahn
[Dr Beatrice Hahn, the University of Alabama who led the research team,
which included her husband George Shaw] speculates, could have occurred
when hunters came into contact with infected blood while butchering chimps
for food, a common practice in Africa. As it happens, the first documented
case of AIDS goes back to 1959, when a man living in Kinshasa, just across
the Congo River from Gabon, home of Marilyn’s kin, died of the disease.”
The above quote is confusing from the very first sentence. Chimpanzees are apes
(as are humans), while sooty mangabeys are monkeys (simians). The initial
sentence should read, with commas:
“Except in the rarest cases chimps, like sooty mangabeys, never show
AIDS-like symptoms.”
Allowing for the typographical error, is this true? How common are “the rarest
cases” when chimpanzees do show such symptoms? It has been publicly admitted
that chimpanzees have been used for medical experimentation with a range of
viruses over several decades. Today chimpanzees continue to be injected with HIV,
hepatitis B and other human viruses in medical (animal) research institutions in the
USA, Europe and elsewhere. Demonstrating the presence of HIV in the tissues of a
chimpanzee that had lived and died in captivity at the hands of the U.S. Airforce
and military research establishment is hardly proof that the virus originated in the
Congo. It does tell us, however, that the U.S. National Institute of Health’s
National Cancer Institute is in possession of HIV-infected tissue samples and also
the expertise and technological means of “amplifying, sequencing and analysing”
killer viruses. In fact, this capability is held by several universities and virology

213

research insitutions in the USA and a smaller number in other parts of the world.
The Principal Office of Time Incorporated, the publisher of Time magazine, is
located in the Rockefeller Centre in New York. The Rockefeller Institute was
involved in the first discovery of cancer-causing viruses in animals, and the Puerto
Rico Cancer Experiments. Several Rockefeller-owned and controlled institutions
have had the scientific and technological capacity to engineer HIV from previously
known animal viruses since such technology was first developed. Is Frederic
Golden’s article influenced by these facts in any way? Is it true that “the first
documented case of AIDS goes back to 1959”? Golden claims that “a man living in
Kinshasa” died of AIDS in 1959, although the disease was only described for the
first time in 1978 (and then as “Gay Related Immune Deficiency” or GRID) and
the virus HIV only in 1984. The diagnosis of the “man from Kinshasa” was a
retrospective one based on very dubious evidence, as will be seen.
The Time magazine article is misleadingly titled “The First Chimpanzee” and
draws the reader to its content with an endearing photograph of the face of a baby
chimpanzee gazing intently at the camera. A small map captioned “Where the
Chimps Are” shows the apparent location, marked in red, of wild chimpanzees in
West Africa – an area including the whole of Gabon and Equatorial Guinea,
according to the map, extending into northern parts of the Republic of Congo and
the southern regions of the Central African Republic and Cameroon. Golden claims
that chimpanzees are “increasingly threatened in the wild”, blaming not the
relentless deforestation of the region but “butchering chimps for food” which is
claimed as “a common practice in Africa”. The possibility of demonising Africans
as “chimp murderers” responsible for the extinction of our closest primate relatives
is obvious. This “butchering” claim is unjustifiable, however, given that
chimpanzees are not commonly hunted or eaten even in the limited areas of West
Africa where they still survive. If concerted efforts are made to save their
remaining forest habitat there is no reason why chimpanzees will not survive, and
indeed thrive, in the wild. Golden, however, uncritically presents the convoluted
arguments of the University of Alabama researchers that by “improving their
status” through becoming valuable “medical subjects”, chimps might be saved
from extinction (but not imprisonment and torture, it seems):

214

“The findings are more than a historical footnote in the battle against
AIDS. Because chimps are much closer to humans than sooty mangabeys –
they share more than 98% of our genome – animals infected with an HIVlike virus could prove to be ideal test subjects for establishing what it is
about them that, as Dr. Anthony Fauci, director of the National Institute of
Allergy and Infectious Diseases, puts it, ‘allows the chimp to become
infected but not get sick.’
“The chimps also stand to gain. Increasingly threatened in the wild, they
may find new status – and protection – as medical subjects. Says Hahn, ‘All
they’d need to do is give a pint of blood every so often.’ That certainly beats
being rocketed into space – or extinction.”
I am still haunted by a happy face I saw on television a few months ago. It was the
smiling face of a chimpanzee. He was being visited by a friend he had not seen for
16 years, and was obviously very happy – at first. The man who visited him had
taught the chimpanzee sign language, and had not expected the victim of cruel
medical experimentation to recognise him. The chimpanzee did, and was obviously
thrilled to see him. He grinned from ear to ear, and started signing frantically –
calling the man by an old nick-name, playing games and ‘fooling around’ from
behind the bars of his prison. He extended his hand out of the cage and held that of
the friend he trusted – the friend who had taught him to speak with human sign
language, even though no one had thought it possible. They thought the chimp
would not be able to learn sign language, not because chimpanzees cannot learn to
sign (they can), but because, as an infant, the two halves of this chimpanzee’s brain
had been surgically severed – because, said the program, it was thought that he
‘may’ have had epilepsy. The program also admitted that researchers were very
interested in split-brain experiments at the time.
This chimpanzee had been used for medical experiments all his life – for almost
two decades he had been subjected to injections, infections, operations and the
most cruel abuses. This showed, as plain as day, on his face. Yet the researchers
who had injected this unfortunate chimpanzee with the hepatitis B virus and
hundreds of others with Hepatitis B and HIV were unconvinced that animals have
emotions or that animals suffer. They remained unconvinced because they chose

215

not to consider the matter deeply, or look at evidence they did not want to believe.
These same researchers claimed that, although unpleasant, medical
experimentation using chimpanzees is necessary and valuable. This is not true. The
crime against our closest primate relatives is not a medical necessity or, in fact, of
scientific value. It is also morally reprehensible and a truly repulsive legacy of the
time when chimpanzees, gorillas and other apes were regarded as “just animals” –
unthinking automatons that could be tortured and killed at the will of scientists and
industrialists. Today the claim that apes do not suffer is not taken seriously and not
usually made. The argument that is consistently used instead is that the ‘sacrifice’
of these animals is a medical necessity – to test out experimental vaccines, in
particular. This is the central claim, and it is false. The claim that chimpanzees
must be used because humans cannot, is also false. In fact, human experimentation
with trial vaccines is even more commonplace than that involving chimpanzees.
The chimpanzee whose old friend came to visit him had been deliberately infected,
not with a trial vaccine, but with a live virus. The television program, titled
Chimpanzees on Death Row, revealed that over 200 chimpanzees have been
infected with the human immunodeficiency virus (HIV) – mainly by injecting them
with blood from people suffering from AIDS. Hundreds more have been infected
with various strains of the hepatitis virus. The progress of the diseases created by
researchers is monitored by repeated blood tests and liver biopsies. The
experimental animals, which are now presumed infectious, are held in solitary
confinement while they slowly die – of man made diseases and emotional
deprivation. This may take many years. Some experimental chimpanzees survive
for 40 or 50 years.
Infecting chimpanzees and other creatures with human viruses carries a serious risk
of increasing the range of mutant strains of killer viruses. Once these have become
established in an experimental animal reservoir they can escape into the human
population or the wild population of the species concerned (or other species).
Injecting animals with viruses has been done in a grossly irresponsible way over
the past 50 years, and we have seen, along with this mindless obsession with
creating human diseases in experimental animals, the emergence of several new
viral plagues – notably that of AIDS (but also Marburg, Ebola, Lassa viruses and
many others). Where are these “new viruses” appearing from – the jungles of the
Third World, or the laboratories of the First World?
The disturbing television program Chimpanzees on Death Row did not raise this
concern, nor did it note the relationship between the demographics of hepatitis B
vaccination trials (in humans) and the first appearance of HIV infection and AIDS

216

in male homosexuals in the United States (in the early 1980s). It was, instead,
focused on revealing the suffering of the experimental subjects, and this was
achieved poignantly. The old footage of struggling chimpanzees being strapped
into “crash pods” by the U.S. Airforce (which had established a ‘breeding colony’
for this reason – the products of which were subsequently sold for medical
research) is deeply etched in my mind. The grimacing face and writhing head of a
chimpanzee in obvious pain and electrical wires attached to his brain are difficult
to forget. The hardest thing to forget, though, will be the face of the tortured
chimpanzee whose old friend had betrayed him. After visiting the chimpanzee, the
man who had taught him sign language turned a blind eye to his pleas for freedom
and told him he had to go. The chimpanzee did not look surprised – just very, very
sad.
The dehumanization that occurs when our fellow creatures are seen as “legitimate
experimental subjects” is familiar to me from personal experience. Without such
dehumanization I would not have been able to dissect and experiment with
different injectable drugs on a living (anaesthetised) cat and then kill it with a
“lethobarb injection”, as I, and all the other medical students in my year were
obliged to do when we studied medicine at the University of Queensland in the
early 1980s. This dehumanization extended to my interactions with “patients”
when, as a medical student, we were sent onto the hospital wards with a list of
patients who had been selected by our tutors as having “interesting signs”. For
example we might be told to examine the liver of Mr Jones in bed 13, or the heart
of Mrs Smith in bed 18, the kidneys of Mr Brown in bed 25 and the feet of Mrs
White. These patients were generous and understanding towards the white-coated
medical students who prodded and poked them as a rule, and I did try to be polite
and friendly, however I did not really try to understand the whole of the person
whose liver I was examining or heart I was listening to with my stethoscope.
When I worked as a junior resident doctor in the psychiatric unit of the Royal
Brisbane Hospital in 1984 the patients, especially those in the locked ward, seemed
less human to me than “normal” people. I assumed, as I was taught to, that
psychiatric patients tend to overreact, imagine things, exaggerate and lie. I believed
that those who claimed they were sane and did not need to be in hospital were
suffering from “lack of insight”. I thought the glazed expression on the faces of
patients on “major tranquillisers” (as modern-day “antipsychotics” were then
called) was caused by “schizophrenia” rather than the drugs the hospital was
forcing on them. I did not learn the error of my thinking and the extent of my own
medical dehumanization, intellectual blinkering, and emotional disconnection from
the suffering of others until I myself had been subjected to such treatment. I

217

shuddered when the presenters of the program said that the men who ran the
chimpanzee research program were considering “euthanasia as a population control
measure” for (infected) chimpanzees. I shuddered because, unlike most doctors, I
have made a careful study of negative eugenics, and realise how the term
“euthanasia” has been used as a euphemism in the past. The past doctrines of
eugenics continue to pose a grave danger to modern genetics and the application of
virology, immunology and biotechnology knowledge. This danger is a huge threat
to humanity and all large mammals. Infecting chimpanzees is not far from
infecting humans. Those who think it permissible to give chimpanzees AIDS,
hepatitis, and other infections and then “euthanase” them to “control population
numbers” may not see it as harmful to “euthanase” Africans and Asians because
they fear the world is overpopulated, or “euthanase” homosexuals and ‘drug
addicts’ because they fear “degeneracy” – or because they despise those that have
been demonized.
One way that medicine has become dehumanized is through the reduction of
human beings to labels and statistics. Often the statistics that are compiled are
themselves derived from labels – the “incidence” of different diagnoses is
measured and compared in different parts of the world, different parts of an
individual country and different social, racial and ‘ethnic’ groups within a country
or group of countries. This is an important aspect of global health monitoring and
epidemiology, however it can easily lead to dehumanization and lack of empathy
by those institutions and doctors that become more familiar with people in terms of
statistics than as individuals. In the case of mental health statistics, from which
predictions have been made by the UN that “mental illness” will increase
substantially over the next 20 years, these are usually compiled according to the
number of people who are given psychiatric labels by ‘mental health’ professionals
or are given psychiatric treatment by medical doctors in various clinics and
hospitals. How fallacious such statistics can be will be explored in subsequent
chapters, and the labels themselves were analysed in detail in my previous book
The Politics of Schizophrenia. The modern global strategies regarding “mental
health”, “harm minimization” (from drugs and venereal diseases) and “safer sex”
are fundamentally connected with each other – and all became “global” during the
Cold War when more and more medical and social programs based on
dehumanised statistical analyses and misapplied bell-curves were developed. These

218

were constructed on the foundations of the eugenics movement of the early 20 th
century.
In the next chapter the cause of AIDS will be explored further, first looking at the
epidemic in Africa, where it has claimed the most lives so far. The Human
Immunodeficiency Virus (HIV) is said to have originated in Africa – specifically in
Central Africa in the region of the Congo (Zaire) river and the AIDS epidemic
exploded in this part of the continent in the mid-1980s. Since then the epidemic has
spread to all parts of Africa, but especially southern Africa, where millions of
people have died of AIDS. In several southern African countries upto 20% of the
population is said to be infected with HIV, the large majority being children, young
women and young men. It has also been claimed that over 1500 people are
becoming infected with HIV every day in South Africa. This is truly a medical
emergency.

219

Chapter 15
BIOLOGICAL WARFARE IN CENTRAL AFRICA?
The claim, by University of Alabama and National Cancer Institute researchers and
Time magazine, that HIV originated in and spread to humans from wild
chimpanzees in the Congo constitutes a significant revision of the “official” theory
of the 1980s and 1990s – that Human Immunodeficiency Virus was probably
endemic in African monkeys, probably African Green Monkeys (Cercopithecus
aethiops), and probably entered the human population in Central Africa through a
monkey-bite or some other blood to blood contact between Africans and monkeys.
After entering the human reservoir, the official theory went, it somehow spread to
the United States, targetting (mainly white) male homosexuals in New York, Los
Angeles and California and causing the epidemic that was initially termed “GRID”
(Gay Related Immune Deficiency) and later termed AIDS (Acquired Immune
Deficiency Syndrome). Simultaneously, the same virus, according to the official
theory, was causing a much more serious epidemic in the “Third World”, and
especially in Central, Eastern and Southern Africa which were being collectively
referred to as “sub-Saharan Africa”. This was the theory I accepted for many years

220

– until I looked at the epidemiology and history of the AIDS epidemic more
closely.
The theory that Human Immunodeficiency Virus was introduced into humans via
contaminated vaccines does not necessarily imply conspiracy or intentional
genocide. Some investigators have suggested that, because of the epidemiology of
the AIDS epidemic in Africa, the virus may have been unintentionally released in
polio or smallpox immunization campaigns during the 1960s and 1970s. A few,
notably the University of California virologist and “oncogene” (cancer-causing
gene) researcher Peter Duesberg, have claimed that AIDS is not caused by the
Human Immunodeficiency Virus at all. Professor Duesberg’s theories and those of
various “conspiracy theorists” were debunked in the science writer Laurie Garrett’s
1994 best-seller The Coming Plague. Having researched the Coming Plague as a
Fellow at the Harvard School of Public Health, Garrett presents the American
medical establishment’s claim that HIV is the result of accidental cross-infection
from simian (monkey) viruses that have been in Africa for very long time, as the
most credible explanation for the origin of the virus. In support of this theory she
quotes Joseph McCormick, head of “special pathogens” at the United States
Centers for Disease Control, and self-proclaimed “virus hunter”, as saying, in
1987:
“Simian viruses have evolved in simians in parallel with human
viruses, and the [HIV] virus in humans has been around for a very long time.
For quite a long time, I believe, in Africa. And I believe a whole family of
these viruses …have co-evolved.” (p. 384)
Laurie Garrett specifically ridicules the theory that HIV was created by the US
military or CIA, mentioning the “anti-vivisectionist Dr Robert Strecker, who
gained a large following in 1987 by again claiming, on the basis of a supposed
BLV (bovine leukaemia virus) connection, that the CIA made the AIDS virus”
(Garrett, 1994, p.382). Dr Strecker’s accusation followed the first public accusation
that HIV was the result of genetic engineering by a senior “Western” academic.
This was in 1985 and made by the British physician Dr John Seale, who stated,
with certainty, that HIV was the deliberate outcome of biological weaponry

221

experiments conducted at Fort Detrick, in Bethesda, Maryland, by the U.S. Army.
Garrett discredits Dr Seale’s accusation by claiming that it was based on an earlier
claim by the East German scientist Jacob Segal of Humboldt University that was
subsequently propagated in “developing countries” by the KGB (the latter being
the 1987 explanation of the U.S. State Department in an official denial of the
accusation that the CIA had developed and released HIV as a biological weapon):
“As evidence for his assertions Seale cited the work of Soviet scientist
S.Drozdov of the Soviet Academy of Medical Sciences in Moscow. Drozdov
and other Soviet scientists were, in turn, influenced by retired East Berlin
scientist Jacob Segal, of Humboldt University. Segal wrote a report, read
throughout Eastern Europe, that claimed the AIDS virus was made at Fort
Detrick in 1977 from a deliberate mixture of visna [virus] and HTLV-I [the
first recognised strain of human T-lymphocyte virus]. Though it had been the
subject of discreet discussion inside the Soviet bloc for over a year, the Segal
report was first publicly distributed at the 1986 Summit of the Nonaligned
Movement, which convened in September in Harare, Zimbabwe. Over
subsequent months the Segal and Seale reports got wide international play,
particularly in developing countries.” (p.382)
In Australia, the Segal and Seale reports had very little “play” – neither in the
publicly nor the privately-owned mass media was the biowarfare accusation
publicised. The fact that the American military stood accused by several scientists,
in several nations (including the USA) of engineering, experimenting with and
disseminating killer-viruses was not taken seriously by politicians or scientists in
this country, and was further regarded as a “taboo” subject – signs of “paranoia”
evidenced by “conspiratorial thinking” and “delusions of control”. The medical
postgraduate and undergraduate textbooks in Australia, mostly imported from the
USA and UK, continued to present the “official story” about the origin of AIDS, or,
more often, ignore it completely while discussing, in much detail, the symptoms,
signs, complications, and drug treatment of those suffering from the disease.
Prevention strategies and programs continued, and have continued to this day, to be
based on the assumption that HIV and AIDS are not the creation of American or
European biological warfare programs. Is this a safe assumption?

222

The British physician, John Seale, would not have suspected Fort Detrick scientists
merely because the “Eastern bloc” made this accusation. It would have been
relatively common knowledge among older doctors in England (and those with
memories that go further) that Fort Detrick used to house the main American
Biological Warfare laboratory, changing its name to the Frederick Cancer
Research Facility of the National Cancer Institute during the Vietnam war (when
there was growing concern in the USA about chemical and biological weapons).
This name change occurred in 1969 (Vankin, Whalen, 1997), three years before the
international treaty banning the development, stockpiling and use of biological and
toxin weapons was ratified by most members of the United Nations, including the
USA, UK and Australia, in 1972. According to this new International Law, which
supposedly came into force in 1975, biological and toxin weapons were defined as
“weapons of mass-destruction” and were banned: existing stockpiles were to be
destroyed and all research and development programs abandoned.
It is obvious, though, that it is easier for an institution to change its name than to
change what it does. To achieve the latter the entire infrastructure needs to be
changed. Programs for staff training and expensive equipment become redundant,
or even illegal. Changing what an institution does requires new goals and
ambitions, new ideas, and a whole new culture. A simple name change is cheaper,
and remarkably effective at fooling the public.
In the 1980s concerns were raised by several researchers about the disproportionate
number of homosexual men in the United States of America who later developed
HIV infections and AIDS after being vaccinated with an experimental hepatitis B
vaccine in the late 1970s and early 1980s. Dr Alan Cantwell refers to this in a 1998
article titled, “AIDS: a doctor’s note on the man-made theory”:
“Conveniently lost in the history of AIDS is the gay Hepatitis-B
vaccine experiment that immediately preceded the decimation of gay
Americans. A “cohort” of over a thousand young gays was injected with the
vaccine at the New York Blood Center in Manhattan during the period
November 1978 to October 1979. Similar gay experiments were conducted

223

in San Francisco, Los Angeles, Denver, St.Louis, and Chicago, beginning in
1980. The AIDS epidemic broke out shortly thereafter.” (p.25)
More detail was given by R.Ayana in the 1988 publication “AIDS: The Real
Story”:
“ “The AIDS virus began to appear in homosexuals around 1979. This
was immediately following tests of the first hepatitis vaccine,” says Dr J.
Anthony Morriss, a leading virologist who has worked with the NIH, Walter
Reed Hospital, and for 35 years with the Food & Drug Administration in
connection with research on vaccines for influenza and other respiratory
diseases.
“The hepatitis-B vaccine was produced using infected blood taken
primarily from homosexual males. Information on HIV in the West prior to
the early 1980s comes from the detection of antibodies in blood samples
stored as part of a San Francisco study of hepatitis B among homosexual
men.
“The first tests of the vaccine were conducted under unusual
circumstances…Only homosexual males between 20 and 40 who were not
monogamous could participate in the tests of the vaccine, which took place
principally in San Francisco and New York. Soon afterward, AIDS was first
detected among gay males in these areas.
“The Centres for Disease Control (CDC) reported in 1981 that 4% of
those receiving the hepatitis vaccine were AIDS-infected. This was prior to
the advent of testing kits for HIV. In 1984 the CDC reported that 60% of
those involved in the hepatitis-B program had developed AIDS.”
Dr Cantwell, in a more recent article (June, 1999) titled “Chimps, Conspiracies and
Killer Viruses”, wrote:
“Beginning in the Fall of 1978 (around the time HIV was ‘introduced’
into the gay community), thousands of homosexuals were injected in New
York City as part of the experimental hepatitis B vaccine program. In 1979
the first few cases of AIDS (then known as “gay-related immune deficiency
syndrome”) showed up in Manhattan.

224

“During the years 1980-1981, similar vaccine experiments were
conducted in Los Angeles, San Francisco, Denver, Chicago and St.Louis. In
the Fall of 1980 the first West Coast case of AIDS appeared in a young man
from San Francisco.
“By 1980 twenty percent of the Manhattan men in the experiment had
turned HIV-positive…”
He continues:
“How could HIV been seeded into gays during vaccine experiments in the
late 1970s when the virus was supposedly unknown and unavailable to
scientists? Until recently it was claimed that the earliest HIV-positive blood
specimens all traced back to gay men in the hepatitis B experiment at the
New York Blood Center. (This fact in itself should give some credence to the
relationship between the experiment and the outbreak of AIDS.)
“However, evidence provided by Joseph McCormick in Level 4: Virus
Hunters of the CDC (Turner Publishing, 1996) suggests HIV-contaminated
blood was already in the virology research community before the gay
experiments began. McCormick, a former chief of the CDC’s (Centre for
Disease Control) legendary “hot zone” laboratory, claims that hundreds of
vials of African serum were collected and airshipped to the CDC (and
presumably also to biowarfare and cancer virus laboratories) for research
purposes in 1976. These blood specimens were collected from natives in a
remote northern area of Zaire (now known as the Democratic Republic of
the Congo) who were exposed to the mysterious African Ebola virus
outbreak (yet another “emerging virus” of dubious origin).
“A decade later when these vials were retested for HIV, it was
discovered that 5 out of 600 samples were positive for HIV, and from one
specimen the virus was actually cultured. Thus, two years before the
hepatitis experiment began, HIV-infected African blood was in the hands of
biowarfare scientists and animal cancer virus researchers.”
Dr Cantwell assumes that virally infected serum from Africa being in the hands of
the CDC in the 1970s meant that it was also in the hands of “biowarfare and cancer

225

virus laboratories”. Is this a reasonable assumption? While cancer virus
laboratories certainly exist in the US and elsewhere in the “West”, biowarfare
laboratories do not officially exist in the USA, Britain or Europe. They were
outlawed in 1975, when the 1972 United Nations treaty became International Law,
having been ratified by over 140 nations, including the USA, UK and Australia. A
few years before this the “Army Biological Warfare Laboratory” at Fort Detrick (in
Bethesda, Maryland) was renamed the “National Cancer Institute’s Frederick
Cancer Research Facility”. Did the thriving American biological warfare industry
make a dramatic change from disease creation in “the enemy” to disease
prevention and cure for everyone, regardless of politics, race, colour, class or
religion? If so, when did this happen?
Other researchers and commentators have described a shift in biowarfare from
official military departments in the USA to private universities, laboratories and
research institutions which received military contracts and provided the military
with advice and expertise on virology (and medicine generally), protection against
biowarfare and chemical warfare, and prevention of disease in combat. This is
certainly the case, however the advice is said to be intended for protection against
biological and chemical warfare. Thus the soldiers who were sent to the Persian
Gulf in the 1990s were given experimental vaccines against anthrax and botulinum
toxin, which were said to be in Saddam Hussein’s biological warfare armoury (as
was admitted by General Colin Powell in his autobiography). Aside from the
matter of HIV and AIDS, is easy to recognise problems in the advice the American
military has been given by its medical advisers (and itself given) from the recently
publicised “Gulf War Syndrome” and the many problems affecting soldiers who
returned from Vietnam (including psychological problems, drug addiction and the
effects of poisoning by toxins such as Agent Orange). If Iraq was limiting its
biological warfare research to anthrax and botulism, by the way, their science is far
behind the times. These agents were known about over a hundred years ago. The
range of potential biowarfare agents has grown considerably since then.
Dr Joseph McCormick, an insider in the CDC, and one of the key obtainers of
infected blood specimens for the US government and CDC in the 1970s, makes no

226

admission about American or European biological warfare in Level 4: Virus
Hunters of the CDC, although he does make a single reference to interest in
haemorrhagic fevers as agents for biological warfare. This reference implicates
Russian science but not that of the Americans:
“…China and the former Soviet Union have always had a keen interest
in hemorrhagic fevers. We know that the Soviet military had established a
major experimental program to investigate these diseases. But their interest
in the viruses was not necessarily altruistic. More ominously, they may have
been motivated by the prospect of developing CCHF and other diseases for
use in biological warfare.” (p.313)
CCHF is an acronym for Crimean Congo Haemorrhagic Fever, one of several
known viral haemorrhagic fevers, others including Lassa Fever, Ebola Virus and
Marburg virus. They cause fever, and diarrhoea which becomes so bloody that it
can cause fatal haemorrhage. The infections can also cause encephalitis with loss
of consciousness, convulsions, coma and death. Most physicians in Australia,
including myself, have never seen a case of haemorrhagic fever but it sounds like a
horrific illness. These infections are frequently fatal, and can also cause permanent
disability, mainly through brain damage. Interestingly, haemorrhagic fevers
apparently came to the attention of the American military in the 1950s, when
thousands of American soldiers in Korea developed “Seoul Hantaan” during the
Korean War (Garrett, 1994, p.22). Seoul Hantaan was a viral haemorrhagic fever,
similar to Lassa, Marburg and Ebola fevers, which, like these infections, was
frequently fatal.
According to McCormick, he pioneered the use of the anti-viral drug ribavirin for
acute haemorrhagic fevers in Africa (where most of the deaths from haemorrhagic
fevers occur) after unsuccessful experiments with plasma from recovered patients.
He found intravenous ribavirin particularly effective in the treatment of early Lassa
Fever, although McCormick admits that the drug was not affordable for most
Africans.

227

In reading Joseph McCormick’s ‘adventures’ and those of his wife Susan FisherHoch (who co-authors Level 4: Virus Hunters of the CDC) it becomes evident that
the American military also have had an active interest in haemorrhagic fevers, and
collaborated regularly with the CDC. Both organizations have been collecting
blood specimens and viruses from people around the world, including Africa, since
the 1950s and before. Prior to that a range of “private laboratories”, notably those
sponsored by the Rockefeller corporation, have been collecting blood and other
human and animal tissue specimens since the early 1900s. Senior staff from the
CDC, and various private American laboratories and elite universities have
subsequently been employed in the USAMRIID (United States Army Medical
Research Institute of Infectious Diseases, based, again, at Fort Detrick, Maryland).
In fact, it was from experiments on infected monkeys at the US military research
centre that McCormick got the idea of trying out ribavirin on humans with Lassa
Fever:
“Karl [Dr Karl Johnson, then chief of Special Pathogens at CDC] told me
that he was already testing the drug against Lassa virus in tissue cultures,
adding that similar experiments were being conducted by Peter Jahrling at
USAMRIID. Peter was infecting monkeys with Lassa virus and treating
them with ribavirin. There was good safety data on the drug, including
human use data, because it had already been used successfully as a treatment
for acute viral pneumonia in infants [especially pneumonia caused by
respiratory syncitial virus, or RSV, a virus that is coincidentally a specialty
of the Harvard-trained MBC head Professor John Mills].” (p.99)
Karl Johnson was head of “Special Pathogens” at the CDC (and McCormick’s
boss) during the late 1970s, when the first cases of AIDS were reported (but before
Robert Gallo’s announcement that he had discovered the HIV virus), and went
from there to work at the USAMRIID in 1982, leaving three years later, shortly
after Gallo’s announcement (in 1984). It was Johnson who first sent McCormick to
Africa in 1976 in search of Ebola Virus.
Level 4: Virus Hunters of the CDC (1996) is a personal account of McCormick’s
adventures in search of killer viruses in Africa during the 1970s and 1980s and

228

gives an indication of some of the unofficial and official alliances that occur in the
dangerous world of viral epidemic research. McCormick tells his story in graphic
detail, concentrating more on presenting an entertaining narrative than providing
solutions to the problem of viral plagues. He portrays himself as the intrepid hero:
venturing into the darkest Africa in search of the source of Lassa Fever, Ebola
virus, Marburg virus and other haemorrhagic fevers. These are potentially lethal
viruses, but McCormick, in telling a dramatic narrative exaggerates the drama.
Writing about pricking his finger with a possibly infected needle, he writes:
“How could I have been so careless? I had bled over three hundred
victims of Lassa Fever and never come close to pricking myself. My first
instinct was to pull off my glove and cry out, but what good would that have
done? Though I rinsed off the glove with disinfectant, I knew the damage
had been done. So the only thing I could do was finish bleeding the woman
and continue with my work. I can’t say that I was calm, but I wasn’t in a
panic, either. Still, I had a nauseating feeling. I knew, more than most
people, that when you get stuck by a potentially contaminated needle in the
midst of a deadly epidemic – like the one I’d earlier investigated in Zaire –
the odds for survival aren’t very good.
“Actually, I’d have to say that the fatality rate was about 100 percent.”
(p.17)
This is not actually true, and Dr McCormick should know it not to be true. A
needle-prick with even a definitely infected needle does not guarantee infection:
the risk of this is small even with the most deadly viruses. Nevertheless, Ebola
virus, which potentially contaminated the needle with which he accidentally
pricked himself, is a horrific micro-organism. The 1980 edition of Harrison’s
Principles of Internal Medicine describes the terrible mortality that this viral
infection caused in early cases (all from Central Africa):
“Between July and November 1976 simultaneous outbreaks of an acute
febrile hemorrhagic disease occurred in Southern Sudan and Northern Zaire.
“Secondary and tertiary” spread of infection, particularly among hospital
staff, was noted. In the Sudan over 300 cases with 151 deaths and in Zaire
237 cases with 211 fatalities were reported. In one Sudanese hospital, 76

229

members of a staff of 230 were infected and 41 died. The virus isolated from
patients in the Sudan and Zaire was morphologically similar to the Marburg
agent but was antigenically distinct. The name Ebola virus, after the river in
Zaire located near the epidemic, has been proposed.” (p.822)
The textbook describes the typical course of an Ebola virus infection:
“Clinically, the disease is similar to Marburg virus disease. The
incubation period ranges from 4 to 6 days. Patients usually present on the
fifth day of illness with a history of abrupt onset of headache, malaise,
myalgias [muscle pains], high fever, diarrhea, abdominal pain, dehydration,
and lethargy. Pleuritic chest pain, a dry hacking cough and a pronounced
pharyngitis [throat inflammation] were also noted. A maculopapular eruption
[skin rash] develops between days 5 to 7 of illness. On black skins the rash
is often faint and not recognised until desquamation [loss of surface layers of
skin] occurs. Hematemesis [vomiting blood], melena [passage of blood in
faeces], and bleeding from the nose, gums and vagina are common. Abortion
and massive metrorrhagia [uterine haemorrhage] was a frequent
complication among pregnant women. Death usually occurs in the second
week of illness and is preceded by severe blood loss and shock.” (p.823)
The illness caused by Ebola Virus is similar to that of Marburg virus, named after
the German town where the first cases of the new haemorrhagic viral infection
occurred, in 1967. All the initial cases of human infection were in people working
in German and Yugoslavian medical laboratories where they were exposed to
African Green Monkeys, according to Harrison’s Principles of Internal Medicine
(1980). The textbook gives an overview of the epidemiological basis on which the
monkeys were implicated:
“The initial outbreaks affected 31 patients in Marburg and Frankfurt,
Germany, and Belgrade, Yugoslavia, and was epidemiologically linked to
monkeys imported from the same source in Uganda. Virus was isolated from
the blood and tissue of these monkeys. Of the 25 primary infections, there
were seven deaths. Six secondary cases, involving two physicians, one
nurse, a postmortem attendant, and the wife of a veterinarian occurred.

230

Person-to-person transmission was felt to take place via accidental needle
sticks or abrasions, although respiratory and conjunctival infection could not
be ruled out. The wife of one patient developed Marburg virus disease at the
onset of his illness; Marburg virus was demonstrated in the semen of the
original patient, despite the presence of circulating antibody, and this
secondary case is believed to have been acquired through sexual intercourse.
Subsequent investigations in the Lake Kyoga region of Uganda where the
monkeys originated revealed no unusual illnesses or death among primates
in the area [which refutes the claim that it arose from wild monkeys].
Complement fixation antibodies were demonstrated in 36 percent of C.
aethiops [green monkeys] trapped in the region, and antibody was detected
in three monkey trappers.” (p.822)
There are other possibilities worth considering for the origin of Marburg virus,
Ebola virus and HIV. One is that the viruses arose through cross-infection from
experimental animals and humans who were infecting them with various viruses to
study the effects of infection in different animals (including green monkeys) and
testing drugs on these animals before using them on humans. Another possibility is
that the infections were transmitted to health care workers and laboratory workers
via experimental vaccines or infected vaccines (laboratory workers and health care
workers are routinely immunised as ‘at risk groups’). These vaccines could have
been unintentionally contaminated with animal viruses, or they could have been
intentional trials on the researchers. While the latter might seem improbable,
admissions by the US military that hundreds of ‘open air germ tests’ were done on
civilian populations of cities in the USA in the 1950s and 60s, make it clear that
morality plays little role in “experimental ethics” as far as the US military research
establishment is concerned, and the record of German academics during the
Second World War is even worse, in this regard. It is also clear that the US military
have not hesitated to deliberately infect monkeys with haemorrhagic fever viruses.
Susan Fisher-Hoch (CDC “special pathogens” boss Joe McCormick’s wife) wrote
in Level 4 Virus Hunters of the CDC:
“It is true that experiments have been conducted at USAMRIID to show
that aerosol spread of several hemorrhagic fever viruses is possible. But

231

doing so requires the use of a muzzle system attached to the face of guinea
pigs and monkeys, which delivers a large dose under pressure.” (p.284)
Lassa fever, for which Joe McCormick experimented with ribavirin in the 1970s, is
a less dangerous haemorrhagic fever than that caused by Ebola virus or Marburg
virus, and most people who are infected with Lassa fever survive. The 1980 edition
of the authoritative textbook Harrison’s Principles of Internal Medicine gave the
official mortality figures for what is described in the book as a “new virus disease”:
“The mortality rates in Jos [in Nigeria, where an outbreak occurred in
1970] and Zorzor [in Liberia, where an outbreak occurred in 1972] were 52
percent and 36 percent, respectively, while in Sierra Leone the rate was 8
percent.”
These mortality figures are based on small samples, since the number of people
who became ill and died was only 10 (out of 32 suspected cases) in Nigeria and 4
(out of 10 suspected cases) in Liberia. More cases of the haemorrhagic fever were
reported between 1970 and 1972 from Sierra Leone, but still only 63 cases were
reported. The mortality rate amongst the suspected cases in Sierra Leone was
significantly lower than in Nigeria and Liberia, with a death rate of 8 percent
reported in Harrison’s Principles of Internal Medicine. The textbook also gives
statistics which suggest that Lassa Fever is not usually a killer virus:
“In Sierra Leone 6 percent of the population surveyed had complementfixing antibody against Lassa Virus, while only 0.2 percent had recognised
disease, suggesting mild disease or inapparent infection. In Liberia 10
percent of hospital personnel had antibodies.” (p.843)
In the dramatic tale of how he discovered the value of intravenous ribavirin (an
anti-viral drug) for the treatment of early infection with Lassa Fever in Sierra
Leone, however, McCormick gives different figures for untreated versus treated
Lassa Fever:
“We went on to treat more than 1,500 patients with laboratory-confirmed
Lassa Fever. From over 16 percent, mortality dropped dramatically to less

232

than 5 percent. As time went on, the new treatment became famous
throughout the district…” (p.107)
The figure of a reduction from 16% mortality to 5%, which was attributed to
treatment with ribavirin, can be explained in other ways. The studies in the healthy
population of Sierra Leone and Liberia indicate that a much larger proportion of
the population have Lassa Fever antibodies than actually develop symptoms of the
disease, let alone die from it. By doing widespread and earlier testing for
antibodies, a sizeable number of mildly infected people (who would recover
without treatment anyway) could have been treated with the drug, resulting in
apparently lowered mortality rates by use of the drug. It is accepted by McCormick
that serious cases did not recover, even with use of the drug, and that his previous
trials of oral ribavirin were of doubtful value.
These trials, described in a chapter titled “magic bullets”, occurred in 1979, and
involved comparing oral ribavirin with injections of plasma taken from people who
had recovered from Lassa Fever. After returning to CDC headquarters in Atlanta,
McCormick analysed the results of the treatment with the help of computer
technology:
“The first analysis suggested that neither treatment was effective. Looked
at as cold, hard numbers, even the ribavirin seemed to have little effect.
“Yet I could not let go so easily. The more I thought about it, the more I
began to wonder whether there might not be another way of looking at the
results [to suggest efficacy of the drug]. I went back and reanalyzed the data.
This time, I decided, we would take a different tack. I started to break down
the patients into two basic categories – those who were in the early stages of
illness on the day we began treatment, and those who were in the late stages.
In our first analysis, we took no account of the timing of the admission:
When did the patient become ill, and when did he actually go to the hospital
for help? Now I took into account how much time had passed from the onset
of illness to the day we’d started ribavirin.
“No matter how we looked at the data for immune plasma, the result was
the same. In every case, the plasma failed to work. It didn’t matter how early

233

in the disease we treated, the patients continued to die at the same rate as
before. But with the ribavirin I detected a glimmer of success, the faint glow,
perhaps, of at least a fraction of the miracle we sought. If a patient was
admitted in the first six or seven days of his illness, ribavirin improved
prospects for survival. If the patient had been sick for more than a week, the
capsules had less effect. We were onto something.”
The faint glow of a fraction of a miracle, provided by a drug which was always too
expensive for Sierra Leone, Nigeria or Liberia to afford for their impoverished
citizens. The same applies for azidothymidine (AZT/Zidovudine): apart from
debatable benefits from the drug, it is too expensive for most of the people with
AIDS in the world today. Furthermore, it may shorten, rather than lengthen, life
expectancy, as will be seen.
Returning to Dr Alan Cantwell’s and Dr John Seale’s accusations, the possibility
that AIDS has been introduced into the human population through vaccinations,
whether accidentally or intentionally, is so serious that it cannot be easily
dismissed as “unprovable” or “unlikely”. If unintentional it is a medical blunder of
unprecedented scale, which warrants an immediate re-evaluation of medical
immunization strategies and public health programs worldwide. If intentional,
which is a possibility raised by several researchers, it is an act of brutal genocide
and mass-murder even worse than the Nazi atrocities of the Second World War, and
the perpetrators should be held accountable. Lest those would determine the truth
about this matter wait for a confession from those responsible or proof of written
orders to commit genocide, though, Cantwell makes the important point that “all
biological warfare research in the US is secret and hidden from public view”. The
same applies to biological warfare research in other countries, including Australia
and Britain.

Chapter 16
BIOLOGICAL WARFARE RESEARCH IN AUSTRALIA

234

The history, demographics and epidemiology of AIDS in the modern world are
simply inconsistent with the theory, promoted by the medical and pharmaceutical
industries, and the governments in Britain, Europe, America and Australia, that the
infection, now called “Human Immunodeficiency Virus” (HIV), spread to humans
from wild monkeys (or chimpanzees) in Central Africa during the 1960s or 1970s.
It clearly does not explain a simultaneous explosion of HIV and AIDS in white
homosexuals in America, and black heterosexual men and women as well as
children in Africa, or the subsequent rapid spread of the virus to mineral-rich areas
of Asia, South America, Melanesia and Polynesia. The American physician and
AIDS researcher Dr Alan Cantwell wrote, in 1998:
“Where did HIV originate? Prominent cancer virologists and
government epidemiologists have theorised that HIV originated in African
green monkeys. Purportedly the monkey virus “jumped species” and entered
the black population. From there it migrated to Haiti and Manhattan. After
the virus entered the black heterosexual population in the late 1970s, it
rapidly spread to millions of blacks because of transfusions with HIVinfected blood, dirty needles, promiscuity and genital ulcers – or so the
experts said.
“Not all scientists believe the official monkey story, although it is rare to
find people who express this view publicly. One persistent underground
rumor is that AIDS is biological warfare. Proponents of the AIDS conspiracy
theory believe that AIDS has nothing to do with green monkeys,
homosexuality, drug addiction, genital ulcerations, anal sex or promiscuity,
but that it has to do with scientists experimenting on blacks and gays: in
short, AIDS is genocide. Most African-Americans have heard the story that
HIV is a manufactured virus genetically-engineered to kill off the black race.
Thirty percent of New York City blacks polled by The New York Times
(October 29, 1990) actually believe AIDS is an ethno-specific bioweapon
designed in a laboratory to infect black people.
“Despite the general acceptance that HIV came from monkeys and the
rain forest, there is no scientific evidence to prove that HIV and AIDS
originated in Africa. What is true is that the first AIDS cases were uncovered

235

in the U.S. in 1979, around the same time that AIDS cases were discovered
in Africa. In addition, no stored African tissue from the 1970s tests positive
for HIV [this is corrected in Cantwell’s 1999 article]. And scientists have a
hard time explaining how a black heterosexual epidemic centered in Africa
could have quickly transformed itself into a white homosexual epidemic in
Manhattan.” (p.25)
Dr Cantwell does not mention, in these articles, several additional scientific and
historical facts and trends which might shed more light on the subject of
motivation for a genocidal use of the HIV virus as a biological weapon. These
include concern about “Global Overpopulation” (the ‘population explosion’)
blamed on “Third World Overpopulation” among “First World” scientists,
politicians and “international health experts” (including those advising the WHO
and the WHO itself), pre-existing white supremacist regimes, eugenic theories and
programs, drug promotion interests and the corporate promotion of needles and
condoms, as well as the financial interests of the insurance industry, mining
industry and chemical industry, the medical treatment industry and the
“international aid industry”. Specifically, it is difficult to ignore the fact that the
populations worst decimated by AIDS: homosexuals, intravenous drug-injecters
and “blacks” have been the targets of genocidal eugenic programs in the past, the
most notorious of which were the Nazi euthanasia programs.
A direct connection between these Nazi programs and the development of
Australia-based biowarfare and chemical warfare programs becomes more likely
with the recent revelations that many Nazi scientists were given asylum and
employment in Australia after the Second World War, given the prevailing
enthusiasm in medical and scientific academia for eugenics and so-called “racial
hygeine” during the years of an openly declared “White Australia Policy”. Nazi
administrators, scientists and senior military espionage officers were also
employed by the American military after the war, and were involved in the
development of the CIA and “Cold War” against communism, according to recent
exposes (Vankin and Whalen, 1997).

236

The possibility that the importation of Nazi scientists adversely influenced the
development of science in Australia is ridiculed by Guy Nolch, in the September
1999 editorial of Australasian Science, in which he deplores the naming of several
Nazi scientists by the Melbourne Age newspaper:
“Last month the public image of science was turned back more than 50
years when The Age reported that Australia had smuggled 127 former
German scientists into Australia soon after World War II. The rationale for
this was to bolster Australia’s fledgling scientific effort. Our allies also
wished to keep military knowledge out of Soviet hands.
“The Age named 41 of the scientists who had been members of the Nazi
Party, and used this fact to stir up a frenzy. Whether they were ardent Nazis
or had joined the party under duress was unclear, and irrelevant as far as The
Age was concerned. These were war criminals!”
What was not revealed by The Age or Australasian Science is what medical and
health science industries former Nazis and Nazi-sympathisers were employed in,
and in what government, academic and “public service” positions they were given
employment when they came to Australia. This is important knowledge for the
Australian people, and would help clarify the State and Commonwealth
Governments’ attitude to Nazi philosophy, which may well have been one of
sympathy (not forgetting that the Allies’ hero Winston Churchill was a prominent
eugenist and past vice-president of the British Eugenics Society). It may also help
Australians gain a more complete picture of Australian and Australia-based
biowarfare programs of the 20th century, and elucidate the history of eugenic
abuses in this country. It is clearly necessary to identify abuses of science in the
past to stop further use of scientific knowledge, including chemical,
anthropological and biological knowledge to create disease, famine and pestilence
in the future.
In the same edition of Australasian Science is an article on modern biowarfare
written by Jacinta Kerin of the Murdoch Institute in Melbourne. In the article, titled
“Biological weapons from genetic research” she writes:

237

“The role of genetic engineering in biological warfare can be divided
into two main areas. The first is the genetic manipulation of either bacteria,
viruses or toxins in order to maximise their suitability for biological warfare.
Before molecular genetics, candidate biological warfare pathogens were
selected on the basis of a number of naturally occurring properties that
render them hazardous to human health. For example, resistance to
environmental degradation, high infectivity, short and predictable incubation
period and resistance to antibiotics and/or vaccines are some of the factors
that might be considered in choosing a pathogen as a bioweapon.
“DNA manipulation raises the possibility that the list of candidate
pathogens could be substantially expanded should some of these properties
be genetically engineered into them. Alternatively, such technology gives us
the means of fine-tuning any of the properties already identified in order to
maximise their utility for a given attack.”
Kerin refers to “us” having the means of “fine-tuning” biological weapons using
DNA manipulation, but it requires technology far more complex and expensive
than most individuals, or “third world” nations, for that matter, have at their
disposal. Not so several “private (corporate) research institutions” and universities,
both public and private, in Australia, the United States and Europe including the
Murdoch Institute, Walter and Eliza Hall Institute, CSIRO and Macfarlane Burnet
Centre (MBC) in Melbourne. All of these university, government and corporate
connected research institutions are conducting research and developing technology
that could be used for modern biological warfare of the most devastating kind.
Given that the MBC have been involved in active promotion of immunization with
trial (experimental) vaccines in several parts of the “third” world, including
vaccination with new Hepatitis B vaccine preparations, and also advises the World
Health Organization (WHO) and United Nations (UN) on AIDS prevention and
treatment, it appears that Australian science merits a very close examination
indeed.
The claim by Kerin that ideal biological weapons have short incubation periods
(which excludes HIV/AIDS) is not, by the way, true. For the purposes of

238

population control and targetted genocide whilst making multi-billion dollar
pharmaceutical and diagnostic pathology profits, HIV is a remarkably suitable
agent – precisely because it has a long incubation period. Because it is said to be
transmitted through sexual intercourse, and that one might not know if one has
been infected by “unprotected” sex until repeated blood tests are done over three
months (the currently promoted claim) it has influenced sexual behaviour amongst
young people in the direction of “safe sex”, which includes the use of condoms,
which act as an additional means of population (number) control. The sale of
condoms is also a multimillion dollar industry of its own accord, profiting also the
rubber industry and plastics industry. Coincidentally the terrible atrocities of
Leopold II in Belgian Congo were committed primarily in the interests of the
Belgian rubber industry. Belgium happens to be where all the live virus vaccines
for SmithKline Beecham’s massive vaccine industry are made – a remarkable
coincidence.
AIDS has also resulted in unprecedented sales of drugs for “combined treatment
regimes”, none of which can cure the disease or prevent it. The sales of thse drugs,
including AZT (Azidothymidine/Zidovudine) from Glaxo Wellcome, abacavir
(Ziagen) also from Glaxo Wellcome, nelfinavir (Viracept), saquinavir (Invirase)
and zalcitabine (Hivid) from Roche (producers of Valium) and many others on
offer amount to billions of dollars every year, and all can cause serious illness and
death themselves, usually by damage to the immune system, a problem they are
supposed to be treating. In addition to these “antiviral drugs” people with AIDS in
Australia are routinely treated with a cocktail of other drugs, including antibiotics,
antifungals, sleeping tablets and psychiatric drugs (including antidepressants,
antipsychotics and tranquillisers).
The Murdoch Institute (genetics), Macfarlane Burnet Centre (virology and
epidemiology), Walter and Eliza Hall Institute (immunology and virology) and
Howard Florey Institute (of experimental physiology, at the University of
Melbourne) and CSIRO (and related Commonwealth Serum Laboratories, CSL)
are among the biggest medical research institutes in Melbourne, and are among the
small number of Australian Research Institutes to receive ongoing “block funding”

239

from the National Health and Medical Research Council (NHMRC), Australia’s
main federal government medical research funding body. Others to receive such
funding include the Baker research institute (cardiovascular research) at the Alfred
Hospital and the Mental Health Research Institute in Parkville, Melbourne (which
is adjacent to and was closely associated with the Royal Melbourne Hospital and
the University of Melbourne).
All these institutes are involved in drug-oriented research and several are looking
for genetic causes of illness. All share a history deeply embedded in eugenics.
Among ‘genetics researchers’ with a background in eugenics, the Mental Health
Research Institute is currently conducting a “genetic study of schizophrenia”,
funded by the NHMRC and “Network for Brain Research into Mental DisordersGenetic Linkage Consortium”. The Murdoch Institute has, according to their 1996
Annual Report, been engaged in studies seeking to find, of all things, “genetic
markers for schizophrenia in Southern African Bantu-speaking families.” It is
unclear as to what criteria were used to diagnose “schizophrenia” in Africans
whose language, beliefs, cultural norms and attitudes are predictably different from
the European psychiatrists who developed criteria for diagnosis of
“schizophrenia”. Eurocentric and Christian Church-based ideas of what constitutes
“normal beliefs, attitudes and behaviour” cannot reasonably be inflicted on
African, Asian, American or Australian people, who are of diverse racial, religious
and cultural backgrounds without serious human rights abuses occurring.
The Annual Report of the Murdoch Institute does not elaborate on these
schizophrenia studies, as it does about some of its more worthwhile projects, but
the section at the end of the 1996 Annual Report, titled “List of Publications”
mentions four schizophrenia studies, all involving the Scandinavian director of the
institute, and Professor of Medical Genetics at the University of Melbourne, Dr
Robert (Bob) Williamson. These include one paper titled “No evidence for linkage
of chromosome 22 markers to schizophrenia in Southern African Bantu-speaking
families” which was published in the American Journal of Medical Genetics and
three papers published in Psychiatric Genetics: “Non-parametric analysis of
chromosome 6p24-22 marker data and schizophrenia in Southern African Bantu-

240

speaking families”, “A linkage study of the N-methyl-D-aspartate receptor subunit
gene loci and schizophrenia in Southern African Bantu-speaking families” and “No
evidence for linkage of chromosome 6p markers to schizophrenia in Southern
African Bantu-speaking families.”
How can you diagnose insanity in a person whom you cannot communicate with
because you do not share a language with them? How can we (or Professor
Williamson) be sure that the ‘Bantu-speaking schizophrenics’ were not South
African political dissidents? After all, if we had received blood specimens from
Communist China or Russia such concerns would be regarded as obvious. Why is
the Murdoch Institute, located in the Royal Children’s Hospital in Melbourne,
trying to prove the ‘genetic basis’ of schizophrenia in (black) Africans in the first
place? Could eugenics have anything to do with it? It has, after all, been admitted
at the South African “Truth and Reconciliation” hearings that white South African
scientists were developing “ethnospecific” biological and chemical weapons –
specifically, germs and drugs specifically designed to harm people with dark skin.
On Friday, 12th January 2001, the Murdoch-owned Australian newspaper carried a
front-page headline story titled “Killer virus created in error”. Credited to “Science
Writer”, Stephen Brook, the article is subtitled “discovery sparks bio-warfare
fears”. It begins:
“A new virus that destroys the immune system has been accidentally
created by Australian scientists and kept secret for more than two years.
“The discovery was made public yesterday amid fears that biological
weapons could be developed from the work on the genetically modified
virus, which in experiments has killed mice vaccinated against it.
“The scientists’ decision to publish the results of their groundbreaking
research came after they raised their concerns with colleagues and officials
at the departments of Defence and Foreign Affairs.
“The mousepox is harmless to humans, but scientists say the technology
that created it could be used to develop a GM-strain of smallpox which
would render the smallpox vaccine much less effective.

241

“Ian Ramshaw, an immunologist who worked on the project, said: ‘The
virus itself is not dangerous, the ideas and the technology behind it are
dangerous’.”
The story claims that while “trying to develop a mouse contraceptive”, scientists at
the “Pest Animal Control Co-operative Research Centre” in Canberra (at the John
Curtin School of Medical Research of the Australian National University) had
genetically engineered the mousepox virus, making a new strain that was more
deadly to mice. In fact, it killed the mice in 6 days. The scientists had added a gene
that produces the cytokine (immune-active chemical messenger) interleukin-4 to
the mousepox virus, using a sophisticated genetic engineering technique,
apparently in an effort to “stimulate antibody production” in the mice. Instead, the
genetically modified (GM) virus killed the mice by “destroying their immune
systems” – specifically by killing T-cells (T-lymphocytes, coincidentally the same
white blood cells that are targetted by the HIV virus). This, the researchers
claimed, was done in 1998, according to the article, and was an “accident” – the
intent was apparently to “develop a virus that would stop mice reproducing”,
although it was admitted by the chief researchers, Professor Ian Ramshaw of the
Australian National University (ANU) and Dr Ron Jackson of the Commonwealth
Scientific and Industrial Research Organization (CSIRO) that “they knew the
research had the potential to be adapted for use as a biological weapon”. Very
easily, in fact, and the article gives details of how this might be done. It also claims
that “Professor Ramshaw has been using a similar technique in the development of
a ‘double whammy’ AIDS vaccine, which is expected to go to human trials
shortly”. The article quotes Professor Ramshaw as claiming that “the [mousepox]
research is the first time a virus has been genetically engineered to make it more
harmful”. Is this true?
It is evident from this article that Australian scientists discovered how to engineer
viruses to make them more harmful over two years before they publicised their
work to the world. Actually, their work was publicised by the American Society for
Microbiology in an on-line publication of the Journal of Virology, and the
Australian scientists’ interview with the Australian came after consultation with the

242

Australian (Commonwealth) Departments of Defence and Foreign Affairs.
Claiming that “the current work has been blown out of all proportion”, Dr Ron
Jackson, the CSIRO scientist involved with the project, was, according to the
article, “being shielded from the media in a darkened lab at the centre, while his
boss, centre director Bob Seamark, fielded the relentless media inquiries from
around the world”.
The fact that the same technique could theoretically be used to make the HIV virus
more lethal is obvious – and this information was made public along with
sufficient scientific detail for its manufacture. Furthermore, the article states that
the engineered virus killed mice that had been previously vaccinated against
mousepox – stressing, for anyone who missed the point, that this technique could
be used to genetically engineer strains of lethal viruses such as smallpox such that
they can overcome prior vaccination.
The article says that Professor Ramshaw, an immunologist, has used a “similar
technique” to develop a “double whammy AIDS vaccine”. What does this mean?
There is no currently available “single whammy” vaccine against AIDS – and yet
Professor Ramshaw claims that human trials of his “double whammy” vaccine are
imminent. If the “similar technique” involves genetically engineering the
interleukin-4 gene onto an “inactivated strain” of the Human Immunodeficiency
Virus, one could predict a similar catastrophic collapse of the immune system, if
the mousepox experiments are any guide.
The claim, in the article, that the scientists were looking for a “mouse
contraceptive” is unlikely to be true, given the other information provided in the
story. Even the more credible claim that they were trying to stop mice from
breeding (sterilizing them, as opposed to providing them with ‘contraception’)
does not stand up to close scrutiny. The article admits that the genetically
engineered mousepox virus was injected into healthy mice and mice that had been
immunized against the virus. All the healthy mice died along with half of the
immunized mice. A diagram on the page shows, “how the experiment went
wrong”. Apparently, the interleukin-4 gene was supposed to stimulate antibody

243

production, but instead killed the mice’s T-cells. Stimulating antibody production
would be a most unusual objective for scientists seeking to stop the breeding of
mice – unless they hoped to control their breeding by releasing killer-viruses into
the “pest animal” population. Given the past examples of myxomatosis introduced
into rabbits and cane-toads introduced into the Queensland cane-fields, the latter is
not impossible, by any means, however the most obvious application of the
scientific discovery by the CSIRO and ANU researchers is for the development of
more lethal biological warfare agents. Could this have been the initial intention?
Also, why was the work kept secret for over 2 years? Why was it publicised at the
particular time that it was? Given that the Defence Department were apparently
involved in the decision to “go public”, was the department also involved in the
research? Is the whole story being told? What other research of this nature is being
conducted in Australia? If they had nothing to hide, why were the researchers
involved being “shielded from the media in a darkened room”, with the centre’s
boss “fielding” questions from media around the world?
In August 1999, the Commonwealth Scientific and Industrial Research
Organization (CSIRO) published an article titled “Will we ever beat AIDS” in the
science organization’s Helix magazine (the magazine of the CSIRO’s helix science
club). This article, the magazine’s cover story, claimed that:
“An experimental vaccine for AIDS that’s receiving international
attention has been made in Australia. Using a virus that infects chickens,
called fowl pox, CSIRO Animal Health researcher Dr David Boyle,
Australian National University researcher Dr Ian Ramshaw and Dr Stephen
Kent from the Macfarlane Burnet Centre for Medical Research have
constructed as kind of ‘taxi’ that can carry part of the HIV virus into people
without causing infection. The taxi is made of the modified fowl pox virus.
“Like all vaccinations, the experimental approach works on the fact that
the immune system, though rather slow to gear up on first meeting a germ,
has a lifelong memory and responds much faster a second time. If a person
can be given parts of the HIV virus without infection, they’ll respond very
quickly to the real virus in the future and, with luck, eliminate it from their
bodies as soon as it arrives and before it has time to enter T lymphocytes.”

244

Simplified for young members of the CSIRO science club, this explanation of how
“the experimental approach works” does not stand up to close scientific scrutiny.
One might say that this experimental approach relies on such immunological
reactions to work. There is much danger, though, in injecting bits of the HIV virus
into humans using fowl pox virus as a ‘taxi’. New, lethal strains might be created.
Hybrid viruses producing the worst features of fowl pox combined with the worst
HIV features could result. This “worst case scenario” is not mentioned in the
CSIRO magazine report, nor is the possibility of accidentally introducing other
animal (in this case avian) viruses into the human population. Noting that Professor
John Mills, director of the Macfarlane Burnet Centre has recently advocated the
injection of live attenuated HIV as “vaccines” into nations with a high incidence of
AIDS (including southern Africa and Thailand, but excluding Australia, Europe
and his native USA), it is interesting that 18 months earlier the CSIRO regarded
such action as too dangerous. Roger Beckmann’s article continues:
“Unfortunately, HIV is a fast-changing virus and we need bits of it that
are constant to induce immunity. The best bits for doing this, and for
stimulating the body to fight it even when it’s in cells, are its internal
molecules. To give these to people would require giving the actual virus in
its entirety, and this – even if we used a killed form of it – is too dangerous.
“So the researchers have constructed an HIV gene-containing fowl pox
virus to use in the vaccine instead. Onve in the body, the foreign proteins of
the fowl pox virus, which cannot cause disease in humans [has it been tested
for this?], alert the immune system. Then the system’s cells also spot the
HIV genes inside the fowl pox and develop a memory for them. The person
is thereby immunised against HIV.”
The last paragraph reveals the financial connection, and that trials on humans were
just about to begin:
“Early results are promising, and small amounts of the experimental
vaccine have now been manufactured by the Australian National University
and the Australian company Virax. But before the vaccine can be put on the
market, it needs thorough testing to ensure that it’s safe. Medical centres in

245

Sydney and Melbourne will conduct a trial in early 2000 on HIV-infected
human volunteers. We’ll keep you posted on the outcome.”
There are several institutions in Australia that have the capacity to develop
biological weapons, including several that are researching infective agents that can
be used in biological warfare. These include virology and immunology research
institutions associated with major Australian universities (see diagrams in the
appendix). The same is true for other universities around the world, including those
in the East and those in the West. The epicentre of global biotechnology is
undoubtedly the United States of America, however Australia also has a highly
developed biotechnology industry. Much research is done in Australia in the areas
of vaccine development and immunology, and in molecular genetics, genetic
engineering and drug development.
The enormous influence of the agricultural and mining industries in Australia has
direct bearing to the development, over the past 100 years, of the biotechnology
and chemical industries in this country. These in turn have been shaped by the
political and economic agendas of those that have wielded political (state) power in
and over Australia. It takes only a cursory exploration of history, politics and
economics texts to see that, since their inception, governments in Australia have
been dominated by capitalist politics. Both “Labor” and “Liberal” governments
have supported and implemented capitalism, courting the mining, agricultural,
pastoral, logging (timber), chemical and pharmaceutical (drug) industries. They
have also, although less publicly, courted the arms industry. The same has been the
case with successive governments in the USA and Britain, and elsewhere in the
“First World” – which, while assuming the title of “Free World” is more accurately
identifiable as the “Capitalist World”. The “First World” claims to be both “free”
and “democratic” and the government of First World nations routinely claim such
virtue, however it takes only a casual reading of the newspapers to see that it is
ruled by an oligarchy of affluent, heavily armed capitalists. These are the masters
of the so-called “military-industrial complex” – a machine that feeds and is fed by
warfare and weapons, including biological (and chemical) warfare and biological
(and chemical) weapons.

246

Biological warfare has been mentioned many times on television, on the radio and
in major newspapers over the past two years, especially at the time of the Gulf
War, when Saddam Hussein’s Government in Iraq was accused of having
biological weapons and/or the potential to develop them. Iraq was also accused of
having chemical weapons and claims were made, mainly by European and
American media sources, that Hussein’s military-controlled government had used
these weapons against racial minority groups (Kurdish people, specifically) in Iraq
in an act, or several acts, of genocide. If these accusations are true (and they appear
likely) Saddam Hussein and commanding officers in his army are guilty of a
heinous crime against humanity, and are guilty of both mass-murder and genocide,
since literal genocide involves mass-murder.
The biological and chemical weapons that Iraq was said to possess were described
by the Australian United Nations ‘weapons inspector’ Richard Butler as “weapons
of mass destruction”, the development and use of which is prohibited, as they
should be, by International Law. It is the legal and ethical responsibility of the
United Nations to enforce these laws, as the organisation has the financial power,
political power and military power to do. Of course, it is the United Nations
organisation (UN) that made these laws in the first place, but not the first to
conceive the concept of Universal, and Global Laws. Universal Laws and Global
Laws are not the same thing, although colloquially the words Universal and Global
are often used interchangeably and this is the case in the naming of the (1948)
Universal Declaration of Human Rights. These laws refer to the human rights of
every human being in the Universe. There can be no exceptions if such a law is to
be just. It cannot be applied with double standards, favouring one individual or
regime over another. It is also not possible to ignore Universal Human Rights laws
infringements in some countries and punish others for less serious crimes by
attacking the nations with other weapons of mass destruction, and expect nobody
to notice inconsistency in the response.
The following radio reports were heard in Melbourne on 17.4.99, containing
carefully edited coverage of the American and British military strikes against
Baghdad:
American military voice:
“their mission is to attack Iraq’s nuclear, chemical and biological weapons
programs and it’s military capacity to threaten it’s neighbours. Their purpose is to
protect the national interest of the United States and indeed the interests of people
throughout the Middle East and around the world.”

247

The broadcast continued with a voice with an Australian accent saying:
“Britain’s Prime Minister Tony Blair says his government has backed the U.S.
attack because there was no realistic alternative to military force.”
The voice of Tony Blair is allowed, by the radio programmers and editors only the
brief statement, read in measured tones:
“We are taking this military action with real regret, but with real determination.
We have exhausted all other avenues.”
The Australian-accented voice continues:
“Canada has supported the US and British air-strikes against Iraq. Germany
says the attack is regrettable but Iraq had plenty of chances to avert the use of
force; while France, China, Iran and Russia have deplored the attack. At Russia’s
request the United Nations Security Council is due to convene in 90 minutes from
now to discuss the Iraqi crisis. UN Secretary General Kofi Annan, who tried to
broker a peace deal with Iraq, says it’s a sad day for the United Nations and a sad
day for the world.”
A sad-sounding but calmly measured voice, presumably that of Kofi Annan, is
heard saying what is unfortunately not nearly enough:
“All we know is that tomorrow, as yesterday, there will still be an acute need in
Iraq and in the surrounding area for humanitarian relief and healing diplomacy. In
both these tasks, the United Nations will be ready, as ever, to play its part.”
It is not, then, unreasonable to ask what part the United Nations and it’s allied
organizations, the World Health Organization and World Bank have played in
stopping or promoting biological and chemical warfare in the past, so that we can
know what sort of “humanitarian relief” and “healing diplomacy” to expect from
the world’s most respected authority on “global health” and “global economics”.
The Macfarlane Burnet Centre is deeply involved in the United Nations and World
Health Organization’s “Third World” health policies, especially in regards to AIDS
treatment and prevention, and several projects are being done in collaboration with
UN organisations and International Aid organisations, particularly World Vision
and AusAID. Among the centre’s many public education pamphlets, is one
introducing the centre, which claims in the section about the International Health
Unit:
“The IHU at MBC is working to reduce the impact of many diseases, and
improve the overall health of hundreds of communities around the globe,

248

through technical assistance and training programs.
“Current projects focus on sexually transmitted diseases such as
HIV/AIDS, hepatitis B, malaria, vaccination programs for preventable
childhood diseases, and improvement of water supplies. The IHU also
conducts a number of teaching programs in Australia, such as the ‘Masters
of Public Health’ and ‘Health and Human Rights’ courses.
“The IHU works with a number of national and international
organisations such as the World Health Organisation, World Vision
Australia, World Bank, and AusAID. IHU aid programs are being
conducted in over 20 countries around the world, including Zimbabwe,
Vietnam, India, Tibet, Nepal, PNG, Thailand, and Australia.”
Given the disastrous problems affecting both global health and the global economy,
it would seem that the advice the “Third World” and the World Health
Organization receive could improve, to say the least. Perhaps because it is less
obviously connected with the crime of “Third World Debt”, the World Health
Organization (WHO) has not been tainted by the public cynicism commonly seen
regarding the World Bank (WB) and International Monetary Fund (IMF). It is
important to realise, when analysing the WHO and its strategies, that the esteemed
global health organization is part of the United Nations organization, alongside the
World Bank and IMF and other UN bodies.
The United Nations Organization grew out of the pre-WWII League of Nations,
and was established by the victors of the Second World War; the USA, UK, USSR,
France and China thus became permanent members of the United Nations Security
Council. When the UN organisation was established in 1945, the World Health
Organization, World Bank and International Monetary Fund were three of
numerous “specialised agencies”, others including the International Labour
Organisation (ILO), Food and Agriculture Organisation FAO), UN Educational,
Scientific and Cultural Organisation (UNESCO), World Intellectual Property
Organisation (WIPO), World Tourism Organisation (WTO), International
Telecommunication Union (ITU), Universal Postal Union (UPU) and International
Atomic Energy Agency (IAEA).

Chapter 17

249

TYING IT ALL TOGETHER
The United Nations laws against genocide were formulated as a direct response to
revelations of Nazi atrocities during the Second World War, a war during which
psychological, biological and chemical warfare were intensively researched by all
the major protagonists. Immediately after the Second World War official biological
and chemical warfare laboratories were set up in several countries, including the
Soviet Union, Britain and the United States of America; these were officially
abandoned in 1973 with the International Convention on Biological and Chemical
Weapons. This convention has been heralded as an example of successful
conversion of the military industry for peace. The Gaia Peace Atlas (1988), edited
by Frank Barnaby, former director of the Stockholm International Peace Research
Institute, claims:
“That military industries can be converted to civilian purposes is shown
by the outcome of the 1972 Biological Weapon Convention. This banned the
production and development of biological weapons. American biological
weapons establishments and personnel were then converted to civilian
medical establishments.” (p.218)
Biological warfare has recently become a matter of public concern, but has always
been a matter of public importance. An acknowledged form of non-conventional
(or unconventional) warfare, biological warfare is based on the use of infective and
biologically toxic agents, including bacteria, viruses, fungi, and chemical toxins to
cause acute and chronic illness. Historically, germ warfare as used to both kill and
maim targeted populations. These have sometimes been declared ‘enemies’, but
more often they have been the victims of covert warfare, especially during the
proliferation of germ warfare in the 20 th Century. During the Second World War, as
has been admitted many decades later, both the Allies and the Axis powers
developed and tested various infective agents for use in biological warfare. On this
matter there is a noticeable difference between the claims of the opponents in the
Second World War and Cold War.
Australia, where this work was researched, where I studied medicine from 1978 to
1983, and where I have worked as a doctor for the past 25 years, was a member of
the Allies in the Second World War. The nation has aligned itself politically,
militarily and scientifically with the capitalist ‘West’ since the first political
foundation of this nation. This is a very recent event – the nation of Australia is
only 100 years old. In stark contrast, the land of Australia is very ancient, and the

250

first people who arrived here did so in the unimaginably distant past. These were
the people the White Nation that called itself Australia (Southern Land) now refers
to as ‘Aborigines’. This term is, of course, not a specific one. During the era of
colonization the dark-skinned natives of all the “discovered” continents were
called “Aborigines”. Roughly the same populations were also described, in
historical records and texts as “natives”, “savages” and “blacks”. Often these terms
were used interchangeably and had been since the earliest days of cargo slavery by
the architects of the “Age of Discovery”, as the Western history textbooks refer to
the period of history from 1490 to 1600, when the monarchies of Western Europe
discovered millions of people to enslave and exploit.
The Age of Imperialism
Imperialism, the building of empires, was a fundamental objective of the voyages
of discovery by Magellan, Bartholomew Dias, Vasco Da Gama, Columbus and the
other “great navigators” of the late 15th and early 16th Centuries. Their voyages
were financed by the wealthy, and rapidly expanding kingdoms of Portugal and
Spain – directly. The monarchies of these nations directly financed the voyages,
and immediately claimed all “discovered” territories for themselves. The Catholic
Church sanctioned these possessions and immediately sent missionaries to convert
the natives. This was done at the same time that soldiers, armed with guns and
cannons established “colonies” at various strategic locations around the globe.
Each site was chosen with care. Strategic importance was paramount, in terms of
strategy in the war between the various colonising (European) nations, and the war
against the people resisting enslavement, for colonization always brought
enslavement.
The role of, initially, the Catholic Church, and later the Protestant Churches in
aiding, abetting and sanctioning the expansion of various European empires,
despite the fact that it was a vehicle for slavery and exploitation, must be
acknowledged if one is to understand the history of genocide in the modern world.
In 1494 Pope Alexander VI gave divine sanction for the division of all new lands
between the monarchies of Spain and Portugal. King Ferdinand and Queen Isabella
of Spain, who had financed Christopher Columbus were “given” the “hemisphere”
(half-globe) West of the Azores islands in the Atlantic Ocean (North, South and
Central America), and the King of Portugal, John II, was granted any “discoveries”
in the Eastern Hemisphere (Africa and Asia), since he had financed Bartholomew
Dias, who had “first” sailed around the southern tip of Africa, discovering a sea
route to the Indian Ocean and thus to the valuable “spice islands” the East Indies.

251

When the monarchies of Holland, England and France developed sufficient naval
power to challenge the Spanish and Portuguese fleets, they took little notice of the
papal decree of Alexander, and claimed the support of their rival Protestant
Churches in their rival territorial claims. Inevitably the desired “divine sanction”
was given without requiring demands or executions of the clergy, although under
Henry VIII, who arranged for the (his) British Parliament to appoint him “head” of
the English Church in 1534. Ironically, Thomas Cromwell, Henry’s First Minister,
who had convinced Parliament of the merits of this dubious act, was one of
Henry’s many friends, enemies and wives the despot had executed when their
utility was no longer evident to him. Another friend that Henry VIII had executed
was the writer and philosopher Thomas More, who had written the satirical classic
Utopia in 1516 and was regarded as one of Britain’s leading intellectuals. Thomas
More had spoken publicly against Henry being made head of the English Church,
resulting in his execution in 1535 after 15 months imprisonment in the Tower of
London. John Fisher, the bishop of Rochester was executed on Henry’s orders, also
in 1535, for the same reason.
Henry VIII ascended the British throne at the age of 18 and ruled the British
Empire until his death in the year 1547 at the age of 56. During this time he
squandered much wealth in wars against his French and Spanish rivals. To
replendish the Royal coffers he seized, with the assistance of his First Minister,
Thomas Cromwell, the lands and property of the Catholic Church in Britain. This
occurred after his break with the papacy due to the refusal of the pontiff, Pope
Clement VII to “annul” his marriage to Catherine of Aragon, the Spanish princess
he had married in 1508. Catherine, who was previously Henry’s sister-in-law (she
was the widow of Henry’s older brother Arthur), was the daughter of King
Ferdinand of Spain, who had been granted the “Western Hemisphere” with his wife
Queen Isabella by the Spanish-born Pope Alexander’s papal decree of 1494.
Henry VIII’s main foe during the many years he waged war against the French was
King Francis, who died, aged 53, on the 31st of March in 1547, only two months
after Henry. Francis had waged war, for many years, against the Habsburg emperor
Charles V, for control of the European mainland and the newly discovered
territories in the Americas. Charles, the son of “Philip the Fair” and “Joanna the
Mad”, was the grandson of Ferdinand II of Aragon, the husband of Queen Isabella
of Castile. Ferdinand and Isabella had united their kingdoms in 1479, ten years
after their marriage, resulting in a shared empire centred in Spain. At the time, the
main threat to Spanish territorial ambitions came from the neighbouring monarchy
of Portugal, which, after a four-year war (1475-1479) was granted, by the Spanish
monarchy, a monopoly of trade and navigation along the entire West African coast.

252

When the explorer Bartholomew Dias, sponsored by John II of Portugal, rounded
the Cape of Good Hope (which he initially named the ‘Cape of Storms’) in 1488,
the territorial claims of the Portuguese expanded dramatically, to include the entire
“Eastern Hemisphere”.
One of King Henry VIII’s enduring legacies is the Royal College of Physicians,
which he established at the urging of the physician Thomas Linacre. The Royal
College of Physicians has remained, to this day, a powerful force in medical
politics, controlling the system of medical qualifications throughout the British
Empire (and later the British Commonwealth). Henry VIII also presided over the
formation of the “United Company of Barbers and Surgeons” in 1540, appointing
Thomas Vicary, the Sergeant Surgeon of Henry’s army, as “Master” of the new
union. The United Company subsequently became the Royal College of Surgeons
(in 1800). In Medicine: the art of healing (1992), the politics surrounding the
formation of the United Company of Barbers and Surgeons is described:
“In London, prior to 1540, there were two distinct groups of surgeons
who were in fierce competition over the right to supervise those who wished
to practice that craft. The more elite of the two was the unincorporated Guild
of Surgeons, with perhaps twoscore members who had learned their skills
while serving in military campaigns. The other was the much larger group of
the Barbers’ Guild, who had distinguished themselves from their fraters who
had only practiced barbering. With 185 members, this was the largest of the
livery companies in London.
“The amalgamation into the new United Company of Barbers and
Surgeons was advantageous to both organizations. The status of the barbers
was elevated by their association with the elite surgeons and by their
separation from the pure shavers and hairdressers. For the surgeons, the
advantage lay in the increase in total numbers and the much larger treasury
of the men with whom they had been linked.” (p.40)
The system of government and civic infrastructure Australia were set up by British
colonial authorities in the early 20th Century. The official head of government in
Australia was the British monarch, referred to in government laws as “The
Crown”. When Australia was formed as a “Federal State” in 1901, the Governor
General, Australia’s official Head of Government, was appointed by the English
monarch. The Governor General maintained executive powers over the elected
government in Australia, according to the Australian Constitution (which was
actually British-designed, and thus maintains British control over the Australian
people, and the land they occupy).

253

At present the Queen of England, Elizabeth II, is the official monarch of Australia,
and thus the “owner” of Crown lands in Australia, according to the Australian
constitution. On 20th November 1926, the present Queen’s grandfather, George V,
declared that the British Empire would henceforth be known as the British
Commonwealth of Nations, of which Canada, Australia, New Zealand, South
Africa and Newfoundland “should have equal status with Britain as members”
(Burne, 1991, p.1088). King George assumed the title “George V, by the Grace of
God, of Great Britain, Ireland and the British Dominions beyond the Seas, King,
Defender of the Faith, Emperor of India”.
The “faith” that George and his armies “defended” (and attacked with) was the
Anglican religion, as defined and ordained by the Church of England (Anglican
Church). This religion had been founded by the notorious King Henry VIII, who
arranged for himself to be appoined head of the new English Church when broke
from the Catholic Church because the Roman Pope refused to annul his marriage
to Catherine of Aragon, so he could marry again. Henry VIII had been granted the
title “Defender of the Faith” by an earlier pope because of his military support
against the Vatican’s enemies. The title “Emperor of India” shows clearly that
George V regarded himself as the owner of this ancient land, and of his various
“dominions”. It was thus not really a “common-wealth” – it was a system of
Imperialism under a new name and a new organizational structure. The “white”
colonies and “dominions” (Australia, New Zealand, Canada, South Africa and
Newfoundland) could aspire to being “equal members in the British
Commonwealth”, but those in the colonies and “protectorates” mainly populated
by “people of colour” were to continue as “inferior members”.
During the Second World War (1939-1945) the political concept of a
“Commonwealth” was exploited to full effect by the British Imperial armed forces.
“Commonwealth partners” from around the British Empire were recruited to fight
for the king’s forces and Allies, against the “unholy alliance” between the
Germans, Italians, Spanish and Japanese, as the American President Franklin D.
Roosevelt described the “Axis alliance” in his presidential address to the nation in
December, 1940. It was during this radio broadcast that Roosevelt urged
Americans that the United States of America must urgently become the “great
arsenal of democracy”:
“As planes and ships and guns and shells are produced, your
Government, with its defense experts, can then determine how best to use
them to defend this hemisphere. The decision as to how much shall be sent
abroad and how much shall remain at home must be made on the basis of
our over-all military necessities.

254

“We must be the great arsenal of democracy. For us this is an emergency
as serious as war itself. We must apply ourselves to our task with the same
resolution, the same sense of urgency, the same spirit of patriotism and
sacrifice, as we would show were we at war.” (Roosevelt, 1940, quoted in
As It Happened: A History of the United States, Sellers, et al, 1975, p.695)
In his broadcast to the nation Roosevelt said that “we are planning our own defense
with the utmost urgency; and in its vast scale we must integrate the war needs of
Britain and the other free nations resisting aggression”. The “other free nations” in
President Roosevelt’s terms, included South Africa, Canada, Australia and New
Zealand, which were “members of the Commonwealth” of equal status with
Britain according to George V’s proclamation of 1926. Officers from these (white)
nations had been placed in positions of authority over the various “coloured
soldiers” in His Majesty’s Army, since the British Government, under the eugenist
Winston Churchill, had been “integrating” its own “war needs”. In His Majesty’s
armed forces it was possible for a dark-skinned man to become a low-ranking
officer, but only as frequently as Galton’s theories would have predicted this. The
command positions were all occupied by white men, all of whom had a “good
education”, meaning that they went to elite schools and universities. These men
were rarely killed in the kinds of war the British waged – while the hordes of
Indians, Africans and Australians who rushed to defend the “Commonwealth”
occupied the front line. They were the occasionally honoured, and frequently
killed, “privates”, who formed a “buffer zone” between the enemy’s bullets and the
officers who gave the orders. The officers had been trained to order “their” men to
keep fighting.
The Second World War was fought on several fronts. These have relevance to the
scientific and medical information to follow, so I will provide a brief overview of
the politics of WWII as I perceive them. I did not learn anything about the Second
World War at school or university, and have only a limited knowledge of its details,
however most people have gathered that the Second World War included a war
between certain European governments for control of Europe and Africa, and a war
between the Japanese Imperial goverment and the government of the United States
of America. Predictably, given the victors of the Second World War, Germany and
Japan are usually seen as being the aggressors in the Second World War, while
Britain and the USA are seen as the defenders of freedom and democracy. While it
is true that the Germans and Japanese had Imperial designs, the British and
Americans did also. British efforts to dominate the world, and create a global
empire, long predated even the foundation of Germany. At the outbreak of the
Second World War the British government claimed supreme authority over a fifth

255

of the world’s land surface: including “dominions and possessions” on every
continent. The “jewel in the crown” of the Empire was India, the population of
which was very much greater than that of the British Isles. India, which had been
wrested from Moslem moghul rulers by the British in the 1700s, had long been a
source of enormous wealth for the Royal British Royal Family and their allies.
Many of the “crown jewels” were “given” to the British by the elite Indians, who
were allowed to maintain their priviledged position in His (British) Majesty’s
Indian Empire, provided they pay their taxes and allow their people to be exploited
and enslaved. The rule of “British Raj” continued in India through the long reign of
Queen Victoria, during which time Indian “indentured labourers” (slaves from
Tamil-speaking Southern India) were sent to various British dominions, including
Queensland (Australia), Ceylon (Sri Lanka), and British territories and
“protectorates” in the Caribbean Sea, Indian Ocean and Pacific Ocean. In all these
areas the British established “plantations” which were administered by “whites”
and where most of the work was done by “blacks” (of either African or Indian
racial heritage).
I was born in London in 1960. My father had graduated in medicine at Cambridge
University in the 1950s, and, while I attended primary school in Kent he obtained
his MD (specialist degree) after writing a thesis on the effects of diuretic drugs on
potassium excretion by the kidneys. After a crash course in “tropical medicine” my
father obtained a research grant from the Nuffield Foundation to establish a
research laboratory at the Kandy Hospital, a public hospital in the hill town of
Kandy, in Sri Lanka, then called Ceylon. My parents were both born in Sri Lanka,
and they regarded the change as “going home”. For my older sister and I it was
leaving our home and adapting to a new one. At the age of 8 I did not find the
change traumatic, as far as I can recall, but my sister, who is a year older than me,
remained homesick for England for many years. In Kandy, which became my
home for the next 7 years, I attended a private boys’ school owned by the Anglican
Church – Trinity College, Kandy. My father worked at the Kandy Hospital and
immersed himself in medical research, doing studies on anaemia, urinary tract
infections, fluorosis (toxicity due to high levels of fluoride in water from rural
wells), and other subjects. My mother, who has a degree in Zoology from the
University of Ceylon, helped my father in his medical research and in writing up
the research. Thus I was exposed to medical research in the “Third World” at a
young age, and witnessed, first hand, how Indian tea-estate labourers were being
treated in Sri Lanka. They were treated atrociously.
When the British conquered the hill kingdom of Kandy in the early 1800s they
succeeded, where the Portuguese and Dutch had failed, to gain political control of

256

the whole of Ceylon. They never developed cultural control, although for many
years they tried. This was done by setting up systems of government and education
along the lines of other “colonies”. Ceylon was then regarded as the “pearl of the
Indian Ocean” a rich, fertile island in the centre of the trade routes between
Europe, Africa and the far East. For many centuries the kings of Ceylon had
exported spices and precious stones to Arab and Chinese traders, and later with
Portuguese and Dutch ones. The Portuguese were the first to try and take control of
the island. This was in the 1600s, and the Portuguese, with their guns and cannons
were able to take control of the coastal kingdoms in the south of Ceylon. The
Portuguese, French and British had already established armed fortresses along the
coast of eastern and western coasts of India, during the 1600s and 1700s. The
Dutch, however, had control of the “East Indies” – now called Indonesia, and then
also known as the “Spice Islands” or “Moluccas”. The Dutch took control of the
ancient cities of Java, creating a Dutch-speaking capital of the “Dutch East Indies”,
which they named Batavia (now Jakarta). The Spanish controlled most of the
South and Central American mainland, with the exception of Brazil, which was a
Portuguese colony. The Spanish also controlled, during the age of cargo slavery,
the south-east Pacific islands still called the “Philippines”. In 1898, the United
States of America took control of the Philippines, along with Cuba, Puerto Rico
and Guam in a treaty with the Spanish, which was signed in France (the “Paris
Treaty” of 10.12.1898).
When the British and Dutch developed their own navies, in the 1600s and 1700s,
they predictably challenged the Portuguese and Spanish claims. Pointing to the
considerable atrocities being committed by the Iberian soldiers, the Protestant
English and Dutch explained to the natives that they hoped to exploit, that the
Spanish and Portuguese were cruel Catholics who had misunderstood the true word
of God. This, claimed the Protestant missionaries from England and the
Nederlands, was to be found in the “King James Version of the Bible” which was
duly translated into hundreds of languages. The British and Dutch colonists did not
approach established civilizations with guns in the first instance; they used,
instead, flattery and bribery, and, failing that, threats. Although their ships were
armed with cannons and carried soldiers with guns and swords, the British and
Dutch governments and monarchies kept their hands clean by having the dirty
work of betrayal, bribery and slavery to be organized and implemented by “trading
companies”. The British East India Company and the Dutch East India Company,
two such companies that were given authority to kill, exploit and enslave in the
name of their respective monarchs, are of special relevance to events in Africa
during the Second World War that may point to the cause of the current epidemic
of AIDS in South Africa.

257

The “League of Nations”, the predecessor of the “United Nations”, was formed in
1919, at the conclusion of the First World War, with the stated aim of preventing
further wars between rival European states. This political organization between 27
nations including Britain and several of its dominions, France, and other victorious
European nations was instigated by the US President Woodrow Wilson, who had
presented his famous 14 point peace plan in 1914.
In 1919, President Wilson’s plan was adapted by the Allies at the Versailles Peace
conference in Paris, at which the formal suurender of the Germans and the
formation of the League of Nations, was negotiated. According to the Versailles
Treaty, Germany was stripped of its colonial “possessions”, and much of its
territory, and was to pay 20 billion gold marks in reparations. Germany was to be
“demilitarised” and surrender territory lived in by 7 million people. The separate
states of Austria, Czechoslovakia, and Hungary were formed from the fragments of
the once huge Austro-Hungarian Hapsburg Empire, which had, in its heyday as the
“Holy Roman Empire”, controlled much of Europe and Iberia, and much of the
Americas. The Scandinavian states, Poland, Belgium, and France gained territory
from Germany in Europe in the Versailles Treaty, as did Romania, Italy, Greece
and the newly formed state of Yugoslavia.
The following year the allocation of Germany’s “colonies” was decided by the
victorious “Allies”. This is where the real wealth of the “German Empire” lay.
According to the League of Nations mandate of 10 August 1920, the German
territories in East Africa (Tanganika and Uganda) was “mandated” to the British,
along with German South-West Africa mandated to the white-supremacist Union of
South Africa. These areas were known to be extremely rich in diamond deposits,
especially the coast of Namibia in South West Africa. They also contained rich
deposits of gold, uranium and other precious minerals. The mandate thus gave
British mining companies access to extraordinary mineral riches in Southern and
Eastern Africa. The 1920 League of Nations Mandate also added territory to
British and French ‘possessions’ in West Africa. The small East African states of
Burundi and Rwanda, centres of the 1984 African AIDS epidemic were added, by
the League of Nations, to the Belgian possessions in central Africa. Since the
1890s, King Leopold II of Belgium had claimed all of the Congo as his personal
property, instituting a system of cruel tyranny and slavery by white Belgian
authorities over a black population divided between privileged Tutsis and
subjugated Hutus. The atrocities being committed by the Belgians in the Congo
were publicised by the British, in particular, in the early 1900s, resulting in the
Belgian government taking over administration of the territory from King Leopold,

258

in 1908.
The Congo, now the independent African nation of Zaire, is where the AIDS
epidemic in Africa is said to have begun, and was the worst hit of the African
countries in the 1980s. Zaire, like Southern Africa, is rich in minerals, and also
contains the last large remnants of the tropical rainforest that once covered so
much of Africa. It is also the last remaining home of our closest primate relatives,
chimpanzees, which are, like many rainforest animals, threatened with extinction at
the hands of mankind.
Other than Australia, the central focus of this book is on Africa, a continent I have
only visited on a single occasion, in 1990. At this time I briefly visited Zimbabwe,
Kenya and Tanzania. Knowing little about the history of Africa, I was amazed
when we visited the “Great Zimbabwe Ruins” that Cecil Rhodes refused to believe
could have been built by any people other than ‘whites’ despite overwheming
evidence to the contrary. These are the remains of a Southern African civilization
dating back centuries before Bartholomew Diaz sailed around the Cape of Good
Hope, encouraging his sponsor, the king of Portugal to claim, for himself and his
family, the whole of Africa. The Spanish, however, disputed the Portuguese claim,
and the warring monarchs sought the decision of the religious leader of their
people and the remnants of the Roman Empire – the Roman Pope, head of the
Catholic Church. The Pope decreed in the 1490s that the Portuguese King John II
could have the “Eastern Hemisphere” (east of the Azores Islands in the Atlantic
Ocean, and thus Africa and Asia) while Queen Isabella and King Ferdinand of
Spain could have the Western Hemisphere (the newly-discovered Americas, hence
the term “New World” various European history books).
The kingdom of Kongo (Congo) was approached by the Portuguese, in the 1500s,
as a possible ally against the Moslem Empire of the Ottoman Turks, against who
the Crusades had raged for many centuries. The Moslem Moors, allied with the
Ottomans, had ruled the southern Iberian peninsula (Spain and Portugal) until the
1300s, and the Catholic empires of Southern Europe were eager for revenge
against their traditional enemies – the Moslems. The Congo kingdom, which was
ruled by the slave-trading and owning King, became the primary source of African
(‘negro’) slaves for the Portuguese.
In Southern Africa, where the Germans fought against the British and Belgians for
control of the diamond-rich coast of South-West Africa, and where Galton made
his name, the AIDS epidemic is out of control. Over one thousand people in South
Africa alone are said to be infected with HIV every day. These are all predicted to

259

die within the next 15 years by Australia’s premier AIDS advisory and research
centre, the Macfarlane Burnet Centre in Melbourne.
Frank Macfarlane Burnet, after whom the Macfarlane Burnet Centre, Australia’s
“premier virology institute” is named, was a eugenist. In fact, he was still
promoting eugenics in the 1970s when it was not a popular subject for public
discussion. He admits this in his 1978 book Endurance of Life, when he also writes
about the ‘eugenic value’ of selective infanticide. The Burnet Institute presently
claims, in its promotional literature, that Sir Frank Macfarlane Burnet was very
concerned about overpopulation. They fail to mention that inherent in the muchvoiced fear of “overpopulation” has always been the ugly combination of racism
and xenophobia.
Closely related to the history of genocide is the dreadful use of chemical and
biological weapons and warfare. The deliberate creation of disease in targetted
populations has a long history, dating back to at least the Middle Ages, when
bodies of people who had died from the bubonic plague were thrown over the
walls of beseiged cities to infect the surrounding enemy (with the additional
objective of avoiding disease from the dead bodies).
The dispossession of indigenous people around the world was justified by
Europeans with imperialist designs in similar ways in the Americas, in Africa and
in Australia, and in all three alcohol was used as a means of attacking ‘native’
populations. Describing indigenous populations as ‘uncivilised savages’ in need of
‘protection’ from morally and intellectually superior (white) masters was a widelyused justification for enslavement of these ‘black’, ‘red’ and ‘brown’ people – it
was claimed as necessary for the ‘development’ of ‘backward races’, or at least
better than their previous state of ‘barbarity’. Alongside this ‘development’ and a
central means of its implementation was the stealing and brainwashing of children
in various Church-run ‘educational institutions’. It was seen as a divinelysanctioned obligation to save the souls of pagan or heathen races, by force if
necessary. This resulted in what has been subsequently defined by International
Law as ‘cultural genocide’.
This book has been more concerned with physical genocide than cultural genocide,
although the two are clearly related. Physical genocide results in cultural genocide
and destroying the culture of a targeted population results in the premature illness
and death of members of the culture concerned. Generally, and in the case of
Aboriginal people in Australia, physical genocide and cultural genocide have been
employed as parallel strategies.

260

In this book I have explored the possibility that an active eugenics conspiracy has
existed behind the scenes for at least the past 130 years, and that genocide has been
occurring in Australia and Africa, in particular, for over 200 years. I have
assembled some pieces of a complex puzzle, one often confused by euphemisms
and medical jargon, and there is much more work in this area to be done. Whether
or not AIDS is the result of a eugenics program, I have no doubt that disease
creation is a massive problem in the modern world and that medical graduates such
as myself have a responsibility to look critically at our own knowledge and
mistakes. I hope others will join me in the search for the true history of medical
science, so that we can use biological knowledge for health. For all people.

REFERENCES

261

1. Allen, T., Thomas, A. (1992) Poverty and Development in the 1990s. Oxford
University Press (in association with Open University): UK
2. American Psychiatric Association. Diagnostic and Statistical Manual of Mental
Disorders ; APA: New York (1994)
3. Arieff, I. (2001) The world in 2050: older, poorer and very crowded, The Age,
2.3.2001
4. Aronson, T. (1968) The Coburgs of Belgium. Cassell: London, UK
5. Ayana, R. AIDS: The True Story. Nexus: Australia (1988)
6. Baglin, D., Moore, D. (1970) The Dark Australians. Australia and New Zealand
Book Company: Sydney and Melbourne
7. Barnaby, F. (1976) preface to Medical Protection against Chemical Warfare
Agents. SIPRI: Sweden
8. Barnaby, F. (Editor) (1988) The Gaia Peace Atlas. Pan: London, Sydney
9. Barnaby, W. (1997) The Plague Makers: The Secret World of Biological
Warfare. Vision (Satin publications): UK
10.Beadle, G. (1958) The biological sciences, introduction to biology chapters in
Frontiers of Science. Allen & Unwin: London
11.Beckmann, R. (1999) Will we ever beat AIDS?, Helix, CSIRO: Australia
12.Blacker, C. (1952) Eugenics: Galton and After. Duckworth: London
13.Blainey, G. (1993) The Rush that Never Ended: A History of Australian Mining.
Melbourne University Press: Australia
14.Bloch, S and Singh, B. (1994) Foundations of Clinical Psychiatry ; Melbourne
University Press: Melbourne
15.Bloch, S. (1998) Psychiatry, an impossible Profession? Australian and New
Zealand Journal of Psychiatry. RANZCP: Australia
16.Bone, P. (2001) We can make big business behave, The Age, 25.4.2001
17.Bostock, J., Jones, E. (1943) The Nervous Soldier. University of Queensland:
Brisbane
18.Breach, R. (editor) (1964) Documents and Descriptions in European History,
1815-1939. Oxford University Press: UK
19.Brook, S. (2001) Killer virus created in error, The Australian, 12.1.2001
20. Brook, S. (2001) Scientists horror as mice died, The Australian, 12.1.01
21.Brook, S. (2001) Fluke achieves feat thought impossible, The Australian,
12.1.2001
22.Burne, J. (Editor) (1991). Chronicle of the World. Penguin: London
23.Burnet, F. Macfarlane (1978) Endurance of Life. Melbourne University Press:
Melbourne, Australia
24.Cantwell, A. (1997) Virus Wars: Does HIV Cause AIDS?, New Dawn, MarchApril 1997

262

25.Cantwell, A. (1998) AIDS: A doctor’s note on the man made theory. New
Dawn, Jan-Feb 1998
26.Cantwell, A. (1999) Chimps, Conspiracies and Killer Viruses. New Dawn, MayJune 1999
27.Carson, R. (1962) Silent Spring. Penguin: UK
28.Casavis, J. (1955) The Greek origin of Freemasonry. New York City: USA
29.Casson, S. (1940) The Discovery of Man. Readers Union: UK
30.Chapman, L. (1980) Diamonds in Australia: The Fields and the Prospectors.
Bay Books: Australia
31.Clark, M. Sources of Australian History. Oxford University Press: UK (1957)
32.Clark, W. (1995) At War Within. Oxford University Press: UK, USA
33.Clerc, M. (1999) Editorial, MSF Newsletter, Issue 15, August 1999
34.Cooperstock, R., Hill, J. (1982) The effects of Tranquillization: Benzodiazepine
Use in Canada. Ministry of National Health and Welfare: Canada
35.Cornevin, M. (1980) Apartheid: power and historical falsification. UNESCO:
Paris, France
36.Court, J. (1997) The Puberty Game. Harper Collins: Australia
37.Crofts, N. (1997) ‘The consequences of our drug policies in Asia’. Drug Reform
News. Victorian Drug Reform Foundation, Spring 1997
38.Darwin, C.R. (1859) The Origin of Species. 1979 reprint by Avenel: USA
39.Darwin, C. (1958) Forecasting the future. Chapter in Frontiers of Science. Allen
& Unwin: London
40.Darwin, L. (1926) The Need for Eugenic Reform. John Murray: London
41.Davenport, C. (1914) The Eugenics Programme and Progress in its
Achievement, chapter in Eugenics: Twelve University Lectures. Dodd, Mead
and co: New York, USA
42.Dax, E.C. (1961) Asylum to Community. F.W.Cheshire: Melbourne
43.Dax, E.C. (1975) Australia and New Zealand, chapter in World History of
Psychiatry London: Macmillan
44.De Paoli, D. (1997) Was Darwin an Evolutionist or Just a Social Reformer? 21st
Century, Fall 1997
45.Dobzhansky, T. (1964) Preface to Evolution by R. Moore, Time Inc: USA,
Nederlands
46.Draper, E. (1965) Birth Control in the Modern World. Penguin: UK
47.Dubecki, L. (2001) Suburban mini-jails on the way, The Age, 25.4.01
48.Duesberg, P. Inventing the AIDS Virus. Regnery Publishing: Washington, USA
(1996)
49.Duncan, A. (2001) Pox a timely reminder to step up controls, The Australian,
12.1.2001, News Limited: Australia
50.Eccles, J. (1965) The Brain and the Person (transcript of 1965 Boyer Lectures),

263

Australian Broadcasting Commission (ABC): Sydney, Aust.
51.Fannin, P. HIV vaccine may cause AIDS: study, Age, 14.3.2001, Fairfax press:
Australia
52.Feacham, R. (1995) Valuing the past…investing in the future: Evaluation of the
National HIV/AIDS Strategy, 1993-94 to 1995-96. Commonwealth Department
of Human Services and Health: Australia
53.Fenner, F (1987) Frank Macfarlane Burnet 1899-1985, Historical Records of
Australian Science, Vol 7, No 1
54.Fosdick, R. (1952) The Story of the Rockefeller Foundation. Odhams Press:
London
55.Frank, M. (1937) Eugenics and Sex Relations for Men and Women. Books Inc.:
New York
56.Fraser, J. (2001) Mbeki swamped by an AIDS tidal wave, The Weekend
Australian, 27.1.2001
57.Gallo, R., Montagnier, L. (1988) AIDS in 1988. Scientific American: Oct 1988
58.Galton, F. (1869, 1892) Hereditary Genius: An Inquiry into its Laws and
Consequences. Macmillan: London, New York
59.Galton, F. (1909) Essays in Eugenics. Eugenics Education Society: London, UK
60.Garran, R. (2001) Bio-weaponry most feared tool of warfare, Australian,
12.1.2001
61.Garrett, L. (1994) The Coming Plague. Penguin: UK
62.Golden, F. (1999). The First Chimpanzee, Time, 8.2.99, Time Inc.: USA
63.Gould, S.J. (1983) The Panda’s Thumb. Penguin: UK
64.Green, C. (1998) Attention deficit hyperactivity disorder – clearing the
confusion. Modern Medicine of Australia, March, 1998. ADIS Press: Australia
65.Gross, A. (1956) Charles Joseph La Trobe. Melbourne University Press:
Melbourne
66.Haskell, A. (1942) Waltzing Matilda: A Background to Australia. Adam &
Charles Black: London
67.Haslem, B. (2000) ‘Mandatory laws draw federal fire’. The Australian,
15.2.2000, News Limited: Australia
68.Henderson, I. (2000) ‘Rate fears spark $15b share slide’. The Australian.
6.1.2000
69.Howells, J. (1975) Great Britain, chapter in World history of Psychiatry.
Macmillan: London (1975)
70.Hurst, L. (1975) South Africa, chapter in World History of Psychiatry.
Macmillan: London
71.Innis, B. (1995) Dengue and Dengue Hemorrhagic Fever, chapter in Kass
Handbook of Infectious Diseases: Exotic Viral Infections (Porterfield, J.,
editor). Chapman and Hall: London

264

72.Isselbacher, K. et al (eds) (1980) Harrison’s Principles of Internal Medicine ;
McGraw Hill: New York
66.Jonas, W. (2000) Mandatory training in crime and despair, The
Australian, 16.2.2000, New Limited: Australia
67.Joske, P. (1967) Australian Federal Government. Butterworths: Australia
68. Kandel, E et al (eds) (1995) Essentials of Neural Science and Behavior ;
Prentice Hall: USA
69. Kelly, B., Raphael, B. (1998) HIV/AIDS and Psychiatry. HIV/AIDS: A
developing issue for general practitioners. Glaxo Wellcome: NSW
70. Kerin, J. (1999) Biological Weapons from Genetic Research.
Australasian Science, September 1999
71.Knoop, D., Jones, G. (1944) The Scope and Method of Masonic History.
Oldham (for Manchester Association for Masonic Research): UK
72.Knoop, D., Jones, G. (1941) Begemann’s History of Freemasonry, printed for
private masonic circulation, bound by Melbourne University in Collection of
pamphlets on Freemasonry (1956)
73.Knoop, D. (1941) The Genesis of Speculative Freemasonry, ditto
74.Knoop, D. (1945) University Masonic Lodges, ditto
75. Knoop, D. (1946) The Wilkinson Manuscript, ditto
71. Kotulak, R. (1996) Inside the Brain. Universal Press: USA
72.Lemonick, M. (1997) The mood molecule. Time, Nov. 1997
73.Lintzeris, N., Dunlop, A., Thornton, D. (1996) Getting through
Amphetamine Withdrawal. Turning Point Alcohol & Drug Centre:
Melbourne
74. Lyons, J. (1999) Defence: our policy revealed, The Bulletin (August
1999)
75. McCormick, J., Fisher-Hoch, S. (1996) Level 4 Virus Hunters of the
CDC. Turner Publishing: Atlanta (1996)
76.Macfarlane Burnet Centre (1997) Annual Report 1996-97. MBC:
Melbourne, Australia
77.Macfarlane Burnet Centre (1998) Annual Report 1997-98. MBC:
Melbourne
78. Macfarlane Burnet Centre (1999) Annual Report 1998-99. MBC:
Melbourne
79.Martin, H and Schumann, H. (1997) The Global Trap. Zed books:
London, New York
80.Mental Health Research Institute. (1997) Annual Report 1997 ; MHRI :
Melbourne
81.Mental Health Review Board of Victoria. (1998) Annual Report 1997-98
MHRB : Melbourne

265

82. Moore, R. (1964) Evolution. Time-Life: Nederlands, USA
83.Moynihan, R. (1998) Too Much Medicine? ABC Books: Sydney
84.Murdoch Institute. (1996) 1996 Annual Report. Murdoch Institute:
Melbourne
85.Murray, P., Wells, J. (1980) From Sand, Swamp and Heath: A History of
Caulfield. City of Caulfield: Melbourne
86.Nathan, P., Burrows, G and Norman, T. (1997) “Melatonin: is it a marker
of depressive illness?” in Aust. J of Med. Science Vol. 18, May 1997
87. Nolch, G. Editorial in Australasian Science, September 1999
88.Nossal, G. (1978) Nature’s Defences. Australian Broadcasting
Commission (ABC): Sydney, Australia
89. Pilger, J. (1989) A Secret Country. Cape: London
90. Popenoe, P., Johnson, R. (1920) Applied Eugenics. Macmillan: USA
91.Reiter, R. (ed) The Pineal Gland ; Raven Press : New York (1984)
92.Rintoul, S. (1993) The Wailing: A National Black Oral History.
Heinemann (Reed): Australia
93. Roach, M. (1998) Why Men Kill, Discover, Vol 19, 12: 100-108.
94. Rogers, H. (1968) The Early History of the Mornington Peninsula.
Hallcraft: Melbourne
95. Roitt, I., Brostoff, J., Male, D. (Editors) Immunology. Mosby: UK, USA,
Mexico, Spain, Italy, Korea, Singapore, Japan, Australia (1996)
96. Sanford, J. (1980) Arbovirus Infections, chapter in Harrison’s Principles
of Internal Medicine (Isselbacher, et al, editors). McGraw Hill: USA
97. Sargent, S. (1944) Great Psychologists. Doubleday: USA
98. Sargant, W. (1957) Battle for the Mind. Pan: UK
99. Saleeby, C. (1921) The Eugenic Prospect: National and Racial. T.
Fisher Unwin: London
100. Senewiratne, B., Thambipillai, S. (1974) Pattern of Poisoning in a
Developing Agricultural Country, , British Journal of Preventative and
Social Medicine, Vol 28, No 1, February, 1974
101. Senewiratne, B., Hettiarachchi, J., Senewiratne, K. (1974) Some
Problems in the Management of Anaemia in Tea-estate Workers in Sri
Lanka, Journal of Tropical Medicine and Hygiene, Vol 77, pp. 177-181,
August 1974
102. Senewiratne, R (1997) Psychiatric Tales and Words About Life. (selfpublished)
103. Senewiratne, R (1999) Eugenics and Genocide in and from Australia.
(self-published)
104. Senewiratne, R (2000) The Politics of Schizophrenia: Dodging the
Inquisitors and Curing Delusions. (self-published)

266

105. Senewiratne, R (2000) Music and the Brain: the therapeutic use of
Music
106.Shaw, M. (1999) Federal lawyers reject Howard line on heroin. Age,
18.12.99
107. Shorter, E. (1997) A History of Psychiatry ; Wiley and sons : Toronto
108. Stone, D. (1974) Gold Diggers and Diggings. Lansdowne: Melbourne
109. Sturtevant, A. (1958) The Genetic Effects of High Energy Irradiation
on Human Populations, chapter in Frontiers of Science (Hutchings, E.,
editor). Allen& Unwin: London, UK
110. Sumich H, Andrews G. (1995)The Management of Mental Disorders,
Vol.2 .World Health Organization : Sydney, Australia
111. Suzuki, D. (1990) Inventing the Future. Allen & Unwin: Australia
112. Stark, D. (1997) UN horror at AIDS spread, Herald Sun, 28.11.97
113. Terry, D. (2000) Vaccination counselling: dispelling the myths. Current
Therapeutics, Dec 1999-Jan 2000
114. Thomas, H. (1997) The Slave Trade. London: Macmillan
115. Toy, M. (1999) Officials siphon gifts to cancer hospital. Age 18.12.99
116. Tucker, A. (1994) Too Many Captain Cooks. Omnibus (Ashton
scholastic): Sydney, Australia
117.United Nations Population Division (1998) Revision of the World
Population Estimates and Projections. UN Department of Economic and
Social Affairs: New York
118.Vankin,J., Whalen,J. (1996) The 60 greatest conspiracies of all time.
Citadel: USA
119. Wodak, A. (1997) After the scuttling of the heroin trial. Drug Reform
News. Melbourne: Victorian Drug Reform Foundation, Spring 1997

APPENDIX: POLITICAL DIAGRAMS
1. The politics of schizophrenia
2. Political connections of the Mental Health Research Institute
3. ‘Psychiatry’ disease promotion in Australia
4. Analysis of propaganda from Australian Correctional Management
5. Rockefeller corporation – biowarfare connections
6. Biological & chemical warfare industry in Melbourne, Australia
7. The recycling of blood
8. The politics of AIDS

267

9. Summary of MBC International Health Unit programs
10.Summary of MBC International Health Unit programs cont.

REFERENCES
1. Allen, T., Thomas, A. (1992) Poverty and Development in the 1990s. Oxford
University Press (in association with Open University): UK
2. American Psychiatric Association. Diagnostic and Statistical Manual of Mental
Disorders ; APA: New York (1994)
3. Arieff, I. (2001) The world in 2050: older, poorer and very crowded, The Age,
2.3.2001

268

4. Aronson, T. (1968) The Coburgs of Belgium. Cassell: London, UK
5. Ayana, R. AIDS: The True Story. Nexus: Australia (1988)
6. Baglin, D., Moore, D. (1970) The Dark Australians. Australia and New Zealand
Book Company: Sydney and Melbourne
7. Barnaby, F. (1976) preface to Medical Protection against Chemical Warfare
Agents. SIPRI: Sweden
8. Barnaby, F. (Editor) (1988) The Gaia Peace Atlas. Pan: London, Sydney
9. Barnaby, W. (1997) The Plague Makers: The Secret World of Biological
Warfare. Vision (Satin publications): UK
10.Beadle, G. (1958) The biological sciences, introduction to biology chapters in
Frontiers of Science. Allen & Unwin: London
11.Beckmann, R. (1999) Will we ever beat AIDS?, Helix, CSIRO: Australia
12.Blacker, C. (1952) Eugenics: Galton and After. Duckworth: London
13.Blainey, G. (1993) The Rush that Never Ended: A History of Australian Mining.
Melbourne University Press: Australia
14.Bloch, S and Singh, B. (1994) Foundations of Clinical Psychiatry ; Melbourne
University Press: Melbourne
15.Bloch, S. (1998) Psychiatry, an impossible Profession? Australian and New
Zealand Journal of Psychiatry. RANZCP: Australia
16.Bone, P. (2001) We can make big business behave, The Age, 25.4.2001
17.Bostock, J., Jones, E. (1943) The Nervous Soldier. University of Queensland:
Brisbane
18.Breach, R. (editor) (1964) Documents and Descriptions in European History,
1815-1939. Oxford University Press: UK
19.Brook, S. (2001) Killer virus created in error, The Australian, 12.1.2001
20. Brook, S. (2001) Scientists horror as mice died, The Australian, 12.1.01
21.Brook, S. (2001) Fluke achieves feat thought impossible, The Australian,
12.1.2001
22.Burne, J. (Editor) (1991). Chronicle of the World. Penguin: London
23.Burnet, F. Macfarlane (1978) Endurance of Life. Melbourne University Press:
Melbourne, Australia
24.Cantwell, A. (1997) Virus Wars: Does HIV Cause AIDS?, New Dawn, MarchApril 1997
25.Cantwell, A. (1998) AIDS: A doctor’s note on the man made theory. New
Dawn, Jan-Feb 1998
26.Cantwell, A. (1999) Chimps, Conspiracies and Killer Viruses. New Dawn, MayJune 1999
27.Carson, R. (1962) Silent Spring. Penguin: UK
28.Casavis, J. (1955) The Greek origin of Freemasonry. New York City: USA
29.Casson, S. (1940) The Discovery of Man. Readers Union: UK

269

30.Chapman, L. (1980) Diamonds in Australia: The Fields and the Prospectors.
Bay Books: Australia
31.Clark, M. Sources of Australian History. Oxford University Press: UK (1957)
32.Clark, W. (1995) At War Within. Oxford University Press: UK, USA
33.Clerc, M. (1999) Editorial, MSF Newsletter, Issue 15, August 1999
34.Cooperstock, R., Hill, J. (1982) The effects of Tranquillization: Benzodiazepine
Use in Canada. Ministry of National Health and Welfare: Canada
35.Cornevin, M. (1980) Apartheid: power and historical falsification. UNESCO:
Paris, France
36.Court, J. (1997) The Puberty Game. Harper Collins: Australia
37.Crofts, N. (1997) ‘The consequences of our drug policies in Asia’. Drug Reform
News. Victorian Drug Reform Foundation, Spring 1997
38.Darwin, C.R. (1859) The Origin of Species. 1979 reprint by Avenel: USA
39.Darwin, C. (1958) Forecasting the future. Chapter in Frontiers of Science. Allen
& Unwin: London
40.Darwin, L. (1926) The Need for Eugenic Reform. John Murray: London
41.Davenport, C. (1914) The Eugenics Programme and Progress in its
Achievement, chapter in Eugenics: Twelve University Lectures. Dodd, Mead
and co: New York, USA
42.Dax, E.C. (1961) Asylum to Community. F.W.Cheshire: Melbourne
43.Dax, E.C. (1975) Australia and New Zealand, chapter in World History of
Psychiatry London: Macmillan
44.De Paoli, D. (1997) Was Darwin an Evolutionist or Just a Social Reformer? 21st
Century, Fall 1997
45.Dobzhansky, T. (1964) Preface to Evolution by R. Moore, Time Inc: USA,
Nederlands
46.Draper, E. (1965) Birth Control in the Modern World. Penguin: UK
47.Dubecki, L. (2001) Suburban mini-jails on the way, The Age, 25.4.01
48.Duesberg, P. Inventing the AIDS Virus. Regnery Publishing: Washington, USA
(1996)
49.Duncan, A. (2001) Pox a timely reminder to step up controls, The Australian,
12.1.2001, News Limited: Australia
50.Eccles, J. (1965) The Brain and the Person (transcript of 1965 Boyer Lectures),
Australian Broadcasting Commission (ABC): Sydney, Aust.
51.Fannin, P. HIV vaccine may cause AIDS: study, Age, 14.3.2001, Fairfax press:
Australia
52.Feacham, R. (1995) Valuing the past…investing in the future: Evaluation of the
National HIV/AIDS Strategy, 1993-94 to 1995-96. Commonwealth Department
of Human Services and Health: Australia
53.Fenner, F (1987) Frank Macfarlane Burnet 1899-1985, Historical Records of

270

Australian Science, Vol 7, No 1
54.Fosdick, R. (1952) The Story of the Rockefeller Foundation. Odhams Press:
London
55.Frank, M. (1937) Eugenics and Sex Relations for Men and Women. Books Inc.:
New York
56.Fraser, J. (2001) Mbeki swamped by an AIDS tidal wave, The Weekend
Australian, 27.1.2001
57.Gallo, R., Montagnier, L. (1988) AIDS in 1988. Scientific American: Oct 1988
58.Galton, F. (1869, 1892) Hereditary Genius: An Inquiry into its Laws and
Consequences. Macmillan: London, New York
59.Galton, F. (1909) Essays in Eugenics. Eugenics Education Society: London, UK
60.Garran, R. (2001) Bio-weaponry most feared tool of warfare, Australian,
12.1.2001
61.Garrett, L. (1994) The Coming Plague. Penguin: UK
62.Golden, F. (1999). The First Chimpanzee, Time, 8.2.99, Time Inc.: USA
63.Gould, S.J. (1983) The Panda’s Thumb. Penguin: UK
64.Green, C. (1998) Attention deficit hyperactivity disorder – clearing the
confusion. Modern Medicine of Australia, March, 1998. ADIS Press: Australia
65.Gross, A. (1956) Charles Joseph La Trobe. Melbourne University Press:
Melbourne
66.Haskell, A. (1942) Waltzing Matilda: A Background to Australia. Adam &
Charles Black: London
67.Haslem, B. (2000) ‘Mandatory laws draw federal fire’. The Australian,
15.2.2000, News Limited: Australia
68.Henderson, I. (2000) ‘Rate fears spark $15b share slide’. The Australian.
6.1.2000
69.Howells, J. (1975) Great Britain, chapter in World history of Psychiatry.
Macmillan: London (1975)
70.Hurst, L. (1975) South Africa, chapter in World History of Psychiatry.
Macmillan: London
71.Innis, B. (1995) Dengue and Dengue Hemorrhagic Fever, chapter in Kass
Handbook of Infectious Diseases: Exotic Viral Infections (Porterfield, J.,
editor). Chapman and Hall: London
72.Isselbacher, K. et al (eds) (1980) Harrison’s Principles of Internal Medicine ;
McGraw Hill: New York
66.Jonas, W. (2000) Mandatory training in crime and despair, The
Australian, 16.2.2000, New Limited: Australia
70.Joske, P. (1967) Australian Federal Government. Butterworths: Australia
71. Kandel, E et al (eds) (1995) Essentials of Neural Science and Behavior ;
Prentice Hall: USA

271

72. Kelly, B., Raphael, B. (1998) HIV/AIDS and Psychiatry. HIV/AIDS: A
developing issue for general practitioners. Glaxo Wellcome: NSW
70. Kerin, J. (1999) Biological Weapons from Genetic Research.
Australasian Science, September 1999
75.Knoop, D., Jones, G. (1944) The Scope and Method of Masonic History.
Oldham (for Manchester Association for Masonic Research): UK
76.Knoop, D., Jones, G. (1941) Begemann’s History of Freemasonry, printed for
private masonic circulation, bound by Melbourne University in Collection of
pamphlets on Freemasonry (1956)
77.Knoop, D. (1941) The Genesis of Speculative Freemasonry, ditto
78.Knoop, D. (1945) University Masonic Lodges, ditto
75. Knoop, D. (1946) The Wilkinson Manuscript, ditto
71. Kotulak, R. (1996) Inside the Brain. Universal Press: USA
72.Lemonick, M. (1997) The mood molecule. Time, Nov. 1997
73.Lintzeris, N., Dunlop, A., Thornton, D. (1996) Getting through
Amphetamine Withdrawal. Turning Point Alcohol & Drug Centre:
Melbourne
74. Lyons, J. (1999) Defence: our policy revealed, The Bulletin (August
1999)
75. McCormick, J., Fisher-Hoch, S. (1996) Level 4 Virus Hunters of the
CDC. Turner Publishing: Atlanta (1996)
76.Macfarlane Burnet Centre (1997) Annual Report 1996-97. MBC:
Melbourne, Australia
77.Macfarlane Burnet Centre (1998) Annual Report 1997-98. MBC:
Melbourne
78. Macfarlane Burnet Centre (1999) Annual Report 1998-99. MBC:
Melbourne
79.Martin, H and Schumann, H. (1997) The Global Trap. Zed books:
London, New York
80.Mental Health Research Institute. (1997) Annual Report 1997 ; MHRI :
Melbourne
81.Mental Health Review Board of Victoria. (1998) Annual Report 1997-98
MHRB : Melbourne
82. Moore, R. (1964) Evolution. Time-Life: Nederlands, USA
83.Moynihan, R. (1998) Too Much Medicine? ABC Books: Sydney
84.Murdoch Institute. (1996) 1996 Annual Report. Murdoch Institute:
Melbourne
85.Murray, P., Wells, J. (1980) From Sand, Swamp and Heath: A History of
Caulfield. City of Caulfield: Melbourne
86.Nathan, P., Burrows, G and Norman, T. (1997) “Melatonin: is it a marker

272

of depressive illness?” in Aust. J of Med. Science Vol. 18, May 1997
87. Nolch, G. Editorial in Australasian Science, September 1999
88.Nossal, G. (1978) Nature’s Defences. Australian Broadcasting
Commission (ABC): Sydney, Australia
89. Pilger, J. (1989) A Secret Country. Cape: London
90. Popenoe, P., Johnson, R. (1920) Applied Eugenics. Macmillan: USA
91.Reiter, R. (ed) The Pineal Gland ; Raven Press : New York (1984)
92.Rintoul, S. (1993) The Wailing: A National Black Oral History.
Heinemann (Reed): Australia
93. Roach, M. (1998) Why Men Kill, Discover, Vol 19, 12: 100-108.
94. Rogers, H. (1968) The Early History of the Mornington Peninsula.
Hallcraft: Melbourne
95. Roitt, I., Brostoff, J., Male, D. (Editors) Immunology. Mosby: UK, USA,
Mexico, Spain, Italy, Korea, Singapore, Japan, Australia (1996)
96. Sanford, J. (1980) Arbovirus Infections, chapter in Harrison’s Principles
of Internal Medicine (Isselbacher, et al, editors). McGraw Hill: USA
97. Sargent, S. (1944) Great Psychologists. Doubleday: USA
98. Sargant, W. (1957) Battle for the Mind. Pan: UK
99. Saleeby, C. (1921) The Eugenic Prospect: National and Racial. T.
Fisher Unwin: London
102. Senewiratne, R (1997) Psychiatric Tales and Words About Life. (selfpublished)
103. Senewiratne, R (1999) Eugenics and Genocide in and from Australia.
(self-published)
104. Senewiratne, R (2000) The Politics of Schizophrenia: Dodging the
Inquisitors and Curing Delusions. (self-published)
106.Shaw, M. (1999) Federal lawyers reject Howard line on heroin. Age,
18.12.99
107. Shorter, E. (1997) A History of Psychiatry ; Wiley and sons : Toronto
108. Stone, D. (1974) Gold Diggers and Diggings. Lansdowne: Melbourne
109. Sturtevant, A. (1958) The Genetic Effects of High Energy Irradiation
on Human Populations, chapter in Frontiers of Science (Hutchings, E.,
editor). Allen& Unwin: London, UK
110. Sumich H, Andrews G. (1995)The Management of Mental Disorders,
Vol.2 .World Health Organization : Sydney, Australia
111. Suzuki, D. (1990) Inventing the Future. Allen & Unwin: Australia
112. Stark, D. (1997) UN horror at AIDS spread, Herald Sun, 28.11.97
113. Terry, D. (2000) Vaccination counselling: dispelling the myths. Current
Therapeutics, Dec 1999-Jan 2000
114. Thomas, H. (1997) The Slave Trade. London: Macmillan

273

115. Toy, M. (1999) Officials siphon gifts to cancer hospital. Age 18.12.99
116. Tucker, A. (1994) Too Many Captain Cooks. Omnibus (Ashton
scholastic): Sydney, Australia
117.United Nations Population Division (1998) Revision of the World
Population Estimates and Projections. UN Department of Economic and
Social Affairs: New York
118.Vankin,J., Whalen,J. (1996) The 60 greatest conspiracies of all time.
Citadel: USA
119. Wodak, A. (1997) After the scuttling of the heroin trial. Drug Reform
News. Melbourne: Victorian Drug Reform Foundation, Spring 1997

274

Sign up to vote on this title
UsefulNot useful