BEYOND VEGETARIANISM

1

Compiled for www.beyondveg.com
Reports from veterans of vegetarian and raw-food diets, Veganism, fruitarianism, and instinctive eating, plus new science from Paleolithic diet research and clinical nutrition.

Assessing Claims and Credibility/Raw and Alternative Diets
p.1

The Raw Vegan Pledge
p.

Comparative Anatomy & Physiology up to Date
p.

Looking At Ape Diets, Myths, Realities and Rationalization
p.

The Fossil record Evidence about Human Diet
p.

Intelligence, Evolution of the Human Brain and Diet
p.

Comparative Anatomy
p.

Insights about Human Nutrition & Digestion
p.

Health and Disease in Hunter Gatherers
p.

Paleolithic Diet vs. Vegetarianism
p.

2

Dietary Macronutrient Ratios & their Effect on Biochemical
Indicators of Risk for Heart Disease p.

Are Higher Protein Intakes Responsible for Excessive
Calcium Excretion p.

Metabolic Evidence of Human Adaptation to Increased Carnivory
p.

Longevity & Health in Ancient Paleolithic vs. Neolithic People
p.

Is Cooked Food Poison?
p.

Does Cooked Food Contain Less Nutrition?
p.

What Every Vegan Should Know About Vitamin B12
p.

The Calorie Paradox of Raw Veganism
p.

Reality Check
p.

Functional and Dysfunctional Lunch Attitudes
p.

3

Satire
p.

Selective Myths of Raw Foods
p.

Troubleshooting: Avoiding & Overcoming Problems
p.

References
p.

Assessing Claims and Credibility in the Realm of Raw & Alternative Diets
Who Should You Believe? by Tom Billings The purpose of this paper is to address the relevant questions:

What criteria can be used to evaluate individuals ("experts" or "diet gurus") making dietary and health claims--particularly when attempting to assess the credibility of information that: either lacks scientific documentation, has not yet been well-researched, or remains controversial supposing that it has been investigated to some degree? Also, why should you assign any more credibility to the information provided on this site than to information provided by those we characterize as dietary extremists?
o o o

4

Gaining a sense of the issues involved in such concerns about credibility, and what to believe, is important, as anyone will know who has ever been faced with what can be a bewildering array of choices about what dietary approach to follow in their own life. An immediate but not fully satisfying answer to the question of this site's approach is to describe it as: an attempt to be realistic; frank and honest about shortcomings and difficulties of alternative diets; moderate, common-sense, sane; and open to new information (even when it contradicts established beliefs of the past). These are the qualities frequently lacking in the approach of dietary extremists.

The context: When science doesn't have the answer to every question

The role of scientific information
Before delving deeper into the question of credibility in the alternative dietary world in general, it is appropriate to examine the context in which the question arises. For some issues in the raw and vegan world, there is ample, relevant, scientific research that addresses the issue(s), and which provides an answer. Sample issues of this type are: • • • • Is the diet of the great apes completely fruitarian and/or vegan? Are humans natural vegans and/or did humans evolve as vegans or fruitarians? Is modern cultivated fruit "natural"? Does fruit have a nutritional composition similar to mother's milk?

Some of the material on this site addresses the specific questions above, among others. (By the way, the answer to all the questions listed immediately above is "no," although the answer to the third question may vary depending on your definition of the word "natural.") The above issues, and other issues for which there is adequate scientific research, are important because they provide an effective gauge for determining if an "expert," in general, accepts reality or is in denial of reality.

The role of anecdotal information
However, many important issues in the raw and vegan world have not been the subject of peer-reviewed, published, quality scientific research. Sample issues in this category are: • • • • How well does the diet work in the long-term? For which health conditions are (specific) raw diets a good short-term therapy, and for which conditions would it be better to avoid raw diets? What does it mean that so very few succeed long-term on 100% raw vegan diets? And which is the better alternative: a steady 50-75% raw diet, or an on/off pattern, i.e., switching from 100% raw to much less than 100%? As behavioral problems appear to be common on 100% raw diets, what are the mental-health effects of long-term, 100% raw vegan diets?

For such issues, most evidence is anecdotal: personal experience and the experience of others. Hence the question of primary interest here is--when adequate science is not available--whose anecdotal evidence to consider seriously: the evidence presented on this site, or the idealistic views promoted by dietary extremists? That is the central issue addressed in this article.

5

Structure of this article
The information and evaluative approaches we'll cover are presented in several major sections: • The first section provides a contrast between the views found on this site and some of the extreme views one may encounter in the raw and/or vegan movements. The objective of the contrast is to help in assessing the credibility and usefulness of information offered by the various diet gurus or advocates one might encounter. The next major section discusses a tool that can be useful in analyzing whether a promoted diet is credible, or incredible. Then a section deals with the often irrational, hostile behavior of dietary extremists, and discusses the implications such behavior has regarding credibility. Finally, the closing sections discuss how discarding dietary dogma can be not simply of practical benefit on the road to better health, but just as importantly can be a personally liberating experience as well.

• • •

Most of the material is presented in summary form to minimize redundancy with other articles on this site. Site articles that provide related information include Idealism vs. Realism in Raw Foods and Selected Myths of Raw Foods.

TABLE OF CONTENTS

Examining extremist attitudes and dogma in raw veganism
o o o o Convenient and contradictory attitudes toward science and evidence Logical gaps, internal contradictions, and leaps of faith Signs of emotional instability and denial of reality Appeals based on the "cult of personality": authoritarian absolutism, worship of "success" figures/gurus, trashing of "failures" and dissidents

• •

Examining the personal diets of raw vegan gurus: a potential tool for assessing credibility
o o o o o o o Does the diet really add up? (Calories don't lie) Introduction: rose-colored glasses vs. the unpleasant realities Some raw/fruitarian "experts" are unable to function in a civil manner Threats, bullying, and harassment: another extremist tactic Does your diet guru promote "dietary racism"? Fruitarian plagiarism: a sign of moral bankruptcy among extremists and "tolerant" followers In Bad Faith: dishonest information and debate tactics in raw foods--personal experiences Credibility is a continuum Skepticism can spark personal liberation A special message about, and for, extremists Is your "ideal" dietary dogma really a sandcastle?

Raw vegan extremist behavior patterns: the darker side of rawism

Skepticism in raw foods vs. the "golden cage" of dietary dogma
o o o o

6

Examining extremist attitudes and dogma in raw veganism

Convenient and contradictory attitudes toward science and evidence
Examine the position of the extremists on issues for which there is significant scientific evidence. Do they acknowledge reality, or are they in denial? Extremists often display a general pattern of picking

and choosing when to accept scientific evidence, and when to reject it.
Typically, science (i.e, specific papers or findings) will be accepted by such individuals (and even promoted) when it assists in promoting their idealistic dietary dogma. But it will also be rejected as invalid, e.g., via such mindless slogans as "science is cooked," or claiming the results are invalid because they are for "mixed diets," whenever science debunks their dogma. Additionally, extremists may display an unreasonably aggressive anti-establishment mindset, equating establishment science with evil, and thereby selectively rejecting specific research for political (rather than scientific) reasons. Examples of this are as follows: •

Ignoring the extensive evidence from published field observation studies of the great apes, showing that at least a modicum of animal foods are a normal part of most ape diets. Instead, the evidence is rejected on the grounds that the extremist believes the apes are "in
error," the behavior is "maladaptive," or other flimsy, logically invalid rationalizations are given. Who knows better what is the natural diet of the apes: the wild apes eating animal foods to survive, or an irrational dietary extremist whose reason is clouded by idealistic dogma? Rejecting evolution as a scientific theory and adopting creationism, solely because evolution (and the fossil record covering millions of years of evidence) shows that humans are natural omnivores/faunivores, and have been omnivores/faunivores ever since the inception of the human genus, Homo, over two million years ago. Adopting creationism for sincere spiritual reasons (i.e., not related to dietary dogma) could perhaps be regarded as legitimate in some sense when limited to the sphere of religion; however, adopting it solely because evolution debunks your dietary dogma is a form of intellectual dishonesty.

Those who reject the published, peer-reviewed scientific research of Victor Herbert, M.D., simply because they disagree with Herbert's political efforts against alternative
diets and medicine. Even if one disagrees with Herbert's politics and goals, his research should be evaluated independently of his political views. (Herbert has published several significant papers on vitamin B-12, a topic of great interest to vegans.)

The dietary proponent who demands hard scientific proof (as a debating tactic) when only anecdotal evidence is available (e.g., when others criticize the diet), while
accepting as scientific fact (and, in effect, as "holy gospel" as well) some of the unscientific nonsense promoted by "experts" of the past--e.g., T.C. Fry claiming wheatgrass juice is toxic, Herbert Shelton claiming that cooking makes all minerals inorganic, Herbert Shelton claiming all spices are toxic, etc. (The dishonest debating style of a fruitarian "expert" who uses this tactic is discussed in a later section of this paper.)

Extremists who selectively quote scientific research papers out of context, or misrepresent the results of research, to promote their diets. The raw and veg*n
communities have the dubious (but mostly appropriate) reputation of tolerating and (in the case of certain extremists) promoting crank science and junk science. An interesting exercise one can perform is to examine the references cited in the rawist writings. You will find that the references fall into a number of categories, as follows: o References to other rawist books (non-scientific, of course). The point here is that much rawist "research" consists of a small group of people quoting each other in a logically circular fashion.

7

o o

References to newspapers, magazines, popular books; i.e., citations that are not
peer-reviewed scientific or professional journals, nor even non-peer-reviewed but otherwise relatively well-informed publications by legitimate scientists. References to peer-reviewed, scientific journals. Such citations are scarce, but are worth checking when they appear, because you may find, if the writer is an extremist, that he/she has misrepresented or misinterpreted the material in the cited paper.

The primary point of the above list is that much of the available writings on raw diets consist of individuals referencing each other in circular fashion, or are based on non-technical sources. Further, when actual references to scientific journals are provided, you may be surprised to find that even in these instances the material is being misused, particularly if the writer is a hardened fanatic.

Consistent pattern of misuse an important consideration. Finally, a note of caution here:
there is a certain level of subjectivity inherent in language. Thus, a few instances of a person appearing to misinterpret research papers is not proof the person is deliberately misusing the material. Instead, you need to observe a long-term pattern of misuse before drawing conclusions that the writer might be an extremist or otherwise untrustworthy. Of course, whenever such a pattern of misuse is present, it points to one or more of two or three things: intellectual dishonesty; chronic sloppiness and lack of due diligence (which often goes hand-in-hand with intellectual dishonesty); or serious prejudice on the part of the "expert." The pattern of promoting science when convenient, and ignoring or denouncing it when inconvenient, raises further questions regarding the lack of credibility of true-believing dietary promoters. The above examples further illustrate that such individuals are in denial of reality, an apparent occupational hazard for those with extreme views.

Logical gaps, internal contradictions, and leaps of faith

Extremists often have inconsistent views on the "proper" use of the human intellect.
You might encounter extremists who want you to: (1) use your intellect to agree with them that the raw vegan diet is the optimal, "natural" diet you should follow; then, (2) discard all of your intellect, even the most basic human tool-making skills, and limit your food choices to fruits and leaves, which are (falsely) claimed to be the only food choices of mythical "naked apes (humans), without tools." Needless to say, the above "logic" is really inconsistent nonsense. Even chimps use tools; should we humans voluntarily take ourselves lower than the chimps? The naked ape hypothesis, presented as selfevident or "revealed truth" by some, is so ridiculous that it might be hilarious, except that some people actually believe it. (By the way, naked apes without tools are not limited to eating only fruits and leaves: one can collect a wide variety of animal products--eggs, insects, snails, etc.--in that manner.) The extremist view of human intellect is similar to their view of science--use it when it helps to promote dietary dogma, ignore it at other times. Again, this reflects an underlying lack of rigor in failing to pause and think through the evidence and logic behind their views, and thus a parallel lack of credibility.

Examine extraordinary claims using basic common sense and logic. This provides an excellent credibility check.
In other words, if you stop long enough to begin thinking carefully and critically about what those touting

8

some exclusive "ideal" diet say, you will immediately reject many erroneous extremist positions. (By the way, we encourage you to use logic and common sense when evaluating the material on this site as well. We make no claims to infallibility, and welcome civil, coherent comment and criticism.) Some examples of extremist views that contradict logic and/or common sense are as follows. •

Some promote the idea that sweet fruit is the "ideal" food for humans because it is supposedly very similar to human mother's milk.
This is of course nonsense; the nutritional composition of milk and sweet fruit are quite different. (See the article Fruit Is Not Like Mother's Milk on this site for a comprehensive analysis and breakdown of the nutritional components of both.) The credibility issue is relevant here on simple logical grounds when those promoting such crackpot mal-nutritional theories turn around and denounce all milk (except for human mother's milk) as unnatural and a bad food. Why? Consider the logic here: If a food is supposed to be good because its composition is like human mother's milk, then goat's milk, or even cow's milk, are better foods than fruit, as their composition is "closer" to human milk than is fruit. Conversely, if these other animals' milk is unnatural and bad, then sweet fruit is even worse because it is even less like mother's milk. Here, simple common sense and logic show how poorly thought-out this lopsided theory is--another reason to question the credibility of those promoting it.

In the case of protein (and perhaps fat/starch as well), one can find crank science: crackpot nutritional theories alleging to prove that these items are "toxic" or harmful. Meanwhile, there are such things as essential fatty acids, and essential amino acids, the lack of which are known to cause deficiency diseases and symptoms. Consider the credibility of any extremist who tells you that almost everyone on the planet (except them and their followers) has a diet that is mostly "toxic" foods, and that food components scientifically proven to be nutritional requirements are supposedly "toxic."

One can find "passionate" rawists claiming that the world's problems are caused by cooked-food consumption, and that if we all become raw vegans, there will be world peace, health, and happiness.
Hmmm... world peace, health, and happiness--all will come from what we put on our lunch plate? That's an awful lot to expect from what is on your lunch plate! If you have any doubts regarding the above, ask yourself: Was World War II an argument over, say, the recipes for egg salad and rice pudding? Has there ever been a war fought specifically over cooked food (vs. raw)? Do people rob, rape, or murder other people specifically because they eat cooked food? The answer to each of the preceding questions is a very clear "no." Is it really even believable on a more indirect level that the cooking of foods might affect the mentality of human beings to such a degree that those eating cooked foods would somehow be psychologically more warlike than those eating nothing but raw foods? In sharp contrast to such unsupported suppositions, and the more peaceful cooked-food eaters actually encountered online, this writer has been the target of hostile, vicious personal attacks (and threats) by allegedly "compassionate" vegan/raw-fooders. If anything, the extensive personal experience of this writer (with extremists) suggests that following a 100% raw vegan diet may make one less peaceful, and more hateful and aggressive. So much for claims of a new, raw "Garden of Eden." To summarize: How credible are those who claim the world will become an Eden if only we all become (100%) raw vegans? The question is particularly relevant to individuals and groups who have a negative, hostile, aggressive style.

9

How credible are claims that nature will ABSOLUTELY follow the simplistic dogma and bogus "laws of nature" promoted by the dietary extremist?
Nature is not limited to the narrow, absolute, simplistic models promoted by so-called "experts," who are often nothing more than idealistic dietary promoters in denial of reality. Nature does not care what we think of it, or what our models or "laws" say: Nature simply IS, and it is not bound by our simplistic ideas and delusions. Nature also does not care how "nice" or "logical" our theories appear to be. Logic alone cannot prove a theory; theories about nature must also agree with evidence--that is, reality. Don't you think that dietary advocates who claim to know everything about nature--that nature is limited to, and perfectly explained by, their delusions of simplistic absolutist "laws"--are merely expressing their own extraordinary arrogance and ignorance? Nature has many mysteries--that much is obvious to those who are not deluded by simplistic, idealistic dietary dogma. We obviously do not (not yet, at least, and not for the foreseeable future) perfectly understand nature, or its "laws." If, then, science is still continuing to plumb the unknown, how much less so can oversimplistic, dogmatic pronouncements about nature adequately reflect its operation?

Some "experts" make unrealistic claims about their diets: That it will bring perfect health, a perfect body, and/or will cure any/all diseases, and so on. Are such claims credible, or simply incredible?
Ask the proponent to give you a clear, testable, non-trivial definition of perfect health/perfect body (which, by the way, should be interesting in and of itself). Now for a way to test the credibility of those who make such claims: Ask them to sign a legally binding contract, as follows: You agree to follow their diet for a period of say, 3 years. At the end of that time, if you have perfect health/body, and/or your illness is cured, you will pay a fee of $1,000. However, if at the end of that time, you are not cured, or you don't have perfect health, then the "expert" is legally obligated to pay you $1,000. Such a dietary promoter will probably refuse such a contract, making the excuse that they cannot verify your compliance with the diet. But, of course, the same concern applies to them. Raw-food advocates, particularly the fruitarian wing, are notorious for not following the diet they promote, for binge-eating, and for the questionable honesty of certain extremists. If such individuals are as absolutely certain about the efficacy of the diet as they say they are, they should be happy to make an easy $1,000. But they won't, because even in their deep denial, they know that the diet they promote, when put into practice in the real world, often does not work "as advertised," which the refusal to risk paying a meaningful penalty if the diet fails, reveals.

Does the proponent claim his or her diet is based on, or uses, "eternal health principles"?
If so, ask the advocate for a list of these eternal principles. Is there any scientific proof for at least some of them, or do they have, in effect, the status of revealed principles that are supposedly "selfevident"? If they are revealed principles, are they presented explicitly as religion (potentially legitimate, at least when confined to that level) or as science (not legitimate)? A "health science" is different from a religion that has diet or health teachings. If the extremist cannot tell the difference between the two, what does that suggest about his/her credibility?

Does the dietary "expert" retreat into convoluted pseudo-reason to support their claims?
This is readily apparent when an obvious explanation--the diet does not work as well as claimed--

10

is a realistic answer under the circumstances. Meanwhile, the "expert" will come up with fanciful explanations which may contain some surface logic for why it does not work, but which break down when examined for the plausibility and probability that good alternate explanations must have. (Examples of this phenomenon are given below, under the discussion regarding how extremists react when told their ideal diet does not work.)

Signs of emotional instability and denial of reality
Those who promote (obsessive) fear as a motivation for their "ideal, perfect" diet are promoting eating disorders (mental illness) rather than a healthy diet.
One does not have to go far in the raw vegan movement to find people promoting rawism using pathological fear as a major motivation factor. In particular, the following fears are often actively promoted: fear of cooked food, fear of protein, fear of mucus. Fear, if it becomes strong enough (which can happen if one believes the "party line" strongly enough), can turn a raw diet into an eating disorder similar to anorexia nervosa. Fear is not the basis for a diet; it is, however, the basis for an eating disorder. Fear is an unsuitable motivation for a diet because of the central role of diet in our lives. There is a difference between rejecting a food based on calm reason, and rejecting a food from an emotional stance based on fear or other negative emotions. If you habitually base a food decision on negative emotions, those negative emotions become a part of your psyche. Another way to say this is that if your diet is motivated by fear, you are effectively bringing fear to the table when you eat. In this way, you are placing fear at the very center of your life: this is a very bad idea, as habitual fear is a toxic emotion. Some readers might claim that fear of unhealthy foods is in fact a good motivation because it will keep you away from bad foods. However, if you think about it, you will reject such reasoning as false, as outlined in the following discussion: Consider the common example of junk-food. You, personally, choose whether or not to eat junk-food. There are no gangs roaming the streets forcing people to eat junk food at gunpoint. The government does not send police or soldiers into your home to force you to eat junk-food. Not only is junk-food consumption your free choice, but unless you make it yourself, you have to go out and buy it/pay money for it so that you can eat it. Now, since junk-food consumption is a free choice, you can also simply recognize that it is not good for you, and, without any fear, choose to not eat it. Instead of feeling fear or other strong negative emotions when you see (or think of) junk-food, there is the calm, rational knowledge that it is not good for you, so it is best to avoid it, and you simply choose to eat other (better) food instead. Your reaction here, and your food choices, should be based on calm reason, not the implicit promotion of paranoid/pathological fear (or extremist slogans that implicitly promote fear). To summarize this point: Those who advocate a diet based on the implicit promotion of pathological fear (by "demonizing" certain foods or components of foods, e.g., protein) are promoting eating disorders, a type of mental illness. Are such "experts" credible? Also consider: Will promoting fear really make the world a better place?

When told that their "ideal, perfect, natural diet" is not working for you, how does the extremist react? What does their reaction say about them, and their credibility?
Extremists often retreat into rationalizations and excuses, e.g., they may say it's your fault, you are not "pure" enough yet, you need coaching, you need more faith/positive thinking to succeed, among other responses. Alternately, such individuals may speculate and say the cause of the failure is non-optimal living conditions, or other non-diet factors. Let's evaluate these responses. •

Apply common sense. If you are attempting to follow a raw vegan diet which some claim (in
varying degrees) is the "perfect, ideal" diet, the diet you are "designed or created for," why would you ever need much coaching, faith, or positive thinking to succeed on it? Many other people go

11

through life on a diet of meat and potatoes, and they don't need coaching or faith to succeed (to the degree that they do) on such a diet. Does the cow need coaching, faith, or positive thinking to thrive on a diet of grass? Does the tiger need coaching, faith, or positive thinking to thrive on a diet of sambar (deer) meat? Of course not--they are eating their natural diets. Only very few thrive in the long-term on raw vegan diets: What does that suggest about the claim of naturalness of such diets? According to the extremists, the raw vegan diet is supposedly your natural, "perfect" diet. If that's true, then one would expect that, after a transition (some weeks or months, but not years), the raw vegan diet should begin to work extremely well for you, and for others following the same dietary path. However, if you talk to very many of those trying the diet, you will quickly learn that such diets often don't work as well "as advertised." More precisely, raw vegan diets may be good short-term healing/cleansing diets, but are frequently problematic in the long-term. The numerous excuses offered to explain this away are simply a type of hand-waving, an effort to distract you from the fact that the diet is apparently not working for you, by suggesting that you are being impatient. What is actually happening here, however, is that the extremists simply cannot, and will not, admit the truth--that raw-food vegan diets may be good for some, but not for everyone, that they are not our natural diet per se, but are a narrow restriction of our natural, evolutionary diet. •

There is an additional danger presented by the claims one often hears that detox may take many years: You may develop actual nutritional deficiencies, or disease, and ignore it because you think it is detox.
This belief has led some raw-fooders to actual harm. The idea that detox takes a very long, indefinite time period is also used by zealous dietary advocates as an unfalsifiable tool to explain bad results on the diet: No matter how long you detox, it is never enough! Further, the fact that you are not getting good results is, to them, proof you need further detox (note the circular logic here). Additionally, the idea that it takes several years for the body to detox enough to get significant results places severe limits on the "self-healing is the only healing" myth that some promote.

Further, by blaming you if the diet does not work for you, such blindly emotional advocates are telling you something else as well, namely that dietary dogma is more important to them than you--a person--are! Some dietary oracles effectively worship dietary teachers of the past: T.C. Fry, Herbert Shelton, Arnold Ehret, and others.
Some of the teachings of such individuals are worthwhile, but others are outdated or simply incorrect. The idea that "Shelton said it, so therefore it is TRUTH" is narrow-minded, and promotes authoritarianism rather than independent thought. In the field of natural hygiene, the American Natural Hygiene Society is updating and revising, at least in part, some of Shelton's teachings to reflect new information. This is necessary--growth and change are a part of life; if they are absent, there is stagnation. Within the natural hygiene movement, however, there is now a major division as a result--those who want to keep natural hygiene current with new scientific findings vs. those who view any attempts at change as heretical attacks against the movement's grand synthesist (Shelton). The attitude of modern disciples or dietary missionaries regarding the teachers of the past provides some insight into whether the person clings to old, outdated dogma or actively investigates reality.

12

The desire for absolute certainty or "eternal" revealed truths in the field of diet and health is an earmark of religion rather than science.
Science is not just an end result, but an ongoing process of change and the willingness to reevaluate old beliefs (theories) in the light of new information. The missionary who presents unchanging, absolute, simplistic "laws" regarding diet and health is really presenting their false religion as scientific fact. Perhaps such individuals are simply religious fanatics out of place. That is, their diet has become their religion, and they are fundamentalist zealots, out to save the world from the "evil demons" of cooked food and/or animal foods. (If that is the case, how/when did cooked foods and animal foods become demons?) Note: The above is not intended as a criticism of religion; people have the right to choose their religion. Instead, the above is intended to point out that some extremists may be promoting their "dietary religion" as science, and doing so in a negative, fanatical manner.

Another trait of extremists is that they tend to promote themselves as the proof, or at least as prime examples, that their diet works and is "best."
Yet given the reality that very few thrive on such diets in the long-term, the claim by extremists that "it works for me, hence it can and will work for anybody/everybody," is very dubious, and in any case, based on invalid logic. The claim is that a few exceptions (such as themselves), somehow, prove a general rule. However, exceptions are used in reality to illustrate how the exception differs from the general rule, i.e., to clarify the general rule itself better, what cases it covers, what it doesn't, etc. In other words, exceptions do not prove a rule; because if they did, they would be bona fide examples of the rule, rather than the exceptions they actually are. An example may clarify the preceding. To be frank, the experience of the writers on this site regarding the few supposedly successful fruitarians, the "exceptions," is that such claims are often illusory or dubious, and usually fail (eventually) to hold up for one or more of the following reasons. o

The person has been on the diet only a very short time--a few weeks or months. This is very common among wanna-be raw "personalities." Even a year or two
or three is really not a long enough time to make definitive assessments about the longterm workability of a diet. Riding high on a wave of (perhaps understandable) enthusiasm, the individual may offer themselves up as an example, speaking out long before they have accumulated--or had a chance to observe, in others--much noteworthy experience. Even when they do hear of others' less-than-exemplary results, blinded by their own initial flush of short-term success, the poor track record of others on the diet often simply fails to register or is infinitely rationalized.

o

The person claims to be a fruitarian, but uses a different definition of fruit and/or fruitarian than the one used here, i.e., their diet is less than 75+% fruit, or
they use a broader definition of the word fruit. (Often so broad--nuts, seeds, leafy greens, etc.--that their "fruitarian" diet is in some cases little different than that of a raw-foodist in general.) The person actually eats a more varied diet than they claim, i.e., actual diet is different from the claimed diet, and/or they binge-eat, cheat, or make "exceptions" to their nominally 100% raw and/or "fruitarian" regime. The person's physical health might not be as good as claimed. (T.C. Fry defending fruitarianism while he was on his deathbed comes to mind here--see "The Life and Times of T.C. Fry" in the Nov. 1996 issue of the Health & Beyond newsletter, for details.)

o

o

13

o

The mental health or psychological balance and well-being of the fruitarian may be questionable (e.g., hateful fanaticism, denial of reality, extreme emotional
fragility--just challenge their diet and you may see this firsthand, etc.).

Obviously, success claims made when any one of the first four of the above five conditions exists automatically invalidate themselves, and claims by those exhibiting behavior outlined in the final condition have to be considered highly suspect. (Often one finds out that under closer questioning, one of the other conditions turns out to apply anyway, when all is said and done.) Here, the heavily self-oriented and self-promotional emphasis on the guru's own success (or that of a very few associates) is proffered as the ultimate justification for their diet. Again, this simply illustrates another facet of the often extreme subjectivity (and consequent lack of concern for a more balanced attempt at objectivity) that typifies the lack of credibility of such advocates.

Extremists may characterize some of the contributors to this site as "failures," because all of the principal writers here eat limited amounts of cooked food, and some of us have experienced problems with raw diets.
Horrors! We eat some cooked foods! We are impure! Call the dietary purity police right now to report this terrible crime! ;-) And while you're at it, also call a mental health professional to discuss how dietary purity has somehow become a critical, but warped and dysfunctional part of your personal self-identity. :-) Reminder: The principal writers on this site have long experience in raw diets. In many cases, we have more experience than some of the prominent extremists. •

Have you noticed just how rare strict long-term 100% raw vegans are?
How very few people have been on the diet more than a few years full-time, and without any binges, cheating, "exceptions," or "backsliding"? Have you searched for ex-fruitarians, or exrawists, and found them to be more plentiful and much easier to find than those currently on the diet? The point of these questions is to illustrate that fruitarianism rarely if ever works in the long term. (Note that other raw diets tend to work better long-term than fruitarianism. Also, raw diets that are less than 100% raw often work better than 100% raw--surprising but true.) Those who have followed the diet (relatively strictly) continuously, for a short term--less than, say, 5 years (a description of some of the extremists advocating fruitarianism)--cannot realistically call themselves a success. Is someone on the diet for a short term a credible example of success?

There is a more serious side to the issue of success, a side that deserves discussion. The argument may be made that "fruitarianism did not work for you, but it works well for me" (where me = a fruitarian advocate).
The problem is, as discussed above, that if one closely examines some of the "role models" of fruitarianism, one often finds, in time, that many of these role models display eating-disorder behavior (binges and/or lying about eating), or other signs (in my opinion) of possible mental unbalance. So, when someone says that "fruitarianism works for me," it can be very hard to believe. Somewhere there may be credible fruitarians who follow the diet very strictly for very long periods, who are physically healthy and mentally balanced, but I have not met any in my opinion. To be complete, I should say that I have met a very few, non-strict fruitarians who appear to be relatively healthy. Perhaps the key to success on a fruitarian diet is to not be so strict, and to include a variety of foods in the diet?

14

Further, there is something else here that should be obvious. When an extremist tells you their raw diet is "ideal," and that if someone else (e.g., any of the contributors on this site) is derogatorily called a failure or backslider because the diet didn't work for them, you should immediately conclude that if the diet does not work for you, then YOU will be branded a failure as well.
Think about that, if you are considering trying fruitarianism, and you are looking for a teacher or mentor. This type of behavior, when apparent, is about the clearest evidence one could ask for to confirm that such a guru cares more about their dogma than the health of their followers.

Examining the personal diets of raw vegan gurus: a potential tool for assessing credibility
DOES THE DIET REALLY ADD UP? (CALORIES DON'T LIE) Looking closely at an individual's self-reported diet may be a powerful technique, at least in some cases, to assess the credibility of a raw vegan "expert" or "diet guru." However, in attempting to use such information for assessment purposes, one faces the central problem of the raw "experts," i.e., the reality that what they claim to eat is frequently not the same as what they actually eat. In other words, self-reports of diet (especially from extremists) may be unreliable, particularly as such reports may conveniently omit details of "exceptions" or binges.

Compare calorie requirements vs. intake. However, even rough estimates of the calorie levels of
self-reported diets may give some insight into the person's credibility. The site article, The Calorie Paradox of Raw Veganism, provides comprehensive information that can potentially help you evaluate the diet of certain "experts." The general evaluation procedure is as follows. First, using the age, weight, gender, and claimed level of physical activity, make a rough estimate of the daily calorie requirements for the individual in question. Standard sources that one can use to derive estimates of calorie expenditure include NRC [1989, pp. 24-38], and the U.K. Department of Health [1991, pp. 15-38, especially tables 2.5-2.8]. In general, however, most individuals will require 2,000 calories/day as a minimum (anorexics and extremely emaciated individuals excepted, perhaps; even in such unusual cases, however, it would be rare to find anyone consuming less than 1,600-1,800 calories per day on a long-term basis.) Second, ask the person about their typical daily diet--get as detailed a description as possible, one that includes amounts consumed. (Amounts are important; without such information you cannot make a reliable estimate.) Then use the information in the "Calorie Paradox" article to make a rough estimate of their daily calorie intake. Next, compare estimated calorie requirements with the estimate for calorie intake. If there is a large discrepancy, i.e., calorie intake is far below calorie requirements, then there are three possible explanations: • • • The "expert" is getting energy from some mystical source. They have anorexia nervosa or a similar eating disorder. They are not being honest with you about quantity/types of foods consumed (e.g., the "expert" binges and/or cheats on the diet, and won't admit it).

15

The simple procedure above will disqualify (or raise serious doubts about) a number of raw, especially fruitarian, diet gurus. No doubt such advocates will make rationalizations and excuses in their defense. However, the simple explanation that the diet gurus are not being honest about their own diet is far more plausible than unsupported rationalizations. Obviously, someone who claims to thrive on a (long-term) diet whose calorie content is at or below starvation level has no credibility.

If the "expert" reports adequate caloric intake. A few raw vegan proponents report food intake
patterns that imply adequate calories, and on the face of it this is a positive sign, at least initially. But there are also a few remaining hurdles when assessing the claims. In probing further, other information from the "Calorie Paradox" article applies, and one should evaluate additional related dietary factors: • •

High-bulk diet? Does the person claim to eat massive amounts of raw vegetables each day? If
so, is that credible, and/or do you want to eat similar amounts every day yourself? Fruit diets: excess urination? If your "expert" reports consuming large amounts of sweet fruit every day, an interesting exercise is to make an estimate of the daily liquid volume of urine implied by the diet. If you are considering a diet of 4-6 kg. (8-13 lbs.) of fruit per day (the amount that would have to be eaten to obtain adequate calories), note that a substantial portion of that daily input will be excreted as urine. In most people, this will produce "excess" urination. In this case, see if you can obtain information about frequency of urination experienced on the individual's "ideal" fruit diet. Note: Some who follow a fruitarian diet may actually get habituated to frequent urination, and may regard the need to urinate several times per hour as "normal" or "healthy." This needs to be considered in discussing the above with fruitarian proponents.

Long-term maintenance diet vs. short-term diet. A further distinction one needs to
consider in evaluating such diets is that a short-term healing diet might not provide adequate calories. However, a long-term maintenance diet must provide adequate calories. Thus you may not need to analyze a diet advertised only as a short-term therapy. However, you may still find that an analysis of the long-term maintenance diet the "expert" claims is ideal nevertheless provides insight into their overall credibility.

The above are relevant issues to consider in evaluating the credibility of those peddling "ideal" raw/veg*n diets.

Raw vegan extremist behavior patterns: the darker side of rawism
Introduction: rose-colored glasses vs. the unpleasant realities.
Many of us are attracted to and/or get involved in raw vegan diets because of the positive idealism and optimistic (albeit simplistic) outlook that is a part of the "party line." It can be disconcerting, therefore, especially for those who are new to the diet and filled with enthusiasm, to learn that raw vegan diets are not cure-alls, are not perfect, and so on. Even more disconcerting, though, is when one learns that the behavior of certain raw diet gurus is a betrayal of the positive moral qualities that (in theory) underlie vegan diets in general; and further, that the behavior of some diet gurus is a massive betrayal of the trust people place in them, as well. Accordingly, readers are warned that this section will not appeal to those who prefer the myth of raw diets to the reality. There is a huge gulf between the "party line" of raw vegan diets and the reality. The behavior of extremists discussed here, e.g., hostility, threats, attacks, plagiarism, "dietary racism," etc., is not a positive or pleasant topic. (Names are omitted here for the obvious reasons.)

16

Thus if a frank discussion of the irrational, negative behavior of certain raw gurus offends you, then you might prefer to skip this section. However, the purpose behind it is that the material which follows may help you avoid needless grief, wasted time, or compromised health from learning the hard way. In that spirit, let's take a look at the darker side of involvement with some of the gurus of rawism.

Some raw/fruitarian "experts" are unable to function in a civil manner.
Some of the writers on this site have been attacked on multiple occasions in the past by hateful extremists (most of whom are vegans and/or fruitarians). A number of such raw/fruitarian diet proponents have earned deservedly bad reputations for their intense, chronic hostility.

Extremists may become socially disabled by their "ideal" dietary dogma. What does chronic
hostility suggest about the credibility of such individuals? If they frequently react to challenges to their dogma with hateful personal attacks, extreme anger, threats, plagiarism, copyright infringement, character assassination by quoting others out of context, and/or dishonesty (all of which are discussed herein)--then the obvious implication is that the "expert" may be emotionally impaired or disabled by his/her "lunch philosophy." Certainly it is reasonable to check for the presence or absence of emotional balance in such dietary advocates. Can a dietary proponent who is emotionally disabled by dogma have much credibility? Would it be a good idea for you to adopt the dietary dogma of such an "expert"? The fact that a number of raw/fruitarian extremists have been thrown off (multiple) email lists on Internet for hostile behavior provides evidence for the hypothesis that at least some of these dietary prophets are unable to function socially, i.e., are at least partially socially impaired by their "ideal" dietary dogma. Perhaps certain extremists are proud of being thrown off email lists; they may see it as proof that they are "radical." However, in reality it simply reflects how hateful and foolish their behavior is. It's worth pointing out that at least two of the extant raw email lists exist because their owners, moderators, or founders were thrown off other email lists (for cause), and had no other choice but to form their own separate lists. Of course, such individuals often cry "censorship" when they are booted from email lists for repeated disregard of the charters that govern list conduct. Such claims are bogus, of course, as censorship is a government function, and the email lists are privately owned--their members residing on the email list at the grace and generosity of the list-owner and moderator.

Threats, bullying, and harassment: another extremist tactic.
The use of threats and bullying provide dietary extremists with yet another weapon to harass people and to try to silence those who dare to challenge their idealistic, "compassionate" dietary dogma. Just a few examples are provided, as follows. • This writer was the target of what could have been interpreted as an implicit death threat from a raw/fruitarian group widely considered (opinion) to behave at times in ways similar to a hate group (time period here was Summer 1997). While the implied threat was couched in hypothetical terms, and could also have been easily interpreted as mere bluster, an individual from the same group did eventually threaten actual physical harm at a later date. (See "1998 Raw Expo" below.) One vegan extremist, widely regarded as a "kook" on Internet, is notorious for threatening people with libel lawsuits. Although most people don't pay serious attention to the individual, the individual may be an example of another socially disabled dietary fanatic. Threats surrounding 1998 Raw Expo (San Francisco). At the request of the organizer of a Raw Expo in 1998 (San Francisco), this writer did not attend because the raw/fruitarian group mentioned above was reportedly threatening to disrupt the event and/or possibly physically assault this writer, if he appeared at the Expo. This "Expo affair" provides an interesting view into the warped thinking and behavior that is condoned in some raw circles. Consider the situation: a raw/fruitarian group threatened possible violence if this writer, a critic of fruitarianism (and a former long-time fruitarian), attended the event. They used such threats to

• •

17

intimidate the Expo organizer into requesting I not attend. Then, in response, instead of taking simple security measures or simply advising the raw/fruitarian group they were not welcome to attend the Expo themselves given such threats, the organizer chose to ask the target of the threats (this writer) not to attend. The rationalization offered was that the raw/fruitarian group had already reserved their booth and was to give a lecture at the Expo--and the importance of such economic interests was emphasized by the organizer. I eventually acceded to the Expo organizer's request as a gesture of courtesy to them. The moral issues illuminated by this event are crystal clear: Some true-believing dietary promoters are so hostile and mentally deranged as to resort to threats of violence when their simplistic dietary philosophy is challenged. Further, some other rawists, e.g., those who continue to ally themselves with such extremists--either in spirit, or who cooperate with them simply out of financial interests--appear to be willing to tolerate those who use threats of violence and bullying as tools to promote the "compassionate"(?) raw vegan diet. In summary, what transpired in this situation illustrates clearly that some rawists believe the raw vegan diet is far more important than honesty and decency.

Does your diet guru promote "dietary racism"?
Let us begin by defining the term, as follows. Dietary racism: A form of bigotry that is the analogue, in dietary terms, of racism; i.e., the hatred of other groups of people because their diet is different from your diet or a hypothetical "ideal" diet, and/or feeling superior to other people because your diet is different. Some raw and veg*n diet gurus actively discuss compassion and/or how the diet they promote will make the world a better place. However, the reality is that their message (e.g., in the case of certain extremists) may be seriously undermined by the presence of blatant or subtle dietary racism. A couple of the more blatant examples are: •

You are "inferior" unless you follow the advocated diet. Some raw/fruitarian proponents
like to insinuate that unless you have a 100% raw vegan and/or fruitarian diet, then you are "inferior" and/or your DNA "mutates," and you become something less than fully human (i.e., like an animal). The rhetoric of bigotry. Some raw/fruitarian advocates refer to those who eat cooked foods as "cooked-food morons," "cooked-food addicts," or say they are "failures" or "degenerates." Extremist view: crank science good, "cooked science" bad. Very common is for raw or fruitarian promoters of crackpot crank science theories to reject conventional, legitimate science that debunks their dietary dogma, on the grounds that the scientists involved eat cooked foods, hence their work is unreliable and cannot be trusted. This may be expressed in other terms, e.g., saying the scientists have "damaged brains," they deal in "cooked science," and so on. If the bigotry in such assertions is not clear to you, consider Hitler denouncing "Jewish science," and the analogy should then be readily apparent.

• •

Somewhat more subtle examples of dietary racism are: • •

Pride in compassion = self-righteousness = feelings of superiority. Veg*ns who actively
claim they are more "compassionate" individuals may come to express outright disdain for, and implicitly behave as if they feel they are superior to, those people who eat meat. Vegans may claim that those who eat meat are "murderers," because "meat is murder." (This specific claim is not very common.)

Many, and probably most, of the individuals who follow raw/veg*n diets are adamantly opposed to racism, and would not tolerate hateful racist attacks on ethnic groups. Why, then, are so many silent when raw/veg*n extremists openly engage in dietary racism (which can be very vicious, at times)?

18

Assessing diet gurus who display dietary racist behavior. If you find significant and long-term
evidence of such "dietary racism," you should consider the impact that has on credibility, and whether you really want to have such a person as your "diet guru." Does the hatred and bigotry implicit in dietary racism reflect your personal views and values? If you would not follow a regular racist, why would you follow a dietary racist? Do you seriously think dietary racism, as a tactic, will help make the world a better place? Finally, the use of the term "dietary racism" here is meant to clearly illuminate the bigotry and hatred of others that, sadly, is close to the heart of the message promoted by certain raw/veg*n extremists. (The intent is not to diminish or slight the tragedy that regular racism represents.)

Raw/fruitarian plagiarism: a sign of moral bankruptcy among extremists and "tolerant" followers.
It is widely known in the raw community on Internet, and is becoming more widely known outside Internet (and in general veg*n circles) that one of the most heavily promoted recent books on raw-food diets/fruitarianism is in fact a massive plagiarism of the book Raw Eating, which was written by Arshavir Ter Hovannessian and published in English in Iran in the 1960s. The original book Raw Eating is long outof-print, although it may occasionally be found in used bookstores in some countries. Plagiarism is a form of theft, as it comprises the use (and, in this case, the sale) of the intellectual property of another, all done without proper attribution. Further, presenting oneself as the author of plagiarized material is misrepresentation (i.e., lying). If the plagiarism, which so far has been limited to one small extremist raw/fruitarian group, were widely condemned by most other raw/fruitarian "experts," then there would be little reason to discuss the matter. However, that is not what has happened as news of the plagiarism has become widely known. Instead, many raw/fruitarian "experts" have rationalized the plagiarism, and have either sided with the plagiarists or expressed tolerance for (i.e., condoned) the plagiarism. (Note: Isn't it surprising that some raw and fruitarian "experts" who are trying to sell books and tapes--ones they are authors of--would closely ally

Direct effect: can you really put much stock in anything a plagiarist, or one who defends a plagiarist, claims? The alleged raw/fruitarian "experts" who present themselves as "authors" of
themselves with plagiarists?) Plagiarized material, and other alleged raw “experts” who are closely allied with, or actively support the plagiarists, is implicitly establishing a very important principle with moral implications: that it is acceptable and legitimate to engage in plagiarism and misrepresentation, if the end result of such efforts is that you persuade others to adopt a fruitarian/raw vegan diet. More explicitly, the principle and its ramifications are: •

Extremist Principle: Misrepresentation is, or at least can be tolerated as, a promotional tool if it serves a "larger good"--that is, convincing people to adopt a fruitarian or raw vegan diet. Once this code of behavior is accepted, one has to conclude that
every claim made by such extremists (plagiarists as well as allies of plagiarists) is suspect, and nothing they say can be completely trusted. After all, if the "expert" thinks it's permissible to engage in misrepresentation, how can you assess whether they are telling the truth when they claim they follow a 100% raw/fruitarian diet, that their diet is so great, and so on? Indirect effect: Selling one's virtue for a raw lunch. The acceptance of misrepresentation-which constitutes complicity by those who either side with, or in some manner defend or actively publish the message of such plagiarists--establishes yet another moral principle (which logically follows from the above). In outline form, the elements of it are: A. Principle #1: Misrepresentation is okay if it promotes raw diets. B. Misrepresentation = dishonesty = giving up virtue. A and B together, comprise a restatement to the effect that:

19

C. Giving up virtue is OK if it promotes raw diets. We also have: D. Giving up virtue = exchanging virtue = selling virtue. Note that the word selling here is quite appropriate because an exchange, in figurative terms, has been made: the extremist exchanges his or her honesty and integrity (i.e., virtue) for an increase in the ability to convince others to adopt the 100% raw/fruitarian diet. This brings us to: Principle #2: Selling your virtue is legitimate, if the price you charge is that more people eat a raw vegan diet. The point of the above exercise is simply to show that even extremists who do not plagiarize, but who merely condone it, are still, in effect, selling out their own virtue too (i.e., their own honesty and credibility) for the power to recruit more people to their diets (by recommending the works of plagiarists to others, or helping to spread word about the authors or "their" works, etc.). In selling out their virtue, then, people are, in a sense, "prostituting" themselves for the sake of promoting their "ideal" diet at the price of integrity and honest representation of facts. •

Ends do not justify the means. The two principles discussed above could be seen as versions
of the tired old "the ends justify the means" excuse, which might theoretically be used in an attempt to justify any deviant behavior, from petty theft to murder. Be aware of this, should "the ends justify the means" argument actually be brought up as an explicit defense by overzealous diet promoters.

"Refusal to judge" where plagiarism is concerned is hypocritical and simply a cover-up for denial. One of the common excuses raised by rawists or fruitarians who have
elected to refrain from comment, or to take a stance, with regard to the plagiarism mentioned above has been to assert that they are simply being "neutral" when they "refuse to judge" the plagiarists. However, this is really an attempt (albeit a mostly subconscious one, probably) to divert attention from the important issues involved. Basically, the "refusal to judge" excuse fails for the following reasons: o

The decision to eat a raw diet is, obviously, in itself an instance of judgment.
Those who use the "refusal to judge" excuse are simply being silly. Judgments are necessary in life, and we make them all the time. Here, however, a double standard is promoted that one should engage in making judgments when selecting a diet to follow, but should not judge relevant behavior of one's erstwhile "diet guru." What this really indicates is a (perhaps subconscious) refusal to be honest with oneself about the fact that we make judgments all the time; and of course, such a double standard is hypocritical. Here, "remaining neutral" is simply a "cop-out." Additionally, the refusal to judge when there is clear evidence of plagiarism or other dishonest behavior is simply a form of denial of reality. In a way, this is not surprising, as (far too) many raw diet gurus appear to be in varying stages of denial of reality themselves.

o

Short-term gains at the expense of long-term growth/credibility. A very important
underlying issue here is whether plagiarized writings (or any that may be misrepresented in other ways, for that matter) form a solid basis for the long-term growth of the raw/fruitarian community, or whether plagiarism marginalizes the community and sharply limits growth. One would expect a credible raw advocate to oppose plagiarism on the grounds--ethical considerations completely aside--that it is harmful, in the long run, to the entire raw community in terms of credibility.

20

Because of this, those who make excuses and ally themselves with plagiarists are in effect relinquishing their own credibility as well. •

If you follow the advice of a particular dietary guru, where do they stand on the issue? Because it has significant bearing on the credibility of your diet guru, you may find it
relevant to determine if they side with those have plagiarized, or have engaged in such behavior themselves. This is primarily of relevance to those in the raw/fruitarian community; it is not relevant at present to the conventional veg*n community. Note here, of course, that it's possible you might not get an honest answer if you directly question your diet guru on this issue. Thus, you may need to ask around the raw community for information. With a bit of searching and detective work, you will find out what is going on.

"So nobody's perfect": What's the difference between forgivable offenses/human error vs. real extremism/lack of credibility?
It should be acknowledged here that everyone makes mistakes on occasion. After all, no one is perfect. It's only human that sometimes people may get caught up in the emotion of the moment, and say or do things they later regret. At other times, or in other ways, unfortunate errors in judgment may occasionally be made. In fact, for the record, we ought to 'fess up here that one of the Beyond Veg site editors was once suspended from an email listgroup briefly for an egregious practical joke that got out of hand, causing others on the list considerable anguish. As was the case in that particular situation, however, one needs to examine whether regret is expressed, a genuine apology made and accepted and something learned from the situation, and the behavior rectified. It is not the intention here to promote intolerance for human errors in judgment, or emotions getting out of hand on occasion. Nor should people get too bent out of shape about the occasional flame (so long as they remain occasional), become oversensitive about a few sporadic heated exchanges, or be unable to take things in stride and so forth on email listgroups or elsewhere. If we couldn't, then it would be a world of impossibly inhuman expectations few could live up to.

How does one distinguish between actual bad faith or lack of credibility, and simple human mistakes? There is an important key by which one can assess if a particular behavior is coming out of
extremism and lack of credibility or not. Very simply, it is when the person regrets what they have done and has the character to offer a genuine apology when having "crossed the line," and gives some indication of having learned something, in that their behavior thereafter changes. With regard to extremist behavior, what distinguishes bad-faith tactics is that this almost never happens. At least to date, it's not something that's been in evidence with the fanatical types of behavior under discussion here. (Also, extremists rarely ever will admit they were wrong about anything when information and evidence are at issue either. Here, admissions of error in owning up to having missed the mark are very similar in their psychology to admissions of apology for errant behavior.)

Rationalizations rather than apologies or admission of error indicate extremism. Why the
unwillingness to apologize? Here, what is probably even more interesting is that you will find when extremists are asked about the kind of behaviors that have been discussed here, they generally simply do not see them as a problem--or at least not much of one--even in the face of feedback from numerous others that they are. An additional key here, then, is: not only do such inflexible absolutists not apologize, they don't see any problem with such behavior in the first place. That is, the behavior is simply rationalized: i.e., it's "radical," "passionate," "raw courage"; plagiarism may be "proactive marketing," etc.; but rarely ever is it a mistake that is admitted, regretted, or apologized for. And, sadly, apology itself may even be labeled a sign of weakness, rather than the sign of character strength and credibility it actually is.

21

In Bad Faith: dishonest information and debate tactics in raw foods--personal experiences.
Let's now consider a raw diet promoter who decides to attack some writings that he/she disagrees with, let's say, for instance, some of the writings on this site, which such individuals almost certainly will be likely to dispute. If an "expert" challenges statements made here via the debate tactic of repeatedly demanding "proof" (implicitly meaning published scientific evidence) for specific claims for which only anecdotal evidence is available, then one must question the credibility of said "expert." After all, a real expert would have knowledge of at least part of the scientific evidence relevant to raw diets, and should recognize that for many issues in raw, only anecdotal evidence is available. Like it or not, the reality is that anecdotal evidence is all that is available for some questions, and those who are credible acknowledge reality. A chronic lack of such acknowledgment, therefore, suggests that the alleged "expert" is using (whether deliberately or more unconsciously) deceptive tactics as a debate strategy, hence may be intellectually dishonest. But this is only one tactic among several others. As it happens, these sorts of tactics have already become familiar to a couple of contributors to this website from direct experience. •

Extremist debate tactic: Quote out of context, then misrepresent the material. The
tactic mentioned in the above two paragraphs can be illustrated with examples from this writer's personal experience. On a few occasions in the past, certain of my articles have been republished-without permission (copyright infringement)--and were angrily attacked by a fruitarian "expert." In such republications (on an email forum), the articles were interspersed with numerous angry demands for "proof" from the alleged "expert." The articles were also misrepresented and misinterpreted, as well--by the exclusion of other specifically relevant parts of the article that would have revealed the distortions this individual engaged in, of course. Relevant comments on these attacks are as follows. o One such article (Vegan Attitudes Toward Instinctive Eating, available on this site in since-updated form) was (and still is) labeled explicitly as personal opinion--needless to say, no one is obligated to "prove" their opinions to an angry, irrational extremist. Hostile attacks may result from underlying eating-disorder behavior. One angry attack included brief quotes from the article, Health Food Junkie, by Stephen Bratman, available on this site. Inasmuch as the "Health Food Junkie" article describes how obsession with diet can become something akin to a mental illness (i.e., an eating disorder), in this instance it can be rather humorously observed (in my opinion) that the angry attack merely provided an excellent example of the type of unbalanced behavior typical of those with such eating disorders.

o

o

Extremist debate tactic: demand proof NOW even when an article has been explicitly presented as a summary only, with further evidence scheduled for future publication. Another article (Selected Myths of Raw Foods, also available on
this site) was specifically labeled as an introductory "capsule presentation" only. Made explicit in the very first paragraph of the article was a statement that a more detailed treatment with supporting evidence would be provided in other articles, to be available later. When the "expert" republished extensive parts of the article that they preceded to dispute, they conveniently left out the above notice regarding evidence to follow at a later date, and repeatedly demanded "proof" from the article. At yet another point in their slicedand-diced republication of the article, the "expert" deleted still another explicit statement that evidence for the specific issue being discussed in that passage would be provided in a scheduled future article, while inserting their own demand for (immediate) "proof" of the statement. Consider: Are such actions fair and honest, or are they unfair, deceptive, and intellectually dishonest?

22

(Note: The examples mentioned in the three above bullet points occurred some time ago in 1997-1998.) o

Harassment via spamming. Besides the above unauthorized republications, the
fruitarian "expert" also involuntarily subscribed this writer for a short period to their email list (in my opinion, a crank science nutrition forum) demonstrating (in my opinion) their disabled emotional state by repeatedly attacking this writer in a vituperative, insulting manner. Needless to say, such behavior constitutes spamming and deliberate harassment. To put such behavior in perspective: Would you consider it reasonable if someone were to take three articles from a popular health magazine (at least one article of which is identified as editorial opinion), and republish them on Internet (in violation of copyright), interspersed with numerous angry, egocentric demands for "proof," and including insults and denunciations of the magazine article authors? Consider: Would that be the action of a rational, sane person--or would it be more accurate to describe such behavior as irrational, insane, and hateful? Is this the behavior one would expect from a serious, "scientific" critic (which the fruitarian "expert" here appears to pretend to be)? Or is such behavior what one would expect, rather, from a raving lunatic or crank? Finally, would that be the act of an honest, credible individual? What credibility does a raw/fruitarian "expert" have who (frequently) engages in such irrational and unfair behavior regarding the writings of others?

o

Extremist often cannot/does not meet the standard of proof they demand from others. To make matters worse, such behavior is also hypocritical, as the alleged
"expert" habitually does not provide--and has not provided (to date)--published scientific studies on fruitarians (i.e., "proof," by their own standards) for ANY of their claims about fruitarianism. (There are almost no published research studies done on fruitarians.) This illustrates a hypocritical double standard: the extremist demands "scientific" proof of common anecdotal observations that fruitarianism does not work, while they have not yet presented any published scientific evidence that it does work. It also neatly reverses the burden of proof--a logically invalid step by accepted standards of evidence. That is, the efficacy of fruitarianism has never been proven, hence the logical burden is on the advocates of fruitarianism to provide credible proof, not on others to prove it doesn't. To date, no proof--other than crank science theories, of course-has ever been presented. Meanwhile, extensive anecdotal evidence, combined with extensive indirect scientific evidence, suggests that 100% raw vegan fruitarian diets are nutritionally deficient and highly unlikely to be successful in the long run.

Example of raw/fruitarian debater relying on plagiarized source material. In another
incident by a different raw/fruitarian proponent, one "diet guru" challenged a different writer on this site to debate evolution. The challenge was made in a long article, posted to a raw listgroup, that criticized evolution for allegedly being unscientific. However, it turned out that over half of the material in the posting the raw/fruitarian writer claimed authorship of (and put a copyright notice on, no less) was in fact plagiarized from the creationist book, Darwin on Trial, by Phillip Johnson. Detailed side-by-side comparisons of passages of the raw/fruitarian advocate's post with those from the book that were made in a reply to the fruitarian's post left no doubt about the extensive plagiarism. Is there any reason to debate a known plagiarist?

That such dietary extremists are willing to stoop to such tactics in defense of the dogma they promote merely underscores that they are not interested in reality or decency. Instead, they have the "one true religion and science of the 100% raw fruit/vegan lunch" to promote; and dishonest debate tactics, hostility,

23

and blatant hypocrisy are standard operating procedure for certain of these so-called raw/fruitarian "experts." What credibility can they be given on other issues, when they so blatantly disregard any standard of truthful disclosure in instances such as this? The obvious answer for any thinking person is that such "experts" forfeit any claim to credibility they might otherwise have had.

Rationalizations of extremist behavior.
No doubt some will try to rationalize their negative behavior via the tired, old excuse that they are merely "passionate" (about their lunch, no less). Ask yourself whether disagreements over lunch philosophy can justify the above aggressive, dishonest acts. The extremists call it "passion"; however, most other people would regard such behavior as reflecting intense hatred and mental toxicity. Note the emphasis here on emotional and mental balance/health. There is a reason for such emphasis--the inability to function in a civil manner when one's dietary dogma is challenged is a common condition among such individuals, and is a sign they may be mentally UNhealthy, and hence have little credibility.

Mainstream rawists are usually not extremists.
Implicit in the usage of the word "extreme" is that it denotes unusual, not-so-common circumstances. Here this means that although the raw and veg*n movements are infested with extremists, especially the fruitarian wing, it is not the case that every rawist is an extremist. Although it may be redundant, the preceding should be stated explicitly, as follows. • • •

Most rawists are not extremists. Most veg*ns are not extremists. Many fruitarians are not extremists.

Having asserted that most rawists are not extremists, it should be noted that in my opinion, at least some of the self-appointed "experts" or "diet gurus" are, and the raw and veg*n movements suffer from the burden of their negative influence. While extremists themselves may not be common in raw, unfortunately they do tend to comprise its most visible element--and the extremist attitudes, myths, and misconceptions they promote have filtered out into the movement and are very commonly reflected in various rawist beliefs, myths, etc. Do such factors help the raw community to grow, or do such fanatical attitudes and behavior patterns simply relegate raw diets to the "lunatic fringe" of the alternative health world?

24

The Raw Vegan Pledge of Extremism, Ignorance & Denial
by Tom Billings Copyright © 1998 by Thomas E. Billings. All rights reserved. If you've just surfed into the Beyond Veg site and this is the first article on the site that you read, we'll give fair warning here that without a background context for the piece, you might falsely assume the Pledge is anti-vegan. (Vegans are known for--how to say it--taking their lunch far too seriously at times.) In reality, the Pledge is a call for reform in the raw vegan movement, disguised as sarcastic satire. To avoid any misunderstanding, we suggest you read through the other material pertaining to raw veganism on this site before reading the Pledge. However, if you've already read a number of articles on the site or you're particularly adventurous, then you MAY be ready for the sarcastic humor of the Pledge. How can you tell? It's not hard: If the material on this website does NOT appeal to you; if it offends you by challenging many of the things you believe in; if it offends you by criticizing your lunch, then perhaps you should seriously consider actually TAKING the following pledge. On the other hand, if you appreciate the material on this site, you might find the lampooning here of raw vegan extremists to be very funny and "on target." Enjoy! :-) Note: The following is intended as a sharp, sarcastic satire of the current state (1998) of extremism in the raw vegan world. If you are easily offended, you may want to skip this article.

The International Raw Alliance of Vegan Extremists (IRAVE) MEMBERSHIP PLEDGE
WHEREAS: 1. The raw vegan diet is the one true religion and science of perfect health. 2. The vegan diet is based on high moral principles, particularly compassion. 3. Compassion should be promoted toward all living beings on this planet, with the obvious exception of those murderous meat-eaters and the "cooked," those who consume cooked foods. (Those who eat meat or cooked foods are degenerates, hence are "fair game" for insults, harassment, threats, and other similar forms of enlightened, compassionate interaction.) 4. The International Raw Alliance of Vegan Extremists (referred to hereafter by the initials: IRAVE) is an organization devoted to promoting the glorious 100% raw vegan diet. 5. IRAVE has extremely strict criteria for membership. AND: I wish to become a full-fledged member of IRAVE, including the rights and responsibilities thereof (i.e., the right to believe and follow, without challenge or question, the teachings of IRAVE, and the responsibility to regularly send money to the IRAVE leadership). IN RECOGNITION OF THE ABOVE:

25

I do hereby, of my own free will, swear or affirm my complete allegiance to the following pledge:

PLEDGE
• • I will follow a 100% raw vegan diet, with no backsliding or exceptions! If asked whether I ever cheat on the diet, I will DENY it, and I will IGNORE the fact that others backslide or cheat. Remember one of the many brilliant slogans taught by the IRAVE leadership: Ignorance is bliss! I will IGNORE the post-1970 research that shows animal foods are a normal part of the diet of the great apes, our closest primate relatives. If challenged on this, I will DENY the validity of current research and claim that the apes eating animal foods are "in error," "perverted," or give other imaginative rationalization(s). I will IGNORE the scientific evidence that our prehistoric ancestors were omnivores. I will cite outdated (and retracted) dental wear studies to support my views. If necessary, I will DENY science and evolution, and adopt creationism, for the sole reason that it can be molded to support my false dietary dogma--that humans are natural vegans/fruitarians. I will IGNORE the evidence of comparative anatomy, which suggests that animal foods can be a natural part of the human diet. Instead, I will use the misleading human-lion-cow comparison, and claim that it shows we are natural vegetarians because we are "closer" to cows than lions. I will IGNORE any health problems that occur on the diet, and will consider them as a sign of detox, a sign from above that I am not fully complying with the glorious 100% raw vegan diet in the theologically correct manner. I will resolve to follow the diet ever more closely, and will resist all temptations by the evil demon of cooked foods! If 100% raw veganism does not work for someone, and the "detox" excuse fails, then I will use another unfalsifiable excuse: food quality. I will claim that their organic food was grown on a former toxic waste dump, or demand they produce a log of Brix readings and soil tests for all food in their diet. In so doing, I will IGNORE the reality that high-quality foods are but one factor among many others in diet; do not guarantee success on any diet; and I will DENY the reality that such demands are arrogant, hostile, and stupid. I will observe, and keep holy, the sacred sacrament of food combining. I will tell others that the diet is ideal, perfect, and natural, and will DENY the real experience of those who have problems on the diet, or for whom the diet does not work. The diet is Perfect: any problems simply must be the fault of the individual! I will DENY that modern fruit contains excess sugar; I will IGNORE the reality that people with high % fruit diets report the symptoms of excess sugar consumption (sugar highs/blues, fatigue, excess urination, thirst, etc.). I will claim that fruit is the "highest food," and will IGNORE the reality that modern cultivated fruit has been bred for generations (in some cases, literally thousand of years) for high sugar content. I will IGNORE the intellectual dishonesty of those who make unrealistic claims for the diet--that it will cure all diseases, bring personal happiness, perfect health, world peace, etc. I will promote, or tolerate the promotion of, raw vegan diets using fear as a major motivator: fear of cooked foods, protein, or mucus. In so doing, I will IGNORE/DENY that such motivations are mentally toxic, pathological, and promote mental illness. I will IGNORE or DENY the obvious reality that blaming the problems of the world on cooked foods is intellectually dishonest, and that promoting the fear/hatred of those who eat cooked foods (or meat) is analogous to racism. I will IGNORE or attack those who criticize the glorious 100% raw vegan diet, especially if they cite scientific research. Science is the product of the minds of cooked-food consumers, thus it is a product of the evil demon of cooked foods! Beware the monstrous demon of cooked foods! I will IGNORE those who encourage me to think for myself. Independent thought is similar to science, hence another trick by the demon of cooked foods! Instead, I will mindlessly follow all the teachings and guidance of the IRAVE leadership. What a wonderful blessing--I don't have to think anymore!

• •

• • •

• • • • •

26

• •

• • •

• •

I will IGNORE the reality that some raw vegan dogma is anti-common sense, and logically invalid/inconsistent. Once again, common sense and logic are like science--yet another trap by the clever demon of cooked foods. I will attack those who criticize the diet using anecdotal evidence, IGNORING the reality that only anecdotal evidence is available for many issues. Instead, I will demand published scientific studies on raw-fooders as proof, even though I cannot produce such studies to support my claims. If my approach is challenged as inconsistent, I will IGNORE/DENY it, as, after all, intellectual honesty and consistency are also products of the evil demon of cooked foods! I will never accept, and will always DENY, the reality that raw veganism is not the natural diet of humans, but is instead a restriction of that natural diet. I will IGNORE or condone those who promote the raw vegan diet via negative, hostile methods (harassment, threats, bullying, insults, etc.). I will claim that such tactics are a joke, a marketing ploy, or that the end justifies the means, if it saves people from the demon of cooked foods! If I, or other extremist members are criticized because our methods are anti-compassion, I will DENY that veganism has anything to do with compassion. I will claim that veganism is based only on animal liberation, and will IGNORE the obvious reality that, without compassion, animal liberation is meaningless--the equivalent of automobile liberation. If any of the leaders of IRAVE engage in plagiarism in furthering their agenda, as has been known to happen, I will IGNORE it. Although plagiarizing a book is stealing (of intellectual property), and presenting oneself as author of plagiarized material is lying, I will IGNORE (or even support) such activities if they promote the one true religion and science that is the 100% raw vegan diet. As a member of IRAVE, I will IGNORE and DENY the need for integrity. Integrity is yet another clever trick by the demon of cooked foods! By joining IRAVE, I adopt instead "raw integrity" and "raw courage," a radical lunch-based system of ethics, in which most any means--including lying and falsification of scientific information, cheating, etc.--are all good if they promote the glorious end we uphold: the 100% raw vegan lunch. I pledge to support the efforts of other members of IRAVE, even if they promote the glorious 100% raw vegan diet through such imaginative claims as "protein is poison," "fruit is just like mother's milk," and so on. If such claims are challenged, I will DENY the reality that they are crank science, and will instead assert they are valid science. Anything repeated often and loudly enough MUST be true! If a member of IRAVE claims to thrive on a diet, the calorie content of which is patently or demonstrably below starvation level, I will not challenge their claim, but will defend their claims when questioned. If necessary, I will IGNORE and DENY that such claims are apparent evidence of secret binge-eating and dishonesty by those making such claims. Whenever reality contradicts raw vegan dogma, I pledge to IGNORE reality and live in DENIAL. After all, a life of IGNORANCE and DENIAL is a life of pure joy and bliss, and a superior way of life, per the teachings of the IRAVE leadership. I pledge that raw vegan dogma (obsession with food and dietary purity) is the most important thing in my life, and in the world. Further, raw vegan dogma is more important than the safety, health, nutritional status, or well-being of other people. I will attack those who challenge my obsessions, and will DENY the reality that my attitudes are mentally UNhealthy, and a sign of serious mental UNbalance.

In certification of my complete personal allegiance to the above solemn pledge, I hereby sign my name using one of my own precious bodily fluids:
____________________ ____________________ (Date)

27

Skepticism in raw foods vs. the "golden cage" of dietary dogma

Credibility is a continuum
The discussion here has covered many issues and behaviors. It is appropriate to remember, therefore, that credibility is a continuum, and not a strictly binary classification (credible vs. incredible). Evaluation of potential extremist "gurus" must be done on an individual basis, as only some of the material here might apply to a particular advocate. Also, someone who is generally credible on most issues might be less credible on a few select issues, due to (differences in) personal philosophy, or ordinary individualism. Evaluation of credibility requires assessing a wide range of issues and behaviors, and making an evaluation based on an overall pattern. Basing an evaluation on only one point may be too narrow and inflexible. Of course, narrowness and inflexibility are trademarks of extremism, and traits to be avoided. So, get as much information as possible when evaluating potential "diet gurus." You probably don't want to trust your health and well-being to an extremist!

Skepticism can spark personal liberation

Why so much skepticism here?
Most of the writers on this site generally follow mostly raw-food-type diets (paleo, instincto, raw lactovegetarian; none of us are 100% raw at present). At one stage or another, we were raw vegans, some of us 100% raw, for varying lengths of time. We even promoted raw diets, though not in the self-conscious "missionary" style that characterizes the current extremists. In other words, the writers on this site had much in common (in the past) with today's extremists. (Of course there were differences--we hopefully had much better manners than many of today's obnoxious and toxic extremists, at least we sure hope the perception by others was that we did! ;-) ). What happened, then, to make the writers here so skeptical about the efficacy of raw vegan diets? Very simply: We had too many problems on the diets, and we saw too many problems in others following the diets, over the long term. Reality intervened, and showed us that raw vegan diets, especially 100% raw vegan diets, and their associated dogma, often do not work as advertised (in the long term). Further, recent scientific research, plus ordinary logic and common sense applied to raw and vegan dogma, debunk major parts of the raw vegan "party line." Because of this, we came to the awareness that portions of the philosophies that underlie rawism and veganism are intellectually bankrupt (and may be ethically bankrupt as well).

Seeing the whole picture brings wider options and more freedom.
This brings us to the motivations of this site: To communicate to you that dietary extremists, blinded by idealism, are not telling you the whole story. To discuss some of the problems that can occur on such diets. To examine the beliefs of rawism/veganism in an open, dogma-free manner. To document that the idea that (raw) veganism (or even vegetarianism) is your "natural" diet is not in accord with scientific evidence. (One can still be a vegetarian for spiritual reasons, of course. And vegetarian diets can certainly be healthier than the SAD as well, though contrary to popular perception within the vegan movement at large, vegetarian diets don't necessarily work well for everyone. Finally, the major objective of this site is to motivate you to do a thorough, critical re-examination of your diet and any dietary dogma that might accompany it. Regardless of whether or not you change your diet because of what you read here, if you begin to think critically about dietary idealism and the claims of the "experts" or "diet gurus," then we have accomplished a great deal--for then, you have transcended the stance of "follower" and are thinking for yourself. That is a major step forward and the first step toward freedom and liberation from (false) dogma.

28

Having to think for yourself frees you from the constricting mental prison of narrow dietary dogma.
Thinking for oneself on dietary matters may be frightening at first. It can be scary to go from the absolute certainty involved in being a follower of cult-like raw dietary extremists, whose aggressive and hostile style actively discourages individual thought, to ditching them and thinking for yourself. However, in that move, you can discover the seeds of freedom, independence, and enhanced self-respect. The "party line" of 100% raw vegan dogma does not "set you free"--instead it binds you to a set of strict rules and a narrow mindset that promotes food obsessions. Thinking for yourself may be a challenge at first, but the rewards make it very worthwhile. Life is full of uncertainty--which is very different from the extremists who foolishly (and falsely) claim complete knowledge of nature and its laws (or of the only ones that they believe matter). Some decisions will be major challenges. However, that is the point--and, quite literally, the fun--of life. A life without challenge, lived according to narrow dietary dogmas, is no fun at all. Take that as advice from one who has personally "been there, done that, no thanks!" We invite you to embark on the ongoing, individual effort to develop an open, honest relationship with nature and reality. The first step is to begin a critical examination of the dietary dogma that you may have been taught in the past.

A special message about, and for, extremists

Extremism and black-and-white morality generate a destructive us/them mentality.
The vegan movement has been criticized for a tendency to divide the world into "us" (good vegans) vs. "them" (bad meat-eaters). This negative practice is, to a certain extent, also reflected in the raw community. The nature of this article is such that comparisons are necessary to convey the essential information. Although this writer has had many unpleasant experiences with hateful dietary extremists, it is not the goal here to condemn "them," i.e, extremists. Probably none of us are immune to the tendency to be overzealous at times, so the intent here is not to focus criticism on the person, but rather on the (pattern of) behavior. Further, it should be noted that those who advocate views based in extremism do so of their own free choice. They can also review the extremism of their positions, and personally choose to change their behavior. As those of us contributing to this site have personally learned through hard experience of our own (experience that we occasionally have to re-learn :-) ), giving up one's extremist mindset can be a very liberating act that releases you from the "golden cage" of narrow, restrictive dietary dogma.

The "golden cage" of dietary dogma.
A life lived in the golden cage of dietary dogma is one of intense, life-controlling obsession with diet/food, a life in which one foolishly self-identifies with their diet. Ultimately, such a life degenerates into selfrighteousness as well. The end result of this is often a serious mental imbalance, and the hostile actions of certain extremists simply confirm their unfortunate condition. Can you imagine two people with SAD diets, fighting over whether corn chips or potato chips are better? Wouldn't most people regard such an argument as truly insane? But that is exactly what extremists are doing when they make hostile attacks and threats against those who challenge their not-so-perfect diets. Is that kind of behavior mentally healthy, or unhealthy?

29

However, whether the individual in question is oneself, or another, there is hope--extremism can always be released in favor of tolerance, by the simple fact of loving and accepting oneself more fully, as one currently is, without insisting on instant change, or demanding that one comply with a (new/different) standard of perfectionism. Loving and accepting yourself as you are right now is the first step to loving and accepting others as they are (even if they eat meat or cooked foods). Also, note that lasting, positive changes come more easily and reliably from gentle, consistent efforts than from sudden, radical changes (e.g., demanding strict conformance to a new standard of behavior--like a new but difficult and narrow diet). Sudden changes often lead to burnout, whether physical or mental. In the long run, slow, gentle change is healthier for body and mind than is extremism. Extremism may show good short-term results, but at a high cost, and with negative side-effects (e.g., long-term physical and/or mental problems). Note: The use of a golden cage as a metaphor above is deliberate. See the book The Golden Cage: The Enigma of Anorexia Nervosa, by Hilde Bruch (1978, Harvard University Press, Boston) for the source of the metaphor.

Is your "ideal" dietary dogma really a sandcastle?
Finally, a general observation. In figurative terms, following the false dogma promoted by dietary extremists can be characterized as being like living in a small sandcastle built right by the waves, at low tide, on the shoreline of the ocean of truth. But now things are changing in the dietary world. The recent scientific information (post-1970) that confirms that apes eat animal foods as part of their natural diets is becoming more generally known. Additionally, the fact that our evolutionary diet indicates that humans are--and always have been--natural omnivores/faunivores only began receiving wide attention in the scientific journals in 1985, with first dissemination to the public at large (via a popular book) in 1988. The vegetarian movement was very small in the U.S. prior to 1970, and it is only now that the varying potential long-term results of vegetarianism (depending on the individual) are becoming clear. What this means, in figurative terms, is that high tide is now beginning in the ocean of truth. This high tide will ultimately sweep away the sandcastles of false dietary dogma, and redefine the shoreline of the ocean of truth. When such major changes occur, you have a choice--stay with the enchanting but dissolving sandcastle, or move to higher ground. Here, the higher ground refers to a place where there is no dogma, save truth and honest concern for your health. Ultimately, these two concerns--truth and your health--are the ones that matter the most, as you search for a diet that serves your body, mind, and spirit in the best way possible. --Tom Billings

Comparative Anatomy and Physiology Brought Up to Date
Are Humans Natural Frugivores/Vegetarians, or Omnivores/Faunivores?
by Tom Billings Copyright © 1999 by Thomas E. Billings. All rights reserved.

30

An Invitation to Readers
If you are--or have ever been--involved in alternative diets or vegetarianism, you have probably heard or read claims that comparative anatomy and/or comparative physiology "proves" or (in more conservative language) "provides powerful evidence" that humans are "natural" fruitarians, vegetarians, or even omnivores. This paper will assess such claims and the evidence supporting them, but we first need to ask: • • • Whether comparative anatomy and physiology provide actual hard proof of the precise composition of the "natural" human diet, or if they merely provide general indications of possible diets. Then, we want to go beyond the typical simplistic analyses presented in the vegetarian and alternative diet lore, and reexamine what information comparative anatomy and physiology actually provide regarding the natural diet of humans. Further, a number of related claims are often made as part of such comparative "proofs." Some of these claims addressed here are: o Does "instinct" (whatever that is) "prove" that we are natural vegetarians? o Does research that shows the typical Western meat-based diet is unhealthy prove that all omnivore diets are unhealthy? o And, since it is mentioned in the subtitle, just what is a faunivore, anyway?

If these questions interest you, I invite you along for the ride. But first, open your mind and fasten your seat belt--the ride may be bumpier than you expect!

If your time is limited, or you have specific interests...
As this paper addresses numerous, diverse topics by necessity (some in depth), it is lengthy and this may present obstacles for some readers. Also, considerable background information must be covered before the main claims of the major comparative "proofs" of diet can be addressed. To help navigate the numerous topics, to gain a quick bird's-eye view, or to read the paper in sections as time permits, the comprehensive Table of Contents below provides direct links to all the major sections and subsections of the paper.

TABLE OF CONTENTS
• • • • • • • • •

PART 1: Brief Overview: What is the Relevance of Comparative Anatomical and Physiological "Proofs"? PART 2: Looking at Ape Diets: Myths, Realities, and Rationalizations PART 3: The Fossil-Record Evidence about Human Diet PART 4: Intelligence, Evolution of the Human Brain, and Diet PART 5: Limitations on Comparative Dietary Proofs PART 6: What Comparative Anatomy Does and Doesn't Tell Us about Human Diet PART 7: Insights about Human Nutrition and Digestion from Comparative Physiology PART 8: Further Issues in the Debate over Omnivorous vs. Vegetarian Diets PART 9: Conclusions: The End, or The Beginning of a New Approach to Your Diet?

31

PART 1: Brief Overview--What is the Relevance of Comparative Anatomical/Physiological "Proofs"?
Comparative anatomy and physiology (as well as more general comparative approaches that incorporate additional information) are often cited by advocates of particular diets as providing "proof" that the diet they promote is humanity's "natural" diet, and hence best or optimal in some sense. The object of this paper is to examine, in detail, the claims made in the assorted "proofs" based on comparative studies. To be fair, it should also be pointed out that some of those presenting comparative studies openly admit that such studies do not provide hard proof of the natural human diet, but instead merely contribute evidence in favor of the particular diet they advocate.

Along the way, this paper will address many of the claims made in:
• • •

The Comparative Anatomy of Eating, by Milton R. Mills, date unknown (a popular
comparative "proof" on the Internet among the raw/vegetarian communities). Fit Food For Humanity, no author cited, 1982, Natural Hygiene Press. This appears to be an update of Fit Food for Man, by Arthur Andrews, 1970. In addition, some of the claims made in What Might Be the "Natural" Diet for Human Beings, by Ted Altar, are briefly discussed here as well.

Other claims: Over and above claims made in the above "classic" sources in the alternative dietary literature of the past, an extensive cataloguing of other claims typically made as part of such comparative "proofs" is also reviewed and discussed. As well, a number of other assertions that go beyond comparative anatomy/physiology itself but that often come up in discussions related to the issue of

omnivorous vs. vegetarian diets are covered here for completeness, since they will be of
keen interest to those interested in these subjects. Citations for these claims are not given, as in some cases the sources are, in my opinion, extremists (often fruitarians); and citing such works by author's name may needlessly inflame and detract from a focus on the issues themselves, or might merely incite personal attacks by hostile but allegedly "compassionate" extremists. The approach taken here will be to paraphrase the latter claims, trusting that readers will be able to recognize such claims and their proponents should they subsequently run across them elsewhere.

The Logic and Structure of Comparative "Proofs"
The basic argument underlying comparative anatomy/physiology "proofs" is that comparing humans with other species--via a list of anatomical and/or physiological features--somehow "proves" that humans have a particular, often quite specific/narrow, diet. This conclusion is allegedly reached if sufficient features match in the comparison list.

Subjective nature of traditional "proofs." However, examination of the various "proofs" reveals
considerable variation in length of lists used for the comparisons, in the level of detail in each list, and in the number of supposedly matching features provided as "proof." In addition to the question of whether the method is logically valid to "prove" the human diet is restricted to a narrow range, one immediately observes that there appears to be considerable subjectivity in determining the level of detail in the comparison list. And, for that matter, just how many "matches" are required for "proof," anyway? (And what validation is there for such a "magic" number of matches?)

32

What do comparative "proofs" actually show? As we will see later, comparative anatomy and
physiology are not so precise as to give definitive, narrow answers to the question of what is the natural diet of humanity. Instead, comparative anatomy and physiology provide evidence of associations and possibilities. We will also see that those who present simplistic comparisons, and claim they absolutely "prove" humans naturally must follow a narrow, restricted diet (e.g., fruitarianism, for example) are not telling you the whole story. Unfortunately, this pattern--giving you only a small part of the information available (in figurative terms, half-truths)--is common in the raw and vegan movements at present (in my opinion).

Comparative Anatomy and Physiology are Legitimate Tools
It should be emphasized, and very clear to readers, that comparative anatomy and comparative physiology are legitimate, useful, and important tools for scientific inquiry and research. In particular, the following are relevant: • • • Comparative anatomy and physiology are used, legitimately, in biology, paleoanthropology, and other fields. It is not the intent of this paper to criticize, minimize, or denigrate the valuable (and important) disciplines of comparative anatomy and physiology. The intent of this paper is to analyze whether the simplistic comparative studies presented by dietary advocates actually prove that the "natural" human diet is as specific as claimed by the dietary advocates. This paper will look at some of the limitations of such comparative analyses. Further, it will provide background information for some of the topics covered in comparative studies, so you can understand just how simplistic the "proofs" provided by dietary advocates usually are.

Comparative anatomy is a valid tool, but simplistic applications are often fallacious. The
basic question of this paper is not whether comparative anatomy and physiology are valid tools (they clearly are, though we will see that applying comparative anatomy and physiology to humans is problematic), but whether the simplistic analyses presented by dietary advocates are legitimate or "scientific" (we will see later that they are not). So, the question here is not whether the tool is valid, but the specific application of the tool--i.e., whether the tool is being used honestly and properly.

Emphasis on Primates
Modern human beings, species Homo sapiens, are classified as belonging to the primates. Accordingly, as this paper focuses on comparisons, the non-human primates will often be used for comparison. Humanity's prehistoric ancestors--extinct primates, including both human and non-human species--are also discussed as appropriate. When necessary, non-primate species are discussed as well, but most attention here is on primates.

Helpful Notes for the Reader, and Special Terminology
• This paper uses the shorthand notation "veg*n" as an inclusive abbreviation for "vegan and/or vegetarian" (where "vegan" means no foods of animal origin whatsoever, and "vegetarian" means dairy and/or eggs may be consumed). This notation is coming into use in some sectors of the online (Internet) vegan/vegetarian community, and is used here to avoid needless repetition and cumbersome phraseology.

33

• •

The abbreviations "SAD" (for "standard American diet") and "SWD" (standard Western diet) are also used extensively throughout this paper to describe the by now well-known unhealthful standard diets consumed by the majority of the populace in most economically well-off countries. Such diets are typically high in overall fats, particularly saturated fats; high in hydrogenated oils ("trans" fats); high in refined sweeteners and denatured (highly processed) foods; low in essential polyunsaturated fats of the omega-3 variety; and low in fruits and vegetables, particularly fresh fruits and vegetables. Readers should be aware that this writer is a long-time vegetarian (since 1970). My personal motivations for being a vegetarian are spiritual, rather than "scientific" or "philosophical naturalism." Intended audience. This paper is written for both the conventional and raw veg*n communities. In particular, some of the information here contradicts and discredits the claims of certain extremists, many (but not all) of whom are raw fruitarians. If you are part of the large majority of conventional veg*ns who in fact are not extremists, then I hope that the references made herein to fruitarian extremists will be educational. They provide what I hope you will find to be interesting insights into the strange crank-science claims and phony naturalistic philosophy that, unfortunately, are often a part of the basis for vegan fruitarianism. Also, quite a number of the claims are often utilized as underpinnings for conventional veganism itself, at least for those who are more extreme in their thinking about it; thus this information should be of interest to the wider vegetarian community as well. Also, as fruitarian extremists will be mentioned here, please note that some fruitarians are moderates and are not extremists. The fanaticism promoted by certain extremist fruitarians does not reflect the more moderate beliefs and practices of at least some mainstream fruitarians.

Defining Fruitarianism. Inasmuch as the term fruitarian has significantly different meanings
when used by different people, it is appropriate to specify the definition used here: A diet that is 75+% raw fruit (where fruit has the common definition) by weight, with the remaining foods eaten being raw and vegan. A common rather than botanical definition for fruit is used--i.e., the reproductive parts of a plant that includes the seed and a juicy pulp. This definition includes sweet juicy fruit, oily fruit like avocados and olives, so-called "vegetable fruits" like cucumbers and tomatoes, but excludes nuts, grains, legumes (all of which are "fruits" per the more general botanical definition, but not as used here). Note that most fruitarians eat other raw foods to comprise the non-fruit part of their diet; such foods may include (by individual choice) raw vegetables, nuts, sprouted grains, legumes, seeds, and so on. The definition used here is somewhat strict, but it does match closely the idealistic and puritanical diets advocated by the more extreme fruitarians. Some individuals use a different definition; when discussing diet with a fruitarian, the first thing to determine is what the diet actually is, as it can vary substantially from one individual to another.

I hope that you find this paper to be interesting, that the material here is "mentally digestible," and that you will use the information as a positive opportunity to examine the assumptions that underlie your personal dietary philosophy. Enjoy!

PART 2: Looking at Ape Diets--Myths, Realities, and Rationalizations
34

Dietary Classifications & Word Gamesmanship Dietary categories are not strict in nature
A good place to begin our exploration of the topic of natural diet as it relates to morphology (anatomical form) is with dietary classifications. In nature, dietary classifications are rarely strict. Herbivores (folivores) routinely consume significant amounts of insects (animal matter) on the leaves they eat. Some folivores--e.g., gorillas, including the mountain gorilla--may deliberately eat insects. Carnivores may seek out and deliberately consume grasses and other plant matter on occasion, and they may consume the stomach contents (vegetation) of herbivores they prey on. Frugivores normally eat some leaves and/or animal matter in addition to their primary food of fruit. Extreme diets--diets that are 100% of a specific food or narrow food category--are not very common in nature. In sharp contrast to the spectrum of broad-based diets that one finds in nature, however, one can easily find dietary advocates promoting (human) diets that have a relatively narrow basis: 100% vegan, 100% raw vegan, 75+% fruit, and so on.

High variability of primate diets
Primates may frequently switch dietary categories
When discussing dietary categories, it is very easy to forget that extremes are rare in nature, and instead focus exclusively on the central food (or food category) consumed, and ignore the other foods consumed. If one classifies chimps as frugivores, it is easy to forget that they also consume smaller but nutritionally significant quantities of leaves and animal foods (both invertebrates--insects--and vertebrates, and other mammals). The fact that humans are primates is also relevant, for primate diets tend to be highly variable on a month-to-month basis in the wild. Chapman and Chapman [1990] reviewed 46 long-term studies of wild primates, with attention on monthly variations in diet. They noted (pp. 121, 126): ...primates do not consistently combine the same kinds of foods in their diets, as many past characterizations would suggest, but rather, that they often switch between diet categories (e.g., fruit, insects, etc.)... Our review of primate diets on a monthly temporal scale suggests that primates do not always consistently include the same kinds of foods in their diets. Instead, primate populations frequently switch between diet categories.

Animal food consumption common
Harding [1981], noting the widespread reports of predatory behavior and meat consumption by non-human primates, makes the interesting comment (p. 191; my explanatory comments are in brackets [ ] below): It is now clear that several primate populations make regular and substantial use of precisely the type of food [animal flesh] which the early theories described as instrumental in the emergence of the hominids. If the diets of these particular nonhuman primates are more broadly based than we had thought, then how accurate is it to characterize contemporary primate diets in general as "vegetarian"?... As Teleki points out (1975: 127ff.), such terms are first used as shorthand references to a particular dietary specialization but then gradually become inclusive descriptions of an animal's entire diet. Essential elements of the diet are ignored, and the result is a generally misleading impression of what a group or population actually eats. As this article shows, diversity rather than specialization is typical of primate diets.

35

Drawing boundaries between diet categories is non-trivial
As mentioned above, extreme diets are rare in nature. Instead, there is a spectrum of diets, and drawing lines or boundaries in the multi-dimensional dietary spectrum to distinguish between, say, frugivores and faunivores [faunivore = meat-eater] is a non-trivial problem. The reality that primates may frequently switch dietary categories makes the situation even more complicated. Chapman and Chapman [1990, p. 123], citing Mackinnon [1974, 1977], report that in one month, orangutan diet in Sumatra was frugivorous (90% fruit by feeding time), but the same population at a different time was folivorous (75% leaves by feeding time). One month a frugivore, later a folivore: how to classify? Additional evidence on the variability of orangutan diets is given in Knott [1998, p. 40]. Further, there is even some academic disagreement over the definition and use of the term "omnivore." These topics are addressed later in this paper .

Common vs. scientific diet category terms
Use of the cultural terms "vegan" and "fruitarian" versus "folivore" and "frugivore"
Another relevant aspect of dietary classifications: The terms vegan, fruitarian, and vegetarian are all human cultural terms, and are used, at least by most dietary advocates, with narrow definitions. One may hear an advocate claiming that "apes are vegans," then later the same advocate might criticize others for using the same terminology--i.e., "apes are NOT vegans"; the specific criticism being that a biological term is more appropriate (e.g., folivore, frugivore, faunivore ). In this article, a primary objective is to communicate concepts clearly and accurately but without undue complexity. Hence, since most readers would generally understand that the phrase "apes are NOT vegans" means that a human who eats the same diet as the apes would not qualify as a vegan, here we won't worry about the technical terms unless/until they are necessary to assist the reader in understanding the material. (Technical terms are used extensively in certain sections of this paper, but will be defined as we go along.)

To Summarize:
• • •

In general, dietary classifications in nature are not as distinct/clear as the narrow, simplistic, precisely defined diets promoted by certain dietary advocates. Because dietary classifications are not strict in nature, a frugivore might deliberately eat some animal foods, hence not qualify as "vegetarian" as the term applies in human culture (similar remarks apply to folivores). Primate diets tend to be highly variable in the wild.

The Evidence of Ape/Primate Diets
Preface. A comparison of humans vs. allegedly "vegetarian" anthropoid apes is frequently a part of comparative anatomy/physiology "proofs" that assert humans are natural vegetarians. ("Anthropoid" means the most human-like apes in the "great apes" family, which includes the chimpanzee, gorilla, and orangutan.) However, as knowledge of the actual diet of wild primates/anthropoid apes (from field observation studies) increases, the reality that most primates/apes include some animal foods (even if only insects) in their normal, natural diet is becoming better known. As a result of this, the myth of the "vegetarian ape" is slowly slipping away.

36

Ape diets, with emphasis on chimpanzees, are summarized in the article "Setting the Scientific Record Straight on Humanity's Evolutionary Prehistoric Diet and Ape Diets" on this site. Additionally, some of the rationalizations by dietary advocates that have been advanced as part of attempts to preserve the myth of "vegetarian apes" are discussed in "Selected Myths of Raw Foods." The current section serves primarily to supplement the above articles, and to address other, newer rationalizations and misinformation promoted by dietary advocates who stubbornly cling to the "apes are vegetarians" myth.

Claims Made in Fit Food for Humanity
The booklet Fit Food for Humanity [Natural Hygiene Press 1982] makes what could be considered the classical comparative argument: humans vs. anthropoid apes. The booklet argues that: • • Anthropoid apes are natural vegetarians (pp. 6-7). Humans are very similar to anthropoid apes, which are established via a comparative anatomy/physiology table (pp. 8-9).

The booklet then reaches the conclusion that humans are natural vegetarians, i.e., that the vegetarian diet is the (only) diet in accord with human anatomy. However, the line of argument given above is incorrect and logically insufficient to establish the claimed result. The structural problems in the above type of argument are addressed in later sections. This section will focus specifically on the issue of ape diets.

"Vegetarian" apes: a misconception of the past First, as mentioned above, the idea that apes (or primates in general) are strict vegetarians in the normal human sense of the word is a misconception of the past. This point is clarified in Sussman [1987, pp. 166, 168]: In fact, most species of primate are omnivorous (see Harding [1981]) and omnivory should be considered an evolutionarily conservative and generalized trait among primates. Primates evolved from insectivores.... Thus, omnivorous primates are mainly frugivorous and, depending upon body size, obtain most of their protein from insects and leaves. In all large, omnivorous, nonhuman primates, animal protein is a very small but presumably necessary component of the diet. In the above, the term omnivore has the usual definition; e.g., from Milton [1987, p. 93]: "By definition, an omnivore is any animal that takes food from more than one trophic level. Most mammals are in fact omnivorous...". ["Trophic" refers to the different levels of the food chain.] Note that some experts use a different, more precise definition for the term omnivore, and disagree that mammals are omnivores--instead they suggest using the term faunivore for animals that regularly include fauna (other animals) in their diet.

Insect food
Regarding consumption of animal foods by primates, Hamilton and Busse [1978, p. 761] note: Many primate species once considered herbivorous are now known to expand the animal-matter portion of their diet to high levels when it is possible to do so... Insect food is the predominant animal matter resource for primates. Insects are eaten by all extant apes, i.e., chimpanzees (e.g., Lawick-Goodall 1968), orang-utans (Gladikas-Brindamour1), gorillas (Fossey2), gibbons (Chivers 1972, R.L. Tilson3), and the siamang (Chivers 1972). The amount of insect matter in most primate

37

diets is small, but may expand to more than 90% of the diet when insects are abundant and easily captured... Preference for animal matter seems confirmed. Note that the footnote numbers in the quote above refer only to Hamilton and Busse [1978].

Rationalizations about Dietary Deviations among Primates
Fit Food for Humanity does include notes on "Dietary Deviations Among the Primates" (pp. 11-12). It is interesting to note that most (but not all) of the references cited therein are encyclopedia entries--which usually do not reflect the latest research. The response in Fit Food for Humanity to the information that anthropoid apes are not strict vegetarians could be characterized as reliance on outdated information, rationalizations, and hand-waving. Let's review some of the claims. ("FFH" is used as an abbreviation for Fit Food for Humanity in the material below.)

FFH: Gorillas are total vegetarians. REPLY/COMMENTS: Both lowland and mountain gorillas consume insects, deliberately and indirectly, that is, on the vegetation they consume. The above quote from Hamilton and Busse
[1978] cites Fossey (personal communication) regarding insect consumption by mountain gorillas. Tutin and Fernandez [1992] report consumption of insects by lowland gorillas in the Lope Reserve, Gabon: termites (whose remains were contained in 27.4% of gorilla feces) and weaver ants. Note that both insects mentioned are social insects; the consumption of social insects is efficient, as their concentration in nests allows easy harvesting of significant quantities. Of further interest here is the information that termites are known to contain significant quantities of vitamin B-12; see Wakayama et al. [1984] for details. Insectivory by mountain gorillas is discussed further later in this section.

FFH: Orangutans consume 2% insects; from p. 11: "the 2% digression may be seen as incidental and insignificant." REPLY/COMMENTS: The quote from FFH does not specify whether the 2% is by weight or feeding
time. Due to the difficulties in estimating weights of foods consumed, the 2% figure is probably by feeding time. Galdikas and Teleki [1981] report that orangutans at Tanjung Puting Reserve in Indonesia consumed 4% fauna (insects, eggs, meat) by feeding time. Kortlandt [1984] reports that (p. 133), "orang-utans eat honey, insects and, occasionally, bird's eggs, but no vertebrates." For photos of a wild orangutan eating insects, see Knott [1998], p. 42; and for a photo of a wild orangutan eating a vertebrate--a rare event--see Knott [1998], p. 54. The claim that insect consumption by orangutans is "insignificant" is clearly an unproven assumption. Insects and other animal foods are nutrient-dense foods: they supply far more calories and nutrients per gram of edible portion than the same weight of most of the plant foods commonly consumed (i.e., fruits other than oily fruits, and leaves).

FFH: The principal rationalizations given for termite and meat-eating by the chimps of Gombe Preserve are:
• • The Gombe Preserve is small, and surrounded by human-populated areas. Chimps are "natural" imitators.

38

FFH then implies (assumes) that the behavior of the chimps of Gombe is in imitation of human behavior. Other writers (elsewhere, not in FFH) suggest that chimps eat meat in imitation of baboons.

REPLY/COMMENTS: The reality is that predation on vertebrates by chimpanzees is widespread throughout tropical Africa. Regarding chimpanzee predation, Teleki [1981, p. 305]
reports that: Moreover, predatory behavior involving vertebrate prey has now been recorded at all major study sites in equatorial Africa, from Uganda and Tanzania to Sierra Leone and Senegal. I expect that the known geographical distribution of predatory behavior will continue to expand as new chimpanzee projects are launched, though it is probable that some populations practice this behavior little or not at all, while others do so regularly and systematically (Teleki, 1975). Note in the above remark that predation by chimps has been found at all major study sites, although it is possible that some groups of chimps hunt rarely, or not at all. Over and above the reality that predation and meat-eating by chimps is widespread, the claim that chimps do so in imitation of humans or baboons is both unproven and dubious. Chimps have lived in proximity to both humans and baboons for approximately 2-2.5 million years. This alone suggests that sufficient time has elapsed, in evolutionary terms, for chimps to adapt to such allegedly "imitation" behavior. Once evolutionary adaptation occurs, the "imitation" behavior would no longer be an imitation (supposing it were that, in some hazily conceived past)--it is natural. This reasoning suggests that the "imitation" argument is dubious at best. Another problem with the "imitation" argument is that imitative learning in captive chimps is common, but in wild chimpanzees it is rare; see Boesch and Tomasello [1998] for discussion on this point.

Chimp/Baboon Interaction. The interaction between baboons and chimps is quite interesting and
serves to illuminate the shallow nature of the "imitation" argument. Teleki [1981, pp. 330-331] comments on: ...the anomalous nature of an interspecific relationship that includes play with baboons, consumption of baboons, and competition with baboons for at least one kind of prey... [T]he Gombe chimpanzees removed 8% of the local baboon population in 1968-1969 (Teleki, 1973a) and 8-13% of the local colobus population in 1973-1974 (Busse, 1977). How is it possible, then, that the primates serving as prey to chimpanzees in Gombe National Park, and possibly also at other sites, have not developed more successful defensive tactics? Any answer other than the proposition that chimpanzees have only recently acquired predatory inclinations, for which there is no supportive evidence at all (Teleki, 1973a), would be welcome." Note: The above quote is included to specifically inform readers that there is no evidence that predation by chimps is a "new" behavior, and that there is extensive, complex, baboon/chimp interaction.

Insect Consumption by Chimps is Universal. Kortlandt [1984] also discusses insect consumption by
chimps (p. 133): Chimpanzees, on the other hand, spend a remarkable amount of time, mental effort and tool use on searching out insects and feeding on them in every place where they have been intensively studied. Hladik and Viroben (1974) have shown that this insect food is nutritionally important in order to compensate for a deficiency of certain amino acids in the plant foods, even in the rich environment of the Gabon rain-forest.

39

Other "Apes Are Vegetarians" Claims
Outdated quote from Stevens and Hume [1995]
On occasion, one sees the following quote from Stevens and Hume [1995, p. 76] cited in support of the "apes are vegetarians" myth: The gorilla, chimpanzee, and orangutan are generally considered to be strict herbivores, although there is evidence that chimpanzees may hunt and eat termites and smaller monkeys (van Lawick-Goodall 1968). The Stevens and Hume [1995] reference (a book, Comparative Physiology of the Vertebrate Digestive System) is, in general, a good reference. However, the specific quote above is clearly outdated. (In a later section, we will discuss another quote from Stevens and Hume [1995] that is also misused by certain dietary advocates.)

Less predominant foods often "assumed away" by fiat. We have already (briefly) discussed insect
consumption by gorillas and orangutans. Though insects are a small component of their diet, we note the following: • • Such a dietary component cannot be "assumed away," as it may be nutritionally significant as discussed above. A human who consumes insects (live insects are obviously a "living food"!) would not be considered a vegan or vegetarian according to the strict dietary definitions promoted by dietary advocates.

Given that the Stevens and Hume [1995] quote mentions Jane van Lawick-Goodall [1968], the dated nature of the quote is easily illustrated by van Lawick-Goodall [1971] herself, where she reports that the chimps of Gombe consume a wide array of insects, bird's eggs, and chicks, and the following vertebrate prey (from pp. 282-283): Most common prey animals are the young of bushbucks (Tragelaphus scriptus), bushpigs (Potamochoerus porcus) and baboons (Papio anubis), and young or adult red colobus monkeys (Colobus badius). Occasionally the chimpanzees may catch a redtail monkey (Cercopithecus ascanius) or a blue monkey (Cercopithecus mitis). As the information has begun to spread within the raw vegan community that ape diets include some animal foods, even if only a limited amount of insects, certain raw vegan advocates have reacted with brand-new rationalizations and claims, which we examine next.

Chimp predation is supposedly "an error" or "maladaptive"
One rationalization made is that the chimps eating meat are acting "in error" or are "perverted." The logical fallacies of such crank/bogus claims are discussed briefly in Selected Myths of Raw Foods. (One wonders what irrational beliefs drive those fruitarian extremists who appear to think they know more about chimp diet than the chimps themselves?) Yet another rationalization promoted by fruitarian extremists is that meat-eating by chimps is "maladaptive," i.e., anti-evolution. Those who promote this claim have no credible evidence to support their views, and they conveniently ignore that the consumption of insects--an animal food--is apparently universal (or nearly universal) among adult chimpanzees. The comments of Kortlandt [1984, p. 134] address the above rationalizations: We should consider here that hunting and killing vertebrates involves health hazards resulting from injury, and that eating meat implies the risks of parasitic and contagious diseases. Such behavior, therefore, strongly indicates certain deficiencies in the composition of the available vegetarian diet.... Similarly, the preference for eating the brain of the victim suggests a lack of lipids and possibly other components in the diet.

Meet the mountain gorilla, the new "model ape"
Recently, some new claims have surfaced regarding the mountain gorilla: • Mountain gorillas are the "closest" to human of the anthropoid apes.

40

Mountain gorillas are strict vegans; Schaller [1963] (a by now outdated source of information on this particular point) is usually cited as the reference for this claim.

The new claims cited above are interesting. While there have been claims in the alternative diet movement in the somewhat distant past emphasizing gorillas as models of fruitarian diet (see Dinshah [1976-1977], for example), advocates since then have tended to concentrate more on the chimpanzee. Now, however, after years of using chimps as the "model vegetarian ape," such advocates are changing course. The above claims appear to attempt once again to designate the mountain gorilla as the NEW "model vegetarian ape," and thereby divert attention from the (now more widely acknowledged) news of chimps who eat meat and termites.

Examining the Claims about Gorillas

DNA analysis of hominoids
Let's now examine the above claims. The claim that mountain gorillas are the "closest" to humans is simply false, and DNA evidence is available for analysis here. Sibley and Ahlquist [1984] analyzed the DNA sequences of hominoid primates: humans, chimpanzees (including bonobos), gorillas, orangutans, etc. (Note: "Hominoid" is a grouping that includes both the "anthropoid apes"--also known as the "great apes"-plus humans as one related family of primates based on their shared "human-like" traits.) Their research produced a matrix of Delta T(50)H values--a distance function that measures differences between DNA sequences. The distances they found (from Sibley and Ahlquist [1984, p. 9]) are:

SPECIES COMPARED Human / Chimpanzee Human / Bonobo Human / Gorilla Human / Orangutan

Genetic Distance (Delta) 1.9 1.8 2.4 3.6

Note that the human/chimp and human/bonobo DNA distances are lower numbers than the human/gorilla DNA distance. Hence we observe that chimps and bonobos are "closer" to humans than gorillas are. Sibley and Ahlquist expanded their 1984 research with similar results; see Sibley and Ahlquist [1987] for details (the paper also includes a review of related research papers). Felsenstein [1987] provides even more insight into the Sibley and Ahlquist research via a statistical analysis of their data set. Similar results come from earlier research. Goodman [1975] compared hemoglobin amino-acid chains, and computed amino-acid distances for humans versus selected non-human primates. Goodman's methods were slightly different but the results were similar to the above: chimps are "closer" to humans than gorillas are. (See Goodman [1975, p. 224] for the amino-acid distance values.)

Mountain gorillas eat insects
Beyond the erroneous genetic-distance assertion, the claim that mountain gorillas are strict vegans (in the normal human sense) is incorrect as well. Mountain gorillas are folivores, and their predominant food is leaves; however, at least some mountain gorillas deliberately consume insects. Watts [1989] observed mountain gorillas feeding on driver ants in the Virunga Volcanoes region of Rwanda and Zaire.

41

Deliberate consumption. Watts [1989 (pp. 121, 123)] notes:
Not all gorillas in the Virungas population eat ants... Gorillas become very excited during ant-eating sessions, and even silverbacks who are not eating ants sometimes give chest-beating displays... Some mountain gorillas eat ants with striking eagerness and excitement, and the rate of food intake per unit feeding time is higher than for other means of insectivory (Harcourt and Harcourt, 1984). But ant-eating is so rare that, like searches for egg cases inside dead plant stems (ibid), it probably is not nutritionally important. It may differ in this respect from termite-eating by lowland gorillas in Gabon (Tutin and Fernandez, 1983)... Fossey and Harcourt [1977] report that mountain gorillas in Rwanda consume grubs.

Inadvertent consumption more important? Harcourt and Harcourt [1984] analyzed inadvertent
insect consumption by gorillas in the Virunga Volcanoes area. They note (pp. 231-232): ...adding all sources of invertebrate material produces a daily consumption of about 2g of animal matter per adult gorilla... Clearly gorillas inadvertently eat many invertebrates during their daily vegetarian diet, and far more than they eat deliberately... [T]he daily 2g of invertebrates makes up just 0.01% of the intake... When invertebrates occur at high concentration in the environment they are deliberately sought by the gorillas, and not a little time sometimes expended in their procurement... Invertebrate consumption could be necessary to satisfy trace element requirements in a species with a nearmonotypic vegetarian diet [Wise, 1982]. Harcourt and Harcourt [1984] also suggest that deliberate insect consumption by gorillas is (probably) nutritionally unnecessary since the prevailing level of inadvertent insectivory provides adequate nutrition.

Insect consumption by lowland gorillas
The situation is somewhat different for lowland gorillas. Lowland gorillas are known to more heavily consume social insects. Tutin and Fernandez [1992] analyzed the insect consumption of lowland gorillas and chimpanzees in the Lope Reserve, Gabon. They found insect remains in 27.4% of gorilla feces, versus 20.2% in chimp feces. Tutin and Fernandez disagree with the suggestion by Harcourt and Harcourt [1984] that deliberate insectivory by gorillas may be nutritionally unnecessary. Tutin and Fernandez [1992] explain that lowland gorillas eat more fruit than mountain gorillas, and preprocess much of their herbaceous foods (using their hands). The result is that despite higher insect density in the lowland gorilla habitat, inadvertent insect consumption by lowland gorillas is less than that for mountain gorillas. They then conclude that deliberate insectivory is necessary for lowland gorillas. From Tutin and Fernandez [1992 (p. 36)], with my explanatory comments in brackets [*]:

42

However, given the relatively frugivorous diet and the selective processing of their herbaceous foods, inadvertent insectivory by gorillas at Lope is likely to be minimal and [deliberate consumption of] social insects probably serve the same critical role in nutrition as they do for chimpanzees. [Hladik & Viroben, 1974; Redford, 1987].

Actual mountain gorilla diets vs. fruitarianism
The choice of mountain gorilla as the "model ape" by advocates of fruitarian diets is both ironic and humorous. The mountain gorilla is a folivore and consumes little fruit. Schaller [1963] observed the mountain gorillas feeding, and he actually tasted some of the foods consumed by the gorillas. Schaller [1963] provides a table (table #39, p. 372) listing the plant species and parts consumed, and how they tasted. Of 24 plant species/parts listed, only 6 were fully palatable, 1 was described as "mealy," and the remaining 17 (~71%) were described as one or more of: bitter, sour, astringent, or resinous. Additional evidence that the gorilla diet is not very palatable to humans is in Koshimizu et al. [1994], which reports that ~30% of mountain gorilla diet items are bitter in taste to humans. Koshimizu et al. [1994] also cite an interesting hypothesis by Harborne [1977] that the mountain gorilla tolerates or even likes bitter-tasting foods. The irony here is, of course, that the fruitarian advocates who use the mountain gorilla as "model ape" do not recommend a diet whose predominant tastes are bitter, sour, astringent, or resinous. Instead they often promote a diet that by default is high in sugar from modern, hybridized sweet fruits.

Bonobos: Last Stand for Extremists?
Now that the information that (regular) chimp diet includes some meat and insects is becoming wellknown, some fruitarian extremists have adopted the diversionary strategy of disregarding chimps and trying to center attention on bonobos, the pygmy chimpanzee. The bonobo has not been studied as thoroughly as chimps. An additional attraction to the fruitarian extremist is the common figure of an 80%-fruit diet for bonobos. However, Chivers [1998, p. 326] reports that the figure of 80% fruit may be an overestimate: The mean fruit-eating score for the pygmy chimpanzee (Pan paniscus) may be inflated by being presented as a proportion of fruiting trees, rather than relative to feeding time (which may be closer to 50% than to 80%; see Susman, 1984). Note that the primary scientific research regarding bonobos is centered on their unique social organization, rather than their diet. More importantly, however, while ape diets are of value as a reference point in attempting to determine the diets of early (pre-human) hominids, diet varies considerably enough among primates that attempting to make definitive extrapolations from one species to another is unreliable. This is particularly the case with humans who, as we will see later, have a digestive system that is nearly unique among primates.

Other Implications of Ape Diets
Given that Fit Food for Humanity argues for vegetarianism on the basis that humans are very similar to the great apes, the new information that the great ape diets include at least some animal foods presents the advocates of comparative "proofs" with a problem. If the comparative method really "proves" that our diet should be that of the great apes, then the information that ape diets include some animal foods also "proves" that the human diet should include some animal foods as well. More precisely, it "proves" that we are not natural vegetarians, at least in the usual human sense. This point will be discussed further in a later section, where we will have some fun with the logic.

To Summarize:

43

• • •

The claim that "apes are vegetarians" is a misconception of the past. Virtually all of the great apes consume some animal food, even if only limited amounts of insects. Humans doing this would not be considered "vegetarian," yet attempts are often made to characterize apes as essentially vegetarian by falsely characterizing such insectivory as insignificant. The comparative analysis done in Fit Food for Humanity--humans versus the great apes-would actually indicate, if such comparisons are valid, that the natural human diet should include at least some animal foods (insects and/or vertebrate/mammalian flesh). The recent claims that humans are "closest" to allegedly "vegan" mountain gorillas are false. We might also note here the level of extremism inherent in the idealistic rationalizations above regarding ape diets: Despite the fact that animal foods average anywhere from approximately 1% to perhaps 5% or 6% of ape diet over the course of a year (depending on the species), even this modest level of animal products is not acknowledged as a "legitimate" aspect of their diets that humans might want to emulate, supposing one were intent on modeling their diets after comparative studies. What does this tell us about the mindset and open-mindedness of such advocates regarding the actual evidence?

PART 3: The Fossil-Record Evidence about Human Diet
Introduction
Meat a part of human diet for ~2.5 million years
The evidence of the fossil record is, by and large, clear: Since the inception of the earliest humans (i.e., the genus Homo, approximately 2.5 million years ago), the human diet has included meat. This is well-known in paleoanthropological circles, and is discussed in Setting the Scientific Record Straight on Humanity's Evolutionary Prehistoric Diet and Ape Diets. The current state of knowledge regarding the diet of our prehistoric ancestors is nicely summarized in Speth [1991, p. 265]:

44

[S]tone tools and fossil bones--the latter commonly displaying distinctive cut-marks produced when a carcass is dismembered and stripped of edible flesh with a sharp-edged stone flake--are found together on many Plio-Pleistocene archaeological sites, convincing proof that by at least 2.0 to 2.5 Ma [million years ago] before present (BP) these early hominids did in fact eat meat (Bunn 1986; Isaac and Crader 1981). In contrast, plant remains are absent or exceedingly rare on these ancient sites and their role in early hominid diet, therefore, can only be guessed on the basis of their known importance in contemporary forager diets, as well as their potential availability in Plio-Pleistocene environments (for example, see Peters et al. (1984); Sept (1984). Thus few today doubt that early hominids ate meat, and most would agree that they probably consumed far more meat than did their primate forebears. Instead, most studies nowadays focus primarily on how that meat was procured; that is, whether early hominids actively hunted animals, particularly largebodied prey, or scavenged carcasses... I fully concur with the view that meat was a regular and important component of early hominid diet. For this, the archaeological and taphonomic evidence is compelling.

Early hominid diet was mixed, not exclusive
The comments in Mann [1981, pp. 24-25] further illuminate the above: Nevertheless, given the available archaeological evidence and what is known of the dietary patterns of living gatherer/hunters and chimpanzees, it appears unlikely to me that all early hominids were almost exclusively carnivorous or herbivorous. It is more reasonable to suggest that the diet of most early hominids fell within the broad range of today's gatherer/hunter diets, but that within the wide spectrum of this adaptation, local environmental resources and seasonal scarcity may have forced some individual populations to become more dependent on vegetable or animal-tissue foods than others. The remarks by Mann remind us of the obvious: that early hominid diets, like hunter-gatherer diets, are a function of local flora and fauna; such diets are limited to the local food base (and to food acquired via trading).

"Natural" behavior a function of evolution The evidence that meat has been part of the human diet for ~2.5 million years, thus, directly implies that meat is a "natural" part of the human diet, where "natural" is defined as: those foods one is adapted to consume by evolution. (Side note to vegetarians: The fact that meat is a natural part of the evolutionary diet does not imply that one must, or even should, eat meat.) Some raw dietary advocates, in apparent denial of the evolutionary evidence, try to turn "opportunistic feeding" into a straw-man argument. The straw-man argument they construct is that the claim meat can be a natural part of the diet is based solely on the idea that humans can (and do) eat meat; they then claim it is circular logic, asserting that the "possibility" is not evidence it is "natural." However, this type of criticism or straw-man argument is based on a rather astonishing ignorance of--or at least certainly a denial of-evolutionary adaptation and how it occurs (discussed below). As such, the anti-"opportunistic feeding" straw-man argument is logically invalid.

Examining the rationalizations and denials of the fossil record evidence of human diet
As one might expect, the preceding information has been met with denial, crank science, and rationalizations from some in the raw vegan (and general veg*n) community. Some of these rationalizations and denials are as follows, with relevant comments.

45

CLAIM: What about the time before humans? What about Australopithecus? Aren't there tooth-wear studies that prove Australopithecus was a fruitarian? REPLY: If you go back far enough, you are dealing with a NON-HUMAN primate.
Australopithecus was a bipedal hominid in the line that led to humans, and was the last hominid prior to the evolution of Homo habilis, the first human in the genus Homo, but Australopithecus was itself still prehuman. If you go back somewhat further, you are dealing with apes that were primarily frugivorous, but as we will see below in discussing Australopithecus further, their diet would have included foods that were tougher than what we would generally call fruits today. Prior to these apes, if we go all the way back to the first primates 65 to 70 million years ago, from which all other primates evolved, then the diet was primarily insects. The important point regarding evolution is, as noted above, that the human diet has included meat since the inception of the human genus (Homo), and we have adapted to such a diet. If the diet of ancient frugivorous non-human primates is somehow supposed to be "very relevant" (more relevant, anyway, than diets of actual members of the genus Homo), then we are basically engaging in an absurd game of "moving the goalposts." If that's legitimate, then we might also say that the diet of the ancient insectivorous primates could be just as relevant.

Outdated toothwear studies often cited. As for Australopithecus, the highly publicized tooth-wear
studies cited in fruitarian/vegetarian lore are dated. (See related discussion in Part 1 of the Paleolithic Diet vs. Vegetarianism series.) Newer isotope studies of Australopithecus fossils indicate an omnivorous diet: Sillen [1992] analyzed the strontium/calcium (Sr/Ca) ratios in Australopithecus fossils, and concluded (p. 495): When specimens of the fossil Australopithecus robustus were examined, Sr/Ca values were inconsistent with that of a root, rhizome or seed-eating herbivore, suggesting that the diet of this species was more diverse than previously believed, and almost certainly included the consumption of animal foods. The results of Sillen [1992] were confirmed in a separate study, using stable carbon isotopic analysis; see Lee-Thorp et al. [1994] for details. Also see Sillen et al. [1995] for a followup to Sillen [1992].

Ancient fruits of different character than today's. A further note on Australopithecus: The fruits in
its diet were not like today's modern sweet hybrid fruits (such as one buys in a supermarket or produce stand). Instead, they included some very tough plant foods. Peters and Maguire [1981] analyzed the wild plant foods available in the Makapansgat (South Africa) area, and determined the structural strength of the foods available. They note (p. 565): Moreover, the most important potential plant food staples include very tough dry berries, beans and nuts, which require an average of 50-250 kg of compressive force to crack and crush them. Peters and Maguire concluded that even the strong (stronger than modern human) jaws of Australopithecus africanus were not strong enough to prepare all the tough plant foods, and simple stone tools would be necessary for survival in such an environment. (Needless to say, the fruits eaten by Australopithecus were markedly different from the diets advocated by modern promoters of fruitarianism--modern fruitarian diets emphasizing juicy, sweet fruits.) CLAIM: Our prehistoric ancestors could not eat meat prior to the advent of tools. (Often stated very emotionally as: Well, what about the time BEFORE tools!?)

REPLY: Chimpanzees kill and eat small animals without tools. The method used is flailing; see
Butynski [1981, table 2, p. 425] for details; also van Lawick-Goodall [1971, p. 283] for illustration. If chimpanzees can kill small prey without tools via flailing, humans are certainly capable of the same.

Naked apes without tools? Some fruitarian advocates promote the crank science/science-fiction claim
that humans are merely "naked apes, without tools," and that the human diet should be limited to foods that

46

can be obtained when one is nude, without tools. This bizarre fantasy is interpreted to mean that humans should limit their diets to soft fruits, leaves, and occasional seeds or nuts. However, even as stated, the fantasy actually implies the inclusion of a wide range of potential animal foods in the "natural" human diet. See the article, Selected Myths of Raw Foods, for further discussion of this, which includes a list of animal foods that meet the stated collection requirements. Note: A recent research report that capuchin monkeys are capable of making and using stone tools (see Westergaard and Suomi [1994]) demonstrates how utterly ridiculous the "naked ape without tools" myth is. One might regard such a myth as bizarre crank science; unfortunately a few fruitarian extremists promote the myth--often in a hostile manner. However, the most relevant point here is that humans have used tools since the inception of the human genus, so the question, "What about the time before tools?" is actually the question, "What about the time before the human genus existed?" This indicates the question is of limited relevance.

CLAIM: Eating meat is a perversion and UNnatural, but it does not affect reproduction. Hence, eating meat does not affect evolutionary selection. Thus, we are NOT adapted to meateating despite the dreadful "cultural" habit of eating meat for literally millions of years! (This argument is often stated in a highly emotional manner.)

REPLY: The above attitude is common in the raw veg*n movement. Such an attitude also reflects
misunderstanding and ignorance of how evolution works. Even if eating meat is "cultural" rather than a required survival tactic (the evidence of hunter-gatherer tribes suggests it is necessary for survival when living in the wild), if meat-eating is universal, then meat-eating is, in effect (or by default), a part of the environment. Once meat-eating becomes a part of the long-term environment, then evolutionary selective pressure will favor genes that are best adapted to that environment. In the long term, genetic adaptation to such a diet, by evolution, is the inevitable result. To illustrate this point, let us consider yet another crank-science fantasy promoted by certain fruitarian advocates: that of a pure fruitarian human for whom protein foods are "toxic," meaning that protein, above quite low levels (far below the levels accepted as necessary even by more mainstream vegan advocates), is believed harmful because it causes the production of "toxic" metabolic by-products. (Those not familiar with fruitarian lore may find such claims to resemble science fiction; unfortunately, such nonsense is actively promoted in some quarters.)

Could a fruitarian human have survived and evolved in an ancient environment favoring meat-eating? Assume a genetic mutation occurs and a human is born into a meat-eating prehistoric
group, but this human is--by pure chance--adapted to be a pure frugivore (fruit only), and protein is "toxic" (as described above) to that person. Clearly, such a person will not survive, long-term, in the meat-eating environment (the individual would eventually die from the toxic effects of the by-products of protein digestion) or would live a short, unhealthy life and be less likely to reproduce (hence the genes fail to survive long-term given that they will be out-competed reproductively). In sharp contrast to the preceding, any genetic mutations that enhance survival in the environment of meateating will exist in an environment where they can and will thrive--and will, over evolutionary time, outcompete those adaptations that do not enhance survival. This illustrates how interaction between behavior (culture) and the environment creates selection pressure that favors the universally prevalent diet (meateating) and disfavors a rare, narrow diet (e.g., fruit only) that is outside the range of the environmental support system. (Note: the remarks of Mann [1981] above are relevant here--I am not suggesting that the evolutionary human diet was 100% meat. Instead, it was a mix of animal and plant foods, varying according to location and seasonal availability of foods. The point is that meat was a significant part of the diet, hence "natural" in the most rigorous sense.)

47

What does "survival of the fittest" actually mean? The claim that humans have not adapted to meat
in ~2.5 million years of consumption rests on a misunderstanding of the principle of survival of the fittest. The idea that eating meat may not at first glance appear to (favorably) affect the reproduction of individuals comes from a misapplication and limited examination of the principle of survival of the fittest--that is, it considers the question on an individual, short-term basis only. However, survival of the fittest is really a broad-based, long-term (multi-generational) proposition. Those traits--mutations and adaptations--that ensure or promote survival over time and over multiple generations are the traits that will be propagated the most successfully and survive; this is the real meaning of survival of the fittest. Further, evolutionary selective pressure is driven by survival--an amoral principle--and not by such moral considerations as whether eating meat is moral or immoral, "good" or "bad," whether it violates so-called animal rights, and so on.

Diet, Evolution, and Culture
Feedback loop between evolution and culture
As discussed above, those aspects of culture that are universal, or nearly universal (tool use, meat-eating, and after the discovery and regular use of fire, eating cooked food) can and will impact evolutionary selection pressure. It does not matter whether such behavior is absolutely required for survival in any possible imaginable circumstances. What counts is that such behavior is universal (or nearly so) and therefore effectively required for survival under the actual conditions (which necessarily includes behavioral conditions) that a given individual or species finds itself. Thus there is an important feedback loop between evolution and behavior (culture). For another perspective on this, note that humans are unique because of their high intelligence (discussed in the next section). Given such intelligence, human survival becomes a function of behavior. And because of our high intelligence, human behavior (which constitutes part of the ongoing evolutionary environment) can be a malleable choice rather than a compelling "instinct." Further, behavior can be, and is, a part of culture. (Culture can be seen as a set of group-specific behaviors that are acquired via social means; see McGrew [1998] for discussion.) The following figure illustrates the nature of the behavior/culture evolutionary feedback loop. It is a modification of fig. 29.1, p. 439, from Wood [1995].

48

Courtesy link to Yale University Press graciously provided in exchange for permission to reproduce the above figure.

Diet and early social organization
Spuhler [1959] comments on the likely interaction of diet and culture (pp. 6-7): The change to a partially carnivorous diet had extremely broad implications for the social organization of early hominoids. [Note: "hominoids" (includes great apes and humans as a class) is probably a typo in the original here; "hominids" (which means humans and their bipedal predecessors) is likely what is meant.] Carnivores get a large supply of calories at each kill. This concentrated food is more easily transported to a central, continually used shelter than is low-calorie plant food, especially before containers were available... Compact animal protein high in calories is a good basis for food sharing...It is unlikely that the long dependency of human children--so important in the acquisition of culture by individuals--could develop in a society without food sharing. And the amount of information which needs to be transduced in a communication system for plant eaters like the gibbons is small compared to that needed in group-hunting of large animals.

Did the human brain give rise to culture and tool use, or vice versa? Washburn [1959] considers
the relationship between tool use and biological evolution, and observes (pp. 21, 29): [M]uch of what we think of as human evolved long after the use of tools. It is probably more correct to think of much of our structure as the result of culture than it is to think of men anatomically like ourselves slowly discovering culture... The general pattern of the human brain is very similar to that of ape or monkey. Its uniqueness lies in its large size and in the particular areas which are enlarged. From the immediate point of view, this human brain makes culture possible. From the evolutionary point of view, it is culture which creates the human brain.

49

The remarks of Spuhler and Washburn illuminate the culture-evolution feedback loop. The argument that "meat-eating does not (directly) impact the reproduction of individuals, hence we are not adapted," is erroneous because it ignores this important feedback loop.

Evidence of dietary culture in non-human primates
Some fruitarian extremists are quick to condemn all (human) cultural dietary habits as being allegedly unnatural. However, non-human primates show evidence of rudimentary, dietary cultural habits. Nishida [1991, p. 196] notes: Humans display diversified food culture: E. Rozin (1973, cited by P. Rozin, 1977) called "cuisine" culturally transmitted food practices... Nonhuman primate species also display diversified food habits within their own geographic distribution... The most easily observable result of cultural transmission processes is the difference in modes of behavior between different local populations of a species, which is uncorrelated with gene or resource distribution (Galef, 1976). Nishida [1991] provides an impressive listing of the differences in feeding behavior of chimpanzees, baboons, and monkeys. Of possible interest to raw-fooders is his description of the macaques of Koshima Island, Japan dipping food into seawater for seasoning (some raw-food advocates stridently condemn the practice of seasoning foods). Of course, that non-human primates display cultural food preferences suggests human cultural food preferences are an extension of that feature. There is an extensive body of scientific research on the topic of culture in non-human primates. For two recent papers that provide an entry to the research, consult McGrew [1998], which discusses culture in several non-human primates, and Boesch and Tomasello [1998], which compares chimpanzee and human cultures.

Fruitarian Extremists Retreat into Creationism
As the general information that the evolutionary diet (actually, range of diets) of humans included animal foods has spread into the raw vegan community, certain fruitarian extremists who want to continue to propagate the myth that 100% raw vegan/75+% fruit diets are the "most natural" diets for humans have made a hasty retreat into what could be called secular creationism. (This is similar to the sudden appearance of claims that the mountain gorilla is "closest" to humans, and vegan.) Relevant observations here are: • • If you are a creationist because it is a part of your sincere spiritual or religious views, please understand that this writer has no argument with you. Sincere spiritual views can be respected, at least regarding their genuineness of belief. Those fruitarian extremists who retreat into creationism for the sole purpose that it can be molded to fit their dietary dogma, however, appear to be engaging in blatant intellectual dishonesty. One fruitarian group advocating creationism went so far as to plagiarize portions of the book Darwin on Trial, by lawyer and creationist Phillip Johnson, in order to attack the evolutionary evidence against their diet

Adaptation in Action: Fruitarian Reaction to New Information
It is interesting to observe that certain fruitarian extremists--after their claims that "chimps are vegans" and "we evolved on a purely fruit/vegan diet" were discredited--adapted very quickly. In a matter of months (less than a year) new myths evolved and were promoted: "Humans are just like vegan mountain gorillas" plus the idea that species are created (nature is a "mystery") rather than evolved.

50

Thus we can wryly observe that certain fruitarian extremists "adapted," in less than a year, to the debunking of their old mythology with a brand-new mythology. And yet these same folks claim that humans cannot and did not adapt, via evolution, over ~2.5 million years, to a diet that includes some meat. Admittedly, these are different adaptations--one intellectual, the other physical. But we see that intellectual "adaptation" can be very rapid indeed, and the contrast between rapid intellectual adaptation versus claims that physical adaptation absolutely CANNOT have occurred in ~2.5 million years (despite fossil record evidence of physical change) presents us with an example of "raw irony"--a not-uncommon feature of the more extreme wing of the raw vegan movement.

Ironic, contradictory views of "nature's laws"
To add further irony to the situation, some fruitarian extremists simultaneously claim both that: • • We are created, not evolved; nature is a deep, profound mystery, and Nature is simplistic (on the other hand) and MUST follow the narrow, bizarre versions of nature's laws promoted by the (same) fruitarian extremists.

Of course, the ability of fruitarian extremists to contradict themselves so clearly, and get away with it, rests on the gullibility and idealism of their followers. This is yet another example of "raw irony," though in this case a much more unfortunate one (for the followers).

To Summarize:
• • Since the inception of the human genus Homo, ~2.5 million years ago, the human diet has included meat. Early hominid diet was probably like recent hunter-gatherer diets in the sense that it would have been a mixture of plant and animal foods, depending on habitat and season. There are no strictly vegetarian hunter-gatherers, and purely "carnivorous" hunter-gatherers (or nearly so) are rare (the traditional Inuit's 90-95% flesh diet being the only example); mixed diets predominate. As a universal, long-term aspect of culture that includes consumption of animal foods, biological adaptation to a diet that includes animal foods is an inevitable outcome of evolution. This occurs as a result of the feedback loop between evolution and long-term behavior patterns (including culture as an important element). As a significant part of the range of diets we are adapted to by evolution, meat--specifically the lean meat and organs of wild animals--can be considered natural (food). However, as humans are intelligent, we can use our intelligence to choose a different diet, if we wish

PART 4: Intelligence, Evolution of the Human Brain, and Diet
Introduction: Claims of the Comparative "Proofs"
Human intelligence ignored or rationalized. One of the key systematic restraints on the alleged
comparative anatomy/physiology "proofs" that promote particular diets is such proofs generally do not consider the many important features that make humans unique in nature. In particular, human intelligence is usually ignored or dismissed (via rationalizations) in such "proofs." For example, Fit Food for Humanity asserts (p. 14): But merely having a superior brain does not alter our anatomy and physiology which, according to natural law, remain characteristic of a total-vegetarian mammal, meant to eat a variety of vegetables, nuts and seeds.

51

Brain size discounted. Le Gros Clark is sometimes quoted by those advocating comparative "proofs" of
vegetarian diets. He also appears to minimize the importance of large brains in humans (the term is "encephalization," discussed later herein). From Le Gros Clark [1964, pp. 4-5]: In Homo the large size of the brain relative to the body weight is certainly a feature which distinguishes this genus from all other Hominoidea, but it actually represents no more than an extension of the trend toward a progressive elaboration of the brain shown in the evolution of related primates. The attitudes stated in the quote from Fit Food for Humanity reflect an underlying denial of the importance of human intelligence, in particular its impact on behavior (and, ultimately on morphology via evolution). The attitude one finds in some raw/veg*n circles is that human intelligence is suspect because it allows us to "make errors," i.e., to eat foods different from those promoted by dietary advocates (who often behave as if they are absolutely 100% certain that they know better than you what foods you should eat).

Hidden, contradictory views on the value of intelligence. An irony here is that there is a
contradiction in the logic of the attitudes of certain dietary advocates regarding intelligence. Some fruitarian extremists promote the alleged naturalness of fruitarian diets via the "humans are naked apes, without tools" myth discussed in the last section. This falsehood is often presented as actual science (needless to say, it is crank science) by those who promote it. Inasmuch as the advanced use of tools is an evolutionary characteristic of human intelligence, we can observe that those promoting the myth are saying that you should reject tool use in seeking your "natural" diet (this nonsense may even be presented as being scientific or logical). However, the preceding is equivalent to telling you to reject your intelligence, and even reject your status as a human being, in order to select the (allegedly) optimal diet. The argument made by fruitarian extremists is thus contradictory; the argument can be stated as: Use your intelligence to agree with the extremist that humans are "naked apes, without tools," and thus reject, in the future, your use of intelligence in food choices. Another irony here is that some of the extremists promoting this false myth present themselves as "scientific." Crank science (or science fiction) is a more accurate description for such myths, however. The contradictory logic of the "naked ape" myth is a good example of the ambivalent, confused attitude toward intelligence displayed by some dietary advocates.

Recent evolutionary research now emphasizes the interaction of diet and brain development. Further, recent research has rendered the quotations above outdated. The remarks of Milton
[1993] on the interaction between brain evolution and diet provide a brief introduction to a more modern perspective (p. 92): Specialized carnivores and herbivores that abound in the African savannas were evolving at the same time as early humans, perhaps forcing them [humans] to become a new type of omnivore, one ultimately dependent on social and technological innovation, and thus, to a great extent, on brain power. This section will review some of the research on the human brain, specifically: • • • Its evolution, How our brains compare to other animals (especially primates--using a comparative anatomy approach), and The dietary/metabolic factors required to support the large human brain.

We begin our review with the topic of encephalization, or brain size.

Encephalization
Introduction
The most significant features that make humans unique in all of nature are our high intelligence and "large" brains. Here "large" means the brain is large relative to body size. Encephalization, or the relative size of the brain, is analyzed using a measure known as the encephalization quotient .

52

"Expected" vs. actual brain size. In order to measure encephalization, statistical models have been
developed that compare body size with brain size across species, thereby enabling the estimation of the "expected" brain mass for a given species based on its body mass. The actual brain mass of a species compared to (divided by) its "expected" brain mass gives the encephalization quotient. Higher quotients indicate species with larger-than-expected brain sizes. Thus, a quotient greater than 1 indicates an actual brain mass greater than predicted, while quotients less than 1 indicate less-than-expected brain mass.

The encephalization quotient is important because it allows the quantitative study and comparison of
brain sizes between different species by automatically adjusting for body size. For example, elephants, which are folivores, and certain carnivorous marine mammals have larger brains (actual physical mass) than humans. However, after adjusting for body size, humans have much "larger" brains than elephants or marine mammals. Additionally, the complexity of the brain is significant as well (and, of course, encephalization does not directly measure complexity--it only measures size).

Kleiber's Law. Kleiber's Law expresses the relationship between body size--specifically body mass--and
body metabolic energy requirements, i.e., RMR (resting metabolic energy requirements), also known as BMR (basal metabolic energy requirements). The form of the equation is: RMR = 70 * (W0.75) where RMR is measured in kcal/day, and W = weight in kg. (The above is adapted from Leonard and Robertson [1994].) An understanding of Kleiber's Law is important to several of the discussions in this paper.

Brain and digestive system compete for limited share of metabolic energy budget. A key
observation to note about relative brain size when averaged across species is that the equation for how brain size varies in proportion to body size uses an exponential scaling factor almost identical to the one used in the equation for how an organism's basal metabolic rate (BMR) varies with body size, i.e. Kleiber's Law. (The exponential scaling coefficient used in the equation for how brain mass varies in relation to body mass is 0.76 [Foley and Lee 1991]; the analogous scaling coefficient for BMR is 0.75; Kleiber [1961] as cited in Foley and Lee [1991].) This is important because it directly implies that brain size is closely linked to the amount of metabolic energy available to sustain it [Milton 1988, Parker 1990]. This point will become central as we proceed. For now it is enough to observe that the amount of energy available to the brain is dependent on how the body's total energy budget has to be allocated between the brain and other energy-intensive organs and systems, particularly the digestive system. Further, how much energy the digestive system requires (and thus how much is left over for the brain and other "expensive" organs) is a function of the kind of diet that a species has developed to handle during its evolution. As we proceed, we will return to the ramifications of this for human diet as it relates to the evolution of the large human brain.

A comparative anatomy analysis of primate brains
Stephan [1972] provides a comparative anatomy analysis of primate brains, including modern humans, non-human primates, and our prehistoric ancestors. Below is a summary of the important points made in Stephan [1972]. Note here that the Stephan paper was done before the Martin research cited above; thus Stephan uses a slightly different measure of encephalization. •

Humans at top of primate scale. Using measures of encephalization based on the
encephalization of insectivorous primates, Stephan [1972] reports that humans are at the very top of the index (with an encephalization quotient, or EQ, of 28.8), while Lepilemur is at the bottom (EQ=2.4). Large gap between humans and great apes. There is a large gap between the encephalization of modern humans and all extant (present-day) non-human primates, including

53

our closest relatives, the great apes. This large gap is filled, however, by analysis of the encephalization of our prehistoric hominid ancestors. Brain enlargement disproportional. The enlargement of the human brain vs. non-human primates is not proportional (thereby possibly contradicting the earlier quote from Le Gros Clark). Stephan [1972, p. 174] notes: The enlargement of the brain is not proportional; that is, all parts do not develop at the same rate. The neocortex is by far the most progressive structure and therefore used to evaluate evolutionary progress ( = Ascending Primate Scale).

Also of interest here is the additional remark in Stephan [1972], regarding comparative anatomy in this context (p. 174): "It must be stressed, however, that because our scientific approach is indirect, it can provide only inferences, not proofs."

Factors in Encephalization: Energy (Metabolism) and Diet
The reality of encephalization--the relatively large human brain--with its correspondingly high intelligence, is readily apparent. The object of current research and debate, however, is the examination of what evolutionary factors have driven the development of increased human encephalization. Such research provides insight into our evolutionary diet, and also reveals why any comparative "proof" that ignores intelligence and the significant impact of brain size on metabolic requirements is logically dubious.

Life cycle and energy requirements
Parker [1990] analyzes intelligence and encephalization from the perspective of life history strategy (LHS) theory, a branch of behavioral ecology. LHS is based on the premise that evolutionary selection determines the timing of major life-cycle events--especially those related to reproduction--as the solution to energy optimization problems.

Extensive energy required for brain growth. Parker discusses the life history variables in nonhuman primates, and then examines how life history events relate to large brain size, gestation period, maturity at birth, growth rates and milk consumption, weaning and birth intervals, age of puberty, and other events. The motivation for studying such events is that the brain is the "pacemaker of the human life cycle" [Parker 1990, p. 144], and the slow pace of most human life history events reflects the extensive energy required for brain growth and maintenance. Foley and Lee [1991] analyze the evolutionary pattern of encephalization with respect to foraging and dietary strategies. They clearly state the difficulty of separating cause and effect in this regard; from Foley and Lee [1991, p. 223] In considering, for example, the development of human foraging strategies, increased returns for foraging effort and food processing may be an important prerequisite for encephalization, and in turn a large brain is necessary to organize human foraging behaviour.

Dietary quality is correlated with brain size. Foley and Lee first consider brain size vs. primate
feeding strategies, and note that folivorous diets (leaves) are correlated with smaller brains, while fruit and animal foods (insects, meat) are correlated with larger brains. The energetic costs, both daily and cumulative, of brains in humans and chimps, over the first 1-5 years of life are then compared. They note [Foley and Lee 1991, p. 226]: Overall the energetic costs of brain maintenance for modern humans are about three times those of a chimpanzee. Growth costs will also be commensurately larger.

54

Then they consider encephalization and delayed maturation in humans (compared to apes), and conclude, based on an analysis of brain growth, that the high energy costs of brain development are responsible for the delay in maturation.

Dietary shift beginning with Homo. Finally, they consider the dietary shifts that are found in the fossil
record with the advent of humans (genus Homo), remarking that [Foley and Lee 1991, p. 229]: The recent debate over the importance of meat-eating in human evolution has focused closely on the means of acquirement... but rather less on the quantities involved... In considering the evolution of human carnivory it may be that a level of 10-20% of nutritional intake may be sufficient to have major evolutionary consequences... Meat-eating, it may be argued, represents an expansion of resource breadth beyond that found in nonhuman primates... Homo, with its associated encephalization, may have been the product of the selection for individuals capable of exploiting these energy- and protein-rich resources as the habitats expanded (Foley 1987a). The last sentence in the preceding quote is provocative indeed--it suggests that we, and our large brains, may be the evolutionary result of selection that specifically favored meat-eating and a high-protein diet, i.e., a faunivorous diet.

How dietary quality relates to the brain's share of total metabolic budget
The research of Leonard and Robertson [1992, 1994] provides an in-depth analysis of brain and body metabolism energy requirements. Relevant points from their research: •

Dramatic changes in last 4 million years. Leonard and Robertson [1992, p. 180] note:
Evidence from the prehistoric record indicates that dramatic changes have occurred in (1) brain and body size, (2) rates of maturation, and (3) foraging behavior during the course of hominid evolution, between four million years ago and the present. Consequently, it is reasonable to assume that significant changes in metabolic requirements and dietary changes have also occurred during this period.

Human brain's metabolic budget significantly different from apes. They point out that
anthropoid primates use ~8% of resting metabolism for the brain, other mammals (excluding humans) use 3-4%, but humans use an impressive 25% of resting metabolism for the brain. This indicates that the human "energy budget" is substantially different from all other animals, even our closest primate relatives--the anthropoid apes. In contrast, total human resting metabolism not significantly different. In order to understand the relationship between metabolism (energy budget) and body size, Leonard and Robertson collected relevant data on metabolism and body size in primates, humans, and other mammals. Some of the data collected included body size, brain size, resting metabolic rate (RMR), brain metabolic rate (brain MR), total energy expenditure (TEE), etc. They also collected activity and energy expenditure data on a few hunter-gatherer societies, and select non-human primates. Statistical analysis of the data showed that:

55

[L]arge-brained anthropoids [great apes], as a group, do not depart from the general mammalian metabolism/body size relationship. Several individual species, however, do deviate markedly from the RMRs predicted by the Kleiber relationship. To summarize, they found that the RMR (resting metabolic rate) for humans--that is, the overall total energy budget--as predicted by body size, did not deviate significantly from that predicted by the Kleiber relationship; that is, resting metabolic rate for humans is comparable to that for other animals (based on body mass). •

Human brain MR 3.5 times higher than apes. However, in marked contrast, they found
that humans have a radically different brain energy metabolism than the other animals. From Leonard and Robertson [1992, p. 186]: The human brain represents about 2.5% of body weight and accounts for about 22% of resting metabolic needs... At 315 kcal (1318 kJ), humans use over 3.5 times more of RMR to maintain their brains than other anthropoids (i.e., a positive deviation of 255%). Clearly, even relative to other primate species, humans are distinct in the proportion of metabolic needs for the brain.

Important changes in diet of Homo erectus. Leonard and Robertson [1992] also applied
their statistical models to our prehistoric ancestors. Their analysis points to important changes in diet. From Leonard and Robertson [1992, p. 191]: The clear implication is that the diet of Homo erectus [note: Homo erectus evolved roughly 1.7 Mya] was not simply an australopithecine diet with more meat; rather there were important changes in both animal and vegetable components of the diet... What made meat an important resource to exploit was not its high protein content, rather, its high caloric return... In short, the early hunting-gathering life-way associated with H. erectus was a more efficient way of getting food which supported a 35-55% increase in caloric needs (relative to australopithecines)...

In a followup paper, Leonard and Robertson [1994] expanded the analysis of their 1992 paper by looking at the relationship of dietary quality to body size and metabolic rates. Important points from their 1994 paper: •

Body size and dietary quality (DQ). When the relationship between body size and dietary
quality (i.e., the energy and nutrient content of the diet) was analyzed, the general relationship found was that larger primates, e.g., gorillas, have low-quality diets (gorillas are folivores), while the smaller primates have higher-quality (insectivorous) diets. Leonard and Robertson [1994, p. 78] note: In general, there is a negative relationship between diet quality (the energy and nutrient density of food items) and body size (Clutton-Brock and Harvey, 1977; Sailer et al., 1985).

Humans depart from normal DQ/body-weight relationship. Next, they calculated dietary
quality (DQ) indices for 5 hunter-gatherer groups, and 72 non-human primate species, relative to body weight. The observed DQ values for the hunter-gatherer groups were higher than predicted for a primate that weighs as much as a human. Leonard and Robertson [1994, p. 79] comment that:

56

Humans, however, appear to depart substantially from the primate DQ-body weight relationship (Fig. 1)... [T]he diets of the five hunter-gatherer groups are of much higher quality than expected for primates of their size. This generalization holds for most all subsistence-level human populations, as even agricultural groups consuming cereal-based diets have estimated DQs higher than other primates of comparable size... [E]ven in human populations where meat consumption is low, DQ is still much higher than in other large-bodied primates because grains are much more calorically dense than foliage.

Availability of grains vs. animal food. A note here regarding cereal (i.e., grain) consumption:
Prior to the development of agriculture, grains would only have been a miniscule fraction of the human diet due to lack of technology to farm, harvest, and store them. In some locations they may have been available seasonally, but the low level of harvesting, storing, and processing technology would necessarily have sharply limited their consumption. Hence prior to the development of agriculture 10,000 years ago (a tiny fraction of the 2.5-million-year existence of the Homo genus), grains were not a feasible option to increase DQ. •

DQ and RMR. Another statistical comparison was run to analyze the relationship between
dietary quality and RMR, or resting metabolic rate. (The previous comparisons just discussed were based on body weight.) While the model fit was not very good, the data plot suggests that human DQ may also be higher than expected when using RMR as the yardstick for comparison in lieu of body weight. (As the model fit is not good, the preceding is a hypothesis only.)

The paradox: Where does the energy for the large human brain come from?
In any event, as we have seen, what begs explanation is that humans "spend" far more energy on the brain than other primates: 20-25% of RMR vs. roughly 8% in the great apes. Yet the total human RMR remains in line with predictions based purely on body size. This presents a paradox: where do humans get the extra energy to "spend" on our large brains? As we will see later in the research of Aiello and Wheeler, the most feasible hypothesis is that the answer lies in considerations of dietary efficiency and quality. Leonard and Robertson [1994, p. 83] conclude: These results imply that changes in diet quality during hominid evolution were linked with the evolution of brain size. The shift to a more calorically dense diet was probably needed in order to substantially increase the amount of metabolic energy being used by the hominid brain. Thus, while nutritional factors alone are not sufficient to explain the evolution of our large brains, it seems clear that certain dietary changes were necessary for substantial brain evolution to take place. In other words, while the evolutionary causes of the enlarging human brain themselves are thought to have been due to factors that go beyond diet alone (increasing social organization being prime among the proposed factors usually cited), a diet of sufficient quality would nevertheless have been an important prerequisite. That is, diet would have been an important hurdle--or limiting factor--to surmount in providing the necessary physiological basis for brain enlargement to occur within the context of whatever those other primary selective pressures might have been.

To summarize: The significance of Leonard and Robertson's research [1992, 1994] lies in their analysis of energy metabolism, which reveals the paradox: How do humans meet the dramatically higher energy needs of our brains, without a corresponding increase in RMR (which is related to our body size)? They argue that the factor that allows us to overcome the paradox is our higher-quality diet compared to other primates. Of course, prior to the advent of agriculture and the availability of grains, the primary source of such increased dietary quality was the consumption of fauna--animal foods, including insects.

57

The Relationship of Dietary Quality and Gut Efficiency to Brain Size
The Expensive Tissue Hypothesis of Aiello and Wheeler
Aiello and Wheeler [1995] is an excellent research paper that neatly ties together many of the threads addressed in the research papers discussed up to this point in this section. The insights available from this paper provide yet another view into the paradox of how humans can afford, in metabolic energy terms, the high "cost" of our large brains without a corresponding increase in BMR/RMR (basal metabolic rate, also known as resting metabolic rate). Aiello and Wheeler begin by comparing the actual size of the "average" human brain (for a body weight of 65 kg). The actual weight is ~1.3 kg, while the predicted weight is 268 g, based on the scaling model of Martin [1983, 1990] (as cited in Aiello and Wheeler [1995]; the Martin model relates brain mass to body mass). They then note that despite the large brain size with its consequent disproportionate demands on metabolism, the total BMR for humans is nevertheless well within the range expected for primates and other mammals of comparable body size. This of course brings us back to the paradox described above: Where does the "extra" metabolic energy come from to power the enlarged human brain, if the human body's total BMR is no higher than that of other comparably sized mammals?

Large brain compensated for by decreased gut size. Aiello and Wheeler then analyze the other
"expensive" organs in the body (expensive in terms of metabolic energy): heart, kidneys, liver, and gastrointestinal tract, noting that together with the brain, these organs account for the major share of total body BMR. Next they analyze the "expected" sizes of these major organs for a 65-kg non-human primate, and compare these with the actual organ sizes for an average 65-kg human. Their figure 3 (below) illustrates the dramatic differences between the expected and actual sizes of the human brain and gut: The larger-than-expected size of the human brain is compensated for by a smaller-than-expected gut size.

58

Aiello and Wheeler [1995, pp. 203-205] note: Although the human heart and kidneys are both close to the size expected for a 65-kg primate, the mass of the splanchnic [abdominal/gut] organs is approximately 900 g less than expected. Almost all of this shortfall is due to a reduction in the gastrointestinal tract, the total mass of which is only about 60% of that expected for a similar-sized primate. Therefore, the increase in mass of the human brain appears to be balanced by an almost identical reduction in the size of the gastrointestinal tract.... Consequently, the energetic saving attributable to the reduction of the gastrointestinal tract is approximately the same as the additional cost of the larger brain (table 4).

Less energy-intensive gut associated with higher dietary quality. The authors further point out
that the reduction in gut size is necessary to keep the BMR at the expected level. Additionally, they argue that the liver, heart, and kidneys cannot be significantly reduced in size to offset the energy costs of encephalization because of their highly critical functions; only the gut size can be reduced. Since gut size is associated with dietary quality (DQ), and the gut must shrink to support encephalization, this suggests that a high-quality diet is required for encephalization. That is, a higher-quality diet (more easily digested, and liberating more energy/nutrients per unit of digestive energy expended) allows a smaller gut, which frees energy for encephalization. Figure 5 from Aiello and Wheeler [1995], below, illustrates the diet/gut size/encephalization linkages.

59

Aiello and Wheeler deduce that the obvious way to increase DQ among early hominids is to increase the amount of animal foods (meat, organs, insects) in the diet. They note [Aiello and Wheeler 1995, p. 208]: A considerable problem for the early hominids would have been to provide themselves, as a large-bodied species, with sufficient quantities of high-quality food to permit the necessary reduction of the gut. The obvious solution would have been to include increasingly large amounts of animal-derived food in the diet (Speth 1989; Milton 1987, 1988).

Advent of cooking may have promoted further encephalization by reducing digestive energy required. Finally, in what will be controversial to raw-fooders, Aiello and Wheeler, after arguing that the
first major increase in encephalization was due to increased consumption of animal foods, next propose that the second major increase in brain size (with the appearance of archaic Homo sapiens) was due to the appearance of cooking practices. (Archaic Homo sapiens coincides with a timeframe of roughly 100,000 to 400,000 years ago.) Cooking neutralizes toxins and increases digestibility (of starch, protein, beta-carotene), and might make the digestion of cooked food (vs. raw) less "expensive" in metabolic energy terms--thereby freeing up energy for increased encephalization. It may be the case, therefore, that in evolutionary terms cooked food could have been responsible for some of the later increases in brain size and--since increased brain size is associated with increased technology--intelligence as well. The authors note [Aiello and Wheeler 1995, p. 210]: Cooking is a technological way of externalizing part of the digestive process. It not only reduces toxins in food but also increases its digestibility (Stahl 1984, Sussman 1987). This would be expected to make digestion a metabolically less expensive activity for modern humans than for non-human primates or earlier hominids. Cooking could also explain why modern humans are a bit more encephalized for their relative gut sizes than the non-human primates (see fig. 4). Note that while this particular aspect of Aiello and Wheeler's research is speculative, it is at least based on the existence of supportive evidence. Fruitarian readers might wish to compare the above analysis with the pejorative slogans about cooked food popular in the raw/fruitarian community that are often based on not much more than simply the strength of as much emotional vehemence as can be mustered.

Potential subterfuges by extremists. Note to readers: The 1995 paper by Aiello and Wheeler is
excellent and, if you have an interest in this topic, makes very worthwhile reading; it is available in many university libraries. At the end of the paper there is a series of comments by other scholars, followed by a rejoinder from Aiello and Wheeler, which gives good insight into the process of scientific inquiry and

60

debate. If you take the time to locate this paper, I would encourage reading not only the primary report, but the comments and rejoinder following. Reason: In my experience and opinion, there is enough of a tendency among raw/veg*n extremists to avoid the primary evidence in cases like this, while finding minor points to nitpick, that it would not be surprising if certain (fruitarian) extremists were to try to use, in an illicit manner, the criticisms by the commenting scholars--e.g., use the material without proper attribution, without consideration of the rejoinder remarks by Aiello and Wheeler, and possibly (the worst extremists of all) the material could be plagiarized or used in violation of copyright. This may sound inflammatory to some readers--please be assured, however, that this comment is based on hard experience with extremists. (See Previous Chapter)

Recent brain-size decreases in humans as further evidence of the brain/diet connection Why has brain size decreased 11% in the last 35,000 years and 8% in the last 10,000?
Another interesting observation about the brain/diet connection comes from recently updated and more rigorous analysis of changes in brain size in humans over the last 1.8 million years. Ruff, Trinkaus, and Holliday [1997] found that encephalization quotient (EQ) began reaching its peak with the first anatomically modern humans of approximately 90,000 years ago and has since remained fairly constant [see p. 174, Table 1]. Most surprisingly, however, absolute brain size--on the other hand--has decreased by 11% since 35,000 years ago, with most of this decrease (8%) coming in just the last 10,000 years. (The decrease in absolute brain size has been paralleled by roughly similar decreases in body size during the same period, resulting in EQ values that have remained roughly the same as before.)

The significance of constant EQ vs. shrinking brain size in context. This data suggests two
points. The first point--relating to EQ--is subject to two possible interpretations, at least on the face of it. One interpretation (characterized by somewhat wishful thinking) might be that, if we disregard the absolute decrease in brain and body size, and focus only on EQ, we can observe that EQ has remained constant over the last 10,000-35,000 years. One could then further conjecture that this implies humans have in some sense been successful in maintaining dietary quality during this time period, even considering the significant dietary changes that came with the advent of the agricultural revolution (roughly the last 10,000 years). However, the problem with such an interpretation is exactly that it depends on disregarding the information that overall body size diminished along with brain size--a most important point which needs to be taken into account. The alternate, and more plausible and genetically consistent interpretation begins by noting that EQ represents a genetically governed trait determined by our evolutionary heritage. Hence one would not expect EQ itself to have changed materially in just 10,000 years, as it would be unlikely such a brief period of evolutionary time could have been long enough for the actual genetics governing EQ (that is, relative brain size compared to body size) to have changed significantly regardless of dietary or other conditions.

Dietary/physiological mechanism may be responsible. This brings up the second point, which is
that the specific question here concerns a slightly different issue: the absolute decrease in brain size rather than the issue of EQ. Since the greatest majority of this decrease took place in just the last 10,000 years, a genetic mutation is no more likely as an explanation for the decrease in absolute brain size than it is for relative brain size, or EQ. This leaves us once again with a physiological/biochemical mechanism as the responsible factor, which of course puts diet squarely into the picture. (Not to mention that it is difficult to imagine plausible evolutionary selective pressures for brain size--primarily cultural/social/behavioral--that could conceivably be responsible for the reversal in brain size, since human cultural evolution has accelerated considerably during this period.)

Far-reaching dietary changes over the last 10,000 years. This leaves us with the indication that
there has likely been some kind of recent historical shortfall in some aspect of overall human nutrition--one that presents a limiting factor preventing the body/brain from reaching their complete genetic potential in terms of absolute physical development. The most obvious and far-reaching dietary change during the last 10,000 years has, of course, been the precipitous drop in animal food consumption (from perhaps 50% of

61

diet to 10% in some cases) with the advent of agriculture, accompanied by a large rise in grain consumption--a pattern that persists today. This provides suggestive evidence that the considerable changes in human diet from the previous hunter-gatherer way of life have likely had--and continue to have-substantial consequences.

Brain growth dependent on preformed long-chain fatty acids such as DHA. The most plausible
current hypothesis for the biological mechanism(s) responsible for the absolute decrease in brain size is that the shortfall in consumption of animal foods since the late Paleolithic has brought with it a consequent shortfall in consumption of preformed long-chain fatty acids [Eaton and Eaton 1998]. Specifically, for optimal growth, the brain is dependent on the fatty acids DHA (docosahexaenoic acid), DTA (docosatetraenoic acid), and AA (arachidonic acid) during development to support its growth during the formative years, particularly infancy. These are far more plentiful in animal foods than plant. Eaton et al. [1998] analyze the likely levels of intake of EFAs involved in brain metabolism (DHA, DTA, AA) in prehistoric times, under a wide range of assumptions regarding possible diets and EFA contents. Their model suggests that the levels of EFAs provided in the prehistoric diets was sufficient to support the brain expansion and evolution from prehistoric times to the present, and their analysis also suggests that the current low levels of EFA intake (provided by agricultural diets) may explain the recent smaller human brain size.

Rate of synthesis of DHA from plant-food precursors does not equal amounts available in animal foods. Although the human body will synthesize long-chain fatty acids from precursors in the diet
when not directly available, the rates of synthesis generally do not support the levels obtained when they are gotten directly in the diet. This is particularly critical in infancy, as human milk contains preformed DHA and other long-chain essential fatty acids, while plant-food based formulas do not (unless they have been supplemented). Animal studies indicate that synthesis of DHA from plant-source precursor fatty acids does not equal the levels of DHA observed when those are included in the diet: Anderson et al. [1990] as cited in Farquharson et al. [1992], Anderson and Connor [1994], Woods et al. [1996]. Similar results are reported from studies using human infants as subjects: Carlson et al. [1986], Farquharson et al. [1992], Salem et al. [1996]. For a discussion of the above studies, plus additional studies showing low levels of EFAs in body tissues of vegans, see Key Nutrients vis-a-vis Omnivorous Adaptation and Vegetarianism: Essential Fatty Acids.

To summarize The data that human brain size has fallen 11% in the last 35,000 years--with the bulk of that decrease (8%) coming in the last 10,000 years--furnishes, by extension, suggestive, potential corroborative support for the hypotheses explored earlier in this section that increasing brain development earlier in human evolution is correlated positively with the level of animal food in the diet. It also indicates that animal food may be a key component of dietary quality (DQ) that cannot be fully substituted for by increasing other components in the diet in its absence (such as grains). This indication is important to consider, because evidence available on the changes in food practices of more recent prehistoric humans (and of course, humans today) can be assessed in more depth and with a higher degree of resolution than dietary inferences about earlier humans. In conjunction with data about DHA synthesis in the body vs. obtaining it directly from diet, this provides a potentially important point of comparison for assessing hypotheses about the brain/diet connection.

Fruitarian Evolution:

62

Science Fact or Science Fiction?
Despite the material of this (and preceding) sections, some fruitarian extremists are likely to continue to promote the tired, discredited "party line," i.e., that humans evolved under a fruitarian diet, and that the natural diet of humans is exclusively fruit or a very-high-percentage fruit diet. (Such claims were assessed and found unsupportable earlier in this paper.) However, there are a few aspects of these claims that merit additional discussion.

Vague claims about an ancient frugivorous primate ancestor
First, those who make such claims may refer to some vague, ancient, frugivorous primate ancestor, implying that such an ancestor somehow proves humans are natural fruitarians. There are two major problems with this: •

The reference to an ancient frugivorous ancestor is so vague that it is meaningless.
True, there were ancient frugivorous primates. However, the reference mixes up the diverse categories of primates, hominoids, and humans such that no meaningful statements can be made. Another problem here is that--as has been discussed earlier--the type of fruits eaten by earlier frugivorous apes included tougher and more fibrous fruits considerably different in character than the highly bred and far sweeter varieties developed for commercial production in modern times. Humans have been eating meat since the dawn of the Homo genus. Humans appeared with the advent of a brand-new genus (Homo) ~2.5 million years ago. Humans evolved on the savanna-- a very different environment from the forest home of the great apes. From the very inception of our genus, humans have been eating animal foods. There is overwhelming scientific evidence to support this point. (Some of the evidence is discussed in this and the preceding section; also see Part 1 of the Paleolithic Diet vs. Vegetarianism interview series, available on this site, for additional information and citations.) The diet of some vague prehistoric frugivore that may or may not be an ancestor is irrelevant in light of the status of humans as a new genus with a different diet (i.e., eating more animal foods) and evolving in a different environmental niche. In contrast to the extensive fossil record evidence of meat in the evolutionary diet, there is virtually no credible scientific evidence of a strict fruitarian or veg*n diet by our prehistoric human (and australopithecine) ancestors.

No fruitarian, or even vegan, hunter-gatherer societies have ever been found. Further, there is
no evidence to indicate there ever existed, in the past, a fruitarian (or veg*n) hunter-gatherer society. Even in the tropical rainforest, hunter-gatherers eat meat. (The Ache of Paraguay in the Amazon rainforest, one of the best-studied of all hunter-gatherer tribes, are a prime example with an average of over 50% meat consumption throughout the year, ranging from 47-77% depending on the season [Hill, Hawkes, Hurtado, and Kaplan 1984].) There is no evidence of any fruitarian societies, and--more to the point--the extensive anecdotal evidence (virtually the only evidence available) on modern attempts at (strict) fruitarianism indicates that it may work for a short while but almost always fails in the long run. (Even the fruitarian extremist "experts" often fail to follow the diet strictly, in the long term.)

Crank science and logical fallacies used in support of claims of fruitarian evolution
Returning to the false claims that humans evolved as fruitarians, the primary additional "evidence" presented in support of such claims consists of additional claims such as: • •

Egocentric claims. Unscientific, blatantly egocentric claims that eating foods other than fruit is
"maladaptive," or "in error." (These claims were discussed earlier in this paper, and are also reviewed in the article Selected Myths of Raw Foods.) Denial of evolutionary adaptation. Fallacious claims that humans did not or could not adapt, via evolution, to a diet that included animal foods (fauna), despite ~2.5 million years of evolution on such diets. (The topic of physiological adaptation is discussed later herein.)

63

Straw-man arguments based on SAD/SWD diets, i.e., citation of clinical research that
shows conventional vegan diets (not fruitarian diets) are better than the standard Western diet. The logical fallacies of this approach (which use the evidence against the typical Western diet to create a "straw man" to stand in for all omnivorous diets) are discussed in a later section. Uninformed comparative anatomy. Claiming that comparative anatomy indicates we evolved to become fruitarians. This claim is addressed in the sections that follow. "Protein is toxic" claims. The presentation of ancillary crank science claims, i.e., that "protein is toxic," in the sense that any excess protein--any above a very low standard much lower than cited even in conventional vegan research--produces metabolic by-products that are poison and will harm you. This crank science theory is then used to "prove" that animal foods are unnatural because they allegedly contain excess protein. Such theories are based on amazing ignorance of the reality that the human body is very well equipped to dispose of the metabolic by-products of protein digestion. Note: See the article, "Is Protein Toxic”. Paranoia-based theories. Another factor in such crank theories is that they appear to be based on an intensely paranoid, pathological fear of any/all possible "toxins" from food, and hence such theories may actually (indirectly) promote obsession with unachievable ideals of total dietary purity, sometimes leading to eating disorders. "Fruit is like mother's milk" analogies. Claiming that human milk (a food whose calories are predominantly from fat) is "just like" sweet fruit--a food whose calories are predominantly from sugar. (Most people seem easily enough able to tell the difference between fat and sugar, though apparently certain fruitarian extremists have major, inexplicable difficulties in perceiving this distinction.) This claim has been blown up into a major bulwark used by some to promote fruitarianism, and is another prime example of bogus crank science. (See the article Fruit is Not Like Mother's Milk for details.)

• •

Fruitarian denial of physiological evolution/adaptation What are the claims? Given the extensive fossil record evidence of meat consumption and physical
evolution since the first appearance of the human genus ~2.5 million years ago, those who wish to promote the theory that humans evolved on, and are adapted to, a diet of only fruit (with perhaps a small amount of other plant foods) must make the following three unusual claims: • • • Despite the fossil record evidence of physical/anatomical evolution in the last ~2.5 million years, physiological evolution--specifically of the digestive system--did not and in fact could not have occurred. Because of the preceding, humans are still adapted to a fruitarian diet because of the (mystical) alleged ancient frugivorous primate ancestor, discussed in a previous section. Similarly, those who make the above two claims must also claim that meat-eating is "maladaptive," and that somehow the human genus survived and evolved (except for the digestive system, of course) over ~2.5 million years of evolution on a "maladaptive" diet! (Does common sense tell you anything about such far-fetched claims?)

Note that those who claim humans evolved to be strict veg*ns, whether fruitarian or not, also must make claims similar to the above to be able to somehow "explain away" the evidence of long-term meat consumption provided in the fossil record.

Analysis of the Fruitarian Claims
Let's now examine, in depth, some of the "evidence" and additional claims raised by promoters of the fruitarian evolution theory.

CLAIM: Morphological evolution is "easy," physiological evolution is nearly impossible.
The basic claim is that mutations that impact morphology--skin color, bone size, etc., have a minor impact

64

on the organism, while mutations that produce physiological changes occur under what seem to be more complicated interactions, hence are less likely to succeed or enhance survival.

REPLY: The claim is not only misleading and an implicit oversimplification, but a good example of the
way such claims are often hastily grasped at without thinking carefully through them first. As changes within the human body occur in a system in which both morphology and physiology are unified, physical changes often cannot be easily divided into two distinct categories, morphological and physiological. In reality, morphology and physiology are closely interrelated. Such morphological parameters as body size and shape are in fact regulated by hormones, which are part of your physiology.

Hormone regulation of morphology/physiology. For example, bone growth and size of the
bones/body are regulated by pituitary growth hormones and sex hormones. In turn, the growth hormones are themselves regulated by releasing hormones produced by the hypothalamus, which is in the brain. (See Tortora and Anagostakos [1981], and Tanner [1992] for relevant discussion of hormones.) This implies that any significant evolutionary change in morphology requires an associated simultaneous change in physiology, i.e., the relevant hormones must change as well. Thus we observe that a binary or "black-and-white" classification of changes as either morphological or physiological does not match well with reality. Once again, fruitarian extremists are engaging in the binary thinking and oversimplification that is characteristic of such dietary dogma. As a postscript to this topic, the remark of Tanner [1992b] is relevant: Differential growth rates are very often the mechanism of morphological evolution in primates... The above suggests that morphological changes are driven by changes in growth rates, which are determined by hormones (physiology). This is quite interesting, for it indicates that morphological evolution is driven by physiological evolution.

CLAIM: The digestive system is extremely complex. For it to evolve in a mere 2.5 million years is simply not possible. REPLY: The above is an unsupported rationalization. Consider that the human brain is far more complex
than the digestive system, and that the physiology of the entire body is regulated via the autonomic nervous system, which is controlled by the brain. More precisely, some of the control of the autonomic nervous system rests in the cortex [Tortora and Anagostakos 1981, p. 374], and the cortex is the part of the human brain that is larger and has further evolved vis-a-vis the great apes [Stephan 1972]. Thus we observe that the "master controller" of the human physiology is the brain, with the cortex playing a key role. Then we note that the human brain has in fact evolved (quite considerably) during the period, i.e., the "master controller" of the human physiology has evolved significantly in the last ~2.5 million years since the human genus first appeared. Thus, the bizarre claim that the physiology of the human digestive system could not have evolved during the period is easily seen as the ludicrous fantasy that it is.

CLAIM: Expert opinion is that physiological evolution is highly unlikely. From Jones [1992, p.
286]: Mutations that deviate from the norm often interfere with biochemical processes and are rigorously removed by selection. Much of the body is made up of the building blocks of cells, which must fit accurately with one another... Any mutation that changes the shape of such molecules will almost always be at a disadvantage as it reduces their ability to interact with each other or with their substrate. Stabilising selection of this kind can be a very conservative force. Stabilising selection is a powerful agent that leads to genetic homogeneity. To explain how genetic variation is maintained in the face of such widespread censorship by natural selection is one of the main problems of population genetics.

65

REPLY: The last sentence in the expert opinion quoted indicates, of course, that physiological evolution
happens anyway. How this happens is precisely one of the interesting challenges spurring current research. The above quote has been used to support the claim that physiological evolution cannot have occurred. Yet genetic variation--and hence physiological evolution--happens anyway. The quote here has simply been twisted and misrepresented.

CLAIM: There is no convincing teleonomy proof that humans have adapted to a diet that includes meat! REPLY: There is also no convincing teleonomy proof that humans have adapted to a strict fruitarian diet!
The appeal to teleonomy is, thus, a red herring (i.e., a diversion) and nothing more. Teleonomy, the scientific study of adaptations, is sometimes used as a defense of fruitarian evolution. Basically, the teleonomy proof argument is nothing more than a diversion--the fruitarian extremist demands what is implied to be definitive "proof," a teleonomy study or analysis, that "proves" humans have adapted to meat in the diet. Such claims are simply diversions, however, because those who make them do not have teleonomy "proof" that humans are adapted to a fruitarian diet either. In effect, the claim is used (if unconsciously) as a smokescreen to divert attention from the total lack of legitimate scientific evidence to support the fruitarian evolution theories. A few comments on teleonomy are appropriate here. Thornhill [1996, p. 107] provides an introduction to teleonomy, and he defines teleonomy as: ...[T]he study of the purposeful or functional design of living systems and the directional pressures that have designed adaptations during long-term evolution. Focus on evolutionary design and function. Teleonomy is primarily interested in phenotypic design and adaptation as they relate to the selective pressures that drive evolution. The focus is on the evolutionary function of adaptations, rather than the evolutionary origins of adaptations (see Thornhill [1996, p. 108]). The role of genetics in teleonomy studies is unclear from the discussion in Thornhill [1996], as Thornhill appears to regard such information as being of very limited value (pp. 116, 122-124). A relevant point here is that the fruitarian extremists who use the teleonomy argument may also simultaneously demand "proof" in the form of genetic models (which Thornhill, the teleonomist, suggests are of limited value in teleonomy). It seems odd to demand genetic "proof" in the context of a system where it is used only occasionally.

Criticisms and limitations of teleonomy. Teleonomy is also known as the adaptationist program, and
a well-known critique of the subject is provided by Gould and Lewontin [1994]. An interesting paper on the many statistical challenges faced in doing teleonomy studies is Pagel and Harvey [1988]. As will be discussed later in this paper, humans are unique in nature by virtue of certain important features: high intelligence, language, etc. This uniqueness, coupled with the serious statistical problems inherent in crossspecies comparisons, presents major statistical and analytical challenges to those who wish to unequivocally "prove" adaptation in humans (vis-a-vis other species) via teleonomy studies. As such, perhaps the best way to view teleonomy is as one analytical tool (out of many tools) available to the researcher, rather than the "best tool" or "only (valid) tool," which is what fruitarian extremists appear to imply in their use of teleonomy as a last-ditch defense of their crank science theories.

CLAIM: There are no examples of animals evolving backwards from a vegetarian to an omnivorous diet!

66

REPLY: Evolution does not move backwards or forwards. Evolution occurs as the result of
changes over time, the result of selective pressures in the environment (which can include behavior, especially for humans, per discussion in a previous section). The use of the word "backwards" shows the emotional nature of the fruitarian evolution claims. The idea that evolving from a vegetarian diet to an omnivorous (or faunivorous) diet is "backwards" is purely a subjective bias. Nature simply IS; our subjective opinions of the process are irrelevant.

CLAIM: If humans are adapted to eating meat, what exactly are those adaptations? REPLY: The answer to this question will be apparent in the later sections of this paper. Though a short
answer could be given here, it would remove the element of surprise from some of the following sections. This question is answered and addressed later with relevant details and context.

In summary, those who cling to crank science/science-fiction theories that humans evolved as fruitarians must also cling to incorrect logic and fantasies which state:
• •

That the human digestive system is an evolutionary throwback to some mystical frugivorous ancestor, and, That the human body, including the brain--but not the digestive system!--has evolved over the last ~2.5 million years. In other words: Those who cling to false fruitarian evolution theories are claiming that evolution works, but not for the digestive system; the brain can evolve but the stomach cannot. Such bizarre assertions place the fruitarian evolution theory squarely in the realm of science fantasy rather than fact.

Further Evidence Against the Claims of Fruitarian Evolution
Fully upright bipedal posture, lack of arboreal adaptations
Unfortunately for the proponents of fruitarian evolution theories, there is evidence that we did not evolve as pure fruitarians, over and above the extensive fossil record evidence of human consumption of animal foods (fauna).

Quadrupedal vs. bipedal adaptations and tree-climbing. First, humans are fully upright and
bipedal, while apes are quadrupedal (though chimps, bonobos, and gorillas, of course, do have limited bipedal abilities). A fully upright bipedal posture is a major disadvantage in tree-climbing compared to quadrupedal motion--it is easier and safer to navigate trees when one can grasp and hold with four extremities rather than two, as humans must. That is, a fully upright bipedal posture is a disadvantage, overall, in collecting fruit--and hence would reduce survival of the alleged "pure fruitarian" prehistoric humans or proto-humans.

Bipedalism/quadrupedalism and fruit-picking. The hypothesis has been advanced that bipedalism
evolved because it is more efficient to pick small, low-growing fruits [Hunt 1994]. Standing up to pick low-

67

growing fruits allows use of both hands to pick fruit, and may allow more efficient harvesting of small fruit from small trees. However, this hypothesis [Hunt 1994] that bipedalism evolved as a fruit-picking strategy, specifically refers to chimps and australopithecines. Even then, it is well-known that chimp diet is not exclusively fruit; their diet also includes small but nutritionally significant amounts of fauna (insects, meat), along with other foods such as leaves and pith. Also, chimps retain quadrupedal motion and arboreal capabilities which they use for efficient fruit collection in larger trees. Hunt [1994] also discusses special muscular adaptations in chimps for vertical climbing (trees) and hanging (from trees). As for Australopithecus, radio-isotope studies indicate that Australopithecus was an omnivore/faunivore, and not a fruitarian. (See Sillen [1992], Lee-Thorp et al. [1994]; also Sillen et al. [1995].) Thus even if bipedalism did evolve initially as a specialized fruit-picking mechanism, the evolution occurred within the framework of an omnivorous/faunivorous diet (or a frugivorous diet that specifically included nutritionally significant amounts of fauna), not a strict fruitarian diet.

Actual fruit availability/dependability fluctuates with location and time. Note that the
hypothesis of Hunt [1994] does not suggest or imply that chimps or Australopithecus had a fruit-only diet. Also note that even in a savanna environment, fruit is not limited to low-growing trees. Consider the major restrictions placed on the survival of our prehistoric ancestors (or even modern-day chimps) if their diet were limited to only fruit picked within 2-3 meters of the ground. Such a restriction would greatly reduce survival, not increase it. A pure fruitarian needs a year-round supply of large amounts of fruit, and the ability to efficiently harvest as much of the available fruit as possible. That is, one needs to be able to efficiently pick most of an entire, tall tree--hence needs effective tree-climbing ability, that is, quadrupedal motion and arboreal adaptations.

Different adaptations, different tradeoffs. In contrast, an omnivore/faunivore for whom fruit is only
a part of the diet, and who effectively hunts/harvests other high-calorie foods (e.g., animal foods) can easily afford the loss of overall efficiency in fruit harvesting that fully upright bipedalism (as in humans) involves. This is particularly true if bipedalism enhances the ability to harvest high-calorie foods (fauna) by increasing hunting efficiency. (Hunting efficiency is discussed in a later section.) The set of adaptations of chimps--limited bipedal ability that improves efficiency in picking fruit from small trees, combined with quadrupedal/arboreal adaptations to efficiently harvest fruit from tall trees, appears to be the optimal adaptation set for a primarily frugivorous animal. However, humans clearly do not have all of these important survival adaptations. In regard to humans, Hunt [1994] argues that (p. 198): Accordingly, scavenging, hunting, provisioning and carrying arguments for the origin of bipedalism (Shipmam, 1986; Sinclair et al., 1986; Carrier, 1984; Wheeler, 1984; Lovejoy, 1981; Jungers, 1991) are more convincing explanations for the refinement of locomotor bipedalism in Homo erectus, as are heat stress models (Wheeler, 1993). Adaptations must be interpreted in context. That is, even if the hypothesis of Hunt [1994] is correct, the later improvements in bipedal motion associated with Homo erectus (which evolved approximately 1.7 million years ago) are likely associated with the role of early humans as hunters, and not as fruit-gatherers. Inasmuch as humans are a separate genus from chimps and Australopithecus, the later developments are, of course, more relevant. Aiello [1992, p. 43] points out that: In fact, at walking speeds human bipedalism is significantly more efficient than ape quadrupedalism. Needless to say, a fruitarian ape needs quadrupedal abilities for harvesting fruit from tall trees, and the lack thereof (in humans) suggests that harvesting fruit was much less important than the fruitarian evolution advocates claim. Additionally, note that humans did not evolve the special (orangutan) hand adaptations for tree-climbing (this is discussed further in a later section, with accompanying citations), and we also lack the special muscular adaptations of chimps [Hunt 1994] for arboreal motion. In contrast, fully upright

68

bipedalism made walking--and hunting--far more efficient, at the cost of reduced efficiency (overall) in fruit harvesting.

Body size and tree-climbing. Finally, the body size of humans may be very questionable for a pure
fruitarian. Humans are larger than orangutans, the most arboreal of the great apes, and we are larger than chimps. Hunt [1994] observed that larger chimps favored shorter trees, thereby minimizing the energy expended in climbing trees. The even larger body size of humans (compared to chimps or orangutans) also reduces our efficiency in harvesting fruit growing on the outer canopy of tall trees (where a substantial part of a tree's fruit is located), as our weight restricts us to relatively large branches. (Small branches could break, and falling out of trees certainly does not promote evolutionary survival.) Temerin et al. [1984, p. 225] note that: Arboreal travel is precarious at best, and increasing size makes compensatory demands for stability increasingly important. One response to this is to limit travel speeds. The remarks of Aiello [1992, p. 44] provide further insight: In larger animals, [tree] climbing can be two to three times more costly in energy terms than horizontal locomotion, a relationship that may explain why arboreality is largely confined to smaller primates. The obvious exception to this pattern is the arboreal orang-utan--the largest arboreal mammal with an approximate average body weight of 55 kilograms. It is, however, both highly specialized in its anatomy and fairly lethargic, climbing slowly and deliberately.

In summary, it seems quite odd that humans allegedly evolved to eat a nearly 100% fruit diet, yet our body size may be "too big" for an efficient frugivore, and we lack the major ape adaptations (quadrupedalism; hand and muscle adaptations) needed to efficiently collect fruit. That is, the fruitarian program alleging we are adapted to eat a diet of fruit lacks substantial awareness of the real-world needs of animals who eat high percentages of fruit to be able to efficiently collect that fruit. The obvious conclusion here is that the claim humans evolved as strict fruitarians is crank science (or, if you prefer, science fiction).

Fruitarian nutrition vs. brain evolution
Finally, the question of whether a fruitarian diet could supply the nutrition needed to drive brain evolution must be considered. All available evidence indicates that the answer is no. Some of the relevant reasons are: •

Vitamins, minerals, and fats. Fruit is deficient in vitamin B-12; pure fruit diets may be
deficient in zinc, and even calcium. The EFA (essential fatty acid) balance in sweet fruit, the diet promoted by fruitarian extremists, is not optimal. Needless to say, vitamin B-12 (and mineral) deficiencies do not promote evolutionary survival, and a poor balance of EFAs does not promote brain development either. (Note: B-12, zinc, and EFAs are discussed in a later section herein.)

Evolution/survival on pure fruit diets not possible in nature given fluctuations in fruit availability. Getting enough fruit to survive on, every day of the year, is unlikely even in a
pristine, primary tropical rainforest. Preceding sections have discussed that orangutans apparently cannot get enough fruit in their remote tropical rainforest homes to live on fruit all year. Instead, they are forced to eat large amounts of leaves when fruit is scarce (see Knott [1998], MacKinnon [1974, 1977] as cited in Chapman and Chapman [1990]). If orangutans--quadrupeds with superb tree-climbing skills--cannot get enough fruit to survive on a (pure) fruitarian diet, it seems highly unlikely that humans (who are larger than orangutans, require more food, and are less adapted for tree-climbing) could consistently find enough fruit in a tropical rainforest (or savanna, where fruit is less plentiful), every day of the year, to satisfy their minimum calorie requirements.

69

By the way, climbing tall trees to pick fruit (orangutans and other primates do not rely on fruit that falls to the ground--apparently inadequate as a sole food source) is hard, dangerous work and greatly increases daily calorie requirements, which further increases the amount of food/fruit required. One wonders if the fruitarian extremists who promote idealistic visions of life in a jungle paradise filled with fruit trees have ever tried to pick more than minimal quantities of wild fruit-especially in a rainforest where the lowest fruit-bearing branches on a tree may be 20-30 or more meters straight up. Finally, no doubt some fruitarian extremists will claim that adequate fruit to support pure fruit diets was in fact available in prehistoric times. This is of course nothing but speculation, without reference to any kind of credible paleobotanical evidence. Even more relevant, where is there evidence of an adequate year-round supply of fruit (adequate to support a fruit-only diet for our prehistoric ancestors) on the African savanna of 1-3 million years ago, in the days of Australopithecus, and subsequently in the early days of the newly evolved Homo genus? The obvious conclusion is that a pure fruitarian diet is not feasible for hunter-gatherers or our prehistoric ancestors, as fluctuations in the fruit supply make survival on a pure fruitarian diet unlikely in the short run, and a virtual impossibility in the long run. If you cannot survive on the diet, it certainly cannot support brain evolution, or any other kind of evolution.

Synopsis and Section Summary
We have seen that:
• • • • • Humans are the most encephalized and intelligent species in all of nature. The human basal metabolic rate (BMR) is normal for our body size, yet we invest far more energy in our brains than other mammals. This is a paradox: how can we "afford" it? The increase in encephalization in humans appears to be compensated for by a comparable decrease in gut size. A decrease in gut size with constant body mass and BMR is possible only via an increase in dietary quality (DQ). Prior to the development of agriculture, the only feasible/economic way to increase DQ was via consumption of animal foods. The available evidence supports the idea that the consumption of animal foods, including meat, was an important factor in encephalization and in the evolution of the human brain. That is, your large brain--which helps you choose your diet (and argue with others about diet)--is apparently, according to the most plausible interpretation of the evidence, the result of the evolutionary legacy of ~2.5 million years of meat consumption by your ancestors. Much of the material in this section comes from paleoanthropology, and is based on an analysis of the available data. The net effect of so much evidence, all supporting the same conclusion (that animal foods were a driving force in brain evolution) is both powerful and convincing. However, some readers might dissent and remind us that although the evidence is substantial, parts of it may be subject to different interpretations. That, however, is a result of the nature of parts of the available evidence, e.g., the fossil record. In this case, those who dissent are faced with the daunting task of plausibly supporting a picture of how brain evolution could have occurred on a purely plant food/fruit diet without agriculture.

70

Human brain size has decreased 11% in the last 35,000 years and 8% in the last 10,000-concurrent with decreasing animal food consumption during that time brought about by the advent of agricultural subsistence. This evidence provides some grounds for suggesting that the calorically/nutritionally dense animal tissues that presumably fueled human brain evolution may not be able to be completely/adequately substituted for by the calorically dense plant-based products of agriculture such as grains. The claim that humans evolved as fruitarians is an example of crank science; there is no credible scientific evidence to support such theories. No doubt certain fruitarian extremists will denounce the material in this section as "pseudoscience." In light of the fact that such adherents have no credible evidence to support their views, such allegations by fruitarian extremists are simply examples of hypocrisy and, perhaps, intellectual dishonesty.

Comparative "proofs" that ignore intelligence and brain size are dubious. At this point it is
appropriate to once again connect this section and the previous one to the topic of comparative anatomy. Two points are relevant here: • • First, the previous section regarding the fossil record and the culture/evolution feedback loop-combined with the material of this section--yields a powerful argument that animal foods were, and are, a part of the human evolutionary diet, i.e., are natural. Second, as this section provides evidence that the evolution of the brain may be closely tied to diet, clearly, comparative "proofs" of diet that ignore intelligence and brain size are very dubious indeed. Unfortunately, the major comparative "proofs" do ignore intelligence. We shall see in coming sections how this oversight renders major portions of those "proofs" logically invalid.

PART 5: Limitations on Comparative Dietary Proofs
This section discusses the limitations--structural and logical--inherent in the various comparative "proofs" of particular diets. Such "proofs" typically focus on comparative anatomy and comparative physiology, though they often include other elements as well. The first part of this section provides a list, with explanations, of the various limitations. The second part discusses some real-world examples that illustrate the difficulties in using comparative studies as "proof."

Editorial note: In the following, the term "comparative proofs" refers to the comparative proofs for
particular diets. It is not a general reference to any/all comparative studies.

Logical and Structural Limitations

Comparative "proofs" of diet are subjective and indeterminate; i.e., they do not provide actual proof of diet.
The "proof" in Fit Food for Humanity that claims humans are natural vegetarians was discussed in an earlier section. This "proof" groups all the anthropoid apes into the vegetarian category. However, we now know that the diet of the great apes is not strictly vegetarian, so let's have some fun and consider the following logical exercise.

71

First, let's modify the "proof" in Fit Food for Humanity as follows. Based on the more recent knowledge now available, replace the collective category for vegetarian apes with two more-well-defined (and accurate) ones: first, with a category consisting solely of the gorilla (a folivore); then second, with a category consisting solely of the chimpanzee (a frugivore/omnivore). The comparative information in Groves [1986] and Aiello and Dean [1990] might be helpful in producing the new comparison tables. This yields two new comparative "proofs" similar to the one in Fit Food for Humanity: one "proof" that shows humans are folivores, and yet another "proof" that humans are frugivores/omnivores. Next, note that Mills (The Comparative Anatomy of Eating) provides a lengthy "proof" that purports to show humans are herbivores. Finally, note that Voegtlin [1975] as cited in Fallon and Enig [1997], provides a comparative anatomy analysis that indicates humans are actually natural carnivores. See Functional and Structural Comparison of Man's Digestive Tract with that of a Dog or Sheep (below) for a comparative anatomy analysis from Voegtlin [1975].

FUNCTIONAL AND STRUCTURAL COMPARISON OF MAN'S DIGESTIVE TRACT WITH THAT OF A DOG AND SHEEP MAN TEETH Incisors Molars Canines JAW Movements Function Mastication Rumination STOMACH Capacity Emptying time Interdigestive rest Bacteria present Protozoa present Gastric acidity Cellulose digestion 2 Quarts 3 Hours Yes No No Strong None 2 Quarts 3 Hours Yes No No Strong None 8 1/2 Gallons Never Empties No Yes-Vital Yes-Vital Weak 70% - Vital Vertical Tearing-Crushing Unimportant Never Vertical Tearing-Crushing Unimportant Never Rotary Grinding Vital Function Vital Function Both Jaws Ridged Small Both Jaws Ridged Large Lower Jaw Only Flat Absent DOG SHEEP

72

Digestive activity Food absorbed from COLON & CECUM Size of colon Size of cecum Function of cecum Appendix Rectum Digestive activity Cellulose digestion Bacterial flora Food absorbed from Volume of feces Gross food in feces GALL BLADDER Size Function DIGESTIVE ACTIVITY From pancreas From bacteria From protozoa Digestive efficiency FEEDING HABITS Frequency SURVIVAL WITHOUT Stomach Colon and cecum Microorganisms Plant foods Animal protein RATIO OF BODY

Weak No

Weak No

Vital Function Vital Function

Short-Small Tiny None Vestigial Small None None Putrefactive None Small-Firm Rare

Short-Small Tiny None Absent Small None None Putrefactive None Small-Firm Rare

Long-Capacious Long-Capacious Vital Function Cecum Capacious Vital function 30%-Vital Fermentative Vital Function Voluminous Large Amount

Well-Developed Strong

Well Developed Strong

Often Absent Weak or Absent

Solely None None 100%

Solely None None 100%

Partial Partial Partial 50% or Less

Intermittent

Intermittent

Continuous

Possible Possible Possible Possible Impossible

Possible Possible Possible Possible Impossible

Impossible Impossible Impossible Impossible Possible

73

LENGTH TO: Entire digestive tract Small intestine 1:5 1:4 1:7 1:6 1:27 1:25

From: The Stone Age Diet pgs.44-45, Walter L. Voegtlin, M.D., F.A.C.P. 1975 Vantage Press, NY, NY. Four "proofs"--but what do they prove? Thus, as a result of producing and collecting the "proofs"
cited above, we will have 4 different types of comparative "proofs": • • • •

Humans are natural herbivores: 2 examples--Mills and Fit Food for Humanity. Humans are folivores: modification (cf. gorillas) of Fit Food for Humanity. Humans are frugivores/omnivores: modification (cf. chimps) of Fit Food for Humanity. Humans are carnivores: Voegtlin.

Then, we simply note that the above cannot all be true as they are contradictory. This points to the conclusion that comparative "proofs" of diet are subjective and indeterminate--that is, they are not hard proof of anything. It should be mentioned here that some who present comparative "proofs" openly admit their subjective nature (though the point is often not given proper emphasis, for the obvious reasons), whereas others are not so candid.

There is no logical validation of the list comparison process. That is, there is no logical validation for the implicit assumption that comparing two or more species via a list of features constitutes proof that a particular diet is natural or optimal.
Comparative studies can be powerful tools for analysis, but there is so much variation in nature that one cannot logically derive a "should" (i.e., you should eat this diet) from an "is" derived from other species' diets (i.e., that other, allegedly similar species, eat the same type of diet). The presence of substantial morphological similarity between two different species does not prove, or even imply, that their diets must be similar. The idea that in nature a "should" can be derived from an "is" (of another species) in such a fashion is a major logical fallacy of the comparative "proofs" for particular diets. Additionally, one other point bears repeating here. Let us pretend that comparison of a list of features somehow constitutes hard "proof" for a specific diet. It then follows that one must ask relevant questions, listed below: • • • •

How are features chosen for a comparison list? Objective, unambiguous guidelines are
necessary to select features. What are those guidelines? (What to include in the list? What to measure?) Once a feature is selected, how to measure: length, weight, volume, surface area, or other measure? (How to measure the list features?) What is the definition of a "match"? Given features from 2 different species in a comparison list, how "close" do they have to be to be considered a "match"? How many matches are required for "proof" given a comparison list of features which contains some number of "matches"? What validation is there for the "magic" number of matches that allegedly constitute "proof"?

74

For the obvious reasons (they are too heavily subjective), the above questions are unanswered. The advocates of comparative "proofs" are assuming that a matching list somehow "proves" their claims. In reality, it proves nothing--a matching list is evidence of possible association or similarity; it helps to define hypotheses, but it does not provide definitive proof of a hypothesis. As Ridley [1983, p. 34] points out: The comparative method is used to look for associations (between the characters of organisms) which natural selection, or some other causal principle, might be expected to give rise to. It can test hypotheses about the cause of adaptive patterns. The result of this test is an association or correlation. The mere association of two characters, A and B, does not tell us whether in nature A was the cause of B, or B of A.

The comparative "proofs" focus only on similarities while ignoring differences.
In legitimate applications of comparative anatomy and/or physiology, the differences--those features that make each species unique--are of central interest. In the comparative "proofs" for diets, the differences are often ignored or rationalized away. (That is, any differences that inhibit promotion of the target diet will be rationalized or ignored.) By ignoring or rationalizing away the differences, the comparative "proofs" of diet render themselves logically invalid. In a discussion of the topic of comparative studies of adaptation, Ridley [1983, p. 8] observes: The most rudimentary kinds of comparative study are those which, after looking through the literature more or less thoroughly, present a list of only the species (or some of them) which conform to the hypothesis... Such a list may usefully demonstrate that a trait is taxonomically widespread. Or it may suggest a hypothesis. But it cannot test one. If the method does not also list the species that do not conform to the hypothesis it is analogous to selecting only the supporting part of the results from a mass of experimental data... The very first requirement, then, for a test of a comparative hypothesis is that the criterion for including a species in the test is independent of whether it supports the hypothesis. This criterion may seem obvious; it may seem that it does not require statement. But it is enough to rule out the entire [comparative] literature (with one exception) from before about 1960. That exception is the French biologist Etienne Rabaud. The basic problem of studying similarities while deliberately ignoring differences is that it constitutes picking-and-choosing the data to conform to the particular hypotheses of interest. Note: the studies of Rabaud (mentioned above) are discussed further, later herein.

The comparative "proofs" ignore the fossil record and evolution. The comparative "proofs" are dated and do not reflect current knowledge of ape diets. The comparative "proofs" assume dietary categories are discrete (distinct), while in nature diets form a continuum.
The above points were covered in previous sections.

The comparative "proofs" ignore the important features that make humans unique in nature.
In particular, these features are generally ignored: • • • • • Tool usage. Fully opposable thumbs. Fully upright and bipedal posture (note: thumbs and posture are mentioned in a few proofs). Fully developed language and culture. And--most important of all--high intelligence.

75

As discussed in the previous section, the topic of evolution of the brain vs. diet is usually not covered. Our brain allows us to develop technology (e.g., tools and cooking), which can impact our morphology via the behavior/culture evolutionary feedback loop. We will see later that human intelligence, expressed via tools and behavior, allows humans to easily overcome many of the alleged limits on adaptation that the comparative "proofs" discuss. The "Expensive Tissue Hypothesis" [Aiello and Wheeler 1995] and related research discussed in the previous section suggest that the evolution of our brain and higher human intelligence is in part the result of a diet that included significant amounts of animal foods. The energy budget and life history events of humans are unique, relevant, and should be covered in a legitimate comparative study. The comparative "proofs" ignore these facts.

Comparative "proofs" assume the form/function connection is strict and necessary.
Here the word "necessary" is used in its logical sense; i.e., that function necessarily follows form, and that the form/function linkage is strict. This is a critical assumption underlying the comparative "proofs" that promote specific diets. However, the assumption is incorrect and logically invalid.

Similar functions can be served by dissimilar forms. John McArdle, Ph.D., a scientific advisor to
the American Anti-Vivisection Society notes [McArdle 1996, p. 173]: It is also important to remember that the relation between form (anatomy/physiology) and function (behavior) is not always one to one. Individual anatomical structures can serve one or more functions and similar functions can be served by several forms. The form/function relationship can be dissimilar even in closely related species. Sussman [1987] comments that (p. 153): When comparing species-specific dietary patterns between different primate species, we often find that these patterns are not dependent upon taxonomic relationships. Closely related species often have very different diets, whereas many unrelated species have convergent dietary patterns. However, we would assume that at some level there is a relationship between dietary patterns, foraging patterns, and anatomical and physiological adaptations. Overall systemic constraints as important as individual functions/forms. Finally, Zuckerman [1933] reminds us of important systematic restraints (pp. 12-13): Function cannot be stressed more than form. Their separation from each other, and their individual isolation from the intact working organism, are both essential, but to a large extent necessarily incomplete, operations. The same is true of the abstraction of biochemical facts. Form, chemistry, and function are indissolubly united, and it may truly be said that from the taxonomic standpoint all characters of the body, whatever their nature, have a fundamental equivalency... To discuss them on the basis of ex cathedra judgements [i.e., via subjective assessments only] of their relationship to heritage or habitus is to attempt the impossible, and to ignore in greater part the lesson taught by the experimental study of genetics and growth.

Research on the form/function linkage
Some examples of research studies that tried to document a form/function linkage are as follows: •

Early research. As mentioned above, the French biologist Etienne Rabaud's work is among the
best early research on evolutionary adaptation, including the form/function linkage. From Ridley [1983, p. 8]: Rabaud was concerned with Cuverian laws of the relationship between anatomy and niche, such as that herbivores have flat, grinding teeth and carnivores sharp, piercing teeth. Rabaud first looked systematically at these laws in his book of 1911, and later in a longer work on Les phenomenes de convergence en biologie (1925). In 1925 he summarized, systematically, such standard examples of [evolutionary] convergence as adaptations for flying and swimming. Each systematic summary

76

led to the same conclusion: "one can recognize no necessary relation between the form and the way of life of the organisms considered" (1925, p. 150). The laws were not universal, he concluded, but were based only on facts that had been selected because they conformed... [to the hypotheses]. •

Gut morphology. Vermes and Weidholz [1930] as cited in Hill [1958] tried to correlate diet
with gut (digestive system) morphology in primates. Hill [1958] describes the results of Vermes and Weidholz (p. 146): The authors attempt to substantiate the correlation between diet and gastrointestinal morphology among a series of Primates; they conclude that the distinctions are not so obvious as in other orders and in most cases do not permit reliable conclusions to be drawn, while in some cases the evidence is contradictory. The above reflects the reality that analysis of the gut is not so simplistic as one might expect from the comparative "proofs." (In later sections, more recent studies--that yield more definitive results than Vermes and Weidholz--will be discussed.)

Mastication. Wood [1994] analyzed the correspondence between diet and masticatory (chewing)
morphology in 18 extant primate taxa. The analysis covered 30 coordinate points on the skull. Wood [1994, abstract] concludes: The results of this study suggest that there is no definitive or straightforward relationship between dietary type and the functional morphology of the masticatory apparatus discernible among the taxa studied, although some encouraging patterns may be observed. Primate dietary behavior is flexible and may not be predicted with certainty on morphological grounds. Once again, efforts were unsuccessful in establishing a strong linkage between form and function in the context of diet and morphology.

More reliable insights by limiting complexity and narrowly defining features to be studied. In contrast to the above studies, if a number of steps are taken to reduce the complexity
of the problem, i.e., limit the analysis to a small group of closely related species, study only a small data set of narrowly defined physical features, and limit the statistical analysis to elementary procedures, then one might find evidence of a form/function linkage (in the set of species studied). This describes the approach of Anapol and Lee [1994], whose research analyzed the morphology of the jaws and teeth of 8 species of platyrrhine monkeys from the Surinam rainforest. Their research found statistically significant differences that were associated with diet in some of the measures of tooth, jaw, and masticatory musculature. Examination of the Anapol and Lee [1994] research versus the comparative "proofs" of diet illustrates some important points: o

Anapol and Lee are interested in morphological specializations-differences--because those differences reflect evolutionary adaptations to different
diets. That is, Anapol and Lee studied both similarities AND differences.

o o

In contrast, the comparative "proofs" focus only on similarities and ignore or
rationalize away those differences deemed inconvenient to the goal of promoting dietary dogma. Also, the research of Anapol and Lee is descriptive; in contrast, the comparative proofs allege that their analysis is prescriptive, i.e., you "should" eat the diet they promote.

77

The work of Anapol and Lee is a good example of a legitimate application of comparative anatomy; in contrast, the comparative proofs of diet are really misapplications of comparative anatomy. Thus we see that a narrowly focused study, within a closely related group of species, might find possible evidence of a form/function linkage. Of course, in sharp contrast, the comparative "proofs" of diet are overly broad, and their conclusions dubious. As the form/function linkage assumption is such a critical part of the comparative "proofs" of diets, we will next examine additional aspects of the form/function relationship, including specific examples of forms that can serve more than one function.

The same form can serve multiple/different functions. Canine teeth. The quote by McArdle in the preceding section serves as a good introduction to this topic.
Canine teeth are a good example of one form serving different functions. In true carnivores, canine teeth help in cutting the flesh of prey. However, some frugivores find large canines to be helpful in processing fruit. Regarding 3 different species of platyrrhine monkeys, Anapol and Lee [1994, pp. 250-251], note: The primary food resource of Chiropotes santanas consists of young seeds from unripe fruit that are obtained by removing the tough pericarp with the anterior dentition (van Roosmalen et al., 1988; Kinzey and Norconk, 1990; Kinzey, 1992)... The most predominant feature of the dentition of Chiropotes satanas is the exceptionally robust upper and lower canines, also present in both Cebus species studied here... For primates harvesting sclerocarp, robust canines provide an advantage for opening exceptionally hard unripe fruits to gain access to seeds (Kinzey and Norconk, 1990).

Primate teeth. Also see Richard [1985, p. 190] for comments on why frugivorous primates need sturdy
teeth, specifically incisors. Richard [1985, p. 194] also notes that "features of tooth morphology indicate diet habits strongly but not perfectly." Mann [1981] reports that chimpanzee and gorilla teeth are very similar despite significantly different diets.

Teeth of hominids. Mann [1981] also contrasts the sharply different analyses of the diets suggested by
the dental systems of early hominids. He cites Jolly [1970, 1972], who argues for a predominantly plant diet (but non-vegetarian; Jolly specifically included vertebrate meat in the diet), versus Szalay [1975], who interprets the same dental evidence as adaptations to hunting and a meat-predominant diet. (See Mann [1981] for Jolly, Szalay citations.) Garn and Leonard [1989] discuss the form/function connection as it applies to the dentition (teeth) of both modern and prehistoric humans. Their remarks bring the issue into clear focus [Garn and Leonard 1989 (pp. 338-339): [E]ven the [Homo] erectus fossils did not have long, projecting gorilla-like canines or the highly convoluted wrinkled molar surfaces that suggest dependence on roots, shoots, and large leaves. Our dentitions are now at most indicative of an omnivorous but soft diet and they are equally suitable for gnawing at bones, gorging on fruit, or consuming fried rice, boiled wheat, and unleavened cakes. However, and despite their more specialized dentitions, chimpanzees prove to be far more omnivorous in the wild than we had earlier reason to imagine. (3)... We therefore grant the [Homo] erectus fossils a considerable year-round supply of animal flesh, but we have no knowledge of how much vegetation was also dug, pulled, picked, or stripped...

78

Climatic data and faunal associations indicate that these earlier people of our genus were not browsers. They correspond to the notion of "early man" as hunters, at least in part... Thus we see that a particular form can serve multiple functions. As we will discuss in a later section, the human hand is an excellent example of one form that can serve a myriad of functions.

Subtle changes in form can produce or support significant changes in the function of organs or body systems.
Milton [1993, p. 89] reports: This research further shows that even without major changes in the design of the digestive tract, subtle adjustments in the size of different segments of the gut can help compensate for nutritional problems posed by an animal's dietary choices. The above quote is even more relevant in light of the fact that human (and other primate) guts are "elastic" and their dimensions can change in reaction to dietary changes. This will be explored in a later section. Richard [1995] notes that juvenile apes have well-differentiated cusps in their molars, while adult apes have flat molars that display heavy wear. However, the larger teeth and stronger masticatory muscles in the adults may serve the same purpose as the weaker muscles and differentiated cusps in the juveniles. This is an example of soft-tissue changes compensating for hard-tissue changes, i.e., tooth wear.

The function served by a particular form can vary dramatically according to specific feeding behavior.
We have already discussed examples of this: the platyrrhine monkeys using canine teeth to eat hard fruits, and the discussion of the considerable versatility of human dentition. Additional examples are: • The sportive lemur (Lepilemur mustelinus), a very small folivore, engages in coprophagy--it eats its own feces, for a "second-stage" digestion. This 2-stage process may increase the energy value of the lemur's diet [Richard 1995]; see also Hladik and Charles-Dominique [1974] as cited in Chivers and Hladik [1980].

The surface area of the digestive system may be less important than the retention time of food in the gut [Richard 1995]. Milton and Demment [1988] analyzed the digestion
mean transit time (retention time) of different types of foods in chimps and humans. They found that lower-quality foods (high-fiber, low-energy) passed through the system faster than higher quality (usually low-fiber, high-energy) foods. Reducing the transit time for lower-energy/lowerquality foods is apparently a survival tactic, as it supports increased intake of lower-energy foods (to meet energy requirements) when necessary.

We will see later that failure to consider differences in feeding behavior makes major parts of the comparative proofs of diets invalid.

Analysis of dietary adaptations is non-trivial.
However, some of the comparative "proofs" themselves do border on the trivial. More seriously, though, as Richard [1995] points out, evolutionary adaptations are compromise solutions that reflect the effect of multiple selection pressures, applied simultaneously. For example, a large gut might assist digestion. However, if the gut size increases sharply and in a disproportionate manner, the animal might not be able to outrun predators. Such an animal would quickly go extinct. This illustrates that one cannot look at single organs or subsystems in isolation. Rather, one must look at the whole organism, or at least attempt to do so.

79

An alternate way of stating the above is that adaptations (morphology) are the solutions to multivariate (multiple-variable) optimization problems, where: • • Many variables are constrained, and Time is a variable as well, i.e., the interrelationships can change over time.

Diets can temporarily change with habitat in the same species. To illustrate the latter point: a
species may evolve under a specific diet, then be forced to change the diet (within the implicit range of adaptation) because of habitat changes driven by climate change. As Richard [1995, p. 190] notes: But environments do change, sometimes quite rapidly, and an animal is not necessarily specialized to eat the foods we see it feeding on now; a species can change its diet (within limits, at least) faster than it can change its teeth. Staple foods vs. critical but more rare foods. Richard [1995] also notes the difficult problem of determining which selection pressure is more relevant in analyzing adaptation--the consumption of a dietary staple, or a food that is consumed rarely but that is the sole source of specific nutrients, e.g., vitamin B-12. The above reflect why examination of all the evidence of morphology, the fossil record, and paleoclimate are important in assessing adaptation.

Phylogenetic (structural) challenges in comparative studies. The form of digestive features can be affected by non-diet factors. The comparison of animal
features using data from individual species that belong to different taxonomic groups presents subtle but important technical (statistical) difficulties in an analysis. As Pagel and Harvey [1988, p. 417] note: [A]ny comparative relationship obtained from an analysis that includes more than one taxon [i.e., phylogenetic category] may be explained, in principle, by any other factors that vary across those taxa... For example, McNab (1986) pointed out that size-corrected metabolic rates of species of mammals are associated with diet. Elgar and Harvey (1987) however, show that McNab's results cannot be easily be separated by phylogeny... If the character of interest varies only at a very high taxonomic level it may be extremely difficult to separate the favored explanation for the character from many other explanations associated with phylogenetic differences. The basic problem here is that the level of aggregation [grouping] of the raw data used in the analysis interferes with the statistical independence required for analysis. (On top of which, the possible existence of common evolutionary ancestors for the species being compared may make the analysis even more complicated.) Readers interested in the statistical details should consult Pagel and Harvey [1988], and also Elgar and Harvey [1987] (as cited in Pagel and Harvey [1988]).

Difficulty of untangling cause and effect. The importance of the above as it relates to simplistic
comparative "proofs" of diet is that it raises the question of cause and effect. In the comparative "proofs," are all of the observed differences in features really due to diet, or are some of them simply artifacts of cross-taxa (between-group) comparisons? That is, if the features in question may be impinged on, or shaped, at least in part, by things that affect a range of features in that taxonomic class beyond diet itself, then one must be able to sort out the cause and effect relationships, or valid conclusions cannot be drawn.

80

Counterexamples to the Paradigm of Comparative Proofs of Diet
As briefly explained above, the comparative proofs of diet are subject to a number of logical and structural limitations. To further illustrate this, let's examine now two animals that illustrate the importance of feeding behavior, and how it can invalidate simplistic comparative proofs that assume the form/function linkage is strict.

Polar Bears: an example of semi-aquatic feeding behavior
The polar bear, Ursus maritimus, physically resembles its relatives, the brown bear and grizzly. Genetically, polar bears evolved from brown bears; see Talbot and Shields [1996], also Zhang and Ryder [1994] for information on the genetic similarities. A limited comparative study of polar bears vs. brown bears might find some evidence that the polar bear is more carnivorous than the brown bear. However, more importantly, without knowledge of the actual--and unusual, for bears--feeding behavior of the polar bear, one might conclude from a comparative study that the polar bear "should" be a predominantly terrestial animal rather than the semi-aquatic animal it actually is. (The primary evidence for aquatic adaptation by the polar bear is webbed feet; however one could presume that is an adaptation for walking on slippery ice, rather than swimming.) How aquatic is the polar bear? Stirling [1988] describes an aquatic stalking procedure commonly used by bears in hunting seals (a major food for them). One type of aquatic stalk is done by swimming underwater, in a stealthy manner, then exploding onto the ice by the seal (often the seal escapes). In another type of stalk, the polar bear lays down flat on the ice and slides along shallow water-filled channels on top of the ice. Stirling also describes how polar bears capture sea birds by diving underneath and biting from below. Garner et al. [1990] fitted 10 female polar bears with satellite telemetry collars in 1986 and tracked their movements on the Arctic pack ice floes, near Alaska and Siberia, for over a year. They calculated a minimum distance traveled by the bears on the shifting pack ice of the Arctic as ranging from 4,650-6,339 km (2,888-3,937 miles) over time periods ranging from 360-589 days. They note [Garner et al. 1990, p. 224], "[B]ears must move extensively to maintain contact with the seasonally fluctuating pack ice." Thus we observe that polar bears are indeed semi-aquatic in behavior, able to travel long distances over ice and open water. Yet polar bears have almost none of the adaptations for aquatic life that (carnivorous) fully aquatic mammals have. (See Estes [1989] for a discussion of the adaptations of carnivorous aquatic mammals.) The polar bear illustrates how an animal can lead a semi-aquatic life, with very few aquatic adaptations, simply via adaptive behavior. That is, adaptive behavior allows the polar bear to overcome what appears to be a limit (of sorts) on morphology. We will see later that the comparative proofs of diet make the serious mistake of ignoring adaptive human behavior.

The Giant Panda: a "carnivore" on a bamboo diet
The giant panda, Ailuropoda melanoleuca, is a member of the carnivore order, but the diet of the panda is predominantly plant food. Schaller et al. [1989, pp. 218-219] note that, "Pandas may consume plants other than bamboo, and they also eat meat when available (Hu 1981; Schaller et al. 1985)... F. scabrida bamboo was the principal food of most pandas."

81

Gittleman [1994] provides a comparative study of the panda. Citing Radinsky [1981], Gittleman notes that an analysis of panda skulls and bone lengths finds they are similar to other carnivore species. The life history traits, however, are significantly different for pandas (vs. other carnivores). Recall that life history traits are ignored by the comparative proofs of diet. Let's now briefly consider the characteristics of the panda that suggest a coarse, largely herbivorous diet. Raven [1936], as cited in Sheldon [1975], lists the adaptations as: lining of the esophagus, thick-walled stomach, small liver, large gall bladder, and pancreas. Gittleman [1994] and Sheldon [1975] both mention the large molars and masseter muscles of the panda (for mastication). Gittleman [1994, p. 458] also mentions the "enlarged radial sesamoids (the so-called panda's thumb) used for pulling apart leaves." So, a comparative study of pandas, if done in sufficient depth, might suggest that the panda diet probably includes substantial amounts of coarse vegetation. However, without knowledge of the habitat and actual feeding behavior, such a study would not constitute proof that the actual diet of the panda "should" or "must be" close to 100% bamboo. (The radial sesamoids are slim evidence on which to conclude that the diet is almost purely bamboo; one would be figuratively "hanging by the thumbs," or "hanging by the sesamoids" on such limited evidence.) The example of the panda illustrates the problem of "resolution levels." A comparative study might suggest that the panda eats considerable vegetation, but such a study does not have the resolution to "prove" that the panda "must" eat a diet that is 100% bamboo vs. a diet that is 75% bamboo + 25% other foods. This is relevant because there are fruitarian extremists who suggest that the bogus comparative "proofs" they promote suggest the human gut is specialized for a nearly 100% fruit diet (vs. for example, a diet that is 60% fruit + 40% other foods). Such claims are unrealistic, not only because of the oversimplistic nature of the comparisons themselves as discussed earlier, but also also given the low resolution of the comparative "proofs."

Polar Bears and Pandas: Epilogue
The examples of the polar bear and panda illustrate that feeding behavior can: • • • Overcome what appear to be limits in morphology. That comparative studies have limits or a resolution level, and That comparative "proofs" that ignore behavior do so at the risk of being invalid.

Section Summary
Some of the major logical and structural limitations on comparative proofs of diet are: • • • • • • Comparative proofs of diet are subjective and indeterminate; i.e., they do not provide actual proof of diet. The comparative "proofs" focus only on similarities, and ignore differences. The comparative "proofs" ignore the fossil record, evolution, and current knowledge of ape diets. The comparative "proofs" assume dietary categories are discrete (distinct), while in nature diets form a continuum. The comparative "proofs" ignore the important features that make humans unique in nature. Comparative "proofs" assume the form/function connection is strict and necessary. However: o The same form can serve multiple, different functions. o Subtle changes in form can produce or support significant changes in the function of organs or body systems. o The function served by a particular form can vary dramatically, by feeding behavior. Analysis of dietary adaptations is non-trivial.

82

• • • •

Some of the differences listed in the various comparative "proofs" may be artifacts of cross-taxa comparisons, rather than being due to differences in diets. Polar bears are an example of an animal whose feeding behavior is sharply different from that which would be expected from its morphology. Pandas illustrate the low resolution of comparative studies in "proving" diets.

PART 6: What Comparative Anatomy Does & Doesn't Tell Us about Human Diet
Primate Body Size Link between body size and diet is not strict. To a certain degree, body size may be a predictor of
diet in (land) mammals, including primates, as diet and body size are correlated. Note that the relationship is correlation, and not necessarily causation. Contrary to the claims made by some fruitarian extremists, the link between body size and diet is neither strict nor absolute. In primates, body size is, at best, a poor predictor of diets.

Correlation of body size and diet in primates. The concepts behind the claims regarding body size
vs. diet are explained by Richard [1995, p. 184]: Small mammals have high energy requirements per unit body weight, although their total requirements are small. This means that they must eat easily digested foods that can be processed fast; however, these foods do not have to be abundant, since they are not needed in large quantities. Large mammals have lower energy requirements per unit body weight than small ones and can afford to process food more slowly, but their total food requirements are great. This means their food items must be abundant but not necessarily easy to digest. Richard [1995, figure 5.13, p. 185] also provides frequency distributions of body weights, by diet, for a large set of primates. The frequency distributions indicate the order of generally increasing body sizes, by diet, is: • • • • insectivores, frugivores that eat insects, frugivores that also eat leaves, and folivores.

Humans are an exception to the rule. Given the frequency distributions in Richard [1995], the
average weight of humans (65 kg) appears to be at or above the upper limit of the range for frugivores eating leaves, and within the folivore range. This would suggest, if the "rule" relating body size and diet were strict, that the most likely diet for a primate the size of a human would be a diet in which leaves are the primary food, and fruit a minor/secondary food. Of course, that is a sharply different diet from the nearly 100% fruit diet advocated by certain extremists who claim that the body size/diet "rule" indicates a strict fruitarian diet for humans. Richard [1995, p. 186] provides further insight into the alleged "rule": Within broad limits, then, body weight predicts the general nature of a primate's diet. Still, there are exceptions to the rule, and body weight is not always a good predictor even of general trends...

83

[W]hen finer comparisons are made among the species the rule loses its value altogether. Specifically, similarly sized species often have quite different diets.

Specific human features imply dramatic breakthrough in diet. Milton [1987, p. 106] also
comments on the "rule": In contrast [to australopithecines], members of the genus Homo show thinner molar enamel, a dramatic reduction in cheek tooth size, and considerable cranial expansion (Grine 1981; McHenry 1982; S. Ambrose, pers. comm.). In combination, these dental and cranial features, as well as an increase in body size, apparently with no loss of mobility or sociality, strongly imply that early members of the genus Homo made a dramatic breakthrough with respect to diet--a breakthrough that enabled them to circumvent the nutritional constraints imposed on body size increases in the apes. The above quotes from Richard [1995] and Milton [1987] reflect that the body size "rule" is not strict, and that humans are a specific example of a species that has overcome it. Additionally, recall that our preceding section on brain evolution discussed how human energy metabolism is dramatically different from all other primates. We humans expend considerably more energy on our brains--energy that is apparently made available by our reduced gut size, which itself is made possible via a higher-quality diet. By changing the internal energy budget to supply our large brains with energy, humans have also overcome the "rule" on primate body size.

Analysis of Head, Oral Features, and Hands

Introduction: the main claims Oversimplistic either/or views of carnivorous adaptations. The various comparative "proofs" of
vegetarianism often rely heavily on comparisons with true carnivores, e.g., lions, tigers, etc. The basic arguments made are that meat cannot be a "natural" food for humans, because humans don't have the same physical features characteristic of the small group of animals that are considered to be "true" carnivores. However, such arguments are based on an oversimplified and flawed view of adaptation; that is, the underlying assumption being that there is only one--or at least one main set of--physical adaptation(s) consistent with eating meat. Such arguments also ignore the high intelligence and adaptive behavior of humans. Let's examine some of the claims made in Mills' The Comparative Anatomy of Eating, plus a few additional claims made in a recent extremist fruitarian book. (Out of politeness and to avoid what some might construe as inflammatory, the latter book will not be identified other than to say that it is a massive plagiarism of the book Raw Eating, by Arshavir Ter Hovannessian--a book published in English in Iran in the 1960s.)

Flawed comparisons that pit humans against "true" carnivores. The claims are made that the
human body lacks the features of certain carnivores, hence we should not eat meat, as it is not "natural." The differences are as follows; carnivores have, but humans do not have: • • • • Sharp, pointed teeth and claws for tearing into the flesh of the prey. A jaw that moves mostly up and down, with little lateral (side-to-side) movement; i.e., the jaw has a shearing motion. A head shape that allows the carnivore to dig into prey animals. Additionally, carnivores are different from humans (and herbivores) in that carnivores usually swallow their food whole (humans chew their food). Also, carnivores generally do not have starch-digesting enzymes in their saliva, whereas humans do. Note: Claims regarding the morphology of the gut (digestive system) are discussed in the next section.

84

In The Comparative Anatomy of Eating, Mills summarizes the above analysis with the statement that: An animal which captures, kills and eats prey must have the physical equipment which makes predation practical and efficient. We shall see that this claim by Mills is incorrect if one interprets it only in morphological terms. Without further delay, let's now examine the above claims.

Logical problems of the comparative proofs
The above physical comparisons are accurate--clearly, humans do not have the jaws, teeth, or claws of a lion. However, to use this information to conclude that humans "cannot" eat meat, or "must have" the same physical traits as other predators to do so, or did not adapt to meat in the diet, is logically invalid and bogus. A short list of errors in reaching the above conclusion is as follows: •

Focusing on purely carnivorous adaptations rather than omnivorous ones. One
clarification should be made immediately: This paper does not suggest that humans are true carnivores adapted for a nearly pure-meat diet. Although it may be that humans might be able to do well on such a diet (e.g., the traditional diet of the Inuit), the focus of this paper is to investigate whether meat can be considered a natural part of the human diet--certainly the paleoanthropological evidence supports that view. Thus the focus here is not on the bogus issue "are humans pure carnivores?" but on "are humans faunivores/omnivores?" Invalid black-and-white views. The conclusion above (that meat "cannot" be a natural part of the human diet) is based on a simplistic (incomplete/invalid) view of adaptation. That is, the conclusion is based on the implicit assumption that the specific physical adaptations of the lion, tiger, etc., are the ONLY adaptations that can serve their intended function, i.e., meat-eating. Inasmuch as meat is a small but significant part of the diet of chimps, however--who also lack the carnivore adaptations (sharp teeth, claws, etc.)--we observe that the assumption is obviously false. Various oversimplistic assumptions. The analysis is simplistic and makes many of the mistakes listed in the preceding section--i.e., it assumes the form/function linkage is strict, fails to recognize that the same form can serve multiple functions, etc. Overlooked differences in adaptive behavior. The analysis ignores critical differences in feeding behavior, i.e., the ones relating to the hunting/feeding behavior of omnivorous primates (e.g., the chimp) in the wild, which is well-known to be different from that of lions and tigers. Also, adaptive behavior (enhanced via human intelligence and technology--tools) allows humans to easily overcome many of the physical limitations of our physical form and morphology. Impact of tool use and language on morphology disregarded. The analysis ignores the impact that human intelligence has had on morphology, specifically the evolutionary effect of technology (stone tools and cooking), as well as possible morphological changes to support language--yet another unique human feature. Obvious explanations rationalized away as "illegitimate." A simple, summary answer to the question of how humans can hunt animals and eat meat without the physical adaptations of the lion and tiger is the obvious one implicitly ignored and rationalized by the advocates of simplistic comparative "proofs": We don't need sharp teeth, powerful jaws, or claws to capture and butcher animals because we have used (since our inception as a genus ~2.5 million years ago) tools (or technology--stone weapons) for that purpose. Over the eons, evolution itself has adapted our physiologies to the results of this behavior along unique lines, quite regardless of the hue and cry over the "illegitimacy" with which these behaviors/skills are regarded by those extremists promoting the bizarre idea that human dietary behavior should be strictly limited to what we could do "naked, without tools." Technology, driven by our intelligence, supports adaptive behavior that allows us to easily overcome the physical limitations that the comparative "proofs" regard (incorrectly) as being limiting factors. Along similar lines, we don't need the strong bodies of a lion or tiger because we have something much more powerful: high intelligence, which allowed humans to become the most efficient hunters, and the dominant mammalian species, on the planet.

• •

Examining comparative claims about the head, oral features, and hands

85

Now let's examine, in the subsections that follow, some of the claims of the comparative proofs (regarding the head, hands, oral features, etc.) in light of knowledge of prehistoric diets and evolutionary adaptation. •

Cheek Pouches: An Example of Vague Claims
Before considering more serious subjects, one vague claim occasionally made in comparative "proofs" is that humans and herbivores both have cheek pouches, or cheeks whose structure is somehow similar to cheek pouches, whereas carnivores do not. Frankly, the claim that human cheek structure is similar to cheek pouches is vague and hard to evaluate. Note, however, one point is clear in this matter: humans do not have true cheek pouches as certain monkeys do. Richard [1995, pp. 195-196] notes that: A cheek pouch is an oblong sac that protrudes when full from the lower part of the cheek. Food is moved between it and the mouth through a slitlike opening. Richard [1995] also reports that cheek pouches may increase foraging efficiency and may assist in the pre-digestion of starchy foods. However, until those making claims regarding cheek pouches clarify their claims, the importance (if any) of this point, cannot be evaluated. Further, the function of the mouth and oral systems (including cheeks) is not restricted to eating--a point that is covered later, and one with particular relevance for humans given our special adaptations for language. The cheek-pouch claim is included here as an example of how vague some claims made in comparative proofs can be. Extremely vague claims are hard to test and evaluate. Also, one cannot help but get the impression--based on the very vagueness of this claim--that its advocates are simply "reaching" to find any possible explanation, without regard for plausibility, efficiency, supporting evidence, etc.: a feature typical of extremism in general.

Hands vs. Claws
The claim that meat cannot be a natural food for humans because we lack claws ignores the following: o Simple tool-based technology part of human evolutionary adaptation. As described above, human intelligence supports the creation and use of technology--stone tools in the case of our early human ancestors. Technology--even stone tools--serves the same function as claws, and humans have used such technology since the very inception of the genus. A human equipped with stone tools is a very efficient hunter, and has no need of claws. o Powerful synergism of versatile human behavior patterns. The human hunting pattern can be seen as an extension of the non-human primate predation pattern. Butynski [1982, p. 427] notes: It seems that complex vocal communication, bipedalism and weapon-use are not essential for primates hunting small vertebrates, including mammals. Nevertheless, when the basic predation pattern of non-human primates is supplemented with these unique capabilities, the complete hominid hunting pattern emerges and with it the ability to efficiently hunt and utilize large mammals (Table 2). Evidence for meat-eating by non-human primates and contemporary human hunter-gatherers indicates that, during the evolution of these three components of the hominid hunting pattern, a 30 to 35-fold increase in the consumption of meat occurred--an increase perhaps already evident more than 2.5 million years ago (Fagan, 1974; King, 1976). Clearly, thanks to technology (even as basic a technology as simple stone tools), humans do not need claws to be effective predators. Instead of claws, we have hands, similar to, yet different from, the hands of the great apes. Humans have fully opposable thumbs, but

86

lack the ape/orangutan adaptations for tree-climbing; the primary adaptations are curved phalanges, curved metacarpals, and increased length of fourth digit; see Aiello and Dean [1990, pp. 379-385] for further information on this; also see Hunt [1994] for information regarding the special muscular adaptations of chimps for tree-climbing. o

Special tree-climbing adaptations lacking in humans. Note that the fact humans
lack the important orangutan hand adaptations for tree-climbing, and chimp muscular adaptations for tree-climbing, directly contradicts the claims of certain fruitarian extremists who allege that humans have evolved for a nearly 100% fruit diet (without the use of tools). Consider that (according to the extremists) we are allegedly adapted to eat a nearly 100% fruit diet, yet we lack the critical adaptations to efficiently climb trees and pick that nearly 100% fruit diet. Something is not right here--specifically, such flawed claims. Versatility of the human hand. The human hand is a very versatile system, and its function is not limited to merely picking fruit. The very same hand can caress a lover, feed a person, brandish a weapon, or perform some other function. The application depends on the individual who owns the hand. The fruitarian claim that human hands were designed or evolved primarily for fruit-picking is just silly nonsense.

o

o

Stone tools compensate for lesser human physical capabilities via intelligence. On a similar note, the use of stone tools (made by the human hand, driven
by intelligence) to slaughter animals killed for food renders invalid the comparative proof claim (above) that humans "should" have a head shape that allows humans to "dig into" prey animals. Humans dig into prey using stone tools (which, by the way, were razorsharp) and their versatile hands. Similarly, the (previous) claim by Mills that humans "must have" the same physical features as true carnivores is invalid for the same reasons. In other words, the use of even rudimentary stone tools and a different feeding behavior by humans avoids the selection pressures that gave lions, tigers, and other carnivores those characteristic wedge-shaped heads, as well as the other features considered to be typical of carnivores.

Jaw Joints and Jaw Motion Oversimplifications in comparative "proofs." One of the claims made in the comparative
proofs is that the motion of the jaw joint of a true carnivore is largely a shearing (up-and-down) action, with limited lateral (side-to-side) movement. This is something of an oversimplification. Sicher [1944] reports that certain bears and pandas can and do masticate or chew plant foods. Sicher reports that the jaw joint of the carnivore actually provides a screwing motion, which has both shifting and shearing motions. In cats (and crocodiles) there is very little shifting motion, i.e., the joint motion is mostly shear. However, in pandas the shifting motion is aided by strong masseter muscles which allow mastication--side-to-side chewing--of plant foods. Similar remarks apply to certain bears. (Although bears are omnivorous and pandas nearly herbivorous, their jaw joint is typical of the order Carnivora.)

Considering detailed features in isolation is misleading. Thus we note that the
characterization of jaw motion as "shear-only" in "carnivores" (members of the order Carnivora) is an oversimplification. This is an example of how considering detailed features in isolation from other features can yield misleading results. It also reminds us that some members of the carnivore order are--surprise--predominantly herbivorous. Finally, the remarks regarding hands vs. claws apply here as well: humans never needed super-powerful jaw muscles to tear up prey animals because we used stone tools to cut them up instead. •

Teeth
McArdle [1996, p. 174] reports that the best evidence humans are omnivores comes from the teeth:

87

Although evidence on the structure and function of human hands and jaws, behavior, and evolutionary history also either supports an omnivorous diet or fails to support strict vegetarianism, the best evidence comes from our teeth... In archeological sites, broken human molars are most often confused with broken premolars and molars of pigs, a classic omnivore. On the other hand, some herbivores have well-developed incisors that are often mistaken for those of human teeth when found in archeological excavations. In previous sections, we have discussed how teeth are a good, but not infallible, indicator of diet. McArdle's remarks should be seen in light of the limits on the tooth/diet connection. •

Salivary Glands and Saliva
McArdle [1996, p. 174] reports that our salivary glands "indicate that we could be omnivores." Tanner [1981] reports that humans and the great apes share a class of proteins known as Pb and PPb in the saliva. These proteins help protect tooth enamel from decay due to carbohydrates and/or coarse plant foods in the diet. The presence of these proteins may be an artifact of past evolution, as the diet of humans is significantly different from that of the great apes. It may also reflect the status of humans as, figuratively, "eat nearly anything" omnivores/faunivores.

Teeth and Jaw: Structure and Mastication Muscles
Note: These are discussed together, as the same evolutionary pressures apply to both.

Hominid dental system is small relative to apes and has decreased in size over evolutionary time. The masticatory system of the great apes is larger than that of humans
[Aiello and Dean 1990]. Garn and Leonard [1989 (see quote in the preceding section)], Milton [1987 (p. 106 quote earlier in this section)], and also Leonard and Robertson [1992] report that the dentition of our ancestors decreased from Australopithecus to Homo erectus, coincident with the development of stone tools and increasing consumption of meat (hence decreasing consumption of coarse vegetable foods). Technology--stone tools--permitted our prehistoric ancestors to hunt, kill, and carve up for transport large animals. All of this was done via tools, obviating the need for the typical carnivore adaptations--strong jaws, sharp teeth, claws, powerful muscles for mastication, etc. Another technology--cooking--may have had a significant impact on human dentition as well.

Potential effect of primitive food processing technology. Brace et al. [1991] present a
fascinating hypothesis. Analyzing the evidence from Neanderthal sites in an area referred to as Mousterian, circa 200,000 years ago, they note (p. 46): [W]e repeat the observation that "The important thing to look to is not so much the food itself but what was done to it before it was eaten" (Brace, 1977:199). If that can be accepted, it should follow that the introduction of nondental food processing techniques should lead to changes in the forces of selection that had previously maintained the dentition. As their analysis continues, they note the following: o o o o The cooking of food (meat) in earth ovens can be dated back over 200,000 years ago. The climate of the Mousterian area was cold, and prey animals would freeze solid in winter unless eaten immediately. Cooking meat in an earth oven thaws out the frozen meat, and in hot weather may offset (within limits) the effects of meat spoilage. Brace et al. [1991, p. 47] note:

88

Meat cooked in such a fashion can become quite tender indeed, and in such condition it requires less chewing to render it swallowable than would be the case if it remained uncooked. In turn, this should represent the relaxation of selection for maintaining teeth at the size level that can be seen throughout the Middle Pleistocene. The appearance of the earth oven in the archaeological record, then, should mark the time at which the dental reduction manifest in the Late Pleistocene had its beginning. o They then tie the further reduction in human dentition to the use of pottery, which allows preparation of soups (on which one can survive even if toothless), and allows fermentation as well.

Universal cultural/technological innovations can reduce/change selection pressures.
Hence, Brace et al. [1991] correlate the (evolutionary) reduction in the size of human dentition to universal cultural innovations--cooking and food processing--which promoted survival and relaxed selection pressures that favor large, robust dentition. (This is an excellent example of the evolutionary behavior/culture feedback loop discussed in a previous section.) The end result of such selection pressure is our modern dentition, which is smaller and weaker than our prehistoric ancestors. Milton's comment in Aiello and Wheeler [1995, p. 215] nicely summarizes the situation: The reduced dentition of early humans indicates that technology had begun to intervene in human dietary behavior, in effect placing a buffer or barrier between human dental morphology and the human gut (and thus selection pressures) and foods consumed.

Appearance of modern human form corresponds with reduced dentition. Brace et al.
[1991, p. 48] note that: The emergence of "modern" human form was uniformly associated with dental reduction. The differences in tooth size that can be seen between the various living human populations, then, were the consequences of different amounts of reduction from the Middle Pleistocene condition. These in turn can then be associated with the different lengths of time that the forces of selection maintaining tooth size have been modified by "modern" Homo sapiens. The above material supports the view that technology, in the basic form of stone tools and cooking, may have had a significant impact on the form and structure of human dentition--teeth-and jaw muscles. (Cooked food is softer, so it relaxes selection pressures that favor stronger jaw muscles.) Regarding this point, the comparative proofs of diet that fail to consider the impacts of technology and human feeding behavior are therefore incomplete and/or inaccurate.

Other selection pressures on head and oral features: brain size, posture, and language Selection pressures are multiple and competing rather than solitary. The above material
suggests that changes in diet and food processing techniques relaxed selection pressures for "robust" dentition. However, such factors reflect but one set of selection pressures operating on the human mouth. The mouth is used for more than just eating, after all. Other factors that may impact the evolutionary morphology of the head, mouth, and oral systems include: brain size, consistently upright bipedal posture, and--extremely important--language.

Increasing brain size reduces space available for oral features. The increase in brain size, e.g.,
encephalization, increases the space required for the cranium (brain vault). This, coupled with the slight posture realignments required for bipedalism, may cause subtle but significant changes in the architecture of the head as a whole. The possibility of impacts from these is mentioned by Radinsky as cited in Hiiemae

89

[1984]. Aiello and Dean [1990] are an indirect reference on this issue--see discussion regarding hominoid mandible and cranium--and consider that a shift in the center of gravity of the head (caused by increasing brain size and bipedal posture) could increase selection pressures for a smaller (lower-weight) mouth and oral systems.

The influence of language on the oral system. Another important selection pressure acting on the
human mouth and oral systems was the development of language. As Milton [1987, p. 106] notes: An innovation such as language could help to coordinate foraging activities and thereby greatly enhance foraging efficiency (see, e.g., Lancaster 1968, 1975). Inasmuch as language greatly increased foraging efficiency, it increased survival and was an important selection pressure. Hiiemae [1984] points out that fully developed language requires a very flexible oral system--jaws, tongue, esophagus, etc. Thus it is no surprise that the human oral system emphasizes flexibility and is quite different from that of carnivores like cats or crocodiles, who have no need of language, and who lack tools and must use sharp teeth and claws instead. The human oral system reflects the evolutionary selection pressures on humans (and not the pressures acting on cats or crocodiles). Further, thanks to technology (tools and cooking) the human dentition was buffered to a certain extent from the diet's underlying content. That is, simple technology enables us to transform foods, as necessary, to better fit our dentition--and per Garn and Leonard [1989] and McArdle [1996], humans have the dentition of an omnivore that eats soft foods.

Selection pressures on humans due to language are unique. Hiiemae [1984] analyzes the process
of chewing and swallowing in mammals, and finds that the process is identical--except in humans. There is a significant difference in the sequence of events in humans, and Hiiemae [1984] describes the difference as (p. 227) "a reflection of a fundamental change." Hiiemae [1984, p. 278] notes: Man is the only mammal in which "communication" has become a dominant oropharyngeal activity. [Oropharyngeal refers to the area between the soft palate and the epiglottis.] Is it not possible that the single most important change in the evolution of the jaw apparatus in hominids has been the development of speech? To go further: many mammals can call (coyotes can "sing") but "speech" involves the exactly patterned modulation of the basic note emitted from the larynx. That patterning is produced by a change in the shape of the air space in the oral cavity and by use of a series of "stops" which involve the tongue, teeth, and lips. [These considerations address] a much more fundamental and wide-ranging question--the relationship between the evolution of speech and the changes in both the anatomy and functional relationships of the structures developed to process food.

Design tradeoffs in evolution of the human oral system due to speech/language. The above
suggests that the development of that unique human feature--language--may have required changes in morphology. Shea [1992] reports that the evolution of speech was associated with changes in the skull base and pharynx, both of which indirectly impact the jaw and dental systems. The uniqueness of the human oral system is explained by Lieberman [1992, pp. 134-135]: The human supralaryngeal airway differs from that of any other adult mammal... Air, liquids and solid food all use a common pathway through the pharynx. Humans are more liable than any other land animals to choke when they eat, because food can fall into the larynx and obstruct the pathway into the lungs... Human adults are less efficient at chewing because the roof of the mouth and the lower jaw have been reduced compared with non-human primates and archaic hominids. This reduced palate and mandible crowd our teeth and may lead to infection because of impaction--a potentially fatal condition before modern medicine.

90

These deficiencies in the mouth of an adult human are offset by the increased phonetic range of the supralaryngeal airway. Cziko [1995, p. 182] reiterates the same information as given above from Lieberman [1992], and adds the cogent comments: But if the design of the human throat and mouth is far from optimal for eating and breathing, it is superbly suited for producing speech sounds... We thus see an interesting trade-off in the evolution of the throat and mouth, with safety and efficiency in eating and breathing sacrificed to a significant extent for the sake of speaking.

Linkage of language and brain development in evolution of the human oral system. Deacon
[1992] notes the seamless integration of language into human nature, and suggests this indicates that language originated long ago. Cziko [1995, pp. 183, 185] mentions the likely language/brain evolution linkage: The study of our vocal tract also provides hints concerning the evolution of our brain. Obviously, the throat and mouth would not have evolved the way they did to facilitate language production and comprehension while compromising eating and respiration if the brain had not been capable of producing and comprehending language. The importance of language and the advantages it provides us in communicating, coordinating our activities, and thinking [we "think" in our native language] suggests that, in addition to being a product of our evolution, it also played a large part in shaping our evolution, particularly that of the brain. The above quote suggests that the evolution of the mouth/oral systems were closely related to the evolution of the brain, via the feature of language. A diagram illustrating the various selection pressures acting on the mouth and oral systems is given below.

91

Human oral system forged as an evolutionary compromise. Thus we observe that dietary changes
and food processing are not the only selection pressures acting on the head, mouth, and oral systems. Our jaw, mouth, and throat are the result of a compromise between multiple selection pressures acting on those structures: • • • The structural impact of food pre-processing (stone tools, cooking), The oral system flexibility required to support language, and The subtle changes required to support increased encephalization and upright bipedal posture.

Thus we see that the "big picture" here is not so trivial or simplistic as the claims of the comparative proofs that "humans cannot eat meat because they lack claws, powerful jaws, and the sharp canine teeth of lions." It is a pity that people cling to such simplistic claims--real life is more complicated, but also far more interesting!

To summarize:

92

• •

Humans do not have claws, razor-sharp teeth, or the other adaptations found in carnivores because from the very inception of the genus Homo we have used technology--at first, and at the least, in the form of stone tools--to serve the same functions as claws, sharp teeth, etc. This buffered humans from the selection pressures associated with the development of the features associated with carnivores. Additional selection pressures on the human head came from encephalization (increasing brain size), bipedal posture, and the development of language (which required a flexible oral system). The typical claims made in comparative proofs of diet that humans are not adapted to meat because of lack of claws, fangs, etc., are fallacious and based on the assumption that only one adaptation is possible for the function. In reality, from earliest beginnings, humans have employed adaptive behavior and technology to overcome the limits of morphology.

Overview of Gut (Digestive System) Morphology in Primates and Humans
Introduction
After the various claims of the comparative proofs that "humans can't eat meat because they lack claws, fangs, etc.," the next set of claims made concerns the digestive system. Typically, a number of features of the digestive system are compared: stomach (type, relative capacity, etc.), length of small intestine, and so on. (See The Comparative Anatomy of Eating, by Mills, for a longer list.)

Diet categories in comparative proofs are typically narrow. The comparisons made are relatively
simplistic, and the thinking process involved is also usually rather narrow: herbivores vs. carnivores vs. humans (and omnivores are sometimes added to the list). In referring to narrow thinking here, we mean that the comparative proofs neatly define their dietary categories in a narrow way, and ignore the fact that in nature, diets are usually not strict (i.e., "pure" diets are rare). The comparison of digestive system features comprises the second major component of many comparative proofs of diet.

Other shortcomings of typical comparative proofs. Needless to say, the comparative proofs suffer
from many of the limitations identified in previous sections--subjective list construction, assuming the form/function linkage is strict, ignoring the reality of modern knowledge of ape diets, resolution problem (analysis may be on too gross a level), and so on.

Preview of this section. Instead of listing the numerous claims of Mills and the other proofs, and
analyzing them directly for possible shortcomings, this paper takes a different approach. First, we'll review the work of Milton [1987], as it nicely illustrates that the human gut is (nearly) unique--even when compared with the great apes. Then we'll review an excellent series of papers: Chivers and Hladik [1980, 1984], Martin et al. [1985], and MacLarnon et al. [1986], which are the most authoritative analyses of gut morphology done to date. Also, Sussman [1987] and Hladik et al. [1999] are discussed, for they provide additional insight into the major papers on gut morphology. Finally, the different definitions of the term "omnivore" are explored, including how those differences have been exploited in an intellectually dishonest way (in my opinion) by a fruitarian extremist.

Humans are unique, and the human gut is (nearly) unique
Since we will be comparing the gut (digestive system) of humans with other animals, especially primates, it is appropriate to begin with an illustration that provides a comparison.

93

Milton [1987, p. 102] notes: When compared to those of most other mammals, the relative proportions of the human gut are unusual (my calculations, using data from Chivers and Hladik [1980], and Hladik [1967]). Human gut small compared to apes. Observing that human gut proportions are different from those found in carnivores, herbivores, swine (an omnivore), and even most other primates, including the anthropoid apes, Milton [1987, p. 101] notes that "...the size of the human gut relative to body mass is small in comparison with other anthropoids (R.D. Martin, pers. comm.)." Milton [1987] includes a table (3.2, p. 99) that compares the relative volumes of the different parts of the gut for selected hominid species. The table shows the stomach at 10-24% of total gut volume in humans, while for orangs and chimps it is 17-20%. The small intestine is 56-67% of total gut volume in humans, 23-28% in orangs and chimps. And the colon is 17-23% of total gut volume in humans, while it is 52-54% in orangs and chimps. The percentages quoted in the preceding sentence are unscaled, i.e. are not scaled for inter-specific differences

94

in body size. Despite this, the figures are useful to compare patterns of gut proportions, and the general pattern is clear: humans have "large" intestines, while chimps and orangs have "large" colons. Additionally, Milton [1987] discusses two primates whose gut proportions appear to roughly match those of humans: • • Capuchin monkey (Cebus species) which has a high-quality diet of sweet fruits, oily seeds, and (40-50% by feeding time) animal foods--invertebrates (insects) and small vertebrates [Milton 1987, pp. 102-103]. The savanna baboon (Papio papio) is a selective feeder who searches for high-quality, nutritious foods. [Caution--remark is based on only one measured specimen for gut proportions.]

Gut characteristics reflect dietary quality. Like humans, Capuchin monkeys and savanna baboons
make extensive use of their hands for pre-processing of food items. Milton concludes that the similarity in gut proportions reflects adaptation to high-quality diets [Milton 1987, p. 103]: Rather, it appears to represent similar adaptive trends in gut morphology in response to diets made up of unusually high-quality dietary items that are capable of being digested and absorbed primarily in the small intestine. The plasticity (or elasticity) of the human gut--that is, how the proportions can change to accommodate temporary fluctuations in diet--is discussed in Milton [1987]. (The topic will be addressed later here.) She predicts that further studies will (continue to) show that the human gut is dominated by the small intestine, with high variability in colon size due to temporary changes in diet.

Quantitative Analysis of Gut Morphology in Primates and Humans
Introduction
A series of excellent papers--Chivers and Hladik [1980, 1984]; Martin, Chivers, MacLarnon, and Hladik [1985] (hereafter referred to as Martin et al. [1985]), and MacLarnon, Martin, Chivers, and Hladik [1986] (hereafter referred to as MacLarnon et al. [1986])--provides two separate and different quantitative analyses of gut morphology, with cross-reference to diet. Together, these two detailed analyses provide a sharp contrast to the simplistic analyses one typically finds in the comparative "proofs" of diet. The papers of Sussman [1987] and Hladik et al. [1999] are also discussed below, for the supplementary insights they provide. Note: The above papers are lengthy and provide considerable detailed, technical information. Only a summary of the major points in each analysis is provided here. Those with a specific interest in gut morphology can learn much from the above papers. And frankly, if you do read the papers cited above, you will realize just how simplistic the analyses presented in the typical comparative "proofs" of diet, really are.

The research of Chivers and Hladik [1980, 1984]
The two papers of Chivers and Hladik [1980, 1984] provide the first analysis of gut morphology of the two different analyses that will be discussed here. To begin with a brief summary, the major points of the approach in these two papers are: • •

GI tracts from numerous species were analyzed. Gastrointestinal (GI) tracts were analyzed
from 180 individuals of 78 different species; 112 of the individuals in the analysis were primates. Morphology points to 3 basic dietary groupings. Differences in morphology reflect adaptation to different diets. Foods can be classed into 3 groups, by structure and composition:

95

o o

Animal matter, including invertebrates and vertebrates. Such foods are easily digested,

and a short, simple gut is adequate for such foods [faunivory]. Fruits--the reproductive parts of plants, including seeds and tubers. Fruits are high-sugar foods, and tubers are high-starch foods that are converted into sugar in digestion; such foods are readily digested and absorbed in the intestines [frugivory]. o Leaves--including the structural parts of plants, i.e., grasses, stems, bark, and gums. These foods require fermentation in a large stomach or the large intestine/colon [folivory]. Dietary categories reflect a continuum, not sharp divisions. The categories of dietary adaptation described above--faunivory, frugivory, folivory--are not discrete (distinct or black-andwhite) categories, but reflect a continuum that stretches from animal foods (hard to catch but easy to digest) to folivory (easy to find but hard to digest), with frugivory an intermediate form (fruit is available, but often only in limited quantities). Typical gut of faunivores. Chivers and Hladik [1980] describe the typical gut pattern for a faunivore (p. 338): The basic pattern of gut structure among faunivores consists of a simple globular stomach, tortuous [containing many twists/bends] small intestine, short conical caecum, and simple smoothwalled colon.

Folivore gut characteristics. Regarding folivores, the long-chain carbohydrates found in
leaves and structural plant parts require bacterial decomposition (fermentation) for digestion and assimilation. There are two primary adaptations for fermentation: chambers in the fore-gut (stomach) or mid-gut (caecum and colon). Frugivore guts are variable. Regarding frugivores, Chivers and Hladik [1980, p. 340] note that: This group contains most primates, but none of them subsist entirely on fruit. All frugivores supplement their diets with varying amounts of insects and/or leaves, but have no distinctive structural specialization in the gut, although its morphology may show considerable variation between species. Contrast the above real-world information with the position of certain fruitarian extremists who claim that a nearly 100% fruit diet is optimal/most "natural" for humans. In actuality, it appears there is no such thing in nature as a 100%-fruit frugivorous primate.

Faunivory and folivory are the endpoints with frugivory intermediate. Pure faunivory
and pure folivory reflect the two extremes of adaptation. However, as diets are a continuum, Chivers and Hladik argue that putting all intermediate diets in the (too broad) category of omnivores is inappropriate. They note [Chivers and Hladik 1984, p. 216]: No mammal mixes large quantities of both animal matter and leaves in its diet without including fruit. Since faunivory and folivory seem to represent contrasting and incompatible [morphological] adaptations, the quantity of fruit in such a mixed diet is always considerable. The objection of Chivers and Hladik to the term omnivore appears to reflect a different view of the definition of the term than is used by many other authors. This will be discussed later in this section.

Determining/measuring relevant gut characteristics is non-trivial. Chivers and Hladik
[1980] includes a discussion of the major challenges involved in measuring the relevant gut parameters--volume, surface area, etc. Additional challenges include: the gut itself is elastic (i.e., it may stretch when you try to measure it), consists of irregular shapes which may be complicated by

96

folds or specialized structures, and so on. In the real world, choosing what parameters to measure and how to measure them, in the area of gut morphology, is decidedly non-trivial. The simplistic comparative proofs of diet don't bother to tell you such details. Basic raw data collected: For the subject animals, the sizes of stomach, large intestine, and small intestine were all measured in terms of surface area, weight, and volume. Data on overall size of the animal--length and weight--were recorded as well. Chivers and Hladik [1980] includes a discussion on the importance and reliability of the various types of measures--surface area, weight, volume--and conclude that volume is the least reliable measure because of confounding factors--i.e., it is hard to distinguish and compare the volume of a fermenting chamber in a folivore versus the volume of absorbing regions in a faunivore.

Gut differentiation coefficients calculated. Given that the primary gut morphology
specializations related to diet range from a gastrointestinal (GI) tract dominated by the small intestine in faunivores, to a GI tract dominated by fermentation chambers in the stomach and/or cecum or large intestine (colon) in folivores, Chivers and Hladik define and calculate coefficients of gut differentiation as:

Measure of stomach + caecum + colon measure of small intestine
where measure = one of weight, volume, or surface area. In other words, the numerator in the equation above represents a measure of the structures that tend to dominate in more folivorous species. The coefficient of gut differentiation compares the numerator against (divides it by) the denominator, which is a measure of the feature (small intestine) that dominates in species with more faunivorous (animal-based) diets. The resulting coefficient provides a simple (though crude) measure of gut morphology specialization. Coefficients of gut differentiation were calculated for each species in the study, and the results plotted by dietary specialization. In regard to the classification of animals by dietary specialization, Chivers and Hladik [1980, p. 377] note: While recognizing the special significance of the gross dietary categories to which each species can usually be assigned, particularly the most specialized forms, we have tried to avoid any implication that a classification into faunivores, frugivores, and folivores reflects exclusive diets. •

Size scaling to compare species adjusted according to body length. To be able to
compare gastrointestinal tracts across species, the raw data was adjusted for body size--in this case, body length. (Note: depending on the source of data, body length might not be as accurate an indicator as adjusting for body weight--which scales better with the body's metabolic demands. The later research papers of Martin et al. [1985] and MacLarnon et al. [1986], which we'll examine shortly and which are extensions of the research described in Chivers and Hladik [1980, 1984], does use body weight to adjust for size differences between species.) Index of gut specialization. Incorporating this scaling adjustment, the GI tracts studied were analyzed for the three different dietary classes (faunivore, frugivore, folivore) to produce a numeric index--an index of gut specialization. (For details on the methodology used to derive the index of gut specialization. The index of gut specialization is an attempt to produce an index that reflects the apparent morphological specialization for the animal in question. That is, it is intended to indicate what the animal's morphology suggests the animal's diet might be. Dietary index: what the animal actually eats. Having developed the index of gut specialization, which suggests what the animal might eat, Chivers and Hladik then turn their attention to the issue of what the animals actually do eat. As the three main dietary specializations are leaves, fruit, and fauna, they use a triangular plot whose 3 vertices represent 100% faunivory,

97

100% frugivory, and 100% folivory. Use of a triangular plot provides a nice 2-dimensional plot for the 3 dimensions required. Each animal is plotted in the triangle according to its actual diet. An update of the original graph [Chivers and Hladik 1980], from Chivers [1992], is as follows.

Chivers and Hladik assert that overlaying an x-axis on the triangular plot provides the best univariate (i.e., one number per animal) summary description of actual diets. Note that because the plot is triangular and a projection of a 3-dimensional plot, the scales are more complicated than in a standard rectangular plot. In this case, the x-axis is defined as: x = (% leaves) - (% animals) The value of the x-axis for each point in the triangular plot of dietary characteristics (above) is used as the dietary index, an approximate indicator of the actual diet of the animal. Note also that the three percentages (fruit, leaves, animal matter) are not independent, as they must add up to 100%.

98

Chivers and Hladik then compare the dietary index values (which indicate actual diet) against gut specialization index values (which indicate what diet would be suggested by gut characteristics) for 8 selected species. The results of the comparison are "good" (i.e., some similarities between the indices of diet and gut specialization). This provides some supporting evidence for the analysis.

A footnote on Chivers and Hladik [1980, 1984]: Human gut morphology
Sussman [1987] describes the analysis of the gut of 6 human cadavers using the measures defined in Chivers and Hladik [1980, 1984]. Analysis of the human gut data using the coefficient of gut differentiation (a measure of gut specialization) placed humans in the frugivore range, along the margin with the faunivore category. However, analysis of the same data using the index of gut specialization (yet another measure of gut morphological specialization) placed humans squarely in the faunivore range. Note that the frugivore classification above came from using the coefficient of gut differentiation, which is an intermediate result in Chivers and Hladik [1980, 1984], hence presumably less desirable (from a certain analytical viewpoint) than the (faunivore) classification achieved using the end result of Chivers and Hladik [1980, 1984], i.e., the index of gut specialization. Also recall that the term frugivore does not mean or imply that a diet of nearly 100% sweet fruit (as advocated by some fruitarians) is appropriate. Recall that all frugivorous primates eat at least some quantities of animal foods, even if only insects. Thus the result that humans appeared to be frugivores by one measure and faunivores by another suggests a natural diet for humans that includes both animal foods and fruits.

The research of Martin et al. [1985] Improved methodology. The research of Martin et al. [1985] is very important, for it provides a second,
different analysis of the (same) data analyzed previously in Chivers and Hladik [1980, 1984]. The analysis of Martin et al. [1985] includes a number of important features/improvements: • • Data for 6 humans, modern Homo sapiens, are included in the analysis. All comparisons are done with respect to body weight. Weight is more commonly used in allometric analysis/adjustments than is body length. (Chivers and Hladik [1980, 1984] used the cube of body length for their analysis.) Another advantage of performing analysis based on body weight is that it is closely related to the metabolic energy requirements of each animal (Kleiber's Law again). The research uses gastrointestinal (GI) quotients (explained below), similar to encephalization quotients. In Chivers and Hladik [1980, 1984], the animals were grouped into dietary categories on an a priori basis prior to quantitative analysis. One can argue that such an a priori classification is inappropriate and introduces bias. In contrast, Martin et al. [1985] allow the data itself to determine the dietary groupings, based on an analysis of gastrointestinal quotients (alone). The groupings are then analyzed for dietary commonality. (In other words, the data themselves here determine the dietary groupings--a critically important point.) Major axis line-fitting is used in preference to linear regression, as it may have advantages when both dependent and independent variables are measured with errors.

• •

The major points from Martin et al. [1985], in summary, are: •

Size scaling adjustments done by body weight rather than length. The raw data is
allometrically adjusted (scaled) to adjust for body weight. Martin et al. [1985, p. 66] note: For instance it is now well established that basal metabolic rate scales to body weight in mammals with an exponent value of 0.75 ["Kleiber's law" (Kleiber 1961; Hemmingsen, 1950, 1960; Schmidt-Nielsen, 1972)], and there is new evidence to indicate that active metabolic rate (i.e., total metabolic turnover in a standard time) also scales to body size with a comparable exponent value

99

(Mace and Harvey, 1982). Hence, one might expect any organ in the body that is directly concerned with metabolic turnover to scale to body size in accordance with Kleiber's law. •

GI quotients for 4 different digestive system components (stomach, intestines, cecum,
and colon) for each animal in the study are calculated by dividing the component's actual weight by its "expected" weight (the latter projected using Kleiber's Law). Measurement of the size of digestive components was calculated using surface area. A GI quotient greater than 1 means the size is larger than expected; a value less than 1 indicates a size smaller than expected. Human GI quotient pattern typical of faunivores. Human GI quotients are considerably lower than predicted/expected for all 4 digestive system components measured. Martin et al. report [1985, p. 72] that: Calculation of gut quotient values has particular interest in the case of the four average surface areas of the gut compartments determined for six Homo sapiens. It can be seen from figs. 1-4 that man has values of less than one for all four gut compartments, most notably with respect to the cecum: GQ = 0.31; IQ = 0.76; CQ = 0.16; LQ = 0.58 [In the above, GQ is the quotient for the stomach, IQ for the small intestine, CQ for the cecum, and LQ for the colon.] This is a pattern shared with a number of animals relying heavily on animal food ["faunivores" (Chivers and Hladik, 1980)].

Meaningful dietary groupings based on statistical analysis of GI quotients. A
dendrogram, or "tree" diagram, based on statistical analysis of the GI quotients for the different animal species in the study was derived in order to determine meaningful dietary groupings according to similarity of GI tracts. The dendrogram for the study can be found in Figure 11, p. 81 of Martin et al. [1985], and includes Homo sapiens. Humans fall into group A2 in the dendrogram, about which, Martin et al. [1985, p. 82] comment: Group A can be characterized as containing numerous mammalian species (primates and nonprimates) that include at least some animal food in their diets. Again, there is a separation into two subcategories (A1, A2), the second of which contains most of the mammalian carnivores and only two primate species--Cebus capucinus and Homo sapiens. Thus the result of the advanced statistical analysis in Martin et al. [1985] is that humans fall into the faunivore--meat-eater--class, yet again. Note also that the Capuchin monkey, Cebus capucinus, is in the same statistical grouping as humans, thereby confirming the remarks in Milton [1987], discussed earlier in this section, that the human and Capuchin monkey gut dimensions are similar.

The research of MacLarnon et al. [1986] Refinement needed in analytical techniques used in earlier study. The research of MacLarnon et
al. [1986] provides an extension and analytical refinement of Martin et al. [1985]. The analytical techniques needed further refinement because: •

Clustering technique is labile. The dendrogram clustering technique used in Martin et al.
[1985] was found to be labile, i.e., small changes in the data set analyzed could cause substantial changes in the cluster pattern produced.

100

• •

Non-symmetric transformation used in earlier study. The anti-logarithmic transformation
applied in computing GI-quotient values in Martin et al. [1985] was non-symmetric and tended to overemphasize the importance of GI components that were larger than expected. The multidimensional scaling (MDS: a dimension-reducing data analysis method) plots produced in Martin et al. [1985] were crowded and difficult to interpret.

MacLarnon et al. [1986] addressed the above issues by performing additional analyses of the Martin et al. [1985] data set: •

The data were reanalyzed in their original, untransformed logarithmic form. This
created another problem, however, as it forced the exclusion (from the analysis) of those animals that lacked a caecum (i.e., some faunivores). [The exclusion was necessary because the logarithm of zero is undefined.] Primates were analyzed as a separate group, for comparison with the other analyses. The MDS was repeated using the original logarithmic data.

• •

The results of MacLarnon et al. [1986] can be summarized as follows. •

Primates-only analysis. In the dendrogram clustering using the transformed data [Martin et al.
1985], humans (Homo sapiens, #42 in the analyses) and Capuchin monkeys (Cebus capuchinus, #16 in the analyses) appear together in a group that consists of frugivores and frugivoreinsectivores. In the dendrogram clustering using the untransformed (logarithmic) data [MacLarnon et al. 1986], humans and the Capuchin monkey occur together, as an apparent outlier in the clustering. In the MDS analysis using logarithmic data, humans and Capuchin monkeys appear as outliers. In the anti-logarithmic data MDS, humans and Capuchin monkeys appear on the edge of the central cluster of primate species.

MDS analysis of a more complete data set: humans grouped (again) with faunivores.
When a more complete data set (primates plus other mammals, excluding those faunivores with no caecum) was analyzed, humans and Capuchin monkeys were grouped with non-primate faunivores in the MDS logarithmic data analysis. This is clearly illustrated in Figure 5 of MacLarnon et al. [1986, p. 302], where humans and the Capuchin monkey appear in the faunivore group. Note that Martin et al. [1985] report similar results using untransformed anti-logarithmic data, though the results are not as clear as in the logarithmic data set. Wild vs. captive specimens. Further analysis showed that including data on captive primates had only a small impact on the results. The primary effect was to rearrange the frugivores and frugivore-insectivores, which are the most labile species in the analysis.

Conclusions. MacLarnon et al. [1986] conclude that:
• • • The use of logarithmic quotients is preferable to the use of anti-logarithmic quotients in MDS analyses. MDS analysis techniques are more robust (for the subject data set) than dendrogram-based clustering techniques. Human GI tract shows possible faunivore adaptations. From MacLarnon et al. [1986, p. 297]:

101

...[T]his being the case, the new evidence from the approach using logarithmic quotient values (Fig. 1, 3 and 5) is particularly interesting in that it suggests a marked departure of Cebus [Capuchin monkey] and Homo [humans] from the typical pattern of primates lacking any special adaptation for folivory...in the direction of faunivorous non-primate mammals.... 5. Use of logarithmic quotient values for clustering purposes suggests that Cebus and Homo possess gastrointestinal tracts that have become adapted in parallel to those of faunivorous mammals, with notable reduction in size of caecum relative to body size. Nevertheless, because of the artificiality of most modern human diets, it cannot be concluded with confidence that the small human sample examined to date reflects any "natural" adaptation for a particular kind of diet. The results obtained so far are suggestive but by no means conclusive.

Thus the research of MacLarnon et al. [1986] suggests, but is not (by itself) conclusive proof, that the human GI tract is adapted for the consumption of animal foods.
Assessing Martin et al. [1985], MacLarnon et al. [1986], and related studies Further study still needed to confirm results. Although very impressive in their scope and
sophistication, the results of Martin et al. [1985] and MacLarnon et al. [1986] that relate to humans specifically are based on limited data (6 individuals). Further study is appropriate before sufficient evidence is available to definitely class humans as faunivores or frugivores, based solely on gut morphology. Some of the reasons for caution regarding the study results are as follows: • •

Small sample size. Some species used in the study (although there were 180 individuals total
from 78 different species) have only a single specimen--i.e., small sample size for those species. Gut dimensions can vary in response to current diet. The gut dimensions of animals can vary significantly between wild and captive animals (of the same species, of course). Gut dimensions can change quickly (in captivity or in the wild) in response to changes in dietary quality. For information on this topic, consult Hladik [1967] as cited in Chivers and Hladik [1980]; also the following sources cited in Milton [1987]: Gentle and Savory [1975]; Gross, Wang, and Wunder [in press per citation]; Koong et al. [1982]; Miller [1975]; Moss [1972]; and Murray, Tulloch, and Winter [1977]. Knowledge of human gut variability is limited. More information is needed on how the human gut dimensions vary with diet. Milton [1987, p. 101] notes: The size of the present-day human small intestine could be an ancient or a relatively recent trait. Indeed, it is not known whether all modern human populations show such gut proportions.

• •

Gut structure and gut transit times. The analyses of Martin et al. [1985] and MacLarnon et
al. [1986] are limited to measures of structure, and do not include gut transit times. Note, however, that data on gut transit times for wild animals will likely be difficult to obtain. Different analytical techniques, different results: the confusion factor. The papers discussed so far in this section provide a range of different analytical techniques, and the results for humans are not consistent across the different techniques. This can be confusing to readers, as one part of the analysis suggests humans are probably faunivores, while another suggests humans might be frugivores.

We have presented the results of these papers in some detail here, so the reader can get a feel for the overall thrust of the analyses. The basic result appears to be that the anatomy of the human GI tract shows what appear to be adaptations for faunivory (consumption of animal foods), regardless of whether humans fall into the faunivore or frugivore class. This leads us to the next paper on gut morphology to be discussed here.

102

The note of Hladik et al. [1999]
A recent note: Hladik, Chivers, and Pasquet [1999] (referred to hereafter as Hladik et al. [1999]) provides further information on gut morphology. The Hladik et al. [1999] paper must be understood in the context of the 4 major papers that preceded it, as well as the fact that it is a comment on the Expensive Tissue Hypothesis of Aiello and Wheeler [1995], which was discussed in a previous section. The context is important because a fruitarian extremist has already quoted parts of Hladik et al. [1999] out of context, and--in my opinion--misrepresented the meaning thereof. •

Humans are frugivores by one measure, faunivores by another. Let's look at some
relevant quotes from the note. Hladik et al. [1999] includes a version of a figure from Chivers and Hladik [1980], with the comment [Hladik et al. 1999, p. 695]: Some of these data are reported here (fig 1), combined with one of the original figures [Chivers and Hladik 1980] illustrating the reduced axis [line fit] for three major dietary tendencies of primates and other mammals (folivore/frugivore/faunivore). The human specimens fall, as expected, on the main axis for frugivores, a gross category corresponding to fruit and seed eaters (or omnivores, when occasional meat eating is practiced). Examination of figure 1 from Hladik et al. [1999] suggests that it is a version of figure 26 from Chivers and Hladik [1980]. The figure (1) shows the relationship between body size (horizontal axis) and area of absorptive mucosa (vertical axis). Hladik et al. [1999] report that the human specimens fall into the frugivore range. However, Sussman [1987] analyzed 6 human cadavers and found that their intestinal surface areas fell into the frugivore range using one measure, but the faunivore range using another measure (specifically: figure 26 from Chivers and Hladik [1980]; see also figure 9.8 in Sussman [1987, p. 176]). Thus we have again that the classification of humans by gut morphology is not consistent; humans appear to be frugivores per Hladik et al. [1999], but faunivores by a similar (perhaps the same kind of) measure per Sussman [1987]. If one considers frugivore as a gross category that includes omnivores, then humans might indeed fall into the category, as the sum total of current evidence suggests that humans (and Capuchin monkeys) are (figuratively) where the faunivore and frugivore classes "meet."

Gut surface areas might not support Expensive Tissue Hypothesis. From Hladik et al.
[1999, pp. 696-697]: A specialized carnivorous adaptation in humans that would correspond to a minimized gut size is obviously not supported by our data (fig. 1). The large variations in human diets (Hladik and Simmen 1996) are probably allowed by our gut morphology as unspecialized "frugivores," a flexibility allowing Pygmies, Inuit, and several other populations, present and past, to feed extensively on animal matter... The first sentence above, re: carnivorous adaptation, must be understood in context: as a comment on the Expensive Tissue Hypothesis. It claims that there is no major change in gut surface areas as the Expensive Tissue Hypothesis suggests. It does not mean there is absolutely no adaptation to faunivory: the major adaptation to faunivory in humans was previously identified as a reduction in size of the caecum and colon, per Martin et al. [1985] and MacLarnon et al. [1986]. The above quote does not contradict the 1985 and 1986 papers.

Humans fail on raw, ape-style frugivore diets, but thrive on faunivore diets. The
second part of the above quote is very interesting. Let's consider the raw-food, ape-style frugivore diets that various dietary advocates claim are "optimal" for humans. The fruitarian diet is a diet of predominantly raw fruit, with some leaves and seeds. It is a strict vegetarian frugivore diet.

103

However, extensive anecdotal evidence (the only evidence available) indicates that the fruitarian diet is a massive failure. There are very few long-term success stories on fruitarian diets, and many of the claims of success are not credible. If we consider an arguably more natural frugivore diet, one of raw fruit and raw meat/animal foods, we have the instinctive eating diet or anopsology. That diet does have a few credible longterm success stories, but only very few. As well, the few long-term success stories known to this writer are of individuals who consume significant quantities of (raw) animal foods (i.e., in caloric terms they are probably closer to faunivore than frugivore). The point here is the obvious irony: humans might be frugivores, but in fact humans fare poorly and do not succeed, in the long-term, on ape-style frugivore diets. In sharp contrast to the failure of ape-style frugivore diets, the example of the Inuit shows that humans can succeed and even thrive, long-term, on a diet similar to a faunivore diet. This, of course, brings us right back to the key question: should we class humans as frugivores or faunivores? •

Insects (or comparable animal foods) a part of natural human diet. Ultimately, Hladik
et al. [1999] directly acknowledge that the natural human nutritional requirements call for some animal foods (from p. 697, emphasis below is mine): There is no doubt our species needs a rich diet to cover large energy expenses, but it requires relatively no richer a diet than many Cebidae and Cercopithecidae feeding on sweet fruits complimented by the protein and fat of a large proportion of insects.

Postscript: humans are where faunivore and frugivore "meet." The Hladik et al. [1999] note
reminds us that gut morphology alone probably won't settle the issue of whether humans should be classed as frugivores or faunivores. Indeed, it might not be possible to unequivocally "prove" humans are in one class but not the other. One point is clear, though: gut morphology suggests that humans are where the classes of faunivores and frugivores "meet," i.e., it suggests that animal foods (even if just insects) are a part of the natural human diet.

On the term "omnivore," and misuse of quotes
Definition of "omnivore" a critical point
Out-of-context quotes from the writings of D.J. Chivers are sometimes used by fruitarian extremists in support of their claims that humans are not omnivores, but are instead adapted to a strict vegetarian/fruitbased diet. From the previous material in this section, recall that the research of Chivers and associates suggests that the human gut morphology is similar to that of faunivores, meat-eaters (which suggests that animal foods are part of the natural human diet), and does not support the alleged claim that humans are adapted for a nearly 100% fruit diet, or even a 100% vegetarian diet. One major factor in understanding the quotes from Chivers regarding omnivores is that he uses the word in a more precise and different way than the most common usage. To understand this point, let's now examine the different definitions of the term "omnivore."

104

Omnivore: the common definition. The most common usage of the term omnivore is to indicate an
animal that includes foods from more than one trophic level in its diet, which is usually interpreted to mean an animal that eats both plant and animal foods. From Milton [1987, p. 93]: Humans are generally regarded as omnivores (Fischler 1981; Harding 1981). By definition, an omnivore is any animal that takes food from more than one trophic level. Most mammals are in fact omnivorous (Landry 1970; Morris and Rogers 1983a, 1983b)... Milton [1987] goes on to note that the term omnivore is vague as there is substantial variability in the foods omnivores eat, and the term is not linked to gut morphology. Thus saying a mammal is an omnivore tells one little about the actual diet of the animal.

Omnivore: Chivers uses the term differently. Let's review some of the relevant quotes. From
Chivers [1992, pp. 60-61]: [F]or anatomical and physiological reasons, no mammal can exploit large amounts of both animal matter and leaves, the widely used term "omnivore" is singularly inappropriate, even for primates. Humans might reasonably be called omnivores, however, as a result of food processing and cookery... Because of cooking and food processing, humans can be described as the only omnivores, but we shall see that their [human] gut dimensions are those of a faunivore.

"Omnivore" a vague term lacking in relevance for GI tract functions. A further relevant quote
is from Chivers and Langer [1994, p. 4]; emphasis below is mine: The concept of omnivory is weakened by the anatomical and physiological difficulties of digesting significant quantities of animal matter and fruit and leaves... animal matter is swamped in a large gut, and foliage cannot be digested in a small gut. A compromise is not really feasible... Humans are only omnivorous thanks to food processing and cookery; their guts have the dimensions of a (faunivore) carnivore but the taeniae, haustra and semi-lunar folds are characteristic of folivores. Among the so-called omnivores, most eat either mainly fruit and animal matter (if smaller) or fruit and foliage (if larger) but not all three. Thus we note that Chivers appears to define an omnivore as a general feeder with a gut morphology that supports a diet that includes significant amounts of all three types of foods: fruits, leaves, and animal matter. Such a gut morphology is not found in mammals, hence the term is indeed inappropriate for mammals.

Contradictory claims about omnivores: which is correct? Thus we have what appear to be
contradictory statements: most mammals are omnivores; no mammal is an omnivore. Which is correct? The answer is that both are correct, because they are using different definitions of the term "omnivore." Chivers' criticism of the common definition of the term "omnivore" is relevant: it would be better (more precise) to use terms that are linked to gut morphology: folivore, frugivore, faunivore. However, that does not mean that those who are using the common definition are making incorrect or invalid statements. Recall that a definition is simply a convention that people follow. While it is desirable that definitions possess analytical rigor, it is not a requirement that they do so. Hence the meaning of a statment like "chimps are omnivores" or "humans are omnivores" is clear, i.e., the natural diet of humans and chimps includes both animal and plant foods. A fruitarian extremist has used the difference in definitions of the term "omnivore" to suggest that statements like "chimps are omnivores" are incorrect and irrelevant. Because the meaning of such statements is clear (even to those who support Chivers' remarks), it is my opinion that the fruitarian extremist is engaging here in a blatantly intellectually dishonest word game in an effort to distract attention from the well-known fact that animal foods are a significant (even if small) part of the natural diet of many primates.

105

More examples of out-of-context quoting by dietary extremists. Those who use (some or all of)
the above quotes from Chivers' writings in support of the fallacious claim that humans evolved on a strict vegetarian/fruit diet often neglect to quote the following, from one of the same source articles, e.g., Chivers [1992, pp. 60, 64]: Exclusive frugivory is practically impossible, because certain essential amino acids and other nutrients are found only in leaves or in animal matter... Humans are on the inner edge of the faunivore [meat-eater] cluster, showing the distinctive adaptations of their guts for meat-eating, or for other rapidly digested foods, in contrast to the frugivorous apes (and monkeys). The first part of the above quote indicates how unlikely the bogus claims that humans evolved as fruitarians (on very low-protein diets) really are. The second part of the above quote points out that human gut morphology is more similar to that of faunivores than the fruit-diet frugivores. Finally, readers should be aware that a more recent paper, Chivers [1998], includes quotes similar to the above. This is mentioned in the event the fruitarian extremist in question here might decide to "update" their argument by utilizing more recent quotes from Chivers than the ones above.

Misuse of other quotes
Speaking of intellectual dishonesty and misuse of quotes, another quote on gut morphology that is sometimes misused by fruitarian extremists is from Elliot and Barclay-Smith [1904], as cited in Stevens and Hume [1995, p. 112]: There can be little doubt that the human colon is rather of the herbivorous than carnivorous type. By itself, the above quote may seem straightforward enough. However, now compare the above to a more complete version of the same quotation in full context, from Stevens and Hume [1995, p. 112] (italicized emphasis mine below): They cited a study of 1,000 Egyptian mummies that indicated that their cecum was considerably larger than that of present-day humans. Therefore, in spite of reservations about deducing function from structure, these investigators concluded that, "There can be little doubt that the human colon is rather of the herbivorous than carnivorous type." As previously mentioned, human gut dimensions can vary with diet. The diet of ancient Egyptians would likely be much higher in fiber than a modern Western diet of processed foods. See Milton [1987] for more information on this topic. The above questionable use of a quote is yet another example (in my opinion) of intellectual dishonesty by a fruitarian extremist. The moral of the story here is that checking the references can be critical, particularly where only a few references have been (perhaps selectively) quoted.

Section summary and synopsis Although by comparative anatomy analysis (alone) the issue is not yet settled, the results of two different statistical analyses of a "large" data set on gut morphology and diet (i.e., the best available scientific evidence) support the idea that animal foods are a natural part of the human diet. That is:
• •

Humans are faunivores or frugivores adapted to a diet that includes significant amounts of animal foods. The morphology of the human gut does not correspond to that expected for a nearly 100%-fruit frugivore, as claimed by various fruitarian extremists.

106

Finally, the simplistic analyses of gut morphology found in the various comparative proofs of diet are (badly) outdated.

PART 7: Insights about Human Nutrition & Digestion from Comparative Physiology
Preface. The objective of this section is to investigate the topic of comparative physiology for any insights
it might provide regarding the natural diet of humanity. Also of interest will be information that can be gleaned about the relative efficiencies and inefficiencies in the availability and absorption of certain key nutrients depending on their source in the diet (i.e., plant or animal foods). To a certain degree, this topic is already discussed in the article Metabolic Evidence of Human Adaptation to Increased Carnivory on this site. This section serves to summarize the main points of that article, and to provide further information and discussion (which in some cases is extensive), as appropriate. Due to the complexity of the issues, some of the subsections below are relatively detailed (and lengthy). This is necessary to establish a basis for discussing the implications physiology has regarding diet. If you find that a particular subsection is not of interest (or too technical, though I have tried to make the material as accessible as possible), you might want to advance to the next subsection.

Key Nutrients vis-a-vis Omnivorous Adaptation and Vegetarianism

Vitamin B-12: Rhetoric and Reality

Introduction: standard information Vitamin B-12 (cobalamin) is an essential nutrient required for the synthesis of blood cells and the
myelin sheath of the central nervous system. Only tiny amounts are needed to sustain health, and a healthy body is efficient at recycling B-12. Additionally, the body maintains a sizable store of the vitamin. The recommended daily allowance/input, or RDA/RDI, for vitamin B-12 is as follows, from NRC [1989, p. 162]: Adults. A dietary intake of 1 mcg daily can be expected to sustain average normal adults. To allow for biological variation and the maintenance of normal serum concentrations and substantial body stores, the

vitamin B-12 RDA for adults is set at 2.0 mcg.
Vitamin B-12 is made only by bacteria; it is not synthesized by plants or animals. The very limited (usually only trace) amount of B-12 in plants comes from uptake of the vitamin from the soil, and from surface contamination with B-12 producing bacteria. (This is discussed in detail below.) Animals concentrate B-12 from the food they eat, and, in the case of folivores, biologically active B-12 may be produced by bacteria in the fermenting chambers in the digestive system. The end result of this is that plant foods provide little (if any) B-12, and animal foods are the only reliable food sources for B-12. Because plant foods are deficient in B-12, the use of B-12 supplements (or fortified foods, e.g., nutritional yeast) is recommended for vegans by most nutrition experts. Of course, some extremists disagree, and one must seriously question whether such "experts" (or "diet gurus") are putting dietary dogma ahead of the health and welfare of those who follow their advice.

107

The primate connection B-12 also an essential nutrient for non-human primates. From Hamilton and Busse [1978, p. 763]:
Many captive primate species enter into hypovitaminosis B-12 [deficiency] when maintained on vegetarian diets (Hauser and Beard 1969, Oxnard 1964, 1966, 1967, Siddons 1974, Siddons and Jacob 1975)... Vitamin B-12 is the least readily available vitamin to omnivorous primates... Deficiency diseases have not been identified for any wild primate population (Kerr 1972, Wolf 1972). The fact that wild primates avoid B-12 deficiency suggests that their natural diet provides adequate B-12. Inasmuch as all primates eat insects, and some insects contain B-12 (see Wakayama et al. [1984]), this suggests insects as a possible B-12 source for some primates, along with production by fermentation bacteria as a possible source for folivorous primates (e.g., gorillas, whose consumption of insects is very small when compared to their intake of plant foods).

B-12 recycling and interactions To a great extent, B-12 is recycled from liver bile in the digestive system. This is one reason why
vitamin B-12 deficiency is rare among vegans, even those who do not use supplements or supplemented foods. The recycling is summarized by Herbert [1994, p. 1217S]: The enterohepatic circulation of vitamin B-12 is very important in vitamin B-12 economy and homeostasis (27). Nonvegetarians normally eat ~2-6 mcg of vitamin B-12/d and excrete from their liver into the intestine via their bile 5-10 mcg of vitamin B-12/d. If they have no gastric, pancreatic, or small bowel dysfunction interfering with reabsorption, their bodies reabsorb ~3-5 mcg of bile vitamin B-12/d. Because of this, an efficient enterohepatic circulation keeps the adult vegan, who eats very little vitamin B-12, from developing B-12 deficiency disease for 20-30 y (27)... Unlike the vegetarian whose absorption machinery is normal, the person whose absorption machinery is damaged by a defect in gastric secretion, by a defect in pancreatic secretion, or by a defect in the gut that produces intestinal malabsorption will develop vitamin B-12 deficiency in 1-3 y because these absorption defects block not only absorption of food vitamin B-12, but reabsorption of vitamin B-12 excreted into the intestinal tract in the bile (2,6).

Reduction in stomach acid promotes B-12 deficiency. A reduction in gastric (stomach) acids is associated with the development of bacterial colonies in the stomach that produce analogues of vitamin
B-12, which can accelerate or promote B-12 deficiency. From Herbert et al. [1984, p. 164]: As pernicious anemia develops, the first loss usually is of gastric acid. Figure 3 (from Drasar and Hill, 23) shows that the achlorhydric stomach [one unable to produce hydrochloric acid] is usually heavily colonized with enteric bacteria. The increased colonies of enteric bacteria in the achlorhydric stomach and small intestine of the pernicious anemia patient may produce analogue which may in three ways accelerate the development of B-12 deficiency. The loss of gastric acids may also occur in iron deficiency. The iron in plant foods is of much lower bioavailability than in animal foods. The common grain-based vegan diet contains antinutrient factors that may inhibit iron absorption (discussed later in this section). Vegetarians, especially vegans, are at higher risk of iron deficiency. From Herbert [1994, p. 1215S]: ...iron deficiency is twice as common in vegetarians as in omnivores (3)... Prolonged iron deficiency damages the gastric mucosa and promotes atrophic gastritis and gastric atrophy, including loss of gastric acid and I.F. [intrinsic factor] secretion, and therefore diminished vitamin B-12 absorption (3, 4, 19). This would cause vitamin B-12 deficiency in twice as many vegetarians as omnivores (3, 4, 19).

Does strict fruitarianism accelerate B-12 deficiency?
Note that loss of gastric acid, whether caused by an iron deficiency or other cause, may promote B-12

108

deficiency. In this regard, the anecdotal experience of some fruitarians may be relevant. (Please note that the following remarks regarding fruitarianism are based on anecdotal evidence, as that is all that is available; there are no published, peer-reviewed studies on fruitarians on this topic.)

Possible reduction in stomach acid in long-term fruitarians. Among the very few people who
manage to follow the fruitarian diet with at least a limited degree of success for longer than a year (i.e., manage to stay on the diet, without extensive binges or cheating--both of which are quite common in fruitarianism), some report that when they try to eat other food again (i.e., foods other than fruits), they are unable to digest it. Food may be eaten and pass through the digestive system, coming out as stools, yet appearing almost exactly as it did when consumed. Typically this effect is observed with protein foods or cooked food, but it may occur with other foods as well. (Surprisingly, the effect may also occur with fruit, if one stays on the diet long enough.) The above phenomenon (which this writer personally experienced after abandoning fruitarianism and returning to a more normal vegetarian diet) suggests a possible deficiency of gastric acid caused by the fruitarian diet. (In time, with a more normal diet, the digestive system appears to recover.) If so, it suggests that fruitarian diets, if they decrease gastric acid, may actively promote B-12 deficiency over and above any effect due to lack of B-12 in the food. (This is a hypothesis at present, of course, and would need research to validate it).

Other negative symptoms of fruitarianism. Many who attempt strict fruitarian diets report other
negative symptoms--intermittent fatigue, lassitude, loss of libido, sugar metabolism problems (diabetes-like symptoms: excess urination, thirst, mood swings, etc.), as well as emaciation. (Note: thirst on a fruitarian diet may seem counterintuitive, since the level of fluid intake is so high. However, the diuretic effect of some fruits [such as citrus], if used as staples, can negate that, depending on the case.) The possibility of fruitarianism increasing the potential for a B-12 deficiency raises the question whether the fatigue, loss of libido, and lassitude that some fruitarians experience are possibly related to B-12 deficiency, or alternatively (or both), to the sugar metabolism/emaciation-starvation effects of the diet.

Clinical B-12 deficiency rare. Doubtless some fruitarian advocates will challenge the above and ask
where are the fruitarians with clinical signs of B-12 deficiency? They are hard to find, for the following reasons: • Fruitarians are quite rare, and ones who are strict are even rarer. Eating-disorder behavior, especially binge-eating, is quite common in fruitarianism. A 1997 survey (unpublished, conducted by Craig Woods Schiemann) of members of SF-LiFE, a raw-food support group in San Francisco, found that over 50% of members surveyed reported problems with binges and cravings. (Side note: thanks to Craig for the survey information.) Indeed, even some of the extremists who promote fruit as the ideal diet admit that they binge-eat and can't follow (strictly) their allegedly "ideal" diet. Binge-eating, of course, may help prevent B-12 deficiency if foods containing B-12 are consumed in binges or as "exceptions" (also commonly known as "cheating," if done in secret). Most fruitarians oppose conventional medicine and refuse to take blood tests, i.e., refuse to have their serum B-12 levels measured. The lack of B-12 tests makes it impossible to validate the B-12 status of fruitarians.

• •

Finally, there are a few sensible, credible fruitarians (though more loosely defined) who specifically include small amounts of animal products in the diet to satisfy B-12 requirements.

Vitamin B-12: Rhetoric and Reality (CONT., 2 OF 5)

109

Vitamin B-12 deficiency in natural hygienists
Dong and Scott [1982] took blood samples at the 1979 annual convention of the American Natural Hygiene Society (ANHS), and tested the samples for serum B-12 levels and other parameters of interest. A total of 83 volunteers provided blood samples. Each individual in the study provided detailed dietary information via a survey form, which asked about the individual's consumption of animal foods (including eggs and dairy), and also asked for a typical daily diet.

Description of natural hygiene diet, and diets of those in survey. For readers unfamiliar with the
term "natural hygiene," in this context the classical definition refers to a predominantly raw diet of fruits, vegetables, nuts, and seeds. Note that the data were collected in 1979, at which point the ANHS had a long history of strongly emphasizing raw plant foods. Since then, the ANHS has revised their position. While still emphasizing plant foods (veganism), the stress previously placed on raw foods has been deemphasized; and adding cooked starches (grains, legumes, tubers, squashes) to the diet is now recommended. Based on the dietary data provided in the survey form, subjects were classified according to their diet, with vegans being defined as those who consumed no animal foods, lacto-vegetarians as those who consumed dairy as their only animal food, lacto-ovo-vegetarians as those who consumed dairy and eggs, and semivegetarians as those who consumed animal flesh foods two or less times per month. It should be noted here that because the subjects were recruited at an ANHS convention, and the ANHS emphasized raw foods at the time, at least some of the subjects in the vegan category presumably were raw or predominantly raw vegans.

Serum B-12 levels of vegan natural hygienists below lower limit of normal range. Dong and
Scott [1982, pp. 214-215] report: Among subjects who did not supplement their diets with B-12 or multiple vitamin tablets, 92% of the vegans, 64% of the lacto-vegetarians, 47% of the lacto-ovo-vegetarians and 29% of the semi-vegetarians had serum B-12 levels less than 200 pg/ml [the lower limit of the normal range]. Mean serum B-12 levels of the dietary groups increased with increasing dietary sources of B-12... Some cases of mild macrocytosis [pernicious anemia] were seen among the vegetarians, but fewer than expected... The data indicates that increasing diversity of animal products consumed increased the serum B-12 level. Dong and Scott [1982, p. 210] report the average serum B-12 levels found as shown below. In the following, note that the normal serum B-12 level is 200-900 pg/ml.

Mean Serum B-12 Levels (pg/ml) Observed at 1979 ANHS Annual Convention from Dong and Scott [1982] DIETARY CATEGORY Vegan Lacto-vegetarian Lacto-ovo-vegetarian Semi-vegetarian Males 120 200 190 360 Females 110 180 260 240

110

Non-vegetarian

360

660

Note that the B-12 serum levels observed for the vegans is in the range regarded as deficient [Dong and Scott, pp. 213-214].

Vitamin B-12 levels in "living foods" and other vegan/vegetarian diets

B-12 in living foods (raw vegan) diets. Rauma et al. [1995] provides data on the vitamin B12 status of living-fooders, i.e., individuals who follow the raw vegan "living foods" diet as taught by Ann Wigmore. Such a diet emphasizes raw sprouts, fermented foods, blended/liquefied foods (raw soups), dehydrated foods, and wheatgrass juice. There were two components to their study: a cross-sectional component and a longitudinal study.

Technical note concerning measurement units in Rauma et al. [1995] study. The study
under discussion here reported results in pmol/L, rather than pg/ml as used by the other studies cited in this section. However, there are reasons to believe that the units for the cross-reference portion of the subject study are actually pg/ml, rather than pmol/L. Because of this, the numbers below (which come from the cross-reference part of the study) are listed here as pg/ml. The rationale for this is as follows: First, as the concern here is a potential inadvertent mix-up between the two different measurement units, pmol/L (pico-moles per liter) and pg/ml (picograms per milliliter), note that the relationship between the two units of measure is: 1 pmol/L B-12 = ~1.35 pg/ml where the ~ above indicates approximately. This follows from the definition of moles, and simple algebra with the units, using the molecular weight of B-12. o If the unit "mole" is unfamiliar to you, check an elementary chemistry text, e.g., Oxtoby and Nachtrieb [1986, pp. 21-23] for explanation. o Note that the term vitamin B-12 does not refer just to one specific molecule; instead the term is applied to a number of biologically active compounds (though not the biologically inactive B-12 "analogues"), an approximate average molecular weight for which is 1350 [Schneider and Stroinski 1987, pp. 56-59]. These slight variations in molecular weight between different active B-12 compounds account for the approximate nature of the conversion ratio between the two measures pmol/L and pg/ml. Second, Rauma et al. [1995] report that one test kit had a lower reference limit of 150 pmol/L, which after conversion is approximately 200 pg/ml, the standard. However, the kit that was used for the cross-reference study (and the numbers below) supposedly had a lower reference limit of 200 pmol/L, which after conversion is 270 pg/ml--a number that is non-standard and appears to make no sense. Further adding to the uncertainty here is that Rauma et al. cite the names of two different manufacturers for the second B-12 assay kit, but email correspondence with the technical support departments of both the named companies reveals that neither company ever made B-12 assay kits. Although the units issue is confusing, the weight of the evidence seems to point to the units for the cross-reference study portion as being pg/ml, and that is how they are reported here. As this website article is being published, an inquiry is pending with the authors of the cited study. When the issue is resolved, this section will be updated as appropriate. Until then, please recognize that there is some uncertainty regarding the measurement units for this study [Rauma et al. 1995].

111

The cross-sectional analysis compared the serum B-12 levels of 21 living-fooders,
each paired with a control (standard Western diet, i.e., non-vegetarian) matched by sex, age, and other factors. The findings of the cross-sectional study found that the control group (standard Western diet) had significantly higher serum B-12 levels (311 pg/ml) than the living-food vegans (193 pg/ml). Note that the average B-12 level reported for living-fooders, 193 pg/ml, is close to the lower limit of the normal range: 200 pg/ml. However, Rauma et al. [1995] report that 57% of the living-foods vegans had B-12 levels below 200 pg/ml.

The longitudinal study was conducted over a 2-year period, using 9 individuals who reportedly
followed the living-foods vegan diet long-term. They note [Rauma et al. 1995, pp. 2513, 2514]: The longitudinal study revealed a decrease in serum vitamin B-12 concentrations with time in six of nine subjects, indicating that the supply of vitamin B-12 from the "living food diet" is inadequate to maintain the serum vitamin B-12 concentration... [T]he occurrence of low serum vitamin B-12 concentrations in over 50% of the long-term adherents of this diet warrants further study of the possible health risks involved. •

B-12 levels in long-term vegans. Bar-Sella et al. [1990] examined the serum B-12 levels of 36
strict vegans who had followed the diet long-term (5-35 years). The vegans had an average serum B-12 level of 164 pg/ml; the control group (standard Western diet) had an average serum B-12 level of 400 pg/ml. The difference between the two groups was significant at the 0.001 level. 26 vegans (from a sample of size of 36, i.e., 72.2% of vegans in the sample) had B-12 levels below 200, the lower limit of the normal range [Bar-Sella et al. 1990, p. 310]. None of the vegans displayed hematological defects (i.e., signs of anemia); however, 4 vegans with B-12 levels below 100 had neurological symptoms. Three of the four (with neurological symptoms) were followed, and they all showed clinical improvements in symptoms after intramuscular (injected) vitamin B12 supplementation. Crane et al. [1998] studied the vitamin B-12 (cobalamin) status of two vegan families. The individuals studied had been vegans for 2-21 years. Their study is noteworthy for two reasons. First, it was very thorough and included tests for levels of homocysteine, methylmalonate, and several other factors involved in B-12 metabolism. Second, it discusses the use of oral B-12 supplements, and recommends that B-12 supplements be chewed, rather than swallowed whole, for best absorption. Crane et al. [1998, pp. 87, 88] note: Results showed that if the data of the serum CBL [cobalamin, vitamin B-12] and urinary MMA [methylmalonic acid] are combined, all nine of the [study] subjects had chemical evidence of insufficient CBL [i.e., deficiency]... Evidence, which we have reported elsewhere (Crane et al., 1994) indicates that over 80 per cent of those people who have been vegans for two years or more are deficient in CBL... Only one person in the study showed possible symptoms of deficiency, which were alleviated by oral B-12 therapy. However, five of the test subjects showed mild signs of anemia.

B-12 levels in lacto-vegetarians. In contrast to the low B-12 levels reported in vegans, B-12
levels in lacto-vegetarians appear to be closer to normal. Dong and Scott [1982], discussed above, tested lacto-vegetarians; see table above for serum B-12 values observed. Solberg et al. [1998] analyzed the plasma B-12 levels of 63 long-time Norwegian lacto-vegetarians, and found no significant differences when compared to a control group (standard Western diet). An interesting

112

side point in the Solberg et al. [1998] study is that lacto-vegetarians who took B-12 supplements had plasma B-12 levels equal to the lacto-vegetarians who did not take supplements. [Note: this should not be interpreted as suggesting that oral cobalamin supplementation is ineffective. Quite to the contrary, an excellent recent study, Kuzminski et al. [1998] has found oral cobalamin therapy to be more effective than intramuscular (injected) cobalamin therapy.]

Negative review paper critical of studies pointing to deficient B-12 status in vegetarians found to be inaccurate and outdated
Immerman [1981], a study of vitamin B-12 status in vegetarians, is subtitled, "A critical review," and attempts to review the clinical studies of vitamin B-12 deficiencies in vegetarians published before 1981. Immerman's approach is to check each study against a criteria list. His reference list had 7 criteria, of which the first 5 were considered essential for a study to be credible. After a detailed review of the studies, Immerman concludes [1981, p. 47], "When judged by these criteria, most of the studies which have found inadequate B-12 status in lactovegetarians and vegans are unconvincing." However, a retrospective and analytical review of Immerman [1981] shows that his assessment criteria are logically flawed, and there may be bias present in his review. A summary of the flaws in Immerman's review is as follows. •

Review criteria admitted only long-term deficiencies (a logical fallacy). Criteria 2 and 4
of Immerman require low serum levels of B-12, plus the presence of (i.e., diagnosis that includes) anemia and/or subacute combined degeneration (SCD). The problem with such narrow criteria is that the symptoms specified are serious long-term symptoms only; see Herbert [1994] for a discussion of the staging of B-12 deficiencies. For example, requiring a diagnosis of subacute combined degeneration means that milder neurological impairments are ignored (mild B-12 deficiency might not produce anemia or SCD.) Hence, it appears that Immerman's requirements can be met only by serious, advanced cases, and he may be rejecting perfectly good data--a major statistical and logical fallacy that renders his conclusions invalid. Review criteria required non-standard therapy for B-12 deficiency. Criterion 5 of Immerman requires that the B-12 symptoms be reversed by oral administration of 1 mcg of B-12 per day. To those who are not acquainted with standard treatment protocols for B-12 deficiency, this may seem reasonable. However, those acquainted with the issues involved in cobalamin therapy will immediately recognize criterion 5 as unrealistic and deceptive. Since the late 1950s, the standard method of treatment for B-12 deficiency in the U.S. has been via intramuscular injections (see Hathcock and Troendle [1991, p. 96] and Lederle [1991]), and not via oral cobalamin as Immerman appears to require. Indeed, Immerman appears to be so rigid in his evaluation of the studies that he objects to one study that administered 5 mcg B-12 (and one unit of blood) to a patient.

Review criteria called for symptom relief on an inadequate dose of B-12. Such rigidity
is actually deceptive because the absorption rate for oral cobalamin in pernicious anemia is approximately 1.2% [Berlin et al. 1968, as cited in Lederle 1991]. Hathcock and Troendle [1991] report that oral doses of 80-150 mcg per day are helpful but do not restore cobalamin serum levels to normal. Lederle [1991] notes that some early studies reported poor results from oral doses of 100-250 mcg of cobalamin per day. Oral doses of 300 to 1,000 mcg per day have proven effective for treatment of pernicious anemia; see Hathcock and Troendle [1991] for citation of 5 relevant studies. Now let us consider Immerman's criterion 5--that is, strictly requiring an oral dose of 1 mcg per day--in light of the following facts. o B-12 (cobalamin) RDA of ~2 mcg/day.

113

o o

An approximate absorption rate of 1.2% for oral cobalamin in pernicious anemia cases.

The review criterion of 1 mcg oral B-12 per day is not supported by reference cited in review. Immerman cites Chanarin [1969, pp. 40-63, 708-714] as
the reference for criterion 5. However, Chanarin [1969] states that 1 mcg B-12 per day is insufficient (p. 57), recommends oral administration of 5-10 mcg B-12 per day in cases of deficiency (p. 712), and cites two studies: Schloesser and Schilling [1963], and Winawer et al. [1967], which reported no or suboptimal improvement on oral doses of 1 mcg B12/day. Thus it appears that Chanarin [1969] does not support Immerman's criterion 5. More recent studies indicate that daily oral doses of 5-20 mcg/day (i.e., higher than the dose specified by Immerman) without intrinsic factor may be ineffective; see Lederle [1991] for two relevant citations. Standard treatment (U.S.) for B-12 deficiency has been via injections rather than oral cobalamin.

o o

We conclude, then, that Immerman's criterion 5 requires symptomatic relief from what appears to be an insufficient dosage of cobalamin, administered in a non-standard way (orally vs. injections). This leads to the conclusion that Immerman's criterion 5 is unrealistic and deceptive, and thus is a logical fallacy that renders his conclusions invalid. •

Rationalizations and bias in review? A review of Immerman [1981] suggests he grasps at
every possible rationalization in an attempt to avoid facing the simple reality that if inadequate amounts of vitamin B-12 are ingested, a deficiency may result. Some may interpret such rationalizations as suggesting possible bias. While it is certainly possible for a vegetarian to be deficient in B-12 due to factors unrelated to diet, to try to rationalize away all B-12 deficiencies in vegetarians via non-diet reasons makes the issue of bias relevant here. Immerman actually did not address one of the most important issues regarding older B-12 studies: the assay methods used and their reliability (i.e, some of the older assay methods are unreliable). For more on this point, see the following subsection on "B-12 in spirulina and other plant foods" just below.

A review of Immerman's "critical review" finds it outdated and invalid. Immerman's "critical
review" does show that some of the older B-12 studies are not as convincing as more recent studies, i.e., in light of newer knowledge, the shortcomings of the older studies are readily apparent. Similarly, a retrospective review of Immerman [1981] finds it to be seriously flawed, logically invalid, and potentially biased. If any reader thinks it unfair to judge Immerman [1981] using more recent knowledge, please recognize that this is also what Immerman himself was attempting to do when assessing the earlier B-12 studies. However, note that Immerman's review is invalidated not solely or simply on grounds of more recent knowledge. The criticisms made here (above) of criterion 5 were relevant at the time of Immerman's review article; i.e., Immerman's study was logically flawed at the time it was published in 1981. [Note: Despite the risk or reality of belaboring the point here, I have discussed Immerman [1981] at length because it is occasionally cited by raw/veg*n diet advocates who wish to discount the published studies showing B-12 deficiencies in veg*ns, and/or deemphasize the importance of B-12 in veg*n diets.]

Steer clear of dietary extremists who rationalize B-12 deficiency concerns. The above evidence
of apparent vitamin B-12 deficiency in vegans (both raw and conventional) should serve as a warning to fruitarian/veg*n extremists and their followers who claim adequate vitamin B-12 can be obtained by eating unwashed plant foods or from intestinal bacterial synthesis (the latter topic is discussed later herein).

Vitamin B-12: Rhetoric and Reality (CONT., 3 OF 5)
B-12 in spirulina and other plant foods

114

Microbial assays for B-12 are unreliable. A common misconception in vegan circles is that
fermented foods and spirulina contain B-12. This claim may, at times, be supported by lab tests for B-12 based on the USP (U.S. Pharmacopeia) assay methods. Unfortunately, as explained in Herbert et al. [1984] and Herbert [1988], the USP assay method for B-12 is unreliable. The assay measures total corrinoids--that is, true B-12 plus analogues (forms of B-12 that are not metabolically active in the body)--and the analogues have the potential to block the absorption of true B-12 by occupying B-12 receptor sites. A preferred, reliable test that can differentiate between true B-12 and corrinoids is provided by differential radioassay. The assay problem must be considered in evaluating "old" studies on B-12.

Spirulina and tempeh contain mostly analogues of B-12. Herbert [1988] reports that tests on
tempeh, a fermented soy product, and spirulina revealed that they contained almost no true B-12, i.e., the "B-12" they contained (per USP assay test) was predominantly analogues. Herbert [1988, p. 857] reports: We suspect that people taking spirulina as a source of vitamin B-12 may get vitamin B-12 deficiency quicker because the analogues in the product block human mammalian cell metabolism in culture [i.e., in the lab] and we suspect that they will also do this in the living human. The presence of analogues, rather than true B-12, in fermented foods makes them unreliable sources for B12.

Effects of cooking and processing on B-12 in foods Available information not well-controlled enough to provide definitive answers. Recognizing
that some vegans advocate raw-food diets, the question of the effect of cooking and processing on B-12 levels is relevant. The information on this topic (available to this writer) is less clear than desired. That is, for a reliable comparison to be made, tests must be made on both raw and cooked samples from the same base lot of raw foods, and a reliable assay method must be used. Comparison of B-12 levels, raw vs. cooked, via standard nutritional tables, is not an optimal comparison method, as it may not be clear whether the above conditions for comparison are met, and/or if reliable assay methods were used in the table analysis. The limited evidence available, though, suggests that cooking reduces B-12 levels, though the exact extent is, unfortunately, unclear. • • Herbert [1984] reports that dehydration at 200°C (392°F) for 6 days reduced B-12 levels by approximately one-third. Note that 200°C is well above normal boiling/steaming temperatures. Herbert [1984] also reports that boiling in alkali destroys 85% of the corrinoid content, per L. liechmanni (USP, bacterial assay) test. This result is difficult to interpret, as typical cooking practices do not involve boiling in alkali, and the USP test, as discussed above, is not a reliable measure of true B-12. Banerjee and Chatterjea [1963] report wide variation in B-12 losses in cooking various types of fish, meat, and milk. B-12 losses, when they occurred, ranged from 23.7-96.4%. However, one fish species in their study showed no loss of B-12, and samples from three fish species (and also goat liver) showed increases in B-12 levels from cooking. (The authors suggest the increase may be caused by cooking increasing the B-12 level in the extraction solvent.) The result that B-12 level for some species increased via cooking, of course, raises questions regarding the overall reliability of their experimental methods and results. Also, their study measured B-12 via microbial assay (Euglena gracilis var. bacillaris), which today is regarded as a less-reliable assay method (does not/may not distinguish analogue forms from true B-12). Banerjee and Chatterjea [1963] also tested 20 types of plant foods, and found no B-12 activity in any of the raw plant foods. Heyssel et al. [1966] report that liver boiled in water for 5 minutes lost only 8% of its vitamin B12 content (vs. raw liver), while muscle meat broiled at 171°C (340°F) for 45 minutes lost 27% of vitamin B-12 (vs. raw meat). [Note: The paper mentions 340°F in one place and 340°C in another.

115

Given that 340°C = 644°F (an unusually high cooking temperature), that figure is probably a typo, and the correct figure is most likely 340°F (171°C).] They note that some of the B-12 they record as lost might actually be contained in the drippings (liquids discarded) in cooking. Their paper also used a Euglena gracilis assay for B-12. The possibility that plant foods might contain some B-12 will be discussed later herein. At that time we will note that there is little or no data on the effect of cooking on B-12 levels in plant foods.

Is biologically active B-12 produced by intestinal bacteria? Claims of intestinal B-12 production may be based on insufficient evidence. Albert et al. [1980]
is sometimes cited as evidence that B-12 producing bacteria can exist in the small intestine. Sometimes explicit claims are made, e.g., that intestinal bacteria allegedly can produce adequate B-12. Baker [1981] and Nutrition Reviews [1980] are related citations that comment on Albert et al. [1980]. However, a careful reading of Albert et al. [1980] shows that it used bacteriological assays, which are of lower reliability, to measure B-12 levels. Specifically, the most accurate bacteriological assay they used is Ochromonas malhamensis. Note that Ochromonas is the most accurate bacterial assay method for B-12; however, even it may report values for some analogues as part of its "B-12" results [Schneider and Stroinski 1987, Tables 3-2, 5-3 to 5-5, pp. 56-57, 119-123]. Herbert and Das [1994, p. 405] apparently regard all the bacterial assay methods as being less reliable than differential radioassay; also see Herbert et al. [1984] and Herbert [1988] for related information. Additionally, the data obtained in Albert et al. [1980] comes from isolated bacterial cultures. Therefore, it is unclear whether the bacteria would produce similar amounts of B-12 under the conditions present in the intestines. This point is discussed in Albert et al. [1980], but is sometimes ignored by dietary advocates with an ideological interest in minimizing the requirement for B-12 in the diet. The bottom line in the paper of Albert et al. [1980] is that it shows certain intestinal bacteria might produce B-12, but it is unclear whether/how much might be produced (and absorbed) under actual conditions in the small intestine. Langley [1995, p. 74] summarizes the situation nicely: In some people, B-12 producing bacteria certainly exist in the small intestine where the vitamin manufactured can, in theory at least, be absorbed. Exactly what contribution this makes to the daily B-12 intake of vegans remains to be clarified. Also recall the discussion above (from Herbert [1984]) regarding the achlorhydric stomach being colonized by bacteria that produce abundant analogues of B-12. Analogues (which block uptake of true B-12) are a major concern whenever one discusses the possibility of B-12 being produced in the small intestine.

Direct coprophagy: a reliable (vegan?) B-12 source
Note: this section may be considered to be in poor taste--both figuratively and literally--by some readers. It is included here for completeness, and in the event certain (extremist) fruitarian/veg*ns might be interested in experimenting with a vegan (?) source of vitamin B-12 that is truly radical in character.

B-12 produced in, but cannot be absorbed from, the human colon. The human colon contains
bacteria that produce vitamin B-12, and fecal matter is a rich source of B-12. This raises the question of whether B-12 can be absorbed from the colon. From Herbert [1988, p. 852]: In one of the less appetizing but more brilliant experiments in the field of vitamin B-12 metabolism in the 50s, Sheila Callendar (7) in England delineated that colon bacteria make large amounts of vitamin B-12. Although the bacterial vitamin B-12 is not absorbed through the colon, it is active for humans. Callendar studied vegan volunteers who had vitamin B-12 deficiency characterized by classic megaloblastic anemia. She collected 24-h stools, made water extracts of them, and fed the extract to the patients, thereby curing their vitamin B-12 deficiency. This experiment demonstrated clearly that 1) colon bacteria of vegans make enough vitamin B-12 to cure vitamin B-12 deficiency, 2) the vitamin B-12 is not absorbed through the colon wall, and 3) if given by mouth, it is absorbed primarily in the small bowel.

116

Herbert et al. [1984] collected the 24-hour fecal output from 6 men. They found that the (24-hour) total fecal output contained ~100 mcg of total corrinoids, of which only ~5 mcg was true B-12 (the remainder being analogues). (Note: see Mozafar [1994] for a table of B-12 levels in manure, feces, soil, sludge, etc.) Given this, the work of Callendar mentioned above could be taken to suggest that the true B-12 in the feces (if reingested and passed back through the small bowel) would be absorbed, despite the substantial amount of analogues present.

Any takers? Further, the daily output of ~5 mcg versus the RDA/RDI of 1-2 mcg suggests that a direct
coprophagy level (i.e., reingestion of feces) of 20-40% of output will meet requirements for B-12. Might this qualify as the only truly reliable, vegan (?) source of B-12? Will coprophagy be the next fad among certain fruitarian extremists? (Obligatory warning: coprophagy, and the handling of feces, is unsafe and increases the risk of transmission of parasites and diseases. Coprophagy is not recommended.)

Vitamin B-12: Rhetoric and Reality (CONT., 4 OF 5)
Indirect coprophagy: plant foods fertilized with feces/manure
The consumption of plant foods grown in soil fertilized with human manure (occasionally called "night soil") might also provide adequate B-12. Herbert [1988, pp. 852, 854] notes: The more frequent source of vitamin B-12 in association with plant food is external contamination with bacteria, often of fecal origin... The fact that stool vitamin B-12 can be important in human vitamin B-12 economy was delineated by James Halsted (11) working with Iranian vegans who did not get vitamin B-12 deficiency... Halsted went to Iran and found that they grew their vegetables in night soil (human manure). The vegetables were eaten without being carefully washed and the amount of retained vitamin B-12 from the manure-rich soil was adequate to prevent vitamin B-12 deficiency. Note that the first part of the quote above--that B-12 on plant foods is primarily present as contamination-may be outdated, per the more recent research of Mozafar [1994]. The paper of Mozafar is sometimes cited by raw/vegan advocates as "proof" that a raw/vegan diet, if grown in the "right" soil (manured), can provide adequate B-12. Therefore, let's now take a closer look at this research. Mozafar [1994] grew soybeans, barley, and spinach on three kinds of soils--unenriched soil (control), soil fertilized with raw, dried cow dung, and soil enriched with vitamin B-12. The discussion of Mozafar [1994] in this paper will omit the results from soil enriched with B-12, as it is not an economically feasible agricultural practice at present. Vitamin B-12 levels were measured using "radioisotope dilution" [Mozafar 1994, p. 307], also called protein-binding. This assay claims a high resolution in distinguishing between true B-12 and analogues.

Plants may absorb B-12 from the soil. Mozafar [1994, pp. 307, 309-310] claims:
[M]ost, if not all, of the B-12 in the plant may be in the free form... Considering the reports that plant roots and leaves can absorb the relatively large molecules of B-12 from nutrient solutions and transport them to other plant parts (Mozafar and Oertli 1992a) and the belief that plants cannot synthesize this vitamin (Friedrich, 1987; Lehninger, 1977; Smith, 1960), it seems that the observed increase in the concentration of B-12 in barley seeds and spinach leaves fertilized with cow dung is mostly (if not fully) due to the uptake of this vitamin by the roots from the soil and not due to superficial contamination or an increased synthesis within the plant.

117

The claim that the B-12 in the above experiment was mostly true B-12--and absorbed and taken up by the plants--is surprising in light of the known presence of B-12 analogues in many foods (see the papers by Herbert for discussions on this point). It also raises an important issue: if the assay techniques used by Mozafar did not also measure vitamin B-12/cobalamin analogues, then we really don't know if analogues are present, which could (potentially) interfere with absorption of vitamin B-12 from plant foods. A later paper by Mozafar--i.e., Mozafar [1997, p. 51]--reports that very little is known about the absorption of B-12 analogues by plants. Note also that the second part of the above quote contradicts the claim by Herbert that B-12 is usually found primarily externally in plants. (The claim by Herbert may be outdated/incorrect; see Mozafar and Oertli [1992] for discussion of their research that shows soybean roots are able to absorb B-12, and the vitamin can be transported within the plant.)

B-12 found in tested plant foods. Mozafar found that the levels of B-12 in barley and spinach grown in
untreated (control) soil versus soil treated with cow dung were significantly different. However, the B-12 levels of soybeans were not significantly different for the two soil types. The B-12 measurements in Mozafar [1994] are frankly hard to interpret as-is. That is, the measurement unit utilized, i.e., nanograms of B-12 per gram of plant food, dry weight, is not meaningful to most readers. To help readers understand the Mozafar results, a part of the results are used below to estimate the amount of each plant food (by itself) needed to get 2 mcg of B-12 per day. The last 2 columns in the table below are the most important--they show the estimated weight (in kilograms and in pounds) of raw plant foods required to satisfy daily B-12 requirements.

B-12 Data from Mozafar [1994] and Estimated Weights of Plant Foods Required to Supply 2 Mcg of B-12 FOOD SOURCE Mcg of B-12 per kilogram (dry wt.) 1.6 2.9 Amount of food necessary to achieve 2 mcg daily requirement of B-12 Dry wt. (KILOGRAMS) 1.25 0.69 Wet wt. (KILOGRAMS) 1.37 0.75 Sprouted* (KILOGRAMS) 4.78 2.64 Sprouted* (POUNDS) 10.52 5.81

Soybean, control Soybean, manured Barley kernels, control Barley kernels, manured Spinach, control Spinach,

2.6

0.77

0.85

1.70

3.74

9.1

0.22

0.24

0.49

1.07

6.9 17.8

0.29 0.11

3.44 1.33

3.44 1.33

7.57 2.94

118

manured

*NOTE: Sprouting applies to the figures for soybeans and barley; spinach is eaten as-is.
Analysis of Mozafar [1994] results
On the surface, the results of Mozafar [1994] appear to indicate that one may be able to get adequate B-12 from a diet of raw plant food. However, let's take a closer look at the numbers.

Soybeans. Soybeans in their crude form are inedible, and need sprouting or cooking to be rendered edible.
Many raw-fooders would challenge the idea that even raw, sprouted soybeans are actually that edible, however, as the flavor is--how to say it--"intensely awful." At any rate, to satisfy B-12 requirements one must eat 2.64-4.78 kg, or 5.81-10.52 pounds of soybean sprouts, per day, to meet B-12 requirements. It is very difficult/nearly impossible to eat that much bulk on a regular basis. The crude beans (0.75-1.37 kg) can be cooked, and the bulk is more manageable, but it is not clear how much extra one would need to eat to adjust for B-12 losses in cooking.

Barley kernels. Barley kernels in their crude form are inedible. While not 100% certain, it appears
Mozafar tested unhulled barley, which has a tough hull (with sharp edges when chewed) that makes eating it raw very difficult, even when sprouted. The amounts of barley required are lower than for soybeans; in sprouted form, the estimate is 0.49-1.70 kg, or 1.07-3.74 pounds. However, the reality of the extremely tough hull on barley makes eating even 0.49 kg (1.07 pounds) per day a difficult and unpleasant task. The 0.24-0.85 kg (0.53-1.87 pounds) of crude barley kernels, if hulled and cooked, are easy to consume. Again, the problem arises in that one needs accurate data on B-12 losses in cooking to know how much extra to eat to compensate for loss of B-12 in cooking.

Spinach. Here 1.33-3.44 kg (2.94-7.57 pounds) are required. The amounts are within the range of
possibility for a person to eat, though it would be difficult (bulk--multiple meals required). The problem, of course, is that a reliable supply of manured spinach is not available to most people, and the weights required of control (unmanured) spinach--3.44 kg or 7.57 pounds--are far too high, again, i.e, yet another bulk problem. It would be extremely difficult to eat that weight of greens in a day.

Comments on the Mozafar [1994] research Research needs to be repeated and verified. The Mozafar paper is the first paper to report B-12
levels above trace in common plant foods. As it is but one study, the research needs to be validated--i.e., it should be repeated and extended by others. (Repetition of the research would provide important confirmation information.) It would be interesting to include root crops, fruit, and wheat in any future research along these lines; indeed, the extension of Mozafar's research to woody plants like tree and bush crops could be very important. Additional comments: • •

B-12 levels in soil highly variable. The table in Mozafar [1994, Table 3, p. 309] indicates that
B-12 levels in soil can vary widely. Accordingly, it would be unwise to assume that a particular soil will supply levels of B-12 to plants that are adequate for human nutritional needs. B-12 losses in shipping and processing? Mozafar analyzed the plant foods at the location where they were grown, just after harvest. Most consumers buy their foods from markets, and the food has been processed and shipped. This raises the question of whether, and how much, postharvest processing and shipping can change the B-12 levels in foods. Similar questions apply to types of cooking and food processing. Legal risks from use of raw manure. The study used raw animal manure; owing to the liability risks (E. coli) of raw manure, the research should be repeated with composted manure. The use of raw, dry cow dung as a soil amendment is uncommon (nowadays) in farming as it can introduce weed seeds.

119

• • •

• • •

The use of manure is not universal, even in organic gardening; it is not the case that "organic = manured" or even "organic = composted." Some organic farmers prefer to simply use heavy mulches, with no manure or compost. Some vegan purists reject the use of animal manure. The term used for such gardening is "veganic." Inasmuch as some vegans condemn the use of manure in farming, the information that the only possible "natural" way to get B-12 in the usual plant foods is via the use of manure--an animal product--as fertilizer provides yet another contradiction/irony in veganism. Of course, some might reply that human manure can be used instead. Indeed it can, but if cow dung is abhorrent to vegan purists because it is an "animal product," then is human manure an "animal product" as well? What about human manure from meat-eaters? Per Mozafar, sewage sludge is very high in B-12, but organic food advocates oppose the use of sewage sludge because of possible heavy metal contamination. (The underlying point here is that sewage sludge might be used--in very small quantities, as a compost additive--to inoculate soils with B-12.) Raw human feces, aka "night soil," is high in B-12. However, its use in farming raises the risk of parasites, E. coli, and potential liability lawsuits aimed at farmers. It is not a feasible soil additive for farms, at present. The Mozafar [1994] paper is considered controversial by some veg*n nutritionists. For comments, see Craig [1997] as cited in Mozafar [1997], and Mozafar [1997] for reply. Possible exaggerations by raw/veg*n advocates. Given that Mozafar [1994] is only one (limited) study, that it did not test wild foods, fruits or nuts, roots or tubers, then to claim that it "proves" plant foods are adequate sources for B-12 would be an exaggeration. (Mozafar makes no such claims; the claim is occasionally made by raw vegan advocates).

Geophagy: another source for B-12?
Warning: for some, this may be in poor taste too. (But remember here our goal to be as thorough as possible on the topic.) Chimps and other apes have been observed engaging in geophagy, i.e., eating dirt, though the predominant hypothesis is that chimp geophagy is done to ingest clay that absorbs excess plant tannins in the GI tract (and not as a B-12 source). Mozafar [1994] reports 14 mcg B-12 per kg of soil. If we assume this is all true B-12 and no analogues (highly unlikely), it implies a daily intake of ~143 g (0.31 pounds) of soil daily to satisfy B-12 requirements. Such an intake is obviously not feasible. Note that Mozafar lists B-12 values for soil from other studies as well (Table 3, p. 309, Mozafar [1994]). However, many of the studies cited are old and thus the assay methods used should be checked. Mozafar reports that soil is one of the richest sources of B-12, but that there is no information available on the level of analogues vs. true B-12 in soil.

Synopsis of impact of Mozafar research
The Mozafar research provides a limited confirmation of the earlier research by Halsted (as reported in Herbert [1988]). That is, if you eat enough plant foods grown in manured soil, you might get adequate B12. However, inasmuch as most of us buy our foods from markets, it would be unsafe to assume that a typical raw/vegan diet provides adequate B-12. Accordingly, the use of B-12 supplements (and/or supplemented foods) by vegans is still appropriate

Vitamin B-12: Rhetoric and Reality (CONT., 5 OF 5)
Feasible B-12 sources in evolution/pre-agricultural times
Both geophagy and, more rarely, coprophagy, are practiced on a limited scale by chimps; see Goodall [1986] for relevant discussion. It is known (from toothwear studies) that prehistoric humans display

120

significant toothwear from "grit" or soil. No one knows how much grit prehistoric humans consumed, or how much might have been inadvertent (versus intentional), though the figure given above of 143 g/day of soil that would be required to achieve one's daily B-12 requirement is probably far more grit than anyone-even a less-than-fastidious prehistoric human--could tolerate. This suggests that geophagy was not a significant B-12 source for prehistoric humans. Prior to agriculture, there were no domesticated crops or herds of domesticated animals. Consequently, there was no deliberate manuring of food crops--which is an agricultural practice, not a hunter-gatherer practice. Of course, some wild plants will--by chance or other reasons--receive limited amounts of wild animal manure. However, there is insufficient evidence to support a claim that such incidental, occasional manuring (of wild plants) would provide a reliable source of adequate B-12 (particularly in light of the extremely high bulk of unmanured plant foods required to meet B-12 requirements).

Potentially feasible plant B-12 sources--legumes and grains--not available to huntergatherers. Note that Mozafar found higher levels of B-12 in soybeans (a legume) and barley (a grain) than
in spinach. Prior to the development of agriculture, such items (legumes, grains) would have been available only in very limited quantities to hunter-gatherer tribes. Further, prior to the development of cooking technology (stone ovens and/or pottery), it would not be possible to cook (or sprout) significant quantities of such foods. (Think of the difficult task of roasting, or sprouting--without implements or containers--a pile of loose, small seeds.) Pottery is also required for cooking most vegetables other than tubers or hard squashes. That means early humans, if they were strictly vegan, would need to eat huge amounts of raw greens per day to satisfy B-12 requirements (and even larger amounts to meet minimal calorie requirements; see The Calorie Paradox of Raw Veganism for more details). The seasonal limits on the availability of greens (and other plant foods) in temperate climates, combined with the large amounts required, further disqualifies greens as a feasible B-12 source.

Fauna (animal foods) the consistent B-12 source in evolution. Accordingly, the only
reliable/feasible sources of B-12, pre-agriculture, were coprophagy and the consumption of animal products. There is ample evidence of consumption of animal products in pre-agricultural times, but no evidence that coprophagy was ever a common practice. That leaves animal products (including insects) as the only reliable B-12 source in pre-agricultural times, and by implication suggests (if further evidence were needed beyond that already existing about prehistoric peoples) that there never existed any strictly vegan hunter-gatherers, as B-12 is an essential nutrient. (This also suggests that strict fruitarian/vegan diets were never a factor in human evolution.)

Vitamin B-12 and mercury
A not-very-clear claim occasionally made in fruitarian circles is that vitamin B-12 deficiency is caused not by a lack of vitamin B-12, but by the cobalt in B-12 undergoing oxidation due to heavy-metal action, specifically inorganic mercury from dental fillings (amalgam). However, a review of the limited information on the "theory" raises serious questions regarding its validity. Some of these reservations are: •

Apparently not a peer-reviewed theory. The theory reportedly appears in an article in a
periodical called Heavy Metal Bulletin (1995, not in the reference list for this paper). A search of several university libraries failed to find the periodical, suggesting an incorrect title, or, perhaps, that the periodical is not a scientific, peer-reviewed journal. Note that the comments here are based on a poorly written article (by a fruitarian extremist) that discusses the theory. (The article included what is claimed to be the reference list from the original article.) Defective reference list. A check of the articles allegedly in the reference list for the original theory article finds that at least 4 articles were either cited incorrectly or apparently do not exist. The net effect of such irregularities is to raise serious questions regarding the paper's validity. Evidence of non-human primates raises doubts about theory. The fact that wild primates (who have no silver-mercury amalgams in their mouths) who are fed vegetarian diets in captivity

• •

121

• •

become B-12 deficient (references cited above) suggests that B-12 deficiency is real; while the "B12/mercury hypothesis" is, at best, unproven. Pardridge [1976] showed that inorganic mercury interfered with brain uptake of certain amino acids, but his research did not test trans-cobalamin, methionine, or homocysteine--the proteins/amino acids involved in B-12 metabolism in the brain. Selenium/mercury association (in the brain) confirmed. Friberg and Mottett [1989], in their review article, suggest, and Bjorkman et al. [1995] confirm, a statistical relationship between inorganic mercury in the brain and selenium (not cobalt). This does not disprove the cobalt hypothesis, but it provides an alternate and proven hypothesis (whereas the cobalt hypothesis is unproven). Additionally, selenium is far more common than cobalt, which raises the question whether it might prevent potential B-12/mercury oxidation by "out-competing" the cobalt for any inorganic mercury present. Mercury chelated by tryptophan, a common amino acid. Pardridge [1976] reports that inorganic mercury is chelated by tryptophan, a common amino acid. This suggests that highprotein diets might provide limited protection from inorganic mercury uptake. (Ironically, the fruitarian extremist touting the vague B-12/mercury hypothesis advocates very low-protein diets.) Kanazawa and Herbert [1982] report that the B-12 in the human brain is nearly all true B-12; there are only trace amounts of B-12 analogues. This raises the interesting question of whether a mercury/B-12 compound could cross the blood/brain barrier, or if it would be rejected as an analogue. Note that this information is based on a very small sample, and must be interpreted with caution. Is there enough mercury to "starve" the brain of B-12, long-term? Given the information in the three preceding paragraphs, one wonders if dental fillings could provide, over a long period of time, enough mercury to react with all the available tryptophan, selenium, and cobalt? Given that tryptophan and selenium are relatively "common" (at least in comparison to cobalt), and that the amounts of mercury released by dental amalgam are extremely small, such a hypothesis seems dubious and would require credible evidence that has been published in a peerreviewed, scientific journal. Implication: Rely only on legitimate scientific information on B-12. For a legitimate scientific introduction to cobalamin metabolism, readers are encouraged to consult Tefferi and Pruthi [1994], which was consulted in the preparation of these remarks. An additional legitimate resource regarding B-12 deficiency and its biochemical basis is Cooper and Rosenblatt [1987].

In summary, the above points raise serious doubts about the (vague) mercury/B-12 hypothesis. Until such a theory is clarified, experimentally tested, and published in a credible scientific journal, it should be regarded as a speculative hypothesis.

Attempts to reverse burden of proof. Finally, this exercise also illustrates one of the bad habits of
the raw/veg*n movement: adopt an unproven theory or claim, then aggressively demand that others disprove it. Such an approach is an attempt to reverse the burden of proof, and the purveyors of such untested hypotheses should not be so easily allowed to get away with such assaults against reason.

A note on Victor Herbert
Victor Herbert, whose research on B-12 has been cited here, is a controversial figure in alternative health circles, given that he lobbies against alternative medicine and supports government restrictions on alternative health. He promotes the allopathic "party line," i.e., drugs and surgery. Because of this, Victor Herbert the politician is very unpopular in alternative health/diet circles. Unfortunately, many in such circles cannot distinguish between Victor Herbert the politician and Victor Herbert the research scientist. Herbert is one of the top researchers in the B-12 area and has a lengthy list of publications in scientific/professional journals. That record--extensive publication in refereed scientific journals--is a record that the anti-Victor Herbert lobby cannot match. Accordingly, I encourage you to evaluate his research on its technical merits, and not to be swayed by emotional or political sentiments.

122

Vitamin B-12: summary/conclusion

Vitamin B-12 is an essential vitamin for which the only reliable sources are fauna (animal foods) and coprophagy (should you freely choose to engage in it, which is most definitely not recommended). There is no evidence to support the idea that coprophagy is a natural human behavior, while there is extensive evidence that humans have consumed animal foods since the inception of the (human) species. Indirect coprophagy--eating plants fertilized with raw excrement--might provide adequate B-12, although the supporting evidence for this is very limited at present. As well, the use of raw excrement in farming poses risks--of parasites and disease for the consumer--and liability lawsuits aimed at the farmer. Indirect coprophagy is not a feasible option at present. The nutritional requirement for B-12, coupled with the lack of ability to assimilate B-12 at the site synthesized by bacteria in the human body (the colon), and the fact that plant foods are not reliable sources of B-12, provides evidence of apparent long consumption (and evolutionary adaptation) to animal foods in the human diet.

Protein Digestion: Plant vs. Animal Sources
Essential and conditionally essential amino acids
Young and Pellett [1994, p. 1204S] provide a good summary of protein requirements: The requirement for dietary protein consists of two components: 1) the requirement for nutritionally indispensable amino acids (histidine, isoleucine, leucine, lysine, methionine, phenylalanine, threonine, tryptophan, and valine) under all conditions and for conditionally indispensable amino acids (cysteine, tyrosine, taurine, glycine, arginine, glutamine, proline) under specific physiological and pathological conditions, and 2) the requirement for nonspecific nitrogen for the synthesis of the nutritionally dispensable amino acids (aspartic acid, asparagine, glutamic acid, alanine, serine) and other physiologically important nitrogen-containing compounds such as nucleic acids, creatine, and porphyrins.

Daily recommendations
The FAO/WHO/UNU (Food and Agriculture Organization, World Health Organization, United Nations University) recommend a daily average intake of protein for adults of 0.76 g per kg of body weight. However, on a related note, Young and Pellett [1994, p. 1205S] note that: However, there is increasing evidence that the current international FAO/WHO/UNU (11) and national (10) requirement estimates for most indispensable amino acids in adults are far too low (9, 23, 24). Several groups (25-27) are seeking to further substantiate this evidence.

Digestibility
A number of procedures have been established to measure the digestibility of proteins; for an example of protein digestibility scoring methods and lists of the protein scores of a wide variety of foods, see FAO [1970]. Through the use of such scoring methods, one can compare the digestibility of the protein in various foods. Young and Pellet note that the level and quality of protein in some individual plant foods make them inadequate as sole protein sources. They make the interesting observation that [Young and Pellett 1994, p. 1210S]: In general, the digestibility of vegetable proteins in their natural form is lower than that of animal proteins... Boiling in water generally improves protein quality, whereas toasting or dry heating reduces protein quality.

123

Note that the information above that we digest certain conservatively cooked protein foods more effectively than raw protein foods challenges the claims of some raw diet advocates that raw is always "better" than cooked. Young and Pellett [1994] provide a table (table 10, p. 1209S) showing the digestibility of various protein sources. Meat, dairy, and eggs are 94-97% digestible, the much-maligned standard American diet is 96% digestible, but grains, beans are "only" 78-85% digestible. One is tempted to conclude that this is due to adaptation to animal foods in the diet over evolutionary time. However, that is not necessarily the case. Animal foods are in general more easily digested than plant foods, for structural reasons: plant foods come encased in cellulose cell walls, which are hard to penetrate and digest. Ruminant animals such as cows, fed animal products in their feed, have no difficulty digesting the animal protein, even though it is arguably "unnatural" for them to eat animal foods.

Protein sources in prehistoric times
Earlier sections of this paper discuss the role of animal foods in the diet during evolution. A question that arises is whether, in theory, plant foods could take the place of animal foods as a protein source in such diets. Clearly, plant foods, if available in sufficient quantity, can satisfy protein requirements. The problem, of course, is availability. Prior to the development of containers, plant foods were difficult to collect and store, in quantity. Without suitable storage technology (even crude technology), plant foods will not last long (because of spoilage and insect attack). Even "tough" plant foods like nuts and acorns are subject to insect and rodent attack unless they are kept in protective storage. The preceding suggests that plant foods were available in season, but not necessarily on a year-round basis (in temperate climates), during evolution. This in turn suggests (but does not prove) that the consumption of animal foods was necessary to satisfy protein (and calorie) requirements in pre-historic times.

Protein in raw vegan diet circles
Protein is a topic of considerable interest in raw vegan circles. One can find fruitarian extremists promoting crank theories that protein (other than the small amount found in sweet fruits) is "toxic" in the sense that the metabolic by-products of protein metabolism are toxic, and will harm you. Such theories are based on a pathological fear of "toxins" and an amazing ignorance of the reality that the human body is well-equipped to process the metabolic by-products of protein metabolism. Further, such theories are sometimes promoted in hateful and dishonest ways (in my opinion and experience). See the article, "Is Protein Toxic?" (not yet available) for a discussion of such theories.

Taurine, a conditionally essential amino acid
Introduction. Taurine is one of the most common sulfur-based amino acids found in nature. Huxtable [1992, p. 101] provides a good introduction to the topic:
2-Aminoethane sulfonic acid, or taurine, is a phylogenetically ancient compound with a disjunct distribution in the biosphere. It is present in high concentration in algae (159, 649, 748) and in the animal kingdom, including insects and arthropods, but is generally absent or present in traces in the bacterial and plant kingdoms. In many animals, including mammals, it is one of the most abundant of the low-molecularweight organic constituents. A 70-kg human contains up to 70 g of taurine. One is not stumbling into the abyss of teleology in thinking that a compound conserved so strongly and present in such high amounts is exhibiting functions that are advantageous to the life forms containing it.

Distribution of taurine in plant/animal kingdoms
Huxtable [1992] reports that taurine is found at high concentrations in the animal kingdom, but is missing or present in trace amounts in other kingdoms. He notes (p. 105): In the plant kingdom, taurine occurs in traces, averaging ~0.01 mc mol/g fresh wt of green tissue according to one report (424). This is <1% of the content of the most abundant free amino acids. Taurine occurs in red algae, but not brown or green algae. Taurine has been reported at levels up to 0.046 mc mol/g, wet weight, in a few plant foods (nuts). Taurine is also present, at high levels, in insects. [Huxtable 1992].

124

Huxtable [1992] warns of the difficulty in assaying plant foods for taurine content. Due to the precision with which taurine must be measured, the taurine content of plant foods given by an analysis may be incorrect due to contamination of the sample with insects, insect parts, animal droppings. As Huxtable says (p. 105): One wet thumb print contains 1 nmol of taurine (236). This amount may be compared, for example, with the analytical range of 0.1-0.3 nmol employed by the high-performance liquid chromatographic method used in one recent paper reporting [taurine] concentrations in plants (600). Note that reference #600 above is Pasantes-Morales [1989] as cited in Huxtable [1992]. Huxtable summarizes that taurine concentrations in plants, when taurine is present at all, are measured in nmol (billionths of a mole), whereas in animal foods taurine is present in micro-moles (millionths of a mole), i.e., an order of magnitude difference of one thousand.

Taurine: selected functions and interactions Taurine is a very important amino acid involved in a large number of metabolic processes.
Huxtable provides a lengthy list of the biological functions provided by taurine [1992, Table 1, p. 102]. Taurine is important in the visual pathways, the brain and nervous system, cardiac function, and it is a conjugator of bile acids. Another important function of taurine is as a detoxifier. Gaull [1986, p. 123] notes: Retinol [vitamin A] in excess amounts, i.e., unbound to retinol-binding protein, can act as a poison. When the long-term lymphoid cell lines are exposed to 10 mc M retinol, their viability decreases strikingly over a 90-minute period [18]. Addition of zinc improves the viability slightly. Further addition of taurine protects the cells even more. If a combination of zinc and taurine is added, there is a striking protective effect... Note that the above suggests that taurine and zinc, both found in animal foods, provide protection from excess vitamin A--a vitamin found in full form only in animal foods. This is an interesting synergism, to say the least. Yet another zinc/taurine interaction is mentioned by Huxtable [1992, p. 129]: Zinc is another metal ion with which taurine interacts. Zinc deficiency leads to increased excretion of taurine (277). Inasmuch as zinc is a mineral in relatively low supply (in terms of quantities and/or bioavailability) in raw/vegan diets, the above raises interesting questions of the possibility of yet another zinc/taurine synergism.

Is taurine essential? Taurine is a conditionally essential amino acid in adult humans; it is an essential amino acid for
infants (mother's milk contains taurine). Gaull [1982, p. 90] discusses the need for taurine: The finding of a dietary requirement for taurine in the human infant is consistent with the negligible activity of cysteinesulfinic acid decarboxylase present both in fetal and in mature human liver (4)... In adult man only 1% of an oral load of L-cysteine was recovered as increased urinary excretion of taurine, giving further evidence that mature human beings also have a relatively limited ability to synthesize taurine and may be largely dependant on dietary taurine (22). The rat, in striking contrast, has considerable ability to convert dietary cysteine to taurine (17). See also Gaull [1977] as cited in Gaull [1982, 1986] for more information on taurine synthesis in humans vs. other animals.

Taurine levels in vegans
Laidlaw et al. [1988] studied the taurine levels (in urine and plasma) of a group of vegans (staff members of a Seventh Day Adventist college) compared to a control group (non-vegans, standard American diet).

Taurine levels lower in vegans. Laidlaw et al. report [1988, pp. 661-663]:

125

The results of the present study indicate that the plasma taurine concentrations in the vegans were significantly reduced to 78% of control values. Urinary taurine was reduced in the vegans to only 29% of control values... These findings suggest there may be a nutritional need for taurine and that plasma levels and urinary excretion fall with chronically low taurine intakes. Possibly the diet of the vegans was low in metabolic substrates or in cofactors for taurine synthesis... Although taurine is synthesized in humans, the current study suggests that the rate of synthesis is inadequate to maintain normal plasma taurine concentrations in the presence of chronically low taurine intakes. It is possible that a higher cysteine intake could increase taurine synthesis in the vegans... Long-term adherence to a strict vegetarian diet may lead to clinical manifestations of taurine deficiency. Citing Sturman et al. [1984], Laidlaw et al. [1988] report that neonatal primates developed abnormal eye function after being fed a taurine-free diet.

Section summary and synopsis Taurine is plentiful in nature, except in the plant kingdom, and is required in the metabolism of all mammals. Cats, an obligate carnivore, have lost their ability to synthesize taurine, which implies a long evolutionary dependence on foods that contain taurine. In humans, the ability to synthesize taurine is apparently limited. Animals that are more herbivorous (e.g., the rat) have a greater ability to synthesize taurine than humans do (they synthesize taurine with three times greater efficiency; see Gaull [1986]). The inefficient status of human synthesis of taurine (compared to more herbivorous animals) suggests a long dependence on a diet that includes taurine, i.e., a diet that includes fauna.

Vitamin A and Beta-Carotene
The primary emphasis of this section is on the low conversion rates of beta-carotene to vitamin A (i.e., low bioavailability) and its possible implications regarding humanity's natural diet.

Introduction
Biesalski [1997, p. 571] provides an excellent introduction to the subject of vitamin A: The term vitamin A is employed generically for all derivatives of beta-ionone (other than the carotenoids) that possess the biological activity of all trans retinol. Retinal [note slightly different spelling] is active in vision and is also an intermediate in the conversion of retinol to retinoic acid. Retinoic acid acts like a hormone, regulating cell differentiation, growth, and embryonic development. Animals raised on retinoic acid as their sole source of vitamin A can grow normally, but they become blind, because retinoic acid cannot be converted to retinol... Vitamin A from animal sources is ingested as dietary retinyl esters which are hydrolyzed to retinol by the lipases of the intestinal lumen... Human plasma contains about 2 mc mol/L retinol bound to retinol binding protein (RBP) and 5-10 nmol/L retinoic acid presumably bound to albumin (Blomhoff et al. 1990). Vitamin A is stored in the liver, and released in the blood attached to RBP as mentioned in the above quote.

Units of measurement. Discussions of vitamin A and beta-carotene use two different measures: IU
(international units) and RE (retinol equivalents). The conversion equivalents are:

126

1 IU = 0.3 mcg retinol = 0.6 mcg beta-carotene = 1.2 mcg other carotenoids and: 1 RE = 1 mcg retinol = 6 mcg beta-carotene = 12 mcg other carotenoids We will see later that the above measures are controversial because they imply a conversion efficiency that is higher than commonly encountered.

Beta-carotene: an introduction
Wang [1994, pp. 314-315] provides an introduction to the topic of beta-carotene: Beta-C is a symmetrical molecule with several conjugated double bonds on its polyene chain... Vitamin A activity is an important function of beta-C. Beta-C also has a variety of functions including enhanced immune response [2], antioxidant function [3,4], enhancement of gap junction communication [5,6], induction of carcinogen-metabolizing enzymes [7], and photoprotection [8]. Beta-C is one of the most important food components for which there is strong evidence for an anticancer role from epidemiological studies in human populations [9-11] and studies in animal model systems [12]...

It should be pointed out that many experimental studies on the anticancer activity of beta-C have been confounded by the poor absorption and low tissue levels of carotenoids in the rodent models used for such studies. Beta-carotene metabolism The metabolism of beta-carotene is very complex and is not fully understood. Wang [1994]
reports that beta-carotene is converted to retinoic acid via two mechanisms: central cleavage and excentric cleavage. Wang et al. [1992] demonstrated the production of retinoic acid (RA) and retinol from betacarotene in human intestinal homogenates (in vitro). They noted (p. 303): The production of RA and retinol in our human intestinal incubations is much lower than that reported by Napoli and Race in rat tissues (10), and only trace amounts of retinol were formed during both betacarotene and beta-apocarotenal incubations. Note: Wang et al. [1992] used intestinal homogenates in their research; see Parker [1996] for concerns on the reliability of results from cell homogenates. Note also that rats (a natural herbivore) produce more RA and retinol than humans. Meanwhile, cats, an obligate carnivore, are unable to synthesize RA or retinol from beta-carotene at all. That is, humans fall "in between" a carnivore and an herbivore in this capability. This is in line with the idea that humans are faunivores/omnivores adapted to a diet that includes some fauna (hence a diet that includes some fully formed vitamin A).

The full significance of the discovery of the cleavage of beta-carotene to retinol in the intestines is not clear, as the post-absorptive metabolism (sites, degree) is unknown [Parker 1996,
abstract]. Yet another confounding factor is that some ingested beta-carotene and vitamin A may be destroyed in the gut, precluding their use (from Goodman et al. [1966], Blomstrand and Werner [1967], both as cited in de Pee and West [1996]).

Bioavailability of beta-carotene
Solomons and Bulux [1997, p. S45] discuss the relative bioavailability of vitamin A from beta-carotene: Herbivores have lesser hepatic [liver] reserves of vitamin A than do carnivores when both meet their energy requirements. Human vegetarians show a somewhat parallel relationship in terms of retinol status when compared to omnivores.... In most of the literature's intervention studies, moreover, there was no improvement in indicators of retinol or vitamin A-nutriture status when a plant source of carotene was fed continuously over periods from months to years. Bioavailability is highly variable. Brown et al. [1989] compared the efficiency of absorption of betacarotene from supplements, carrots, broccoli, and tomato juice. They found that the supplements and carrots increased plasma carotenoids, but not broccoli or tomato juice. They also found wide differences in

127

absorption rates (carotenoid utilization) among individual subjects. A similar study by de Pee et al. [1995] in Indonesia found (p. 75): There is little evidence to support the general assumption that dietary carotenoids can improve vitamin A status. Solomons and Bulux [1997] apparently concur; they note that (p. 45): The findings...suggest that provitamin A compounds are converted less efficiently than the currently accepted factors of 1:6 and 1:12 would suggest. Parker [1996] discusses the many difficulties involved in measuring beta-carotene absorption. He notes that many studies do not take into account the intestinal conversion of beta-carotene to retinoids other than vitamin A (retinol). Parker [1996] cites the study by van Vliet et al. [1995] as an exception, which found an absorption efficiency of 11% for beta-carotene supplements as measured in plasma [blood]. As most studies show higher absorption rates for supplements than for beta-carotene in food, this would suggest an actual absorption rate well below 11% in food items. (Note that the result mentioned is from only one study, and one should be cautious in making inferences from a single study).

Poor design of beta-carotene conversion studies. de Pee and West [1996], a review article,
summarizes the studies on the conversion of beta-carotene to vitamin A (p. S38): Many experimental studies indicating a positive effect of fruits and vegetables [on vitamin A status] can be criticized for their poor experimental design while recent experimental studies have found no effect of vegetables on vitamin A status. Thus, it is too early to draw firm conclusions about the role of carotene-rich fruits and vegetables in overcoming vitamin A deficiency. Solomons and Bulux [1997] note that beta-carotene serves as a "sunscreen" for plants, it is an antioxidant, and that it may serve similar functions in those who consume the plant [Jukes 1992 and Krinsky 1992, both as cited in Solomons and Bulux 1997]. This would indicate competing uses for beta-carotene--between its potential use as a carotenoid vs. cleaving it to create vitamin A.

The bioavailability of beta-carotene is influenced by a number of factors. These are discussed
in de Pee and West [1996], de Pee et al. [1995], and Parker [1996]. We will consider only 3 factors of interest here: • •

Heating/conservative cooking increases the bioavailability of beta-carotene in carrots
[de Pee et al. 1995; Parker 1996]. This is significant because it conflicts with the idea that raw foods are always best, an idea promoted by raw vegan advocates. Eating vegetables with fat can increase carotene absorption, as beta-carotene is a lipid, i.e., a fat; see Parker [1996, pp. 543-544] for a detailed discussion on this point. This is of interest because there are fruitarian extremists who promote the crank science/science fiction claim that all (or nearly all) fat is "toxic." (Apparently they have never heard of the concept of essential fatty acids, or refuse to acknowledge it.) Genetics. From de Pee and West [1996, p. S49]: [T]here is evidence that some people have a genetic defect which renders them unable to convert beta-carotene to retinol [Blomstrand & Werner, 1967; McLaren & Zekian 1971]. Obviously, someone who cannot convert beta-carotene to retinol must rely on animal food sources for all vitamin A. Such a genetic defect raises the interesting question of whether such a genetic change increases, or decreases, evolutionary survival rates.

Vitamin A and beta-carotene: synopsis
The complexity of vitamin A/beta-carotene metabolism, coupled with the apparently inefficient conversion of beta-carotene to active vitamin A, makes it difficult to reach firm conclusions. However, given the number of vegans who do not use animal products or supplements, and who do not display symptoms of vitamin A deficiency, such observation suggests that conversion of beta-carotene to vitamin A, though

128

inefficient and sluggish, is adequate to prevent deficiency. At the same time, however, the low efficiency of that conversion (when compared to that of the rat, an herbivore) suggests that humans show evidence of preferential use of, and some degree of evolutionary adaptation to, preformed vitamin A-containing diets, i.e., those including animal foods.

MINERALS (IRON AND ZINC)
Preface. This section will address the issue of the low bioavailability of certain minerals in the vegan diet. The more narrow the diet, the greater the risk of deficiency, a point of real concern for some raw vegans. By way of introduction, Freeland-Graves [1988, p. 861] notes: The fact that vegetarian populations have no gross mineral deficiencies suggests that the mineral status of most vegetarians is probably adequate. Note that most vegetarian populations are not strictly vegan, and particularly not raw vegan; thus, as mentioned, the narrower the diet, the more likely the bioavailability issue might come into play.

Iron (Fe)
Introduction
Iron is an essential nutrient. Iron occurs in two forms in foods: heme ("organic") and non-heme ("inorganic"). (Note that the terms organic and non-organic are in quotes because the usage is figurative rather than literal.) NRC [1989] suggests a daily RDA/RDI of 15 mg/day for adults. Hurrell [1997] provides a good overview for iron (p. S4): The body requires iron (Fe) for the synthesis of the oxygen transport proteins haemoglobin and myoglobin, and for the formation of haem enzymes, and other Fe-containing enzymes, which participate in electron transfer and oxidation-reduction reactions... The body regulates iron homeostasis by controlling absorption and not by modifying excretion as with most other metals; absorption is increased during deficiency and decreased when erythropoeisis [red blood cell production] is depressed. The body has a limited capacity to excrete Fe and that in excess of needs is stored... All non-haem food Fe is assumed to enter a common Fe pool in the gut lumen (Cook et al, 1972), from where absorption is strongly influenced by the presence of other food components. [inhibitors]... As little or no Fe is excreted, absorption is synonymous with bioavailability. In general, heme iron (found in significant quantities only in animal foods) is absorbed much more readily than the non-heme iron from plant foods. In animal foods, the iron is approximately 40% heme [NRC 1989, p. 198]. However, Hurrell [1997], citing MacPhail et al. [1985], reports that in lean meat, heme iron may vary from 30-70% of total iron. The remainder of the iron in animal foods, and effectively all of the iron in plant foods, is non-heme. The NRC reports [1989] that the absorption of non-heme iron may vary by a factor as high as ten with the presence/absence of inhibitors or enhancers of iron absorption.

Factors that inhibit iron absorption
Let's look at the factors that inhibit iron absorption, as follows. •

Phytates/phytic acid--myoinostol hexaphosphate--is common in grains and legumes, and
sharply reduces the bioavailability of iron. The phytate content of foods is reduced, but not eliminated by milling and some common processing techniques. Of interest to raw vegans, Hurrell [1997, p. S5] notes:

129

Some traditional processes such as fermentation, germination, and even soaking can activate phytases in cereal grains or flours, which then degrade phytic acid and improve Fe absorption (Sandberg and Svanberg, 1991). • •

Polyphenols--including phenolic acids, tannins, flavonoids. Common in tea, coffee, chocolate,
wine.

Calcium (Ca)--inhibits iron absorption when given in inorganic (supplement) form, or in dairy.
The effect of Ca depends on the composition of the meal, and the level of Ca it contains. Mixed or complex meals (i.e., meals of several food items) generally do not inhibit Fe absorption. In contrast, Ca may inhibit Fe absorption in mono-meals (the practice of mono-eating is recommended by some raw vegans) depending on the food eaten. Fiber. Freeland-Graves [1988], citing Bindra et al. [1986], reports that dietary fiber might inhibit Fe absorption. Hurrell [1997], citing Rosssander et al. [1992], reports that fiber has little effect on Fe absorption. It appears that the effect of fiber on Fe absorption has not been resolved yet. Protein. Proteins can inhibit or enhance iron absorption depending on the type of protein. Proteins are converted into peptides in digestion, and these can bind Fe. Legume proteins (including soy), milk casein, and egg albumin, can bind iron. Nuts. Nuts can sharply inhibit the absorption of iron in a meal; the inhibition can be overcome, however, by the addition of vitamin C to the meal (Craig [1994]).

• • •

Factors that enhance iron absorption
• •

Muscle tissue. This brings us back to protein again. In contrast to plant protein, the presence of
animal muscle (meat) sharply increases the absorption of Fe. This may be due to the level of cysteine-based proteins in meat. Ascorbic acid (vitamin C) and other organic acids. A good absorption enhancer in supplement form and also in the form of fruits and vegetables. Hurrell [1997], citing Hallberg et al. [1989], reports that at sufficiently high concentrations, the iron enhancement effect of ascorbic acid can overcome the inhibiting effect of phytic acid in grains. Craig [1994] reports that other organic acids (citric, malic, tartaric, and lactic) can also enhance iron absorption.

Iron deficiency in veg*ns
Freeland-Graves [1988, p. 861] notes: Dwyer et al (28) reported mild Fe deficiency in 25% of preschool vegetarian children despite normal intakes of Fe. In adult lacto-ovo-vegetarians, Bindra and Gibson (30) also found a high prevalence of Fe deficiency with normal dietary Fe intakes. In new vegans (32) low Fe stores as measured by serum ferritin were found in 27% of the females, and 10% had values <10 mcg/L. Low serum ferritin has also been observed in lacto-ovo-vegetarian students (33). The above quote indicates that at least some veg*ns are low or deficient in iron. Craig [1994] cites four studies that found vegetarians had lower serum ferritin levels, and reduced iron stores, when compared to omnivores. In contrast to the preceding, Craig [1994] cites four other studies that did not find differences in iron levels, comparing vegetarians against non-vegetarians. Craig [1994] concludes that a balanced veg*n diet can supply adequate iron. However, Craig suggests a diet that includes grains, nuts, and seeds, as well as fresh fruits and vegetables. The idea is that the vitamin C in the fresh fruits and vegetables will counteract the iron inhibitors in the grains and seeds, hence make the iron (in the grains and seeds) available for absorption. From an evolutionary perspective, such a diet is of limited relevance: grains are available in quantity only via agriculture, and nuts and seeds were available only seasonally prior to the development of containers and food storage techniques.

From iron deficiency to iron overload: the distribution of hereditary hemochromatosis

130

(Acknowledgment: This topic was introduced to me via a posting made by Gary Ditta to the Paleolithic Diet email list, paleodiet@maelstrom.stjohns.edu, on 4/30/98.)

Introduction and incidence. Hereditary hemochromatosis (HH) is a genetic disease, a type of "iron
overload," which is "the most common inherited metabolic disease among the white [European descent] population worldwide" (from Bonkovsky et al. [1996], abstract). HH is an "iron overload" disease in which the body absorbs excess iron and stores it in the major organs (heart, pancreas, liver, etc.). This can eventually cause cirrhosis of the liver, diabetes mellitus, and cardiac complications [Niederau et al. 1994]. HH is chronic and progressive, and is treated by phlebotomy--blood-letting. The estimated incidence of HH varies from study to study. Bonkovsky et al. [1996] and Feder et al. [1996] report a gene frequency of 10%. Citing a variety of studies, Niederau et al. [1994] report a gene frequency of 2.3-10.7%, a frequency of heterozygotes (those with just one copy of the gene) of 4.3-19.1%, and a frequency of homozygotes (those having two copies of the gene) of 0.074-1.16%. The full disease occurs only in affected homozygotes, though a small number of heterozygotes may exhibit abnormal iron metabolism.

Hemochromatosis and diet: a hypothesis
The high incidence of HH has some potential implications regarding diet. The gene appears to reduce survival, yet it has become relatively common and widespread among those of European (Celtic) ancestry. Consider the following hypothetical analysis. • For 99% of the time since the inception of the species, humans have lived as hunter-gatherers, eating a diet that includes animal foods that are rich in easily absorbed iron. Under such circumstances, the gene responsible for HH would not survive for long, as the hunter-gatherer diet is rich in animal foods and heme iron, and the HH gene (genetic mutation) would sharply reduce survival of those unlucky enough to have it. In the most recent 1% of our history, humans developed agriculture, stopped being huntergatherers, and switched from a diet rich in easily absorbed iron to a diet based on grains--low in bioavailable iron and high in phytates (iron inhibitors). Under the new circumstances, a genetic mutation that increases iron absorption (e.g., the gene associated with HH), would occur in an environment where it actually enhances survival. Under those circumstances, such a gene would both survive and spread. Modern times arrive, and bring large-scale agriculture and agricultural technology, industrialization, and greatly increased wealth. Over a very short period of time, the meat/animal food content of the diet increases substantially from what it was 100 years ago (if less than what it was prior to the development of agriculture). However, those with the HH gene who eat a diet rich in animal foods are now at increased risk of disease, because their bodies absorb "too much" iron from the recent dietary change that allows them to eat far more meat than their grandparents could afford. That is, their bodies have partially adapted (via the HH gene) to a diet in which iron is of very low bioavailability.

The above hypothesis explains the incidence of HH, and also may serve as evidence, in selected populations, of partial, limited adaptation to the "new diet"--i.e., the high-carbohydrate, grain diets provided by agriculture. Contrast the above--possible evidence of limited genetic adaptation to dietary changes in the approximately 10,000 years since agriculture began--with the unsupported claims made by certain raw/veg*n advocates that humans could not, and did not, adapt to a diet that includes animal foods, after ~2.5 million years. Reminder: the above is a hypothesis, for discussion .

Hemochromatosis: supplementary notes Iron overload diseases in Africa. Iron overload is also common in Ethiopia and the Bantu tribe of
South Africa. Hemochromatosis is not limited to hereditary transmission; it can occur any time iron is consumed in great excess. Hennigar et al. [1979] report that iron overload in the Bantu tribe is due to

131

consuming large amounts of a local alcoholic beverage that is brewed in iron pots. Wapnir [1990, p. 101] mentions that iron overload in Ethiopia and the Bantu is due to soil type and use of iron pots and utensils.

Heme iron receptors in the human intestine Heme and non-heme iron receptors. Both heme iron and non-heme iron are absorbed via intestinal
receptors, though the absorption pathways are different. Heme iron is absorbed via specific intestinal receptors (i.e., the receptor function is limited to absorption of heme iron). The heme iron receptors are in the brush borders of the intestinal mucosa, with the duodenum having the most receptors. For confirmation of the existence of the heme receptors, see Tenhunen et al. [1980], Grasbeck et al. [1979], Conrad et al. [1967], and, for a related study on pig intestine, see Grasbeck et al. [1982]. After the initial absorption process has taken place, the iron is removed from the heme, and enters the non-heme iron pool [Wapnir 1991, p. 106]. The human intestine also has receptors for non-heme iron, but there appear to be multiple receptors, and the transfer proteins involved are general and may also absorb some metals other than iron. See Davidson and Lonnerdahl [1989] for discussion of one of the types of non-heme receptor, and its involvement in the absorption of manganese. As well, Davidson and Lonnerdahl cite other studies that propose the subject receptor is also involved in absorption of zinc. Wapnir [1991, pp. 106-107] provides an overview of the various transfer proteins involved in absorption of iron and other metals.

Heme iron receptors: an adaptation to animal foods in the diet. The information that the human
intestine has receptors specifically for absorption of heme iron is significant. Heme iron is found almost exclusively in animal foods. Plant foods may contain heme iron as well, as cytochrome c, a protein found in plants, reportedly contains a heme group. However, the level of heme iron in plants is extremely low, and not nutritionally significant. (Most of the standard references, such as NRC [1989, p. 198], report that all of the iron in plants is in non-heme form). Given that plant foods contain effectively no heme iron, and that heme iron is found only in animal foods, the presence of intestinal receptors in humans that are specific to heme iron is strong evidence of evolutionary, physiological adaptation to animal foods in the diet.

Heme iron: a quote out of context. The following quote has been used to suggest that plant foods
contain significant amounts of heme iron; from Wardlaw [1999, p. 513]: The heme iron in the meat, poultry, fish, dry beans, eggs, and nuts group is especially well absorbed. However, examination of Wardlaw [1999] reveals the following. • • The quote is taken out of context; it is part of the caption for Figure 14-4 on page 513, which depicts the standard food pyramid, which of course groups food sources together instead of treating individual food items individually. Wardlaw [1999, p. 507] provides definitions of heme and non-heme iron. In those definitions, heme iron is identified as coming from animal tissues (implicitly, only from animal tissues).

Taken out of context, the above quote from Wardlaw [1999] could be used to suggest that beans and nuts are good sources of heme iron. A closer examination of the material, in context, reveals otherwise. In the quote, Wardlaw is discussing a food group, for which 4 out of 6 members are animal foods. Being animal foods, those 4 foods contain heme iron, and hence, the group, considered as a whole, provides heme iron, so long as one eats a balance of foods from the group. More precisely, eating a balance of foods from a group means that one eats, over an appropriate period of time, some of every food type in the group. Such a strategy would insure the consumption of heme iron from the animal foods in the group. Finally, note that eating a balance of foods from the food groups is "standard" advice in nutrition, and this context should be taken into account when interpreting statements based on such sources.

132

Iron: synopsis
The existence of receptors in the human intestine that are specific for the absorption of heme iron, which is found in nutritionally significant amounts only in animal foods, coupled with the evidence of the low bioavailability of iron in plant foods, provides evidence of evolutionary, physiological adaptation to a diet that includes animal foods.

Zinc (Zn)
Zinc is an essential nutrient involved in many enzyme pathways in the body. The human body
maintains a small pool of zinc (mostly in bone and muscle), with a fast turnover rate (deficiency symptoms appear quickly in test animals). The (U.S.) RDA/RDI for zinc is 15 mg/day for adult men, 12 mg/day for adult women [NRC 1989]. Sandstrom [1997] summarizes the functions of zinc, and how homeostasis is maintained (p. S67): Its biochemical function is an essential component of a large number of zinc dependent enzymes participating in the synthesis and degradation of carbohydrates, lipids, protein and nucleic acids... At low and excessive intakes changes in urinary and skin losses contribute to maintain homeostasis. The primary dietary sources for zinc are animal foods and, to a lesser extent, grains. The bioavailability of zinc in animal products is higher than in grains [Inglett 1983, as cited in NRC 1989].

Zinc bioavailability (inhibitors) and interactions Zinc may be low in vegan diets. Freeland-Graves [1988] summarizes the issue of zinc concisely (p.
859): Vegetarian diets have the potential to be limited in Zn because foods that are considered to be the best sources of this mineral, such as meats, poultry, and seafood, are excluded (1). In addition, these diets contain copious quantities of fiber, phytate, and oxalate, compounds that have been found to bind to minerals and reduce bioavailability. Comments on zinc inhibitors and interactions are as follows. • • • • • •

Phytic acid is a strong inhibitor, and phytate is present at "high" levels in many grains.
Sandstrom [1997], citing Rossander et al. [1992], reports that zinc absorption is usually below 15% from meals that are high in phytate. The inhibiting effect of phytic acid in grains is counteracted by including animal proteins in the meal [Sandstrom et al. 1980, as cited in Sandstrom 1997]. Oxalate in high-fiber diets may reduce zinc bioavailability [Kelsay et al. 1983, as cited in Freeland-Graves 1988]. This could be a real concern for raw vegans. Soy protein may reduce zinc bioavailability [Freeland-Graves 1988]. Large amounts of non-heme iron in a veg*n diet may reduce Zn bioavailability [Solomons and Jacob 1981, as cited in Freeland-Graves 1988]. Calcium and phytate in combination. The combination of high calcium intake plus high phytate intake may decrease Zn bioavailability [O'Dell 1984, as cited in Freeland-Graves 1988].

Zinc deficiency in veg*ns Low zinc levels in vegans. Freeland-Graves [1988, pp. 859-860] notes:

133

In a study of 79 vegetarians, our laboratory (9) observed that only female vegans had low dietary levels of Zn. These low levels were attributed to heavy reliance on fruits, salads, and vegetables, foods that are poor sources of Zn... Levels of Zn in salivary sediment were significantly lower in vegetarians than in nonvegetarians, and vegans exhibited the lowest levels. The first remark above, about reliance on fruits and vegetables, is relevant to those raw vegans, mostly fruitarians, who avoid grains/sprouts. It suggests that such individuals are--not surprisingly--at higher risk of a Zn deficiency. Many fruitarians report (anecdotal evidence) a "light" or "euphoric" mental feeling that actually may be a symptom of zinc deficiency (a feeling often mistaken for a "spiritual" feeling). Some advocates of fruitarianism claim the "euphoric" mental feeling is a benefit of the diet. Additional relevant anecdotal evidence is that some raw vegans and fruitarians report loss of libido, which may also be a symptom of zinc deficiency. The possibility that one of the most hyped effects of the fruitarian diet, the allegedly "euphoric" mental state, may be a Zn deficiency is of course a hypothesis. However, it is a plausible hypothesis (based on the available scientific and anecdotal data), and deserves further research. Side note. As a former fruitarian who has personally experienced the "light" or "euphoric" mental effect of a fruit diet, and who has also done some spiritual work, I can corroborate that, at least for me, the "light" or "euphoric" feeling of a fruit diet is not a "spiritual" feeling. Rather, the effect of the fruit diet is more like a mild drug or sugar high--an airy, apathetic feeling (lassitude).

Zinc: synopsis That the bioavailability of Zn is higher in animal foods than plant foods may suggest that humans are better adapted to the digestion and assimilation of animal foods (i.e., in contrast, many plant foods contain zinc inhibitors). Also, specific symptoms that are commonly reported by fruitarians (anecdotal evidence), i.e., the "light" mental feeling and loss of libido, suggest the possibility that zinc deficiency may be widespread among fruitarians. This is an area where research is needed.

Essential Fatty Acids (EFAs) (1 OF 2)
Introduction
Fatty acids serve a number of important functions. As specified in Nettleton [1995]: • • • Some are essential nutrients, e.g., linoleic acid. Some, particularly short-chain fatty acids, are digested and provide energy (calories). Some--certain long-chain fatty acids--are components of cells and membranes.

The primary fatty acids of interest here are, first, linoleic acid (an n-6 acid) and alpha-linolenic acid (an n-3 acid). Second, given the presence of suitable enzymes, linoleic and/or alpha-linolenic acid can be converted to arachidonic acid (AA), eicosapentaenoic acid (EPA), docosahexaenoic acid (DHA), eicosanoids, and other important metabolic compounds. DHA is very important in the central nervous system; Jumpsen and Clandinin [1995, p. 24] report: Approximately one third of fatty acids with ethanolamine and serine phosphogylcerides in the cerebral cortex of humans, monkeys, and rats is docosahexaenoic acid (O'Brien and Sampson, 1965; Svennerholm, 1968; Neuringer and Connor, 1989).

134

Can synthesis of EPA and DHA from precursors maintain levels equal to obtaining them in the diet preformed? As we proceed, the central question that will be addressed here is how the
conversion of precursors such as alpha-linolenic acid (from plant foods) to EPA and DHA compares to obtaining these crucial fatty acids preformed directly from animal foods. During evolution, humans would have been dependent on obtaining EPA and DHA primarily from animal sources [Eaton et al. 1998]. Given that DHA is particularly important in growth and development of the brain, which tripled in size during human evolution, the question of whether obtaining EPA/DHA primarily through synthesis from precursors is efficient enough to maintain optimum levels is of prime interest.

Understanding the terminology for EFAs
A number of systems and abbreviations are used to denote the various fatty acids of interest here, and these can at times be confusing. This section serves to introduce the relevant systems. The two main groups or families of polyunsaturated fatty acids of interest here are the omega-3 and omega6 families, which are written as n-3 or w-3, and n-6 or w-6. The different families of fatty acids have different chemical structures. They all consist of chains of carbon atoms with a methyl group (CH3) at one end of the chain, and an acid or carboxyl group at the other end (HO-C=O). A standard system has been established for describing the various fatty acids. An example will illustrate the principles involved in the nomenclature.

18:2n-6 Linoleic Acid (LA): The first number specifies the number of carbon atoms in the chain; here it
is 18. The second number, 2, tells us how many of the carbon atom bonds in the chain are double bonds. The last number, 6 in this case, specifies the position number of the first carbon double bond from the methyl end of the chain. Putting the above together, letting "C" be a carbon atom, "-" a single bond, and "=" a double bond, we can crudely depict linoleic acid as: (CH3)-C-C-C-C-C-C=C-C=C-C-C-C-C-C-C-C-(HO-C=O)

Saturated vs. unsaturated fats. A fatty acid is saturated if it contains the maximum number of
hydrogen atoms possible, i.e., it has no double bonds between adjacent carbon atoms. A fatty acid is unsaturated if it is not saturated, that is, if it has one or more double bonds between adjacent carbon atoms. Both linoleic and alpha-linolenic acids contain double carbon atom bonds, hence are unsaturated. (Note that the depiction of linoleic acid above is a summary and does not show all the hydrogen atoms.)

Abbreviations: an "alphabet soup" of terms. A sometimes-confusing array of numbers,
abbreviations, and names are used for the various fatty acids and related terms. The following table is provided as an aid in understanding, and as a quick reference for fatty acids of interest here. The term "precursor" below refers to which fatty acid (FA) is a precursor in synthesizing another fatty acid. The precursors shown below are the ones of interest here, and are not necessarily the only precursors, and in some cases are actually intermediate precursors.

IMPORTANT OMEGA-3 (n3) AND OMEGA-6 (n6) POLYUNSATURATED FATTY ACIDS Numeric Designation Fatty Acid Name Abbreviation Can be Made from Synthesis Precursors

OMEGA-3 FAMILY

135

18:3n-3 20:5n-3 22:6n-3

Linolenic acid, also known as alpha-linolenic acid Eicosapentaenoic acid Docosahexaenoic acid

LNA or ALA EPA DHA

[n-3 pathway] LNA/ALA EPA

OMEGA-6 FAMILY 18:2n-6 20:4n-6 22:5n-6 Linoleic acid Arachidonic acid Docosapentaenoic acid LA AA DPA [n-6 pathway] LA AA

The table above is based on information from Jumpsen and Clandinin [1995], and Nettleton [1995]. Note that EPA and DHA are synthesized in a pathway that starts with LNA (all three of which are n-3 FAs), while AA and DPA are synthesized in a pathway that starts with LA (all n-6 FAs). The primary focus here will be on the n-3 family and its longer-chain derivatives, EPA and DHA.

Additional terms of interest.
• • •

Short-chain: a FA with 8 or fewer carbon atoms. Long-chain: a FA with 12+ carbon atoms. Medium-chain: intermediate between short and long, 9 to 12 carbon atoms.

Requirements for EFAs Linoleic acid: a minimum of 1 to 3+% of calories. Only small amounts of linoleic acid (an n-6 acid)
are required. An RDA/RDI has not been formally adopted; however linoleic acid at 1-2% of calories will prevent deficiency. NRC [1989] recommends a minimum intake of linoleic acid, for adults, of 3-6 g/day. The comments of Jumpsen and Clandinin [1995, p. 29] are appropriate here: ...[B]ecause competition exists among the fatty acids for desaturating enzymes (Brenner, 1981), a level of at least 3% of energy should be met by n-6 fatty acids (FAO, 1977)... Uauy et al. (1989) recently suggested that the recommendation of 3.0% of total energy is adequate to prevent clinical signs of deficiency but may be insufficient to ensure functional and biochemical normalcy. FAO [1995], a joint publication with the United Nations World Health Organization (WHO), recommends linoleic acid consumption in the range of 4-10% of total energy. They specifically recommend consumption of linoleic acid at the 10% level when total intake of saturated fatty acids is high. Note that the FAO/WHOrecommended consumption level is higher than the suggested minimums.

Alpha-linolenic acid recommendation: 0.5% of calories. In reference to alpha-linolenic acid (an n3 acid), Jumpsen and Clandinin [1995, p. 21] note: ...[A] dietary requirement for 18:3n-3 [alpha-linolenic] has not been clearly established, a growing body of evidence indicates that this series of fatty acids is also essential (Lamptey and Walker, 1976; LeprohonGreenwood and Anderson, 1986; Bourre et al., 1990a). In Canada, nutrition recommendations have established the dietary requirement level for 18:3n-3 as 0.5% of energy (Nutrition Recommendations, Health and Welfare Canada, 1990).

136

Neuringer et al. [1984] has demonstrated n-3 fatty-acid deficiencies in animals, and Bourre et al. [1989b] showed that a diet deficient in n-3 oils affected enzyme activity and learning abilities in rats [Neuringer, Bourre, as cited in Jumpsen and Clandinin 1995]. The preceding lend credence to the hypothesis that alphalinolenic acid is an essential nutrient. Note that certain fruitarian extremists make the science-fiction claim that all, or nearly all, fats (i.e., fats above the levels found in sweet fruits) are "toxic." To date the extremists have not presented any credible evidence to support the idea that, say, avocados are "toxic" because of their fat/oil content.

Essential Fatty Acids (EFAs) (CONT., 2 OF 2)
Bioavailability of EFAs: plant vs. animal sources Diet is only source for n-3 fatty acids. Nettleton [1995] points out that plants can convert linoleic acid
(n-6) to alpha-linolenic acid (n-3), but humans cannot. Hence, diet is the only source for n-3 fatty acids for humans. In general, plant foods provide linoleic (n-6) and alpha-linolenic acid (n-3), while animal foods provide DHA and other preformed long-chain n-3 fatty acids, such as eicosapentaenoic acid (20:5, EPA). EPA and DHA are common in aquatic animals.

THE PROBLEM: Are human enzymes that synthesize DHA and EPA from precursors adequate as a sole source?

Negative evidence suggesting DHA/EPA-synthesizing enzymes may be inadequate for complete nutrition. Although the issue of whether alpha-linolenic acid is essential for
humans is still controversial, it is clear that DHA, which can be produced by converting alphalinolenic acid, is essential in infants (mother's milk contains ready-made DHA). Although the human body will synthesize such long-chain fatty acids from precursors in the diet when not directly available, the rates of synthesis generally do not support the levels obtained when they are gotten directly in the diet. This is particularly critical in infancy, as human milk contains preformed DHA and other long-chain essential fatty acids, while plant-food-based formulas do not (unless they have been supplemented). Let's now review the research linking levels of EFAs in body tissues with EFA consumption via diet and/or supplementation.

Animal studies on impact of EFAs in diet: chickens. Anderson et al. [1990] studied the
effects of n-3 fatty acid (FA) supplementation in newly hatched chickens with an n-3 FA deficiency. Levels of DHA were measured in the brains and retina of deficient chicks fed 4 diets containing different levels of n-6 FA. The chicks fed a diet that included DHA and EPA recovered--that is, the levels of DHA/EPA returned to normal levels after 3 weeks. The researchers concluded that linolenic acid as the sole source of n-3 fatty acids is inadequate, and supplementation with DHA is advisable. Anderson and Connor [1994] provides additional insight into this research.

EFA studies on rats. Woods et al. [1996] tested the hypothesis that the "right" ratio of LA/LNA
(LA = linoleic acid, an n-6 fatty acid; LNA = alpha-linolenic acid, n-3) in infant formula could support the same levels of DHA production as results from breast-feeding (in rats). A series of milk formulations with LA/LNA ratios of 10:1, 1:1, and 1:12 were tested on infant rats, and the resulting levels of DHA in the rat's brains were measured. Woods et al. [1996, pp. 691, 693] conclude that: This suggests that there is no level of LNA that is capable of supporting the [levels of] neural DHA found in the dam-reared [breast-fed by mother rats] rat pups...

137

If it is accepted that human essential fatty acid metabolism is not more efficient than that of the rat, it follows that infant formulas like those in our study would not support proper human nervous system accretion of LC-PUFAs [long-chain polyunsaturated fatty acids]. Woods et al. [1996] do suggest that LNA might assist DHA synthesis in the brain, but not the retina, of the rats.

EFA studies on monkeys. Lin et al. [1994] studied the effect of various n-3 fatty-acid levels in
the diet on the retinas of rhesus monkeys. Monkeys on deficient diets, when fed a diet that included DHA, recovered to near-normal levels, except for certain lipids. In the n-3 FA-deficient monkeys, docosapentaenoic acid (DPA, 22:5n-6, a long-chain derivative in the n-6 FA family) partially replaced DHA in the monkeys' retinas. However, Lin et al. [1994, abstract] describe the replacement as "functionally incomplete," suggesting that the resultant DPA levels were too low, and/or did not provide equivalent functional support for the monkeys' visual systems (in comparison to the monkeys whose diet included DHA).

EFA levels in "living foods" adherents (raw vegans). Agren et al. [1995] studied the EFA
levels of vegans following the living foods diet, a vegan diet that emphasizes sprouted and fermented raw or "living" foods. The study began with 12 people who claimed to be following the living foods diet, but two were excluded because they were found not to be actually following the diet (it is a difficult and idealistic diet that few can follow successfully, in the long-term), and another subject was excluded who displayed fatty-acid levels that were 4-5 standard deviations from the means observed (again suggesting probable non-compliance with the claimed diet). Comparison of the EFA levels in erythrocytes, platelets, and serum lipids in living foods raw vegans vs. controls (standard Western diet) showed that the EPA levels in the raw vegans were only 29-36%, and DHA levels 49-52%, of the analogous levels in the control group. Additionally, the n-6/n-3 ratio in the raw vegans (5.55) was about double the ratio observed in the controls (2.70). (Note: Agren et al. [1995, Table 2, p. 367] report n-3/n-6 ratios; the n-6/n-3 ratios in the preceding sentence were obtained by inverting the ratios reported in Agren et al. [1995] for erythrocyte totals.)

EFA levels in conventional vegans. Krajcovicova-Kudlackova et al. [1997] analyzed EFA
levels in blood plasma, comparing several groups: vegans, vegetarians (whether lacto- or lactoovo-vegetarian was not specified), semi-vegetarians (those who ate fish) and a control group following the standard Western diet. The vegan group had significantly lower levels of both EPA (62.5%) and DHA (67%) compared to the control group, while the levels of EPA and DHA were not significantly different between vegetarians and the control group (computed from Krajcovicova-Kudlackova et al. [1997, Table 2, p. 367]). N-6/n-3 ratios were found to be 19.48 for vegans, 14.71 for vegetarians, and 13.07 for the control group. Note that the n-6/n-3 ratios for vegans and vegetarians are different (statistically significant, P = 0.001) from the control (standard Western diet) group. Although the difference between the vegetarian group vs. control group is statistically significant, in real-world terms the amount of the difference might not be very meaningful, as 14.71 and 13.07 are "close." However, the corresponding difference between the control group and vegans, whose ratio is 19.48, is considerably different enough that the result is likely to be meaningful in real-world terms as well.

Impact of DHA supplements on EFA levels in vegetarians. Conquer and Holub [1996]
studied the effect of DHA supplementation on the EFA levels in phospholipids, in vegetarians. (Phospholipids are compounds of lipids with alcohols, which also have a phosphate residue [Ansell 1973]. Lecithin is an example of a phospholipid [Strickland 1973]. Phospholipids are present in blood platelets, serum, and elsewhere in the human body.)

138

Two volunteer groups of vegetarians consumed either capsules providing 1.62 gms/day of DHA, or (control group) a similar quantity of corn oil capsules (placebo, no DHA content). The DHA in the capsules was from algae rather than animal sources. It was found that consumption of DHA capsules had a dramatic effect on DHA levels in phospholipids (increase of 225% in blood platelet, 246% in serum, phospholipids) and EPA levels (176% in blood platelet, 117% in serum, phospholipids). Additionally, the consumption of DHA capsules significantly lowered the n-6/n-3 ratio after 3 weeks (from 8.9 to 3.3); ratios were 3.1-3.3 for those taking DHA, 9.5-9.9 for those on the corn oil placebo. The consumption of DHA also improved a number of biomarkers associated with heart disease (cholesterol, triglycerides; see the study for details).

EFA levels in human milk. Sanders and Reddy [1992] compared the levels of EFAs in breast
milk from women who followed vegan, vegetarian, and standard Western diets (the latter serving as control group). Levels of DHA were found to be lowest in the milk from vegan mothers (37.8% of control group level), highest in the standard diet, with (lacto-)vegetarians in the middle (81% of control group level). The study authors note [Sanders and Reddy 1992, p. S71]: The proportion of DHA in erythrocyte total lipids of infants breast-fed by vegans was 1.9% compared with 3.7% in infants fed a [cow's] milk formula containing butterfat as the sole source of fat and 6.2% in infants breast-fed by omnivores [standard Western diet] at 14 weeks postpartum. That DHA levels in infants were lower in those who were breast-fed by vegan mothers than in those fed the cow's milk formula is somewhat surprising in light of the data that the vegan human milk contained more DHA as a percentage of energy. This may be caused by (relatively) high levels of linoleic acid (n-6) in the vegan breast milk suppressing synthesis of DHA. Sanders and Reddy [1992, Table IV, p. S74] report n-6/n-3 ratios of: 16.4 for vegans, 13.0 for vegetarians, 13.5 for control. (Note: Sanders and Reddy report that the ratio is higher for vegans, but the statistical significance thereof is not clear in their paper.) For a related paper, see Reddy et al. [1994].

EFA levels in tissues of infants. Farquharson et al. [1992] analyzed the brain tissues of infants
who died of crib death, also referred to as "sudden infant death syndrome" (SIDS). They found significantly higher levels of DHA and DPA in the cerebral cortex tissues of breast-fed infants vs. those fed formulas (that were not supplemented with DHA; breast milk contains DHA). They note that the higher linoleic acid content of the formulas might inhibit synthesis of DHA. Carlson et al. [1986] compared the EFA levels in blood cell phospholipids of preterm infants, for those receiving human milk vs. those fed with formula. Feeding with preterm human milk increased phospholipid DHA levels, while feeding with formula caused a decline. Salem et al. [1996] provides proof that infants are able to synthesize, in vivo, essential fatty acids from precursors. Also noted, however, is that the level of DHA synthesis is inadequate to meet needs in early life. Connor et al. [1996] studied the effect on DHA levels of supplementing the diet of pregnant women with fish oil capsules (high in DHA and EPA) in the latter stages of pregnancy. They found that blood DHA levels were higher in the blood of infants whose mothers consumed the fish oils. In the supplemented infants, DHA levels were 35.2% higher in red blood cells, and 45.5% higher in plasma, vs. the non-supplemented controls.

EFAs and infant development. Gibson et al. [1996] discusses a number of studies that show
improved visual function in breast-fed infants vs. those receiving formulas. The authors relate this to the presence of DHA in breast milk, and the lack (or lower levels) thereof in formulas. Also see Hoffman et al. [1993] and Uauy et al. [1996] in this regard.

139

Low conversion rates of LNA to DHA. Nettleton [1995] reports that alpha-linolenic acid
(LNA) can be converted to EPA and DHA, but the efficiency of the conversion is low [Dyerberg, Bang, and Aagard 1980; Sanders and Younger 1981; as cited in Nettleton 1995]. Emken et al. [1994], as cited in Conquer and Holub [1996], report a conversion rate of LNA to DHA of ~5% in adults. Salem et al. [1996] suggest (Table 1, p. 50; see also p. 51) a minimum conversion rate of ~1% from LNA to DHA in infants (0.9 mg of DHA produced from 100 mg precursor).

Flaxseed oil a suboptimal source for n-3 acids. Flaxseed oil is used by some vegans as a
source of n-3 oils. However, the study of Kelley et al. [1993] fed volunteers a diet in which 6.3% of calories were from flaxseed oil. (Given the cost of flaxseed oil, this is a large amount and would be expensive.) They observed a statistically significant increase in EPA levels in peripheral blood mononuclear cells (PBMNC) in those receiving flaxseed oil. EPA levels in serum were not affected by flaxseed oil. Regarding DHA levels, no increases (in PBMNC or serum) were seen from flaxseed oil supplementation. Gerster [1998] is a review article that looks into the issue of whether adults can adequately convert ALA to EPA and DHA. The tables in Gerster [1998, pp. 165-166, 168] list a number of studies that showed a pattern similar to the one described above. That is, flaxseed oil supplementation may increase EPA, but not DHA. o

Studies that found no changes in DHA levels from flaxseed oil supplementation: Dyerberg et al. [1980], Singer et al. [1986], Kelley et al. [1993],
Cunnane et al. [1995], Mantzioris et al. [1994], Layne et al. [1996]. Studies that reported DHA decreased with flaxseed oil supplementation: Sanders and Roshanai [1983], Allman et al. [1995]. Studies that reported DHA increased with flaxseed oil supplementation: Kestin et al. [1990], and Sanders and Younger [1981]. Note that the latter study found an increase in DHA for vegan subjects only, in plasma glycerides, but not in platelets (they found no differences in DHA in the non-vegan control group).

o o

The weight of the studies above suggests that flaxseed oil does not provide efficient support for DHA synthesis.

Preformed DHA may be necessary for optimum levels. Gerster also makes the point that
fish oils are superior sources for EPA and DHA, and concludes [1998, pp. 159-160]: These findings indicate that future attention will have to focus on the adequate provison of DHA which can reliably be achieved only with the supply of the preformed long-chain metabolite. For another perspective, Nettleton reports [1995, pp. 33-34]: The fact that LNA is less effective than EPA and DHA in enriching tissues with n-3 FA [fatty acids] means that foods rich in LNA, vegetables and seed oils will be less satisfactory sources of n-3 FA for human health than seafoods or other animal foods enriched with EPA and DHA. That is not to say, though, that oils such as canola and soy are not useful sources of n-3 FA... Finally, consuming oils with LNA helps reduce the amount of linoleic acid we consume. In rats, the ratio of n-3 to n-6 fatty acids consumed is more effective than the absolute amount of n-3 FA in inhibiting eicosanoid synthesis from arachidonic acid (Boudreau et al. 1991). It may be the same for people also...

140

Tissues that possess the enzymes needed to convert one form of n-3 FA to another will get by with available precursors, but others may require particular n-3 FA preformed. Dietary sources providing both EPA and DHA would appear to have advantages over those supplying only LNA.

Effects of balance of fatty acids. Langley [1995] reports that typical vegan diets have a high
intake of linoleic acid, which can suppress the production of DHA, i.e., the typical vegan diet has a poor balance of the EFAs. The effect of the balance of fatty acids is also discussed in a number of the studies cited above. The remarks above concerning fatty acid balance are cogent, as a number of companies strongly promote (and some vegan "diet gurus" recommend) flaxseed and/or hempseed oils as supplements to vegans to "make sure" they are getting the right balance of fatty acids in their diet. One must wonder how "natural" vegan diets are, if exotic, expensive oils are "required" to provide the right balance of linoleic and alpha-linolenic fatty acids in a vegan diet. If vegan diets are truly "natural," one would intuitively expect it to be relatively easy, as a vegan, to get the optimal balance of fatty acids. Leaving aside the criterion of naturalness, however, if (typical) vegan diets are held instead simply to be the "optimum," one would not expect the balance of fatty acids in a typical vegan diet to potentially suppress the synthesis of critically important EFAs, like DHA. The studies discussed above, however, raise serious questions on this point. In addition, it appears that despite flaxseed oil supplementation, for example, it may still be difficult to achieve the proper level of DHA. The studies above also show a strong link between DHA in the diet and DHA levels in body tissues, with vegans having very low levels of DHA when compared to those whose diet includes preformed DHA. This raises the question (discussed below) of whether the levels of DHA and EPA synthesis are optimal or merely adequate to prevent deficiency symptoms. •

Positive evidence suggesting DHA/EPA-synthesizing enzymes may be adequate for complete nutrition. Okuyama [1992] presents limited evidence that the desaturation-elongation
enzyme activity is adequate to supply 20-carbon fatty acids like EPA and DHA. Okuyama [1992, p. 170] reports: Finally, it should be emphasized that strict vegetarians ingest only linoleate and alpha linoleate, and usually do not develop essential fatty acid deficiency (8). All these data suggest that humans are no exceptions among mammals in the metabolism of linoleate and alpha linoleate, although the desaturation-elongation activities may be relatively lower in most humans probably due to the negative feedback control of the enzyme systems by large amounts of highly unsaturated fatty acids in their diets (9).

DHA and EPA synthesis: minimally sufficient, or optimal? The obvious question that arises here
is whether the desaturation-elongation process provides adequate DHA for optimal health, or merely enough to prevent deficiency symptoms. As pointed out earlier in this paper in Part 4 ("Intelligence, Evolution of the Human Brain, and Diet"), recent research on the evolution of the human brain and encephalization has shown a drop in human brain size of 11% in the last 35,000 years and 8% in the last 10,000, a timeframe which coincides with steadily decreasing amounts of animal food in the human diet. Although correlation alone is of course not causation, this evidence strongly suggests further research is needed in the area of human requirements for the preformed long-chain fatty acids important to brain and central nervous system development.

EFAs: synopsis
The scientific evidence available to date is somewhat unclear. Synthesis of EPA and DHA (from plant source oils) may be adequate in some individual adults; whether that is true for the general (adult)

141

population is unclear. The apparent difficulty in getting a favorable fatty acid "balance" on a vegan diet; the sporadic efficiency of conversion of LNA to EPA/DHA; and the recent evidence of decreasing human brain size appear to support the idea that humans have adapted to diets that include preformed EPA/DHA (i.e., animal foods). Beyond this, however, the real question is to what degree humans may be dependent on obtaining these preformed n-3 fatty acids from the diet not simply to prevent potential deficiency but for optimal functioning. Conclusions are at this point equivocal and more research is needed in this area.

Bitter Taste Rejection: A Marker for Dietary Trophic Levels
This section is based on the excellent paper by Glendinning [1994]. The primary interest here is on how bitter taste rejection threshold levels vary according to dietary trophic levels (based on possible evolutionary adaptation). However, the article by Glendinning also provides hard information that addresses some of the myths about bitter taste promoted in fruitarian/raw circles, and that material will be examined here as well.

Fruitarian/raw vegan myth: bitter = toxic
One of the enduring myths of fruitarianism/raw veganism is that any food that tastes bitter absolutely must be toxic. Glendinning [1994, pp. 1217-1218] notes (with citations) that all naturally occurring toxins taste bitter to humans at some concentration level, i.e.: Proposition A: Natural toxin implies bitter taste. However, the claim that is promoted in raw circles is the logical converse, i.e., that: Proposition B: Bitter taste implies natural toxin. This section will focus on proposition B.

Lack of proof for the claim that bitter implies toxic. Glendinning [1994] defines the bitter rejection
response as the aversive reaction that occurs when one eats bitter foods. He then asks whether the claim that bitter implies toxic is valid, and reports that (p. 1217): The bitter rejection response has not been evaluated rigorously in terms of its value in avoiding poisons. For the bitter rejection response to be an effective poison detector, there should be a predictable relationship between the threshold concentration for bitterness and that for toxicity. If the bitter threshold is substantially higher or lower than the toxicity threshold for many compounds, the effectiveness of the bitter rejection response as a poison detector would be uncertain. To the author's knowledge, no investigator has explicitly tested for such a relationship across a series of bitter compounds. Evidence that the claim "bitter implies toxic" is false. The hypothesis is then presented that perhaps the bitter taste threshold co-varies with the level of toxicity of the bitter compounds, i.e., that proposition B above (the fruitarian/raw claim) may be true. Glendinning then goes on to summarize the evidence that indicates the claim is false: • • • Genetic variation. The bitter taste threshold in mice varies according to the autosomal gene Soa. In particular, the bitter taste sensitivity of mice is not a function of the toxicity of the item being tasted. Inter-strain and inter-species studies with mice found wide variation in bitter taste sensitivity, and no relationship between bitter taste rejection and toxicity levels of the items tested. A comparison of the bitter taste threshold with the toxic level of the same compounds, in both mice and humans, found no relationship between the bitter taste and toxicity levels. (See figure 2, p. 1220 of Glendinning [1994] for the comparison plots--very interesting!)

142

Several mammalian species have been observed freely feeding on toxic plants, apparently indifferent to the toxicity. The inference here is that the toxins in these plants failed to produce the bitter rejection response.

Interested readers are strongly encouraged to read the original article for the citations and details. Glendinning [1994, p. 1218] concludes: Taken together, the above studies indicate that bitter taste sensitivity does not accurately mirror the body's systemic reactivity to naturally ocurring or synthetic [toxic] compounds in several species of mammal. Fallacious "proofs" that "bitter implies toxic" as cited in raw circles. The reality that there is no legitimate scientific proof for the fruitarian/raw claim provides yet another example of advocates' claims failing to meet the logical requirement of burden of proof. Instead, the claim may be presented with bogus "proofs" such as the following. • • A simple "proof" that one encounters is the claim, "Your taste buds evolved to reject bitter because toxic items are bitter." The preceding seems attractive, but it is actually a claim that proposition A somehow "proves" its own converse, proposition B. This is logically invalid. A more extensive crank science "proof" that one encounters is when the fruitarian/raw "expert" references a phytochemical database, picks out a toxic chemical (in this case, a bitter chemical would likely be chosen) that is present in the food and then announces that the food contains toxins, therefore there is "scientific proof" that the (bitter) food must be "toxic."

Space does not permit a full exploration here of the massive logical fallacies in the above crank science "proof." However, a short reply to the above is that practically every chemical is toxic in a large enough dose. If one is going to examine toxins in foods, one must consider dosages (harmful dosage of chemical vs. amount in food) and bioavailability as well. The crank science approach (common among fruitarian extremists) of equating the presence in a food of a chemical that is toxic (in isolation, and in large doses) as "proof" the food is "toxic" is intellectually dishonest, and borders on pathological fear-mongering. Bitter taste as trophic marker /Tolerance to bitter as a dietary adaptation. Turning our attention from fruitarian myths to more serious matters, Glendinning [1994] presents the hypothesis that the bitter taste might serve as a trophic marker. The argument is based on evolutionary adaptation. Inasmuch as plants contain many (unavoidable) bitter constituents, it could be suggested that animals with a plant-based diet would be likely to develop some degree of tolerance for the bitter taste, i.e., they will develop some degree of adaptation to it, hence have higher bitter taste rejection thresholds. Similarly, those animals whose diet almost never includes the bitter taste, i.e., carnivores, would rarely encounter the bitter taste and have a low tolerance for it, hence have very low bitter rejection thresholds. Omnivores would fall in between the two extremes in bitter taste tolerance. Glendinning [1994] hypothesizes that the bitter rejection thresholds would follow the order, from lowest to highest threshold (p. 1219): • • • • Carnivores Omnivores Herbivores--grazers (i.e., grass feeders) Herbivores--browsers (i.e., forb, shrub, and tree feeders)

Comparison of bitter rejection thresholds in mammals. Glendinning then examined the bitter
rejection thresholds of 30 mammal species to the chemical quinine hydrochloride (QHCL), using data from published studies. QHCL was chosen because it is widely studied, and sensitivity to QHCL has been found in a number of studies (see Glendinning for citations) to be correlated to sensitivity to many other bitter compounds. Next, Glendinning computed average QHCL taste thresholds by trophic group. The numbers he obtained [Glendinning 1994, pp. 1222, 1225] are as follows:

143

TROPHIC GROUP Carnivores Humans Omnivores Herbivores Grazers Browsers

QHCL Taste Threshold 2.1 x 10-5 M 3.0 x 10-5 M 3.0 x 10-4 M 6.7 x 10-4 M 3.0 x 10-3 M

Note: in above table, M = moles/liter. Note that the results obtained agree with Glendinning's hypothesis about the order of bitter rejection threshold values that would be expected based on the trophic level of foods consumed. Also of interest is that the number for human tolerance of QHCL lies between the numbers observed for carnivores and omnivores, and specifically is closest to the average carnivore value observed. The significance of Glendinning [1994] is that the study provides an actual analysis of bitter taste tolerance data for a wide variety of species, and reaches a result which suggests, or points to the possibility based on the data obtained (which is of course not proof by itself) that humans may be carnivores/omnivores. This coincides with the extensive evidence already given previously that humans are faunivores. No doubt certain fruitarian extremists will react to the above and claim that humans reject the bitter taste because our natural food is (nearly exclusively) sweet fruit. However, as discussed in previous sections, there is no legitimate scientific evidence for the claim that humans evolved as fruitarians (whether strictly so or not). Additionally, many wild fruits are quite bitter, and food shortages or competition for food would likely make the consumption of bitter fruit necessary at times--implying that "fruitarian humans" should have some tolerance for bitter. If humans were strict fruitarians, one would expect our QHCL tolerance level to be similar to the chimp. However, the QHCL tolerance level for chimps is 1.0 x 10-4 M [Glendinning 1994, p. 1226], a whole order of magnitude larger than the number for humans. Finally, many of the same fruitarian extremists claim (at times with intense emotion) that bitter foods must be toxic, the very claim that Glendinning neatly assesses and discredits.

Insulin Resistance: The Carnivore Connection Hypothesis of Miller and Colagiuri [1994]
THE PROBLEM: Incidence of adult-onset diabetes varies depending on recency of a population's exposure to agriculture.
This section is based on Miller and Colagiuri [1994]. The carnivore connection hypothesis they elucidate provides an explanation, based on evolution, of the following phenomenon: • • Populations that have a relatively "long" history of agriculture, e.g., those of European descent, have a relatively low incidence of NIDDM, that is, non-insulin-dependent diabetes mellitus, also known as adult-onset diabetes. Populations that adopted agriculture more recently or, in other words, discontinued the evolutionary hunter-gatherer diet more recently, have relatively higher incidence rates of NIDDM. In some former hunter-gatherer populations, incidence levels of NIDDM range from epidemic

144

levels (Nauruans; the Pima tribe of the U.S.), to rates that are "only" several times that of Europeans.

THE HYPOTHESIS: Overview
The carnivore connection is a hypothesis that explains the above in terms of insulin resistance and partial adaptation to the high-carbohydrate diets introduced via agriculture. The major points of the hypothesis are as follows (from Miller and Colagiuri [1994, p. 1280 (abstract)]: Our primate ancestors ate a high-carbohydrate diet and the brain and reproductive tissues evolved a specific requirement for glucose as a source of fuel... [T]he Ice ages which dominated the last two million years of human evolution brought a low-carbohydrate, high-protein diet. Certain metabolic adaptations were therefore necessary to accommodate the low glucose intake. Studies in both humans and experimental animals indicate the adaptive (phenotypic) response to lowcarbohydrate intake is insulin resistance... We propose that the low-carbohydrate carnivorous diet would have disadvantaged reproduction in insulinsensitive individuals and positively selected for individuals with insulin resistance. Natural selection would therefore result in a high proportion of people with genetically-determined insulin resistance. Note here that insulin resistance is a disadvantage in the high-carbohydrate diets provided by agriculture, as the metabolism of sugar (and starch) requires far more insulin than the metabolism of fats or proteins. That is, an agricultural diet requires more insulin than a hunter-gatherer diet. Hence, because of a relatively longer history of agriculture, selection pressure for insulin resistance was released sooner in European populations--resulting in lower incidence of NIDDM.

The hypothesis is illustrated in the following figure.

145

Courtesy link to Springer-Verlag's Springer Science Online graciously provided in exchange for permission to reproduce the above figure.

Evidence for the hypothesis
Miller and Colagiuri [1994, p. 1281] comment :

Our hypothesis hinges on four lines of evidence.

146

1. that during the last two million years of evolution, humans were primarily carnivorous, i.e., flesh-eating hunters consuming a low-carbohydrate, high-protein diet 2. that a low-carbohydrate, high-protein diet requires profound insulin resistance to maintain glucose homeostasis, particularly during reproduction 3. that genetic differences in insulin resistance and predisposition to NIDDM can be explained by differences in exposure to carbohydrate during the past 10,000 years 4. that changes in the quality of carbohydrate can explain the recent epidemic of NIDDM in susceptible populations. The beauty of the carnivore connection hypothesis of Miller and Colagiuri is that it provides an elegant, logical (and plausible) explanation for the increase in NIDDM that occurs when former hunter-gatherer societies adopt the high-carbohydrate diets of modern agriculture. At present, it remains a hypothesis, but is included here to stimulate thought on these issues, and to introduce readers to the powerful insight that evolutionary/Paleo research can provide into problems in modern nutrition and health.

Related studies on insulin resistance The "thrifty genotype" hypothesis of Neel [1962, 1982]. The earliest paper to examine the genetic
aspects of insulin resistance as it relates to diabetes mellitus is Neel [1962]. Neel hypothesized that the diabetic genotype was in a sense a "thrifty" genotype that encouraged the efficient utilization of large amounts of food. Such a feature might convey survival advantages in an environment in which both feast and famine were common (though, obviously, not at the same time). The apparent overproduction of insulin in modern times by certain individuals might be a remnant from evolutionary times, when it may have increased adipose fat storage to tide people over during lean times. Neel [1982] revisits the 1962 research, and notes that subsequent research invalidated several details of his 1962 paper. Neel's later paper provides three different approaches which might explain or support the original hypothesis of the "thrifty genotype" (as Neel's hypothesis is widely known). Two of the approaches revolve around having non-functional or insensitive insulin receptors, and the third approach suggests differences in insulin sensitivity between glucose and lipid pathways in each individual.

The "not-so-thrifty genotype" hypothesis of Reaven [1998a]. Reaven [1998a] hypothesizes that
the underlying function of insulin resistance is not to increase fat storage as Neel suggests, but to spare muscle tissue from proteolysis, i.e., to preserve muscle tissue in periods of famine. Reaven [1998a] claims (and provides one citation in support) that there is evidence that insulin-controlled glucose disposal is genetically determined. Reaven's hypothesis is called the "not-so-thrifty genotype." Ozanne and Hales [1998] challenge Reaven [1998a], and note that the evidence is weak in support of claims that NIDDM has a genetic basis. They also note that Reaven's approach suggests that individuals with NIDDM are sensitive to the anti-proteolytic action of insulin, while simultaneously being resistant to the glucose-disposal effects of insulin (analogous to the third approach of Neel [1982]).

Both hypotheses (Neel, Reaven) are based on a dubious assumption. Cordain et al. [1998]
criticize the approaches of both Neel and Reaven, as both are based on the assumption that famine was common in evolutionary times. However, Cordain et al. cite extensive evidence that starvation was/is less common in prehistoric peoples and among modern hunter-gatherers when compared to subsistence agriculturalists. Instead of assuming periodic famine, Cordain et al. [1998] argue for insulin resistance on the basis described in Miller and Colagiuri [1994] (as outlined above on this page). (Also see Reaven [1998b] for additional comments.)

Insulin Resistance: The Carnivore Connection Hypothesis (Part 2)

147

Syndrome X and an alternative vegetarian hypothesis on insulin resistance Description of syndrome X. The term "syndrome X" has two meanings in medicine. One usage of the
term applies to patients displaying angina and electrocardiographic evidence of heart disease without accompanying evidence of coronary artery disease. The term "microvascular angina" is slowly replacing "syndrome X" for this disorder [Reaven 1994]. This particular usage of the term has been described first so that we can focus on the meaning of the term "syndrome X" of interest here; from Reaven [1994, p. 13]: Indeed, there is substantial evidence [3, 5-8] that the combination of insulin resistance and compensatory hyperinsulinaemia predisposes individuals to develop a high plasma triglyceride (TG) and a low highdensity lipoprotein (HDL) cholesterol concentration, high blood pressure, and coronary heart disease (CHD). In 1988, I suggested that this cluster of abnormalities constituted an important clinical syndrome, designated as Syndrome X [3]. Reaven [1994] proposes that the term syndrome X be expanded to include, as possible symptoms, hyperuricaemia (increased serum uric acid concentration), the presence of smaller low-density lipoprotein (LDL) particles, and other conditions. The reason for discussing syndrome X here is two-fold. First, the term may be relevant to some (but certainly not all) of the health problems that occur when former huntergatherers switch from their traditional diets to the high-carbohydrate diets of agriculture. Readers should be aware that other diagnoses (including the diagnosis of "good health") may be relevant, and it is not suggested that all former hunter-gatherers suffer from syndrome X.

Alternative vegetarian hypothesis for insulin resistance? The second reason to discuss syndrome
X is that a recent hypothesis, Provonsha [1998], might possibly be promoted by raw/veg*n advocates as a vegetarian alternative to the insulin resistance hypotheses discussed above. Provonsha's hypothesis claims (in opposition to implications of the above hypotheses) that meat consumption causes insulin resistance and related syndrome X conditions. A brief summary of the major parts of the hypothesis of Provonsha is as follows. •

Simplistic binary logic is a basic tenet of the hypothesis. Biochemical processes can be
classified in a binary manner: anabolic (associated with food intake) and catabolic (associated with fasting and recovery from injury). Implicitly, Provonsha appears to indicate that anabolic = good, and catabolic = bad, in discussing processes. The consumption of animal tissue (protein) results in the production of the hormones glucagon and cortisol, whose action opposes that of insulin. (Provonsha classifies insulin as anabolic, and glucagon and cortisol as catabolic.) Meat protein = "catabolic"; allegedly causes syndrome X. Meat protein is "catabolic" and will likely activate catabolic processes in the human body, causing insulin resistance and promoting syndrome X.

• •

There are numerous problems with Provonsha's arguments. First, Provonsha tries to class all biochemical processes as either anabolic or catabolic, and he expresses amazement that both processes can be present in the body at once; from Provonsha [1998, p. 120]: "It has always amazed me that 'protein' activates both insulin and glucagon, hormones with completely opposite functions (Unger, 1974)."

Binary logic of hypothesis is an inadequate model of reality. It appears that Provonsha believes
that the human body should conform to a simplistic binary model: anabolic or catabolic, where the "or" is exclusive. However, the human body is a very complex system, with anabolic and catabolic processes underway simultaneously, and in balance. Black-and-white reasoning similar to what Provonsha appears to suggest can also be found in the simplistic approaches of certain raw vegan extremists. Suffice it to say that such reasoning does not reflect well on Provonsha's hypothesis. Second, Provonsha argues against glucagon and cortisol, yet his criticisms of them are unconvincing. He argues that both raise blood sugar, and claims that cortisol increases insulin resistance and promotes obesity (he cites three references, the titles of two of which relate to obesity). His criticism of glucagon similarly

148

discusses its role in promoting obesity, although he provides only one citation--a study of diabetic rats--that may indirectly support this claim. The basic problem with his arguments here is that since studies on the direct effects of cortisol and/or glucagon are not cited (with the possible exception of the rat study above), his argument is essentially a narrow "this might happen if we look at specific processes in isolation" [not a quote from Provonsha]. Such arguments are frankly not very compelling, as the human body is a complex, interrelated system, with catabolic and anabolic processes active simultaneously. Further, glucagon and cortisol, and catabolic processes in general, are necessary and are part of the balanced homeostasis required to support life.

Fallacy that meat = high-fat consumption recited. Also, Provonsha engages, on p. 122, in the
common fallacy of equating meat consumption with high-fat consumption--demonstrating ignorance of paleolithic diet research that has analyzed wild animal meats, which were part of humanity's evolutionary diet, and found them to be mostly very low-fat (muscle meats) or, at the least, low in saturated fat (organ meats), and that lean meats do exist. (The character of wild game is discussed in a later section.)

Unsupported claims in hypothesis. Third, Provonsha claims (p. 124), without evidence or citations,
that if one eats meat, one will absorb "untold" chemicals that are catabolic and can disrupt human body processes. Because he provides no citations in support of the preceding claims, Provonsha's claims that eating meat might disrupt body processes appear to be speculative. Provonsha also criticizes the fact that glucagon (produced by eating animal tissue) inhibits pancreatic enzyme production. However, he fails to mention that many grains and legumes--common plant foods, and the major caloric basis for most (conventional) vegan diets--contain significant amounts of enzyme inhibitors as well. (For more on antinutritional components in grains/legumes, see The Late Role of Grains and Legumes in the Human Diet, and Biochemical Evidence of their Evolutionary Discordance.)

Real-world dietary trials directly contradict the hypothesis. Provonsha's hypothesis that meat
protein causes insulin resistance ultimately fails the test of real-world dietary trials. While it is true that eating fat with carbohydrate may lead to lower glucose and higher insulin levels (see, e.g., Collier et al. [1988]), it does not follow that meat causes chronic hyperinsulinemia or syndrome X. To the contrary, two studies on this very issue provide evidence that dramatically contradicts the core of Provonsha's hypothesis.

Results of trial on nearly 100% animal food (seafood) diet. O'Dea and Spargo [1982] investigated
the plasma and glucose responses in 12 full-blooded Australian Aborigines both before and after 2 weeks on a diet of nearly 100% animal flesh (seafood). Prior to the trial, the diet of the Aborigines was, on an energy (caloric) basis, high-carbohydrate (40-50%), high in fat (40-50%), and low in protein (10% or less). The seafood diet observed for 2 weeks was, on an energy basis, 70-75% protein, 20-25% fat, and less than 5% carbohydrate. If Provonsha's hypothesis regarding animal protein causing insulin resistance were true, one would expect an increase in insulin resistance from such a high protein diet. However, the opposite was observed; from O'Dea and Spargo [1982, pp. 496-497]: Together these findings suggest an improvement in glucose utilization and insulin sensitivity after the high protein-low carbohydrate diet... The mechanisms by which low carbohydrate diets which are high in protein preserve glucose tolerance while those high in fat do not, is probably related to the gluconeogenic potential of high protein diets. Elevated glucose levels in response to the ingestion of protein would promote hepatic gluconeogenesis from amino acids entering the liver from the splanchnic circulation. In this way a low carbohydrate diet which was high in protein could maintain the necessary glucose supply to the body whereas one high in fat could not.

149

Reversion to high-protein, meat-based diet improves lipid and carbohydrate metabolism in Westernized, diabetic Australian Aborigines. O'Dea [1984] presents the results of a study on 10
diabetic and 4 non-diabetic full-blooded Australian Aborigines. They were tested both before and after living for 7 weeks in the traditional (hunter-gatherer) manner, in northwestern Australia. The 7 weeks consisted of 1.5 weeks of travel, 2 weeks at a coastal location living on mostly seafood, and 3.5 weeks inland living on land animals with some wild plant foods. During the first 1.5 weeks (travel time), the diet was 90% animal foods on an energy (caloric) basis. The coastal diet composition was nearly all animal foods, mostly seafood with some birds and kangaroo, and was approximately 80% protein, 20% fat, and less than 5% carbohydrate. The diet at the inland location was 64% animal foods (by energy) and 36% plant foods, the macronutrient content of which was 54% protein, 13% fat, and 33% carbohydrate.

Significant improvements noted in both glucose tolerance and insulin response. Once again, if
Provonsha's hypothesis that animal protein causes or aggravates insulin resistance were true, one would expect the above animal-protein-based diets to greatly aggravate the condition of the diabetics in the study. However, the opposite happened: the condition of the diabetics improved. From O'Dea [1984, pp. 601, 602]: The major finding in this study was the marked improvement in glucose tolerance in 10 diabetic Aborigines after a 7-wk reversion to traditional hunter-gatherer lifestyle. There were two components to this improvement: a striking fall in the basal (fasting) glucose concentration and a less marked, but nevertheless significant, improvement in glucose removal after oral glucose... The three most striking metabolic changes that occurred in this study, namely the reductions in fasting glucose, insulin, and triglyceride concentrations to normal or near-normal levels, were certainly interrelated. Note that the insulin response to glucose also improved [O'Dea 1984, pp. 596, 599]. The above studies that contradict Provonsha's hypothesis, plus the other problems as noted previously, suggest the hypothesis has little merit, and raw/veg*n advocates should not try to use it as a vegetarian alternative explanation to the more plausible carnivore-connection hypothesis for insulin resistance or syndrome X.

Insulin resistance and fruitarianism Diabetes-like symptoms. On adopting a fruitarian diet, anecdotal evidence (the only evidence available)
indicates than many individuals experience diabetes-like symptoms, specifically some of the following: excess urination, frequent thirst, fatigue, mood swings, intermittent blurred vision, pains in extremities (hands, feet), etc. The standard fruitarian "party line" explanation for such symptoms is that they are detox and will go away once you are "pure" enough. In reality, the excess urination and frequent thirst usually persist, so long as one is consuming large amounts of sweet juicy fruit. (Consider the physics involved: large amounts of water are ingested, in the form of sweet juicy fruits, and approximately the same amount of water must be excreted, as well.) A typical response in the face of this ongoing experience is to (mentally) adjust to the symptoms; that is, one may begin to consider it normal or healthy to urinate several times per hour, have frequent mood swings, experience intermittent fatigue (which may be considered as evidence of "detox"), and so on.

(Note regarding thirst symptoms: Thirst on a fruitarian diet may seem counterintuitive, since the level
of fluid intake is so high via all the fruit. However, some fruits [citrus, for example] are diuretic and increase urination. When used as staples, this can lead to thirst and/or the need to drink more water, at least

150

in my own former experience as a fruitarian. Or instead, one may become thirsty but suppress the urge to drink because they become tired of urinating every few minutes. This syndrome may be partially responsible for why some fruitarians "don't drink water," aside from just the high water content of the diet.) The carnivore connection hypothesis may explain these symptoms as well, i.e., the individual is consuming grossly excessive carbohydrate (sugar) given his or her level of insulin resistance. Of course other factors may also be involved (e.g., insulin inhibitors in the diet, plus other nutritional factors: zinc, B-12, taurine, insufficient calories, etc.), but the hypothesis that the "ideal" fruitarian diet is beyond the range of genetic adaptation (i.e., decidedly unnatural) for many individuals is not only tantalizing, but highly plausible as well. The wide incidence of diabetes-like symptoms among fruitarians may be (additional, circumstantial) evidence that although our prehistoric ancestors certainly consumed fruit (when available), strict fruitarian diets were never an important factor in human evolution.

Rationalizations by fruitarians. No doubt the defenders of fruitarianism will claim that a high
incidence of NIDDM in former (recent) hunter-gatherer populations is all the fault of the grains/legumes in their new diet, and that there would be no NIDDM if the former hunter-gatherers were to eat the "ideal" fruitarian diet. But where is the evidence (even including anecdotal evidence) for such a claim? More to the point, the extensive anecdotal evidence surrounding fruitarianism is that it is a massive failure, in the long run, and the diabetes-like symptoms (which may be due to excess sugar consumption) are common among those who try to subsist on a diet of mostly sweet fruit. Further, such evidence is found among people of European descent, i.e., those who have had a longer time to adapt to higher carbohydrate diets.

Comparative Physiology: Overall Synopsis
Some of the physiological evidence that humans are adapted to a diet that includes substantial animal products (fauna; i.e., we are faunivores) is: • •

Heme iron receptor sites. Our intestines contain receptor sites specifically for the absorption
of heme iron, which is found in nutritionally significant quantities only in animal foods. This is strong evidence of evolutionary physiological adaptation to animal foods in the human diet. B-12 considerations. Humans need vitamin B-12, but all current evidence suggests that plant foods were not a reliable, year-round source during human evolution. Geophagy and coprophagy are not plausible sources, leaving animal foods (including insects) as the sole reliable, plausible source. Taurine synthesis. Relative efficiency of synthesis: the synthesis of taurine is much less efficient in humans than in herbivorous animals. Beta-carotene to vitamin A conversion. Relative efficiency of conversion: the conversion of beta-carotene to vitamin A is much less efficient in humans than in herbivorous animals. Sufficiency and balance of EFAs. Common, staple plant foods generally do not contain the right "balance" of EFAs, and production of EPA, DHA from plant source fats may be inefficient. It's hard to understand why--if humans really are natural vegans--the "optimal" balance of EFAs is apparently so difficult to achieve with plant foods. Bioavailability issues. Relative efficiency of digestion/bioavailability: Although animal foods are generally easier for any mammal to digest than plant foods for structural reasons (e.g., cell wall considerations), the fact that many staple plant foods contain high levels of factors that inhibit the human digestive process suggests a long evolutionary dependence on animal foods as major nutrient sources. Examples of the relative bioavailability are as follows. o Iron in animal foods is more bioavailable than in plant foods. o Zinc is more bioavailable in animal foods than in plant foods. o Animal protein is digested more efficiently than plant protein. Analysis of bitter taste thresholds by Glendinning [1994] shows that the human bitter taste threshold is in the same range as faunivores.

• • •

151

Taken individually, many of the above points are equivocal. When considered collectively, however, they strongly point to animal foods having an important role in the human diet during evolution. Also, two important hypotheses relating diet and evolution were discussed here: •

The incidence of hereditary hemochromatosis, a relatively common (in certain
populations) "iron overload" disease, may be an example of a partial genetic adaptation that promotes survival in the high-carbohydrate, lower-animal-food diets of agriculture, by increasing iron absorption. The carnivore-connection hypothesis of Miller and Colagiuri explains the high incidence of NIDDM in former (and only recently Westernized) hunter-gatherer populations as being due to insulin resistance; i.e., their insulin resistance level has not yet begun to adapt to the highcarbohydrate diets of agriculture.

Specific concerns for fruitarians. Additionally, specific hypotheses regarding fruitarianism were
presented: • • •

Heightened B-12 risk. Strict fruitarianism might accelerate vitamin B-12 deficiency by
decreasing production of gastric acid. This may be a low-risk issue as it is very rare for anyone to strictly follow a fruitarian diet long-term; i.e., "cheating" and binge-eating are common on the diet. Low zinc and feelings of "euphoria." Zinc deficiency is a plausible potential explanation for the "euphoric" mental feeling reported by some fruitarians (also an explanation for the loss of libido reported by some). Diabetes-like symptoms. The carnivore-connection hypothesis of Miller and Colagiuri might explain the high incidence of diabetes-like symptoms among fruitarians, and the extremely high failure rate among those who try the diet. It seems plausible, given the predominant picture presented by the anecdotal record, that most people are not genetically adapted to a diet in which (approximately) 75+% of calories come from sugar, a simple carbohydrate that requires insulin for metabolism.

PART 8: Further Issues in the Debate over Omnivorous vs. Vegetarian Diets
Preface and overview. This section examines some of the related or ancillary claims, i.e., claims other
than those based strictly on comparative anatomy/physiology, that are sometimes included in the comparative "proofs" for specific diets. Inasmuch as there is a wide variety of such claims, and such ancillary claims are often lengthy, this section is primarily an overview or summary response to such claims.

152

This section first briefly considers two bogus fruitarian claims. Then, as an important prerequisite to discussing the use or interpretation of clinical data in ancillary claims, we examine hunter-gatherer diets to see if a healthy omnivore/faunivore diet exists. Then we examine some common logical fallacies in the interpretation of clinical and epidemiological data. After that, we examine the topic of instinct versus intelligence, and finally finish by addressing a long list of bogus claims about instinct made by a fruitarian extremist.

Two Fruitarian Claims

Anti-protein theories: prime examples of crank science Overview of anti-protein theories. These theories were mentioned in earlier sections, and only a brief
summary is presented here. The basic thrust of such theories is that: • • • Actual protein/amino acid requirements are allegedly much lower than the national and international standards suggest. (Note: See the earlier section on protein digestion for pointers to published scientific papers that claim the opposite is true, i.e., the standards are too low.) "Excess" protein, i.e., protein above the very low level in fruits, will create metabolic by-products as a result of digestion. These metabolic by-products are allegedly harmful in anything above very tiny amounts.

Ergo, one could summarize the above as "protein is toxic" in figurative terms, although in literal terms, one might summarize the theory as "protein is toxic in the sense that consuming anything above minimal requirements causes the production of metabolic by-products that are toxic and harmful."

Defects of anti-protein theories. The problem with such theories is that:
• • The calculation of protein requirements in such theories may be dubious--may fail to include a safety margin, or assume that the individual is sedentary, and so on. The greatest flaw of such theories is that they are based on massive ignorance of the reality that the human body is very well-equipped to dispose of the metabolic by-products of protein digestion. That is, the theories assume the metabolic products are necessarily harmful without regard for the bioavailability of the metabolic by-product, (i.e., the levels actually produced and absorbed vs. known harmful levels) and the ability of our detox systems (liver, other organs) to handle the by-products. Such theories are implicitly centered around an intensely paranoid, pathological fear of any/all possible "toxins" in food (with no regard to the body's normal tolerances for, and abilities to safely dispose of, them). Such intense fear of toxins can easily become obsessive. Needless to say, pathological, obsessive fear is not the basis for a healthy diet or dietary philosophy; it can be, however, the basis for an eating disorder (a mental illness).

See the article, "Is Protein Toxic?" on this site (not yet available) for an in-depth analysis of anti-protein theories; also see the article Fruit is Not Like Mother's Milk for background information relevant to the theories.

The theories mentioned above are prime examples of crank science. They might look impressive
to the layperson, as they may include detailed calculations and citations to scientific journals. However, a close look at the logic and other details will reveal them to be crank science: bogus and fallacious. It should be mentioned that even conventional veg*ns might find the above site articles of interest, for they illuminate the crank science and bad logic that pervade the (vegan) fruitarian movement.

153

Raw irony. Another example of "raw irony" should be noted here. Some of the staunchest advocates of
the anti-protein theories report that they apparently have not managed to succeed, long-term, on the "ideal," ultra-low protein fruitarian diets that they advocate. It seems their message is, "Do as I say, not as I do."

Fruitarian straw argument: a pristine preindustrial world
Inasmuch as paleoanthropology and evolution provide powerful scientific evidence that humans are not natural veg*ns/fruitarians but are instead, natural omnivores/faunivores, some raw/veg*n advocates have resorted to straw arguments. The straw argument is to assert that those discussing the evolutionary diet of humans are claiming that humans who follow an evolutionary diet will be disease-free, and that prehistoric life was some kind of Eden. Of course, such assertions are distortions and exaggerations. No credible researcher of evolutionary diet would suggest that following such a diet will guarantee perfect health, or that prehistoric life was Edenic. In sharp contrast to the straw claims of an Eden, the fossil record points to high mortality rates from what could be described as occupational hazards (hunting, accidents, predators, and other kinds of violence) in prehistoric times. Disease (primarily acute) was also a major mortality factor. The use of such a straw argument provides yet another example of "raw irony," for it has become commonplace in certain fruitarian circles to claim that the fruitarian diet provides "paradise health." As mentioned in previous sections, extensive anecdotal evidence (all that is available) indicates that adherence to a strict (vegan) fruitarian diet in the long run often leads to serious physical and/or mental ill health. Let us now turn our attention to more important matters in the following sections.

Hunter-Gatherers: Examples of Healthy Omnivores
A major point often overlooked by dietary advocates eager to condemn all omnivore diets (based primarily on clinical research that by default normally utilizes the SAD/SWD diet as the "control" omnivore diet) in order to "sell" their favorite (raw/veg*n) diet is that healthy omnivore/faunivore diets do in fact exist. Besides the apparent anomaly that occasional individuals appear to thrive on the deservedly maligned SAD diet, hunter-gatherers (who have by now all but disappeared or assimilated Western lifestyles) provide examples of healthy omnivore/faunivore diets.

HEALTH AND DISEASE IN HUNTER-GATHERERS
Environmental factors

Malnutrition and starvation. Dunn [1968], a paper on health and disease in hunter-gatherers,
makes a number of relevant points. Dunn (p. 233) reports "patent (and perhaps even borderline) malnutrition [in hunter-gatherers] is rare." Dunn also reports that hunter-gatherers are often better nourished than nearby agriculturalists or urban residents. Dunn also notes (p. 223), "Starvation occurs infrequently." Agriculturalists who depend on a few crops for the bulk of their diet are more susceptible to starvation (e.g., caused by crop failure from climatic variation, insect attack, plant diseases) than hunter-gatherers who have a wider dietary base.

Occupational hazards of hunter-gatherer lifestyles. Dunn (pp. 224-225) also mentions
that: 4. Accidental and traumatic death rates vary greatly among hunter-gatherer populations...

154

7. Ample evidence is available that "social mortality" has been and is significant in the population equation for any hunting and gathering society. In the above, social mortality refers to warfare, cannibalism, sacrifice, infanticide, etc. These are rare or non-existent today, though they may have been significant factors among certain huntergatherer populations in the not-too-distant past. •

Parasites and infectious diseases. On the topic of parasites, Dunn notes (p. 225):
8. Parasitic and infectious disease rates of prevalence and incidence are related to ecosystem diversity and complexity. Although many of these diseases contribute substantially to mortality, no simple, single generalization is possible for all hunter-gatherers.

Chronic and degenerative diseases

Chronic diseases rare in hunter-gatherer societies. Dunn (pp. 223-224) reports that:
3. Chronic diseases, especially those associated with old age, are relatively infrequent... Although life expectancies of hunter-gatherers are low by modern European or American standards, they compare favorably with expectancies for displaced hunter-gatherers, many subsistence agriculturalists, and impoverished urbanized people of the tropics today (Ackerknecht, 1948; Billington, 1960; Duguid, 1963; Dunn, MS-i; Maingard, 1937; Polunin, 1953). Few huntergatherers survive long enough to develop cardiovascular disease or cancer, major causes of mortality in America and Europe today. These diseases do occur infrequently, however, and preclinical manifestations of cardiovascular disorder may be detected in such populations (Mann et al., 1962). Occasional writers have claimed that certain primitive populations are "cancer-free" or "heart disease-free," but sound evidence to support such contentions is lacking, and evidence to the contrary has been steadily accumulating. Note: the term "MS-i" is as it appears in Dunn [1968]. No explanation is given for it there; possibly it might stand for "manuscript 1."

How "hunter-gatherer" is defined is an important point when determining disease incidence. (note regarding Mann [1962] reference cited in Dunn [1968] above) The comments
from Dunn [1968] above have been cited at enough length to provide the context that no human population is completely disease-free. However, at the same time, there are some important qualifications that need to be clarified regarding groups sometimes cited as "hunter-gatherers" who in actuality do not meet the relevant definition. The paper of Mann et al. [1962] referred to just above, which studied African Pygmies in what was then known as the Congo, is a case in point. The dietary survey component of Mann's study could not be completed due to political unrest, so dietary information had to be obtained from other, perhaps less reliable sources. Mann's paper claims both that the Pygmy diet was mostly wild foods, and that the second and third most common foods in the diet were sweet potatoes and rice, both of which are cultivated foods. The Pygmies also consumed palm oil and salt obtained via trade. By current research standards, of course, all of these foods are well outside the definition of a hunter-gatherer diet. The paper also claims that the Pygmy aboriginal culture has been preserved, while, simultaneously, the Pygmies are oppressed and used as laborers by nearby Bantu tribes. Further, the claims of heart disease were based solely on ECG (electrocardiogram) readings, which Mann et al. [1962] report are an unreliable diagnostic tool for Africans (p. 367). Finally, the abnormal

155

ECG readings that were a basis for the claim of possible heart disease should be interpreted in light of the serum cholesterol readings obtained for the Pygmies: 101 for adult men, 111 for women. These are extraordinarily low readings. Such inconsistencies in the Mann et al. [1962] paper raise considerable doubt concerning their finding of signs of possible heart disease, and also whether the Pygmies tested were following their traditional lifestyle. The lesson here, once again, is to check sources, particularly older ones, to see how well the research standards match currently accepted definitions and criteria. •

Humans are not exempt from the reality of diseases. No doubt certain dietary extremists
will seize on quotations like the one above regarding the lack of complete freedom of disease in hunter-gatherers, and try to use it against the hunter-gatherer diet. The above remarks and qualifications notwithstanding regarding the care needed in assessing (reputed) hunter-gatherers, a couple of obvious points need to be remembered regarding any population: o Wild animals can and do suffer from degenerative diseases. McCullagh [1972] discusses the occurrence of atherosclerosis in wild African elephants; Goodall [1986] discusses the incidence of arthritis-like conditions (also colds and pneumonia, among other disorders) among the wild chimps of Gombe. Of course, wild animals can and do suffer from a wide range of non-degenerative diseases (hoof and mouth, Lyme disease, bubonic plague, polio, etc.). o No diet can guarantee "paradise health," and claims that a particular diet does are simply "dietary snake oil." The idea that a particular diet can guarantee "paradise health" (a bogus claim made by some fruitarian extremists) or prevent any/all diseases is nothing but a fairy tale. More specifically, such false claims are also "dietary snake oil" when used by dishonest dietary advocates to "sell" their "ideal" diet. In the real world of nature, diseases exist, and despite the considerably enhanced probability of better health by following a better diet, there are no guarantees in life. The point of the above is straightforward: wild animals can and do die from diseases, and there is no reason to assume humans are different in this matter. However--the absence of total immunity aside--the incidence rate of chronic and degenerative diseases is in fact very low in huntergatherer societies.

Incidence of specific diseases. Although hunter-gatherers are not immune to chronic diseases,
the available evidence indicates that such diseases are relatively rare in hunter-gatherer groups who follow their traditional lifestyles. However, the incidence of such diseases increases sharply once these societies begin to assimilate Western culture and lifestyle. Comments on specific diseases are as follows. o Cancer. As Eaton et al. [1994, p. 361] note: Medical anthropologists have found little cancer in their studies of technologically primitive people, and paleopathologists believe that the prevalence of malignancy was low in the past, even when differences in population age structure are taken into account (Rowling, 1961; Hildes and Schaefer, 1984; Micozzi, 1991). Eaton et al. [1994] also analyzed the factors involved in women's reproductive cancers and developed a model that indicates that up to the age of 60, the risk of breast cancer in Western women is 100 times the risk level for preagricultural (e.g., hunter-gatherer) women. o

Cancer in Africa. The famous medical missionary, Dr. Albert Schweitzer, writing in
Berglas [1957], reports the following [Berglas 1957, preface]:

156

On my arrival in Gabon, in 1913, I was astonished to encounter no cases of cancer. I saw none among the natives two hundred miles from the coast. I can not, of course, say positively there was no cancer at all, but, like other frontier doctors, I can only say that if any cases existed they must have been quite rare. Williams [1908] reports that cancer is extremely rare among Australian aborigines and in the aboriginal peoples of Africa and also North America. (Note the date of the citation: 1908, a time when there were far more hunter-gatherers than there are today.) o Cancer among the Inuit. Stefansson [1960] describes the search of George B. Leavitt, a physician on a whaling ship, who searched for cancer among the Inuit of Canada and Alaska. It took him 49 years, from 1884 to the first confirmed case in 1933, to find cancer. (Stefansson [1960] describes a possible but unconfirmed case of cancer in 1900, and Eaton et al. [1988] describe cancer in a 500-year-old Inuit mummy.) Schaefer [1981] reports that breast cancer was virtually unknown among the Inuit in earlier times, but was one of the most common forms of malignancy by 1976. o

Cardiovascular disease. Moodie [1981] reports on evidence of hypertension among
Australian aborigines, from 1926 to 1975. The data support an association between increasing Westernization and hypertension, but there are some inconsistencies. However, citing additional, more recent data, Moodie reports further evidence of increasing hypertensive disease among Aborigines. Moodie [1981] also reports that prior to the 1960s, arteriosclerosis and ischemic heart disease were rare among the Australian aborigines. Schaefer [1981] reports that hypertension and coronary heart disease are extremely rare among the less-acculturated Inuit, but are increasing markedly among the acculturated groups. A 1958 survey of Alaskan natives found no hypertension; however, a 1969 survey found that native Alaskan women showed so-called normal levels of hypertension (i.e, comparable to Western women).

o

Diabetes. Moodie [1981] reports that diabetes was rare in Aboriginal communities prior
to the 1970s; nowadays the prevalence of diabetes is more than 10% in some Aboriginal communities. Schaefer [1981] reports there are no cases of diabetes among the Inuit who still live the traditional lifestyle. However, cases are now being reported among the acculturated Inuit of the Mackenzie delta area (Canada). Diabetes is also increasing in incidence in the Inuit of Alaska and Greenland.

The above indicate that chronic degenerative diseases are rare when traditional hunter-gatherer lifestyles are followed, but the incidence of disease increases as Westernization occurs. This suggests that at least from the viewpoint of chronic degenerative diseases, the hunter-gatherer lifestyle and diet were quite healthy.

Biomarkers
The biomarkers for hunter-gatherers following traditional lifestyles reflect a high level of health and physical fitness. Eaton et al. [1988] report that the aerobic fitness of male hunter-gatherers is in the superior to excellent range. The triceps skinfold measurements for young males range from 4.6-5.5 mm in hunter-

157

gatherers compared to 11.2 mm for Westerners. Diabetes prevalence is 1.2-1.9% in hunter-gatherers versus 3-10% for Westerners. Finally, serum cholesterol levels for hunter-gatherers are astonishingly low by modern standards--in the 101-146 mg/dl range. (See Eaton et al. [1988, pp. 742-745] for the tables the preceding data comes from.)

Health status of the Aborigines of Australia: O'Dea [1991]
O'Dea [1991] provides additional insight into the health of Australian aborigines, pre-Westernization, when they followed their traditional hunter-gatherer lifestyle. Below is Table 1 from O'Dea [1991, p. 234]:

Recognizing that we are all the descendants of hunter-gatherers, the negative impact on modern huntergatherers when they adopt Western diet and lifestyles may provide insight into factors behind the health problems that afflict those of us whose hunter-gatherer ancestry is more recent. The standard Western diet is dramatically different from the hunter-gatherer evolutionary diet. O'Dea [1991, p. 237] provides the following summary comparison table:

158

The hunter-gatherer diet of the Aborigines was low in fat (lean meat from wild animals) and simple carbohydrates like sugar. Further, considerable energy was required to obtain such foods (i.e., those high in fat or sugar) in the wild, especially in the arid outback of Australia. In contrast, the diet of Westernized Aborigines is high in fat from domesticated animals and in simple carbohydrates (refined sugars, e.g., sucrose, fructose from corn syrup). Also, little effort is required to obtain such foods in today's modern society--just go to the store and buy it (much easier than hunting for your meal).

Impact of Western diet on Aborigines. Once Aborigines switch to a Western diet and lifestyle, they
frequently develop the problems common on Western diets: adult-onset diabetes, coronary heart disease (CHD), hyperinsulinemia, high blood pressure, obesity, etc. The incidence of such diseases increases as the degree of Westernization increases in Aborigine communities. As an example, O'Dea [1991, p. 239] observes that: In the 20-50 year age group, the prevalence of diabetes is ten times higher in Aborigines than in Australians of European ancestry. Although fewer data are available, the prevalence of CHD and hypertension is also considerably higher in Aborigines (Wise et al. 1976; Bastian 1979), and hypertriglyceridemia [high

159

triglycerides] is a striking feature of the lipid profile in all Westernized Aboriginal communities in which it has been measured (O'Dea et al. 1980, 1982, 1988, 1990; O'Dea 1984).

In summary, despite implicit assumptions to the contrary by many raw/veg*n diet advocates, healthy omnivore/faunivore diets do in fact exist, and the diets of hunter-gatherers preWesternization are good examples thereof.

Which Omnivore Diet? The "Omnivorism = Western Diet" Fallacy
Fallacy prevalent among scientifically oriented vegans as well as extremists. Perhaps the most
grievous of all fallacies promoted by extremist and conventional vegans alike is the implicit equation of Western diets with omnivorism in general. Ironically, this nearly omnipresent fallacy (in the vegan community) seems to be widespread even among those for whom clinical research is the "gold standard," and among whom proper epidemiological protocols are fine points of discussion. Likely this is because-among such advocates--other avenues of research such as evolutionary/anthropological studies tend to be given short shrift or denigrated in comparison to clinical studies. Thus important knowledge with a crucial bearing on the interpretation of clinical studies that would be obvious to someone versed in hunter-gatherer research, for instance, can be completely missed, as it is in the perpetuation of this fallacy.

Clinical studies implicitly use those eating Western diets as the "omnivorous control" group. Typically, study after study or "reams of evidence" are cited as showing that veg*n diets are
superior to "omnivorous" diets. But in fact, what the cited studies really indicate is simply that veg*n diets are superior to the deservedly maligned standard American/Western diet (SAD or SWD) with respect to certain degenerative diseases. That is, when not otherwise mentioned, by default those eating the SWD (or variants thereof) implicitly serve as the "control" group to represent "omnivores" in such studies.

Omnivorous hunter-gatherer diets differ from the SAD/SWD in two important respects. Yet
just as typically, no mention is made of the fact that Western diets are considerably different than omnivorous hunter-gatherer diets, for example, in terms of the overall proportions and composition of foodstuffs besides meat. Equally as indiscriminate, there seems to be virtually zero awareness (since the subject is rarely if ever mentioned) of the considerable differences in the type of meat itself in such diets: that is, domesticated or feedlot meat in the typical Western diet vs. the wild game of hunter-gatherer diets. Divergences in overall composition between Western vs. hunter-gatherer diets are covered in the section just below. The disparities between domesticated meats vs. wild game are discussed in the subsequent section, "Logical Fallacies and the Misinterpretation of Research," along with other issues that relate to the "Omnivorism = Western Diet" fallacy.

Hunter-Gatherer Diets vs. Western Diets: Wild vs. Domesticated/Cultivated Foods
An important point to be noted concerning hunter-gatherer diets is that they are composed of wild rather than cultivated foods. Hunter-gatherer diets also generally exclude grains, legumes, and dairy, as they are the products of agriculture--something that hunter-gatherers do not practice.

The hunter-gatherer diet is generally composed of:
• • • Wild fruits, nuts, greens, tubers--raw and/or cooked. The lean meat and organs of wild animals. Insects (wild, though some tribes, e.g., the Ache of Paraguay, practice semi-cultivation of insects; see Clastres [1972]).

160

Other foods--e.g., honey, as an example. (Note that wild honey usually includes the bee brood-bees in the form of larvae; reportedly it is the best-tasting part--at least that is what a friend who is a former instinctive eater relates.)

Most modern Western diets, in sharp contrast, consist of:
• • • • • • Domesticated grains (wheat, corn, rice), tubers, and legumes (which present the problems of gluten, phytates, other antinutrients, and a high glycemic index for some tubers/refined grains). Meat from domesticated animals (high in saturated fat). Dairy (lactose intolerance, high in saturated fat). Cultivated fruits (bred for high sugar content; high glycemic index for some fruits). Hybrid vegetables. Manufactured and/or highly processed foods: o Hydrogenated oils (synthetic fat, very difficult for the body to process). o Heavily processed foods (additives, artificial ingredients). o Large amounts of refined sugar (candy, cookies, other sweets).

Comparison of the above lists with the earlier table (herein) from O'Dea [1991] shows the dramatic differences between a potentially healthy omnivore/faunivore diet, i.e., the hunter-gatherer diet, and a diet generally regarded as unhealthy: the SAD/SWD. The differences between wild fruit and cultivated fruits are discussed in the article, Wild vs. Cultivated Fruit, available on this site. The substantial differences between the meat of wild and domesticated animals is discussed in, "Disparities between Wild Game and Domesticated Meat in Fat Distribution and Composition," on this site (not yet available); additional information is provided in Speth and Spielmann [1983] and Naughton et al. [1986]. Eaton and Konner [1985] discuss the differences in protein and fat content for a Paleolithic diet of wild foods versus the SAD diet of cultivated and processed foods; see Eaton and Konner [1985, table 5, p. 288]. Their data estimate a Paleo diet as containing 34% protein and 21% fat by calories, while the SAD diet is 12% protein and 42% fat. The table also shows a ratio of polyunsaturated fat to saturated fat of 1.41 for a Paleolithic diet versus a ratio of 0.44 for the SAD. Hence, one might characterize a diet of feedlot meat (SAD) as high-fat, medium-protein, versus a diet of lean meat from wild animals (Paleo) as low-fat, highprotein.

Logical Fallacies and the Misinterpretation of Research
CLINICAL STUDIES, ANIMAL EXPERIMENT STUDIES, AND EPIDEMIOLOGICAL STUDIES Introduction
Some of the more recent and more advanced comparative "proofs" of diet may cite as supportive evidence clinical studies, epidemiological studies such as the Cornell China Project, and/or animal experiment studies. A large number and variety of studies may sometimes be cited in comparative "proofs." However, sheer number of studies cited means little if the interpretations drawn from study results are based on fallacious logic and/or misinterpretation.

Common logical fallacies. Typically, such studies are presented as allegedly indicating that veg*n diets
are healthier than omnivore diets. However, nearly all of such claims are actually logical fallacies. A short summary of such commonly encountered fallacies is as follows.

161

Results from studies using domesticated or feedlot meat cannot be generalized to all omnivore diets. Results of experiments in which test subjects eat domesticated/feedlot meats are
often interpreted--usually unknowingly, but at the least implicitly--as reflecting the results of all meats and/or all omnivore diets. However, the composition of domesticated/feedlot meat is very different from the composition of the lean wild animal meat (in terms of both overall fat level as well as type of fats contained) that was a part of humanity's evolutionary diet. (Not yet available: "Disparities between Wild Game and Domesticated Meat in Fat Distribution and Composition" explores these differences in detail.) Eaton et al. [1996, p. 1735] note that the level of fat in domesticated meat compared to wild game is on average five times higher. Just as tellingly, the proportion of saturated fat as a percentage of total fat is five to six times higher in domesticated meat compared to wild game. And there are other significant differences as well. Eaton and Konner [1985, p. 285] point out that EPA, an essential omega-3 fatty acid, constitutes approximately 4% of total fat in wild game, while in domesticated beef there is almost none. Eaton et al. [1996, p. 1736] further note that game meats are also high in other essential fatty acids such as arachidonic acid [AA] and docosahexaenoic acid [DHA], with the result that the character of meats consumed in hunter-gatherer diets results in high plasma levels of the aforementioned essential lipids compared to Westerners, and helps to maintain a dietary omega-6 to omega-3 fatty acid ratio ranging from 1:1 to 4:1--far below the 11:1 ratio typical of Westernized countries. The significance of the difference here as discussed by Eaton is that the high omega-6 to omega-3 ratio that prevails in Western countries may be linked with the promotion of cancer.

Logically invalid extrapolation from SAD/SWD diets to all omnivore diets. Likewise,
clinical studies comparing the SAD (standard American diet) or SWD (standard Western diet) to veg*n diets often assume that the poor results on the SAD/SWD imply all omnivore/faunivore diets would give similar results. That is, bad results on the SAD/SWD are cited as "proof" that virtually all omnivore/faunivore diets are "bad." There is no logical basis for this; it is an example of "unsupportable extrapolation." Results of epidemiological studies may be exaggerated. The importance and relevance of epidemiological data, especially the China Project, may be exaggerated by dietary advocates. This will be discussed in greater depth in a later section.

Let us now begin our summary exploration of these topics.

How reliable are animal studies that use domesticated/feedlot meat?
Due to the significant differences in composition of wild vs. domesticated meats, animal studies that use meat from feedlot animals cannot be used to reliably project the results of all possible omnivore diets. That is, the results of such studies are not representative of hunter-gatherer diets, or, for that matter, any other omnivore diet that excludes domesticated/feedlot meat. In other words, to project from animal studies using domesticated meats to all omnivore/faunivore diets is a logical fallacy--a fallacy one may find commonly practiced by raw/veg*n diet advocates. Some additional remarks on this issue are: •

Hunter-gatherers generally consume all (or nearly all) of the animals they harvest.
(O'Dea [1991] discusses this point regarding the Aborigines of Australia.) Preference is given to internal fatty organs over the lean muscle meats. In contrast, test animals in experiments may be fed nothing but marbled muscle meats. (The nutritional composition of organs is itself different from the composition of muscle meat, not to mention marbled, domesticated/feedlot muscle meats.)

In some tests, animals are put on a diet that may be far beyond their range of natural diets, e.g., putting a natural vegetarian (like a rabbit) on a diet of, say, cooked bacon. As
the gut morphology of a rabbit is not adapted to a pure meat diet, the results of such tests must be interpreted with caution.

162

• •

In some tests, the animals used have been domesticated extensively, e.g., "nude" mice
that are considered desirable because they readily show results to drugs. Obviously, the results of tests on animals that were bred to "yield results" must be interpreted carefully. The issues of reliability and projection in cross-species comparisons deserve some consideration. See Freedman [1988] for a discussion of the (technical) statistical issues involved in developing cancer models based on data from mice.

The above points suggest the following conclusions: •

Studies using domesticated or feedlot meat are not representative of all omnivore diets. Due to the many differences in the composition of wild vs. domesticated meats, the results
of animal studies based on the latter do not necessarily reflect results one might get on a huntergatherer type of diet, or any other omnivore diet that excludes feedlot meats. Obviously, however, animal studies that use feedlot meat have relevance for the SAD/SWD diet, as such diets generally include feedlot meat.

Thus we observe that extrapolating from animal studies using domesticated/feedlot meats to all omnivore/faunivore diets is an unsupportable extrapolation and a logical fallacy.

Clinical studies based on the SAD/SWD diet Narrow claims: Conventional veg*n vs. SAD/SWD. One does not have to look far to find
raw/veg*n advocates citing clinical studies that show the negative health effects of the SAD/SWD diet. The more savvy/honest advocates are specific in their claims, and state something like, "Study X shows the (conventional) vegan diet promotes better health than does the SAD/SWD." Such precision in language is good; however, the fact that healthy omnivore/faunivore diets do in fact exist should also be mentioned, at least occasionally, for completeness and for honesty. (It is rare to find a raw or veg*n advocate who bothers to mention that healthy omnivore diets do exist.)

Fallacious claim: One type of veg*n diet vs. all omnivore/faunivore diets. The less savvy/honest
raw/veg*n diet advocates point to clinical studies of veg*ns vs. SAD/SWD consumers, and then make the massive logical fallacy of saying that such evidence proves or indicates that the veg*n diet tested is better than all omnivore diets. It is a logical fallacy to assume that test results for one type of vegan diet versus one type of omnivore diet (the SAD/SWD, usually) indicate that the tested vegan diet (or, even worse, all vegan diets) are better than all omnivore diets. The SAD/SWD is only one of a large variety of possible omnivore diets. For example, all of the diets in the China Project (discussed in the next section) are omnivore diets, and are generally different from the SAD/SWD diet. Obviously, omnivore diets vary considerably--just as vegan diets do.

Unconscious double standard. One criticism veg*n advocates make concerning some of the clinical
studies that show negative health effects on veg*n diets is that such studies may use non-representative veg*n diets, like macrobiotics, as their sampling base. Along the same lines, note that the "conventional" vegan diet (one that makes use of grains, legumes, etc., usually cooked) is radically different from the 100% raw fruitarian diet. Further, 100% raw fruitarian diets have a dismal record of failure in the long run. The usual long-run result of 100% raw fruitarian diets (per anecdotal evidence) is ill health. Assume that a long-term clinical study of fruitarians existed, and it showed the negative health results so common in anecdotal reports. Would it be fair to extrapolate from the fruitarian diet and condemn all vegan diets? Not really--that would be a logical fallacy. In a similar manner, raw and/or conventional veg*n diet advocates who use clinical studies based on the SAD/SWD diet to condemn all omnivore/faunivore diets are engaging in a logical fallacy.

Examples of citing clinical studies based on the SAD/SWD diet

163

For examples, let's look at two of the studies cited by Ted Altar in his paper, What Might Be the "Natural" Diet for Human Beings. [Note: See reference list for link to paper on Internet]. Altar cites Friedman et al. [1964] as proof that red blood cells become "sticky and sludge" after a meal of animal flesh. Armstrong et al. [1981] is cited as evidence that anti-inflammatory hormones and sex hormones increase after eating animal flesh. Altar then claims these "immediate body reactions to meat" suggest that meat is not in accord with our evolutionary diet.

Study cited does not compare vegetarian vs. non-vegetarian diets. However, the above
suggestion does not follow from the studies cited by Altar. Both studies used the SWD diet (Armstrong et al. [1981] was done in Australia). Friedman et al. [1964] compared two groups of volunteers, all eating the SAD diet, one group consistenting of "personality type A," the other group personality type B." Both groups were fed a meal of ~1900 calories, 67% fat (bacon, eggs, butter, milk, cream). Despite the massive dose of fat, sludging was observed in 10 of the 12 members of group A, but only 3 of group B. Thus Friedman et al. [1964] is focused on comparing two personality groups, A and B, given identical nonvegetarian diets. If a conclusion is to be drawn, the results suggest personality may correlate with the body's ability to handle ingested fat. More important, the Friedman et al. [1964] study is not a vegetarian vs. SAD comparison study; it did not attempt to compare animal vs. plant fats, and so on. Also, the study used large amounts of feedlot meat and dairy from domesticated animals. Hence the results cannot be projected onto hunter-gatherer diets, other omnivore diets, or even veg*n diets (as none of the subjects were veg*ns, and the foods tested were nonveg*n).

SAD used incorrectly to impute results for all omnivore diets. As for Armstrong et al. [1981], the
study compared levels of reproductive hormones in vegetarian and non-vegetarian Seventh-Day Adventist women in Australia. It is appropriate to let Armstrong et al. speak for themselves ([1981, pp. 761, 765)]: The dietary data were such as to permit only qualitative assessment of total dietary fat intake... In accordance with our original hypothesis, the vegetarians had a lower excretion of estrogens than did the non-vegetarians. The difference was small but significant and largely due to a lower excretion of E3 [estriol, a type of estrogen]... The E3 ratio was less in vegetarians than in non-vegetarians, which is the opposite of what might have been expected from the evidence of other studies relating E3 ratios to risk of breast cancer... The total urinary estrogen values were lower in the vegetarians than in the nonvegetarians, the difference being mainly due to the lower E3 excretion. Armstrong et al. then suggest that the lower estrogen excretion in vegetarians may be due to metabolic differences in estrogen pathways, or to lower estrogen production in vegetarians. Note the remark above that implies the E3 ratio in vegetarians might be associated with an increased risk of breast cancer. Suddenly, the lower estrogen levels of the vegetarian diet are not as attractive as before. Inasmuch as the Armstrong et al. [1981] study uses the SWD--in this case, the standard Australian diet--as comparison, once again the results cannot be projected to all omnivore diets.

Claims are not supported by the studies. Thus we observe that Altar's claim that these "immediate
body reactions to meat" are suggestive of meat not being in accord with our evolution is fallacious. The evidence presented by Altar is based on feedlot meats in the context of the SWD diet--foods that are far removed from our evolutionary diet.

164

Please note here that I have used Altar's paper as an example because it cites only a few studies, and it is possible to examine those in detail. With a large number of citations, it is difficult to check them all. It is not my intention here to "pick on" Ted Altar.

Drawbacks to Relying Exclusively on Clinical Studies of Diet
Exclusive reliance on clinical studies is narrow-minded and discounts other important evidence
There is a tendency among most conventional veg*n dietary advocates to rely heavily and/or exclusively on clinical studies in discussing the health effects of diets. This is unfortunate and, quite frankly, narrowminded, as it ignores the other important scientific information currently available, e.g., the evidence of evolution and anthropology. (Of course, some formal dietary studies are ecological [epidemiological] and are based on the aggregation of large amounts of data. However, such studies require followup clinical studies to confirm hypotheses suggested by the ecological data. So, one ends up right back [again] with clinical studies.) One example of the information to be gained from evolutionary and/or "paleo"-type studies is the data available from research on hunter-gatherers regarding the health effects of their omnivorous diets (which are much different than various versions of the SAD/SWD omnivorous diet utilized in most clinical research), which was discussed earlier here in Part 8. Another area of research, one closely related to evolutionary studies, is the newly emerging biomolecular study of genes and their effects. These offer the promise of not simply new information but also a new way of thinking about dietary problems.

Genetic studies offer a new paradigm for giving insights into optimal diets for INDIVIDUALS, rather than blanket recommendations based on clinical studies of GROUPS. With the mapping of the entirety of the human genome currently in progress, insights will be
(and are already beginning to be) gained into the actual purpose and design of the human organism's functioning at the level of genes/molecular biology (the most fundamental "physiological" level possible). In conjunction with paleoanthropological insights regarding evolutionary adaptation, this understanding of the body's actual genetic design--when applied to nutritional questions--stands to give us a newly detailed understanding of the consequences of various eating patterns at the lowest nuts-and-bolts level. Eventually this will result in the ability to ascertain the actual step-by-step mechanisms governing the consequences of how foods are handled in the most unmistakable, fundamental way, and how their operation may vary from one individual to another.

Because they are based on studies of groups, clinical trials are vulnerable to yet-to-bediscovered confounding factors based on individual (genetic) differences. By definition, of
course, these confounding factors can unfortunately be discovered only after the fact. Nevertheless, clinical studies are based on the assumption that with tightly-enough-controlled protocols, eventually all confounding variables can be eliminated. The desired objective in this is to be able to make dietary recommendations based on the outcomes of such trials that can be applied more or less equally to all individuals fitting the parameters controlled for. In itself, this is certainly logical. However, controlling for all--or progressively more and more--confounding variables may eventually show that beyond a certain level of commonality, individual uniqueness becomes more important. Indeed, in Part 7 of this paper, we saw how there may be significant, telling differences between different individuals at the genetic level in regard to the problems both of insulin resistance and hereditary hemochromatosis, depending on one's evolutionary heritage. It's important to note here, also, that these types of genetic differences are not "genetic disorders"--they are normal (polymorphic) variations in genes that can occur between human populations or individuals depending on their evolutionary background. (Also see The Late

165

Role of Grains and Legumes in the Human Diet, and Biochemical Evidence of their Evolutionary Discordance for a few other examples of genetic differences that may play a role in diet and disease.)

Genetics should greatly individualize the study of diet. One outcome of the unique insights to be
had from the new paradigm of evolutionary and genetic studies should be the potential to greatly individualize the study of diet. This is something that clinical studies by their nature--as studies of groups based on statistical averages, rather than of individuals based on the interaction of actual (and unique) genetic mechanisms--are not as readily geared to explore. Of course, clinical studies may still be required to test hypotheses about the interaction of genes with diet (by including controls for genetic differences between individuals) if such effects cannot be ascertained from genetic research at the physiological level alone. However, ultimately, the consequence of increasing genetic insights into diet will be the ability to evaluate which diets, or dietary patterns, work best for different individuals. To some extent, this places limits on the currently prevailing paradigm of clinical studies based on the statistically averaged results of groups. In so doing, it highlights the tendency of dietary idealists to use clinical studies as fodder for blanket recommendations meant to apply equally to everyone, or that assume one type of "ideal" diet can, should, or will work optimally for everyone.

Common sense and anecdotal evidence deserve consideration in certain circumstances
Relying exclusively on clinical studies also ignores anecdotal evidence. Although anecdotal evidence is indeed unreliable, at times it is the only evidence available. Despite its unreliability, it may be useful, under some circumstances, when no published studies are available on the topic of interest or when clinical data is skimpy or equivocal. Again, similar to what was noted above in regard to genetic issues, such circumstances may well depend on individual differences that escape the notice of clinical trials. •

Potential self-selection effects in long-term adherents of special diets. An often
unaddressed issue involved when ruling out anecdotal evidence in the case of vegetarian and/or vegan diets (whether raw or conventionally vegan) that may be relevant is the problem of selfselection bias that can exist in some of the more formal studies. Even by the acknowledgment of those in the veg*n community interested in scientific assessments, the ranks of long-term veg*ns are comprised of the few left among a continuing stream of people trying such diets but later dropping out. (See archives of the Sci-Veg internet listgroup for past discussions of problems in arriving at reliable demographic or other more mundane statistics about veg*ns in the general population, for instance, due to sampling problems of this sort. One relevant thread runs from 6/17/98 to 6/29/98, on the topic "Survey Results"--search for the phrase "stock-vs-flow" to locate the most relevant passages on self-selection bias.) This can be a potentially large source of bias in long-term longitudinal studies that follow a chosen group over time.

Dropouts from veg*n diets that may be due to "failure to thrive" (FTT) go largely unaccounted for. (Note: See the later section in Part 9 on failure to thrive for a brief discussion
of this syndrome.) The problems of social pressures that keep people from adhering to diets that differ significantly from cultural norms, over the long term, are much discussed and often cited as the dominant factor responsible for dropouts; and such pressures can certainly be acknowledged as looming large among the potential reasons. (This is actually a problem with any diet significantly different from the norm, and not unique to veg*nism.) However, at the same time, a continuing blind spot in the veg*n community seems to be a refusal to seriously consider the possibility some people simply may not do their best health-wise on a veg*n diet no matter what manipulations of it are tried. We include here among such people normal individuals without congenital enzymatic deficiences or other such syndromes. (This is mentioned because citing such syndromes, or carping over nitpicking-type details, is often the standard reply given in the veg*n community--in response to counterexamples--why the failures are held not to be indicative of possible problems with the diet itself.) Here, basic common sense should suggest the possibility that those who do the best on veg*n diets may be the ones who

166

would tend to predominate among the ranks of long-term adherents. (Again, this could also be expected to apply not just to veg*nism but to any special diet significantly different from a culture's norm.) •

Anecdotal reports are of potential value in assessing "failure to thrive" given longitudinal studies are of those voluntarily veg*n. Thus, long-term studies of selfselected veg*n populations (rather than studies that randomly select individuals from the general population) may not be reliable in impartially assessing the full range of possible physical effects of veg*n diets that may occur in a more random sample of individuals. Consequently, when longterm studies of people who remain on vegan or vegetarian diets (or any diet, for that matter) for fairly significant periods of time are composed of those who consciously have chosen to be on the diet themselves, self-selection effects cannot logically be ruled out. (This of course is one reason why random or other sampling techniques designed to eliminate bias are so important in setting up target and control groups.)

Moral ostracism as a masking effect obscuring awareness of "failure to thrive."
Additionally, one of the thorniest problems that, in the opinion of this writer, continues to face the study of veg*n diets--given that they are often ideologically/morally based (in addition to whatever scientific merit they may also have)--is that those who "fail" or otherwise "abandon" the diet normally face significant ostracism from their former peers. Given this situation, there is justifiable merit in considering the value of anecdotal reports as potential indicators of what the full spectrum of physical outcomes on veg*n diets over the long-term may actually be, which would include equal attention to those who abandon diets. (Note: This caveat obviously does not apply to studies [usually much shorter-term, of course] composed of randomly selected individuals to make up a pool of test subjects.)

Most who abandon veg*n diets do not come forward publicly without encouragement. This problem of morally based social disapproval from other adherents of
veg*n diets is an additional factor beyond just the self-selection effect that compounds matters, making it very difficult procedurally for those wanting to study the situation to get an accurate sampling of the vegan dropout population to check the incidence of possible negative long-term problems. Often such former adherents understandably do not want to come forward very publicly or make known any "failure" they may have experienced, for obvious reasons (personal attacks, shame, being stigmatized by former associates, etc.). Rather than seeking to understand to what degree FTT might exist, most often one can sense an almost palpable eagerness on the part of staunch adherents to explain away such problems or assign them into one or the other kind of irrelevance where the actual adequacy of the diet itself may be concerned. Given this prevailing atmosphere, it generally takes a proactively supportive environment before people will willingly and openly discuss any problems they may be having without withholding information.

Examples of FTT. Within the rawist vegan community (a population that has yet to be
scientifically studied) it is only very recently that anecdotal reports of the physically based problems that are one aspect of what, apparently, leads to the low long-term adherence rate have become more commonplace. This is primarily due to the emergence of communication groups and other "safe," "support" forums that have been created in recent years for people on such diets (some on the Internet). As mentioned above, without such proactive encouragement, it is unlikely those in the mainstream of vegetarianism will ever hear about more than a dribble of such cases. (See The Psychology of Idealistic Diets on this site for an in-depth discussion of one pool of such anecdotal reports from the Natural Hygiene movement--a well-known subculture of vegans, some of whom eat raw vegan diets, others a more conventional-style vegan diet. Also see our Dietary Problems in the Real World page for first-person accounts of problems some individuals have experienced on various forms of such diets, some of which relate to possible FTT.)

167

"Cheating" and dietary "exceptions" as a potential confounding variable to vegan research. It would be remiss at this juncture not to mention another important and potential
"confounding variable" that faces those who would study the effects of diets such as veg*nism (particularly strict veganism) that restrict or eliminate an entire food class (animal foods) the body has been genetically programmed to expect by evolution. Anyone who has ongoing access to personal conversations with a wide range of individuals practicing vegan diets, and who has had the chance to gain their confidence and cross-examine them in a friendly, sympathetic way, and is honest, will tell you that "cheating" on vegan diets (making occasional "exceptions" or eating foods not strictly "allowed") is not that unheard of, depending on the individual. Not that any given individual(s) may not be a perfect adherent. However, in some instances these dietary "exceptions" can be fairly regular and significant (anywhere from weekly to monthly "exceptions," perhaps) such that they add up over time. Despite the best of intentions, then, some individuals find themselves craving non-veg*n foods and cannot stick as faithfully to the diet as they might wish.

Role of dietary "exceptions" a particular concern with vegan research when deficiency questions are at issue. This is (or ought to be) a significant issue given that vegan
research is often concerned with potential deficiency concerns (as opposed to problems of excess). Being able to track or account for behaviorial "exceptions" thus represents a serious obstacle to robust experimental design when attempting to evaluate the adequacy of strict vegan diets (for a range of individuals) over the long-term. If the diet is supposed to be strict, but the behavior may not be, and you do not have a way of rigorously checking for it, then your research methodology has problems if such noncompliance is implicated in preventing deficiencies. Certainly the issue of the reliability of self-reports given by test subjects about food consumption in interviews or on questionnaires is a potential concern with any dietary research. However, with veganism it assumes special significance given the restrictive nature of the diet--in the sense that animal foods are eliminated, yet these are the very foods (usually dairy or eggs) partaken of in "exceptions" to the diet. Where deficiencies may be at issue, driblet amounts can sometimes be significant, in that it may not take a huge amount of a food to have an effect vis-a-vis deficiency. For vegan research, therefore, this possibility represents an important potential source of confounding error to address. Unfortunately, this is not a problem that appears to be openly acknowledged or discussed much among those relying entirely on clinical research about vegan diets. Here again, therefore, anecdotal reports are not without their value in getting some idea of the importance that unmeasured "exceptions" may have, or the role they may play, in individuals following diets that are nominally vegan but perhaps may not always be completely so, in reality.

Hateful approach of certain fruitarian extremists. In contrast to the reliance on clinical studies by
conventional vegans, the approach of many raw veg*n advocates differs. Some quote clinical studies very selectively, ignoring any that challenge their dogma (the usual rationalization is that it is from a "mixed" diet, and hence non-representative of results one might achieve on a raw vegan diet), while aggressively promoting any studies that appear to support their dogma (even if the results are for "mixed" diets). A few fruitarian extremists reject (effectively) all clinical studies and science, and certain fruitarian crank science promoters selectively reject scientific research that challenges their dogma, on the hateful grounds that (the particular) science in question is the product of the minds of people who eat cooked foods (i.e., is "cooked science"), hence is wrong, worthless and/or cannot be trusted. The analogy to racism is obvious here; recall Hitler denouncing "Jewish science" if the analogy is not clear to you.

Other factors in evaluating clinical studies

168

Some of the limitations and problems inherent in clinical studies that may be relevant depending on the situation are: •

Statistical deficiencies. Some studies are based on small samples, short-term experiments
(when long-term experiments are needed), and/or the statistical analysis is deficient (e.g., no control group, failure to measure covariates, uncontrolled confounding by external factors, errors in the choice of methods used for data analysis, etc). Creative intrepretation possible. One can usually find a (single) study, or interpret the results from a study in a way that supports a particular viewpoint--even if the view is known to be false. Potential for bias. Some studies are paid for by special-interest groups, and the objectivity of such studies may be dubious. One wonders if there may be some bias in the research.

• •

Thus one should be cautious in interpreting the results of clinical studies. It is good to use multiple studies, review papers (those that look at or analyze the overall data from a wide range of studies), or standard reference books whenever possible, to avoid the problems of relying on only one study.

In summary Clinical studies are a tool, to be used in an appropriate manner. Like any other tool, they can be used incorrectly. Finally, no number of clinical studies based on SAD/SWD data can overcome the logical fallacy (common in raw/veg*n circles) of claiming that results from such studies (SAD/SWD data) apply to all possible omnivore/faunivore diets.

The Cornell China Project: Authoritative Proof, or Misinterpretation by Dietary Advocates?
EXAMINING THE VEGAN CLAIMS Introduction to the China Project
The terms "China Study" and/or "China Project" will be used here to reflect the research published in Junshi et al. [1990] (and in related research papers, although due to the large volume of such material the discussion here is limited to a select few papers). This is a large ecological study of the diet and lifestyle of adults aged 35-64 years, in 65 counties in China. The ecological data were collected in 1983-1984, and included information on diet, smoking, consumption of alcohol, as well as analysis of urine and blood samples. The ecological data from 1983-1984 were aggregated on a county level, and supplemented with county data from a nationwide mortality survey in 1973-1975 as well as select demographic information from the Population Atlas of China. The size and scope of the China Study are impressive. As a result, some dietary advocates have aggressively promoted the China Study as "proof" that vegan diets are optimal or best. However, a closer look at the study reveals important limitations that impact the reliability, usefulness, and interpretation of the study results. Many dietary advocates are quick to cite the China Study without discussing the limitations inherent in such a study.

Limitations of the China Study
Let us now briefly examine some of the limitations of the China Study and its results. Quotes from the principal (China Study) authors are used liberally below, so you can learn about the limitations from the study authors themselves. •

Level of aggregation of the study data yields, at most, 65 observations (data points) for analysis. The data in the China Study are aggregated at the county level. The result is that for

169

most health/disease/dietary factors to be analysed, there are 65 (or less) observations (data points). This is important for two reasons: o The study is often described as authoritative and reliable--characteristics that are usually associated with "large" data sets. When one learns there are only 65 observations (and hundreds of variables), it suddenly seems far less authoritative. Note that the term "large" is relative; for simple analysis of a very few variables, 65 data points may be adequate, but for sophisticated models involving several variables, hundreds (or even thousands) of data points may be appropriate. o The limit of 65 observations places limits on the number of variables that can be analyzed simultaneously (via multivariate--that is, multiple-variable--techniques).

Side note: (statistical, can skip if you wish) Even a simple technique like some of the
regression methods based on splines may be seriously limited on a data set of only 65 points. Spline methods are becoming increasingly important, because unlike traditional regression techniques they do not assume the functional form of a relationship between variables. (That is, they do not assume ahead of time what particular mathematical relationship may exist between one variable and another.) • Such limits are quite frustrating on a data set that includes hundreds of variables. Campbell [1989] appears to acknowledge this (p. 2): • Although uniquely comprehensive... it is not yet clear how satisfactory the analysis of multiple factor effects will be upon disease risk, given the limited number of survey counties (65). More complete evaluations of the virtually unlimited interactions between different disease causes may have to await the addition of still more dietary and lifestyle studies of disease mortality rates.

Limits on the use of geographical correlations, the primary data of the China Study monograph (Junshi et al. [1990]). The China Study monograph [Junshi et al. 1990] provides
geographic correlations which are of limited direct use. Although it is possible to develop statistical models in which the dependent variable is a correlation, models constructed using the underlying variables from the relevant correlation may be far more meaningful and useful. Peto, writing in Junshi et al. [1990, pp. 76-77] notes: Although geographic variation in particular disease rates can provide clear evidence that, in areas where those rates are high, that disease is largely avoidable, they may provide frustratingly unclear evidence as to exactly how it can be avoided... An even more striking example of the limitations of geographic correlations is that oesophageal cancer in China has no clear geographic association with smoking, and has a significantly (P < 0.01) negative association with daily alcohol intake. Peto and Doll [1981], as cited by Peto in Junshi et al. [1990], also remind us that attempts to separate causality from confounding factors in geographical data via the technique of multiple regression are often unsuccessful (or, my comment--misleading, which is even worse).

The China Study report lists only 6 statistically significant correlations between meat-eating and disease mortality. Further, 4 of the correlations are negative, which
indicates that the mortality rate for that disease decreased as meat consumption increased. The two diseases that had positive correlations with meat consumption are schistosomiasis, a parasite, and pneumoconiosis and dust disease. Thus, the direct evidence of the study is hardly the condemnation of meat consumption that veg*n dietary advocates may claim it to be. It should be noted here that correlation is a measure only of linear relationships, and other analytical methods may yield different results. Despite the

170

possibility of the existence of more complicated statistical relationships, it seems quite odd, given the interpretations of the study made by veg*n dietary advocates, that meat intake generally did not correlate with disease mortality. (See table 5033, pp. 634-635 of Junshi et al. [1990].) •

Ecological studies (like the China Study) generate hypotheses, they do not prove them. Campbell and Junshi [1994] concisely state this limitation (p. 1155S):
First, this study is ecological and includes 6,500 individuals residing in 130 villages. Thus according to widely held assumptions, any inferences concerning cause-and-effect relationships should be considered to be hypothetical only, with validation to be provided only by intervention or prospective analytic studies on individuals. Thus we note that the China Study requires backup clinical studies before making inferences or drawing conclusions. The main hypothesis of the China Study is whether diets that are predominantly plant foods reduce chronic diseases. However, some veg*n advocates go far beyond the main hypothesis of the study, and claim it proves that veg*n diets are "better" than all omnivore diets. Further, such claims may be made without supporting clinical studies, and without regard for the actual range of diets included in the study. (The latter point is discussed later herein.)

The ecological fallacy, and its impact on ecological inference. Freedman [1999, p. 1]
provides a brief overview of the ecological fallacy (boldface emphasis below is mine): In 19th century Europe, suicide rates were higher in countries that were more heavily Protestant, the inference being that suicide was promoted by the social conditions of Protestantism (Durkheim 1897). A contemporary example is provided by Carroll (1975): death rates from breast cancer are higher in countries where fat is a larger component of the diet, the idea being that fat intake causes breast cancer. These are "ecological inferences," that is, inferences about individual

behavior drawn from data about groups... The ecological fallacy consists in thinking that relationships observed for groups necessarily hold for individuals: if countries with more Protestants have higher suicide rates,
then Protestants must be more likely to commit suicide; if countries with more fat in the diet have higher rates of breast cancer, then women who eat fatty foods must be more likely to get breast cancer. These inferences may be correct, but are only weakly supported by the aggregate data. ...However, it is all too easy to draw incorrect conclusions from aggregate data.... For example, recent studies of individual-level data cast serious doubt on the link between breast cancer and fat intake (Holmes et al. 1999).

Eliminating group-level confounding has no necessary relation to individual-level confounding. Greenland and Robins [1994a] provide an interesting and insightful statistical
critique of ecologic studies, furnishing examples that demonstrate how ecologic and individuallevel studies are subject to different forms of confounding and errors. In conclusion, they state that [Greenland and Robins 1994, p. 749], "...conditions that guarantee no confounding in an ecologic study are logically independent of the conditions that guarantee no confounding in an individuallevel study..." That, of course, sharply limits the relevance and reliability of efforts to use ecological data to make inferences about individuals. They also discuss how the techniques commonly used for "adjusting" ecological data for covariates (other variables) might not reflect the true underlying conditions due to non-additivity, non-linear relationships, and other factors (i.e., the common practice of a linear adjustment may be inappropriate and/or inadequate for many covariates). Further, they point out that having a large

171

number of regions in an analysis does not guarantee a randomization of the underlying relationship (if any) between covariates and regions.

Averaging at the aggregate level prevents assessment of individual-level relationships. Commenting on Greenland and Robins [1994a], Piantadosi [1994, p. 763] makes
the cogent remarks: It is impossible to study ecologic analyses very carefully and come away with much confidence in them. Averaging is a process that necessitates some loss of information... When analyzing such aggregated data, we not only lose all ability to extend inferences reliably to less aggregated data but we even lose the ability to estimate the direction and magnitude of bias... I encourage epidemiologists to understand the deficiencies of group-level analyses and limit them to the role of hypothesis generation. Readers with some knowledge of statistics will find the above papers, plus the related papers of Greenland and Robins [1994b] and Cohen [1994] to be of interest, and to provide a good introduction to the statistical limitations (and problems) that are relevant when one tries to make ecological inferences. This underscores the earlier point (above) that the China Study merely generates hypotheses, it does not prove them. •

Results of a study done in one culture or society do not necessarily apply to other cultures or societies. (They potentially can, but they also might not.) Campbell and Junshi
[1994] make this point (p. 1155S): A second general comment on comparing residents of rural China with residents of highly industrialized societies is that it may not be appropriate to extrapolate diet-disease relationships across cultures. Campbell and Junshi [1994] then go on to briefly discuss the linkage between diet and health, and how the incidence of chronic disease changes when groups change diets. In that regard, the discussion in the previous section of chronic health problems among the Aborigines of Australia, after adopting high-carbohydrate diets, is relevant. Wenxun et al. [1990] provide actual examples of differences in Chinese and Western disease patterns (p. 1033): An examination of the correlations among CVD (cardiovascular disease) mortality rates, erythrocyte [red blood cell] fatty acids, blood lipid indices, and selected environmental factors indicates that some relationships reported for Western populations were absent in China whereas other relationships were observed. For example, there was no correlation between county means [averages] of plasma cholesterol and county mortality rates for any of three CVD types. Thus, factors other than total dietary lipid or plasma cholesterol may be important in explaining the geographic distribution of CVD mortality rates within China. One probably will not find a consensus on the reasons why extrapolation of study results from one culture to another is unreliable. However, a number of probable factors include: differences in income and lifestyle, differences in level of industrialization causing differences in exercise levels and exposure to chemicals, minor but significant differences in genetics, differences in food preparation practices, and so on.

Lack of actual income data for the survey participants is a serious flaw. It makes adjustment of the data for the effect of income less reliable. Ideally, one would like to

172

adjust data for income in statistical analyses, as it is presumably a good proxy variable for the degree of Westernization. (Many of the degenerative diseases common in Western societies are in effect, diseases of affluence, hence adjusting the data for actual income may be important in statistical analysis.) However, the China Study has no direct measures of the income of the participants. Instead, the study includes demographic data that could be utilized to yield estimates of county average, per-capita income. However, such an estimate is a poor substitute for actual income data on the study participants. Also, per-capita income may be skewed by the presence of a few, very high-income individuals living in an otherwise (very) poor county. •

Attempts to use the China Study to prove that all omnivore diets are bad are yet another logical fallacy. Ultimately, attempts to claim that the China Study "proves" all
omnivore/faunivore diets are bad fail as yet another logical fallacy. Basically, none of the county diets in the China Study were vegan diets, and none were evolutionary diets (and, by the way, none were the SAD/SWD diet). Most were high carbohydrates, grain-centered diets (though one county reported high consumption of both meat and dairy--reminder: dairy was never a part of humanity's evolutionary diet). Campbell, writing in Junshi et al. [1990], reports (p. 63): The national mean [average] percentage energy intake obtained from animal foods was observed to be 5.7%, with a range of 0.1-59.4%. Thus we observe that extrapolation to strict vegan or evolutionary diets (or even the SAD diet) go beyond the range of the China Study data, and hence such projections are less reliable statistically. Also, as none of the China Study diets were evolutionary diets, and the meat consumed came from domesticated rather than wild animals, the results from such (Chinese) diets cannot be extrapolated to evolutionary diets (i.e., yet another logical fallacy).

Cancer, veg*n diets, and the China Study
Veg*n dietary advocates sometimes cite the China Study as indicating that it suggests veg*n diets may provide increased protection from cancer when compared to "omnivore diets." Along these lines, the following points are of interest. •

Recent research in Australia found that a high-starch Chinese diet did not reduce risk of colon cancer. The paper of Muir et al. [1998] describes a randomized, crossover dietary
intervention study in which 12 people (of European descent) in Australia followed for 3 weeks each a high-starch diet similar to the diet of low-income comunities in China, and later the SAD (standard Australian diet, in this context). The Chinese-style diet lowered serum cholsterol and fecal pH. However, for all other fecal markers tested, the results from the Chinese diet were worse than for the SAD. Muir et al. conclude [1998, p. 372]: These results suggest that consumption of high-starch diet alone is insufficient to reduce the risk of developing colon cancer. As the above reflects results of only one study, the usual cautions on interpretation apply.

Collaborative analysis of 5 large studies finds no difference in death rates from cancer, vegetarian vs. non-vegetarian. A recent collaborative analysis of 8,300 deaths
among 76,000 people in 5 prospective studies [Key et al. 1998] compared death rates of vegetarians and non-vegetarians for a number of diseases. The studies involved included large proportions of vegetarians. Key et al. [1998] found that vegetarians were less likely to die of ischemic heart disease than non-vegetarians, but there were no differences in the death rates for a number of types of cancer: stomach, large bowel (colon), lung, breast, and prostate.

173

Note: In the five studies analyzed by Key et al. [1998], four of the studies defined vegetarians as people who did not eat any meat or fish, with non-vegetarians being all other people (i.e., this would include those who did eat dairy and eggs in their diets). The other study in the analysis classed as vegetarians those who claimed to be vegetarians; a followup on that study 5 years later found that only 66% of the self-identified vegetarians ate meat or fish less than one time per month. (This suggests some mis-classification error in that study or, as the followup was 5 years later, it may reflect people who abandoned the veg*n diet.) The above results suggest that those who claim veg*n (or Chinese-style high-carbohydrate) diets may provide protection from cancer may be premature in their assessments .

In Summary The China Project is often cited in an inappropriate manner by veg*n dietary advocates. It does not "prove" vegan diets are the "best" diet. Strict vegan diets, hunter-gatherer (evolutionary) diets, and even SAD/SWD diets are not in the set of diets in the China Project, i.e., are outside the range of the data from the China Project. Claims by dietary advocates that the China Study "proves" all omnivore diets are bad and (some) vegan diets are better are a logical fallacy. It would be better if the (interesting) results of the China Project were not misinterpreted or misrepresented by the "popular" health media or by dietary advocates.

Instinct vs. Intelligence in Diet: Where is the Line?
This section will briefly address the interactions between instinct and intelligence; and then a long list of claims (about instinct) made by raw-vegan/fruitarian extremists will be discussed.

Introduction: The difficulty of distinguishing between instinct and intelligence
One claim often made by those advocating comparative proofs of vegetarianism is that eating plant foods is instinctive, while eating animal foods is non-instinctive or unnatural. Accordingly, the question of whether particular actions are due to instinct or intelligence is controversial and a source of major disagreement. The problem lies in "confounding"--that is, how to separate instinct from intelligence in the arguably "unnatural" modern world we live in. There appear to be no hard answers to the questions here, although many raw/veg*n advocates claim otherwise.

Individual intelligence can override instinctive restraints. A good introduction to the topic is
provided by Itzkhoff [1985, p. 171]: All our difficulties, as well as all our possibilities have come about because the sapient brain overrode the last restraints, indeed the directiveness, that instinct gives to animals. In the animal world, intelligence is guided by instinct to achieve clear-cut survival needs... Man, the generalized intelligent ape, has a brain that establishes the rules of the game, almost irrespective of the individual's bodily or even grossly survivalistic needs... What remains to man after these basic conservational restraints of instinct are extinguished is a super-intelligence that filters all major categories of human behavior through the cortex. He is a thinking animal with a complex brain, a supremely energized mammalian brain that must now control, direct, guide his behavior. The old passions, energies, and drives no longer have built-in censors. A number of raw/veg*n advocates openly criticize intelligence because it allows you to override your instinct, which the advocates claim (with virtually no credible proof or logic to support them) is that of a veg*n animal. The argument is that intelligence allows you to eat the "wrong" foods (where "wrong" is usually equivalent to whatever the dietary advocate dislikes), and these "wrong" food choices often get

174

institutionalized into culture. Some of the more extreme raw dietary advocates (fruitarians, mostly) bitterly and hatefully denounce cultural eating patterns, and culture in general, because of this.

Eating Animal Foods, Part 1: Instinct or intelligence?
The major question to address here is whether eating meat or animal foods is instinctive or not. A number of raw/veg*n advocates allege that everyone is repulsed by the act of killing another animal and eating its flesh. It should be noted that no real proof is ever offered to back up such claims; what proof--other than the advocate's personal feelings--is there on this subject? Let's consider the claims that people are "naturally repulsed" by the act of killing and eating other animals. Such claims are clearly contradicted by the following: •

No veg*n (hunter-) gatherer societies. Revulsion against the killing of animals is apparently
(for the most part) absent in hunter-gatherer societies, none of whom are veg*n, and all of whom rely on animal foods for part of their diet. Note also that humans have been hunter-gatherers for 99% of the time since the human genus Homo appeared. We are all the descendants of (nonveg*n) hunter-gatherers. Sport (and survival) hunting. Claims of universal revulsion are contradicted by the considerable popularity of sport hunting in many countries and societies. The claims are also contradicted by the millions who hunt for food (including fishing) to ensure their survival. The meat-animal death connection is clear to many consumers. In many countries, animals are sold alive (and killed at home), or they are killed (in the full view of the consumer) at meat markets. The animal death/meat connection is crystal clear to millions (and probably billions) of meat-eaters. Remark regarding the two preceding points: Although many of these people could presumably be veg*ns, they choose to eat animal foods. So much for the claim of universal revulsion at killing/eating animals.

• •

• •

Chimps prize animal flesh. Claims of biologically inherent revulsion may be contradicted by
our evolutionary cousins, the chimps, who eat--and greatly prize (when it is available)--animal flesh. For an interesting account of a previously inexperienced modern individual's experiments with killing a chicken and immediately thereafter eating the fresh meat--raw--see Zephyr [1997].

So, next time a veg*n dietary advocate claims that our instincts prevent humans from killing and eating animals, ask them for credible proof of their claim(s). The advocate may simply be projecting his/her personal moral and emotional preferences onto others. However irritating--or enjoyable--the preceding analysis may be to you, it still does not directly answer the question of whether eating meat is instinctive or not, of course. Obviously, if eating meat (or veggies) is absolutely necessary for survival (survival is certainly an instinct), then it is, by definition, instinctive. However, in today's modern society, with a huge variety of foods available, how often is it absolutely necessary to eat any one food or food type (as substitutes are usually available)?

Eating Animal Foods, Part 2: An evolutionary view
Another approach to answering the question is to consider our evolutionary history, and to note that, since the very inception of the human (Homo) genus, ~2.5 million years ago, the human diet has included meat, and our metabolic and morphological makeup appear to reflect varying degrees of adaptation to animal

175

foods in the diet. Thus one can argue that eating animal foods is instinctive, because it is natural behavior-behavior that we have followed long enough so that evolutionary adaptation has taken place.

The association of increasing brain size with animal food consumption. For many readers, the
preceding paragraph is a convincing argument. However, other readers will quickly ask: What about the confounding effects of intelligence? Here the expensive tissue hypothesis of Aiello and Wheeler [1995], and related research, is relevant. Recall that the major point of the expensive tissue hypothesis is that the human brain increased in size (and our intelligence increased) via brain evolution fueled by a switch to a diet that included very significantly increased amounts of meat, and which allowed our gut (digestive system) to shrink thereby freeing metabolic energy (to support the increase in brain size). This hypothesis, and the related research discussed in section 4 herein, suggests that the consumption of meat and the evolution of intelligence are closely interrelated. To summarize, the evidence of evolution is as follows. • •

Eating some animal products (e.g., the lean meat of wild animals) is natural because humans have adapted to the behavior by evolution. Eating animal foods (may have) fueled brain evolution and hence fueled the increase in human intelligence.

Given the above information, the obvious answer to the question, "Is meat-eating instinctive or driven by intelligence?" is that both apply. That is, one can argue that eating (wild) animal products is both instinctive and intelligent, for humans, from the evolutionary point of view.

Individual intelligence makes the final decision. On the other hand, at the individual level,
intelligence may motivate some people to be raw/veg*ns, and to avoid animal products. That is an example of the power of human intelligence, acting at the individual or personal level.

Eating Animal Foods, Part 3: Morality and naturalism
It is appropriate to remind readers of some important points here. • • •

Survival is amoral. Evolution is driven largely by survival--an amoral concept. The moral
arguments for vegetarianism are irrelevant in an evolutionary context. Individual intelligence implies individual choice. The overriding power of human intelligence allows us to choose to be raw/veg*n, or any other diet that appeals to us. You are not required to eat meat or animal foods. Instinct not an argument for SAD/SWD diet. Even though one can argue that eating animal foods was instinctive and intelligent in the context of evolution, readers are reminded that modern domesticated/feedlot meat is dissimilar in composition to wild animal meats, and diets high in feedlot meats are known to be relatively unhealthy. That is, the instinct and intelligence arguments do not support a diet of domesticated meats, or the SAD/SWD diet. Moral dietary decisions an exercise in individual intelligence. If your decision to be a veg*n is based on moral grounds, then the entire instinct vs. intelligence argument is of little or no relevance to you. Your morality, as an exercise of your intelligence, can override other considerations, if you so choose. Veg*n "instinct" part of false naturalism claims. The primary relevance, in this context, of the claim that veg*n diets are instinctive is that it is part of the false naturalism claims made by many raw/veg*n dietary advocates. The research available on this site provides evidence that raw/veg*n diets are not the natural diet of humanity but are, at best (in terms of concerns having to do with naturalism), a restriction of such diets.

176

Examining Fruitarian Claims about Instinct in Food Selection
Introduction. This section catalogs and addresses a long set of claims about instinct made by a few
fruitarian extremists. The claims are summarized and paraphrased here, to respect copyrights. The material below includes supplementary information on primates (hunting habits, fauna consumption, and use of medicinal herbs); the hunting skills of prehistoric humans; child/infant food preferences; and other topics of potential interest to rawists and/or veg*ns. If you are short on time, you might skim the material below, reading only the items of interest, or if you prefer, skip to the next section.

CLAIM: Humans are limited to a narrow diet (nearly 100% fruit) by our genetic code.
Instinct is a function of genetic code.

REPLY: Extensive real-world evidence indicates humans thrive on a broad spectrum of diets. Morphology and physiology are functions of DNA, and the existence of structural limits is implied.
However, the obvious (and overwhelming) evidence of individuals thriving on a wide variety of diets-ranging all the way from hunter-gatherer to conventional veg*n--indicates that humans can succeed on a wide array of diets. (We are not limited to ~100% fruit diets, and fruitarian diets have a dismal record of failure in the long-term.) One must wonder whether humans are really limited to "narrow" diets, or is it the case that the extremist making the claim simply limits themselves emotionally to their own narrow view? One further point here: Intelligence plays a role in morphology and physiology, both via brain evolution and via the culture/evolution feedback loop discussed in an earlier section.

CLAIM: The "instinct" to hunt and kill animals is not found in every human, hence it cannot be an instinct. Example: Most all domestic cats still have an instinct to hunt. REPLY: The above has already been addressed to a certain extent (instinct vs. intelligence discussion), so
we'll only make a few additional points here. • •

The claim is logically invalid. In some hunter-gather societies, the males hunt and the females
gather plant foods. Application of the "logic" of the above claim would lead to the result that eating plant foods is not instinctive either.

Many behaviors that are described as (allegedly) "instinctive" are actually learned behavior patterns. Cats (including domestic cats) and other predators are usually taught to hunt
by their mothers. (Obviously many human behaviors that are common across all cultures are learned behavior patterns as well.) Calorie deficiency and loss of libido: are raw vegan diets instinctive? Speaking of instinct, one wonders if it is instinctive to eat a diet that fails to provide adequate calories? (Many raw vegans are emaciated from insufficient caloric intake.) What about a diet that lacks sufficient vitamin B-12? Or a diet that (per numerous anecdotal reports) frequently causes loss of libido? (Raw vegan diets, yet again.) As reproduction is tied to libido, those who lose libido will, in the long-run, be selected out of the evolutionary gene pool.

CLAIM: Children instinctively choose sweet foods, like fruit. Parents bribe their children to get
them to eat (repulsive!) animal foods.

REPLY:
1. Sweetness is a sign of carbohydrates--calories/energy--in food, and there is reason to believe that for increased evolutionary survival, the desire for sweet may be instinctive or evolutionary (but see points 4 and 5 below). But sweetness is not the only naturally attractive taste characteristic in foods that appeals to children (point 4 below).

177

2. One should not discuss sugar without mentioning that fruit is of limited and/or sporadic
availability in most environments, even the rainforest. The evidence of orangutans (Knott
[1998], Mackinnon [1974, 1977] as cited in Chapman and Chapman [1990]) suggests that fruit is not always available, even in remote, pristine rainforest habitats. Sugar consumption may impact opioid receptors in the brain; see Blass ([1987], a heavily-referenced review article) for a discussion of this topic. Market acceptance is an important factor in the breeding programs for modern fruits, and sweetness (high sugar content) is an important factor in such breeding programs. Research on infant food preferences contradicts some fruitarian claims. Story et al. [1987] review the research of Clara Davis done in the 1920s and 1930s. Davis gave infants a free choice of a wide array of foods on a tray. Story et al. [1987] reports the favorite foods chosen by the infants (p. 104): Bone marrow was the largest single source of calories (27%) for one infant, whereas milk provided the bulk of calories for the other two (19% and 39%). All three infants shared a low preference for all 10 vegetables, as well as for pineapple, peaches, liver, kidney, ocean fish, and sea salt. These foods constituted less than 10% of the total energy intake. Davis observed that the infants ate much more fruit, meat, eggs, and fat than pediatricians typically advised. The results of Davis obviously confirm some preference for sweet foods. However, the results of Davis also indicate additional taste preferences, and discredit the claim that children are repulsed by animal foods; after all, bone marrow was the favorite food for one infant and prized by the other two infants as well. (Once again, it can be seen that the real limitations here exist in the highly emotional, narrow thinking processes of the extremist rather than in nature.) It should be clearly noted that the results of Davis are based on small samples. Despite this, they are intriguing, as they contradict most so-called "wisdom" about children's food preferences.

3.

4.

5. First reaction to sugar can be negative. If the response to sugar is truly instinctive, and
instinct is universal per a previous claim, then one would expect those who live without sugar to react in a very positive way when they finally get to (first) taste it. The quote below from Stefansson [1960 (p. 86)] suggests that this is not necessarily the case (italicized emphasis mine, in the following). The Copper Eskimos, so named because many of their weapons and tools were of native copper, had never dealt with any traders before 1910. They did not even know tea, used no salt, and lived exclusively on flesh foods, eating roots and such only in time of famine. In 1910, they for the first time tasted sugar, given them by the first trader to reach Coronation Gulf, Joseph Bernard. They disliked it.

CLAIM: Eating meat is a learned behavior, and not instinctive. Humans have been eating meat for "only" 2 million years. REPLY: This has already been discussed extensively earlier in this paper. CLAIM: Some humans are disgusted at the thought of eating meat. How could that happen to a true carnivore? REPLY: (Sarcasm) Some humans are disgusted at the thought of eating durian (a smelly tropical fruit
revered by some, hated by others). How could that happen to a true frugivore?

178

More seriously, the suggestion that we are true carnivores is a straw argument. Instead, the
evidence presented here supports the claim that humans are natural omnivores/faunivores, not total carnivores. (Although it may be possible humans might be able to survive--and survive well--as carnivores, e.g., the Inuit.) As for the disgust factor, given the role of animal foods in evolution and hunter-gatherer societies, is not such disgust merely the result of one's own: (a) moral/religious views, or (b) self-conditioning with veg*n dogma?

CLAIM: True carnivores often eat (only) their prey's internal organs and leave the muscle for
the vultures. Why don't human meat-eaters behave this way?

REPLY: Animal organs are widely consumed in many cultures, and by hunter-gatherer groups (see O'Dea [1991] for a discussion of this regarding the Aborigines of Australia). Humans are
different from lions in a number of ways, as follows. • • • We are a different species. We are intelligent and use tools (technology). Our feeding behavior is different; it is an extension of the primate pattern (see Butynski [1982] for further information on this point).

The next set of claims deals with form, function, and primates.

CLAIM: Primates have adaptations exclusively for fruit eating--vision, hands, etc. True
carnivores usually do not have adaptations for fruit-eating.

REPLY: Here, another form of straw argument comparing humans to true carnivores is erroneously
brought up. However, on the subject of fruit-eating adaptations, as discussed in an earlier section, chimps and orangs have special adaptations for tree-climbing, which are very handy for a frugivore. Humans lack these important (frugivore) adaptations for efficient fruit-collecting. The idea that our vision and hands are exclusively for fruit-eating is utterly ridiculous--such claims were examined (and found lacking) in earlier sections herein (e.g., a single form can support multiple functions; adaptation is not limited to one specific form; etc.).

CLAIM: True carnivores hunt by smell alone; they don't need technology. REPLY: Repeating the straw argument, "Humans are not like true carnivores," does not make it relevant. Chimps, and (omnivorous, not carnivorous) primates in general, hunt by sight and not
by smell. (This point is discussed further later in this section.) Humans are a special kind of primate; we are not lions or tigers.

CLAIM: The great apes, except for chimps, are strict vegetarians. REPLY: This claim is simply in error and is typically based on outdated research. Modern
research from roughly 1970 onward has revealed that the apes other than chimps also eat a modicum of insects and/or other animal foods in their diet. See the earlier section on ape diets for details.

CLAIM: Hunting by chimps is not instinctive, because:
1. Female chimps do not hunt.

179

2.

The males kill prey only by knocking it to the ground. 3. Chimps have smaller bodies than humans, hence need concentrated foods to support higher metabolism. 4. Chimps get sick from eating meat and must resort to "toxic" herbs.

REPLY: The above is a good example of the sort of misinformation and half-truths frequently dispensed
by fruitarian extremists. Let's look at the above claims one at a time.

1. In fact, female chimps do hunt, and males generally share their kills with females. The paper
of Galdikas and Teleki [1981] provides the best answer to this claim (pp. 241, 245, 247): Long-term research at Gombe National Park indicates that chimpanzees, more than any other nonhuman primate in existence today, practice an incipient form of labor division; one age/sex class, adult males, focuses on exploiting certain [food] resources, especially vertebrates, to the benefit of other members of the social unit, including individuals of other age/sex classes (Teleki 1973a, Wrangham 1975)... The specialization is not absolute: chimpanzee females occasionally hunt game and males certainly spend considerable time collecting insects... Further since female chimpanzees benefit from male predatory activities through the extensive sharing of meat within a community, while males benefit little if at all from female collecting activities because insects are rarely shared among adults, it is possible that the annual intake of fauna is greater among chimpanzee females than among males (McGrew 1979)... Modern field studies are demonstrating that monkeys, apes and humans are, with some exceptions, basically omnivorous mammals that share many adaptive responses to resource availability (see Harding and Teleki 1980).

2. Male chimps are not limited to knocking prey to the ground; they often use the method
3. of flailing to kill their prey. See van Lawick Goodall [1973] and Butynski [1982] (table 2, p. 425) for a summary discussion of chimp hunting methods. The claim about body size was examined and found wanting in an earlier section herein. By the way, to claim that any primate smaller than humans needs more concentrated food would suggest that orangs, bonobos, and gibbons probably all need to eat meat also. Medicinal herb use by chimps. The claim that chimps get sick and use herbs has some truth in it. However, herb use does not constitute proof that meat-eating by chimps is somehow noninstinctive. The claim (#4) made above implicitly includes two myths promoted by raw fooders: First, the false myth that wild animals who eat their "natural" diet never get sick. This myth was assessed earlier, and a short discussion of the topic is given in the article, Selected Myths of Raw Foods. The second myth is that herbs (and spices and meat and anything else the fruitarian extremist dislikes) are "toxic." The field of self-medication (using herbs) by animals, known as zoopharmacognosy, is discussed in Huffman [1997], which provides an overview of the topic. Self-medication in primates is primarily used to control parasites and gastrointestinal disturbances. Huffman [1997] also discusses possible self-medication by bears. The obvious logical fallacy of the extremists is to assume (without any credible scientific proof) that such herb usage is unnatural. Inasmuch as some raw fruitarian extremists are quick to label almost everything (except fruit, and perhaps a few greens) as "toxic" and "bad," the following quote from Huffman [1997, pp. 171172] confronts such extremist thinking with the reality that primates have the general ability to handle (detox) a wide range of plant compounds.

4.

180

Among primatologists a major focus of concern about plant secondary compounds in the diet has been on how and why primates can cope with their presence (Glander, 1975, 1982; Hladik, 1977a,b; Janzen, 1978; McKey, 1978; Milton, 1979; Oates et al., 1977, 1980; Wrangham and Waterman, 1981). An extreme case in point is the golden bamboo lemur (Hapalemur aurensis) of Madagascar, which is noted to consume over 12 times the lethal adult dose of cyanide in a day, without ill effect (Glander et al., 1989). The cyanide comes from the tips of a bamboo species, Cephalostachyum sp. (Graminaea), consumed as part of the lemur's daily diet at certain times of the year (Glander et al., 1989). In this case, the cyanide is thought to be detoxified mostly in the liver (Westly, 1980).

CLAIM: Meat-eating by humans cannot be instinctive because humans don't eat the specific monkeys that chimps hunt. REPLY: The above claim reflects truly amazing ignorance of chimp (and human) hunting behavior. It suggests that the only meat that chimps eat comes from monkeys. This is both false and
absurd. Hamilton and Busse [1978, p. 764] note: Predation upon mammals by chimpanzees and baboons is usually opportunistic, i.e. prey animals are encountered at close range and are captured quickly with a high probability of success, a typical scavengehunting tactic. Van-Lawick Goodall [1973] provides a list of prey killed by the chimps of Gombe; it includes many vertebrates that are not monkeys. Teleki [1981, table 9.3, p. 314] reports that of the identified mammalian prey killed by Gombe chimps, 68% were primates, and 32% were non-primates. Having ascertained that the basic premise of the claim is false, let's look at reality. Many humans live in temperate climates where there are no monkeys. Why does the extremist suggest that humans should eat only monkey meat, and no other kind of meat? Especially when it is contradicted by the reality of wild chimp diets, and the fossil record of the prehistoric human diet?

CLAIM: Humans cannot eat meat because we lack fangs, claws, sharp teeth. Human teeth are
good only for eating fruit, and not for tearing flesh or chewing leaves.

REPLY: The above claims were investigated thoroughly in earlier sections and found specious. CLAIM: Humans are fully upright and bipedal, and this makes us ineffective at hunting. REPLY: Such a claim might be hilarious if it were intended as humor. Unfortunately, the extremist here is
serious. As for hunting, we are all the descendants of hunter-gatherers. There are no veg*n gatherers, and no evidence that any ever existed. If humans were really ineffective and/or incompetent at hunting, then one of the following would be true: • • Humans would be extinct now because of our status as inefficient hunters, or There would be many veg*n gatherer societies (which would also be clearly reflected in the fossil record).

However, we are alive today to discuss the fine points of diet, which indicates that our hunter-gatherer predecessors were effective enough at hunting to pass the critical test of survival of the fittest.

Humans are the only primate to prey on vertebrates larger than self (body size). In reference
to the above, Butynski [1981, p. 427] notes:

181

No primate other than man has been observed to prey upon animals larger than itself... In contrast, man frequently kills mammals many times his size. Milton [1987] (citing Rodman and McHenry [1980]) reports that bipedalism is more energy-efficient over land than is quadrupedalism. Humans, of course, are the only fully upright bipedal primate.

Prehistoric humans hunted many species to extinction. The considerable skill of humans at
hunting is summarized by Allman [1994, p. 207]: Our modern human ancestors' hunting abilities are strikingly apparent in the Americas, where Paleoindians hunted to extinction nearly 70% of the species of large mammals on the continent, including mammoths, camels, and giant sloths... The most important factor in our ancestors' hunting prowess, however, was not their new tools, but their psyche. Large, relatively clumsy, and slow of foot, our ancestors could not stalk their prey and swiftly chase them down, like many predators, but instead had to rely on working together as a team to bring down their prey--something at which modern humans excelled. Note the reference in the above quote to the social and cultural behavior of prehistoric humans hunting in cooperative groups. This is an example of adaptive behavior that exerted selective pressure on human morphology and physiology via evolution. Recall the simplistic comparative "proof" arguments about human dental structure, body structure; such arguments are clearly fallacious because they implicitly assume that humans hunted solo, without technology (e.g., like cats). For additional information on human hunting skills and species extinctions, see Martin [1990].

CLAIM: Due to body size "rules," large mammals (like humans) don't have to eat flesh. REPLY: The body size rule has been discussed already. The above claim would suggest that lions, tigers,
killer whales, and polar bears don't need to eat flesh. Remember too that some very large land carnivores are now extinct: sabertooth tigers, prehistoric lions (about twice the size of modern day lions), etc. They didn't need to eat flesh either? Are such claims an example of denial of reality, or science fiction?

CLAIM: Cooking was needed because eating raw meat introduced parasites. REPLY: One can contract certain parasites from any raw food, including raw plant foods.
All that is needed is the presence of parasite eggs or bacteria, and these can be carried by water, insects, birds, are found in animal dung, and so on. If indeed cooking was universally required to neutralize parasites in meat, this implies that cooking would have been universally needed for plant foods as well. Of course, that contradicts the extremist view that humans evolved on a diet of raw fruit.

CLAIM: Humans are not adapted to be omnivores. The writings of D.J. Chivers are then quoted
by the extremist, in a misleading way.

REPLY: This is discussed in an earlier section. In my opinion, to use quotes in a deliberately and grossly
misleading way is intellectually dishonest. (If it is not deliberate, it shows how seriously the extremist's fanaticism filters their perception of what they read, and how it predisposes them to screen out or distort, perhaps subconsciously, what they do not want to hear.)

Epilogue I hope the preceding set of claims and replies was interesting--at least in part--to you. The mixture of half-truths and twisted "logic" found in the above claims is typical of crank science. Unfortunately, a significant part of "fruitarian science" is composed of these types of distortions.\

182

PART 9: Conclusions: The End, or The Beginning of a New Approach to Your Diet?
Contrary Facts vs. Vegan Dogma: Facing the Honesty Problem
SYNOPSIS OF THE PRIMARY EVIDENCE (CONCLUSIONS) Humans can be regarded as natural omnivores, so long as one uses the common definition of the
term: a natural diet that includes significant amounts of both plant and animal foods. (Humans might not qualify as omnivores if one uses the definition of omnivore as advocated by D.J. Chivers and associates, and discussed in earlier sections herein.) To use terms that are linked to gut morphology, humans are either faunivores [meat-eaters] or frugivores with specific (evolutionary) adaptations for the consumption of animal foods. This, of course, means that humans are not natural vegetarians. A short summary of some of the evidence supporting this follows (the material below was discussed in depth in earlier sections of this paper). •

The fossil record. Approximately 2.5 million years of human omnivory/faunivory are apparent
in the record, with genetic adaptation to that diet the inevitable and inescapable outcome of evolution. The supporting evidence here includes isotope analysis of fossils, providing further evidence of consumption of animal foods. Comparative anatomy of the human gut. The best scientific evidence available to date on gut morphology--analyzed using two different statistical approaches--shows evidence of adaptations for which the best explanation is the practice of faunivory. (Faunivory as an explanation is also supported by optimal foraging theory in hunter-gatherer tribes.) Further, the human gut morphology is not what might be expected for a strict vegetarian/fruit diet.

Comparative physiology (metabolism) o Intestinal receptors for heme iron. The existence of intestinal receptors for the
specific absorption of heme iron is strong evidence of adaptation to animal foods in the diet, as heme iron is found in nutritionally significant amounts only in animal foods (fauna).

183

o

B-12 an essential nutrient. Similarly, the requirement for vitamin B-12 in human

• •

nutrition, and the lack of reliable (year-round) plant sources suggests evolutionary adaptation to animal foods in the human diet. o Plant foods are poor sources of EFAs. In general, the EFAs in plant foods are in the "wrong" ratio (with the exception of a very few exotic, expensive oils), and the low synthesis rates of EPA, DHA, and other long-chain fatty acids from plant precursors point to plant foods as an "inferior" source of EFAs. This strongly suggests adaptation to foods that include preformed long-chain fatty acids, i.e., fauna. o Taurine synthesis rate. The low rate of taurine synthesis in humans, compared to that in herbivorous animals, suggests human adaptation to food sources of taurine (fauna) in the human diet. o Slow conversion of beta-carotene. The sluggish conversion rate of beta-carotene to vitamin A, especially when compared to the conversion rate in herbivorous animals, suggests adaptation to dietary sources of preformed vitamin A (i.e., a diet that includes fauna). o Plant foods available in evolution were poor zinc and iron sources. The plant foods available during evolution (fruits, vegetative plant parts, nuts, but no grains or legumes) generally provide low amounts of zinc and iron, two essential minerals. These minerals are provided by grains, but grains are products of agriculture (i.e., were not available during evolution), and contain many antinutrients that inhibit mineral absorption. This suggests that the nutritional requirements for iron and zinc were primarily met via animal foods during human evolution. o Bitter taste threshold as a trophic marker. An analysis of the human bitter taste threshold, when compared to the threshold of other mammals, suggests that our sensitivity to the bitter taste is comparable to that of carnivores/omnivores. There is no such thing as a veg*n gatherer tribe. And there are no records to indicate that any such tribes ever existed; also no evidence of any vegan societies either. The actual diets of all the great apes includes some fauna--animal foods. Even the great apes that are closest to being completely vegetarian, gorillas, deliberately consume insects when available. Chimps and bonobos, our closest relatives, hunt and kill vertebrates and eat occasional meat.

Many of the ancillary claims made in comparative "proofs" of veg*n diets are logical fallacies:
o o o The misinterpretation of animal studies using domesticated or feedlot meats to condemn all omnivore diets. The misinterpretation of clinical studies showing negative results for the SAD/SWD as indicating negative results for all omnivore diets. The misinterpretation of the results of the China Project to claim it "proves" vegan diets are best and all omnivore diets are bad.

John McArdle, Ph.D., an anatomist and primatologist, a vegetarian, and scientific advisor to the American Anti-Vivisection Society, summarizes the situation clearly [McArdle 1996, p. 174]: Humans are classic examples of omnivores in all relevant anatomical traits. There is no basis in anatomy or physiology for the assumption that humans are pre-adapted to the vegetarian diet. For that reason, the best arguments in support of a meat-free diet remain ecological, ethical, and health concerns.

Veg*n diets are not the natural diet of humans
The data available on humanity's evolutionary diet leads to the conclusion that veg*n diets are not the natural diet of humanity, although a veg*n diet that excluded dairy, grains, and legumes could be described as a restriction of the evolutionary diet. The evolutionary or hunter-gatherer diet (discussed in earlier sections) consists of a diet of wild plant foods (fruits, nuts, some leaves/stems, starchy tubers--possibly cooked), insects, and the lean meat and organs of wild animals.

184

Note that grains, legumes, and/or dairy are generally not available to hunter-gatherers; such foods are provided in significant quantities only via agriculture, and have been a significant part of the human diet for only about 10,000 years or less. The extent of human genetic adaptation to such foods is a controversial point, but the majority view is that the genetic adaptation that has taken place in the last 10,000 years is quite limited. (See the discussions earlier herein regarding hereditary hemochromatosis, and the carnivore connection hypothesis.) Similarly, modern processed foods have been with us for only a few generations, and genetic adaptation in such a short period is highly unlikely.

Failure to Thrive (FTT)
Your health is more important than raw/veg*n dietary dogma
There is a phenomenon known as failure to thrive (FTT). This occurs when one carefully and strictly follows a specific, recommended dietary regime, in this case raw/veg*n, but one suffers poor or ill health anyway. Such poor health may range all the way from clinically diagnosed nutrient deficiencies to a mild but noticeable general malaise--e.g., loss of libido, lassitude and fatigue, emaciation and weakness, etc. Poor health is not limited to physical symptoms, but may include mental problems as well (e.g., depression). Signs of poor mental health that one may (commonly) observe or encounter in the raw/fruitarian community and vegan communities (more often in the raw vegan community) include: severe obsessions with food and/or dietary "purity," eating-disorder behavior patterns (binge-eating), and, sometimes, incredibly hateful fanaticism (e.g., some of the fruitarian extremists), and so on.

Awareness of FTT obscured by self-censoring due to moral ostracism. Hard statistics are not
available on the incidence of FTT. Many if not most of the people with FTT self-censor themselves; that is, they switch to a diet that works for them, and leave the raw/veg*n community. Anecdotal evidence suggests that FTT is very common--the usual result--of following a strict fruitarian diet for a long term, while it presumably is much less common with conventional (and relatively diverse) cooked vegan diets. (The narrower the diet, the more likely FTT, per anecdotal evidence from raw diets.) Note also the possibility, though, that FTT may be more common among veg*ns in general than one might think--consider the number of former veg*ns around, and the ostracism (discussed earlier) that usually meets anyone speaking out about why they may have abandoned the diet. The ostracism syndrome very commonly--and also very effectively--shuts out awareness of FTT in the raw vegan community. That it may also do so in the wider vegan community is well worth considering, especially given the similar prevalence of reflexive pat answers in response to any mention of examples that could raise the possibility of FTT.

Psychological patterns: Feelings of superiority and recitation of mantras result in blocking of contrary information. Observation of this type of behavior pattern--at its most visible, perhaps, in
rawism--suggests a more general rule: Habitual recitation of such standardized replies or catechisms has a predictable long-term psychological effect, similar to ostracism, on those who engage in it. It tends to screen out awareness of the topic even as a worthy question open for discussion at all, let alone serious consideration. Finally, contributing to both the patterns of ostracism and screening of awareness can also be a circular, self-fulfilling prophecy that fosters a superiority complex which feeds on itself, further perpetuating matters. The form it takes is the following: Since any failures to thrive must be failures to comply with the recommended dietary guidelines, then those who fail must not be complying, because they are not intelligently following the guidelines like those of us who are succeeding. Thus, those who fail tend to be assigned second-class status as lesser, unworthy information sources. Those who succeed, on the other hand, are obviously the ones doing it right and the ones to be believed. Yet both sources are just as "anecdotal."

185

Such a sense of arrogance, then, sets up another psychological block to really listening to those who might have contrary information. Thus, in some cases, not only is the information reflexively rationalized, it may not even be fully "heard" to begin with. •

Excuses and rationalizations the "party line" reaction to FTT. The most common
response to FTT among raw/veg*n diet advocates is to blame the victim or outside factors, i.e., the "party line" response is excuses and rationalizations. A few of the common excuses raised are as follows. o Detox--the favorite, and unfalsifiable, rawist excuse. Your problems are all detox--you have not been on the diet long enough to become "pure." The standard, almost religious, excuse is that unless you faithfully follow the "one true religion and science of the raw vegan diet" you will never reach the "promised land" or "afterlife" of "paradise health." As wild and as crazy as this characterization may sound, it is a fairly accurate (albeit sarcastic) depiction of typical extremist claims. Also note that some conventional veg*ns make similar, though less outrageous, claims. o Deficiencies on the "ideal" diet? It may be a minor deficiency--you need to take supplements; or, an alternate form: it may be a minor deficiency--be sure to eat enough of certain (quite) specific plant foods. o Overlooked minor details. You need to fine-tune your diet (more, or yet still more), or change some single one or another of a potentially lengthy list of minor details to succeed. (If everything has to be just exactly right with little margin for error, however, what this shows instead is the narrowness and relative unworkability/impracticability of the diet.) o If all else fails, blame other factors. The FTT is of course due to outside factors: you didn't get enough rest, you didn't get enough exercise, you are under stress, you need expert advice which I will happily provide if you will pay my high fee, you lack sufficient faith in the dietary regime or are too impatient, and so on. Many of the rationalizations and excuses given by raw vegans are discussed in a number of the articles on this site: Idealism vs. Realism in Raw Foods, Assessing Claims and Credibility in Raw and Alternative Diets, and others. Similar excuses, if more sophisticated, are commonly proffered by conventional veg*n "diet gurus" as well.

How often are diet gurus honest and humble? However, the one thing I have not observed
to date is for a (prominent) raw/veg*n advocate to actively admit the obvious: that the raw/veg*n diet is arguably unnatural, and while such diets may be helpful or beneficial for some people (particularly in the short run), other people may experience FTT in the long run, because the diet might not be suited for them. What I am suggesting raw/veg*n diet advocates do is to openly admit the following:

The advocate does not have all the answers in the field of diet/nutrition, and there is no "perfect" or "ideal" diet that applies to everyone. 2. The diet the advocate promotes might not work for everyone (or, more accurately, probably does not work for everyone). If the proposed diet does not work for you, then you should try another diet; after all, the diet must serve you, and not the other way around. • Your personal health and well-being are far more important than raw/veg*n dietary dogma or any other idealistic dietary dogma. In particular, your health and well-being are more important than someone else's obsession with eating a (100%) raw vegan diet, or with someone else's idea that "meat is murder," and so on. • For some diet gurus, dietary dogma is more important than the health of their followers. It is my opinion that many raw/veg*n advocates regard dietary dogma as being more important than the health and well-being of their followers. (Not that this is conscious, of course-actions speak louder than words in signaling such behavior.) This opinion is based on seeing many 1.

186

raw diet advocates rationalize the massive failure (in the long-term) of the dietary programs they advocate. At some point, one must open up to reality and admit that raw diets can be extremely good short-term diets (at least for some people), but that their record as long-term maintenance diets is very poor. Diet gurus may blame victims if their diet does not work. However, instead of admitting the obvious, i.e., that their dietary dogma is flawed, many raw diet advocates rationalize and blame the victim, often ignoring the serious health problems their followers face. To a lesser extent, the same attitude--dietary dogma is more important than people--appears to be present in the conventional vegan community as well. I hope that you share my opinion that such circumstances reflect badly on the raw/veg*n "diet gurus" who engage in the above.

No doubt the comments here will be aggressively challenged by raw/veg*n diet advocates, as studies on FTT are quite limited (it's an area where most data is anecdotal). No doubt the defense will be raised that "(more) studies are needed in this area." This is certainly true; however, any study that assumes FTT can always be fixed while staying within the confines of veg*n diets is a study based on the assumption that vegan diets can work for anyone. That would be logically invalid, as it would constitute assuming the end result of a study on vegan FTT. Personally, I would love to see the results of long-term studies on vegan FTT; I suspect that raw/veg*n advocates might find the results of such studies to be rather surprising.

Difficulty of resolving the question with nutritional calculations done "on paper," opinions from nutritionists, or with clinical studies done to date. It appears that the assumption vegan diets
can/will work for everyone is often based on the fact that the diet (assuming it is well-planned) meets nutritional requirements "on paper." Or it may be based on the opinion of vegan nutritionists dealing with vegan patients; though here one must be skeptical given the inherent vested interest of the nutritionist, and the above-mentioned tendency to characterize not just some, but all, problematic outcomes as due to "failure to comply" (or "didn't implement the diet intelligently or meticulously enough") rather than "failure to thrive." Also, occasionally one will see a pro-vegan nutritionist make statements to the effect that after a year, the majority of people on vegan diets are doing fine. However, accuracy questions aside, as pointed out elsewhere repeatedly on this website, successful short-term results--even for a year or two (a year is a very short period of time in the context of long-term success)--do not necessarily predict long-term success on vegan (particularly raw vegan) diets; and to presume they do is naive at best. (If the patient has previously been eating the SAD/SWD diet, then in contrast, any one among a range of possible sweeping dietary changes based on adding a large percentage of more natural foods to the diet stands to be an improvement in the short-term.)

Self-selection effects present a barrier to accurately assessing incidence rates. Alternatively,
the assumption vegan diets will work for everyone may be based on extrapolations from results of clinical studies done to date. The problem here, however, is that there seem to be no clinical/epidemiological studies that have been performed on long-term vegans that are not composed of self-selected groups. (See comments on FTT in the earlier section, Drawbacks to Relying Exclusively on Clinical Studies of Diet, for a discussion of how the self-selection effect, by default, "censors" data that might be obtained from any failures on vegan diets.) This may continue to be a problem for the obvious reason that suitable vegan populations to study are small to begin with, and long-term veganism involves a major lifestyle change. To avoid self-selection effects and accurately assess incidence rate, randomly selected test subjects must maintain far-reaching dietary changes for the lengthy time periods required (at least a few years, as a starting point) to arrive at meaningful results. At present this represents a major hurdle to scientifically assessing incidence rates of FTT. Note: As of approximately 1997/1998, vegan advocate Michael Klaper, M.D. had begun attempting to put together a study looking at both successful and non-successful vegans, which appears to be the first-ever of

187

its kind. See http://www.vegsource.com/klaper/study.htm for additional information. Although not a longitudinal study, it does aim to compare groups who have been on various vegetarian diets--including vegans who have been successful and ex-vegans who have not--with the objective of uncovering the factors and potential physiological mechanisms behind why some individuals thrive on a vegan diet and others do not.

Potential reactions to information in this paper: Part 1

Rationalizations about the Evidence for Omnivorous Diets from the Fossil Record & Comparative Anatomy
Diet gurus excel at spinning rationalizations
One thing many raw/veg*n advocates are very good at is quickly spinning new rationalizations and excuses when faced with new, contrary factual information. It's very easy to spin new rationalizations (which are usually presented with little or no evidence), whereas investigating such rationalizations (only to often find they must be discredited, ultimately) is often very hard, detailed work. (The papers on this site are evidence of this, and it's probably why there are few such investigations to be found that go into more than a cursory level of explanation or detail.) Some of the potential rationalizations that may be raised against the overall thrust of this paper can be fairly easily predicted. A likely list follows below, with brief replies. (Some of the rationalizations are revisions or defenses of prevalent current rationalizations.)

RATIONALIZATION:

1. The fossil record is irrelevant because we were created, rather than evolved!
Evolution is nonsense!

2. Our prehistoric ancestors were acting "against their nature" by eating meat for 2.5
million years! It's obviously maladaptive behavior! REPLY: Both of the above have been covered in previous sections of this paper, so just a few additional
brief remarks are given here. 1. As mentioned previously, to adopt creationism solely because it supports your "lunch philosophy" is intellectually dishonest.

2. Adaptation to a universal habit (meat-eating) is the ultimate and unavoidable result
of evolution. The human genus (Homo) has gone through 3-4 separate species since its inception;
this provides direct evidence of evolutionary adaptation to environmental (and universal cultural) selection pressures. Those who make claim #2 above simply have no credible evidence to support their views. Further, such claims suggest a truly massive ego on the part of the extremist; i.e., the raw/veg*n advocate supposedly knows better than all the people over 2.5 million years of evolution who ate animal foods for survival. Inasmuch as eating animal foods was clearly necessary due to ecological limits in most (and probably, for the most part, all) locations, the claim that "eating meat is maladaptive" could be restated as "survival is maladaptive." The latter statement clearly delineates the extremist ego.

RATIONALIZATION: The evidence of modern ape diets is irrelevant because we are a unique
species, and we evolved in a different environment (the African savanna) than the forest-dwelling great apes.

188

REPLY: There is much truth in the above; the differences in human vs. ape evolution have been
discussed in earlier sections. However, the significance of ape diets and the above rationalization about them stems from two points:

1. The 180-degree about-face from previous stance on ape diets. The similarities between
apes and humans have long been used to support the claim that humans are natural vegetarians. Those who have used the argument previously, and then not only drop it, but make a point of proactively targeting it for dismissal--upon hearing that the great apes are not vegetarians--are behaving inconsistently in a fashion that, again, highlights the intellectual dishonesty (perhaps not entirely conscious) implicit in such aspects of the raw/veg*n "party line." Ape diets may give insight into early evolutionary diets (pre-human). Despite the significant differences between apes and humans, the diet of apes is of some relevance as it may provide insight into the diets of the early hominids, i.e., our "early" evolutionary diet. The statement that we evolved on the African savanna is of interest, because plant densities are lower on the savanna, and the density of animal life higher (especially when migrating herds visit) than in the rainforest. Foley [1982] reports the surprising but very interesting information that of the three major habitats--arid regions, savanna/grasslands, and forests--the savanna habitat contains the least plant foods that are edible to humans. (It's quite surprising that an allegedly "vegetarian human" would evolve in the habitat that provides the lowest levels of plant foods!) Thus, the differences in relative availability of foods (savanna vs. rainforest) suggests that the human evolutionary diet was lower in plant foods, and higher in animal foods, than the diet of the great apes (which a number of other, separate lines of evidence also support).

2.

RATIONALIZATION: The human gut is far too elastic for comparative anatomy to tell us anything about our natural diet. REPLY: After years of citing the dubious analyses of gut morphology found in Fit Food for Humanity or
Mills' The Comparative Anatomy of Eating, those who suddenly adopt this view when faced with contrary evidence are, again, behaving in an inconsistent and intellectually dishonest manner. That said, although the best scientific research to date indicates we are faunivores--meat-eaters--by anatomy, it is not "hard proof." However, the gut morphology data is only one part of a large set of data (from multiple, separate lines of evidence) that points to humans being natural faunivores.

RATIONALIZATION: The evidence of comparative physiology is irrelevant:

1. We can get adequate B-12 from unwashed produce.
2.

3.

The low bioavailability, in plant food, of protein, iron, zinc, and EFAs is because some/all of these things are "toxic," except in tiny (nutritional) quantities. Physiological measurements made on (degenerate!) meat-eaters are invalid! Only measurements done on fruitarians are valid, and such measures will be different for "real" humans, i.e., fruitarians.

REPLY: Point #1 is common in the vegan movement; versions of points #2 and #3 above are occasionally
made by fruitarian extremists. Let's briefly look at these.

1. Insufficient evidence to support claim #1. Despite the research of Mozafar [1994] discussed
2. earlier herein, there is insufficient data at present to support a claim that wild plant foods (alone) could provide adequate B-12 to sustain a (hypothetical) vegan gatherer tribe. This view is certainly not supported by real, honest science, though there are fruitarian extremists who present elaborate crank science "proofs" that allege protein in more than extremely modest

189

3.

amounts (extremely modest even by conservative estimates approved by the more mainstream conventional-vegan community) is effectively "toxic" because its metabolic by-products are allegedly toxic. Point #3 is not only speculative, it is an example of "dietary racism." Note the implicit hatred in point #3, something that, unfortunately, is far too common in the fruitarian movement (i.e., dietary racism). This view is also thoroughly egocentric as well. While it is true that levels of zinc, iron, B-12, taurine synthesis, and beta-carotene conversion will likely differ between fruitarians and non-vegetarians, without actual studies, we cannot say much about such assumed differences. This claim is basically a diversion by fruitarian extremists.

Rationalizations in Response to the Evolutionary and Hunter-Gatherer Evidence for Omnivorous Diets
Similar to rationalizations offered in reaction to the comparative anatomy evidence for omnivorous adaptation are the following--sometimes more sophisticated--diversionary ploys that may be offered in response to evolutionary and hunter-gatherer evidence.

RATIONALIZATION: What happened back in the Paleolithic age doesn't really matter. We are different people today, in most every way. The diet of the Paleolithic is irrelevant nowadays
given the new conditions we live under that must be coped with.

REPLY: Genetically, we are in fact quite closely similar to the hunter-gatherers of the late Paleolithic era. As Eaton et al. [1988, p. 740] note:
Accordingly, it appears that the [human] gene pool has changed little since anatomically modern humans, Homo sapiens sapiens, became widespread about 35,000 years ago and that, from a genetic standpoint, current humans are still late Paleolithic preagricultural hunter-gatherers. Because of the close (nearly identical) genetic similarity, the diet of Paleolithic times is relevant. Of course we do live under different circumstances, and we adapt accordingly. However, for evolutionary adaptation to be reflected in the gene pool is generally believed to require a very long time--considerably longer, in any event, to produce more than but minimal changes since the time humans took up agriculture and ceased to be exclusive hunter-gatherers (roughly 10,000 years ago; much less for some groups). RATIONALIZATION: Evolution is concerned with reproductive success, not longevity. An evolutionary diet does not have to provide excellent health, it only has to be good enough to allow one to survive to reproduce. We can improve on evolution, and raw/veg*n diets are a good example thereof. After all, veg*ns live longer than non-veg*ns!

REPLY:

Proposing veg*n diet as an improvement on evolution is a reversal of the logic implicit in veg*n naturalism claims. The above is probably one of the major defenses to be
employed by raw/veg*n advocates. First, the claim that we can improve on evolution can be restated as, "We can improve on nature." This rationalization neatly reverses the logic that underlies the bogus but popular claim that raw/veg*n diets are best because they (allegedly) are "most natural" for humans. Anyone who uses the naturalism argument for raw/veg*n diets, but then uses the rationalization above, is being inconsistent.

The claim that raw/veg*n diets are an improvement on nature has little if any real supporting evidence. There are some vegan diet advocates who are aware of the evolutionary
and paleoanthropological evidence and honestly admit that vegan diets are not natural. That aside, the claim that veg*n diets are an improvement over evolution is simply speculative. There is certainly ample evidence to suggest that conventional veg*n diets are better than SAD/SWD diets, but no comparisons of veg*ns with hunter-gatherers or other groups today attempting to follow

190

modern versions of an evolutionary diet such as Paleodiet advocates in clinical trials or epidemiological studies. As previously discussed in this paper, to make an implicit equation of the SAD/SWD with evolutionary diets just because both are omnivorous is fallacious. Hence one cannot say that veg*n diets are "healthier" than natural, evolutionary, hunter-gatherer-type diets. There is extensive anecdotal evidence for raw vegan diets that indicates long-term success stories are rare. On the other hand, anecdotal evidence for conventional vegan diets suggests they are much more successful, in the long-run, than raw vegan diets. This doesn't tell us anything about how such diets would fare against evolutionary diets, however--long-term longitudinal studies of conventional veg*ns are lacking. •

The claim conveniently avoids the failure-to-thrive issue. Another obvious issue here as
discussed earlier--or at least that is obvious to those ex-vegetarians who have made concerted, intelligent attempts at vegan diets (even conventional vegan diets, i.e., including grains/legumes) and who have failed--is the problem of "failure to thrive." (See earlier section on Drawbacks to Relying Exclusively on Clinical Studies of Diet, starting about one-third to one-half the way down the page, for a discussion of how mechanisms of built-in structural bias operate in the vegan community to minimize/prevent awareness of failure to thrive or its relevance.) This is an issue that, by and large, has yet to be faced squarely and honestly by very many in the vegetarian movement (including most of its scientifically oriented advocates). Thus, claims that vegan diet is an improvement over evolution ring hollow when so much remains unknown about actual rates of failure to thrive, and when the question continues to be largely glossed over by all but a very few in the vegetarian movement. Evolutionary discordance. An important additional consideration here is the concept of "evolutionary discordance"--that is, negative repercussions as a result of behavior contrary to the organism's genetic design. Our genetic makeup reflects an evolutionary compromise between multiple, competing selective pressures. The organism's physiology is therefore a reflection of adaptations (to such multiple pressures) that must function simultaneously in concert, and that mutually and intimately affect one another. As a result, the body, as determined by its genetic "blueprint," is a densely interwoven mesh of interdependent physiological systems that are very tightly integrated.

Disregarding the genetic dietary range may cause negative repercussions due to the physiological interdependencies determined by evolution. The range of dietary
adaptation supported by our genetic code is also the result of multiple selection pressures with their resulting physiological interdependencies. The problem with straying too far from the behavior/environment (which includes diet) that the body has adapted to genetically is that an alteration in one facet is likely to cause unanticipated repercussions, via the body's tightly woven and interdependent physiological systems. Attempted manipulation of one factor or attribute in the mix may occur at the expense of others, because of the multiple evolutionary selection pressures that have resulted in the particular "balance" of physiological factors that act as "constraints" on each other in the organism's functioning. Despite the fact that evolution may not (in the case of any given species) select for longevity, therefore, it does select for highly integrated physiological designs where the component systems mutually constrain each other. This places limits on how much the environment/behavior/diet of the organism can be changed without negative consequences to the majority of individuals in the gene pool. (Excepting of course, those individuals with the requisite genetic polymorphisms-variations--to benefit from such changes.) It may be possible to alter one's diet significantly, i.e., to adopt an "evolutionarily discordant" diet, and to see short-term, positive results. A good (if extreme) example is provided by vegan forms of fruitarianism. In the short run, such a diet may actually enhance the health of the person who

191

follows the diet. (Warning: fruitarian diets are high-risk; consult a qualified health professional before undertaking a fruitarian diet for therapeutic purposes.) However, in the long run, perhaps because the diet may be beyond the range of adaptation, undesired consequences can occur. In the case of fruitarianism, these consequences can include diabetes-like symptoms, severe emaciation, fatigue, mood swings, loss of libido, and serious mental consequences: life-controlling food obsessions, eating-disorder behaviors, incredibly hateful fanaticism, and so on. The loss of libido reported by many fruitarians (and some regular raw veg*ns) suggests that those who follow such a diet will fail to reproduce, and quickly die out in evolutionary terms. That explains one way that evolutionary selection pressure can quickly eliminate a discordant diet. In contrast to the above real-world anecdotal experience, if following a fruitarian diet granted long-term robust health and increased sexual virility, it would enhance reproduction and thrive in the gene pool via survival of the fittest. Further, the enhanced reproduction inherent in such a diet would make such a diet predominant, over evolutionary time.

Side note: The above provides yet another argument against the claim that humans evolved as
fruitarians. If the diet were even half as good as claimed, those who followed the diet (or the closest approximation thereof, per local conditions) would thrive and out-reproduce those on meat-based diets, and the fruitarian diet would have become the standard diet at some point in our evolution. So, although an individual might adopt a discordant diet and benefit in the short run, unless the diet conveys--consistently--significant survival/reproduction advantages in the extant environment, the genes that support the diet will not survive to become "natural" or standard. Further, the data available to date (anecdotal data) does not support the idea that raw/veg*n diets convey a major survival advantage, in the long run. •

Where is the credible, long-term longitudinal data on longevity for vegans vs. nonvegans? As Corder [1998, p. 130] observes, "Most of the literature which addresses diet and
longevity is promotional, testimonial, observational or editorial." One wonders how many of the claims regarding veg*n longevity are promotional. It appears that many of the claims that veg*n diets promote longevity are based on short-term (cross-sectional) studies, and/or are extrapolations from studies that show trends toward better biomarkers (or lower disease rates) for people consuming more plant foods. The extrapolation of such studies from low levels of meat consumption to zero meat consumption is done at a considerably higher risk of statistical error, as such extrapolations go beyond the range of the data. Also, as mentioned previously, comparisons with the standard Western diet tell one nothing about likely comparisons with evolutionary or hunter-gatherer diets. (A further problem in comparing hunter-gatherer diets with veg*n diets is the low level of sanitation and high "occupational risks" of traditional hunter-gatherer lifestyles.) Even more relevant here is the apparent paucity of long-term, longitudinal data on the longevity of strict, long-term vegans. Claims based on biomarkers are certainly of interest, but actual longevity data, collected over a long period, would be more credible. And of course, one needs other, supplementary data to properly analyze longevity, e.g., data on smoking, alcohol use, accidental deaths, etc. Until such long-term longitudinal data sets are available, claims of veg*n longevity may be subject to challenge. [Note: Readers are invited to contact the author with citations for long-term longitudinal studies of veg*n longevity.]

RATIONALIZATION: Hunter-gatherers may eat some meat but they are not that far from being vegetarians. Many raw/veg*n advocates say that: (a) hunter-gatherers usually have a diet in which
plant foods are predominant, and also that (b) hunter-gatherers rarely hunt, and when they do they are usually unsuccessful. Ergo, their diet is nearly vegetarian. (Note: This general idea, in varying forms, is

192

held not just by raw extremists, but tends to be shared by those who identify themselves as conventional vegans as well.)

REPLY: This claim can probably be traced back to early (since superseded) anthropological work on hunter-gatherers, and may also be based on a flawed and outdated huntergatherer survey interpretation. The claim is a false myth, and one that certain "diet gurus" (and, as well, more scientifically oriented conventional vegans who should probably know better) are quite happy to propagate. •

Popularized early studies of San Bushmen as one source of the myth. The myth of the
minimally-meat-eating hunter-gatherer may have its roots in early publicity surrounding one of the first well-studied hunter-gatherer societies to gain widespread attention (in the 1960s/1970s) in the anthropological community--the San tribes of the Kalahari Desert in Africa. In particular, the ! Kung San, widely reported on as a mostly peaceful, egalitarian tribe who ate much more plant food than meat, went on to become something of a cause celebre as the prototype of huntergatherers--as well as a Rorschach blot for anyone with a theory about humanity's inborn sociocultural inclinations to utilize as a jumping-off point. (For an overview of early romanticism about the !Kung, and how knowledge and views of hunter-gatherers since then have grown and changed, see Lewin [1988].) But as it was to happen, further research and analysis has since shown that the San are unrepresentative of the majority of hunter-gatherer tribes where plant-to-animal-food ratios are concerned. (The !Kung San derive 37% of their food from animal foods, the Kade San 20% [Eaton 1985]; and in this connection, it's worth remarking here that even 20% is a significant amount--in reality not really that close to being "nearly vegetarian.")

Flawed early survey of worldwide hunter-gatherers as another source. Richard Lee
did some of the influential early (and still well-regarded) anthropological work studying the ! Kung. (See Lee [1979] for a report of his anthropological observations.) However, it is not just the diet of the !Kung San or related Bushmen that may be the source of the widespread (erroneous) "mostly gatherer" view of pre-agricultural people, but perhaps also a hunter-gatherer survey of Lee [1968]. The survey of Lee did claim that hunter-gatherers gathered more than they hunted. However, as Ember [1978] explains, Lee's interpretation of the survey data (not the survey itself) was seriously flawed and unrepresentative of hunter-gatherers. The flaws in Lee's survey interpretation are as follows: 1. He reclassified shellfish-collecting as gathering, rather than hunting. 2. He (arbitrarily) excluded many North American hunter-gatherer tribes from the survey (which was based on the Ethnographic Atlas [Murdock 1967]). The result of this was, as described by Ember [1978], to arbitrarily increase the importance of gathering in the survey.

More recent and detailed analysis of survey data finds hunting more important than gathering. Ember [1978, table 2, p. 441] describes a more thorough survey utilizing all of the
tribes in the Ethnographic Atlas (including the North American cases excluded by Lee), which found that in 77% of the hunter-gatherer societies, gathering contributed less than half the food calories. Hunting and gathering contributed approximately equal calories in 13% of the societies, and gathering contributed more calories than hunting in only 10% of the societies. (Also see the subsection on hunter-gatherers in "Metabolic Evidence of Human Adaptation to Increased Carnivory" see below. Thus we note that the claim hunter-gatherers are incompetent hunters and rely primarily on gathering is false, and that hunter-gatherer societies where gathered plant food is a more important food source than hunting are a very small minority.

193

Outdated, incorrect claims that Australian Aborigines were "nearly vegetarian."
The San Bushmen are not the only hunter-gatherer group whose diet has been misunderstood or misrepresented. One can also find inaccurate claims that the Aborigines of Australia were predominantly vegetarian. Lee [1996, p. 7] summarizes the situation nicely: For many years it was believed that the Aboriginal diet was predominately vegetarian, and a statement that the central Australian desert diet was composed of "70-80% plant foods" [93] had been widely accepted [87, 101-103]. This is no longer believed to be the case. Meggitt [93] compared foods by estimated weight, which probably overassessed the importance of the vegetable component of the diet. Support for the vegetarian basis of the traditional desert diet had also been interpreted from botanical lists, observations illustrating the ripening of different plant food species throughout the year [96] and evidence of storage of plant foods. However, none of these sources attempted to quantify actual dietary intake, and the effects of climate, seasonality and specific location must also be considered [37, 70, 96]... There is increasing evidence that both tropical savanna/coastal and desert [86, 90] diets were meat-oriented; vegetable foods provided an important supplement, rather than an alternative to animal foods, with proportions changing throughout the seasons [10, 25, 36, 82, 84, 100]. For example, the coastal people of Cape York concentrated so much on the procurement of marine mammals that vegetable foods were considered a luxury [7]. Desert Aborigines have described themselves specifically as meat eaters [106]. Early observers commented on the intake of meat of smaller mammals as a "mainstay" of the diet of central Australian tribes [80]. The above also reminds us to be cautious in evaluating earlier studies, and to use all the available evidence when evaluating the diets of hunter-gatherers. Otherwise, one may end up making fallacious claims, as here, e.g., that Australian Aborigines (or other hunter-gatherers) were "nearly vegetarian."

RATIONALIZATION: There are no vegan gatherer tribes because they have not been exposed to the "enlightened" philosophy of veg*nism. They are living in ignorance and have not evolved spiritually. REPLY: That there are no vegan hunter-gatherer tribes suggests the diet is neither feasible for them nor natural. Stop and think carefully about the nature of claims that a particular diet somehow
makes you "enlightened" or "superior." Stripped of their idealistic rhetoric, the nature of such claims is, quite simply: "My lunch is better than yours, and that makes me a better person than you!" Even worse, a few fruitarian extremists actively promote this nonsense in dishonest and incredibly hateful ways. I hope that you can see that rather than "enlightened," such attitudes are really self-righteous, egotistical, and the dietary equivalent of racism. Because of this, it is best to not view veg*n philosophy as

"enlightened." RATIONALIZATION: The hunter-gatherer diet is not feasible for people living in modern times; it's just a bunch of academic "ivory-tower" theorizing about diet and nutrition. Vegan diets are real
and work! REPLY: The earlier section regarding FTT (failure to thrive) indicates that veg*n diets don't seem to work for everyone. As for the claim that hunter-gatherer diets are all academic, a partially tongue-in-cheek reply here is: Whether paleodiets are fully achievable in today's world at this point might be debated-however, is it not obvious that attempts to approximate them will certainly be much closer than attempts not to? More seriously, the section Should You Eat Meat? (later herein) briefly describes how it may be possible to approximate a hunter-gatherer diet today.

194

Evolution and Vegetarian Choice: Continued Dogma or a New Honesty?
Based on past experience (some of which was quite unpleasant), it is possible that some "spiritually advanced, peaceful, compassionate" raw/veg*n diet advocates will react to the information in this article with some of the following.

DIVERSIONARY TACTICS: PERSONAL ATTACKS AND NITPICKING Attacking the messenger while ignoring the message
Hateful personal attacks by dietary extremists--e.g., pejorative name-calling, claiming that I am antivegetarian, anti-raw, etc.--can be fairly easily predicted. One fruitarian extremist group (that, in my opinion, behaves like a hate group--an opinion shared by many others) has attacked me on numerous occasions in the past with defamatory lies and threats of violence. I have also been attacked by other fruitarian extremists (e.g, promoters of crank science theories) and even a conventional veg*n extremist (a notorious Internet "kook"). Please note that: I am both pro-vegetarian and pro-raw. Readers should be aware that I am a long-time vegetarian (since 1970), a former long-time (8+ years) fruitarian (also a former vegan), have followed naturalhygiene-style and living-foods-style raw vegan diets, and am both pro-vegetarian and pro-raw. However, I am definitely not a promoter of, or a "missionary" for, any specific diet. In reality, I am tired of seeing raw and veg*n diets promoted in negative ways by extremists whose hostile and dishonest behavior is a betrayal of the positive moral principles that are supposedly at the heart of veg*nism. (See the site article Assessing Claims and Credibility in the Realm of Raw and Alternative Diets for insight into the behavior of extremists.)

Nitpicking inconsequential issues while ignoring the relevant ones
Any small errors in the material here (and, given the size and scope of the material, there are likely to be a few errors despite a considerable amount of fact-checking) do not by themselves (alone) invalidate the major points or conclusions given here. No doubt extremists will greatly exaggerate the presence (and importance) of any small errors they can find. Past experience with certain fruitarian extremists who have dishonestly misrepresented my views so they could nitpick them suggests that the material on this site will be misrepresented and subject to nitpicking as well. Again, see the site article, Assessing Claims and Credibility for a discussion of related extremist tactics.

How will you react?
As a reader of this paper, you have the choice of how you react to the information here. Will you use it as an opportunity for self-examination regarding the information you use to back up your dietary philosophy? Will you try to ignore it, using nitpicking and rationalizations as an emotional shield to avoid considering the material herein? Or, will you behave like an extremist and threaten or attack me or the other writers on this site, thereby promoting hatred and negativity? If you are tempted to follow that route because the material here challenges your "lunch philosophy," then please stop and think: Will raw/veg*n hatred make the world a better place? Are hatred and threats simply a new type of raw/veg*n "compassion"? I suggest that readers ignore personal attacks and nitpicking, and focus on the actual issues rather than getting sidetracked by such diversions. Instead, focus on evaluating the myths, fallacious logic, crank science, and unsubstantiated claims that are prevalent in the raw/veg*n movement, and ask yourself: Do I really want to participate in (or condone) promoting the raw/veg*n movements using such dubious means?

Dishonest raw/veg*n diet gurus
Every time I have raised the issue of the dishonesty, crank science, and even hostility that are common in the promotion of raw/veg*n diets, many people quickly rally to the defense of the diet gurus. In so doing, they are making certain implicit assumptions (though perhaps not consciously). The assumptions are that negative, dishonest means are acceptable (or may be condoned by silence or lack of opposing comment

195

when one is aware of it) and/or may even be necessary to promote raw/veg*n diets. (This is pretty much an "ends justifies the means" argument.) Needless to say, such assumptions are outrageous. I hope that you share my view that raw/veg*n diets can and should be promoted in honest, positive ways, with legitimate science, and with complete respect for those who choose other diets. There is no need for myths, crank science, or dietary racism in promoting raw/veg*n diets.

Should You Eat Meat?
The objective of this paper has been to examine the claims made in various comparative "proofs" of diets, and not to promote one diet over another. Similarly, this website does not seek to promote a particular diet. Instead, the focus here is on providing you with information for your consideration and evaluation. With that in mind, the question of whether you should eat meat is a question that only you can answer. The role of spiritual or moral factors in such a decision is at your discretion, i.e., you choose whether to include such factors in the decision process. As a veg*n for moral/spiritual reasons, I strongly encourage you to consider such factors, if appropriate in your case. It is certainly not my intent here to promote meat-eating, only to clarify the scientific facts, and to hopefully dispel some of the myths/crank science associated with raw/veg*n diets. However, ultimately, diet is

your personal responsibility, and your personal decision, as well.
A few comments are relevant here, in context. • • • The fact that domesticated/feedlot meat is potentially harmful has been mentioned a number of times in this paper. SAD diet not a good idea. As the SAD/SWD diets have (deservedly) bad reputations and use domesticated/feedlot meats, adopting or following such a diet is not a very good idea. Propensity to consume fauna might not serve best interests in modern society. Hamilton and Busse [1978, p. 765] note: The human propensity to expand dietary meat consumption seems to be a legacy from our omnivorous primate heritage. Humans apparently share with most primates a tendency to increase the proportion of dietary animal matter whenever it is economical to do so. We inherited dietary preferences for animal matter, which historically have been limited by economics and a hierarchical society. Under current luxury diet circumstances, no such balance of diet to resource availability prevails and preference betrays best interest. •

Times have changed, but the calorie content of fat has not. Allman [1994] summarizes
the dietary paradox we face rather nicely (p. 204): The human body absorbs 95% of the fat we ingest, suggesting that this compact source of calories was extremely hard to come by in ancient times. This legacy of our ancestor's eating has created a psychology of food preference that is celebrated in the local burger palace. Having existed for eons in an environment where only lean meat was available, our ancestors would have found a fast-food hamburger a gustatorial paradise. It is precisely what our ancestors loved about fat--its incredibly rich content of calories--that makes it so bad in modern times, when fat is available in great quantities. This evolutionary legacy in our food psychology is the reason fatty foods cause so much trouble for those of us who who live in the food-rich industrialized West today: Having evolved in an environment where fat was scarce, our modern-day minds have a hard time knowing when to stop.

196

Possible approximation of hunter-gatherer diets, in modern societies. Although the
evolutionary hunter-gatherer diet is not feasible for most people at present, it may be possible to approximate it via a diet that includes: o Grass-fed buffalo, yak, emu, duck, or other less-domesticated animals (perhaps even pastured beef, assuming you can find it). o Wild fish are a possibility as well since in many cases they fairly closely approximate key characteristics of land game (low saturated fat, abundant in EPA/EFAs, appropriate omega-3 to omega-6 ratio, etc.), though fish are not regarded as completely strict evolutionary foods in general, as many of our ancestors lived inland and lacked fishing technology. (The assertion that fish are not completely within the definition of an evolutionary diet is somewhat controversial, however, and subject to some debate.) o Non-hybrid vegetables and wild plant foods, as available (using commercial plant foods-but no grains or legumes, when wild is not available). As well, many commercial plant foods, depending on the item in question (low glycemic index, for example, as one criterion), may be close enough to the profile of wild plant foods to include in a paleostyle diet. o Insects, as an animal-food source, are also within the evolutionary definition.

The comments above are provided to stimulate discussion and research. The above is not intended as an individual dietary prescription or recommendation.

A New Beginning?
Newer knowledge can help in self-assessment. I hope that the information in this paper has been of interest to you. I also hope that you don't react with denial (rationalizations, excuses) or hostility (attacks, threats). Instead, I hope that you use the process of "mentally digesting" the information here as an opportunity for in-depth self-examination of the attitudes you may hold toward diet, and as a real opportunity to expand your vision about the human species'--your own--actual dietary heritage.
This common heritage we all share has had a profound impact on human physiology and anatomy, and also exerts its effects internally (mentally) in how we experience our bodies' programmed biological reactions to different foods. A helpful result of knowing this is that it more thoroughly explains why we crave certain food types, giving a better perspective on the challenges facing those of us in modern times eating diets that may significantly differ from the species' evolutionary diet.

The pervasive nature of the false claim that "humans are natural veg*ns" has had its own, long-term, impact on the veg*n movement. Unfortunately, many veg*ns have developed very strong
attachments to their "lunch philosophy," and it has become a pseudo-religion for some (with the dietary extremists being, in effect, hateful and fanatical followers of the pseudo-religion). However, if you follow a veg*n diet because of underlying moral or spiritual factors, please stop and think: • •

You really don't need the naturalness claim to be a veg*n! That is, moral/spiritual reasons alone are adequate to justify following a veg*n diet (assuming the diet works for
you, of course).

Further, if the motivation for your diet is moral and/or spiritual, then you will want the basis of your diet to be honest as well as compassionate. In that case, ditching the
false myths of naturalness presents no problems; indeed, ditching false myths means that you are ditching a burden.

The information presented in this paper should, hopefully, make clear that many raw/veg*n "diet gurus" are promoting misinformation, myths, and logical fallacies as part of the raw/veg*n "party line." Unfortunately, junk science and crank science are far too common in the raw/veg*n community, and some extremists have earned their very bad reputations for hostility and dishonesty, as well.

197

Myths and crank science: a good basis for the raw/veg*n movements? And now, consider
whether these factors--myths, logical fallacies, crank science, hostile behavior by some raw/veg*n extremists--are a good basis for the long-term growth of the raw/veg*n community. Are these the factors that will expand the community and build a stable basis for the future? Or, rather, are such factors simply the seeds of a long-term, self-destructive erosion of the raw/veg*n community's credibility? An erosion that only stands to spread as the increasing influence on nutritional science of emerging evolutionary knowledge sets the record straight and exposes the above myth-making to a wider public for what it is?

Personal choice: myths and crank science, or reality and honesty? Each of us has a choice: we
can cling to false myths and crank science, or we can open up to reality and strive for an honest basis for our own diets, and for the larger raw/veg*n communities as well. I hope that you will give serious thought to your choice.

EPILOGUE: A Personal Note to the Reader
Writing this paper has been a major effort for me. It consumed most of my spare time for months: evenings, holidays, Saturdays, Sundays, and so on. Believe it or not, this paper was a real labor of love. My motive for having written it is to educate individuals about the myths of raw/veg*n diets, and thereby to help in what way I can to encourage reform on these issues in the raw/veg*n movements; to encourage the raw/veg*n movements toward a more consistently honest basis; and to encourage people to be skeptical of the dishonest and/or delusional crank science and myths that are promoted by far too many "diet gurus." I hope that you have enjoyed this paper. If so, I encourage you to read more of the material on this website, and also to invite your friends to visit the site and read the material here. Thank you for reading, and I wish you good health, and good thinking! P.S. I encourage all to also read Appendix 1. It is relevant to the subject and will likely be of interest to many readers. --Tom Billings

APPENDIX 1: The Carnivorous/Faunivorous Vegan
Acknowledgment: This appendix, including its title, is inspired by Moir [1994]. The objective of this brief appendix is to remind vegans, especially those who fall into the ego trap of selfrighteousness, that, by and large, no one is really a "pure" or "true" vegan. Even the most obsessive, nitpicking, label-reading vegan is not a "pure" vegan. The reason is that virtually everyone not only indirectly uses animal products, but everyone also consumes some fauna or animal foods, even if inadvertently.

198

Breast-fed "vegan" infants are actually carnivores. This potentially distressing state of affairs (to
the extremist) usually begins at birth, at least for those who are breast-fed. Moir [1994, p. 88] informs us (boldface emphasis mine): Lactivory is an adapted form of carnivory in that the food source, milk, is derived directly from animal cells (Moir, 1991). It follows that no young mammal is initially herbivorous (White, 1985); all are carnivores for some period. While this is a general rule for mammals, there is ample evidence that it applies to a large range of invertebrates as well as diverse vertebrates; the initial development of the young borne of eggs, whether oviparous or viviparous, is explicitly dependent on succour from the female in the form of yolk or nutrient transfer. Moir [1994] points out that most herbivorous mammalian neonates cannot use or obtain the normal food of adults. Further, most single plant foods are inadequate to support growth to maturity--they lack amino acids, typically lysine, and other nutritional factors (e.g., vitamin B-12). Finally, the significant energy and rapid growth requirements of a juvenile animal require food that is very high quality and easy to digest. Plant foods cannot meet the demanding requirements, but carnivory--including insectivory and lactivory (milk) can. (The amino acid profile for milk is said to be "close" to that for animal tissues.)

Carnivorous diet for infants is a necessary survival tactic. Moir concludes [1994, p. 99]:
The apparent carnivory of juvenile herbivores is a necessary tactic to satisfy the requirement for available amino acids, particularly the lysine and methionine, required in a highly concentrated and digestible form that is constantly and universally available. While plant proteins, particularly rubisco [ribulose bisphosphate carboxylase/oxygenase, a protein], have the required amino acid structure, digestibility, biological value and concentration required, the levels and availability fluctuate because of seasonal and structural changes (Minson and Wilson, 1980) and other interfering substances (Carpenter, 1960), except in very specific niches. Therefore, the only way for young herbivores to realize their growth potential and survive to reach reproductive maturity is to behave nutritionally as "carnivores." Thus we see that every "vegan" child who is breast-fed really starts off his or her life as a carnivore. And so does every other mammal, whether folivore, frugivore, or faunivore.

Note: The above is not intended to discourage breast-feeding by vegan mothers, or to increase
the negative attitudes (i.e., hatred) frequently displayed by allegedly compassionate vegans towards dairy.

Every vegan is actually a faunivore. Finally, readers should know that every vegan is actually a
faunivore because of the inadvertent consumption of insects. From Taylor [1975, p. 33] (italic and boldface emphases both mine): There's one last point I might call attention to in this consideration of our indirect eating of insects: strictly speaking, there is no such thing as a vegetarian, for vegetarians generally eat more insects than do non-vegetarians, simply because a larger percentage of their diet consists of food of plant origin. The more fruits, nuts, and vegetables in one's diet, the more insects one is going to eat. Since insects are animals, and their presence in our foods of plant origin is ubiquitous--although generally undetected-we cannot escape the fact that the so-called vegetarian is no vegetarian. Man is omnivorous

[faunivorous], whether he wants to be or not.

Paleolithic Diet vs. Vegetarianism: What was humanity's original, natural diet?
A 3-Part Visit with Ward Nicholson Copyright © 1998 by Ward Nicholson. All rights reserved.

199

INTRODUCTORY REMARKS
The text of the interview is republished here much as it originally appeared in Chet Day's Health & Beyond newsletter, with a few small modifications necessary for the present web version. See the introductory remarks on the front page of the interview series for specifics.

For those unfamiliar, the term "Natural Hygiene," which appears periodically in these interviews,
is a health philosophy emphasizing a diet of mostly raw-food vegetarianism, primarily fruits, vegetables, and nuts, although for revisionists eating some cooked food, it can also include significant supplementary amounts of grains, legumes, and tubers.

Ward transferred coordinatorship of the Natural Hygiene M2M to long-time member Bob Avery in 1997, and is no longer associated with the Natural Hygiene movement. To learn more about the N.H.
M2M (now called the Natural Health M2M), or for information about getting a sample copy, you can find out more here.

Introduction
Our guest this issue is Ward Nicholson, [former] Coordinator of The Natural Hygiene M2M--a unique "many-to-many" letter group forum created for correspondence between Natural Hygienists in a published format, which has been operating since 1992. [Note: Long-time M2M participant Bob Avery took over the Coordinatorship in December 1996 after Ward resigned and left the Natural Hygiene movement.] Participants in the M2M write letters about their Hygienic experiences, debate viewpoints, and offer support to each other. Each issue, Ward collates the letters together (exactly as-is with nothing edited out) into a 160- to 180-page bimonthly installment of the M2M, a copy of which is sent out to everyone in the group for ongoing response.

Two primary issues to be covered. What we'll be discussing with Mr. Nicholson in H&B are two
things: •

Real-world accounts of how long-term vegans do on the Natural Hygiene diet. One of
these consists of the ideas and conclusions Ward has reached about Hygienists' actual experiences in the real world (based on interacting with many Hygienists while coordinating the N.H. M2M)-which are often at variance with what the "official" Hygienic books tell us "should" happen. The human evolutionary/dietary past. And the other is the meticulous research he has done tracking down what our human ancestors ate in the evolutionary past as known by modern science, in the interest of discovering directly what the "food of our biological adaptation" actually was and is--again in the real world rather than in theory.

Given the recent death of T.C. Fry, I consider Ward's analysis of special importance to those who continue to adhere strictly to the fruits, vegetables, nuts and seeds diet. We'll tackle this month the question of humanity's primitive diet. In two subsequent issues, we'll wrap that topic up and delve into what Ward has learned from coordinating the Natural Hygiene M2M about Hygienists' experiences in real life. You'll find that will be a recurring theme throughout our discussions with Mr. Nicholson: what really goes on in real life when you are able to hear a full spectrum of stories from a range of Hygienists, as well as what science says about areas of Hygiene that you will find have in some cases been poorly researched or not at all by previous Hygienic writers. Not everyone will agree with or appreciate what Mr. Nicholson has to say. But, as I've written more than once, I publish material in H&B that you won't find anywhere else, material and sound thinking that

200

interests me and calls into question my ideas and my assumptions about building health naturally. In this series of three interviews, I guarantee Ward will challenge many of our mind sets. Mr. Nicholson has a lot of ground to cover, so without further ado, I happily present our controversial and articulate guest for this issue of H&B.

Personal experiences with fasting, Natural Hygiene, and veganism

Health & Beyond: Ward, why don't we start out with my traditional question: How was it that you became involved with Natural Hygiene?
Ward Nicholson: I got my introduction to Natural Hygiene through distance running, which eventually got me interested in the role of diet in athletic performance. During high school and college--throughout most of the 1970s--I was a competitive distance runner. Runners are very concerned with anything that will improve their energy, endurance, and rate of recovery, and are usually open to experimenting with different regimens in the interest of getting ever-better results. Since I've always been a bookworm, that's usually the first route I take for teaching myself about subjects I get interested in. In 1974 or '75, I read the book Yoga and the Athlete, by Ian Jackson, when it was published by Runner's World magazine. In it, he talked about his forays into hatha yoga (the stretching postures) as a way of rehabilitating himself from running injuries he had sustained. He eventually got into yoga full-time, and from there, began investigating diet's effect on the body, writing about that too. At first I was more interested in Are Waerland (a European Hygienist health advocate with a differing slant than Shelton), who was mentioned in the book, so I wrote Jackson for more information. But instead of giving me information about Waerland, he steered me in the direction of American Natural Hygiene, saying in his experience it was far superior. I was also fascinated with Jackson's experiences with fasting. He credited fasting with helping his distance running, and had a somewhat mind-blowing "peak experience" while running on his first long fast. He kept training at long distances during his fasts, so I decided that would be the first aspect of the Hygienic program I would try myself. Then in the meantime, I started frequenting health-food stores and ran across Herbert Shelton's Fasting Can Save Your Life on the bookracks, which as we all know, has been a very persuasive book for beginning Natural Hygienists.

Initial experiences with fasting. So to ease into things gradually, I started out with a few 3-day "juice"
fasts (I know some Hygienists will object to this language, but bear with me), then later two 8-day juicediet fasts while I kept on running and working at my warehouse job (during college). These were done--in fact, all the fasts I've experienced have been done--at home on my own. Needless to say, I found these "fasts" on juices difficult since I was both working, and working out, at the same time. Had they been true "water" fasts, I doubt I would have been able to do it. I had been enticed by the promises of more robust health and greater eventual energy from fasting, and kept wondering why I didn't feel as great while fasting as the books said I would, with their stories of past supermen lifting heavy weights or walking or running long distances as they fasted. Little did I realize in my naivete that this was normal for most fasters. At the time I assumed, as Hygienists have probably been assuming since time immemorial when they don't get the hoped-for results, that it was just because I "wasn't cleaned-out enough." So in order to get more cleaned-out, I kept doing longer fasts, working up to a 13-day true water fast, and finally a 25-day water fast over Christmas break my senior year in college. (I had smartened up just a little bit by this time and didn't try running during these longer fasts on water alone.)

201

I also tried the Hygienic vegetarian diet around this time. But as the mostly raw-food diet negatively affected my energy levels and consequently my distance running performance, I lost enthusiasm for it, and my Hygienic interests receded to the back burner. I was also weary of fasting at this point, never having reached what I supposed was the Hygienic promised land of a total clean-out, so that held no further allure for me at the time.

Health crash. After college, I drifted away from running and got into doing hatha yoga for a couple of
years, taught a couple of local classes in it, then started my own business as a typesetter and graphic designer. Things took off and during the mid to late 1980s, I worked 60 to 80 hours a week, often on just 5 to 6 hours of sleep a night, under extreme round-the-clock deadline pressures setting type at the computer for demanding advertising agency clients. I dropped all pretense of Hygienic living, with the exception of maintaining a nominally "vegetarian" regime. This did not preclude me, however, guzzling large amounts of caffeine and sugar in the form of a half-gallon or more of soft drinks per day to keep going. Eventually all this took its toll and by 1990 my nervous system--and I assume (in the absence of having gone to a doctor like most Hygienists don't!) probably my adrenals--were essentially just about shot from all the mainlining of sugar and caffeine, the lack of sleep, and the 24-hour-a-day deadlines and accompanying emotional pressures. I started having severe panic or adrenaline attacks that would sometimes last several hours during which time I literally thought I might die from a heart attack or asphyxiation. The attacks were so debilitating it would take at least a full day afterwards to recover every time I had one.

Post-crash fasts and recommitment to Natural Hygiene. Finally, in late 1990/early 1991, after I
had begun having one or two of these attacks a week, I decided it was "change my ways or else" and did a 42-day fast at home by myself (mostly on water with occasional juices when I was feeling low), after which I went on a 95%-100% raw-food Hygienic diet. The panic attacks finally subsided after the 5th day of fasting, and have not returned since, although I did come close to having a few the first year or two after the fast. Soon after I made the recommitment to Hygienic living, when I had about completed my 42-day fast, I called a couple of Hygienic doctors and had a few phone consultations. But while the information I received was useful to a degree with my immediate symptoms, it did not really answer my Hygienic questions like I'd hoped, nor did it turn out to be of significant help overcoming my health problems over the longer-term. So in 1992 I decided to start the Natural Hygiene M2M to get directly in touch with Hygienists who had had real experience with their own problems, not just book knowledge, and not just the party line I could already get from mainstream Hygiene. With this new source of information and experience to draw on, among others, my health has continued to improve from the low it had reached, but it has been a gradual, trial-and-error process, and not without the occasional setback to learn from.

Improvements on fasts alternating with gradual downhill trend on vegan Natural Hygiene diet. One of the motivating factors here was that although fasting had been helpful (and continues to be),
unfortunately during the time in between fasts (I have done three subsequent fasts on water of 11 days, 20 days, and 14 days in the past five years), I just was not getting the results we are led to expect with the Hygienic diet itself. In fact, at best, I was stagnating, and at worst I was developing new symptoms that while mild were in a disconcerting downhill direction. Over time, the disparity between the Hygienic philosophy and the results I was (not) getting started eating at me. I slowly began to consider through reading the experiences of others in the M2M that it was not something I was "doing wrong," or that I wasn't adhering to the details sufficiently, but that there were others who were also not doing so well following the Hygienic diet, try as they might. The "blame the victim for not following all the itty bitty details just right" mentality began to seem more and more suspect to me.

This leads us up to the next phase of your Hygienic journey, where you eventually decided to remodel your diet based on your exploration of the evolutionary picture of early human diets as now known by science. Coming from your Hygienic background, what was it that

202

got you so interested in evolution?
Well, I have always taken very seriously as one of my first principles the axiom in Hygiene that we should be eating "food of our biological adaptation." What is offered in Hygiene to tell us what that is, is the "comparative anatomy" line of reasoning we are all familiar with: You look at the anatomical and digestive structures of various animals, classify them, and note the types of food that animals with certain digestive structures eat. By that criterion of course, humans are said to be either frugivores or vegetarians like the apes are said to be, depending on how the language is used.

Shortcomings of the "comparative anatomy" rationale for determining our "natural" diet.
Now at first (like any good upstanding Hygienist!) I did not question this argument because as far as it goes it is certainly logical. But nonetheless, it came to seem to me that was an indirect route for finding the truth, because as similar as we may be to the apes and especially the chimpanzee (our closest relative), we are still a different species. We aren't looking directly at ourselves via this route, we are looking at a different animal and basically just assuming that our diet will be pretty much just like theirs based on certain digestive similarities. And in that difference between them and us could reside errors of fact. So I figured that one day, probably from outside Hygiene itself, someone would come along with a book on diet or natural foods that would pull together the evidence directly from paleontology and evolutionary science and nail it down once and for all. Of course, I felt confident at that time it would basically vindicate the Hygienic argument from comparative anatomy, so it remained merely an academic concern to me at the time.

Exposure to the evolutionary picture and subsequent disillusionment with Natural Hygiene.
And then one day several years ago, there I was at the bookstore when out popped the words The Paleolithic Prescription[1] (by Boyd Eaton, M.D. and anthropologists Marjorie Shostak and Melvin Konner) on the spine of a book just within the range of my peripheral vision. Let me tell you I tackled that book in nothing flat! But when I opened it up and began reading, I was very dismayed to find there was much talk about the kind of lean game animals our ancestors in Paleolithic times (40,000 years ago) ate as an aspect of their otherwise high-plant-food diet,* but nowhere was there a word anywhere about pure vegetarianism in our past except one measly paragraph to say it had never existed and simply wasn't supported by the evidence.[2] I have to tell you that while I bought the book, red lights were flashing as I argued vociferously in my head with the authors on almost every other page, exploiting every tiny little loophole I could find to save my belief in humanity's original vegetarian and perhaps even fruitarian ways. "Perhaps you haven't looked far enough back in time," I told them inside myself. "You are just biased because of the modern meat-eating culture that surrounds us," I silently screamed, "so you can't see the vegetarianism that was really there because you aren't even looking for it!" So in order to prove them wrong, I decided I'd have to unearth all the scientific sources at the local university library myself and look at the published evidence directly. But I didn't do this at first--I stalled for about a year, basically being an ostrich for that time, sort of forgetting about the subject to bury the cognitive dissonance I was feeling.

News of long-time vegetarians abandoning the diet due to failure to thrive. In the meantime,
though, I happened to hear from a hatha yoga teacher I was acquainted with who taught internationally and was well-known in the yoga community both in the U.S. and abroad in the '70s and early '80s, who, along with his significant other, had been vegetarian for about 17 years. To my amazement, he told me in response to my bragging about my raw-food diet that he and his partner had re-introduced some flesh foods to their diet a few years previously after some years of going downhill on their vegetarian diets, and it had resulted in a significant upswing in their health. He also noted that a number of their vegetarian friends in the yoga community had run the same gamut of deteriorating health after 10-15 years as vegetarians since the '70s era.

203

Once again, of course, I pooh-poohed all this to myself because they obviously weren't "Hygienist" vegetarians and none of their friends probably were either. You know the line of thinking: If it ain't Hygienic vegetarianism, by golly, we'll just discount the results as completely irrelevant! If there's even one iota of difference between their brand of vegetarianism and ours, well then, out the window with all the results! But it did get me thinking, because this was a man of considerable intellect as well as a person of integrity whom I respected more than perhaps anyone else I knew.

Gradual personal health decline on vegan diet. And then a few months after that, I began noticing I
was having almost continual semi-diarrhea on my raw-food diet and could not seem to make well-formed stools. I was not sleeping well, my stamina was sub-par both during daily tasks and exercise, which was of concern to me after having gotten back into distance running again, and so real doubts began creeping in. It was around this time I finally made that trip to the university library.

And so what did you find?
Enough evidence for the existence of animal flesh consumption from early in human prehistory (approx. 23 million years ago) that I knew I could no longer ignore the obvious. For awhile I simply could not believe that Hygienists had never looked into this. But while it was disillusioning, that disillusionment gradually turned into something exciting because I knew I was looking directly at what scientists knew based on the evidence. It gave me a feeling of more power and control, and awareness of further dietary factors I had previously ruled out that I could experiment with to improve my health, because now I was dealing with something much closer to "the actual" (based on scientific findings and evidence) as opposed to dietary "idealism."

Paleontological evidence shows humans have always been omnivores

What kind of "evidence" are we talking about here?
At its most basic, an accumulation of archaeological excavations by paleontologists, ranging all the way from the recent past of 10,000-20,000 years ago back to approximately 2 million years ago, where ancient "hominid" (meaning human and/or proto-human) skeletal remains are found in conjunction with stone tools and animal bones that have cut marks on them. These cut marks indicate the flesh was scraped away from the bone with human-made tools, and could not have been made in any other way. You also find distinctively smashed bones occurring in conjunction with hammerstones that clearly show they were used to get at the marrow for its fatty material.[3] Prior to the evidence from these earliest stone tools, going back even further (2-3 million years) is chemical evidence showing from strontium/calcium ratios in fossilized bone that some of the diet from earlier hominids was also coming from animal flesh.[4] (Strontium/calcium ratios in bone indicate relative amounts of plant vs. animal foods in the diet.[5]) Scanning electron microscope studies of the microwear of fossil teeth from various periods well back into human prehistory show wear patterns indicating the use of flesh in the diet too.[6] The consistency of these findings across vast eons of time show that these were not isolated incidents but characteristic behavior of hominids in many times and many places.

204

Evidence well-known in scientific community; controversial only for vegetarians. The
evidence--if it is even known to them--is controversial only to Hygienists and other vegetarian groups, few to none of whom, so far as I can discern, seem to have acquainted themselves sufficiently with the evolutionary picture other than to make a few armchair remarks. To anyone who really looks at the published evidence in the scientific books and peer-reviewed journals and has a basic understanding of the mechanisms for how evolution works, there is really not a whole lot to be controversial about with regard to the very strong evidence indicating flesh has been a part of the human diet for vast eons of evolutionary time. The real controversy in paleontology right now is whether the earliest forms of hominids were truly "hunters," or more opportunistic "scavengers" making off with pieces of kills brought down by other predators, not whether we ate flesh food itself as a portion of our diet or not.[7]

Timeline of dietary shifts in the human line of evolution
Can you give us a timeline of dietary developments in the human line of evolution to show readers the overall picture from a bird's-eye view, so we can set a context for further discussion here?
Sure. We need to start at the beginning of the primate line long before apes and humans ever evolved, though, to make sure we cover all the bases, including the objections often made by vegetarians (and fruitarians for that matter) that those looking into prehistory simply haven't looked far enough back to find our "original" diet. Keep in mind some of these dates are approximate and subject to refinement as further scientific progress is made.

65,000,000 to 50,000,000 B.C.: The first primates, resembling today's mouse lemurs, bush-babies, and
tarsiers, weighing in at 2 lbs. or less, and eating a largely insectivorous diet.[8]

50,000,000 to 30,000,000 B.C.: A gradual shift in diet for these primates to mostly frugivorous in the
middle of this period to mostly herbivorous towards the end of it, but with considerable variance between specific primate species as to lesser items in the diet, such as insects, meat, and other plant foods.[9]

30,000,000 to 10,000,000 B.C.: Fairly stable persistence of above dietary pattern.[10] Approx. 10,000,000 to 7,000,000 B.C.: Last common primate ancestor of both humans and the
modern ape family.[11]

Approx. 7,000,000 to 5,000,000 B.C.: After the end of the previous period, a fork occurs branching
into separate primate lines, including humans.[12] The most recent DNA evidence shows that humans are closely related to both gorillas and chimpanzees, but most closely to the chimp.[13] Most paleoanthropologists believe that after the split, flesh foods began to assume a greater role in the human side of the primate family at this time.[14]

Approx. 4,500,000 B.C.: First known hominid (proto-human) from fossil remains, known as
Ardipithecus ramidus--literally translating as "root ape" for its position as the very first known hominid, which may not yet have been fully bipedal (walking upright on two legs). Anatomy and dentition (teeth) are very suggestive of a form similar to that of modern chimpanzees.[15]

Approx. 3,700,000 B.C.: First fully upright bipedal hominid, Australopithecus afarensis (meaning
"southern ape," for the initial discovery in southern Africa), about 4 feet tall, first known popularly from the famous "Lucy" skeleton.[16]

205

3,000,000 to 2,000,000 B.C.: Australopithecus line diverges into sub-lines,[17] one of which will
eventually give rise to Homo sapiens (modern man). It appears that the environmental impetus for this "adaptive radiation" into different species was a changing global climate between 2.5 and 2 million years ago driven by glaciation in the polar regions.[18] The climatic repercussions in Africa resulted in a breakup of the formerly extensively forested habitat into a "mosaic" of forest interspersed with savanna (grassland). This put stress on many species to adapt to differing conditions and availability of foodstuffs.[19] The different Australopithecus lineages, thus, ate somewhat differing diets, ranging from more herbivorous (meaning high in plant matter) to more frugivorous (higher in soft and/or hard fruits than in other plant parts). There is still some debate as to which Australopithecus lineage modern humans ultimately descended from, but recent evidence based on strontium/calcium ratios in bone, plus teeth microwear studies, show that whatever the lineage, some meat was eaten in addition to the plant foods and fruits which were the staples. [20]

2,300,000 to 1,500,000 B.C.: Appearance of the first "true humans" (signified by the genus Homo), known as Homo habilis ("handy man")--so named because of the appearance of stone tools and cultures at
this time. These gatherer-hunters were between 4 and 5 feet in height, weighed between 40 to 100 pounds, and still retained tree-climbing adaptations (such as curved finger bones)[21] while subsisting on wild plant foods and scavenging and/or hunting meat. (The evidence for flesh consumption based on cut-marks on animal bones, as well as use of hammerstones to smash them for the marrow inside, dates to this period. [22]) It is thought that they lived in small groups like modern hunter-gatherers but that the social structure would have been more like that of chimpanzees.[23] The main controversy about this time period by paleoanthropologists is not whether Homo habilis consumed flesh (which is well established) but whether the flesh they consumed was primarily obtained by scavenging kills made by other predators or by hunting.[24] (The latter would indicate a more developed culture, the former a more primitive one.) While meat was becoming a more important part of the diet at this time, based on the fact that the diet of modern hunter-gatherers--with their considerably advanced tool set--has not been known to exceed 40% meat in tropical habitats* like habilis evolved in, we can safely assume that the meat in habilis' diet would have been substantially less than that.[25]

1,700,000 to 230,000 B.C.: Evolution of Homo habilis into the "erectines,"* a range of human species often collectively referred to as Homo erectus, after the most well-known variant. Similar in height to
modern humans (5-6 feet) but stockier with a smaller brain, hunting activity increased over habilis, so that meat in the diet assumed greater importance. Teeth microwear studies of erectus specimens have indicated harsh wear patterns typical of meat-eating animals like the hyena.[26] No text I have yet read ventures any sort of percentage figure from this time period, but it is commonly acknowledged that plants still made up the largest portion of the subsistence.* More typically human social structures made their appearance with the erectines as well.[27] The erectines were the first human ancestor to control and use fire. It is thought that perhaps because of this, but more importantly because of other converging factors--such as increased hunting and technological sophistication with tools--that about 900,000 years ago in response to another peak of glacial activity and global cooling (which broke up the tropical landscape further into an even patchier mosaic), the erectines were forced to adapt to an increasingly varied savanna/forest environment by being able to alternate opportunistically between vegetable and animal foods to survive, and/or move around nomadically.[28] For whatever reasons, it was also around this time (dated to approx. 700,000 years ago) that a significant increase in large land animals occurred in Europe (elephants, hoofed animals, hippopotamuses, and predators of the big-cat family) as these animals spread from their African home. It is unlikely to have been an accident that the spread of the erectines to the European and Asian continent during and after this timeframe coincides with this increase in game as well, as they probably followed them.[29]

206

Because of the considerably harsher conditions and seasonal variation in food supply, hunting became more important to bridge the seasonal gaps, as well as the ability to store nonperishable items such as nuts, bulbs, and tubers for the winter when the edible plants withered in the autumn. All of these factors, along with clothing (and also perhaps fire), helped enable colonization of the less hospitable environment. There were also physical changes in response to the colder and darker areas that were inhabited, such as the development of lighter skin color that allowed the sun to penetrate the skin and produce vitamin D, as well as the adaptation of the fat layer and sweat glands to the new climate.*[30] Erectus finds from northern China 400,000 years ago have indicated an omnivorous diet of meats, wild fruit and berries (including hackberries), plus shoots and tubers, and various other animal foods such as birds and their eggs, insects, reptiles, rats, and large mammals.[31]

500,000 to 200,000 B.C.: Archaic Homo sapiens (our immediate predecessor) appears. These human
species, of which there were a number of variants, did not last as long in evolutionary time as previous ones, apparently due simply to the increasingly rapid rate of evolution occurring in the human line at this time. Thus they represent a transitional time after the erectines leading up to modern man, and the later forms are sometimes not treated separately from the earliest modern forms of true Homo sapiens.[32]

150,000 to 120,000 B.C.: Homo sapiens neanderthalensis--or the Neanderthals--begin appearing
in Europe, reaching a height between 90,000 and 35,000 years ago before becoming extinct. It is now well accepted that the Neanderthals were an evolutionary offshoot that met an eventual dead-end (in other words, they were not our ancestors), and that more than likely, both modern Homo sapiens and Neanderthals were sister species descended from a prior common archaic sapiens ancestor.[33]

140,000 to 110,000 B.C.: First appearance of anatomically modern humans (Homo sapiens).[34] The last Ice Age also dates from this period--stretching from 115,000 to 10,000 years ago. Thus it was in
this context, which included harsh and rapid climatic changes, that our most recent ancestors had to flexibly adapt their eating and subsistence.[35] (Climatic shifts necessitating adaptations were also experienced in tropical regions, though to a lesser degree.[36]) It may therefore be significant that fire, though discovered earlier, came into widespread use around this same time[37] corresponding with the advent of modern human beings. Its use may in fact be a defining characteristic of modern humans[38] and their mode of subsistence. (I'll discuss the timescale of fire and cooking at more length later.)

130,000 to 120,000 B.C.: Some of the earliest evidence for seafoods (molluscs, primarily) in the diet by
coastal dwellers appears at this time,[39] although in one isolated location discovered so far, there is evidence going back 300,000 years ago.[40] Common use of seafoods by coastal aborigines becomes evident about 35,000 years ago,[41] but widespread global use in the fossil record is not seen until around 20,000 years ago and since.[42] For the most part, seafoods should probably not be considered a major departure,* however, as the composition of fish, shellfish, and poultry more closely resembles the wild land-game animals many of these same ancestors ate than any other source today except for commercial game farms that attempt to mimic ancient meat.[43]

40,000 to 35,000 B.C.: The first "behaviorally modern" human beings--as seen in the sudden
explosion of new forms of stone and bone tools, cave paintings and other artwork, plus elaborate burials and many other quintessentially modern human behaviors. The impetus or origin for this watershed event is still a mystery.[44]

40,000 B.C. to 10-8,000 B.C.: Last period prior to the advent of agriculture in which human beings universally subsisted by hunting and gathering (also known as the "Late Paleolithic"--or "Stone Age"-period). Paleolithic peoples did process some of their foods, but these were simple methods that would have been confined to pounding, grinding, scraping, roasting, and baking.[45]

207

35,000 B.C. to 15-10,000 B.C.: The Cro-Magnons (fully modern pre-Europeans) thrive in the cold
climate of Europe via big-game hunting, with meat consumption rising to as much as 50%* of the diet.[46]

25,000 to 15,000 B.C.: Coldest period of the last Ice Age, during which global temperatures
averaged 14°F cooler than they do today[47] (with local variations as much as 59°F lower[48]), with an increasingly arid environment and much more difficult conditions of survival to which plants, animals, and humans all had to adapt.[49] The Eurasian steppes just before and during this time had a maximum annual summer temperature of only 59°F.[50] Humans in Europe and northern Asia, and later in North America, adapted by increasing their hunting of the large mammals such as mammoths, horses, bison and caribou which flourished on the open grasslands, tundra, and steppes which spread during this period.[51] Storage of vegetable foods that could be consumed during the harsh winters was also exploited. Clothing methods were improved (including needles with eyes) and sturdier shelters developed--the most common being animal hides wrapped around wooden posts, some of which had sunken floors and hearths.[52] In the tropics, large areas became arid. (In South Africa, for instance, the vegetation consisted mostly of shrubs and grass with few fruits.[53])

20,000 B.C. to 9,000 B.C.: Transitional period known as the "Mesolithic," during which the bow-andarrow appeared,[54] and gazelle, antelope, and deer were being intensively hunted,[55] while at the same time precursor forms of wild plant and game management began to be more intensively practiced. At this time, wild grains, including wheat and barley by 17,000 B.C.--before their domestication--were being gathered and ground into flour as evidenced by the use of mortars-and-pestles in what is now modern-day Israel. By 13,000 B.C. the descendants of these peoples were harvesting wild grains intensely and it was only a small step from there to the development of agriculture.[56] Game management through the burning-off of land to encourage grasslands and the increase of herds became widely practiced during this time as well. In North America, for instance, the western high plains are the only area of the current United States that did not see intensive changes to the land through extensive use of fire.[57] Also during this time, and probably also for some millennia prior to the Mesolithic (perhaps as early as 45,000 B.C.), ritual and magico-religious sanctions protecting certain wild plants developed, initiating a new symbiotic relationship between people and their food sources that became encoded culturally and constituted the first phase of domestication well prior to actual cultivation. Protections were accorded to certain wild food species (yams being a well-known example) to prevent disruption of their life cycle at periods critical to their growth, so that they could be profitably harvested later.[58] Digging sticks for yams have also been found dating to at least 40,000 B.C.,[59] so these tubers considerably antedated the use of grains in the diet. Foods known to be gathered during the Mesolithic period in the Middle East were root vegetables, wild pulses (peas, beans, etc.), nuts such as almonds, pistachios, and hazelnuts, as well as fruits such as apples. Seafoods such as fish, crabs, molluscs, and snails also became common during this time.[60]

Approx. 10,000 B.C.: The beginning of the "Neolithic" period, or "Agricultural Revolution," i.e.,
farming and animal husbandry. The transition to agriculture was made necessary by gradually increasing population pressures due to the success of Homo sapiens' prior hunting and gathering way of life. (Hunting and gathering can support perhaps one person per square 10 miles; Neolithic agriculture 100 times or more that many.[61]) Also, at about the time population pressures were increasing, the last Ice Age ended, and many species of large game became extinct (probably due to a combination of both intensive hunting and disappearance of their habitats when the Ice Age ended).[62] Wild grasses and cereals began flourishing,* making them prime candidates for the staple foods to be domesticated, given our previous familiarity with them.[63] By 9,000 B.C. sheep and goats were being domesticated in the Near East, and cattle and pigs shortly after, while wheat, barley, and legumes were being cultivated somewhat before 7,000 B.C., as were fruits and nuts, while meat consumption fell enormously.[64] By 5,000 B.C. agriculture had spread to all inhabited continents except Australia.[65] During the time since the beginning of the Neolithic, the ratio of

208

plant-to-animal foods in the diet has sharply increased from an average of probably 65%/35%* during Paleolithic times[66] to as high as 90%/10% since the advent of agriculture.[67]

Remains of fossil humans indicate decrease in health status after the Neolithic. In most
respects, the changes in diet from hunter-gatherer times to agricultural times have been almost all detrimental, although there is some evidence we'll discuss later indicating that at least some genetic adaptation to the Neolithic has begun taking place in the approximately 10,000 years since it began. With the much heavier reliance on starchy foods that became the staples of the diet, tooth decay, malnutrition, and rates of infectious disease increased dramatically over Paleolithic times, further exacerbated by crowding leading to even higher rates of communicable infections. Skeletal remains show that height decreased by four inches* from the Late Paleolithic to the early Neolithic, brought about by poorer nutrition, and perhaps also by increased infectious disease causing growth stress, and possibly by some inbreeding in communities that were isolated. Signs of osteoporosis and anemia, which was almost non-existent in pre-Neolithic times, have been frequently noted in skeletal pathologies observed in the Neolithic peoples of the Middle East. It is known that certain kinds of osteoporosis which have been found in these skeletal remains are caused by anemia, and although the causes have not yet been determined exactly, the primary suspect is reduced levels of iron thought to have been caused by the stress of infectious disease rather than dietary deficiency, although the latter remains a possibility.[68]

Subjectively based vegetarian naturalism vs. what evolution tells us So have Hygienists really overlooked all the [evolutionary] evidence you've compiled in the above timeline about the omnivorous diet of humans throughout prehistory? Are you serious?
It was a puzzle to me when I first stumbled onto it myself. Why hadn't I been told about all this? I had thought in my readings in the Hygienic literature that when the writers referred to our "original diet" or our "natural diet," that must mean what I assumed they meant: that not only was it based on comparative anatomy, but also on what we actually ate during the time the species evolved. And further, that they were at least familiar with the scientific evidence even if they chose to keep things simple and not talk about it themselves. But when I did run across and chase down a scientific reference or two that prominent Hygienists had at long last bothered to mention, I found to my dismay they had distorted the actual evidence or left out crucial pieces.

Could you name a name or two here and give an example so people will know the kind of thing you are talking about?
Sure, as long as we do it with the understanding I am not attempting to vilify anybody, and we all make mistakes. The most recent one I'm familiar with is Victoria Bidwell's citation (in her Health Seeker's Yearbook[69]) of a 1979 science report from the New York Times,[70] where she summarizes anthropologist Alan Walker's microwear studies of fossil teeth in an attempt to show that humans were originally exclusively, only, fruit-eaters. Bidwell paraphrases the report she cited as saying that "humans were once exclusively fruit eaters... eaters of nothing but fruit." And also that, "Dr. Walker and other researchers are absolutely certain that our ancestors, up to a point in relatively recent history, were fruitarians/vegetarians."[71] But a perusal of the actual article being cited reveals that: •

Inappropriate absolutistic interpretation. The diet was said to be "chiefly" fruit, which was
the "staple," and the teeth studied were those of "fruit-eater[s]," but the article is not absolutistic like Bidwell painted it.

209

Meaning of "fruit" and "frugivore" misinterpreted. Fruit as defined by Walker in the
article included tougher, less sugary foods, such as acacia tree pods. (Which laypeople like ourselves would be likely to classify as a "vegetable"-type food in common parlance.) And although it was not clarified in the article, anyone familiar with or conscientious enough to look a little further into evolutionary studies of diet would have been aware that scientists generally use the terms "frugivore," "folivore," "carnivore," "herbivore," etc., as categories comparing broad dietary trends, only very rarely as exclusivist terms, and among primates exclusivity in food is definitely not the norm.

• •

Time period was at the very earliest cusp of the evolution of australopithecines into homo habilis. The primate/hominids in the study were Australopithecus and Homo habilis-among the earliest in the hominid line--hardly "relatively recent history" in this context. Study was preliminary and equivocal. The studies were preliminary, and Walker was cautious, saying he didn't "want to make too much of this yet"--and his caution proved to be well warranted. I believe there was enough research material available by the late 1980s (Health Seeker's Yearbook was published in 1990) that had checking been done, it would have been found that while he was largely right about Australopithecine species being primarily frugivores (using a very broad definition of "fruit"), later research like what we outlined in our timeline above has shown Australopithecus also included small amounts of flesh, seeds, and vegetable foods, and that all subsequent species beginning with Homo habilis have included significant amounts of meat in their diet, even if the diet of habilis probably was still mostly fruit plus veggies.

There is more that I could nitpick, but that's probably enough. I imagine Victoria was simply very excited to see scientific mention of frugivorism in the past, and just got carried away in her enthusiasm. There's at least one or two similar distortions by others in the vegetarian community that one could cite (Viktoras Kulvinskas' 1975 book Survival into the 21st Century,[72] for instance, contains inaccuracies about ape diet and "fruitarianism") so I don't want to pick on her too much because I would imagine we've all done that at times. It may be understandable when you are unfamiliar with the research, but it points out the need to be careful.

Subjective naturalism vs. the functional definition provided by evolution/genetics
Overall, then, what I have been left with--in the absence of any serious research into the evolutionary past by Hygienists--is the unavoidable conclusion that Hygienists simply assume it ought to be intuitively obvious that the original diet of humans was totally vegetarian and totally raw. (Hygienists often seem impatient with scientists who can't "see" this, and may creatively embellish their research to make a point. Research that is discovered by Hygienists sometimes seems to be used in highly selective fashion only as a convenient afterthought to justify conclusions that have already been assumed beforehand.) I too for years thought it was obvious in the absence of realizing science had already found otherwise.

The subjective "animal model" for raw-food naturalism. The argument made is very similar to
the "comparative anatomy" argument: Look at the rest of the animals, and especially look at the ones we are most similar to, the apes.* They are vegetarians [this is now known to be false for chimps and gorillas and almost all the other great apes--which is something we'll get to shortly], and none of them cook their food. Animals who eat meat have large canines, rough rasping tongues, sharp claws, and short digestive tracts to eliminate the poisons in the meat before it putrefies, etc.

The trap of reactionary "reverse anthropomorphism." In other words, it is a view based on a
philosophy of "naturalism," but without really defining too closely what that naturalism is. The Hygienic view of naturalism, then, simplistically looks to the rest of the animal kingdom as its model for that naturalism by way of analogy. This is good as a device to get us to look at ourselves more objectively from "outside" ourselves, but when you take it too far, it completely ignores that we are unique in some ways,

210

and you cannot simply assume it or figure it all out by way of analogy only. It can become reverse anthropomorphism. (Anthropomorphism is the psychological tendency to unconsciously make human behavior the standard for comparison, or to project human characteristics and motivations onto the things we observe. Reverse anthropomorphism in this case would be saying humans should take specific behaviors of other animals as our own model where food is concerned.)

Subjective views of dietary naturalism are prone to considerable differences of opinion; don't offer meaningful scientific evidence. When you really get down to nuts and bolts about
defining what you subjectively think is "natural," however, you find people don't so easily agree about all the particulars. The problem with the Hygienic definition of naturalism--what we could call "the animal model for humans"--is that it is mostly a subjective comparison. (And quite obviously so after you have had a chance to digest the evolutionary picture, like what I presented above. Those who maintain that the only "natural" food for us is that which we can catch or process with our bare hands are by any realistic evolutionary definition for what is natural grossly in error, since stone tools for obtaining animals and cutting the flesh have been with us almost 2 million years now.) Not that there isn't value in doing this, and not that there may not be large grains of truth to it, but since it is in large part subjectively behavioral, there is no real way to test it fairly (which is required for a theory to be scientific), which means you can never be sure elements of it may not be false. You either agree to it, or you don't--you either agree to the "animal analogy" for raw-food eating and vegetarianism, or you have reservations about it--but you are not offering scientific evidence.

Evolutionary adaptation/genetics as the functional scientific test of what's natural. So my
view became, why don't we just look into the evolutionary picture as the best way to go straight to the source and find out what humans "originally" ate? Why fool around philosophizing and theorizing about it when thanks to paleoanthropologists we can now just go back and look? If we really want to resolve the dispute of what is natural for human beings, what better way than to actually go back and look at what we actually did in prehistory before we supposedly became corrupted by reason to go against our instincts? Why aren't we even looking? Are we afraid of what we might see? These questions have driven much of my research into all this. If we are going to be true dietary naturalists--eat "food of our biological adaptation" as the phrase goes-then it is paramount that we have a functional or testable way of defining what we are biologically adapted to. This is something that evolutionary science easily and straightforwardly defines: what is "natural" is simply what we are adapted to by evolution, and a central axiom of evolution is that what we are adapted to is the behavior our species engaged in over a long enough period of evolutionary time for it to have become selected for in the species' collective gene pool. This puts the question of natural behavior on a more squarely concrete basis. I wanted a better way to determine what natural behavior in terms of diet was for human beings that could be backed by science. This eliminates the dilemma of trying to determine what natural behavior is by resorting solely to subjective comparisons with other animals as Hygienists often do.

Correcting the vegetarian myths about ape diets
You mentioned the "comparative anatomy" argument that Natural Hygienists look to for justification instead of evolution. Let's look at that a little more. Are you saying it is fundamentally wrong?
No, not as a general line of reasoning in saying that we are similar to apes so our diets should be similar. It's a good argument--as far as it goes. But for the logic to be valid in making inferences about the human diet based on ape diet, it must be based on accurate observations of the actual food intake of apes. Idealists such as we Hygienists don't often appreciate just how difficult it is to make these observations, and do it thoroughly enough to be able to claim you have really seen everything the apes are doing, or capable of

211

doing. You have to go clear back to field observations in the 1960s and earlier to support the contention that apes are vegetarians. That doesn't wash nowadays with the far more detailed field observations and studies of the '70s, '80s, and '90s. Chimp and gorilla behavior is diverse, and it is difficult to observe and draw reliable conclusions without spending many months and/or years of observation. And as the studies of Jane Goodall and others since have repeatedly shown, the early studies were simply not extensive enough to be reliable.[73]

Citing of outdated science an earmark of idealism out of touch with reality. Science is a
process of repeated observation and progressively better approximations of the "real world," whatever that is. It is critical then, that we look at recent evidence, which has elaborated on, refined, and extended earlier work. When you see anybody--such as apologists for "comparative anatomy" vegetarian idealism (or in fact anybody doing this on any topic)--harking back to outdated science that has since been eclipsed in order to bolster their views, you should immediately suspect something.

Accumulation of modern post-1960s research shows apes are not actually vegetarians. The
main problem with the comparative anatomy argument, then--at least when used to support vegetarianism-is that scientists now know that apes are not vegetarians after all, as was once thought. The comparative anatomy argument actually argues for at least modest amounts of animal flesh in the diet, based on the now much-more-complete observations of chimpanzees, our closest animal relatives with whom we share somewhere around 98 to 98.6% of our genes.[74] (We'll also look briefly at the diets of other apes, but the chimpanzee data will be focused on here since it has the most relevance for humans.)

Diet of chimpanzees. Though the chimp research is rarely oriented to the specific types of percentage
numerical figures we Hygienists would want to see classified, from what I have seen, it would probably be fair to estimate that most populations of chimpanzees are getting somewhere in the neighborhood of 5%* of their diet on average in most cases (as a baseline) to perhaps 8-10%* as a high depending on the season, as animal food--which in their case includes bird's eggs and insects in addition to flesh--particularly insects, which are much more heavily consumed than is flesh.[75] •

Meat consumption by chimps. There is considerable variation across different chimp
populations in flesh consumption, which also fluctuates up and down considerably within populations on a seasonal basis as well. (And behavior sometimes differs as well: Chimps in the Tai population, in 26 of 28 mammal kills, were observed to break open the bones with their teeth and use tools to extract the marrow for consumption,[76] reminiscent of early Homo habilis.) One population has been observed to eat as much as 4 oz. of flesh per day during the peak hunting season, dwindling to virtually nothing much of the rest of the time, but researchers note that when it is available, it is highly anticipated and prized.[77] It's hard to say exactly, but a reasonable estimate might be that on average flesh may account for about 1-3% of the chimp diet.[78] The more significant role of social-insect/termite/ant consumption. Now of course, meat consumption among chimps is what gets the headlines these days,[79] but the bulk of chimpanzees' animal food consumption actually comes in the form of social insects[80] (termites, ants, and bees), which constitute a much higher payoff for the labor invested to obtain them[81] than catching the colobus monkeys that are often the featured flesh item for chimps. However, insect consumption has often been virtually ignored[82] since it constitutes a severe blind spot for the Western world due to our cultural aversions and biases about it. And by no means is insect consumption an isolated occurrence among just some chimp populations. With very few exceptions, termites and/or ants are eaten about half the days out of a year on average, and during peak seasons are an almost daily item, constituting a significant staple food in the diet (in terms of regularity), the remains of which show up in a minimum of approximately 25% of all chimpanzee stool samples.[83] Breakdown of chimpanzee food intake by dietary category. Again, while chimp researchers normally don't classify food intake by the types of volume or caloric percentages that we Hygienists would prefer to see it broken down for comparison purposes (the rigors of observing these creatures in the wild make it difficult), what they do record is illustrative. A chart

212

for the chimps of Lope in Gabon classified by numbers of different species of food eaten (caveat: this does not equate to volume), shows the fruit species eaten comprising approx. 68% of the total range of species eaten in their diets, leaves 11%, seeds 7%, flowers 2%, bark 1%, pith 2%, insects 6%, and mammals 2%.[84] A breakdown by feeding time for the chimps of Gombe showed their intake of foods to be (very roughly) 60% of feeding time for fruit, 20% for leaves, with the other items in the diet varying greatly on a seasonal basis depending on availability. Seasonal highs could range as high as (approx.) 17% of feeding time for blossoms, 22-30% for seeds, 10-17% for insects, 2-6% for meat, with other miscellaneous items coming in at perhaps 4% through most months of the year. [85] Miscellaneous items eaten by chimps include a few eggs,[86] plus the rare honey that chimps are known to rob from beehives (as well as the embedded bees themselves), which is perhaps the most highly prized single item in their diet,[87] but which they are limited from eating much of by circumstances. Soil is also occasionally eaten--presumably for the mineral content according to researchers.[88] •

Fluid intake in chimps not restricted to fruit, and includes water separately. For those
who suppose that drinking is unnatural and that we should be able to get all the fluid we need from "high-water-content" foods, I have some more unfortunate news: chimps drink water too. Even the largely frugivorous chimp may stop 2-3 times per day during the dry season to stoop and drink water directly from a stream (but perhaps not at all on some days during the wet season), or from hollows in trees, using a leaf sponge if the water cannot be reached with their lips.[89] (Or maybe that should be good news: If you've been feeling guilty or substandard for having to drink water in the summer months, you can now rest easy knowing your chimp brothers and sisters are no different!) The predilection of chimpanzees toward omnivorous opportunism. An important observation that cannot be overlooked is the wide-ranging omnivorousness and the predilection for tremendous variety in chimpanzees' diet, which can include up to 184 species* of foods, 40-60 of which may comprise the diet in any given month, with 13 different foods per day being one average calculated.[90] Thus, even given the largely frugivorous component of their diets, it would be erroneous to infer from that (as many Hygienists may prefer to believe) that the 5% to possibly 8% or so of their diet that is animal foods (not to mention other foods) is insignificant, or could be thrown out or disregarded without consequence--the extreme variety in their diet being one of its defining features. Over millions of years of evolution, the wheels grind exceedingly fine, and everything comes out in the wash. Remember that health is dependent on getting not just the right amounts of macroelements such as carbohydrates, fats, and proteins, but also critical amounts of trace minerals and vitamins for instance. We require, and are evolutionarily adapted to, the behavior that is natural to us. Where chimps are concerned, 5% or 8% animal food--whatever it actually is--is a modest but significant amount, and not something you can just say is incidental or could be thrown out without materially changing the facts.

Other ape diets. In order of how closely related the other great apes are to humans, the gorilla is next
after the chimpanzee, then the orangutan, and gibbon in decreasing order.[91] I'll just briefly summarize a few basic facts about the other great apes here, concentrating primarily on the gorilla. •

Diet of gorillas compared with chimps. Interestingly, while the gorilla has often been cited
as a model in the modern mythology of "fruitarianism,"[92] on average it is actually the least frugivorous of the apes. Highland gorillas (where less fruit is available in their higher-altitude mountainous habitat) have become primarily folivorous (leaf/vegetative-eaters), while the lowland gorilla is more of a hybrid folivore/frugivore.[93] I might mention in this regard that there is some

213

suggestion chimps seem not to prefer extra-high roughage volumes, at least compared to the gorilla. Certainly they do not seem to be able to physiologically tolerate as much cellulose from vegetative matter in their diet.*[94] Gorillas can, however, tolerate higher amounts of folivorous matter, due apparently to their more varied and extensive intestinal flora and fauna.[95] Chimps, however, are known to "wadge" some of their foods, which is a form of juicing that has the effect of reducing their fiber intake.[96] Wadging means that they make a wad of leaves which is mixed in with the primary food item (such as a fruit) as a mass, which is then used as a "press" against their teeth and palate to literally "juice" the main food which they may suck on for up to 10 minutes before discarding the wadge of fiber after all the juice has been sucked out. Wadging may also serve as a way to avoid biting into potentially toxic seeds of certain fruits, from which they can then still extract the juices safely, or as a way to handle very soft items such as pulpy or overripe fruits, as well as eggs and meat.[97] Such behavior ought to debunk the prevalent Hygienic/raw-foods myth that it is always the more natural thing to do to eat "whole" rather than fragmented foods. This is not necessarily true, and again, such a view is based in subjective definitions out of touch with the real world. Another example here is that chimps (and gorillas as well) also eat a fair amount of "pith" in their diet-meaning the insides of stems of various plants--which they first have to process by peeling off the tough outer covering before the pith inside is either eaten or wadged.[98] •

Other apes less closely related to humans. All the great apes, with the exception of the
gorilla, are primarily frugivorous, but they do eat some animal products as well, though generally less than the chimp--although lowland gorillas eat insects at a comparable rate to chimps. In decreasing order of animal food consumption in the diet, the orang comes first after the chimp, then the bonobo chimp, the gibbon, the lowland gorilla, and the highland gorilla--the latter eating any animal foods (as insects) incidentally in or on the plants eaten. Again, remember, animal food consumption here does not equate solely with flesh consumption, as that is less prominent than insects in ape diets. The chimp and bonobo chimp are the only ones to eat flesh (other than a rare occurrence of an orang who was observed doing so once). All the apes other than the highland gorilla eat at least some social insects, with the chimp, bonobo chimp, and orang also partaking of bird's eggs.[99]

Update on fat consumption in preagricultural and hunter-gatherer diets
*** "...there was much talk [in The Paleolithic Prescription] about the kind of lean game animals our ancestors in Paleolithic times (40,000 years ago) ate as an aspect of their otherwise high-plant-food diet..."
As one of the few initial published summaries of the modern Paleodiet evidence, it may be that the The Paleolithic Prescription's description of the levels of fat believed to have prevailed during the Paleolithic was somewhat on the conservative side. However, the picture at this point is still being fleshed out and undergoing debate. Of perhaps most relevance, though, is not so much the absolute levels of fat, but rather what the fat-intake profile would have been in terms of saturated vs. polyunsaturated and monounsaturated fats.

Organ meats favored in preference to muscle meats in hunter-gatherer diets. Observations of
modern hunter-gatherers have shown that muscle meats (the leanest part of the animal) are least preferred, sometimes even being thrown away in times of plenty, in preference to the fattier portions. Eaten first are the organs such as brains, eyeballs, tongue, kidneys, bone marrow (high in monounsaturated fat), and storage fat areas such as mesenteric (gut) fat. (Even this gut fat is much less saturated in composition, however, than the kind of marbled fat found in the muscle meat of modern feedlot animals.) There is no

214

reason to believe earlier hunter-gatherers would have been any different in these preferences, since other species of animals who eat other animals for food also follow the same general order of consumption.

Type of fat may be more important than amount of fat. As a related point, while it is likely to be
controversial for some time to come, an increasing amount of recent research on fats in the diet suggests there may actually be little if any correlation between the overall level of fat consumption and heart disease, atherosclerosis, cancer, etc., contrary to previous conclusions. Instead, the evidence seems to point more specifically to the trans fats (hydrogenated vegetable oils used to emulsify and preserve foods) which permeate much of the modern food supply in supermarkets, and highly saturated fats (including dairy products and probably saturated commercial meats, but not lean game meats) along with, perhaps, a high consumption of starches and refined carbohydrates driving the hyperinsulinism syndrome. (See the postscript to Part 2 for a brief discussion of the recent findings on hyperinsulinism.)

Cardiovascular disease and cancer are rare in hunter-gatherers, despite their fat/protein intake. Given that atherosclerosis, heart disease, and cancer are almost non-existent even in longer-lived
members of hunter-gatherer tribes eating large amounts of meat, and whose diets much better approximate what the evolutionary diet is thought to have been like than the diets of modern cultures, there is good reason to believe the earlier mainstream research on fats was faulty. (For details on health and disease in hunter-gatherers, see Hunter-Gatherers: Examples of Healthy Omnivores, on this site. The next page of Part 1 here will also discuss corrected anthropological survey data showing the average level of meat consumption of hunter-gatherers to be in excess of 50% of the diet.) Review papers and publications by fat and cholesterol researcher Mary Enig, Ph.D., and by Russell Smith and others have revealed a pattern of misinterpretation and misrepresentation of results of many earlier studies on dietary fat and cholesterol. (Enig recommends Smith's comprehensive Diet, Blood Cholesterol and Coronary Heart Disease: A Critical Review of the Literature [Vector Enterprises, Santa Monica, CA (1988

Co-evolution of increased human brain size with decreased size of digestive system

Data points to increasing dependence on denser foods, processed by a less energy-intensive gut to free up energy for the evolving brain.
Also, left completely out of Part 1 of the interview above due to my initial passing familiarity with further evidence are recent findings pointing to a correlation between increasing levels of animal flesh in the diet over the eons at the same time the human brain was in the process of near-tripling in size--from 375-550cc at the time of Australopithecus, to 500-800cc in Homo habilis, 775-1225cc in Homo erectus, and 1350cc in modern humans (Homo sapiens).

Sufficient amounts of long-chain fatty acids essential to support brain growth. While the
specific evolutionary factor(s) that drove the increase in human brain size are still being speculated about, one recent paper suggests that--whatever the causes--the evolutionary increase in brain size would not have been able to be supported physiologically without an increased intake of preformed long-chain fatty acids, which are an essential component in the formation of brain tissue. [Crawford 1992]

Animal prey likeliest source for required amounts of long-chain fatty acids during human brain evolution. Lack of sufficient intake of long-chain fatty acids in the diet would therefore be a
limiting factor on brain growth, and these are much richer in animal foods than plant. (Relative brain size development in herbivorous mammals was apparently limited by the amount of these fatty acids in plant food that was available to them.) Given the foods available in humanity's habitat during evolution, the

215

necessary level of long-chain fatty acids to support the increasing size of the human brain would therefore presumably only have been available through increased intake of flesh.

Human brain size since the late Paleolithic has decreased in tandem with decreasing contribution of animal food to diet. In addition, a recent analysis updating the picture of
encephalization (relative brain size) changes in humans during our evolutionary history has revealed that human cranial capacity has decreased by 11% in the last 35,000 years, the bulk of it (8%) in the last 10,000 [Ruff, Trinkaus, and Holliday 1997]. Eaton [1998] notes that this correlates well with decreasing amounts of animal food in the human diet during this timeframe. (Of particular relevance here is that most of this decrease in animal foods correlates with the dawn of agriculture 10,000 years ago.)

The central role of DHA in brain growth. Eaton [1988] also notes the obvious hypothesis here
would be that shortfalls in the preformed long-chain fatty acids important to brain development are logical candidates as the potentially responsible factors, most particularly docosahexaenoic acid (DHA), which is the long-chain fatty acid in most abundance in brain tissue, as well as docosatetraenoic acid (DTA), and arachidonic acid (AA). (The human body can synthesize these from their 18-carbon precursor’s linoleic acid (LA) and a-linolenic acid (ALA)--obtainable from plant foods--but the rate of synthesis does not match the amounts that can be gotten directly from animal foods. Additionally, an excessive amount of LA compared to ALA, which is likely when plant foods predominate in the diet, inhibits the body's ability to synthesize DHA endogenously, compounding the problem.) This evidence of decreasing brain size in the last 35,000 years, and particularly the last 10,000, represents important potentially corroborative evidence for the continuing role of animal foods in human brain development, since dietary changes in this most recent period of human prehistory can be estimated with more precision than dietary composition earlier in human evolution. While it should be clearly noted here that correlation alone is not causation, at the same time it should be acknowledged that there seem to be no other worthy hypotheses as yet to explain the dietary basis that could have supported the dramatic increase in brain size during human evolution.

Recent tuber-based hypothesis for evolutionary brain expansion fails to address key issues such as DHA and the recent fossil record. As a case in point, there has been one tentative alternative
hypothesis put forward recently by primatologist Richard Wrangham et al. [1999] suggesting that perhaps cooked tubers (primarily a starch-based food) provided additional calories/energy that might have supported brain expansion during human evolution. However, this idea suffers from some serious, apparently fatal flaws, in that the paper failed to mention or address critical pieces of key evidence regarding brain expansion that contradict the thesis. For instance, it overlooks the crucial DHA and/or DHA-substrate adequacy issue just discussed above, which is central to brain development and perhaps the most gaping of the holes. It's further contradicted by the evidence of 8% decrease in human brain size during the last 10,000 years, despite massive increases in starch consumption since the Neolithic revolution which began at about that time. (Whether the starch is from grain or tubers does not essentially matter in this context.) Meat and therefore presumed DHA consumption levels, both positive *and* negative-trending over human evolution, track relatively well not simply with the observed brain size increases during human evolution, but with the Neolithic-era decrease as well, on the other hand. [Eaton 1998] These holes, among others in the hypothesis, will undoubtedly be drawing comment from paleo researchers in future papers, and hopefully there will be a writeup on Beyond Veg as more is published in the peerreview journals in response to the idea. At this point, however, it does not appear to be a serious contender in plausibly accounting for all the known evidence.

Co-evolution of increased brain size with concurrent reduction in size of the human gut.
Recent work is showing that the brain (20-25% of the human metabolic budget) and the intestinal system are both so metabolically energy-expensive that in mammals generally (and this holds particularly in

216

primates), an increase in the size of one comes at the expense of the size of the other in order not to exceed the organism's limited "energy budget" that is dictated by its basal metabolic rate. The suggestion here is not that the shrinkage in gut size caused the increase in brain size, but rather that it was a necessary accompaniment. In other words, gut size is a constraining factor on potential brain size, and vice versa. [Aiello and Wheeler 1995]

Human gut has evolved to be more dependent on nutrient- and energy-dense foods than other primates. The relationship of all this to animal flesh intake is that compared to the other primates,
the design of the more compact human gut is less efficient at extracting sufficient energy and nutrition from fibrous foods and considerably more dependent on higher-density, higher-bioavailable foods, which require less energy for their digestion per unit of energy/nutrition released. Again, while it is not clear that the increasing levels of animal flesh in the human diet were a directly causative factor in the growth of the evolving human brain, their absence would have been a limiting factor regardless, without which the change likely could not have occurred. Other supporting data suggest that in other animals there is a pattern whereby those with larger brain-to-body-size ratios are carnivores and omnivores, with smaller, less complex guts, and dependent on diets of denser nutrients of higher bioavailability. Vegetarian philosophy has traditionally relied on observing that the ratio of intestinal length to body trunk length parallels that of the other primates as an indication the human diet should also parallel their more frugivorous/vegetarian diet. However, this observation is based on the oversimplification that gut length is the relevant factor, when in fact both cell types and intestinal surface area are the more important operative factors, the latter of which can vary greatly depending on the density of villi lining the intestinal walls. In these respects, the human gut shares characteristics common to both omnivores and carnivores. [McArdle 1996, p. 174] Also, intestinal length does not necessarily accurately predict total gut mass (i.e., weight), which is the operative criterion where brain size/gut size relationships are at issue. The human pattern of an overall smaller gut with a proportionately longer small intestine dedicated more to absorptive functions, combined with a simple stomach, fits the same pattern seen in carnivores. [Aiello and Wheeler 1995, p. 206]

Corrected anthropological survey data shows meat averages over 50% of hunter-gatherer diets
*** "No text I have yet read ventures any sort of percentage figure from this time period [the historical period of Homo erectus' existence], but it is commonly acknowledged that plants still made up the largest portion of the subsistence."
The most significant correction to Part 1 of the interview series here involves newer Paleodiet analysis of the amount of plant vs. animal food in modern hunter-gatherer diets, which--in conjunction with optimal foraging theory and knowledge of ancient habitats--can be used as a basis for extrapolating backward to estimate what that of our hominid ancestors may have been. The widely quoted figures of Richard Lee (in his Man the Hunter, 1968) that stated an average of 65% plant and 35% animal food for modern huntergatherers have, upon review by other researchers, been discovered to have been somewhat flawed. (Lee's 65%/35% ratio was in turn repeated in the research sources I used for this interview, but I had not traced the figures to their ultimate source at the time.)

Previous meta-analysis of Ethnographic Atlas survey data on hunter-gatherer diets was performed incorrectly. As noted by Ember [1978], it turns out that in calculating his averages, Lee
somewhat arbitrarily threw out a portion of the North American hunter-gatherer cases (who often had higher rates of meat consumption); plus he classified shellfishing as a "gathering" activity (normally used to categorize plant-food gathering). Taken together, these skewed the resulting average considerably. Reanalysis correcting for Lee's analytical errors shows a more likely average of somewhere between 5065% meat consumption in modern hunter-gatherer diets. (For a brief enumeration of the reanalysis that researcher Loren Cordain's group has done, see the first section or two of Metabolic Evidence of Human Adaptation to Increased Carnivory.)

217

Modern hunter-gatherer diets vis-a-vis optimal foraging theory and reconstructed prehistoric diets. An important issue worthy of mention here is the reliability of backward extrapolations
from behaviorally modern hunter-gatherers to more primitive hominids. The phrase "behaviorally modern" generally refers to the time horizon of approximately 40,000 B.C., during and after which humans began exhibiting behaviors we think of as modern: such as ritual burials, cave paintings, body ornamentation, other expressions of art, and most importantly for our purposes here, more sophisticated tool design, use, and techniques where hunting is concerned (eventually culminating in the bow-and-arrow, for example) which would presumably have increased hunting success. In light of this, critics have questioned how reliable the backward extrapolations may be. Proponents, in justification, note that well-established "optimal foraging theory" (which, reduced to its bare essentials, says that all creatures tend to expend the least foraging effort for the greatest caloric and/or nutritional return) can be used as a basis to make reasonable predictions. The kind of food intake that optimal foraging theory predicts for a particular species (modern or ancient) is in turn dependent on the surrounding environment and foods prevailing at the time (i.e., savanna habitat, generally, in the case of humans), along with what is known about the kind of foods their digestive physiology would likely be able to efficiently process. Based on this approach, proponents believe even with less sophisticated hunting technologies--given what is now known about ancient environments and availabilities of ancient plant and animal resources--the level of animal food in the diet beginning with erectus, at least, probably would have been in the 50% range with variations for local habitat. The increasing encephalization quotient (brain volume relative to body size) of Homo and particularly the jump in brain size with Homo erectus at 1.7 million years ago also tends to corroborate the suggestion of increasingly large amounts of meat in the diet at this time.

Meat consumption levels in early Homo species

*** "based on the fact that the diet of modern hunter gatherers...has not been known to exceed 40% meat in tropical habitats like habilis evolved in, we can safely assume that the meat in habilis' diet would have been substantially less than that."
As just discussed above regarding earlier analyses of anthropological data that have since been corrected, the average percentage of meat eaten by all modern hunter-gatherers studied to date who have been described in the Ethnographic Atlas (an anthropological compendium and database of hunter-gatherer data)--has now been shown to be in the 50-65% range, looking at the entire range of habitats studied. There are tropical hunter-gatherers eating in this range of meat consumption as well. Therefore, the percentage of meat in habilis' diet becomes an even more interesting question than before. As the diet of habilis' precursor Australopithecus would presumably have been higher in meat than modern chimps (who are in the 2% range of meat consumption) at some undetermined level, while habilis' successor erectus is now thought to have been near the range of modern hunter-gatherers (perhaps 50%), the question of where in between those wide endpoints habilis' meat consumption fell is not something I am prepared to guess. However, when we note that habilis was the first known hominid tool user--and that those tools were specifically used for processing meat--it seems logical to suggest that the amount must have jumped well above that of Australopithecus.

*** "1,500,000 to 230,000 B.C.: Evolution of Homo habilis into the 'erectines'..."
New and controversial reanalysis of previous fossil discoveries in Java has suggested erectus' span of existence may possibly have extended to as late as 30,000-50,000 B.C. in isolated areas, although the new analysis is still undergoing intensive debate. [Wilford 1996]

218

*** "There were also physical changes in response to the colder and darker [more northerly] areas that were inhabited [by erectus after 700,000 years ago], such as the development of lighter skin color that allowed the sun to penetrate the skin and produce vitamin D, as well as the adaptation of the fat layer and sweat glands to the new climate."
To avoid confusion here, it should be mentioned that these adaptations are believed to have evolved a second time in modern homo sapiens 100,000-200,000 years ago once they began migrating out of Africa-after having evolved from a group of erectines that would have remained behind in Africa during the earlier migrational wave during which other erectines had spread northward 700,000 years ago.

Research updates relating to diet and health late in human evolution
*** "...but widespread global use [of seafoods] in the fossil record is not seen until around 20,000 years ago and since. For the most part, seafoods should probably not be considered a major departure, however, as the composition of fish, shellfish, and poultry more closely resembles the wild land-game animals many of these same ancestors ate than any other source today except for commercial game farms that attempt to mimic ancient meat."
It may be that the fact fish is one of the foods some people are allergic to is an indicator evolutionary adaptation to this food may well not be complete, or is at least still problematic for some genetic subgroups, though compared to grains and dairy, there is of course a far better case for including it in a Paleolithic diet. Strictly speaking, meat from land animals is the primary flesh-food adaptation for humans. (The small minority of researchers promoting the very controversial "aquatic ape" theory of human evolution presently considered fringe science--which proposes a brief formative phase of human evolution taking place in coastal waters--might disagree, of course.) To this date, I have not myself seen or heard convincing research either way on how well adapted the human species overall may be to fish consumption compared with land game animals.

*** "35,000 B.C. to 15-10,000 B.C.: The Cro-Magnons (fully modern pre-Europeans) thrive in the cold climate of Europe via big-game hunting, with meat consumption rising to as much as 50% of the diet."
At least one researcher has voiced the opinion that given Ice-Age Europe in many areas was not too dissimilar from the arctic climate that modern-day Eskimos inhabit [i.e., very little plant food available], meat consumption in Europe during the last Ice Age may have approached similar levels--that is, considerably more than 50%, up to as high as 90% in some areas.

*** "Wild grasses and cereals began flourishing [around 10,000 B.C.] making them prime candidates for the staple foods to be domesticated, given our previous familiarity with them."
This statement could be construed as giving the impression that wild grasses and cereals may have been flourishing in many places around the globe. While it is true that environmental conditions at this time became considerably more favorable to wild grasses and cereals, I have since learned that the ones which thrived naturally and lent themselves best to the development of agriculture (wheat, barley, etc.) were indigenous only to a limited number of locales around the globe, the most notable of which, of course, was the Near East, where agriculture got its earliest start. A more accurate statement would be that peoples in the locations where these grasses and cereals began flourishing indigenously took advantage of the improved weather and environment to exploit them even further. The grasses and cereals would not have flourished to the extent they did, nor--most importantly--spread as widely as they came to do (virtually around the globe), were it not for proactive human intervention.

219

*** "During the time since the beginning of the Neolithic, the ratio of plant-to-animal foods in the diet has sharply increased from an average of probably 65%/35% during Paleolithic times to as high as 90%/10% since the advent of agriculture."
As stated above, the average animal food consumption during the Paleolithic was more likely in the 5065% range than 35%. This makes the decrease down to 10% of the diet in many places even more noteworthy in terms of suggesting the possible nutritional repercussions of modern-day diets.

*** "Skeletal remains show that height decreased by four inches from the Late Paleolithic to the early Neolithic..."
Various sources I've seen since say as much as six inches decrease in height. It is worth noting in this connection that vegetarians will often argue that the extra growth animal foods promote is "pathological" hypergrowth; leads to premature aging, etc. It should be noted, however, that recent studies show that the reduced growth sometimes seen in infants on vegan diets can lead to failure to meet accepted benchmarks for health status during the growth period. In refuting the common claim of vegetarians that the rebound in adult heights of modern populations eating more animal food is nothing but "pathological hypergrowth," it is also worth repeating that the historical decrease in height noted in skeletal remains after the Neolithic (when animal food consumption began plunging) was accompanied by skeletal signs of disease-stress, as mentioned in Part 1 of the interview. Also, longevity of Neolithic peoples, who ate diets higher in plant food (particularly grains) and lower in animal food, was in general a bit shorter than that of Paleolithic peoples. [Angel 1984]

Clarifications regarding chimpanzee diet

*** "The argument made is very similar to the "comparative anatomy" argument: Look at the rest of the animals, and especially look at the ones we are most similar to, the apes."
A brief note: the original article noted that humans are most genetically similar to chimpanzees. If more space had been available, I would have made the further distinction that we are equally as genetically close to the bonobo chimpanzee. However, since the common chimpanzee has been more closely studied than the rare bonobo chimp, the research from these studies has (until recently, perhaps) provided more wideranging insights. So far as I am aware, the main observed differences between the common chimp and the bonobo, diet-wise, are that bonobos eat a higher percentage of fruit in the diet (approx. 80% vs. up to 60%-65% for the common chimp); and apparently insect and flesh consumption is less. Interestingly reminiscent of human behavior, the bonobos' highly sexual behavior in many common social situations seems to serve a communication function both as a social "glue" and as a conciliatory mechanism between individuals in potentially divisive situations. (Examples: bonobos will often engage in sex-play upon coming onto a major food source--sensory excitation often turns in sexual excitation, which is then discharged in fraternization with others in the group and seems to promote group cohesiveness. Also, when spats occur between bonobos, sexual engagement afterwards may serve as a way to smooth the incident over and reestablish normal relations.)

*** "...it would probably be fair to estimate that most populations of chimpanzees are getting somewhere in the neighborhood of 5% of their diet on average in most cases (as a baseline) to perhaps 8-10%..."
While the 5% figure was the best guess I could make at the time based on the difficulty in finding any research with precise figures, it does appear to be fairly close to the mark. I have since seen figures quoted

220

in the 4% to 7% range for chimp animal-food consumption. 8-10% is in all likelihood too high as an average, though could plausibly reach that high at the height of the termite season or during the peak months for red colobus monkey kills.

*** "An important observation that cannot be overlooked is the wide-ranging omnivorousness and the predilection for tremendous variety in chimpanzees' diet, which can include up to 184 species of foods..."
Something I overlooked in this factoid taken from Goodall's The Chimpanzees of Gombe was that 26 out of the 184 items were observed to be eaten only once. Nevertheless, the 158 remaining items still constitute a remarkable variety of food items. That the additional items would still be sought out, on top of an already high level of variety, can be looked at as an interesting indicator of the "opportunistic" nature of chimpanzees in taking advantage of whatever food they can find and utilize. (In other words, as a "model" for the kind of dietary behavior that vegetarians sometimes hold up that humans ought to be compared to, chimps' opportunism suggests they are not particularly constrained by arbitrary food categories in deciding what is appropriate or not to eat.)

*** "...there is some suggestion chimps seem not to prefer extra-high roughage volumes, at least compared to the gorilla. Certainly they do not seem to be able to physiologically tolerate as much cellulose from vegetative matter in their diet."
This could be a bit misleading if construed to mean chimps don't have a lot of roughage in their diet, because they certainly do. It remains true, however, that chimps do not possess the requisite kind of symbiotic intestinal fauna (bacteria) to subsist on diets as high in leafy vegetative matter as gorillas do. The subsequent discussion on wadging was meant to show two things: that chimp behavior is inventive toward meeting their needs, and also that wadging can be a way of exploiting the part of fibrous foods they could otherwise not eat freely of due to the limitations of their gut. The relevance of these observations for human diet would be that the first hominids went much further in the direction of less fibrous foods than chimpanzees were to go after the divergence from our common ape ancestor, and toward a much more concentrated diet higher in animal foods containing denser nutrients of higher bioavailability, as we discussed above regarding the increasing size of the human brain that occurred concomitantly with a reduction in the size of the gut. This concludes Part 1. In Part 2, we'll look into such things as fire and cooking in human evolution, rates of genetic adaptation to change, modern hunter/gatherers, diseases in the wild, and then turn to "the psychology of idealistic diets."

Knowledge gap in vegetarian community about evolutionary data/implications

Health & Beyond: In Part 1 of our interview, you discussed the extensive evidence showing that primitive human beings as well as almost all of the primates today have included animal foods such as flesh or insects in their diets. Why haven't Natural Hygienists and other vegetarians looked into all this information?
Ward Nicholson: My guess is that: (1) Most aren't aware that paleoanthropologists have by now assembled a considerable amount of data about our evolutionary past related to diet. But more importantly, I think it has to do with psychological barriers, such as: (2) Many Hygienists assume they don't have to look because the subjective "animal model" for raw-food naturalism makes it "obvious" what our natural diet is, and therefore the paleontologists' evidence must therefore be in error, or biased by present cultural eating practices. Or: (3) They don't want to look, perhaps because they're afraid of what they might see.

221

Many Hygienists identify the system mostly with certain dietary details, even though the system itself flows from principles independent of those details. I think in spite of what most
Natural Hygienists will tell you, they are really more wedded to certain specific details of the Hygienic system that remain prevalent (i.e., raw-food vegetarianism, food combining, etc.) than they are truly concerned with whether those details follow logically from underlying Hygienic principles. The basic principle of Natural Hygiene is that the body is a self-maintaining, self-regulating, self-repairing organism that naturally maintains its own health when it is given food and other living conditions appropriate to its natural biological adaptation. In and of itself, this does not tell you what foods to eat. That has to be determined by a review of the best evidence we have available. So while the principles of Hygiene as a logical system do not change, our knowledge of the appropriate details that follow from those principles may and probably will change from time to time--since science is a process of systematically elucidating more "known" information from what used to be unknown. Thus the accuracy of our knowledge is to some extent time-based, dependent on the accumulation of evidence to provide a more inclusive view of "truth" which unfortunately is probably never absolute, but--as far as human beings are concerned--relative to the state of our knowledge. Science simply tries to bridge the knowledge gap. And a hallmark of closing the knowledge gap through scientific discovery is openness to change and refinements based on the accumulation of evidence. Open-mindedness is really openness to change. Just memorizing details doesn't mean much in and of itself. It's how that information is organized, or seen, or interpreted, or related to, that means something.

Hygienic and vegan diets are a significant restriction of the diet(s) on which humans evolved. What's interesting to me is that the evolutionary diet is not so starkly different from the Hygienic
diet. Much of it validates important elements of the Hygienic view. It is very similar in terms of getting plenty of fresh fruits and veggies, some nuts and seeds, and so forth, except for the addition of the smaller role of flesh and other amounts of animal food (at least compared to the much larger role of plant foods) in the diet. It's one exception. We have actually done fairly well in approximating humanity's "natural" or "original" diet, except we have been in error about this particular item, and gotten exceedingly fundamentalist about it when there is nothing in the body of Hygienic principles themselves that would outlaw meat if it's in our evolutionary adaptation.

Avowed Shelton loyalists are actually the ones who have most ignored his primary directive.
But for some reason, even though Natural Hygiene is not based on any "ethical" basis for vegetarianism (officially at least), this particular item seems to completely freak most Hygienists out. Somehow we have made a religion out of dietary details that have been the hand-me-downs of past Hygienists working with limited scientific information. They did the best they could given the knowledge they had available to them then, and we should be grateful for their hard work. But today the rank and file of Natural Hygiene has largely forgotten Herbert Shelton's rallying cry, "Let us have the truth, though the heavens fall." Natural Hygiene was alive and vital in Shelton's time because he was actively keeping abreast of scientific knowledge and aware of the need to modify his previous views if scientific advances showed them to be inadequate. But since Shelton retired from the scene, many people in the mainstream of Hygiene have begun to let their ideas stagnate and become fossilized. The rest of the dietary world is beginning to pass us by in terms of scientific knowledge.

Only two insights remain that are still somewhat unique to Natural Hygiene. As I see it, there
remain only two things Natural Hygiene grasps that the rest of the more progressive camps in the dietary world still don't: •

Strong recognition of the principle of the body as homeostatic self-healing mechanism. An understanding of the fundamental health principle that outside measures (drugs,
surgery, etc.) never truly "cure" degenerative health problems. In spite of the grandiose hopes and claims that they do, and the aura of research breakthroughs, their function is really to serve as crutches, which can of course be helpful and may truly be needed in some circumstances. But the

222

only true healing is from within by a body that has a large capacity, within certain limits, to heal and regenerate itself when given all of its essential biological requirements--and nothing more or less which would hamper its homeostatic functioning. The body's regenerative (homeostatic) abilities are still commonly unrecognized today (often classed as "unexplained recoveries" or--in people fortunate enough to recover from cancer--as "spontaneous remission") because the population at large is so far from eating anything even approaching a natural diet that would allow their bodies to return to some kind of normal health, that it is just not seen very often outside limited pockets of people seriously interested in approximating our natural diet. •

Fasting as a tool to promote such self-healing. And the other thing is that Hygienists are
also keenly aware of the power of fasting to help provide ideal conditions under which such selfhealing can occur.

But the newer branch of science called "darwinian medicine" is slowly beginning (albeit with certain missteps) to grasp the principle of self-healing, or probably more correctly, at least the understanding that degenerative diseases arise as a result of behavior departing from what our evolutionary past has adapted us to. They see the negative side of how departing from our natural diet and environment can result in degenerative disease, but they do not understand that the reverse--regenerating health by returning to our pristine diet and lifestyle, without drugs or other "crutches"--is also possible, again, within certain limits, but those limits are less than most people believe.

In some ways, though, Hygiene now resembles a religion as much as it does science, because
people seem to want "eternal" truths they can grab onto with absolute certainty. Unfortunately, however, knowledge does not work that way. Truth may not change, but our knowledge of it certainly does as our awareness of it shifts or expands. Once again: The principles of Hygiene may not change, but the details will always be subject to refinement

The rift in the Natural Hygiene movement over raw vs. cooked foods
Speaking of such details subject to refinement, I know you've been sitting on some very suggestive evidence to add further fuel to the fire-and-cooking debate now raging between the raw-foodist and "conservative-cooking" camps within Hygiene. Please bring us up to date on what the evolutionary picture has to say about this.
I'd be happy to. But before we get into the evolutionary viewpoint, I want to back up a bit first and briefly discuss the strange situation in the Hygienic community occurring right now over the raw foods vs. cooking-of-some-starch-foods debate. The thing that fascinates me about this whole brouhaha is the way the two sides justify their positions, each of which has a strong point, but also a telling blind spot.

Character of the rift: doctors vs. the rank-and-file. Now since most Natural Hygienists don't have
any clear picture of the evolutionary past based on science for what behavior is natural, the "naturalistic" model used by many Hygienists to argue for eating all foods raw does so on a subjective basis--i.e., what I have called "the animal model for raw-food naturalism." The idea being that we are too blinded culturally by modern food practices involving cooking, and to be more objective we should look at the other animals--none of whom cook their food--so neither should we. Now it's true the "subjective raw-food naturalists" are being philosophically consistent here, but their blind spot is they don't have any good scientific evidence from humanity's primitive past to back up their claim that total raw-foodism is the most natural behavior for us--that is, using the functional definition based on evolutionary adaptation I have proposed if we are going to be rigorous and scientific about this.

223

Now on the other hand, with the doctors it's just the opposite story. In recent years, the Natural Hygiene doctors and the ANHS (American Natural Hygiene Society) have been more and more vocal about what they say is the need for a modest amount of cooked items in the diet--usually starches such as potatoes, squashes, legumes, and/or grains. And their argument is based on the doctors' experience that few people they care for do as well on raw foods alone as they do with the supplemental addition of these cooked items. Also, they argue that there are other practical reasons for eating these foods, such as that they broaden the diet nutritionally, even if one grants that some of those nutrients may be degraded to a degree by cooking. (Though they also say the assimilation of some nutrients is improved by cooking.) They also point out these starchier foods allow for adequate calories to be eaten while avoiding the higher levels of fat that would be necessary to obtain those calories if extra nuts and avocados and so forth were eaten to get them.

One side ignores the need for philosophical consistency. The other denies practical realities and real-world results. So we have those with wider practical experience arguing for the inclusion of
certain cooked foods based on pragmatism. But their blind spot is in ignoring or attempting to finesse the inconsistency their stance creates with the naturalist philosophy that is the very root of Hygienic thinking. And again, the total-raw-foodists engage in just the opposite tactics: being philosophically consistent in arguing for all-raw foods, but being out of touch with the results most other people in the real world besides themselves get on a total raw-food diet, and attempting to finesse that particular inconsistency by nitpicking and fault-finding other implementations of the raw-food regime than their own. (I might interject here, though we'll cover this in more depth later, that although it's not true for everyone, experience of most people in the Natural Hygiene M2M supports the view that the majority do in fact do better when they add some cooked foods to their diet.)

Is there a way these two stances in the conflict over cooking can be reconciled and accounted for scientifically? Now my tack as both a realist and someone who is also interested in being
philosophically consistent has been: If it is true that most people* do better with the inclusion of some of these cooked items in their diet that we've mentioned--and I believe that it is, based on everything I have seen and heard--then there must be some sort of clue in our evolutionary past why this would be so, and which would show why it might be natural for us. The question is not simply whether fire and cooking are "natural" by some subjective definition. It's whether they have been used long enough and consistently enough by humans during evolutionary time for our bodies to have adapted genetically to the effects their use in preparing foods may have on us. Again, this is the definition for "natural" that you have to adopt if you want a functional justification that defines "natural" based on scientific validation rather than subjectivity.

When was fire first controlled by human beings?
So the next question is obvious: How long have fire and cooking been around, then, and how do we know whether that length of time has been long enough for us to have adapted sufficiently?
Let's take the question one part at a time. The short answer to the first part of the question is that fire was first controlled by humans anywhere from about 230,000 years ago to 1.4 or 1.5 million years ago, depending on which evidence you accept as definitive. •

Evidence for very early control of fire is sparse and ambiguous. The earliest evidence for control of fire by humans, in the form of fires at Swartkrans, South Africa and at Chesowanja, in Kenya, suggests that it may possibly have been in use there as early as about 1.4
or 1.5 million years ago.[100] However, the interpretation of the physical evidence at these early

224

sites has been under question in the archaeological community for some years now, with critics saying these fires could have been wildfires instead of human-made fires. They suggest the evidence for human control of fire might be a misreading of other factors, such as magnesiumstaining of soils, which can mimic the results of fire if not specifically accounted for. For indisputable evidence of fire intentionally set and controlled by humans, the presence of a hearth or circle of scorched stones is often demanded as conclusive proof,[101] and at these early sites, the evidence tying the fires to human control is based on other factors. Earliest dates for control of fire accepted by skeptical critics. At the other end of the timescale, these same critics who are only willing to consider the most unequivocal evidence will still admit that at least by 230,000 years ago[102] there is enough good evidence at at least one site to establish fire was under control at this time by humans. At this site, called Terra Amata, an ancient beach location on the French Riviera, stone hearths are found at the center of what may have been huts; and more recent sources may put the site's age at possibly 300,000 years old rather than 230,000.[103] Somewhat further back--from around 300,000 to 500,000 years ago--more evidence has been accumulating recently at sites in Spain and France[104] that looks as if it may force the ultraconservative paleontologists to concede their 230,000-year-ago date is too stingy, but we'll see.

And then there is Zhoukoudian cave in China, one of the most famous sites connected with Homo erectus, where claims that fire may have been used as early as 500,000 to 1.5 million years ago have now largely been discredited due to the complex and overlapping nature of the evidence left by not just humans, but hyenas and owls who also inhabited the cave. (Owl droppings could conceivably have caught fire and caused many of the fires.) Even after discounting the most extreme claims, however, it does seem likely that at least by 230,000 to 460,000 years ago humans were using fire in the cave[105], and given scorching patterns around the teeth and skulls of some animal remains, it does appear the hominids may have done this to cook the brains (not an uncommon practice among hunting-gathering peoples today).[106] The most recent excavation with evidence for early use of fire has been within just the last couple of years in France at the Menez-Dregan site, where a hearth and evidence of fire has been preliminarily dated to approximately 380,000 to 465,000 years. If early interpretations of the evidence withstand criticism and further analysis, the fact that a hearth composed of stone blocks inside a small cave was found with burnt rhinoceros bones close by has provoked speculation that the rhino may have been cooked at the site.[107]

Crux of the question: first control of fire vs. earliest widespread use. Now of course, the crucial
question for us isn't just when the earliest control of fire was; it's at what date fire was being used consistently--and more specifically for cooking, so that more-constant genetic selection pressures would have been brought to bear. Given the evidence available at this time, most of it would probably indicate that 125,000 years ago is the earliest reasonable estimate for widespread control.*[108] Another good reason it may be safer to base adaptation to fire and cooking on the figure of 125,000 years ago is that more and more evidence is indicating modern humans today are descended from a group of ancestors who were living in Africa 100,000-200,000 years ago, who then spread out across the globe to replace other human groups.[109] If true, this would probably mean the fire sites in Europe and China are those of separate human groups who did not leave descendants that survived to the present. Given that the African fire sites in Kenya and South Africa from about 1.5 million years ago are under dispute, then, widespread usage at 125,000 years ago seems the safest figure for our use here.

Sequence of stages in control: fire for warmth vs. fire for cooking. One thing we can say about
the widespread use of fire probable by 125,000 years ago, however, is that it would almost certainly have included the use of fire for cooking.* Why can this be assumed? It has to do with the sequence for the progressive stages of control over fire that would have had to have taken place prior to fire usage becoming

225

commonplace. And the most interesting of these is that fire for cooking would almost inevitably have been one of the first uses it was put to by humans, rather than some later-stage use.* The first fires on earth occurred approximately 350 million years ago--the geological evidence for fire in remains of forest vegetation being as old as the forests themselves.[110] It is usual to focus only on fire's immediately destructive effects to plants and wildlife, but there are also benefits. In response to occasional periodic wildfires, for example, certain plants and trees have evolved known as "pyrophytes," for whose existence periodic wildfires are essential. Fire revitalizes them by destroying their parasites and competitors, and such plants include grasses eaten by herbivores as well as trees that provide shelter and food for animals.[111]

Opportunistic exploitation of animal kills by predators after wildfires. Fires also provide other
unintended benefits to animals as well. Even at the time a wildfire is still burning, birds of prey (such as falcons and kites)--the first types of predators to appear at fires--are attracted to the flames to hunt fleeing animals and insects. Later, land-animal predators appear when the ashes are smoldering and dying out to pick out the burnt victims for consumption. Others, such as deer and bovine animals appear after that to lick the ashes for their salt content. Notable as well is that most mammals appear to enjoy the heat radiated at night at sites of recently burned-out fires.[112] It would have been inconceivable, therefore, that human beings, being similarly observant and opportunistic creatures, would not also have partaken of the dietary windfall provided by wildfires they came across. And thus, even before humans had learned to control fire purposefully--and without here getting into the later stages of control over fire--their early passive exposures to it would have already introduced them, like the other animals, to the role fire could play in obtaining edible food and providing warmth.

Potential adaptation to cooking in light of genetic rates of change

So if fire has been used on a widespread basis for cooking since roughly 125,000 years ago, how do we know if that has been enough time for us to have fully adapted to it?
To answer that, we have to be able to determine the rate at which the genetic changes constituting evolutionary adaptation take place in organisms as a result of environmental or behavioral change--which in this case means changes in food intake.

Rates of genetic change as estimated from speciation in the fossil record. The two sources for
estimates of rates at which genetic change takes place are from students of the fossil record and from population geneticists. Where the fossil record is concerned, Niles Eldredge, along with Stephen Jay Gould, two of the most well-known modern evolutionary theorists, estimated the time span required for "speciation events" (the time required for a new species to arise in response to evolutionary selection pressures) to be somewhere within the range of "five to 50,000 years."[113] Since this rough figure is based on the fossil record, it makes it difficult to be much more precise than that range. Eldredge also comments that "some evolutionary geneticists have said that the estimate of five to 50,000 years is, if anything, overly generous."[114] Also remember that this time span is for changes large enough to result in a new species classification. Since we are talking here about changes (digestive changes) that may or may not be large enough to result in a new species (though changes in diet often are in fact behind the origin of new species), it's difficult to say from this particular estimate whether we may be talking about a somewhat shorter or longer time span than that for adaptation to changes in food.

Measurements of genetic change from population genetics. Fortunately, however, the estimates
from the population geneticists are more precise. There are even mathematical equations to quantify the

226

rates at which genetic change takes place in a population, given evolutionary "selection pressures" of a given magnitude that favor survival of those individuals with a certain genetic trait.[115] The difficulty lies in how accurately one can numerically quantify the intensity of real-world selection pressures. However, it turns out there have been two or three actual examples where it has been possible to do so at least approximately, and they are interesting enough I'll mention a couple of them briefly here so people can get a feel for the situation. The most interesting of these examples relates directly to our discussion here, and has to do with the gene for lactose tolerance in adults. Babies are born with the capacity to digest lactose via production of the digestive enzyme lactase. Otherwise they wouldn't be able to make use of mother's milk, which contains the milk sugar lactose. But sometime after weaning, this capacity is normally lost, and there is a gene that is responsible. Most adults--roughly 70% of the world's population overall--do not retain the ability to digest lactose into adulthood[116] and this outcome is known as "lactose intolerance." (Actually this is something of a misnomer, since adult lactose intolerance would have been the baseline normal condition for virtually everyone in the human race up until Neolithic (agricultural) times.[117]) If these people attempt to drink milk, then the result may be bloating, gas, intestinal distress, diarrhea, etc.[118]

Influence of human culture on genetic selection pressures. However--and this is where it gets
interesting--those population groups that do retain the ability to produce lactase and digest milk into adulthood are those descended from the very people who first began domesticating animals for milking during the Neolithic period several thousand years ago.[119] (The earliest milking populations in Europe, Asia, and Africa began the practice probably around 4,000 B.C.[120]) And even more interestingly, in population groups where cultural changes have created "selection pressure" for adapting to certain behavior--such as drinking milk in this case--the rate of genetic adaptation to such changes significantly increases. In this case, the time span for widespread prevalence of the gene for lactose tolerance within milking population groups has been estimated at approximately 1,150 years[121]--a very short span of time in evolutionary terms.

Relationship between earliest milking cultures and prevalence of lactose tolerance in populations. There is a very close correlation between the 30% of the world's population who are tolerant
to lactose and the earliest human groups who began milking animals. These individuals are represented most among modern-day Mediterranean, East African, and Northern European groups, and emigrants from these groups to other countries. Only about 20% of white Americans in general are lactose intolerant, but among sub-groups the rates are higher: 90-100% among Asian-Americans (as well as Asians worldwide), 75% of African-Americans (most of whom came from West Africa), and 80% of Native Americans. 50% of Hispanics worldwide are lactose intolerant.[122] Now whether it is still completely healthy for the 30% of the world's population who are lactose tolerant to be drinking animals' milk--which is a very recent food in our evolutionary history--I can't say. It may well be there are other factors involved in successfully digesting and making use of milk without health sideeffects other than the ability to produce lactase--I haven't looked into that particular question yet. But for our purposes here, the example does powerfully illustrate that genetic adaptations for digestive changes can take place with much more rapidity than was perhaps previously thought.*

Genetic changes in population groups who crossed the threshold from hunting-gathering to grain-farming earliest. Another interesting example of the spread of genetic adaptations since the
Neolithic has been two specific genes whose prevalence has been found to correlate with the amount of time populations in different geographical regions have been eating the grain-based high-carbohydrate diets common since the transition from hunting and gathering to Neolithic agriculture began 10,000 years ago. (These two genes are the gene for angiotensin-converting enzyme--or ACE--and the one for apolipoprotein B, which, if the proper forms are not present, may increase one's chances of getting cardiovascular disease.) [123]

227

In the Middle East and Europe, rates of these two genes are highest in populations (such as Greece, Italy, and France) closer to the Middle Eastern "fertile crescent" where agriculture in this part of the globe started, and lowest in areas furthest away, where the migrations of early Neolithic farmers with their grainbased diets took longest to reach (i.e., Northern Ireland, Scotland, Finland, Siberia). Closely correlating with both the occurrence of these genes and the historical rate of grain consumption are corresponding rates of deaths due to coronary heart disease. Those in Mediterranean countries who have been eating highcarbohydrate grain-based diets the longest (for example since approximately 6,000 B.C. in France and Italy) have the lowest rates of heart disease, while those in areas where dietary changes due to agriculture were last to take hold, such as Finland (perhaps only since 2,000 B.C.), have the highest rates of death due to heart attack. Statistics on breast cancer rates in Europe also are higher for countries who have been practicing agriculture the least amount of time.[124] Whether grain-based diets eaten by people whose ancestors only began doing so recently (and therefore lack the appropriate gene) is actually causing these health problems (and not simply correlated by coincidence) is at this point a hypothesis under study. (One study with chickens, however--who in their natural environment eat little grain--has shown much less atherosclerosis on a high-fat, high-protein diet than on a low-fat, high-carbohydrate diet.[125]) But again, and importantly, the key point here is that genetic changes in response to diet can be more rapid than perhaps once thought. The difference in time since the advent of Neolithic agriculture between countries with the highest and lowest incidences of these two genes is something on the order of 3,000-5,000 years,[126] showing again that genetic changes due to cultural selection pressures for diet can force more rapid changes than might occur otherwise.

Recent evolutionary changes in immunoglobulin types, and genetic rates of change overall.
Now we should also look at the other end of the time scale for some perspective. The Cavalli-Sforza population genetics team that has been one of the pioneers in tracking the spread of genes around the world due to migrations and/or interbreeding of populations has also looked into the genes that control immunoglobulin types (an important component of the immune system). Their estimate here is that the current variants of these genes were selected for within the last 50,000-100,000 years, and that this time span would be more representative for most groups of genes. They also feel that in general it is unlikely gene frequencies for most groups of genes would undergo significant changes in time spans of less than about 11,500 years.[127] However, the significant exception they mention--and this relates especially to our discussion here--is where there are cultural pressures for certain behaviors that affect survival rates.[128] And the two examples we cited above: the gene for lactose tolerance (milk-drinking) and those genes associated with high-carbohydrate grain consumption, both involve cultural selection pressures that came with the change from hunting and gathering to Neolithic agriculture. Again, cultural selection pressures for genetic changes operate more rapidly than any other kind. Nobody yet, at least so far as I can tell, really knows whether or not the observed genetic changes relating to the spread of milk-drinking and grain-consumption are enough to confer a reasonable level of adaptation to these foods among populations who have the genetic changes, and the picture seems mixed.*

Rates of gluten intolerance (gluten is a protein in certain grains such as wheat, barley, and oats that
makes dough sticky and conducive to bread-baking) are lower than for lactose intolerance, which one would expect given that milk-drinking has been around for less than half the time grain-consumption has. Official estimates of gluten intolerance range from 0.3% to 1% worldwide depending on population group. [129] Some researchers, however, believe that gluten intolerance is but the tip of the iceberg of problems due to grain consumption (or more specifically, wheat). Newer research seems to suggest that anywhere from 5% to as much as 20-30% of the population with certain genetic characteristics (resulting in what is called a "permeable intestine") may absorb incompletely digested peptide fragments from wheat with adverse effects that could lead to a range of possible diseases.[130] What do common genetic rates of change suggest about potential adaptation to cooking? We have gone a little far afield here getting some kind of grasp on rates of genetic change, but I think it's been necessary

228

for us to have a good sense of the time ranges involved. So to bring this back around to the question of adaptation to cooking, it should probably be clear by this point that given the time span involved (likely 125,000 years since fire and cooking became widespread), the chances are very high that we are in fact adapted to the cooking of whatever foods were consistently cooked.* I would include in these some of the vegetable foods, particularly the coarser ones such as starchy root vegetables such as yams, which are long thought to have been cooked,[131] and perhaps others, as well as meat, from what we know about the fossil record.

Are cooking's effects black-and-white or an evolutionary cost/benefit tradeoff?
What about the contention by raw-food advocates that cooking foods results in pyrolytic byproducts that are carcinogenic or otherwise toxic to the body, and should be avoided for that reason?
It's true cooking introduces some toxic by-products, but it also neutralizes others.[132] In addition, the number of such toxins created is dwarfed by the large background level of natural toxins (thousands)[133] already present in plant foods from nature to begin with, including some that are similarly carcinogenic in high-enough doses. (Although only a few dozen have been tested so far,[134] half of the naturally occurring substances in plants known as "nature's pesticides" that have been tested have been shown to be carcinogenic in trials with rats and mice.[135]) Nature's pesticides appear to be present in all plants, and though only a few are found in any one plant, 5-10% of a plant's total dry weight is made up of them.[136] [The reason "nature's pesticides" occur throughout the plant kingdom is because plants have had to evolve low-level defense mechanisms against animals to deter overpredation. On one level, plants and animals are in a continual evolutionary "arms race" against each other. Fruiting plants, of course, have also evolved the separate ability to exploit the fact that certain animals are attracted to the fruit by enabling its seeds to be dispersed through the animals' feces.]

We have a liver and kidneys for a reason, which is that there have always been toxins in natural foods that the body has had to deal with, and that's one reason why these organs evolved. There
are also a number of other more general defenses the body has against toxins. These types of defenses make evolutionary sense given the wide range of toxic elements in foods the body has had to deal with over the eons. [Perhaps not clear enough in the original version of the interview is the point that a wide range of GENERAL defenses might therefore be reasonably expected to aid in neutralizing or ejecting toxins even of a type the body hadn't necessarily seen before, such as those that might be introduced by cooking practices.] Such mechanisms include the constant shedding of surface-layer cells of the digestive system, many defenses against oxygen free-radical damage, and DNA excision repair, among others.[137]

The belief that a natural diet is, or can be, totally toxin-free is basically an idealistic fantasy--an illusion of black-and-white thinking not supported by real-world investigations. The real
question is not whether a diet is completely free of toxins, but whether we are adapted to process what substances are in our foods--in reasonable or customary amounts such as encountered during evolution-that are not usable by the body. Again, the black-and-white nature of much Hygienic thinking obscures here what are questions of degrees rather than absolutes.

Cooking may favorably impact digestibility. Also, and I know raw-foodists generally don't like to
hear this, but there has long been evidence cooking in fact does make foods of certain types more digestible. For example, trypsin inhibitors (themselves a type of protease inhibitor) which are widely distributed in the plant kingdom, particularly in rich sources of protein, inhibit the ability of digestive enzymes to break down protein. (Probably the best-known plants containing trypsin inhibitors are legumes and grains.) Research has shown the effect of most such protease inhibitors on digestion to be reduced by

229

cooking.[138] And it is this advantage in expanding the range of utilizable foods in an uncertain environment that was the evolutionary advantage that helped bring cooking about and enhanced survival.* I want to make clear that I still believe the largest component of the diet should be raw (at least 50% if not considerably more), but there is provision in the evolutionary picture for reasonable amounts of cooked foods of certain types, such as at the very least, yams, probably some other root vegetables, the legumes, some meat, and so forth. (With meat, the likelihood is that it was eaten raw when freshly killed, but what could not be eaten would likely have been dried or cooked to preserve it for later consumption, rather than wasting it.) Whether or not some foods like these can be eaten raw if one has no choice or is determined enough to do so is not the real question. The question is what was more expedient or practical to survival and which prevailed over evolutionary time.

Cooking practices of Aborigines in light of survival needs. A brief look at the Australian
Aborigines might be illustrative here.* What data is available since the Aborigines were first encountered by Europeans shows that inland Aborigines in the desert areas were subject to severe food shortages and prolonged droughts.[139] This of course made emphasizing the most efficient use of whatever foods could be foraged paramount. Estimates based on studies of Aborigines in northern Australia are that they processed roughly half of their plant foods, but that no food was processed unnecessarily, any such preparation being done only to make a food edible, more digestible, or more palatable.[140] In general food was eaten as it was collected, according to its availability during the seasons--except during times of feasts--with wastage being rare, such a pattern being characteristic of feast-and-famine habitats. Some food, however, was processed for storage and later retrieval (usually by drying), including nuts and seeds, but may also have been ground and baked into cakes instead, before burying in the ground or storing in dry caches.[141] Fresh foods such as fruits, bulbs, nectar, gums, flowers, etc., were eaten raw when collected. Examples of foods that were prepared before consumption include the cooking of starchy tubers or seeds, grinding and roasting of seeds, and cooking of meat.[142] That these practices were necessary to expand the food supply and not merely induced by frivolous cultural practices like raw-foodists often tend to theorize can be seen in the fact that after colonization by Europeans, Aborigines were not above coming into missions during droughts to get food.[143]

The role of individual experimentation given evolutionary uncertainties about diet
Fly in the ointment: dietary changes since advent of agriculture. But the more interesting and
more pressing question, to my mind, is not whether we are adapted to cooking of certain foods, which seems very likely,* but how much we have adapted to the dietary changes since the Neolithic agricultural transition, given the 10,000 years or less it's been underway. At present, the answer is unclear, although in general, we can probably say there just hasn't been enough time for full adaptation yet. Or if so, only for people descended from certain ancestral groups with the longest involvement with agriculture. My guess (and it is just a guess) would be that we are still mostly adapted to a Paleolithic diet, but for any particular individual with a given ancestral background, certain Neolithic foods such as grains, perhaps even modest amounts of certain cultured milk products such as cheese or yogurt (ones more easily digested than straight milk) for even fewer people, might be not only tolerated, but helpful. Especially where people are avoiding flesh products which is our primary animal food adaptation, these animal by-products may be helpful,* which Stanley Bass's work with mice and his mentor Dr. Gian-Cursio's work with Hygienic patients seems to show, as Dr. Bass has discussed previously here in H&B (in the April and June 1994 issues).

230

How are we to determine an optimum diet for ourselves, then, given that some genetic changes may be more or less complete or incomplete in different population groups?
I think what all of this points to is the need to be careful in making absolute black-and-white pronouncements about invariant food rules that apply equally to all. It is not as simple as saying that if we aren't sure we are fully adapted to something to just eliminate it from the diet to be safe. Because adaptation to a food does not necessarily mean just tolerance for that food, it also means that if we are in fact adapted to it, we would be expected to thrive better with some amount of that food in our diet. Genetic adaptation cuts both ways. This is why I believe it is important for people to experiment individually. Today, because of the Neolithic transition and the rates at which genetic changes are being discovered to take place, it is apparent humanity is a species in evolutionary transition. Due to the unequal flow and dissemination of genes through a population during times like these, it is unlikely we will find [more] uniform adaptation across the population, as we probably would have during earlier times. This means it is going to be more likely right now in this particular historical time period that individuals will be somewhat different in their responses to diet. And as we saw above (with the two genes ACE and apolipoprotein-B) these genetic differences may even confound attempts to replicate epidemiological dietary studies from one population to another unless these factors are taken into account.*

Conflicting data from various modern lines of evidence means people must experiment and decide for themselves. So while it is important to look for convergences among different lines of
evidence (evolutionary studies, biochemical nutritional studies, epidemiological studies and clinical trials, comparative anatomy from primate studies, and so forth), it is well to consider how often the epidemiological studies, perhaps even some of the biochemical studies, reverse themselves or come back with conflicting data. It usually takes many years--even decades--for their import to become clear based on the lengthy scientific process of peer review and replication of experiments for confirmation or refutation.

Openness means challenging any rigid assumptions we may have through experimentation.
So my advice is: don't be afraid to experiment. Unless you have specific allergies or strong food intolerances and whatnot, the body is flexible enough by evolution to handle short-term variations in diet from whatever an optimal diet might be anyway. If you start within the general parameters we've outlined here and allow yourself to experiment, you have a much better chance of finding the particular balance among these factors that will work best for you. If you already have something that works well for you, that's great. If, however, you are looking for improvements, given the uncertainties above we've talked about, it's important to look at any rigid assumptions you may have about the "ideal" diet, and be willing to challenge them through experimentation. In the long run, you only have yourself to benefit by doing so.

Conflicts between paleo/anthropological vs. biochemical/epidemiological evidence
Despite the evolutionary picture you've presented here, there are still objections that people have about meat from a biochemical or epidemiological standpoint. What about T. Colin Campbell's China Study for example?
Good point. Campbell's famous study, to my mind, brings up one of the most unremarked-upon recent conflicts in epidemiological data that has arisen. In his lecture at the 1991 ANHS annual conference, reported on in the national ANHS publication Health Science, Campbell claimed that the China Study data

231

pointed to not just high fat intake, but to the protein in animal food, as increasing cholesterol levels. (High cholesterol levels in the blood are now widely thought by many to be the biggest single factor responsible for increased rates of atherosclerosis--clogged blood vessels--and coronary heart disease.) According to him, the lower the level of animal protein in the diet (not just the lower the level of fat) the lower the cholesterol level in the blood. He believes that animal food is itself the biggest culprit, above and beyond just fat levels in food.[144]

Campbell's conclusions about cholesterol and animal protein are contradicted by evidence from studies of modern hunter-gatherers. Yet as rigorous as the study is proclaimed to be, I have to
tell you that Campbell's claim that animal protein by itself is the biggest culprit in raising blood cholesterol is contradicted by studies of modern-day hunter-gatherers eating considerable amounts of wild game in their diet who have very low cholesterol levels comparable to those of the China study. One review of different tribes studied showed low cholesterol levels for the Hadza of 110 mg/dl (eating 20% animal food), San Bushmen 120 (20-37% animal), Aborigines 139 (10-75% animal), and Pygmies at 106, considerably lower than the now-recommended safe level of below 150.[145] Clearly there are unaccounted-for factors at work here yet to be studied sufficiently.

Large and significant differences between domesticated meat vs. wild game. One of them
might be the difference in composition between the levels of fat in domesticated meat vs. wild game: on average five times as much for the former than the latter. On top of that, the proportion of saturated fat in domesticated meat compared to wild game is also five times higher.[146] Other differences between these two meat sources are that significant amounts of EPA (an omega-3 fatty acid thought to perhaps help prevent atherosclerosis) are found in wild game (approx. 4% of total fat), while domestic beef for example contains almost none.[147] This is important because the higher levels of EPA and other omega-3 fatty acids in wild game help promote a low overall dietary ratio of omega-6 vs. omega-3 fatty acids for hunter-gatherers--ranging from 1:1 to 4:1--compared to the high 11:1 ratio observed in Western nations. Since omega-6 fatty acids may have a cancer-promoting effect, some investigators are recommending lower ratios of omega-6 to omega-3 in the diet which would, coincidentally, be much closer to the evolutionary norm.[148] Differences like these may go some way toward explaining the similar blood cholesterol levels and low rates of disease in both the rural Chinese eating a very-low-fat, low-animal-protein diet, and in huntergatherers eating a low-fat, high-animal-protein diet. Rural Chinese eat a diet of only 15% fat and 10% protein, with the result that saturated fats only contribute a low 4% of total calories. On the other hand, those hunter-gatherer groups approximating the Paleolithic norm eat diets containing 20-25% fat and 30% protein, yet the contribution of saturated fat to total caloric intake is nevertheless a similarly low 6% of total calories.[149]

What about the contention that high-protein diets promote calcium loss in bone and therefore contribute to osteoporosis?
The picture here is complex and modern studies have been contradictory. In experimental settings, purified, isolated protein extracts do significantly increase calcium excretion, but the effect of increased protein in natural foods such as meat is smaller or nonexistent.[150] Studies of Eskimos have shown high rates of osteoporosis eating an almost all-meat diet[151] (less than 10% plant intake[152]) but theirs is a recent historical aberration not typical of the evolutionary Paleolithic diet thought to have averaged 65% plant foods and 35% flesh.* Analyses of numerous skeletons from our Paleolithic ancestors have shown development of high peak bone mass and low rates of bone loss in elderly specimens compared to their Neolithic agricultural successors whose rates of bone loss increased considerably even though they ate

232

much lower-protein diets.[153] Why, nobody knows for sure, though it is thought that the levels of phosphorus in meat reduce excretion of calcium, and people in Paleolithic times also ate large amounts of fruits and vegetables[154] with an extremely high calcium intake (perhaps 1,800 mg/day compared to an average of 500-800 for Americans today[155]) and led extremely rigorous physical lives, all of which would have encouraged increased bone mass.[156]

Caveats with respect to using modern hunter-gatherers as dietary models
Okay, let's move on to the hunter-gatherers you mentioned earlier. I've heard that while some tribes may have low rates of chronic degenerative disease, others don't, and may also suffer higher rates of infection than we do in the West.
This is true. Not all "hunter-gatherer" tribes of modern times eat diets in line with Paleolithic norms. Aspects of their diets and/or lifestyle can be harmful just as modern-day industrial diets can be. When using these people as comparative models, it's important to remember they are not carbon copies of Paleolithicera hunter-gatherers.[157] They can be suggestive (the best living examples we have), but they are a mixed bag as "models" for behavior, and it is up to us to keep our thinking caps on. We've already mentioned the Eskimos above as less-than-exemplary models. Another example is the Masai tribe of Africa who are really more pastoralists (animal herders) than hunter-gatherers. They have low cholesterol levels ranging from 115 to 145,[158] yet autopsies have shown considerable atherosclerosis. [159] Why? Maybe because they deviate from the Paleolithic norm of 20-25% fat intake due to their pastoralist lifestyle by eating a 73% fat diet that includes large amounts of milk from animals in addition to meat and blood.*[160] Our bodies do have certain limits.

But after accounting for tribes like these, why do we see higher rates of mortality from infectious disease among other hunter-gatherers who are eating a better diet and show little incidence of degenerative disease?
There are two major reasons I know of. First, most modern-day tribes have been pushed onto marginal habitats by encroaching civilization.[161] This means they may at times experience nutritional stress resulting from seasonal fluctuations in the food supply (like the Aborigines noted above) during which relatively large amounts of weight are lost while they remain active. The study of "paleopathology" (the study of illnesses in past populations from signs left in the fossil record) shows that similar nutritional stress experienced by some hunter-gatherers of the past was not unknown either, and at times was great enough to have stunted their growth, resulting in "growth arrest lines" in human bone that can be seen under conditions of nutritional deprivation. Such nutritional stress is most likely for hunter-gatherers in environments where either the number of food sources is low (exposing them to the risk of undependable supply), or where food is abundant only seasonally.[162]

Fasting vs. extended nutritional stress/deprivation. Going without food--or fasting while under
conditions of total rest as hygienists do as a regenerative/recuperative measure--is one thing, but nutritional stress or deprivation while under continued physical stress is unhealthy and leaves one more susceptible to pathologies including infection.[163] The second potential cause of higher rates of infection is the less artificially controlled sanitary conditions (one of the areas where modern civilization is conducive rather than destructive to health)--due to less control over the environment by hunter-gatherers than by modern civilizations. Creatures in the wild are in frequent contact with feces and other breeding grounds for microorganisms such as rotting fruit and/or carcasses, to which they are exposed by skin breaks and injuries, and so forth.[164]

233

Animals in the wild on natural diets are not disease-free. Contrary to popular Hygienic myth,
animals in the wild eating natural diets in a natural environment are not disease-free, and large infectious viral and bacterial plagues in the past and present among wild animal populations are known to have occurred. (To cite one example, rinderpest plagues in the African Serengeti occurred in the 1890s and again around 1930, 1960, and 1982 among buffalo, kudu, eland, and wildebeest.[165]) It becomes obvious when you look into studies of wild animals that natural diet combined with living in natural conditions is no guarantee of freedom from disease and/or infection. Chimpanzees, our closest living animal relatives, for instance, can and do suffer bouts in the wild from a spectrum of ailments very similar to those observed in human beings: including pneumonia and other respiratory infections (which occur more often during the cold and rainy season), polio, abscesses, rashes, parasites, diarrhea, even hemorrhoids on occasion.[166] Signs of infectious disease in the fossil record have also been detected in remains as far back as the dinosaur-age, as have signs of immune system mechanisms to combat them. [167]

Uninformed naturalism and unrealistic expectations in diet
Pure "naturalism" often overlooks the large positive impact of modern environmental health advantages. One of the conclusions to be drawn from this is that artificial modern conditions are
not all bad where health is concerned. Such conditions as "sanitation" due to hygienic measures, shelter and protection from harsh climatic extremes and physical trauma, professional emergency care after potentially disabling or life-threatening accidents, elimination of the stresses of nomadism, plus protection from seasonal nutritional deprivation due to the modern food system that Westerners like ourselves enjoy today all play larger roles in health and longevity than we realize.[168]

Unrealistic perfectionism leads to heaping inhumanity and guilt on ourselves. Also, I would
hope that the chimp examples above might persuade hygienists not to feel so guilty or inevitably blame themselves when they occasionally fall prey to acute illness. We read of examples in the Natural Hygiene M2M which sometimes seem to elicit an almost palpable sense of relief among others when the conspiracy of silence is broken and they find they aren't the only ones. I think we should resist the tendency to always assume we flubbed the dietary details. In my opinion it is a mistake to believe that enervation need always be seen as simply the instigator of "toxemia" which is then held to always be the incipient cause of any illness. It seems to me you can easily have "enervation" (lowered energy and resistance) without toxemia, and that that in and of itself can be quite enough to upset the body's normal homeostasis ("health") and bring on illness. (Indeed I have personally become ill once or twice during the rebuilding period after lengthy fasts when overworked, a situation in which it would be difficult to blame toxemia as the cause.) The examples of modern-day hunter-gatherers as well as those of chimps should show us that you can eat a healthy natural diet and still suffer from health problems, including infectious disease, due to excessive stresses--what we would call "enervation" in Natural Hygiene.

Health improvements after becoming ex-vegetarian
Ward, we still have some space here to wrap up Part 2. Given the research you've done, how has it changed your own diet and health lifestyle? What are you doing these days, and why?
I would say my diet right now* [late 1996] is somewhere in the neighborhood of about 85% plant and 15%

234

animal, and overall about 60% raw and 40% cooked by volume. A breakdown from a different angle would be that by volume it is, very roughly, about 1/4 fruit, 1/4 starches (grains/potatoes, etc.), 1/4 veggies, and the remaining quarter divided between nuts/seeds and animal products, with more of the latter than the former. Of the animal foods, I would say at least half is flesh (mostly fish, but with occasional fowl or relatively lean red meat thrown in, eaten about 3-5 meals per week), the rest composed of varying amounts of eggs, goat cheese, and yogurt. Although I have to admit I am unsure about the inclusion of dairy products on an evolutionary basis given their late introduction in our history, nevertheless, I do find that the more heavily I am exercising, the more I find myself tending to eat them. To play it safe, what dairy I do eat is low- or no-lactose cultured forms like goat cheese and yogurt.* Where the grains are concerned, so far I do not experience the kind of sustained energy I like to have for distance running without them, even though I am running less mileage than I used to (20 miles/week now as opposed to 35-40 a few years ago). The other starches such as potatoes, squash, etc., alone just don't seem to provide the energy punch I need. Again, however, I try to be judicious by eating non-glutencontaining grains such as millet, quinoa, or rice, or else use sprouted forms of grains, or breads made from them, that eliminate the gluten otherwise present in wheat, barley, oats, and so forth.* In general, while I do take the evolutionary picture heavily into account, I also believe it is important to listen to our own bodies and experiment, given the uncertainties that remain. Also, I have to say that I find exercise, rest, and stress management as important as diet in staying energetic, healthy, and avoiding acute episodes of ill-health. Frankly, my experience is that once you reach a certain reasonable level of health improvement based on your dietary disciplines, and things start to level out--but maybe you still aren't where you want to be--most further gains are going to come from paying attention to these other factors, especially today when so many of us are overworked, over-busy, and stressed-out. I think too many people focus too exclusively on diet and then wonder why they aren't getting any further improvements. Diet only gets you so far. I usually sleep about 8-10 hours a night, and I very much enjoy vigorous exercise, which I find is necessary to help control my blood-sugar levels, which are still a weak spot for me. The optimum amount is important, though. A few years ago I was running every day, totaling 35-40 miles/week and concentrating on hard training for age-group competition, and more prone to respiratory problems like colds, etc. (not an infrequent complaint of runners). In the last couple of years, I've cut back to every-otherday running totaling roughly 20 miles per week. I still exercise fairly hard, but a bit less intensely than before, I give myself a day of rest in between, and the frequency of colds and so forth is now much lower.

I am sure people will be curious here, Ward: What were some of the improvements you noticed after adding flesh foods to your diet?
Well, although I expected it might take several months to really notice much of anything, one of the first things was that within about 2 to 3 weeks I noticed better recovery after exercise--as a distance runner I was able to run my hard workouts more frequently with fewer rest days or easy workouts in between. I also began sleeping better fairly early on, was not hungry all the time anymore, and maintained weight more easily on lesser volumes of food. Over time, my stools became a bit more well-formed, my sex drive increased somewhat (usually accompanies better energy levels for me), my nervous system was more stable and not so prone to hyperreactive panic-attack-like instability like before, and in general I found I didn't feel so puny or wilt under stress so easily as before. Unexpectedly, I also began to notice that my moods had improved and I was more "buoyant." Individually, none of these changes was dramatic, but as a cumulative whole they have made the difference for me. Most of these changes had leveled off after about 4-6 months, I would say.

235

Something else I ought to mention here, too, was the effect of this dietary change on a visual disturbance I had been having for some years prior to the time I embarked on a disciplined Hygienic program, and which continued unchanged during the two or three years I was on the traditional vegetarian diet of either all-raw or 80% raw/20% cooked. During that time I had been having regular episodes of "spots" in my visual field every week or so, where "snow" (like on a t.v. set) would gradually build up to the point it would almost completely obscure my vision in one eye or the other for a period of about 5 minutes, then gradually fade away after another 5 minutes. As soon as I began including flesh in my diet several times per week, these started decreasing in frequency and over the 3 years since have almost completely disappeared.

What problems are you still working on?
I still have an ongoing tussle with sugar-sensitivity due to the huge amounts of soft drinks I used to consume, and have to eat fruits conservatively. I also notice that I still do not hold up under stress and the occasional long hours of work as well as I think I ought to, even though it's better than before. Reducing stress and trying not to do so much in today's world is an area I really pay attention to and try to stay on top of. My own personal experience has been that no matter what kind of variation of the Hygienic or evolutionary "Paleolithic" diet I have tried so far, excessive prolonged stress of one sort or another (whether physical or mental) is a more powerful factor than lapses in diet in bringing on symptoms of illness, assuming of course that one is consistently eating well most of the time. And Chet, this brings up something I also want to emphasize: Just as you've freely mentioned about yourself here in H&B on numerous occasions, I'm not perfect in my food or health habits, and I don't intend to set myself up as some sort of example for anyone. Like anyone else, I'm a fallible human being. I still have the occasional extra-cheese pizza or frozen yogurt or creamy lasagna or whatever as treats, for example. Not that I consider having those on occasion to be huge sins or anything. I stick to my intended diet most of the time, but I don't beat myself up for the indiscretions. I would hope other people don't beat themselves up for it either, and that we can all be more forgiving of each other than that.

Uncertainties about earliest use of fire for cooking

*** "Given the evidence available at this time, most of it would probably indicate that 125,000 years ago is the earliest reasonable estimate for widespread control." *** "...given the time span involved (likely 125,000 years since fire and cooking became widespread), the chances are very high that we are in fact adapted to the cooking of whatever foods were consistently cooked." *** "...it is this advantage in expanding the range of utilizable foods in an uncertain environment that was the evolutionary advantage that helped bring cooking about and enhanced survival."
In hindsight here, it is worth pointing out that the above estimate of 125,000 years ago for widespread cooking is not based on very much evidence--nor is anything about the earliest origins for fire and cooking based on very much evidence, for that matter. Much about what is inferred about the earliest occurrences of

236

fire use is based on considerable deduction applied to very limited evidence (unlike the considerably more well-established, consistent, and converging lines of evidence for early meat consumption). There are some who believe a more reasonable estimate might be more like 40,000-60,000 years ago for widespread use of fire, based on wider distribution of hearths, but in my conversations with those perhaps more familiar with the paleontological evidence for cooking and fire than me, it still seems apparent that probably nobody knows for sure given the current state of the paleontological findings. While some believe the rarity of evidence such as hearths early on after control of fire points toward the early use of fire mostly for heating and protection from predators rather than cooking, discussions on the internet's PALEODIET listgroup have brought up the point that some modern hunter-gatherers in fact have been known on occasion to use cooking techniques that don't require hearths or cooking vessels that leave behind fossil evidence. And though modern hunter-gatherers have evolved behaviorally since ancient ones, this suggests it may not be out of the realm of possibility that other methods could have been used to utilize fire in cooking than just hearths.

*** "...the most interesting of these [stages of sequence for control] is that fire for cooking would almost inevitably have been one of the first uses it was put to by humans, rather than some later-stage use."
Again, I would now be more cautious in inferring this (my earlier inference about this was based on Johan Goudsblom's 1992 book, Fire and Civilization), after having been privy to further discussions with Paleodiet researchers. As the preceding paragraph mentions, it is possible the stage of use for warmth and protection from predators may not have so quickly progressed to the stage of using fire for cooking. But no one really seems to know one way or the other for sure from what I can tell.

*** "A brief look at the Australian Aborigines [who utilize cooking for survival purposes] might be illustrative here."
Elsewhere I believe studies of other hunter-gatherer tribes show that many of them cook about half their food. Again, whether this particular behavior of modern hunter-gatherers is representative of behavior in the more-distant evolutionary past should perhaps be regarded with caution.

Incompatibilities between dairy consumption and human physiology

*** "...for our purposes here, the example [of lactose tolerance having developed within 1,150 years in some segments of the population] does powerfully illustrate that genetic adaptations for digestive changes can take place with much more rapidity than was perhaps previously thought."
The estimate of 1,150 years is from the Cavalli-Sforza data. A somewhat more conservative estimate based on the prevalence of lactose tolerance in those of Northern European extraction is that the gene for adult lactose tolerance would have increased from 5% to 70% prevalence within about 5,000 years (approx. 250 generations). [Aoki 1991]

Genetic changes due to "neoteny" (such as adult lactose tolerance) not indicative of overall rates of adaptation. Even while these data for relatively quick evolutionary changes resulting in adult
lactase production remain essentially true, however, an important point that should be clarified is that the gene for lactase production is already present and expressed in all humans (for breastfeeding) up through the time of weaning. Therefore, for lactase production to continue into adulthood would require only relatively small changes in the gene, e.g., via the process known as "neotenization" (the retention of juvenile traits into adulthood). Thus, "brand-new" traits, so to speak, unlike polymorphisms such as the

237

gene for lactase production which already exist (even if not in a form previously expressed in adults) would take much longer to evolve.

Additional indications of incongruence between dairy and human physiology. Further, beyond
the question of lactose tolerance, I have since learned there would be many additional genetic changes required (than just that for lactose tolerance) to result in more complete adaptation to milk consumption. A number of recent studies demonstrate problems of milk consumption that go considerably beyond whether or not a person is capable of handling lactose: • •

Lactose and heart disease. One is that lactose itself is a risk factor for heart disease, since in
appreciable quantities it induces copper deficiency which, in turn, can lead through additional mechanisms to heart pathologies and mortality as observed in lab animals. Poor Ca:Mg ratio which can skew overall dietary ratio. Another problem is the calciumto-magnesium ratio of dairy products of approximately 12:1, which is directly at odds with the ratio of 1:1 from a Paleolithic diet composed of meats, fruits, and vegetables. Depending on the amount of milk in the diet, the resulting overall dietary ratio can go as high as 4 or 5:1. This high ratio leads to reduced magnesium stores, which have the additional ramification of increasing the risk of coronary heart disease, since magnesium helps to lower levels of blood lipids (cholesterol), lower the potential for cardiac arrthymias, lower the oxidation of LDL and VLDL cholesterol (oxidation of cholesterol has been linked to atherosclerosis), and prevent hyperinsulinism. (More about hyperinsulinism shortly below.) Saturated fat. Milk has also been linked to coronary heart disease because of its very high saturated fat content. Molecular mimicry/autoimmune response issues. Additionally, autoimmune responses are being increasingly recognized as a factor in the development of atherosclerosis. In relation to this, research has shown milk to cause exceptionally high production of certain antibodies which crossreact with some of the body's own tissues (an autoimmune response), specifically an immune response directed against the lining of the blood vessels. This process is thought to lead to atherosclerotic lesions, the first step that paves the way for consequent buildup of plaque.

• •

[See Part 2 of Loren Cordain, Ph.D.'s posting of 10/9/97 to the PALEODIET list (relayed to the list and posted by Dean Esmay) for details and references relating to the above points about dairy consumption.]

Signs of evolutionary mismatch between grains and human physiology

*** "Nobody yet, at least so far as I can tell, really knows whether or not the observed genetic changes relating to the spread of milk-drinking and grain-consumption are enough to confer a reasonable level of adaptation to these foods among populations who have the genetic changes, and the picture seems mixed."
The most succinct addendum to this assessment is: Not any more, as we have seen above with milk. Where grains are concerned, there are several similar problems which I have since learned have been uncovered. To list a few: •

Certain wheat peptides appear to significantly increase the risk of diabetes through molecular mimicry of the body's own tissues, leading to autoimmune responses destructive of
cells that produce insulin. [See Loren Cordain, Ph.D.'s post of 6/23/97 on the PALEODIET listgroup for reference citations, and also his article on this site about the evolutionary discordance of grains and legumes in the human diet, for details. ]

238

Increasing amounts of research suggest that celiac disease is probably also caused by autoimmune responses generated through molecular mimicry by certain peptides in wheat
and other grains (known collectively as "glutens"). Additional studies on autoimmune problems have led some researchers to believe numerous additional chronic conditions are also traceable to autoimmune responses generated by the glutens in grains. [See Ron Hoggan's various postings in the archives of the PALEODIET listgroup for information and reference citations about this.]

It is well documented that the phytates in grains bind the minerals iron, zinc, magnesium, and calcium, which can impair bone growth and metabolism, among other
problems. Antinutrients in grains also negatively affect vitamin D absorption which can lead to rickets with sufficient levels of intake. Grain consumption also generates biotin deficiencies in experimental animal models--the lack of which impairs fatty acid synthesis. [See Loren Cordain's post of 10/1/97 on PALEODIET for a brief summary and references pertaining to the preceding points, as well as the article on the evolutionary discordance of grains and legumes mentioned in the first bullet point just above.] Hyperinsulinism and excess carbohydrate consumption. Lastly, and most importantly, significant amounts of grain consumption considerably increase the carbohydrate intake of the diet, and excessive carbohydrate consumption is the primary factor driving what has come to be known as the hyperinsulinism syndrome. Hyperinsulinism and its associated constellation of resulting symptoms, collectively known as "Syndrome X," is not yet well-accepted in the medical and mainstream nutritional communities, but recent research has been increasingly pointing in its direction as a potential underlying factor in the development of many "diseases of civilization," which may be linked together via hyperinsulinism as a common cause.

The etiology of hyperinsulinism leading to the symptomology of Syndrome X is as follows:
• • • All carbohydrates, from whatever source (natural or artificial), whether simple or complex, are ultimately broken down into glucose, or blood sugar. (And remember here that grains contribute the largest percentage of carbohydrates in most modern diets.) For glucose to be taken up by the cells of the body and used as fuel, the hormone insulin must be secreted by the pancreas. When chronically excessive levels of carbohydrates are eaten, insulin is overproduced, and the body eventually becomes dulled--more unresponsive--to insulin. This can become a vicious circle: Since the body is dulled to insulin, more has to be produced, which causes further dulling of sensitivity, leading the body to produce even more.

Biomarkers indicating hyperinsulinism. High levels of insulin have been correlated with high blood
pressure, high cholesterol, high triglycerides. If the relationship is causative, then by extension hyperinsulinism would also presumably be causative of the health problems that these symptoms themselves lead to (i.e., heart disease, etc.). High levels of insulin also lead to obesity, since in addition to enabling glucose to be used as fuel, insulin also promotes fat storage, while inhibiting the burning of fat.

Hyperinsulinism and diabetes. The foregoing constellation of symptoms constitutes what has come to
be called "Syndrome X." The extreme end of Syndrome X is probably Type II diabetes, in which the body has become so insulin-resistant it can no longer control its blood sugar levels, or even Type I diabetes, in which the pancreas overloads and is no longer able to produce insulin at all.

Recent studies indicate diets higher in protein reduce symptoms of Syndrome X. Dovetailing
with this research on Syndrome X is that diets higher in protein--or diets which lower carbohydrate levels by substituting animal protein--improve blood lipid (cholesterol) profiles by lowering LDL, VLDL (the

239

"bad cholesterols") and total cholesterol, while increasing HDL ("good cholesterol") and improving other symptoms of Syndrome X. Conversely, repeated studies are showing that low-fat, high-carbohydrate diets popular today do the opposite. [See Loren Cordain, Ph.D.'s posting of 3/26/97 to the PALEODIET list, relayed to the list by Dean Esmay, for reference citations on this.] Note particularly that these dietary changes which improve hyperinsulinism parallel the macronutrient composition that would have prevailed in the evolutionary diet of humans during Paleolithic times. (I.e., fairly low carb intake combined with relatively higher levels of fat and protein due to the prevalence of animal flesh in the diet, and more limited availability of fruits with high sugar content.)

Why do cooked or denser foods often improve raw/vegan health?
*** "If it is true that most people do better with the inclusion of some of these cooked items in their diet that we've mentioned--and I believe that it is, based on everything I have seen and heard..."
In retrospect, I want to make clear here that this was true of the vegetarians I was in contact with during the time I ran the Natural Hygiene M2M (and others since). My opinion at this point is that the improved health observed in most otherwise raw-foodist vegetarians who later include some amount of cooked food in their diet occurs because cooking simply allows additional foods to be eaten that broaden the diet. Just as importantly, it also increases the nutrient density of the diet by making available more concentrated nutrition in an otherwise restricted spectrum of nutrient intake brought about by eating only the bulkier, less-dense foods that compose most vegetarian fare that is edible raw.

For those who do not thrive on raw vegan diets, do the benefits experienced from grains/dairy outweigh any downsides? With the previously discussed evidence that grain and milk
consumption can carry with them certain health ramifications, an interesting question presents itself. Considering our earlier observation above that vegetarians on raw-food diets often experience improvement when adding dairy products, or cooked items to their diet that allow more concentrated and nutrient-dense foods such as grains or legumes, one now has to ask: For vegetarians who otherwise refuse animal products such as meat--which is the primary Paleolithic adaptation to foods of animal origin--and eat a diet whose macronutrient content (higher carb and lower protein) is out of line with the evolutionary past, how much do the possible benefits of getting a wider spectrum of more concentrated nutritional intake in consuming grains, legumes, and/or milk or milk by-products outweigh their disadvantages? Certainly in the short run it seems to help many vegetarians, based on anecdotal observations, especially those we observed in the Natural Hygiene M2M. (There will be more about this in Part 3 of the interview.)

Long-term concerns. In the long-term, however, it appears from the recent evidence coming in,
especially regarding hyperinsulinism, that there is a price to pay. While Americans have only been experimenting with vegetarianism in larger numbers since the 1960s and 1970s, epidemiological studies of populations in Southeast Asia where cultures have lived on grain and legume-based diets for centuries (often supplemented by dairy)--which is not so different in character to many vegetarian diets--show high rates of heart disease and widespread nutritional deficiencies. That we saw numerous health problems in the Natural Hygiene M2M that were often called "detox," but would appear to clinicians more like deficiencies, adds at least some anecdotal support that similar results might prevail among vegetarians. (There is unfortunately a lack of any controlled studies of predominantly raw-foodists who also include some dairy and/or grain/legume products in their diets.)

Mitigating circumstances. Two observations worth noting that cloud the picture a bit, however, are the
following: •

Noteworthy difference between lacto-vegetarian subpopulations and raw-foodists adding grains/dairy. There is a primary difference between present-day raw-foodists (or

240

predominantly raw-foodists) who supplement the diet with dairy and grains vs. those traditional societies eating diets similar to (though not strictly the same as) lacto-vegetarianism. And that is the huge focus that raw eaters put on large amounts of fresh fruits and vegetables. Given the protective effects against disease that research has been showing consumption of fresh fruits/veggies confers, this point of distinction between the two groups would lead one to expect there may be interesting differences in results worth further consideration or study. •

For previous raw-foodists, the supplemental amounts are usually relatively modest.
Secondly, those who may otherwise be predominantly raw eaters supplementing their diet with grains and dairy usually do so in modest amounts. How much their restricted intake of these problematic foods might mitigate what detrimental fallout there is from eating them would also be an interesting question to pursue.

*** "...the more interesting and more pressing question, to my mind, is not whether we are adapted to cooking of certain foods, which seems very likely, but how much we have adapted to the dietary changes since the Neolithic agricultural transition, given the 10,000 years or less it's been underway." Information about cooking's ultimate impact on health at the biochemical level of detail is still inconclusive. While as we've noted above, the detrimental repercussions of grains on health have
now been implicated in a number of ways, the picture about cooking is more unclear than I had thought at the time of the interview. And this is not only due to the uncertainties over when cooking began. As I have since found, there have been no "review" papers summing up the research on cooking in modern food science studies that have taken a unified look at the subject, and what other studies there are, are fragmented and scattered widely. (But see Looking at the Science on Raw vs. Cooked Foods on this site, as a first attempt to remedy this lack.) This lack of unified study on the subject shows, in one way, just how new the evolutionary perspectives embodied in Paleodiet research are to the field of mainstream nutrition. (Though of course, the question of raw vs. cooked foods has been around in modern times in the alternative health movement, particularly within Natural Hygiene, since probably at least the mid-1800s.)

Big picture is more clear: Impact of cooking is likely to be much less important than other overarching considerations. One interesting observation here, however, which may serve to put the
cooking question in more realistic perspective is that studies of modern hunter-gatherers (many of whom seem to cook about half their food) that have been performed show them to be probably among the most free of chronic degenerative diseases of all peoples on the planet. This suggests that whatever negative (or positive) effects cooking may or may not have, it probably just does not play a very large role adding to, or mitigating, the overall health factors that determine freedom from the degenerative diseases of civilization.

Magnitude of effect from macronutrient ratios likely plays the most influential role. Indeed,
as we outlined above, more and more Paleodiet-relevant research seems to be showing that it is the overall macronutrient profile of a diet, in terms of: • •

Types of fats, and their ratios and sources (which are considerations that seem to be more
important than former analyses emphasizing simply total amount of fat), and Factors that may precipitate Syndrome X. Just as crucially, the balance between proteins and carbohydrates in avoiding hyperinsulinism. (As we have seen, substituting more protein for carbohydrate increases the "good" HDL cholesterol while lowering the "bad" LDL cholesterol, and significantly reduces risk for other symptoms of Syndrome X).

It seems likely that these factors plus avoiding to the degree possible known non-evolutionary foods play the largest role in health, overshadowing what effect cooking may have. (Within reason, of course: There is evidence that overcooking to the point of charring should definitely be avoided since taking it that far produces carcinogenic by-products in fats/proteins such as meat. At the same time, however, it should be remembered that even raw uncooked plant foods also contain a certain natural level of mutagens and

241

carcinogens. Again, these two points about cooking taken together with the fact cooking can also eliminate toxic antinutrients suggest that, as was pointed out in Part 2 of the interview itself, the issue of potential toxic by-products from cooking is far from a clear or cut-and-dried one.)

Eating all natural foods or all-raw by itself does not automatically result in a prudent diet.
The upshot is that the obsession with a total raw-food diet in some sectors of the vegetarian community-while certainly not bad in the strictly technical sense--is considerably overhyped, assuming, that is, that, say half the diet is raw to begin with (particularly getting sufficient amounts of fresh vegetables and/or fruits). It is becoming more clear that the idea among vegetarians that if a person just eats all natural or all raw foods and they'll be okay ignores important factors, such as: • • •

Which foods are, in fact, the most natural for humans (certainly not just vegetarian
ones--evolution shows meat is important as well);

Which natural foods can be important to minimize or avoid (i.e., legumes, grains,
dairy--if they turn out to be problematic--for those who otherwise consider them natural); and What balance of macronutrients, whether raw or cooked, best approximates the ancestral human diet and/or results in the best mix of nutrients to support health over the longterm. (Using this standard for comparison, a totally raw vegetarian diet may be too low in protein (it is often too low in calories to maintain weight for many individuals without great effort), and markedly overabundant in carbohydrates, which in spite of eating an "all-natural" diet, or an allraw diet, can still lead to long-term health problems due to possible hyperinsulinemia, particularly if the diet is too high in fruits.)

*** "Especially where people are avoiding flesh products which is our primary animal food adaptation, these animal by-products [cheese, eggs] may be helpful [for some vegetarians]..."
Again, while this may be true enough in the short-term, long-term is a different story, although eggs are something of an exception, and should not be lumped in with dairy when analyzing their effects. However, while eggs are native to the human diet, they would have been consumed only seasonally and in limited quantities, not daily or year-round like people do today. While the nutritional content of eggs is quite high and they contain complete protein of high bioavailability, they also contain the antinutrients avidin and conalbumin in the egg-white, the former of which inhibits biotin and other B vitamins; the latter of which binds iron. Egg-white is also allergic for some individuals. Thus not everyone can benefit from eggs, and those who can should still keep in mind that while natural and a food early humans would have eaten, they would only have been available in limited amounts.

*** "This [differing adaptation between population groups to Neolithic practices begun 10,000 years ago, specifically grain consumption] means it is going to be more likely right now in this particular historical time period that individuals will be somewhat different in their responses to diet. And as we saw above (with the two genes ACE and apolipoproteinB) these genetic differences may even confound attempts to replicate epidemiological dietary studies from one population to another unless these factors are taken into account."
While this is of course true, the fact that most of the detrimental repercussions of grains that we have outlined above hold across all population groups implies that--whatever genetic advantages there may be within certain groups--they have apparently so far conferred only minimal (physiological) adaptive advantages, which, overall, fall well short of complete genetic adaptation. Note, however, on the other hand, that grains have conferred considerable cultural selective advantages. That is, they have built the agricultural base that has enabled the rise of settled, hierarchical civilizations, which support the social stratifications and specialized pursuits that have given rise to the huge technological advances since then. The problem here, of course, where physical health is concerned is that genetic (physiological) adaptation always lags behind cultural selection in the behavior/culture evolutionary feedback loop (see discussion on

242

the feedback loop between evolution and culture elsewhere on the site for more information on this evolutionary process). Even assuming that civilizations worldwide were to remain based on grains indefinitely into the future, it would still probably be many more thousands, perhaps tens of thousands, of years before a fuller physiological adaptation can have taken place.

*** "In experimental settings, purified, isolated protein extracts do significantly increase calcium excretion, but the effect of increased protein in natural foods such as meat is smaller or nonexistent."
A recent posting on the PALEODIET list by Staffan Lindeberg, M.D., Ph.D. about this concern citing more studies than I had access to at the time of the interview [see next point for title and date of post] indicates that taken as a whole, studies on calcium excretion due to increased protein intake do in fact indicate animal protein has a more pronounced effect than other sources. (This is due primarily to the sulfates contained in animal protein [Breslau et al. 1988, as cited in Barzel and Massey 1998].)

Paradox of high bone mass in pre-agricultural skeletons despite large animal protein intake.
As stated in the body of the H&B interview, however--and as Lindeberg implies in his remarks--the debate as to animal protein's potential role in osteoporosis continues since it seems that hunter-gatherers in equatorial and temperate zones (excluding those in arctic regions such as Eskimos eating the highest levels of animal protein) have good bone mass parameters. (Archaeological specimens prior to the Neolithic certainly substantiate the picture of robust skeletal development in primitive hunter-gatherers.)

This would point to compensating factors in the complete picture of a Paleolithic diet that may
render the protein/calcium loss issue something of a non-concern when the diet as a whole is assessed. What the physiological mechanisms are that lead to this net result, however, have not been fully worked out by researchers, although the role of a number of other factors besides protein that also influence calcium balance is becoming more clear in recent years. (For a more in-depth discussion of the paradox of high bone mass in Paleolithic skeletons despite high protein intake, see "Are Higher Protein Intakes Responsible for Excessive Calcium Excretion?" elsewhere on this website.)

*** "Studies of Eskimos have shown high rates of osteoporosis eating an almost all-meat diet (less than 10% plant intake) but theirs is a recent historical aberration not typical of the evolutionary Paleolithic diet thought to have averaged 65% plant foods and 35% flesh."
I did not know at the time of making this statement that the modern studies showing Eskimos to have high rates of osteoporosis are unfair because, in this case, the more recent epidemiological studies were performed with Eskimos already partway along the road to Western habits and diets, which has not been generally noted. Thus these particular studies are not reliable indicators of the effect of their native diet, for which one has to go back to observational reports in the early part of this century and before. Unfortunately, information on the Eskimos from this era is limited to observational reports from which no rigorous physiological data is apparently available. On the other hand, studies of prehistoric Eskimo skeletons do show some degree of osteoporosis, although how much may be due to diet vs. other lifestyle factors such as reduced sunlight exposure, compression fractures due to traumatic vibrations from extensive sledding, etc., is not known. The rate of osteoporosis is increased in prehistoric Eskimo populations living furthest north, and among these furthest-north-living Eskimos, this rate increases as one moves further east from Alaska, across Canada, and into Greenland compared to those living in the western end of their range. [See the post of 7/29/97 on the PALEODIET list titled "Osteoporosis in Eskimos" by Staffan Lindeberg, M.D., Ph.D. for an extensive discussion plus scientific references.]

*** "W.N.: I would say my diet right now is somewhere in the neighborhood of about..."
It should go without saying that given the above updates to the Paleodiet research discussed here, I have

243

made attempts to bring my current diet more in line with what is now known, although I certainly don't claim to be perfect in my habits, and recommend people make up their own minds about what to do. There is also the problem that economics, as well as current agricultural practices in how animals are raised, slaughtered, processed, and made available for sale (the dearth of organ meats available is one problem) introduce difficulties in approximating a true Paleolithic diet today.

*** "...low- or no-lactose cultured forms like goat cheese and yogurt."
While cultured forms of milk do indeed have little or no remaining lactose, the other drawbacks of milk products outlined in the postscript above in all likelihood also would apply to goat dairy, even if the nutritional profile of goat's milk may be somewhat closer to human milk than is cow's milk, as is often asserted. [See Loren Cordain's post of 10/15/97 on the PALEODIET list for a brief summary of what is currently known and unknown about the nutritional composition of goat's milk compared to cow's milk.]

*** "...sprouted forms of grains, or breads made from them, that eliminate the gluten otherwise present in wheat, barley, oats, and so forth."
Although I have not been able to confirm the following to my complete satisfaction with documentation, sprouting of grains is probably as important for deactivating the antinutrients (presumably phytates) they contain. Whether sprouting is as effective in reducing the gluten content as sprouting enthusiasts believe is something I am no longer so sure about. For either process, however, much undoubtedly depends on how many hours or days the sprouts are allowed to germinate and what stage the growth process is allowed to reach before consumption. .

Problem: Finding unsanitized reports about the full spectrum of real-world results with vegetarian diets
Health & Beyond: Ward, I briefly described what the M2M is in the first part of our interview, but why don't you recap it for us here?
Ward Nicholson: Sure. The term "M2M" is an acronym that stands for "many-to-many," and is something I sometimes refer to as "group mail" because people can get pretty passionate about it. It's like a huge penpal group, except it's operated as a newsletter. Ours is published bi-monthly, containing letters from participants that they send to me as its Coordinator. Normally, you might call me the "Editor," except one of an M2M's operating principles is that we agree to print what everybody says exactly as they send it in, with no editing other than setting a page limit for each person's letters. So an M2M is a free-speech kind of thing. Each issue, we set an optional topic to be discussed, usually relating to health or Natural Hygiene, but it's only a suggestion, and people can ignore it if they want and talk about anything--which, believe me, they sometimes do! We even encourage this, because sometimes you can glean the most interesting insights from what people say that is supposedly "off topic," or only indirectly related, than by what they say directly "on topic." The conversation that develops between everyone is this amorphous thing that continues to change in an organic sort of fashion, shifting with the sands of who joins and drops out and what happens in the members' lives, and what things are said that hit a nerve with folks.

How many people are in the M2M?
The membership has hovered around 30 to 50 "active participants" (those who agree to write in letters regularly) for the last few years, plus about 10 to 20 "read-only" subscribers, some of whom become active themselves after seeing what goes on. These numbers are about where we have to keep things because of logistics to keep the M2M from getting unwieldy. But it's also a kind of revolving door, since usually a couple of people join and a couple of people tend to drop out every issue or so, with a certain stable core of longer-term members. This is just a rough guess, but I would say in the four years the M2M has been in operation, overall we have had maybe 75, perhaps as many as 100 people write in letters to the M2M telling something about their experiences with Hygiene. The real value, though, is you get to know the people who stay on-board

244

for some length of time on an ongoing basis, and in considerable depth, which gives you much better insight into their Hygienic or other vegetarian or "radical diet" experiences and experiments, the predispositions in their thinking, and thus a basis for judging what is really going on with people. I would not claim that the M2M is a scientific sample. However, I do think it is a unique forum for getting at the truth of people's dietary and health experiences, because: • • Those who write in are enthusiastic about Hygiene or vegetarianism or other "natural food" diets based on some kind of first principles, so you know they are trying hard. Their words are unfiltered and unedited, which I believe gives a much better glimpse of the truth of the rank-and-file's results than what we get from the public statements made by the ANHS (the American Natural Hygiene Society, the national organizing body) and the IAHP (the International Association of Hygienic Physicians), who have an image to uphold. The ability to cross-examine people tends to elicit the truth from them over time, with the resultant honesty it forces in people unless they are going to drop out. And, I think you can pretty reliably assume that even where they are not being completely honest about their level of health and behavior, since people will tend to present even their failings in the best light, you can be pretty sure that if they have problems to discuss, they are really experiencing them.

• •

I do think, therefore, this allows you to draw some pretty fair conclusions about what is truly going on with a range of Hygienic vegetarians in the real world, especially in the absence of any scientific studies.

So why did you start the M2M and what have you learned from it?
My original idea in doing the M2M was just to talk directly to "the horse's mouth" so to speak--to bypass "official" channels of information--so I could find out what was really happening with Hygienists in the real world. I learned plenty about this, of course. However, what surprised me--at first, anyway--is that I have learned far more instead about what I call "the psychology of idealistic diets." I discovered that a sizable proportion of Hygienists are experiencing problems and disappointing results--even pronounced problems--on the diet, often in spite of adhering faithfully to the Hygienic program over considerable periods of time (many months or a number of years).

Not everyone does well on raw and/or vegan diets ("failure to thrive"). So the first thing I would
say here to set the stage for discussing the psychology of an idealistic health and dietary system is that in Natural Hygiene's case, while it certainly works well for some people, it just doesn't for others, regardless, apparently, of the time span. Yet you don't hear about this in mainstream Hygiene--not from the ANHS or in the books and magazines that are sold anyway. There is a real see-no-evil, hear-no evil, speak-no-evil syndrome. (I have also been told by a Hygienist not connected with the M2M, who has had contact with higher-ups within the ANHS, that they are aware of some of these failings, but simply choose not to mention them.) If problems are acknowledged, they are attributed to lack of discipline in following the program, or in screwing up the details, or not giving things enough time, or that one is focusing only on diet (even when, in fact, they aren't) to the exclusion of other health factors. But never are they attributed to the idea that Natural Hygiene could be incomplete in some way. I know that I myself found this hard to believe initially, because having read many of the Hygienic books including those by Herbert Shelton (whom I still have a tremendous amount of respect for as a great synthesizer and original thinker in many areas) that were so very convincing as to the wonderful results that were gotten, I thought here at last was a system whose logic and results both showed it to work. And the system was so logical and seemingly all-encompassing as laid out by Shelton, how could it not work for everyone?

245

The controversies that arise over dietary failures provide an entry point for observing the psychology behind idealistic diets. Eventually, however, I could no longer deny the obvious, and most
who have been with the M2M any length of time will tell you this is one of the more eye-opening things about participating in it and being exposed to lots of different people: There really are numerous problemcases on the Hygienic system of health and diet. Once faced with this, you have to come to terms with it one way or the other. And of course, it is in how people explain the problems where all the controversy lies. We will get into how people react to this state of affairs in depth, but for now I just want to put this on the table here for people to think about.

The attractions and pitfalls of purist black-and-white dietary philosophies
I take it, then, you have some other observations to introduce before we proceed further?
Yes. Let's back up at this point and set the context for why people get into Natural Hygiene or any other dietary system in the first place.

PROBLEM #1: Confusion and contradiction in the marketplace of dietary ideas and research. I think most of us have experienced before we got into Natural Hygiene how confusing the
dietary world is these days. Everyone has something different to say and even the scientific studies on nutrition seem to contradict themselves every few years. About the only things anyone seems to agree on right now are that lots of fresh fruits and vegetables are good, and too much fat in the diet is bad--and some people don't even agree about the details of the latter. On most every other topic there is divisive controversy. Most people are confused. There is nothing that leads to a good argument like food these days. Of course, maybe it's always been that way, but in any event, confusion and contradiction about diet are rampant and have been for a long time. •

THE APPEAL: The psychological attractions of emotional certainty. So the first thing
to look at here is that people strongly crave a lot more certainty in this area. This is important from the psychological standpoint: It means there is going to be appeal in systems that make everything very simple, very black-and-white, systems that offer some sort of key litmus test by which you can assess everything and reduce it all to fundamental, delineated concepts, hopefully with few ambiguous gray areas--even in spite of the fact the real world may turn out to be far more complex. THE PITFALL: Oversimplifications and illusory truths. Now there is nothing inherently wrong with desiring more certainty in and of itself. The problem comes in when the illusion of absolute certainty is so strongly desired, and people become so sure of themselves, that everything becomes oversimplified and things start getting left out of the equation. We'll talk more about this later too.

PROBLEM #2: Feeling powerless over microbes, genetics, or out-of-control authoritarian health-care. Another part of the psychological climate to look at here is that with the
dominant/submissive doctor/patient relationship that has dominated the Western medical approach for decades if not centuries, people don't feel in control of their health. They feel at the mercy of unknown forces--not only doctors, but microbes, inherited genetics, and so forth. •

THE APPEAL: Giving people tools to regain individual control over their own health. So here, there is an inherent appeal in dietary systems that give back control to people by
putting tools and methods in their hands that give them more power to control their own health. Again, this is good, of course.

246

THE PITFALL: Perceptual reframing of health factors can blind people to real dangers. But on the downside, there can be the danger of a false sense of control that occurs
when people are not as much in control as they think they are, or think they know more than they do. This can cause one to view health or disease symptoms through perceptual filters which distort one's vision of what may actually be going on, which can get people into real trouble. We see this in the case of Natural Hygiene where all symptoms tend to be viewed as "detox," blinding people to other possibilities that may go unaddressed as their condition worsens while the person thinks they are getting better. This we will also discuss in depth.

PROBLEM #3: Thoughtful non-authoritarianism vs. mere reactionary rebellion.

THE APPEAL: Casting off outmoded or oppressive authority, and thinking for oneself. Another psychological backdrop for the appeal of Natural Hygiene and other givecontrol-back-to-the-people approaches is that they are anti-authoritarian. There is an inherent and legitimate appeal in systems of thought that take power away from oppressive systems like what much of our current modern health-care system has become. I would hope most of us laud this trend. THE PITFALL: Myopia and self-delusion. However, there is also a danger here as well, in that if one becomes so reactionary that they begin not to listen to feedback from others, they risk becoming lost in an emotionally reactive, subjective world of their own. Feedback is the only corrective to self-delusion, and the feedback of information outside your own "self-certain" or myopic thinking processes can be vital if your world is not to become solely self-oriented to the point you lose your bearings.

Okay, this gives us some insights into what psychological factors are appealing to people about a dietary system like Natural Hygiene. However, people also usually get into health foods because of specific health problems. Why Hygiene and not some other system?
Well, this is something I have done a lot of thinking about, including thinking back to why I personally was attracted to Hygiene, and I've also observed what people new to the M2M talk about. And I don't think it would be exaggerating too much to say that it is the paradigm of "detoxification" that captures people's imaginations and satisfies their sense of logic. It powerfully addresses the three psychological desires we talked about above, by: • Number one, answering the desire for a feeling of certainty with a simple, powerful theoretical and practical mechanism that appears to explain everything (or almost everything). It also gives a completely new perspective for most health-seekers who--like most of the populace-have been absorbed in the make-sure-you-get-all-the-right-nutrients approach or who have been mesmerized by the idea microbes are at the root of everything. This is what I think is responsible for the statement you often hear by converts to Hygiene that "it just had the ring of truth." What I think people are really describing here is the shift from one paradigm to another. Suddenly you have the new explanatory paradigm that in one stroke simplifies the entire health equation, relegating microbes to a secondary role and assuring you that if you just eat a variety of vegetarian natural foods, you will get all the nutrients you need, so what you end up mainly needing to focus on is eating clean foods. It is this simplicity that is partly what makes people feel so certain, along with the fact that this simplicity is so logically black-andwhite, a characteristic of Hygienic thought we will also look at more as we proceed.

247

Second, just as convincingly, the need for control is answered by the fact one can actually see results by eating cleaner foods. This part of the Natural Hygiene system definitely works for almost everybody who has been eating what we call the "SAD" ("standard American diet")--at least at first, and up to a point--and so it is what one could call the "conversion experience" or the "rite of initiation" that furnishes the proof people need to be convinced of the validity of the entire system of Hygienic practice. Whether on the basis of initial symptom disappearance due to detoxification the entire fabric of Hygiene merits unquestioned credence is an issue most people don't think about. But the fact is, it does play the role of convincing people to follow the rest of the dietary admonitions of the system too, by extension; i.e., if this is so powerfully true, how can the rest not be also? And third, of course, people love the anti-authoritarian blast-the-medicos rhetoric that blames their profession for everything short of the fall of Greek and Roman civilization. It gives people something to stand for--the underdog against Goliath, or the lone light of truth in a dark, SADeating world--and enables us to pump up our egos by showing how ignorant most people are, especially the M.D.s, who know nothing about nutrition, and who with their drugs are creating more toxification of people and creating the very problem that needs solving in the first place. Even the scientists are seen as stupid too, who, in spite of all their knowledge, can't see the forest for the trees like we Hygienists can see it, just by using a little common sense and breaking out of our previous paradigms.

Other unconscious needs also met. Psychologically, all of this serves a lot of unconscious needs by
giving people a sense of identity, villains to fight against, Gentiles to convert to the cause, and so forth. One can even be a martyr and a prophet not recognized in their own country (family) during holiday gatherings where they are ostracized for refusing to partake in heathen rituals like turkey-eating. And of course, as I have personally experienced first-handedly, there are the heretics such as myself who are upbraided for having abandoned the cause, or the backsliders who are pitied in their weakness. It's a complete psychological package that people often get a lot more mileage out of than they realize.

Success/failure rates of vegan diets in Natural Hygiene
All right, the preceding gives us a basic psychological framework for the appeal that Hygiene makes to people's minds in the beginning. What I want to look at now are some of the actual results on people's bodies and health that we see in the M2M to add the next layer of events, and upon which further stages of Hygienic psychology are built. Because it is in dealing with or explaining the successes and failures that much of the psychology manifests itself.

Before you do that, let me slip in another question here to get the broad picture first: What is the percentage of Hygienists in the M2M who do well on the diet compared to those who don't? What's the basic split?
It's hard to say with any precision since a number of Hygienists follow versions of the diet that don't strictly match the traditional definition. But I'll make some very rough guesses here, as long as you understand that's what they are. If you are defining the diet to include those who are very strict about their behavior--an important point--and who follow either the all-raw regimen or the 80% raw/20% cooked plan, I would say perhaps 3 to 4 out of 10, maybe half if one is generous, do well. [Note: "Cooked" foods in this context refers to grains, legumes, and tubers; the relative amounts given would be by volume.] This is long-term, an important distinction we'll look at as we proceed. Significantly more do well short-term.

Dropouts over time and "cheating" on the diet complicate the assessment. There is also a
certain amount of attrition of "strict-behavers" over time, so that the ranks of these individuals who do well

248

long-term consist of those left after others have liberalized their behavior. Again, the distinctions one makes change the answer, making it very difficult to say. A real problem here in coming up with a number is that there are many who "cheat" a little (we'll discuss this behavior later) and include haphazard but regular driblet amounts of cheese or butter, even eggs. If you were to include these individuals, even more would be classed in the group doing well. All-in-all, I would guess what you are left with after this is a quarter of Hygienists who are experiencing some kind of difficulty, possibly up to half if one were being very tough about it. It all depends very much on just how you define "difficulty," which we'll be doing shortly. I don't want to quibble over specific figures because they can be argued. But a significant percentage--whatever it is--experience the problems we'll be listing, from minor to more troublesome.

The problem of vested interests among "official" sources in getting straight answers. Now
admittedly the M2M is not a scientific sampling, but frankly, one probably doesn't exist right now, so we have to go on what we hear from people when they appear to be being straight with us. It would be better if we could get statistics from Hygienic practitioners' caseloads, but since they are so heavily invested both monetarily and psychologically in Hygiene, it is difficult to assess how unbiased they are. Even though I have a lot of respect for their knowledge about applying currently accepted Hygienic practices to the circumstances of the lives of their patients, you have to admit that most are not going to allow the possibility to enter their minds that failures in their patients might be failures of the Hygienic diet. They are going to be more likely to see them as "failure to comply" or as due to other factors outside their control.

"Party line" views and ostracism of dissidents. I believe you know from having previously
interviewed practitioners Stanley Bass and Christopher Gian-Cursio here that when Gian-Cursio looked into the problems a number of his patients were experiencing and came to the conclusion that it was because the Hygienic diet was not sufficient for all patients--and that the problems were only turned around by carefully supplementing the diet with foods of animal origin such as cheese--and wrote it up, he was essentially ostracized or at least ignored by professional Hygienedom for saying so. Many of the Hygienic practitioners, as Dr. Bass has pointed out in one of your earlier interviews here in H&B, often are dealing primarily with fasting patients, and thus not necessarily seeing how well all their patients do out in the real world away from a fasting institution. Initially, you certainly cannot argue with the empirical fact that very few people with health problems will not improve considerably with a Hygienic fast followed by a cleaner diet than what they have been used to eating. But what happens during a fast and the aftermath is only a small window into what goes on the rest of the time on a Hygienic regimen. I myself have gotten decent results from fasting, but have not done my best on the Hygienic diet itself. It was a repeating pattern of improve-on-a-fast followed-by-stagnation-and-eventual-downslide on the traditional Hygienic diet that got me thinking along these lines.

Special measures that may make the diet work for more individuals argue against its naturalness. You also have to remember that there is a certain amount of "plasticity" in organisms to
what they can functionally adapt to, even if they are not optimally adapted genetically to something. With enough accumulated experience--as many Hygienic practitioners have--you can learn just what jots and tittles have to be followed in order for a Hygienic diet to work. Thus, there may be some truth to the fact that if people aren't getting results from a Hygienic diet (assuming they are following the other elements of the lifestyle such as proper rest, emotional poise, adequate exercise, clean air and water), then "they aren't doing something right." However, if you really have to follow so many dietary details down to the last iota, then that very strongly argues against such a diet being our natural one, given all the rules for behavior that have to be so painstakingly followed. It seems to me, most especially given the more rough-and-tumble evolutionary picture, that a "natural" diet will have more allowance for give-and-take variation in one's basic food intake than that.

249

Gap between uncensored reports and officialdom rarely surfaces publicly in a way that gets widespread attention. Bass and Gian-Cursio claim they were seeing some problems develop in
Hygienists in real life, especially over successive generations. However, their views were never allowed a forum by official Hygienedom, so very few have had a chance to hear their contentions. What we get is the party line from the IAHP and the ANHS. What one can observe in the M2M agrees more with what Bass and Gian-Cursio have said than what you hear officially.

What about those eating the all-raw-food version of the Hygienic diet? Don't they do better than this?
Unfortunately, no. As a group, they do worse. There are certainly some individuals who do well on the raw-food diet, but they are a small minority. We do have a few individuals in the M2M on all-raw who have done well for 10/20/30 years or more. But my estimate from having seen people come and go in the M2M, and the stories they have told about their attempts at sustaining an all-raw diet, is that somewhere in the neighborhood of about 10-15% of Hygienists are truly able to thrive on an all-raw diet. The rest do better when they have added a certain amount of cooked tubers, legumes, squashes, grains, etc., though they may of course be able to live in whatever condition they do on an all-raw diet.

The distinction between short-term and long-term results is critical in evaluating "success."
It is one thing to go on a raw-food diet and do well for some months or a year or two or whatever, but most who take it beyond that point begin to decline. The true test is long-term over a number of years. With the exception of those few who thrive, what we hear in the M2M from people attempting the all-raw diet for any significant length of time (this applies in spades to those attempting fruitarianism) is that over the long term they often start suffering from one or more of the following symptoms: They may lack energy, or their stools will become chronically diarrhea-like, they feel hungry all the time, they may not be able to maintain their weight without struggle, or they develop B-12 deficiency, lose sex drive, fatigue easily, and/or develop insomnia or a hyperreactive nervous system. With some individuals, we also sometimes see bingeeating develop when they try all-raw.

Are the rare all-raw success stories the "ideal," or simply "exceptions"? The all-raw diet works
well for some few individuals, but for these individuals to extrapolate (as most do) that therefore everyone would also do well if only they were "doing it right" ("like I do it," following certain jot-and-tittle procedural details) is fallacious and self-serving. It assumes that everybody is alike, or enough so for the same ideal diet to work for all--more black-and-white reasoning. If there is one thing modern genetics is showing, it is that people can vary considerably in certain respects. Especially given the evolutionary transition that the species has been in the midst of since the advent of agriculture 10,000 years ago, you would expect there to be more variability at this time.

Symptoms of "failure to thrive" on raw and/or vegetarian diets
All of this so far paints a disconcerting picture of the Natural Hygiene diet as one that works for some but not others. Why? Do you really feel people are so different that we can't agree on certain basic fundamentals of diet?
No. But I don't think it's too mysterious why some succeed and some fail on Hygienic or other vegetarian diets. In Part 1 and Part 2 here, we went to great lengths to detail what the so-called "original" or evolutionary diet of humanity was. What Hygienists miss is that the "basic fundamentals of diet" that you speak of comprise a wider variety of acceptable foodstuffs--including a certain percentage of foods such as cooked starches and lean animal game products--than have been considered "kosher" within Hygiene to this point in time. It is within the set of foodstuffs common to our evolutionary heritage that you find individual differences and the need to tailor things somewhat.

250

From this standpoint, then, the Hygienic diet is not so much different from the evolutionary diet as it is simply a restriction of it (as is true of all vegetarian diets). The foods we are eating are fine foods, but they are a subset of what the human body evolved on, and some individuals with the right kind of constitution or genetic plasticity--say, closer out toward the ends of the statistical "bell curve" of the average genetic make-up--can handle that restriction while others can't. And as we said previously, there are jots and tittles that experienced Hygienists may know how to implement in order to compensate to make the diet work for additional people. But they don't work for everyone.

So for the individuals who do not do well on a Hygienic diet--whether all-raw or the more mainstream 80% raw/20% cooked version--what are the symptoms they experience?
Well, I mentioned a few of them briefly above for the all-raw-fooders, but let's look at them in more depth, since they often also apply to the mainstreamers who are not doing so well, just to a lesser degree. Also, in most cases, whether all-raw or not, these individuals are more often total vegans, or close to it--no dairy or eggs gotten on the side occasionally. I want to mention here that I am well aware that the traditional Hygienic explanation for some of these symptoms would be that they are merely "detox" symptoms. Later I will go into just why I believe this idea is mistaken given the circumstances in which these long-term symptoms usually manifest.

A look at the most serious potential problems. First, here are a couple of severe problems we have
seen. Keep in mind these are the most severe we have seen, and they are individual cases, but they are real possibilities. •

Case of congestive heart failure in a decades-long Sheltonian hygienist. Probably the
most sobering was that a year or two ago one older member who ended up dropping out of the M2M wrote to us that they had developed congestive heart failure. They said Hygienic practitioners told them it was due to lack of adequate protein and B-12 over an extended period. [No comment here as to potential accuracy of this attribution.] We later heard word from them through an intermediary in the M2M that it may have been due to having followed a diet too high in fruits percentage-wise for many years prior to that.* This even though they were apparently not extreme about it like a "fruitarian" would be, but had simply followed a traditional vegan Hygienic regimen over many, many years, only one higher in fruits. This particular situation I don't think anyone in the M2M really had much of a stock Hygienic explanation for. Most were simply stunned. I do think, however, it at least drove home the point with most people that diets too high in fruits are wise to stay away from. Case of rickets in a vegan toddler eating a Natural Hygiene-style diet. Another sobering occurrence was that in the most recent issue of the M2M, one of our participants reported that their two or three-year-old infant son--who was still breastfeeding (or had been until recently) and who, along with the mother, was eating vegan Hygienic foods, including getting sunlight regularly--had developed bowed legs as a manifestation of rickets due to vitamin D deficiency.* After getting professional help that included an initial megadose of vitamin D, the father began feeding the son raw goat's milk and cheese, and added a daily multivitamin/mineral supplement, with the result that the son's legs are improving and areas in his teeth where enamel had been missing are now re-enameling. Cases of vitamin B-12 deficiency. As a perhaps less severe problem, but one that occurs somewhat more frequently, we have had several members on the Hygienic diet (even when not all-raw) report very low levels or deficiencies of vitamin B-12. These have been discovered on blood tests not just by mainstream doctors but also by Hygienic practitioners, the latter taking the problem seriously. To my knowledge, the low B-12 levels have so far responded quickly to supplementation which is often recommended by Hygienic doctors these days.

251

Low-profile symptoms more commonly seen. The above are the most serious problems I can recall
offhand without going back and digging through back issues of the M2M to perhaps find a few others. Less severe but more frequent than the above symptoms are the following: • • • • • •

Continual diarrhea or poorly formed stools can develop in some individuals after some
months on the diet. This would occur mostly with people eating all-raw. This seems to be due to the very high roughage content which some individuals simply cannot handle well.

Some women's periods become quite erratic or may disappear. Some people experience insomnia or unrestful sleep on an all-raw [and/or vegan] diet and
get run down and chronically tired. Another contingent of people has very little energy eating all-raw and drag around all day. An alternate manifestation of this would be feeling languid much of the time or sleepy during the day even with regular nightly sleep. A mental effect sometimes seen may be lack of normal motivation to get regular daily tasks done--no "joie de vivre" ("joy in living").

Some individuals cannot maintain their weight well and become thinner than even they may think is healthy. We have one individual in the M2M who had this problem and who
was a huge eater, adhering to the Hygienic diet since 1989, who has finally been gaining weight after having included regular raw animal flesh for the past couple of years. The volume of food required has dropped dramatically. Most, although not all, athletic individuals, particularly endurance athletes, find they perform better when they include cooked starches in their diet--and that relying primarily on the sugars in fruits on a raw-food diet, and the calories in nuts and so forth, does not seem to provide the proper mix of carbohydrate fuels that provide the sustained energy they want. Perhaps the most common complaint of all is that more than just a few people are "hungry all the time." Some then become obsessed with eating and have to eat or piece all day long. Another common complaint is that sex drive decreases markedly for some, even to the point of virtually disappearing. This can occur in both males or females, but the complaint is heard the most often from men, many of whom are not accepting of, or happy with, the stock Hygienic explanation of a simple lack of "overstimulation," or the typical vegetarian view that it is a "spiritual" advance.

• •

Some people experience more frequent respiratory problems and colds on an allraw diet, even when they have been on the diet long-term, which might indicate a
lowered immune system. For example, one individual who has been vegan since 1989, and Hygienic since 1991, has noted an increase in colds/respiratory problems whenever the raw component of their diet has climbed above the 80-90% range or so. When they include cooked grains and starches in their diet, they feel better, are not so hungry all the time, sleep more soundly, and the colds cease. Some people's nervous systems become more reactive and subject to upset, which may be connected to the insomnia problem that some experience.

How people get trapped by "pure" diets that don't maintain long-term health
The "frog in slowly boiling water" syndrome: initial improvement followed by long-term decline. And the final thing I want to look at here is not really a specific symptom, but a long-term
syndrome that takes time to manifest. We could call this the "better-for-awhile-at-first-followed-by-slowdecline" syndrome, or the "frog-in-slowly-boiling-water" syndrome. In fact, this is the usual long-term pattern within which the above problems occur.

252

How can things end up bad when they started out so good? As we mentioned earlier, the
key conversion mechanism for convincing people of the Natural Hygiene system--or other "cleandiet" approaches--is that when they go on a cleaner diet, almost everyone improves initially because of the detoxification that takes place. People often feel so incredibly good initially on a clean diet that they cannot believe--they do not want to believe--it could somehow go sour. However, over the longer term, many people's bodies simply may not be extracting sufficient nutrition from the diet and they begin to decline, or develop various symptoms even if they still feel okay, presumably due to deficiencies after their reserves are exhausted. Except in the case of B-12, these are not usually your typical "deficiency-disease" symptoms--like the case of rickets that was mentioned earlier. They are more subtle, but they have their effect, and they can be slowly debilitating over long-enough periods of time.

The lulling effect of imperceptibly slow declines in health. From a psychological
standpoint, the most interesting thing about this syndrome is that it occurs slowly enough that people often mentally adjust to the lowered state of health and do not perceive it as such, particularly since they so strongly do not want to believe it is happening. Instead, they continue to feel they are doing fine--sometimes even when they describe some of the above symptoms, which they may interpret as improvements that indicate "continued detox"--while people who know them may tell them otherwise. Yet because of the extreme anti-authoritarianism, they don't listen to what others are telling them, until finally, the problems reach a point they can't deny them anymore. Emotional "certainty" shuts down one's ability to rationally assess symptoms. By the time this happens, though, people have been so thoroughly convinced of the entire Hygienic dietary system by their initial successes, and particularly are so won over mentally by the detoxification paradigm, that they are now psychologically invested in the "rightness" of everything about the Hygienic system, and cannot believe there could be any shortcomings in it, since it seems so internally self-consistent logically.

The willingness to make sober judgments of current symptoms is perpetually displaced into the future. Since they got results to start with, they feel their problems now
must be only a temporary setback, or evidence of further detoxification taking place or that further detoxification or patience for further healing is needed, or if not that, if they can just correct a few details, all will be well again and they will be back on the road to superior health. "The better you get, the more energy the body has to expel the toxins, and the more powerful and healing the crises become!" This is the thinking, which in earlier stages of detoxification might be true, but is increasingly questionable the longer one follows the program, which is the case for many of the symptoms we outlined above.

It sounds like you are saying people can become more concerned with "being right" about Hygiene than in whether they are actually getting good results.
Basically, yes. That's a good way of putting it. The logic of Hygiene comes to captivate people and satisfies them as much as results do--perhaps more so if they aren't getting the results desired. The dynamic, of course, is a little different for those who are getting good results. They, too, are plenty interested in being right about Hygiene, but since it has worked for them, they have a more solid sense of "certainty" about it, which of course they project to others. And because they have this greater sense of certainty, these individuals tend to become the repositories of traditional wisdom which is dispensed to the less fortunate. Those who are successful have a tendency to blame the other persons' behaviors ("you aren't following all the itty-bitty details right") or to doubt the failures are failures at all, seeing them rather as expressions of the truths of Hygiene at work when its principles are violated, proving once again its rightness.

253

Those who are successful are partners in reinforcing the tendency not to see failures as real failures. That's one of the biggest things holding all this in place: Those for whom Hygiene has worked
cannot afford to consider failures as failures of Hygiene either, because just like everyone, they uphold the ideal that it must work for all, and so if it does not, it is also a threat to their own beliefs in spite of their success. This shared ideal is what binds the successes and failures together in common cause. The less successful imbibe the certainty of the more successful to maintain the faith. This syndrome is of course not unique to Hygiene, but occurs with any idealistic system where there are both significant successes and failures. The important point to note here is that it shuts down the mind's openness to new interpretations. And if the mind cannot look at alternative explanations, what you will often see is that the favored paradigm gets pushed to extreme limits by those who are failing at it in order to try to reach some sort of resolution.

Such as?
Well, at this point, two things may happen, which are that: • • In order to prove that the system does in fact work, people redouble their efforts at detoxification, and often attempt to go on an even stricter diet. And/or: Considering perhaps they may not be getting enough of some nutrient after all, they begin to get more perfectionistic about the many vegetarian dietary details that are possible to manipulate. After all, one characteristic of exercising more control is the need to pay attention to the additional details that are necessary in order to implement such increased control. And if the details one used to pay attention to aren't working anymore, one must become more attentive to even smaller details.

Thus, what is essentially a simple, logical system of garbage-in/garbage-out ("toxemia is the basic cause of all disease") begins to increase in complexity. In the end, obsession and fanaticism may develop.

How obsessively striving for absolute dietary purity becomes a fruitless "grail quest"
Significant parallels with religious behavior. In the introductory passages of Part 2 of the interview
here, I was not merely making a metaphorical comparison in saying that Natural Hygiene resembled a religion in some ways, because the reaction of redoubled purification efforts in response to the better-forawhile-but-then-worse syndrome has interesting parallels with certain religious motivations. It serves some of the same psychological needs.

Let's go into this more. How so?
Well, in religion, at least in the West, you have the idea that people have sinned, and that there are prayers or rituals you must go through to cleanse yourself of that sin. There is typically guilt or at least anguish over this state of assumed sin. Even if one does not believe in original sin, still, all it takes is one slip-up to establish an ad-hoc fallen state, after which you must redeem yourself. Guilt and redemption are strong motivators, because they go to the core of self-image.

The role of unseen and unverifiable "toxemia" as evidence of one's "sin." In Natural Hygiene,
if one makes a blanket assumption that "toxemia is the basic cause of all disease"--or at least the only one worth considering, since dietary sufficiency eating natural vegan foods is usually assumed to be a given-then the psychological motivations often cease to become merely practical and become coopted by a "redemptionist" quasi-religious quest. For those who are prone to taking it to an extreme, what was practical can become imbued with an element of superstition, or at least an unverifiable belief in unseen

254

toxemia that one cannot really substantiate completely. So one's philosophy can begin to obscure whatever realities to detoxification there may be and blow them out of proportion.

The "relativity" of absolute purification transforms it into an ever-receding goal and goad.
One's overriding goal becomes greater and greater purification toward the holy grail of complete detoxification that ever recedes as one's perceived level of detoxification continually falls short of the postulated goal. One paints themselves into a corner where because "no one is without sin" (we all goof up from time to time), one can never be sure after they have sinned if that sin is really forgiven or not (cleansed from the body by normal processes). Thus to redeem oneself, one must always either fast or go on some sort of cleansing program. Following from this is the idea that to remain pure and undefiled and also to serve as proof ("by their fruits ye shall know them," or the signs that one is more evolved), one should go even further and be able to live on the most restricted diet possible permanently. Such as not just a raw-food diet, but sometimes fruits alone (fruitarianism). And if one is not able to do so, then it means one is not really purified, or evolved enough, and thus the only recourse is to do further cleansing until one is really pure and evolved enough to do so.

Self-restriction becomes its own virtue as absolute purity recedes. After awhile, however, this
easily becomes a habit divorced from any real reason. Even if one did reach perfect purity, the urge toward self-restriction would continue because of the psychological dynamics which have become ingrained. So for those not doing well on the Hygienic diet who fault themselves, it can end up becoming a selfreinforcing vicious circle and one never reaches the "goal." Because as we've noted, what often happens by this time is after being on the cleansing diets and restricted intakes for long enough, instead of toxemia being a problem, deficiency (depletion of one's reserves) sets in. So what ensues if people are really hooked into this dynamic is they go through a cycle of cleansing and sinning, cleansing and sinning (because one's body is crying out for more nutrition and drives one to eat something different, anything different in an attempt to make up the nutritional shortfall). Or if they do not sin, they continually feel they still must not have figured the "system" out completely (whatever the system may be), and so they go on searching here, and searching there.

Endgame: fundamentalist obsessive-compulsiveness. Once this goes on long enough, because it is
self-reinforcing, many of these individuals become so locked into this way of thinking they are never again able to see food and eating any other way. (Just as fundamentalists rarely ever change their minds.) This particular syndrome is a potential breeding ground for obsessive-compulsives, and the individuals who fall into it often become the basket cases of Hygiene. I realize, of course, all this may sound rather ridiculous taken to this extreme, but it is a real dynamic that we see in the M2M in more individuals than you might think. Typically, of course, it's not nearly as extreme as painted above, but it's not an uncommon undercurrent in behavior. At some point a person has to decide between two perspectives: Is one going to take toxemia or detoxification to be all there is to health--by taking the reactionary opposite of their former all-American "more and more and more is better" outlook to its simplistic opposite of "less and less and less is better"? Or do we try to strike a balance point somehow?

Successful vegetarian diets require more than simple dietary purity 255

So how do those who are successful on the Hygienic program go about striking that balance? More attention paid to robust nutritional intake and other health factors. Well, we see a few
different ways, but they all boil down to two main areas, I think. Rather than getting so totally wrapped up in detox, successful Hygienists more often seem to pay a lot of proactive attention to making sure they get sufficient nutrition, rather than just assuming it will happen automatically if they eat all natural foods. But just as importantly, they also more often pay equal attention to the other factors of a healthy lifestyle such as adequate exercise, sleep, rest, sunshine, stress reduction (including lack of excessive mental stress), and even sometimes to living in more equable climates, which also reduces stress. I don't think the attention paid to these other factors is any accident. Long-time Hygienists are more likely to be aware on some level of the detrimental impact insufficient attention to these other factors can have, and intuitively seem to know they must be paid attention. These factors are important to health in and of themselves for anybody, of course. However, I believe there is an additional reason why they assume added importance for a Hygienic lifestyle in particular.

Stress and vegetarian diets. My own personal experience and observations corroborate what I have
heard from some other long-time former vegetarians outside Hygiene who have had the chance to observe successes and failures--both in themselves and widely in the vegetarian community. And this is that stress makes a vegetarian diet more difficult to thrive on--more so than for other diets. It just seems to be that when you look at a range of long-term vegetarians, the ones doing better are more often living "easier" lives. Their lives are less demanding work-wise or stress-wise, or if not, they have plenty of time to recharge. Again, I am not saying this is true of all, but it is a noticeable thread. Handling stress of course hinges on several other factors in the Hygienic system, such as adequate but not excessive exercise to create a state of fitness and better resilience to stress; and of course adequate rest and sleep to repair from stresses. And a number of Hygienic practitioners are noted for continually reminding the rest of us not to shirk these factors.

Stress creates less margin for error for nutrition's contribution to physiological maintenance. Why should it be that one may have to pay more attention than usual to alleviating stress
factors on a vegetarian diet if it is to succeed? Again, my opinion is that it goes back to the observation that compared to the "original" evolutionary diet of humanity, the Hygienic diet is a restricted one. This leaves less margin for error with regard to nutritional intake. And that would presumably also have the potential to affect any cellular rebuilding and repair necessitated by stresses. With less room for error in this area, greater attention must be paid to other factors besides food that either reduce stresses that make more demands on nutrition or on the stress-response system (adrenal glands, etc.), or that involve rebuilding or recuperative activities. This expands the margin again to create a bigger cushion.

Successful Natural Hygiene diets are often less strict and more diverse than traditional/"official" recommendations. Of course that's not the only thing to expand one's safety
margin. The second thing is that there is more diversity in the way Hygienists in the real world actually approach the dietary aspect of the system than is publicly acknowledged in the pages of ANHS's Health Science magazine, for instance. Some of the ones who are successful occasionally include--if not regularly supplement the diet on a low level with--things like yogurt, or some cheese, perhaps eggs. It may not be much, but the more-or-less even-if-haphazard continuing dribble of it is suggestive. Stanley Bass, one of the Hygienic practitioners as you know from his interviews here, and who is a member of the M2M, strongly recommends modest but regular amounts of cheese or eggs as necessary to complete the basic Hygienic diet nutritionally--based not only on his mice experiments, but experiences with real-life patients as well.

Special nutritional practices added by some individuals. A few individuals in the M2M take
special measures such as making flaxseed preparations which they feel provide crucial nutrients to get [i.e.,

256

essential fatty acids (EFAs) and/or their precursors]. Others may go in for BarleyGreen, such as yourself, Chet, or for the blended salads that Gian-Cursio developed and thought essential to increase assimilation, which you and Bass have also recommended. We have also heard privately from an individual who has consulted with one noted Hygienic practitioner who says they acknowledged to them in person that there is nothing wrong with fish in the diet once a week or so--though of course this practitioner cannot come out and say so publicly. Still other people go beyond the traditional Hygienic whole-foods idea that says juices should be limited to a now-and-then basis, and regularly drink vegetable-juice preparations (carrot-celeryromaine is one that seems to be favored) as a way of getting more concentrated doses of minerals and so forth. And finally, of course, there are those such as myself and another individual or two in the M2M who have added modest amounts of animal flesh to our diets, based on the evolutionary picture of humanity's original diet, and have seen significant improvements over the results we got on the normal Hygienic diet. We still believe the basic Hygienic way of eating is a fine way to eat as far as it goes, and we continue to follow many of the usual Hygienic eating patterns including plenty of fruits, vegetables, and nuts; we just don't think it is quite sufficient, and have carefully added reasonable amounts of these other items to round out the diet, paying attention to issues of quality. Those of us in this last camp go along with those such as Bass in thinking that the paradigm of detoxification is valid to a point but only half the picture, and that actually deficiencies are an equal if not greater cause of problems for those on long-term vegetarian diets, accounting for many of the problems that arise and persist.

Potential role of involuntary "lapses" in filling nutritional gaps. Individuals like the above who
supplement the basic Hygienic diet, of course, do so consciously. In relation to this, there is another interesting, relatively unconscious, behavior here we should also look at. While I don't mean to explain away all the successes in Hygiene as due to people doing non-kosher things--because that isn't true-nevertheless it is interesting to observe that many Hygienists still "cheat" a little or have periodic minor indulgences even when they do not really intend to do so as their "ideal" regimen. Using butter or cheese on one's steamed vegetables on a weekly basis--or more often depending on the individual--isn't uncommon to hear about in the M2M. Perhaps less frequently some eggs and so forth. Consciously, they really do not "mean" to do it, but they do it anyway. Which raises the legitimate question of how much of a role these periodic low-level "indulgences" may be playing in filling in potential nutritional gaps in people's diets. The bottom line is, I don't think anyone really knows the answer. It's a question I doubt anyone has studied systematically, and it would be very difficult to do so.

Lapses usually automatically interpreted as "addictions" instead by adherents. What is
interesting from the psychological standpoint, however, is that these so-called "lapses" are almost always viewed as discipline problems or addictions. If the Hygienic diet is supposed to be so "natural," why are so many enthusiastic and motivated followers not completely satisfied by it? One cannot help but get the strong feeling no one is interested in considering the possibility the diet may not be satisfying people's physical needs as a primary underlying reason why they cannot, or are not, sticking to the diet, and that their bodies may be making them do this. This is another "secret" of sorts that's not talked about much publicly but you see in the M2M: Many people struggle with sticking to the diet, and not just for "social" pressures, but because of cravings. After a point, while it may explain some of the cases, to blame it all on past addictions or lack of discipline or poor-quality produce is too convenient.

Rationalizing dietary failures with circular thinking and untestable excuses
This type of habitual pattern of theoretical explanation that exhibits the strong tendency to first fault results (or the individual responsible) if they do not measure up to the philosophy suggests motivations rooted more in idealism than in practically evaluating one's philosophy by results.

Can you clarify that a little further?

257

Pat answers and mantras. If one's interpretation of a theory is overidealistic--even though the theory
itself may satisfactorily answer some number of things, as I agree that Natural Hygiene can--they will try to make their view of it impregnable to criticism with a pat answer for everything, so that you could imagine no outcome of any circumstance or test that they would accept as valid evidence against it.

Unfalsifiable excuses impervious to testability. In science there is a prohibition against explanations
that are not "falsifiable"--meaning those that cannot be subjected to a test where one outcome would negate ("falsify") the explanation, and the other outcome would support it. Otherwise the assertion is impervious to a fair test and will not be taken seriously by science because it is untestable. In other words, if in interpreting the results of an experiment you can always twist them so that they support your theory, and you cannot allow or conceive of any result that would count against the theory, then you are trying to have your cake and eat it too, and that is not allowed if you are going to be scientific.

How about some examples here? "You're too addicted." Well, the above idea that people cannot stay on the Hygienic diet because they
are too addicted to other things is a circular kind of argument as normally stated. There could perhaps be ways of testing it, if one wanted, but not the way it is usually stated which is designed to deflect any possible criticism: Why can't one stay on the Hygienic diet? Because they are too addicted to past foods. How do you know you are addicted to such foods? Because if you weren't you could easily stay on the Hygienic diet! It goes round and round. You see, the way they are formulated, you can't subject these contentions to a real test because they are supposed to be true by definition. When you start hearing explanations like this given out as pat answers, then it indicates one doesn't want the theory to be argued with or subjected to a fair test where there could be a risk of an answer you don't want. "You just haven't given it enough time yet." There are other statements like this in Hygiene designed more to protect belief than to truly explain. For example, another common one we hear a lot in Natural Hygiene is that if someone isn't getting good results, then they "just haven't given it enough time yet." "Hey, you expect to get well in just a few months or years from 20 or 30 years of prior bad living habits!?" That's the rhetoric. Now we needn't deny the need for patience without also requesting some sort of reasonable criterion to determine if there has been enough time. Because without it, you just don't know--you are clawing at the thin air of a foggy explanation designed to obscure the issue. If one were truly interested in being practical here, they could at least make the observation that if what one is doing is working, then symptoms would tend to lessen in frequency or severity over time. (In other words, you would at least see a trend, even if full results did require a much lengthier period.) If this were in fact observed, this would uphold such reasoning, while if the symptoms are increasing or persisting over the long-term, then the hypothesis is falsified. But if all you are doing is just telling people they haven't given it enough time no matter how long they have given it, then you are obviously not very interested in putting things to a realistic test. What we find in the M2M is that usually people don't require all that long to notice some kind of results trend after a dietary change. A few months to several months is usually amply sufficient to see at least some kind of trend. Many people see early results or trends within a few weeks to a month or two. If you are going longer than several months to a year, and you have not gotten any better than when you started, or are experiencing persistent trends for the worse, something is probably wrong.

"Unnatural overstimulation." Here's another example, this one more of a double-standard explanation
to discount good results that other people get eating different diets than Hygiene. It's one I hear myself all the time: If you are someone who has chosen to start including meat in your diet again and you have been

258

gradually feeling better, you inevitably hear the mantra that it's not really better health, it's only "unnatural overstimulation" due to the supposed excessive toxins or nasty animal protein in meat. Yet by the canon of Hygiene itself, the criteria for a stimulant as contained in Shelton's "Laws of Life" contradict this. Law X, "The Law of Stimulation or Dual Effect," and XI, "The Law of Compensation," together basically state that the energy from a stimulant saps the energy of the body for its effect and requires a compensating recovery period. In other words, stimulants result in a prostrating effect afterward during which you feel worse and have to recuperate from. This is not what those of us adding meat to our diet in reasonable amounts in diets otherwise Hygienic in character and who are experiencing improved health go through--quite the opposite, as it is a gradual response. Yet the mantra gets repeated over and over again on autopilot without seeming comprehension of its lack of correspondence to the actual circumstances. It is invoked as a pat answer simply to explain away the fact someone is getting better results on a different diet one does not like.

When symptoms are always seen as "detox." The most important example, of course, of an
untestable answer repeated like a mantra that we need to address here would be when any symptoms encountered are always explained as symptoms of "detox"--while the alternative possibility that they could be indicative of a dietary deficiency, casting doubt on the sufficiency of the Hygienic diet, is ignored totally. Again, that a symptom is due to detoxification might be true depending on the circumstance, but without a way of reasonably assessing which might be the case, it's merely a convenient, unfalsifiable, and untestable explanation. How would you test such an explanation? Well, short of some sort of blood test that might pinpoint biomarkers indicative of toxemia, you could at the least point out that, as we did above, the symptoms-detoxification in this case--should begin to diminish in frequency over time. And on the other hand, if after these symptoms have diminished or abated and you are left with other symptoms that have persisted at the same level or even increased, you would have to admit to reasonable grounds for supposing that these symptoms may indicate deficiency or deterioration or metabolic imbalance of some kind. But merely to say without any further attempt at clarification that any symptom indicates one is "still too toxemic" or still too gunked-up--that it requires still further detoxification--shows that one is not very sincere in their attempts to find out. For these reasons I think that the stock explanation of "continuing detox" is erroneous in attempting to cavalierly dismiss the symptoms I described earlier that are experienced by those who do not do well longterm on the Hygienic diet. Because for these individuals, the symptoms are persistent and do not diminish over time. And as we have said, usually these individuals had experienced improvement at first before the decline, which is why they are so loath themselves to accept the idea that the Hygienic diet may be insufficient for them.

Other meaningless, unhelpful, or unfalsifiable excuses. And a little more on the humorous side
here, when all else fails, one can always bring out the even heavier artillery of such unanswerable assertions as: "too badly damaged by prior living," "you weren't breastfed long enough," "degeneration of the race from prior generations' poor diet," or my own favorite and one I was once personally accused of: "psychosomatically sabotaging yourself by trying too hard." Could there be any possible validity to these excuses? Perhaps. Who knows? The point is, they could be given for any approach that fails, not just Hygiene, which shows just how relevant they really are in explaining why Hygiene shouldn't give better results than all the rest. Please understand, as I emphasized in the introduction to Part 2 of our interview, that in this critique I am not quibbling with the basic principles of Hygiene, which I agree are valid for the most part. What I am saying is that they are often interpreted in conveniently inconsistent, fallacious, or selective fashion to explain away detrimental results that threaten emotional attachments to dietary details or implementations they may believe stem from those principles. And that this can get people into serious trouble by obscuring their vision of what may really be happening to them.

259

5 tips for staying alert to the traps of excessive dietary idealism
What about ways of guarding against falling into these different traps of idealism? Are there things that can be done to prevent going unconscious about it? 1. Self-honesty instead of denial. Well, number one, I would say simply acknowledging all that we've
been looking at here--at the least considering it as a real "possibility"--is a big first step. The big problem I see is that even where people become aware of these problems, as happens in the M2M, they most often go into denial about it. Denial is the biggest hurdle. Just be honest with yourself about how you feel and if you are pleased with the results you are getting or not.

2. Focus first on results rather than theoretical certainty. Second, becoming less set on arriving at
a perfect theory and more interested in getting good results helps one remain more objective. Be wary of the desire for too much certainty about any one theory. Theories are very useful tools, but they usually change and become modified or at least refined over time in response to scientific advances. Being too certain theoretically can override your ability to perceive any detrimental results that might cast that certainty in doubt. If you think of a theory as an approximation subject to modification by results--which is what science is all about--you'll be much less likely to get into trouble.

3. Utilize reasonable timeframes to gauge trends. Third, being willing to place reasonable time
limits on one's dietary experiments--say, a few to several months per dietary adjustment to assess at least some sort of trend--will help keep you from falling into the frog-in-slowly-boiling-water syndrome if you don't get results.

4. Exercise at some activity that gently challenges your limits, to hone sensitivity to changes in your capacities and health. Fourth, a regular exercise program doing something you enjoy,
particularly one involving a high level of activity--such as some type of endurance sport, or weight-lifting, or yoga--not just those particularly, but something where you can accurately measure or get feedback about your results at a high level of activity, can be very helpful in not only achieving health, but in more realistically assessing it. You don't have to put large amounts of time into it; rather, intensity or focus is the key, where you play the edges of your capacities for at least brief periods of time without too much undue strain. When you do this, you will notice shortfalls in results and the impact of diet or changes in health much more quickly than if you are idling along far below what you are capable of.

5. Don't ignore feedback about how you are doing from people who know you well. And fifth,
actively seek out the opinions of other people whose judgment you trust about how they think you are doing. Maintain a dialogue with people and don't isolate yourself from feedback and others' opinions, or be so ready with dogma that you drown out what they are saying. Ultimately, if you aren't feeling well, looking well, and doing well in your daily life, why are we bothering with any of this?

Is there anything we haven't covered that you'd like to add before we close out this interview?
Yes. As much as I've tried to carefully document the scientific aspects in Parts 1 and 2 of these interviews, I have no doubt newer science will inevitably supersede portions of it. Also, given the complexity of some of the research, it's possible there may be a few instances where I overlooked differing interpretations of fact. I fully expect given the controversial nature of what I am reporting and saying here that there will be those who will point out any such inaccuracies that may have occurred, which I welcome. This is how knowledge advances, and I'm sure it will help me refine my own thinking or look at new interpretations as well.

Big picture more important than disputes over details. I trust, however, people can judge for
themselves that the overall thrust of the views I have been presenting do not depend on a few specific

260

details, but rather on how all of them as a whole point to a larger overarching pattern. That's my real interest--the broad thrust of all this. I don't really expect to convince very many people and I know many will be upset. I can well imagine as we speak that many of your readers will be busy writing their own rebuttals. What I am interested in is starting a dialogue on these issues in the vegetarian and Natural Hygiene communities, and generating awareness about them. I don't think most know yet that there is considerable scientific data on humanity's "original" diet now, or that as we have discussed in Part 3 here, there are significant numbers of Hygienists not doing well.

Tolerance for our own mistakes, tolerance for others. In this same vein, I'd also like to suggest to
Natural Hygienists and other "veg-raw" folks not to be too fundamentalist about their dietary beliefs and practices. Especially with other people--but even yourself. Leave yourself open to the fact you could be wrong sometimes--because we all are. Don't paint other people who don't believe the way you do about diet, fasting, and health as willful ignoramuses, or people such as M.D.s as willfully evil promoters of bad or destructive information. Most of us are doing the best we know how with what we've got. We need to stop making people into villains, and diet into a religion that we feel a need to identify with as being somehow morally or socially "right" about. Because no matter how much you think you know about it, you can always be humbled by new information. I've really been razzed at times by close family and friends about the changes that have occurred in my thinking about diet as a result of my experiences and research. I'm sure more changes will come. It's somewhat embarrassing how overzealous I've seen myself be at times, even while considering myself openminded and tolerant at the time. If you want to create an environment where people will be forgiving of you when you change your mind about something, you can't be coming across as an inflexible proselytizing zealot to them or they won't be tolerant of you either.

Open-mindedness is an ongoing process, not something one achieves and then "has." It's something you have to continually pay attention to and engage yourself at. It doesn't happen automatically. In fact, the tendency if we do nothing is for our minds to crystallize and become more closed as we age. We all die eventually no matter what diet we eat. I would hope we can all stay forever young at heart even as our bodies inevitably age. That's the most important thing.

Further observations about "failure to thrive" on vegan diets
*** "...the most sobering was that a year or two ago one older member who ended up dropping out of the M2M wrote to us that they had developed congestive heart failure. They said Hygienic practitioners told them it was due to lack of adequate protein and B-12 over an extended period. We later heard word from them through an intermediary in the M2M that it may have been due to having followed a diet too high in fruits percentage-wise for many years prior to that."
As discussed in the postscript to Part 2, although this is admittedly speculative, symptoms of heart disease in this situation would also be consistent with hyperinsulinism as a possible cause, given the high level of carbohydrate from the excessive fruits, combined with low protein consumption.

*** "Another sobering occurrence was that in the most recent issue of the M2M, one of our participants reported that their two or three-year-old infant son--who was still breastfeeding (or had been until recently) and who, along with the mother, was eating vegan Hygienic foods, including getting sunlight regularly--had developed bowed legs as a

261

manifestation of rickets due to vitamin D deficiency." Case of rickets in vegan toddler. This example of serious nutritional deficiency was criticized by one
vegan raw-food advocate online, who made the accusation that crucial information must have been purposely withheld from my account, and suggested the rickets could have been due to an induced calcium deficiency from the binding action of phytates if the diet were rich in grains. For the record, here are the particulars of the father's and his son's diets: •

Family environment and parents' dietary beliefs/practices. (From issue #23 of the N.H.
M2M): The father himself had been vegetarian for 13 years, and for the last four had maintained a diet of "80% living foods"--which we can take to mean roughly no more than about 20% of the diet (probably by volume since calories were not mentioned) would have been other than raw (but possibly juiced, or blended) foods. He also mentions eating potatoes and rice occasionally in this and another issue of the M2M. If we are being fair in making an estimate, we'd probably have to say that at most, rice would have constituted no more than, say, 10% of the father's own diet (half of the 20% non-living foods, given the mention of potatoes as a similarly occasional item). Particulars of father's diet (reflected to some degree in toddler's diet). The father also gave the following rundown of the particulars of his own diet (which, as we'll see, has an influence on the character of his son's as well) and to some extent his family's diet. (Little information was available for the mother's diet, except that it was influenced to some degree by her husband's diet, and the implication is hers too was likely vegan, or mostly so.) A typical breakfast would consist of bananas or grated apples with reconstituted raisins and a little tahini. Or alternatively a fruit smoothie consisting of fresh-squeezed apple juice, tahini, dates and bananas. Lunch was a salad of various vegetables at hand such as the usual items we are all familiar with: tomatoes, bell peppers, cucumbers, celery, onion, broccoli, carrots, lettuce, sprouts, all topped with a nut or seed dressing--often tahini--and/or avocado dressing. Fruit was a favored item for snacks later. Dinner was another raw salad for the whole family with the possible addition of steamed vegetables or potatoes. The son's diet was not much different, though perhaps heavier in fruit. Like the father's, his was "a mostly living foods vegan diet." The father also states that his son was unvaccinated--a sign generally indicative in the vegan community of serious commitment overall to these types of dietary/lifestyle regimes. The son was still breastfeeding at the age of 2 years and 8 months. Dietary consumption consisted of "6-8 bananas a day, apple juice, oranges, kiwis, apples, celery, dates, steamed potatoes, all sorts of vegetables, organic spelt bread, ricecakes, and his favorite spread right now is made with avocado, raw tahini, lime juice, raw cider vinegar, kelp, and a couple of capsules of bluegreen algae."

Note that the father's statements quoted above depict a diet with a regular though not excessive level of grain consumption, would which tend to refute our critic's idea of an induced calcium deficiency stemming from inappropriately high levels of phytate in the diet.

Calcium deficiency rather than vitamin D deficiency as potential cause? The critic's suggestion
that the attribution of the rickets to vitamin D deficiency overlooks the possibility of calcium deficiency does have merit. The father's account (in issue #28 of the N.H. M2M) of visits to the pediatrician, however, strongly implies the doctors diagnosed or at least themselves believed the symptoms were due to vitamin D deficiency. In rechecking my sources here, it is worth noting that the father does make a statement elsewhere in the M2M that at the time of an earlier childhood bout of illness, they had a hair analysis done "that showed some heavy metal contamination and calcium levels in the blood at the low end of normal." (At the same time, here, it should also be noted that the accuracy of hair analysis can be controversial due to alleged problems with repeatability.) How much effect a low-normal calcium level (assuming it persisted, or existed in the first place) may have had at the time of the rickets diagnosis is not discussed.

Elimination of problem using supplements/animal foods demonstrates insufficiency of diet, regardless of exact cause. What all of this does suggest, however, is that regardless of the exact cause
of the rickets, the particular diet followed--one fairly typical of the way recent, and more sensible Natural

262

Hygiene vegan diets in general are practiced--was insufficient in some way. Our critic seems to ignore this general point. Another criticism by the same raw-food vegan critic who took a dim view of our example here was that "fats will impede mineral absorption as well. Babies fed on early formulas with too much linoleic acid [a fatty acid] become anemic." This speculation is easily refuted by the father, as it turns out, who mentions in passing at one point (in issue #23 of the M2M) that the family was once threatened with having their son taken away by the authorities "because we wouldn't feed him formula and baby rice."

Tendency is to rationalize as to speculative possibilities while ignoring probability/plausibility. To wrap up this example, it's worth pointing out that these kinds of speculative
rationalizations are heard all the time in the vegan community, and are a good illustration of how dietary extremists are almost without fail more interested in finding ways "to explain things away" rather than simply accept that these things can and do happen on vegan diets. No matter how strong the example, you can count on an extremist first looking for a way to rationalize it in preference to dealing with it on its own terms. We see here another prime earmark of the idealist's (dubious) logical method of criticism and peptalking: engage in speculation as to numerous possibilities (to deflect attention from a more direct interpretation of the evidence) that redirect attention away from the critic's favored regime as the cause, without regard for probability, reasonableness, parsimony of explanation, and plausibility in the equation of arriving at the most likely explanation given the evidence. The lesson from our example of rickets above, again, is that even if we cede that we may not be able to pin down the proximate cause with absolute certainty in such an anecdotal example (even one verified by mainstream doctors), nevertheless the remedy of adding animal products in the form of goat dairy, plus a vitamin D and multivitamin supplement solved the problem. The unavoidable implication is that the diet was deficient in some way, whatever those deficiencies or their causes may have been.

Special measures to make vegan diets work can be seen as compensations for lack of evolutionary congruence. As outlined in detail in Part 1 of these interviews, the evolutionary evidence
makes it clear that vegan diets are not the original, natural diet of humans, and never have been at any point in our history. Nevertheless, we can cede the point that with the right kind of manipulations, it may be possible in whatever number of cases to compensate for the unnaturalness of leaving animal products out of the human diet by instituting various dietary practices that may be utilized in a particular vegan diet.

That even informed advocates acknowledge vegan diets should be carefully planned suggests that less margin for error is a real issue. Unfortunately, however, as we see above, some
individuals simply don't do well on vegan diets even when conscientious attempts may be made. Often, more scientifically oriented vegan advocates will object in such cases that the diet wasn't "intelligently planned" enough. Yet if one must go to such lengths to be so conscientious and careful, what does this suggest? The most logical conclusion is that there isn't as much margin for error on vegan diets in terms of nutrient deficiencies, while with the inclusion of animal products--a part of the natural human diet--or with the addition of artificial supplements to compensate for their absence, there is. (I do not mean, of course, to suggest that milk is ideal as an animal food. In view of the evolutionary and clinical evidence regarding potential long-term problems due to high saturated fat levels in dairy (among other things), flesh food would be the logical choice as the animal food we are most fit to eat. However, the vegetarian way of life does not permit this, so dairy (or eggs) often becomes the only choice available for those vegetarians who are willing to consider any animal products at all.)

*** The yo-yo syndrome as potential indicator of failure to thrive on strict diets:
One behavioral type of symptom that may be experienced on raw-food diets that was not discussed at much length is what could be called the yo-yo syndrome. Although in extreme cases this can be expressed as

263

binge/purge or alternate fast/feast behavior, more typically it is typified by on-the-wagon/off-the-wagon behavior. Those for whom vegan or raw-food diets don't work well may blame themselves as a consequence of their idealism, and go through a continual cycle of try/fail, try/fail, try/fail, attempting to get the diet to work, no matter how many times it is attempted. Typically, guilt feelings ensue and become reinforced by such behavior, and blame is then laid on "addictions" or lack of self-discipline. Since this type of behavior could potentially manifest itself with any diet that one puts up as an ideal different from past habits, however, I do not know at this time of any foolproof way one can tell for sure in any particular case whether it is indicative of nutritional insufficiency as opposed to "lack of will," so to speak. However if it occurs in conjunction with one or more of the physical symptoms mentioned in the interview, one would have to consider the strong possibility the behavior is more likely biologically driven.

Viewing the yo-yo syndrome as valuable feedback to help pinpoint problems breaks the cycle of guilt over so-called "lapses." At the least, one can use the presence of the behavior as a
feedback signal that one is either not physically satisfied by a diet, or alternatively that something about one's implementation of the diet is not satisfying innate human needs for gustatory and emotional satisfaction from the foods eaten, over and above their nutritional sufficiency. (Anyone who has experienced a fast of any length of time can attest to the fact that food and eating satisfy not just physical needs, but also "feed" a certain human need for psychological or oral satisfaction in eating; not to mention the role that food and "eating time" play in structuring a "day in the life" of human behavior.) A "biological" view of this behavior, of course, would point out the likelihood that on-the-wagon/off-thewagon behavior may have little to do with "discipline" problems or "addictions," but could be the simple result of one's body forcing them to eat other things in spite of their determination not to, in order to get needed nutrients not available (or not in high enough quantity) from the "allowed" items in the diet. It also needs to be pointed out that even where one is following a prescribed diet that might be supposedly "right" for them (for instance, perhaps an evolutionary-type diet like we have discussed here), people still have individual needs that they will often have to arrive at individual solutions for.

Unhooking from guilt frees attention to seriously consider and evaluate practical solutions one may have been blind to before. A different approach, therefore--no matter what one's dietary
philosophy or knowledge--is that one can instead ask: What kind of drive toward satisfying some physical, emotional, or psychological need may have pushed them into the "lapse"? The trick here of course is to figure out what the "indulgence" may be signaling. But at least one is no longer blinded by the selfdefeating cycle of guilt and recrimination, which frees up one's vision to look at serious alternatives. For example: Is there something in the food, or about the food itself that the body may be craving even if one has previous conditioning the food is "bad"? Or, even if the food is bad, is there something in it that one is attracted to in attempting to satisfy some other legitimate need one could get from another, better food they have been avoiding? Perhaps there is even something about the experience of eating the food or the particular psychological satisfaction it brings, rather than strictly the dimension of nutritional considerations that one finds satisfying about it: for instance, texture, taste, the bodily feelings generated after eating it, etc.

Experimental attitude requires new mental relationship with the question of "certainty."
Taking this perspective can be a way out of the self-blame and reactiveness of guilt, or self-destructive and futile "discipline," into a mode where one uses their behavior as a kind of feedback guide to better satisfy themselves on the way to reaching better health. Doing this, however, requires taking an experimental attitude and being willing to temporarily suspend the tendency toward instant judgment or wanting instant certainty, and to instead "live in the question" while realizing some answers in life require that one find them on their own without the assurance of an external authority.

264

*** Why is being "hungry all the time" on a veg-raw-food diet such a problem for some individuals even in the short-term before any deficiencies could have arisen? Diet is lower in overall nutrient/energy-density than the one the human body evolved on.
With the benefit of recent studies indicating that the human gut evolved to be less energy-intensive in performing its functions, and more dependent on higher-density, higher-energy foods with higher protein and fatty acid content (much of them animal in origin) to support the evolution of the large energyintensive human brain (see postscript to Part 1), it appears we have a very likely explanation for why so many raw-foodists are so hungry all time: They simply aren't eating enough concentrated food. (While it is true that some people's bodies seem to be able to functionally adapt to this to one degree or another, or they may be blessed with more efficient metabolisms to begin with, there are many more who can't make a go of it very well.)

Human digestive system not optimized for maximum extraction of nutrition/energy from a diet of all high-fiber foods. In avoiding animal foods and--if totally raw-foodist--also many of the more
concentrated vegetarian foods like cooked grains, legumes, tubers, etc., such individuals have relegated themselves to eating a diet significantly lower in overall nutrient and energy-density. (See The Calorie Paradox of Raw Veganism for a detailed discussion and analysis of the energy-density aspect of this equation.) When you combine this with the fact that the human gut is also not as efficient in extracting nutrients from these higher-fiber foods as it is from denser foods (which is not to say sufficient amounts of fiber do not have their rightful place as a valuable component of the diet), you have a good recipe for the urge so many raw-foodists have to eat large volumes of food, while still remaining hungry much of the time. The binge/purge or fast/feast behavior some raw-foodists get themselves into also becomes understandable as a confluence of this underlying metabolic cause combined with mental overidealism. Thus, while the resulting syndromes might be similar to bulimic-type behavior seen in more mainstream dieters, the underlying causes are somewhat different.

Eating like a gorilla leads to a life centered around food like a gorilla. It is also worth noting that
even gorillas, who have the kind of intestinal fauna and flora to handle a very high-roughage diet more efficiently, still spend considerably larger portions of their day eating than humans normally do. Humans by contrast have been evolving away from this to some degree, towards a more active social existence, which itself takes time. By attempting to "eat like a gorilla," one to some extent has no choice but to spend more of their day eating (or thinking about it, if they are going hungry). This in itself tends to lead to a food-centered existence that breeds obsession, which is reflected in the perception among others that rawfoodists often become fixated on food to the detriment of other aspects of human life.

*** Becoming highly dependent on "mainstay" foods in a veg-raw diet: The frequency of dependence on avocados and/or nuts is explained by human digestive system's design for denser foods. The smaller gut/dense, fattier foods/large-brain connection that arose
during evolution also goes a long way toward explaining why many veg-raw-foodists may become heavily dependent on foods like avocados and nuts to make the diet actually work. Many raw-foodists have something of a guilt complex about the large numbers of avocados and nuts they often eat. Intuitively they seem to feel that their diets contain disproportionate amounts of these items, yet at the same time they cannot stop eating them. This dependence on these foods (sometimes not even realized as such) may lead to guilt feelings, or may become expressed in strictures or rules meant to clamp down on what is felt are "gluttonous" urges toward eating these foods. This is especially true in this day and age when the highcarbohydrate, low-fat gospel reigns supreme, which casts something of a pall over the joys otherwise to be had for vegetarians in eating these more-fatty, hunger-satiating foods.

Raw-foodist eating patterns and difficulties are predictable/understandable given evolutionary design of human gut. Yet from the perspective of the evolution of the smaller human gut

265

geared toward the digestion of denser, fattier foods (in the past it was primarily animal flesh, of course) it is easy to see why raw-foodists often become highly dependent on avocados and nuts in their diet and suffer without them. The only other alternatives--both ruled out for raw-foodists--are to eat cooked grains, legumes, and tubers to get more concentrated nutrient intake, or dairy products or eggs. Those having been eschewed, the choice occurs almost by default, though it's really probably almost a foregone conclusion given the restrictions veg-raw-foodists live under, when analyzed from the perspective of the evolution of the human diet and digestive system.

Once all other, more energy-dense foods are eliminated, "fruitarianism" is the logical/inevitable outcome. To make matters worse, many raw-foodists living under the above strictures
and who actually do try to either eliminate nuts and avocados, or at least seriously limit them, understandably tend to find the only remaining somewhat concentrated foods--fruits--looming large as attractive taste treats, and often overindulge in these otherwise fine foods. But when made too large a part of the diet--and given that modern fruits have been bred for higher sugar contents than their ancestors--this can lead to another set of problems resulting either in hyperinsulinism or sugar addiction. Of all the varieties of vegetarians and "living foodists" and raw-foodists, it is the so-called "fruitarians" who get into the most health trouble the most quickly. And again, it is not hard to see how this can happen with the progressively more limited set of foods allowed under the vegetarian raw-foods philosophy. It is not just the "ethical" consideration that fruitarians put forth (as a motivating factor) that fruits are the only "karmaless" food that's obtainable without harming a plant to obtain food from it. The digestive, metabolic (energy-producing), and satiation (hunger-satisfying) characteristics of foods that play out based on the dietary limitations one lives under, in fact, inexorably lead to these behaviors in logical, understandable ways.

The fallacy of fruitarianism: word games vs. the real world of practice and results
The following material on fruitarianism had to be cut from the original print interview due to space restrictions, but is included here on the web in expanded and rewritten form, since fruitarianism has proven to be a seductive philosophical and behavioral trap that those who are particularly prone to idealism or extremism within the vegetarian movement sometimes fall prey to. It can and often does lead to serious and long-lasting health consequences. (One unfortunate member of the Natural Hygiene Many-to-Many, for example, lost all their teeth as a result of a high-fruit diet; although the potential problems of high-fruit diets extend considerably beyond just loss of teeth.)

H&B: Let's hear more about the fruitarianism question--with regard to humans and apes both. I know you have plenty of opinions on the thinking and behavior of those subscribing to this particular system of diet.
W.N.: Well, first there is the problem of defining just what you mean by the words "fruit" and "fruitarian." There is a lot of gamesmanship, sleight-of-hand, and word redefinition that goes on among fruitarian advocates to redefine "fruit" away from the common definition (soft, pulpy, sweet, juicy fruits from tree or vine) so that it includes the so-called "vegetable fruits" like peppers, cucumbers, tomatoes and the like, or "nut-fruits" and so on, so as to broaden what is considered "fruitarian." In a botanical sense, these foods can be considered fruits, and thus--if we stretch things a bit--perhaps "technically" permissible in what might be called a "fruitarian" diet. The problem, however, is that most fruitarians don't even stop there either. Most go further and allow or even specifically recommend "greens" and/or "green-leafed vegetables" as essential, and of course neither of these qualify as fruit even in the botanical sense. Once you get this far, any sense of integrity about what "fruit" really means has been sacrificed to the realm of fast-talking slipperiness.

Word games over what qualifies as "fruit" usually expand the definition so far that the distinction means little. But you have to look at the fact that most people don't normally think of these

266

other items as fruits unless they are trying to wiggle out of the straightjacket the normal definition creates when you really start to think about being able to survive on nothing but fruits alone. So if one wants to play games and define a fruit as almost anything under the sun that is a "seed-bearing mesocarp" or whatever, well, fine, but in my view that's losing touch with the reality of what people mean in common parlance and mutual understanding of the language. Certainly if you define fruit broadly enough as a botanist might, you may be able to make it as what I call a "technical fruitarian" (meaning extremely technically defined). We should recognize that this is just a game, though, because it's a moving target that people trying to be fruitarians use so it won't bother their conscience to use the label. Frankly, the way fruit is sometimes defined by fruitarians (to include nuts, seeds, greens, and/or green vegetables) doesn't really distinguish the diet much, if any, from an all-raw version of a Natural Hygiene diet of fruits, vegetables, nuts, and seeds.

Were our evolutionary primate predecessors really true "fruitarians"? If you are defining fruit
the way it is commonly defined as relatively high-sugar-content, juicy tree fruits plus things like melons, berries, and so forth, then we have something more concrete to talk about. As far as the normally defined fruits go, there is a partial grain of truth to apes as fruitarians, and for the ancestor common to us both, but only if you remember we are talking what the greatest percentage of the diet was. While over half may have been fruit at one time (perhaps with the gracile Australopithecines 3-4 million years ago, although I don't know of any data to confirm this with any surety or not), there were also significant portions of other foods. To really make a case for anything approaching fruitarianism as ancestral to humans, you would have to go back to at least the common ape-like ancestor ancestral to both chimps and humans approximately 7 million years ago, and even this would not have been total fruitarianism. The closest approximation to fruitarianism (and not completely so, at that) in an ape-like species might have been in the time period of around 40 million years ago, but you have to realize these creatures were apes and not human. Humans themselves (the genus Homo, beginning with Homo habilis over 2 million years ago) have never been fruitarian. Even chimps, who are the nearest modern ape species to fruitarianism, do not eat above 2/3 fruit or so in the case of the common chimp, or perhaps as high as 80% in the case of the bonobo chimp--in either case still a significant amount short of 100%, especially when you consider they both also eat leaves, pith, insects, and a bit of meat too.

Simply put: Humans are not apes. As a number of Natural Hygiene practitioners have noted,
however, few humans can eat even 2/3 fruit over long periods of time without getting into serious difficulties. We had several M2M folks try near-fruitarian diets, and no one had any lasting success with it, although some have done fine for several months at first, perhaps even a year or two. In fact, those that we knew of in the M2M reported getting into trouble trying to do so and later regretted their naivete in attempting it, due to the problems that eventually followed. The two most common repercussions of longterm attempts at fruitarianism are usually that the teeth are the first to go, then people's blood-sugar processing abilities, along with deficiencies.

People may do well at first, but this is because they are living off of past nutritional reserves, and when the stored reserves run out, the game's over. This is a theme we've probably beaten to
death here, but it warrants repetition, especially with regard to fruitarian diets: It is not enough for a diet to be "clean"--it must also be a sufficient diet. Fruitarianism and near-fruitarianism are the worst possible case, because in addition to progressive long-term deficiencies, the body's insulin-production capabilities are being simultaneously overwhelmed with the high carbohydrate load in the form of higher glycemic-index foods containing simpler sugars like glucose, sucrose, and fructose.

Advocates of "fruitarianism" frequently change their definition of it over time. Most people
who initially promote a total fruitarian diet are forced to back off and begin allowing the use of some nuts, seeds, and green vegetables from experience. This may extend the period over which a "fruitarian" can maintain their regimen, but it doesn't remove the underlying problem of the long-term consequences of

267

excessive sugar consumption and/or hyperinsulinism, not to mention low intake of B-vitamins, certain minerals, etc., that are likely to result if the diet is continued long enough. (Although the following is just speculation, and there may certainly have been other potentially causative factors, it is worth considering that the relatively early death of near-fruitarian advocate T.C. Fry at age 70 recently--from atherosclerotic plaques in his legs that led to a coronary embolism--might possibly have been due at least in part to hyperinsulinism, which can promote atherosclerosis and heart disease. See Chet Day's investigative article, "The Life & Times of T.C. Fry" at the Health & Beyond website for more on Fry's life and the events leading up to his death. Caveat: You will need the helper app Adobe Acrobat Reader for your web browser to read the PDF file that you'll be served up while online.)

Noteworthy 1970s-era exposé of numerous alleged fruitarians found no successes, and widespread misrepresentation of diets actually eaten. 4-Part "Fruit for Thought" article series, by the American Vegan Society's Jay Dinshah. An
interesting little piece of Hygienic history here that most are probably unaware of is that Jay Dinshah, a long-time vegan who was previously a staffer with ANHS (the American Natural Hygiene Society) many years ago, published an extensive article series on fruitarianism in the late 1970s, titled "Fruit for Thought," in his own newsletter Ahimsa--the most in-depth piece of journalism on the subject I have ever seen. [Dinshah 1976-77]

Fruitarian gurus weren't actually practicing what they preached, but followers who did ran aground. As part of the series, Dinshah told of his investigations, ranging over many years, of all the
fruitarian advocates he knew of at the time claiming to live on fruitarian diets. He originally thought himself that fruitarianism seemed theoretically possible. But over the years, what he discovered was that none of them--and this included famed fruitarian advocates like Johnny Lovewisdom, Walter Siegmeister (pen name Raymond Bernard), and Viktoras Kulvinskas--were actually living on fruitarian diets, even as they defined the diet for themselves. Yet there were many others Dinshah met who had taken the advice of these people quite seriously and had gotten themselves into very serious health troubles.

Excuses, excuses. Lovewisdom was eating plenty of vegetables but made the excuse his orchards and
gardens in the Ecuadorian tropics were not in the right place to enable the fruit to be of the quality to support him in good health. Breatharian advocate Siegmeister was living not even as a fruitarian but as a vegan. And Kulvinskas was basically eating as a garden-variety raw-foodist vegetarian with periodic lapses back to cooked food--while he had previously championed liquitarianism (juices only), fruitarianism, and even breatharianism. [I am not aware of what Kulvinskas' living-food-type recommendations and practices are these days, many years later.] Through another route, I've seen reporting on Wiley Brooks [Mapes 1993] a reputed, so-called "breatharian," who readily admits to eating Big Macs and Twinkies now and then, but claims those are not what actually sustain him--it is really the air sustaining him in spite of what he is eating! (Now there is some truly creative, totally untestable, idealistic logic for you!)

Failures the rule, no successes ever came to light. Of the people who had lived on truly restrictive
(soft/sweet, tree-fruit-type) fruitarian diets, Dinshah's article series brought to light that their practices had put them in a bad way health-wise. Also included in Dinshah's article series was the substance of a conversation he had once had with Dr. Gerald Benesh [who has been previously interviewed in Health & Beyond] who reported that "in all his years of practice, in fasting, rebuilding, and advising people in even that wonderful climate [near Escondido, CA in the 1960s] and with the fine fruits available in the area and from below the Mexican border, he had nonetheless never been able to bring even one of his people to the point where they could live in good health on just fruits." [Dinshah 1976-77, Part 1, p. 7]

268

Little change in fruitarian movement between now and then. As well, Dinshah took a long, hard
look in his series at the rationale and reasoning behind fruitarianism as published in books on the subject at the time--echoing claims and propositions very similar if not identical to what we hear today--and found the literature of the time rife with logical as well as outright factual errors and assumptions. (For a look here on Beyond Veg at these types of errors, see the material available in Selected Myths of Raw Foods, and in our Waking Up from the Fruitarian Dreamtime section.)

Notable Natural Hygiene practitioner at the time related specific problems seen in numerous fruitarian patients. Printed in tandem with the article series was also a separate reprint of an
article by Benesh himself [Benesh 1971] that had previously been published in Herbert Shelton's Hygienic Review, detailing a few of the serious problems he had seen develop in people he had cared for who attempted a fruitarian diet--even on high-quality fruits available in season--for more than a few to several months. Benesh listed the following symptoms of people on long-term fruitarian diets that he had seen in his own Natural Hygiene practice, which we should note are not so very different from those mentioned earlier in this interview for the majority of other total-raw-foodists who experience long-term troubles: [R]idged nails, gingivitis, dental caries, dry skin and brittle hair, lowered red blood cell count and low hemoglobin percentage. Over a long period of time (at least one year or more) the blood serum level drops to a point of an impending pathological state if not corrected. Many of them display serious signs of neurological disorders, while some experience emotional upsets and extreme nervousness and often complain of insomnia. When their nutritional program is corrected these signs disappear and the patient finds himself in a much improved state of health. I recently spoke with a health-minded medical doctor, who embarked on this lopsided program and did very well, experiencing a high state health for about a year, when almost suddenly a loss of weight was experienced and neurological signs were evident. This doctor took a series of blood and serum tests plus other pertinent tests, which verified what I have observed in fruitarians and excessive fruit-eaters, and corroborates my findings. Another cardinal lack that occurs quite often is a distinct lack of vitamin B-12. This lack of B-12 gives rise to the neurological signs that indicate a serious deprivation of this vital element needed to keep the nervous system operating at a so-called normal level.

Potential explanation for digestive difficulties in long-term fruitarians. Long-term attempts at
fruitarianism can also lead to other problems. According to Benesh, the excessive amounts of organic acids from a fruitarian diet are unable to be adequately buffered by the digestive system, with the result that these acids end up in the large colon where they interfere with proper balance and/or amount of intestinal flora. The ultimate result can be compromised bowel function, which in a few cases can result in serious health problems such as colitis, if the person continues to persist on the diet. Benesh theorizes that the excessive acids cause salts to be drawn from the cells of the colon resulting in flaccidity and interference with peristalsis. (Note that if a similar process occurred in the small intestine as well, this might be a possible explanation for why extended attempts at fruitarianism sometimes result in the apparent inability to digest so-called "heavier" foods [nuts, cooked potatoes or rice, etc.], which may pass through the digestive tract relatively little digested, as reported by some fruitarians. Benesh also notes that hemorrhoids are not uncommon, and that fecal analysis may reveal occult blood as a result of minute petechial hemorrhages of the small blood vessels in the mucosal lining of the intestine.

Beyond the myth of fruitarianism is the empowerment that freedom from fantasy brings.
Fruitarianism is a myth that dies hard, but the price of the illusion is having to learn the hard way. If the reality seems disillusioning, consider what is gained instead: freedom from a deceptive, if well-intentioned, fantasy that ends up hurting people. Freedom from fibs that people tell themselves and others that only

269

compromise the capacity for self-honesty and the ability to more objectively assess how their health is actually being affected.

When evaluating claims, look beyond the word games. What you should do when you encounter
those advocating fruitarianism is to look at what they actually eat. To find out the truth, look at what they do, not what they say. So far, what has inevitably been found is that such fruitarians are either in poor health if they have truly been eating solely sweet and succulent fruits for any length of time; or if in seemingly good health, they invariably include other foods that the rest of us would consider vegetables or "greens"; and most often also nuts and avocados as well (which are hard to do without and still obtain sufficient fat and protein on this kind of diet). In some cases the game-playing can reach rather absurd levels to where even things like sea vegetables and dulse, steamed potatoes or rice on occasion (eggs have even been called "hen-fruit" before) are somehow defined as fruits or technically "permissible" in a fruitarian diet. So when you find fruitarians splitting hairs over what technically qualifies as a fruit, you should be honest enough to recognize fruitarianism for the word game it actually is, instead of getting carried away by idealistic fantasies. There is no empirical support for successful fruitarianism (as it would be commonly thought of) either in hominid evolution and just as importantly in the real world of results for people who have tried it. If you run across people who do claim to be successful, you should be very skeptical, since fibbing, prevaricating, and word-gamesmanship have proven upon investigation to constitute a necessary pillar of support in the lore and history of fruitarianism.

A more "evolved" path, or only more extreme? So if it is true that the "successful" fruitarians are so
only because they define fruit very broadly as any "seed-bearing mesocarp," what real difference does it make? Why are they so concerned to define themselves as one? In the end, it usually comes down to an ideal that fruitarianism is somehow more physically "pure" or more spiritually "evolved" than gardenvariety vegetarianism. People want to think of themselves as special (which in itself is natural and understandable, of course), and fruitarianism is painted as being the "next step" in dietary evolution. What it really is, however, is a next step toward extremism, philosophical dishonesty, and ultimately failure of health.

Judge the diet, not yourself--by bottom-line results, not high-sounding philosophy. Think very
seriously before giving credit to those who claim the mantle of fruitarianism or other extremist/idealist dietary programs, and especially before putting your health at risk based on their advice. Instead, put the power of reality-based thinking and knowledge to work in opening up workable avenues for dietary experimentation based on honesty. Rather than judging yourself and your behavior by a dietary philosophy, turn the equation around and evaluate the worth of a diet by the concrete results it gives.

The Late Role of Grains and Legumes in the Human Diet, and Biochemical Evidence of their Evolutionary Discordance
by Loren Cordain, Ph.D. Copyright © 1999 by Loren Cordain. All rights reserved. Introduction: The principle of evolutionary discordance
To set the context for this discussion, let's first briefly recap the basic evolutionary processes to which all species are subject. These principles are fundamental to understanding the repercussions of grains or any other food on the genetics that govern human biology. • Mutations in genes cause variability in the population-wide genetic makeup of a species.

270

• • •

• •

Selective pressures due to environment (which includes diet) cause individuals with genetic makeups most suited to that environment to thrive the best under those selective pressures. These individuals achieve better rates of survival and differential reproductive success, thus leaving behind more descendants compared to those without the more advantageous traits. This process is iterative (repeats itself) with each generation. Thus with repeated generations, the characteristics of organisms in a population become more finely tuned to their environment, which again, includes the diet they habitually eat. After a point, assuming environmental factors are not themselves undergoing great change, the population(s) making up a species will tend to reach a point of relative genetic equilibrium. That is, the species' genetic makeup remains fairly stable with a genetic makeup as well-tuned to its environment as it is likely to be, though including some variability due to geneti