Saturday, May 28, 2011

Who Said Paleo Diet Was High In Fat? Part 2.2

In her editorial comments and letter to the editor remarking on the Cordain et al paper on Plant-animal subsitence ratios of world-wide hunter-gatherers, Katherine Milton also made several other comments casting doubt on the idea that evolutionary human diets contained low proportions of plant foods and carbohydrate, and high proportions of animal protein and fat.  Rather than quote, I will paraphrase:
1.  Humans come from an ancestral lineage—i.e. primates–– in which plant foods have traditionally served as the primary source of energy.

2. The human gut displays a protracted transit time, averaging 62 hours with low-fiber diets and 40 hours with high-fiber diets.  “In striking contrast to humans and all great apes, all extant Carnivora show a rapid turnover of ingesta. For example, a 370-kg polar bear takes ≈24 h to digest a seal carcass.”

3. “To date, few genetic adaptations to diet have been identified in humans, suggesting that, in their evolution, humans tended to resolve dietary problems primarily by using technology rather than biology.”

4.  “The technologic abilities of humans derive from their unusually large, complex brain, a brain that, under normal conditions, is fueled by a steady supply of glucose. Consumption of digestible carbohydrate is the most efficient way for humans to obtain glucose for brain function. Potential alternatives—gluconeogenesis or the use of ketones to fuel the brain—represent alternative, more costly metabolic solutions.”

Our ancestral primate lineage consumed, so far as we can tell, a plant-dominated omnivorous diet.  This does not mean that humans should eat like other extant, closely related primates (and Milton does not suggest that we should).  We don’t have exactly the same digestive physiology as chimps or gorillas; we don’t have the equipment required for a raw plant-dominated omnivorous diet. 

However, as Milton notes, we also lack the digestive physiology of the top carnivores.  Even on a low fiber diet, due to our proportionately long small intestine, the passage of food through the human gut takes more than 2.5 times longer than transit of food through the guts of extant carnivores.   In this respect, the human gut functions more like that of other primates than that of top carnivores.  This strongly suggests that the human gut remains adapted to a plant-dominated diet.  

As well, the fact that humans produce more salivary amylase and have more copies of the AMY gene that codes for amylase production than other primates also suggests that evolutionary diets contained substantial amounts of starch.

Although we have clear evidence that ancestral humans succeeded in expanding the animal-source component of their diets and good reasons to believe that this may have played a role in expansion of the human brain by providing neural fatty acids, this evidence does not tell us that ancestral humans had abandoned the primate tradition of plant-dominated nutrition.  
The expansion of the human brain through evolution does not necessarily indicate a move to a diet composed largely of grassland animal meat and fat.   The idea that “meat-based diets support brain development better than plant-dominated diets” doesn’t gain much support from comparative anatomy.  Primates build larger brains (relative to body mass) than other species on largely vegetarian diets.  The chimps living on 95 percent plant food diets have a body:brain ratio (by mass) of 100:1.  Elephants eating plant-dominated diets have a ratio of 851:1.  Tigers living on virtually 100 percent grassland animal food have a body:brain ratio of 1000:1.  

 Thus, the 95 percent vegetarian primate eating a has a brain 10 times larger than the carnivorous tiger on a relative basis.  Moving the diet in the direction of the tiger’s (predominantly land animal flesh and fat) would seem to favor a smaller, not larger, brain. Notably, children recovering from malnourishment appear to build normal brains on diets containing as little as 8 grams of animal protein per day (one-third of total protein requirement) if provided adequate essential fatty acids.[1, 2, 3 ]

Some have suggested that consumption of brains and marrow supported development of hominid brains in evolution.  Of interest, Ta'i chimpanzees regularly eat the brain, eyes, and marrow of long bones of colobus monkeys [4 pdf] and presumably have been doing this for millennia, but the chimp’s brain is only one-third the size of a human’s.  Perhaps some other nutritional factor supported hominid brain expansion?

Dolphins have a brain slightly larger than human (1.8 kg vs 1.4) and a body:brain ratio similar to humans (dolphins, 42:1, humans, 56:1).   For development and function, the human brain requires specific vitamins (B12, folate, B-complex) and minerals (iodide, iron, copper, zinc, and selenium) in addition to essential fatty acids.  On a weight basis, shellfish, eggs, finfish, pulses, and cereals all provide greater concentrations of these minerals than meat.[5, pdf]

From this data, using single foods, a human can satisfy all requirements for brain-selective minerals by eating 2 pounds of shellfish, 5 pounds of eggs, 8 pounds of fish, or 11 pounds of meat.  With this basis, one can imagine that only small amounts of seafoods to a plant-dominated diet would provide a hominid with adequate nutrition for building a large brain.

The evolutionary expansion of the brain actually increased the demand for glucose provided by plant foods.   Nonhuman primate brains only use 8-9% of  resting energy expenditure, but the modern human brain uses about 20-25% of resting energy expenditure and two-thirds of all glucose used by the body, while the mass of the human gut is only 60% of expected for a similar sized primate.  Although the brain can adapt to a large extent to use of ketones instead of glucose, this is an inefficient way to fuel the brain and appears to stimulate the stress i.e. fight or flight response.  For example, it appears that the ketogenic diet’s antiseizure effect depends on stimulation of the sympathetic nervous system since disabling norepinephrine destroys the anti-seizure effect of the diet. []

The nervous system and red blood cells use an estimated 150 g of glucose daily.  Non-neural tissues will also use glucose for fuel when provided.  The conversion of amino acids to glucose unnecessarily burdens the liver with ammonia to detoxify, and the kidneys with sulfuric acid to eliminate.  Therefore, for smoothest operation, humans need food that supplies large amounts of glucose in a compact and readily usable form (i.e. not amino acids).  This is supported by the beneficial effect that high carbohydrate diets appear to have on mood [7], and reflected in the fact that the body preferentially burns glucose as fuel and stores glucose as glycogen in the liver and muscles rather than as fat.[8 , 9 ] Animal tissues are universally low in glucose, therefore not serving as the best source of substrate for brain metabolism. 

The body also needs glucose as a substrate for the manufacture of a number of other important functional and structural compounds including:

  • Glucuronic acid, which binds to substances to facilitate their transport around the body. In this way glucuronic acid is largely responsible for the elimination of toxins such as drugs, excess hormones, and foreign chemicals.
  • Hyaluronic acid, a constituent of extracellular fluids, needed for synovial fluid, cartilage, and skin.
  • Chondroitin sulfates (a type of glycosaminoglycan), constituents of cartilage
  • Immunopolysaccharides
  • DNA and RNA
  • Heparin (another glycosaminoglycan), an endogenous anticoagulant

None of this means we should “go vegan.”  Animal products of the best quality—particularly seafoods-- provide important nutrients not adequately provided by plants, not the least of which is vitamin B-12.  However, as a student of evolutionary nutrition, I find it noteworthy that we have enterohepatic recirculation of vitamin B-12.[10]   As noted by Herbert [8 ], this can prevent B-12 deficiency from occurring in a previously omnivorous adult vegan for 20-30 years:

“The enterohepatic circulation of vitamin B-12 is very important in vitamin B-12 economy and homeostasis (27). Nonvegetarians normally eat 2-6 mcg of vitamin B-12/d and excrete from their liver into the intestine via their bile 5-10 mcg of vitamin B-12/d. If they have no gastric, pancreatic, or small bowel dysfunction interfering with reabsorption, their bodies reabsorb ~3-5 mcg of bile vitamin B-12/d. Because of this, an efficient enterohepatic circulation keeps the adult vegan, who eats very little vitamin B-12, from developing vitamin B-12 deficiency disease for 20-30 y (27) because even as body stores fall and daily bile vitamin B-12 output falls with body stores to as low as 1 mcg, the percentage of bile vitamin B-12 reabsorbed rises to close to 100%, so that the whole microgram is reabsorbed.”

Looking at this from an evolutionary perpective, why would the body have such efficient recycling of vitamin B-12 but not of other B-complex vitamins?  Logically, nutrient recycling probably represents a response to scarcity and infrequent consumption, so I would guess that this system would most likely have developed in response to a diet that did not have a continuous rich daily supply of vitamin B-12, with high B-12 intake occurring sporadically. 

Of further interest, which animal products supply the most vitamin B-12 per serving?  From the Linus Pauling Institute page on vitamin B-12 [11] :

Shellfish, not land animals, appear the richest sources.  An adult human requires about 2.4 mcg of vitamin B-12 daily.  Given the fact that we recycle vitamin B-12 with a net loss of only 2-5 mcg per day from body stores, three ounces of beef or salmon or 4 large eggs daily, 3 ounces of crab once every other day, 3 ounces of mussels once every 3-4 days, or 3 ounces of clams once every ~20 days, would satisfy the requirement. 

So, reverse engineering from current human vitamin B-12 metabolism seems to suggest that for a very long period of time, long enough to establish our baseline B-12 metabolism, our ancestors infrequently consumed foods rich in vitamin B-12. 

On the other hand, applying the same thought process to modern human carbohydrate (glycogen) and fat storage suggests frequent consumption of carbohydrates and infrequent consumption of fats in ancestral diets.  But I’ll save that topic for a future entry.

Who Said Paleo Diet Was High In Fat? Part 2.1

As I noted in part 2.0 of this series, in 2000, a team including Loren Cordain, Janette Brand Miller, S Boyd Eaton, Neil Mann, Susanne HA Holt, and John D Speth published “Plant-animal subsistence ratios and macronutrient energy estimations in worldwide hunter-gatherer diets” (hereafter referred to as “the Plant-Animal Ratios paper”) in the American Journal of Clinical Nutrition. In their conclusions, they wrote:

“Whenever and wherever it was ecologically possible, hunter-gatherers would have consumed high amounts (45–65% of total energy) of animal food. Most (73%) hunter-gatherer societies worldwide derived >50% (≥56–65%) of their subsistence from animal foods, whereas only 13.5% of these societies derived more than half (≥56–65%) of their subsistence from gathered plant foods. In turn, this high reliance on animal-based foods coupled with the relatively low carbohydrate content of wild plant foods produces universally characteristic macronutrient consumption ratios in which protein intakes are greater at the expense of carbohydrate.”[Italics added.]

In Part 2.0, I outlined the three main propositions Cordain et al attempted to support:

1. Hunter-gatherers would have eaten more animal than plant food “wherever and whenever possible;”
2. Wild plant foods have a “relatively low carbohydrate content;” and
3. The paucity of plant foods available to hunter-gatherers made it necessary for them to focus their hunting on procuring large game supplying large amounts of fat.

In Part 2.0, I showed that anthropologist Katherine Milton provided a number of reasons to doubt that the database used by Cordain et al (Murdock’s Ethnographic Atlas) provides a solid foundation for support of the first of these three propositions. Now I would like to take a closer look at the other two.

So, I think that we have three questions to explore:

1. Did wild plant foods really have a “relatively low carbohydrate content” compared to cultivated (agricultural) foods?
2. Did any or many hunter-gatherer groups have access to significant quantities of wild plant foods?
3. Did hunter-gatherers universally spurn plant foods in favor of animal foods, or did any highly value plant foods even more than animal foods?

Carbohydrate Content of Wild Plant Foods

In the Plant-Animal Ratios paper, Cordain et al wrote:

“We used the average plant macronutrient values of 62% of energy from carbohydrate, 24% from fat, and 14% from protein based on the previously analyzed database of 829 wild plant foods (17). Because of the similarity (3.5% difference) in the mean energy density of wild plant (6.99 kJ/g) and animal foods (7.24 kJ/g) in our database, we assumed that the P-A subsistence ratio based on weight in the Ethnographic Atlas would be virtually identical to the P-A subsistence ratios based on energy.”

Now, as quoted above, Cordain et al described wild plant foods as “relatively low in carbohydrate.” Whenever someone says “relatively” s/he means relative to something else. I will assume that Cordain et al meant that wild foods are low in carbohydrate relative to cultivated foods, not to processed foods, because they are making an argument in favor of mimicking a wild rather than agricultural diet.

So, do wild plant foods supply less carbohydrate than cultivated plant foods? Let’s see.

Two factors influence the carbohydrate delivery of a food: it’s energy density, and the proportion of energy delivered as carbohydrate.

Cordain et al describe their (mythical) average wild plant food as providing 62 percent of energy from carbohydrate and 6.99 kJ/g (1.7 kcal/g). Does this profile make wild plant food significantly “lower in carbohydrate” than common staple plant foods of agricultural people?

I used the USDA database to find the energy density and carbohydrate content of four higher-carbohydrate and one lower-carbohydrate cultivars, at least three of which have served as the staple food for at least one agricultural tribe: sweet potatoes, white potatoes, boiled brown rice, boiled lentils, and almonds.

I found the following: Sweet potatoes supply 0.9 kcal/g and 80 percent of energy as carbohydrate, baked white potatoes supply 0.9 kcal/g and 82 percent of energy as carbohydrate, boiled brown rice supplies 1.1 kcal/g (4.7 kJ/g) and 79 percent as carbohydrate, boiled lentils supply 1.2 kcal/g (4.9 kJ/g) and 41 percent of energy from carbohydrate, and almonds supply 5.6 kcal/g (24.1 kJ/g) and 6 percent of energy as carbohydrate. In tabular form, including the average wild plant food figures from Cordain:

Thus, the “average plant macronutrient value” used by Cordain et al indicates that the mythical average plant food used by hunter-gatherers had an energy-density almost double that of sweet potatoes or white potatoes and 50 percent greater than boiled lentils. Thus, the "average wild plant food" doesn't appear particularly low in energy (kcalories) compared to any of these cultivated foods except for the low carbohydrate almonds.

As for carbohydrate, the “average wild plant food” delivers 50 percent more carbohydrate as a proportion of energy than lentils, and ten times more carbohydrate (as a percent of energy) than almonds.  (I know, Cordain et al created an average...I'll get to that soon.)

Let’s compare Cordain et al’s “average wild plant food” with white potatoes. A 100 g serving of white potatoes would supply 90 kcal and 74 kcal as carbohydrate, or 18 g carbohydrate. A 100 g serving of the “average wild plant food” would supply 170 kcal and 105 kcal as carbohydrate, or 26 g carbohydrate.

So 100 g of the (mythical) “average wild plant food” supplies 40% more carbohydrate than white potato. Rather than being “relatively low in carbohydrate” compared to potato, the “average wild plant food turns out to relatively high in carbohydrate per 100 g serving.

Of course, the “average wild plant food macronutrient value” is derived by averaging the values of 829 wild fruits, seeds, nuts, underground storage organs (tubers, roots, etc.), leaves, dried fruit, flowers, gums, and miscellaneous plant parts. Averaging low-carbohydrate, high-energy-density plant values such as from nuts with high-carbohydrate, lower-energy-density plant values such as from tubers, and very low energy density foods like flowers will tend to bring down the “average” carbohydrate content and raise the average energy density. The same applies to cultivars.

If I average together the values above for those five cultivated plant foods, I get a mythical “average cultivated food” that supplies 1.9 kcal/g and 58 percent of energy as carbohydrate, not significantly different from the “average wild plant macronutrient value” used by Cordain et al.   See table below.

 In other words, the "average wild plant food" doesn’t appear exceptionally low in carbohydrate or energy compared to the average of these cultivated plant foods.  One more calculation for fun: 

Cordain et al's "average wild plant food" has an energy and carbohydrate density not significantly different from the average for these five cultivars.

I very much doubt that we would get significantly different results by including 829 cultivars, because we will get a mix of fruits (high-carbohydrate, low energy density), vegetables (generally moderate to high carbohydrate, low energy density), grains (moderately high in carbohydrate, energy density similar to brown rice), legumes (low to moderate carbohydrate, energy density similar to lentils), seeds and nuts (low carbohydrate, high energy density).  Edible plants simply have consistent energy density and carbohydrate ratios whether wild or cultivated.  It seems mythical that wild plant foods differ markedly from cultivars in energy density or carbohydrate content. 

Plants Used By Hunters

Cordain et al imply that nature provides relatively small quantities of carbohydrate-rich plant foods, making pursuit of animal fat the route to optimal foraging. In her editorial comments on the Plant-Animal Ratios paper, titled Hunter-Gatherer Diets—A Different Perspective, Milton disagreed, pointing to a number of examples where hunter-gatherers clearly have relied on plant foods as the basis of their diets. From her editorial:

“In the average collecting area of an Aka Pygmy group in the African rain forest, the permanent wild tuber biomass is 4545 kg (5 tons) (19).

"Australian aborigines in some locales are known to have relied seasonally on seeds of native millet (2) or a few wild fruit and seed species (20) to satisfy daily energy demands. Some hunter-gatherer societies in Papua New Guinea relied heavily on starch from wild sago palms as an important source of energy (21), whereas most hunter-gatherer societies in California depended heavily on acorn foods from wild oaks (22).”

So, starchy plants aren't necessarily scarce in hunter-gatherer environments.  Assuming that the wild tubers available to the Pygmies have an energy density of ~1.0 kcal/g, the permanent wild tuber biomass in their forest gives them continuous access to 4, 545, 000 kcal, constantly regenerated by photosynthesis.  In addition to this, consider the findings of Melissa Darby, described in my book:

“Anthropologist Melissa Darby, M.A., of Lower Columbia Research and Archaeology (Oregon) says that a woman gathering carbohydrate-rich camas could net 5,279 calories per hour. …..In 1996 Darby demonstrated that hunter-gatherers in the Northern Hemisphere had access to Sagittaria latifolia, a prolific wetlands plant that produces a tuber very similar to the white potato, which the Chinook Indians called wapatos. This plant grows in Europe as well as North America, the tuber is easy to harvest, and abundant from late fall through spring, when other high-carbohydrate plant foods may be scarce. Darby has harvested approximately 5,418 calories per hour gathering wapatos from a knee-deep pond. The tubers do not need grinding or mashing to be palatable, and can be cooked fresh, stored fresh in a cool place, or dried. They are thoroughly cooked in a bed of hot ashes in 10 minutes, and do not need stones for long oven cooking. Pollen data indicates the wapato was prolific in the last Ice Age through North America, the North American Great Basin, Siberia, and Northern Europe.(17)”

Five thousand calories per hour seems pretty productive--enough to feed two people for one day gathered in just 60 minutes.  Darby claims that it is possible to harvest up to 10, 000 calories of wapato per hour in some seasons. If so, five hours of foraging would support a family of five people for several days assuming no other food source. Considering the low hazard and energy investment involved in harvesting wapatos, this seems like some optimal foraging to me. Anyway, back to Milton:

“These and similar data indicate that hunter-gatherer societies typically did not rely on many wild plant species specifically for energy. Rather, they had one or a few dependable wild staples (some also good sources of protein) that provided much of their energy needs. In nature, any dependable source of digestible energy is generally rare and when discovered is likely to assume great importance in the diet. Animal foods typically are hard to capture but food such as tree fruits and grass seeds are relatively reliable, predictable dietary elements. …Humans are quick to appreciate the value of reliable energy-providing staples and will work hard to ensure a steady supply of them.….. Contemporary ethnographers working in Amazonia noted that even when smoke racks are filled with game, if the carbohydrate staple becomes exhausted, the inhabitants say they have no food (23).” [Italics added.]

So some hunters of wild game apparently considered their starchy staple “food” and their game not food per se. This sounds similar to the situation in agricultural societies.  In English, we call a main eating event a “meal” which is the word for a starchy food, namely ground grain, e.g. cornmeal or oatmeal. In China, the word for rice or cereals (‘fan’) is also used as the word for food in general, while vegetables, fruits, and animal items are called ‘tsai,’ or "dishes." 

Why would any animal so consistently across cultures consider some starchy plant the main food, and other plants only side dishes?  Could it have something to do with how well whole food starches satisfy human hunger and nutritional requirements?  A question to explore later.

And from her letter in reply to Cordain et al:

“Examination of the literature suggests that hunter-gatherers throughout the world took full advantage of any dependable sources of dietary energy in their environment (9–11), even devising complex technologies to secure energy from potentially toxic plant sources such as acorns and cycads (10, 11). Such dependable plant foods, in turn, tended to be relied on heavily for dietary energy. For this reason, Cordain et al's comments on the "low carbohydrate content of wild plant foods" seem largely beside the point—what is key is the steady availability of energy from 1 or 2 reliable wild-plant staples. To secure a dependable source of dietary carbohydrate, some hunter-gatherers, such as the Mbuti (Africa) and the Maku (South America), established symbiotic trade relationships with indigenous agriculturalists (12).”

So, it seems we have considerable evidence that at least some hunter-gatherers developed technology for accessing starchy foods, and some went out of their way to secure steady supplies of high carbohydrate starchy staple foods, apparently unattached to remaining "hunter-gatherers" or "paleo."  So, a question to contemplate:  If a hunter-gatherer does not cultivate foods himself, but trades some hunted or gathered wild foods for some cultivated foods, and eats the latter, does he remain a hunter-gatherer, or is he now an agriculturalist?

Milton believes that modern human neural and digestive physiology clearly suggest that human evolutionary diets probably included plenty of plant foods including high carbohydrate starchy staples.  I also have some things to say about this.  I will leave that discussion for Part 2.2.

Study: Metabolic and Behavioral Effects of a Low Fat, High Sugar Weight Loss Diet

What would happen if you put 42 overweight women on a diet supplying only 11 percent of energy as fat and 71 percent as carbohydrate, with half of those women getting 43 percent of their total daily energy (kcalorie) diet from sucrose, plain white table sugar?

I don't recommend eating a 43% sucrose diet, but I do have interest in knowing how sugar affects human metabolism and behavior, as I think that this could help us understand the baseline adaptation of human metabolism.

Surwit et al did the experiment.  The women weighed 130 to 200 percent of their ideal bodyweights at baseline.  Surwit et al controlled dietary intake by providing the women with all meals and snacks consumed during the 6 week period, along with a Women's One-A-Day multivitamin.

Half of the women ate a starch-based diet and the other half ate the high sucrose diet.  Each of the groups included 12 "whites."  The high sucrose group had 8 "blacks," and the high starch group had 10.

This table shows the nutrient profile of the two diets; one supplied 121 g of sucrose daily, the other only 12 g sucrose daily:

You can see that without the multivitamin contribution, the high sucrose diet has a low nutrient density compared to the high starch diet.  Of concern to me, with the multivitamin added, both groups had quite high intakes of iron.  High levels of dietary and stored iron appear to promote multiple degenerative, inflammatory, and autoimmune disease processes, as well as aging, by enhancing oxidative activity throughout the body.   Since these diets supplied essentially adequate dietary iron without the supplement, no one who did not have iron-deficiency anemia needed this amount of additional iron and I would have had them take an iron-free supplement in this study.  Outside this study, I would have taught them to make natural food selections to optimize iron intake.

And this table shows a sample day's menu for each of the groups.

Lots of refined foods...not what I would suggest.  Lots of room for improvement.

The following figure depicts the changes in body weight during the intervention:

Surprise!  Both groups lost weight.  According to this table, the loss included trunk fat and a decline in body fat percentage:

 The loss of trunk fat suggests a decline in insulin resistance, since trunk fat is a sign of insulin resistance or metabolic syndrome.

Total cholesterol, LDL, and triglycerides also declined, with minimal reduction in HDL:

Both groups had reductions in fasting glucose and marked reductions in levels of norepinephrine, a "fight or flight" neurotransmitter (declining levels of which would indicate reduction of perceived stress and tension):

The authors noted:

"Interestingly, one 53-yold subject who displayed the typical pattern of markedly elevated plasma triacylglycerol and hyperglycemia at the beginning of the study showed significant decreases in fasting glucose, total cholesterol, LDL-cholesterol, and triacylglycerol concentrations, and an increase in HDL cholesterol during the 6 wk of the high-sucrose diet. These findings are consistent with epidemiologic data (1, 9, 22, 39) and animal data from our laboratory (13) that clearly show that high sucrose or complex carbohydrate consumption does not cause obesity, hyperglycemia, or insulin resistance in the absence of dietary fat."

Wait...with all that carbohydrate and so little fat and animal protein, didn't they feel hungry all the time?  The following table shows the "behavioral"--actually, psychical, or mental-emotional-- effects of these diets:

Hunger, negative feelings, and depression all declined from baseline in both groups; vigilance and positive feelings both increased from baseline in both groups.  Anxiety declined in the starch-based group, and slightly increased, but not significantly, in a statistical sense, in the sugar-based group.

The authors noted in the introduction:
"Although there is little theoretical rationale to support the notion that sucrose produces behavioral arousal, there are data to support a theory making the opposite prediction. Carbohydrate consumption, in conjunction with a minimal amount of protein, has been shown to cause an increase in the ratio of plasma tryptophan to large neutral amino acids (28), which in turn is associated with an increase in central tryptophan uptake and brain serotonin synthesis (29, 30). Furthermore, sucrose has a greater effect than starch. (28). This carbohydrate-induced change in central serotonin activity would presumably have a tranquilizing effect as opposed to the exaggerated arousal and hyperactivity typically attributed to sucrose (31). "
Combined with this information, this study's findings that high sugar intakes can markedly reduce norepinephrine and hunger levels confirms the Chinese medical view that sugar has yin effects, where yin stands for the overlapping sensory characteristics cool, calm/quiet, soft, and moist.  According to Chinese medicine, this makes sugar a medication for excessive yang conditions characterized by heat, agitation, tension, and dryness; but because it has relatively extreme characteristics, long term regular use of large amounts will create an excessively yin condition, i.e. excessive coolness, lassitude, weakness/impotence, and moisture (e.g. watery phlegm accumulation, excessive salivation), and a generally deficient condition.  Chinese dietary principles classify whole food starches as more desirable, more balanced foods--having a balance of yin and yang characteristics making them suitable for use as staple foods.  More on that in another article.

Meanwhile, this study seems to question claims that diets high in sugar or starch and low in fat will increase hunger, raise blood sugar, increase lipids, and promote metabolic syndrome in obese individuals.  On the contrary, in this study, increasing the carbohydrate content of the diet, whether by starch or sugar, had the opposite effect.

It seems very unlikely that this would occur in any animal with a metabolism by nature designed for a low carbohydrate and high fat intake.  I would expect high-carbohydrate, low-fat diet to increase, not decrease, stress and tension in an animal evolutionarily adapted to a low-carbohydrate, high-fat diet.

On the other hand, it suggests a dietary remedy to replace the most commonly prescribed drugs in the U.S.A. -- antianxiety and antidepressant medications.

Can you reverse depression, negative feelings, poor vigilance, and hunger all with one simple dietary shift to increased whole food starch and concomitant reduced fat?

A comprehensive theory of human nutrition and weight management has to have the capacity to explain this study without trying to explain it away.

Thursday, May 26, 2011

Who Said Paleo Diet Was High In Fat? Part 2.0

In 2000, a team including Loren Cordain, Janette Brand Miller, S Boyd Eaton, Neil Mann, Susanne HA Holt, and John D Speth published “Plant-animal subsistence ratios and macronutrient energy estimations in worldwide hunter-gatherer diets” (hereafter referred to as “the Plant-Animal Ratios paper”) in the American Journal of Clinical Nutrition [1].  Based on their analysis of the diets of 229 tribes that they identified as contemporary hunter-gatherers, they concluded:

“Whenever and wherever it was ecologically possible, hunter-gatherers would have consumed high amounts (45–65% of total energy) of animal food. Most (73%) hunter-gatherer societies worldwide derived >50% (≥56–65%) of their subsistence from animal foods, whereas only 13.5% of these societies derived more than half (≥56–65%) of their subsistence from gathered plant foods. In turn, this high reliance on animal-based foods coupled with the relatively low carbohydrate content of wild plant foods produces universally characteristic macronutrient consumption ratios in which protein intakes are greater at the expense of carbohydrate.”[Italics added.]

Specifically, Cordain et al decided that:

“The most plausible (values not exceeding the mean MRUS) percentages of total energy [in hunter-gatherer diets] would be 19–35% for dietary protein, 22–40% for carbohydrate, and 28–58% for fat.”

Although this suggests that hunter-gatherer diets ranged from low- to high-fat (28-58% of energy from fat), this paper has served as support for the idea that hunter-gatherer diets typically had high fat contents, because in it Cordain et al appear to provide evidence for three propositions:

1.  Hunter-gatherers would have eaten more animal than plant food “wherever and whenever possible;”
2.  Wild plant foods have a “relatively low carbohydrate content;” and
3.   The paucity of plant foods available to hunter-gatherers made it necessary for them to focus their hunting on procuring large game supplying large amounts of fat. 

So, does this paper supply good evidence for these propositions? 

Perhaps unbeknownst to many people familiar with this paper, Katherine Milton, professor of physical anthropology at U.C. Berkeley, and author of at least 100 peer-reviewed papers on the dietary ecology of primates, human ancestors, and humans, including “Diet and Primate Evolution [pdf]” in Scientific American  and “A Hypothesis to Explain the Role of Meat-eating in Human Evolution [pdf]” in Evolutionary Anthropology, wrote a critical editorial commentary on the Cordain et al paper (Hunter-Gatherer Diets--A Different Perspective) that appeared in the same issue of AJCN, and a follow-up letter to the editor.  What did she have to say?


Cordain et al reported the procedure used to produce their numbers:

First, they analyzed the data from Murdock’s Ethnographic Atlas (heareafter referred to as the EA) to determine the probable range of plant:animal ratios in hunter-gatherer diets.  From this analysis they concluded that most hunter-gatherers derived more than half (56-65%) of their subsistence from animal foods.

Second, using a database of 829 wild plant foods, they calculated that the (mythical) “average” wild plant food macronutrient provided 62% of energy from carbohydrate, 24% from fat, and 14% from protein, and an energy density of 6.99 kJ/g (1.7 kcal/g).  

Third, unlike Eaton et al, Cordain et al assumed that hunters would consume all edible portions of a wild game carcass, including marrow, adipose, and visceral fats. 

Fourth, they assumed a constant of ~35 percent of the diet of all hunter-gatherers consisted of land animal foods, with any additional animal foods coming from fresh- or salt-water fish. 

Fifth, they state:

“Because of the similarity (3.5% difference) in the mean energy density of wild plant (6.99 kJ/g) and animal foods (7.24 kJ/g) in our database, we assumed that the P-A subsistence ratio based on weight in the Ethnographic Atlas would be virtually identical to the P-A subsistence ratios based on energy.”

Sixth, they assumed that an 80 kg hunter would require about 3000 kcal (12552 kJ) from food each day.

Seventh, they performed calculations using these numbers to generate ranges of macronutrients in a hunter-gatherer diet.

Eighth, if any calculation produced a protein intake greater than 35 percent of energy, which would probably deliver more nitrogen than the human liver can covert to urea in a day and therefore lead to harmful levels of blood ammonia and amino acids,  they assumed that ‘typical’ hunter-gatherers pursued higher intakes of animal fats. 

They based this assertion on the ethnographic reports in the EA and optimal foraging theory.   In short, from the data in the EA it appears that hunters preferred to increase animal fat as opposed to plant food consumption, even to the extent of selectively eating animal fats (and discarding the rest of the carcass).  Cordain et al explain this as a manifestation of optimal foraging theory, i.e. hunting a large fat animal supposedly has a greater return on investment than plant foods, meaning that for hunting fat animals, the ratio of energy obtained to energy spent in the hunt appears greater than the ratio of energy obtained to energy spent in collecting wild plants.

Thus, the conclusions made by Cordain et al rest entirely on the reliability of the basic data in the EA as a basis for making quantitative calculations about macronutrient ratios of hunter-gatherers.

Reliability of the EA Data

Cordain et al describe the nature of the EA data in this passage:

“Although Murdock did not specify whether the subsistence-dependence categories were based on the energy content or weight of the food for each subsistence economy (gathered plant foods, hunted animal foods, and fished animal foods), examination of the >400 original references indicates that in many cases, estimates were made by weight. Ethnographic data are qualitative in nature and as such lack the precision of quantitative data; consequently, Murdock's subsistence-dependence categories, in almost all cases, represent subjective approximations by Murdock of the ethnographer's or anthropologist's original observation.” [Italics added]

In this last sentence, Cordain et al appear to admit that the Ethnographic Atlas reports Murdock’s “subjective approximations” of the subsistence ratios of various cultures, based on the original, first hand observations of ethnographers or anthropologists, which are “qualitative” [i.e. plant versus animal], not “quantitative,” and “lack precision.” 

So the question arises, how can anyone have confidence in quantitative descriptions of the macronutrient ratios of hunter-gatherer diets when the basic “data” supporting those consists of second-hand “subjective approximations” of the raw weights of plant, animal, and fish consumed by the various tribes?

In discussing the limitations of their calculations, Cordain et al return to the issue:

“Perhaps the most important variable influencing the estimation of the dietary macronutrient ratio in hunter-gatherer populations, when indirect procedures are used, is the validity of ethnographic data. Other ethnographers who compiled hunter-gatherer data from the Ethnographic Atlas noted that the scores Murdock assigned to the 5 basic subsistence economies are not precise, but rather are approximations (11, 36, 37) generally based on raw weights of the dietary items (36). Although estimations of energy by weight of wild plant and animal foods may sometimes yield results similar to actual values, there is considerable room for error. The present analysis indicates that if the mean plant-food energy density for 829 wild plant foods (6.99 kJ/g) is contrasted with the energy density (7.24 kJ/g) of an average white-tailed deer with 10% body fat, there would only be a 4% difference between actual energy values and those estimated by weight. However, if the mean energy density of wild fruit (3.97 kJ/g) or wild tubers (4.06 kJ/g) were contrasted with that of a white-tailed deer with 17.7% body fat (10.17 kJ/g), there would be a 60–61% difference between actual energy values and those estimated by weight. Obviously, not all ethnographic estimations of energy intake in hunter-gatherer populations based on food weight would necessarily be this extreme. This example does indicate the imprecise nature of qualitative ethnographic data; however, it does not rule out its important use as a data source to test hypothetic models. [Italics added.]”

Cordain et al admit that the EA data is “qualitative” and “imprecise,”  and admit "considerable room for error" involved in translating this data into energy and macronutrient intakes, yet all of their more precise quantitative conclusions about macronutrient ratios rest on this somewhat shifty foundation. 

Milton had a few things to say about this use of the EA.  In her critical editorial comments  on Cordain et al, Milton lists some of the reasons she doubts the accuracy of conclusions drawn from unselective use of the EA dietary data on "hunters:"

1 The EA is “compiled largely from 20th century sources and written by ethnographers or others with disparate backgrounds, rarely interested in diet per se or trained in dietary collection techniques.”  These people would not compile accurate objective data about diet but would give their  subjective impressions.
2  “By the 20th century, most hunter-gatherers had vanished; many of those who remained had been displaced to marginal environments.”   Marginal environments may produce less edible plant matter than richer environments favored by agriculturalists; if given a environment richer in plant resources, they might choose something else.  For example, Inuit are forced to live on an animal-based diet because their environment does not supply calorically significant plant foods.  This doesn’t tell us anything about what they would do if they inhabited the lusher environments now under the control of agricultural people.   Similarly the contemporary !Kung live in a desert; what they eat now doesn’t tell us what their ancestors ate when the Khalahari was a lush land with many large permanent lakes.
3  “Some societies coded as hunter-gatherers in the Atlas probably were not exclusively hunter-gatherers or were displaced agricultural peoples.”  More about this below.
4  “Because most of the ethnographers were male, they often did not associate with women, who typically collect and process plant resources.” 
5  “Finally, all the hunter-gatherers … included in the Atlas were modern-day humans with a rich variety of social and economic patterns and… not ‘survivors from the primitive condition of all mankind.’ (6). Their wide range of dietary behaviors does not fall into one standard macronutrient pattern that contemporary humans could emulate for better health.”

All of these factors introduce considerable room for error in the EA representations of hunter-gatherer diets, several producing a bias toward an impression that hunter-gatherers favored animal over plant foods. 

In her letter to the AJCN that also accompanied the Cordain et al paper, Milton made further remarks:

“In his 1968 analysis of hunter-gatherer diets, [Richard] Lee (8) reclassified some Atlas data and also excluded mounted hunters with guns and ‘casual’ agriculturalists from his database. In Lee's opinion, only 24 societies from all of Africa, Asia, Australia, and South America could be classified as hunter-gatherers, whereas North America alone contained >80% (135) of the 165 ‘hunting’ societies listed in the Atlas.”
“In contrast, in their analysis, Cordain et al (1) identified 229 hunter-gatherer societies in the Atlas; they also combined 2 of Lee's discrete categories (hunting and fishing) to estimate the total contribution of animal foods to energy subsistence. Given the uneven quality of most dietary data in the Atlas, the overrepresentation of hunter-gatherer societies from more temperate locales and the differences in classification and data analysis between these authors, different conclusions seem inevitable and all conclusions appear to merit closer study.”

Richard Lee provided his opinion in his essay “What hunters do for a living, or, how to make out on scarce resources” published in the book he edited, Man the Hunter (1968).  According to the University of Toronto Archives and Records:  “Prof Lee has published over 100 articles and chapters in books. He has authored several books including Man the Hunter (1968), Kalahari Hunter Gathers (1976), Politics and History in Band Societies (1982) and The Dobe Ju/’hoansi (2003). Most recognized is his 1979 , The !Kung San: Men and Women and Work in a Foraging Society, listed in American Scientist list of the 100 most important works in science of the 20th century.”

Now, consider Milton’s points: 

1.   Lee came to the conclusion that only 24 societies in the EA could be legitimately classified as hunter-gatherers operating in truly primitive fashion, i.e. without firearms, horses, or “casual agriculture,” but Cordain et al classified as hunter-gatherer any tribe that, according to the “subjective approximations” of Murdock, did not practice animal husbandry or agriculture at the time of observation by ethnographers or anthropologists, even if that tribe used modern firearms, or went hunting on horses instead of on foot.  The use of firearms and horses certainly increases the productivity of hunting compared to hunting on foot with primitive bow and arrow, perhaps leading some tribes to hunt--and succeed--more than they would otherwise. 
2.  None of the tribes that Lee identified as hunter-gatherers inhabited North America, whereas 80% of the tribes classified as “hunters” in the Atlas inhabited mostly 20th century North America.  In the 19th and 20th centuries, Euroamericans forced Natives of North America off of productive, agriculturally valuable lands that could produce large amounts of plant biomass, onto marginal lands like highlands and deserts where they would have little opportunity to consume plants relative to animals.

If Milton is correct, this apparently means that Cordain et al considered North American mounted hunters using firearms more “representative” of hunter-gatherers than the pedestrian African !Kung and Hazda, which in this letter to the AJCN they describe as “extreme” and “unrepresentative” of hunter-gatherers because they have diets higher in plant food than those North American tribes. 

In reference to this latter idea, Milton responded:

“The !Kung and Hazda, dismissed by Cordain et al as ‘unrepresentative,’ differ from many hunter-gatherers listed in the Atlas precisely because they have been relatively well studied dietarily—in both cases, plant foods contributed the bulk of daily energy intake.”

So, in essence, Milton is asking, which serves as the better basis for estimates of evolutionary macronutrient intakes: Quantitative data derived from first-hand accounts of African tribes not using horses or firearms, or qualitative data derived primarily from second-hand “subjective approximations” of dietary intakes of tribes all over the world, but predominantly from contemporary North America?   

In "Stone Agers in the Fast Lane"  and The Paleolithic Prescription, Eaton et al used the plant:animal subsistence ratio estimated by Lee, based on the 24 tribes he identified as primitive hunter-gatherers, namely 65:35 plant:animal, as the basis for their macronutrient estimates.  

Enough for now.  Remember that Cordain et al claimed that wild plant foods typically have a "relatively low carbohydrate content," apparently making them unsuitable as staple foods for hunter-gatherers, making it necessary for them to hunt high fat animals to avoid excessive consumption of protein.  Both Milton and I have some observations on this claim, which I will discuss in a future post, part 2.1.

Sunday, May 22, 2011

Nutrient density of dietary fats and high fat diets

 I decided to do a comparison of the nutrients in 500 kcalorie portions of olive oil, butter, lard, white rice, brown rice, potatoes, and sweet potatoes. 

I chose to include white and brown rice only because I wanted to find out if common animal fats really provide more nutritients, kcal for kcal, than white rice, commonly referred to as "filler" food.

I chose 500 kcalories because this would represent 25% of a 2000 kcal diet and 30% of a 1500kcal diet.  If you eat a high-fat, low-carbohydrate diet (more than 50% of kcals from fat), you will be getting most of your calories from fats.

Does this supply large amounts of fat-soluble vitamins or result in improved nutrient-density, compared, for example, to substituting white rice for those 500 kcal?  

I mean, does a diet that delivers 50% of its energy as fat really pack a nutritional punch, compared to one that delivers 50% of its energy as white rice, brown rice, potatoes, or sweet potatoes?

Using USDA data I compiled this table:

Click to Enlarge
Some observations:

1.  Sweet potatoes provide more vitamin E than any of the fats except olive oil, and potatoes supply almost 3 times as much vitamin E as lard and nearly as much as butter.

2.  None of the fats supply significant vitamin D; considering that our requirement is about 4000 IU daily, a 500 kcal dose of butter or lard provides only 1-1.4 percent of the daily requirement.

3. Butter delivers more micronutrients than either of the other fats, but the amounts are very small.  Except for vitamins A, D, and E, it doesn't even hold a candle to white rice.

4. A 500 kcal dose of butter provides only 50% of your vitamin A, and 11% of your vitamin E requirements. It provides no more than 2% of the requirement for any other nutrient.  By contrast, one teaspoon of cod liver oil provides 150%  of the vitamin A requirement, and 450 IU of vitamin D, ten times more than butter, and for only about 50 kcalories, one-tenth of the butter.

Kcalorie-for-kcalorie, enriched white rice meets more micronutrient needs than butter.  A diet providing 500 kcal as white rice, supplemented with a mere teaspoon of cod liver oil for your vitamin A, far surpasses the nutrient-density of a diet replacing those 500 kcal of white rice with either butter, lard, or olive oil, even if you include the teaspoon of cod liver oil. Choose brown rice, potatoes, or sweet potatoes and your vitamin and mineral intake will start to reach superior levels.  

Now, if you are trying to lose 25 to 50 pounds, you will need to be in energy deficit for 6 to 12 months.  If you consume 1800 kcalories daily, and 60% of those come from fats, which all have a very low vitamin and mineral density, you will need to get virtually all of your vitamins and minerals from the non-fat portion of your diet, which consists of only 720 kcalories. 

In Understanding Nutrition 6th Edition, Whitney and Rolfes point out on page 271 that "Nutritional adequacy is difficult to achieve on fewer than 1200 kcalories a day, and most healthy adults should not consume any less than that."  They make this statement in reference to a diet supplying only 20 percent of kcalories as fat.  A 60% fat, 1800 kcal diet is, from a nutrient-density perspective, equivalent to an 1100 kcal low-fat diet.  Most likely it will have multiple micronutrient deficiencies.

Now consider the following table, which lists the nutrients involved in metabolism of fats for energy as well as other functions. 

Role in fat metabolism
Part of TPP coenzyme in TCA.
Base of coenzymes FMN and FAD required for TCA cycle and electron transport chain.
Base of coenzymes NAD and NADP required for TCA cycle and electron transport chain; required cofacdtor for ∂-5 desaturase in EFA metabolism.
Delivers carbon to the TCA cycle to replenish oxaloacetate; required for catabolism of some fatty acids and for fatty acid synthesis.
Pantothenic acid
Base for Coenzyme A (CoA) required for making acetyl CoA, which cells need for synthesizing lipids and steroid hormones as well as for running the TCA cycle.
Pyridoxine (B6)
Part of coenzymes PLP and PMP used in fatty acid metabolism and steroid hormone activity, cofactor for ∂-6 desaturase in EFA metabolism.
Helps to break down some fatty acids.
Needed to synthesize the phospholipid lecithin.
Required for hydroxylation of carnitine, a compound that transports long-chain fatty acids into the mitochondria for energy metabolism; cofactor for ∂-5 desaturase.
Tocopherols (VT-E)
Protect lipids from free-radical damage (esp. PUFAs)
Part of ATP required in TCA cycle.
Catalyst in ATP formation; required for synthesis of lipids and elongation of essential fatty acids to produce prostaglandins.
Part of vitamins biotin and thiamin.
Part of hemoglobin required to deliver oxygen to cells for oxidation of macronutrients.
Co-factor in EFA  metabolism.
Part of glutathione, an assistant to VT-E.
Needed to make hemoglobin and in energy-releasing reactions.
Cofactor in lipid metabolism.

If you develop a deficiency of any of the nutrients required for oxidation of fats to release energy, this may impair your ability to convert body fats to energy, and stall your loss of fat.

Notice that it is the B-complex vitamins, vitamin C, and minerals you need most to metabolize fats.  So far as I know, vitamins A and D play no role in fat metabolism, so the star vitamins in butter probably aren't going to help you burn body fat.

Thus, from a micronutrient standpoint, a diet providing 30% of its energy as free (i.e. refined) fats differs little from a diet providing 30% of its energy as refined sugar. 

If you have a higher energy requirement—say, 2500 kcal daily or more­­––this might not affect you.  You might get at least 1200 kcalories from meat and possibly vegetables, enough to at least meet minimum micronutrient requirements. 

This might explain why some people eating high-fat, low-carb diets feel and perform adequately over a longer term, while others do not.  I would expect that those on high kcalorie, high fat, low carb diets, more typically large, active men, would have a better micronutrient status and general health and results than those on low kcalorie, high-fat, low-carb diets, more typically small or sedentary women.

Since the nutrients in meat lie in the muscle or organ, not in the fat, high-fat meats have a lower micronutrient density than lean meats. Since requirements for B-complex vitamins vary in proportion to energy expenditure, if you have a high energy expenditure, it would seem rational to get your energy from foods with the higher density of B-complex vitamins, such as unrefined starches and lean meats, rather than surplus fats.

Regardless of energy requirement, I would rather choose the more nutrient-dense path, increasing my chances of superior micronutrient status without use of isolated supplements having questionable value and potential for harm.

Just a thought.

Friday, May 20, 2011

The Case of the Missing Extinctions

Or, The Pleistocene Extinctions Pattern Implications For Understanding Human Evolutionary Nutrition

Prior to fifty thousand years ago, before the Pleistocene extinctions, ancestors of modern humans lived in Africa.   Modern man, Homo sapiens sapiens, emerged in Africa, before these extinctions. 

As explained by geneticist Spencer Wells in the documentary film Journey of Man, by tracking the Y chromosome, we now know that all modern humans are descendants of a few men among the ancestors of the !Kung people currently living in the Khalahari desert. 

But in ancestral times, the Khalahari was not a desert. During the Pleistocene ice ages, “the great deserts of North Africa and Western North America today were mostly vast grasslands with large permanent lakes and abundant game animals.” [1]  

Many believe that humans evolved primarily by hunting those land animals, and take the Pleistocene extinctions as evidence that humans evolved as top predators of grassland animals in Africa.

In "Of mice, mastodons and men: human-mediated extinctions on four continents" [2], Lyons et al provide several convincing lines of evidence and reasoning that indicate that humans must have caused the Pleistocene extinctions.  To simplify, no other known natural phenomenon (such as climate change) could account for the sudden selective extinctions of the megafauna, without simultaneous extinctions of smaller species. 

But there is something very interesting about the pattern of the Pleistocene extinctions.

The following graph from Lyons et al [2] shows the pattern.  The hatched bars show the species exterminated in the late Pleistocene.  As you move from left to right, the bars represent increasingly larger species.   Notice anything unique about Africa?

On every continent where humans were an invasive species, the largest animals went extinct.  

But not in Africa.

There, most of the large body species survived.   


It appears that as humans invaded Australia and the Americas they succeeded in exterminating the fat megafauna, but humans who remained in Africa did not, and many, including giraffes, elephants, and rhinos,  have remained to this day.  Why?

Lyons et al have a suggestion:

“The lack of extinctions in Africa (Fig. 1) is especially notable given the long history of humans on this continent. The co-evolution of man with the African megafauna may have resulted in the evolution of effective anti-predator behaviours (Diamond, 1984; Martin, 1984).”[Italics added]

In other words, upper Paleolithic humans did not exterminate the megafauna of Africa because the African megafauna had plenty of experience with and knew how to avoid human predation.  Just the smell of humans probably sent them running.


In contrast, animals in Europe, Asia, Australia, and the Americas most likely evolved independent of human predation.  This meant that they were, for humans, relative to African fauna, ‘sitting ducks.’ 

Thus, considering the co-evolution of humans with African fauna, the fact that the extinctions did not occur in Africa supports the hypothesis that humans, as an invasive species, caused those extinctions on other continents.

It also suggests that the invasion of other continents allowed humans to pursue a subsistence strategy––hunting large, fat anmals––that would not have been as successful in Africa.

Moreover, we could reasonably take the statement by Lyons et al and modify it slightly:

The co-evolution of man with African fauna (mega or not) most likely resulted in those fauna evolving behaviors effective for avoiding human predation.

This further suggests that the human exodus out of Africa into other continents may have resulted in a significant change in human diet composition, after H. sapiens sapiens had already emerged.  In other words, it suggests that the later Paleolithic diet of humans outside of Africa may have contained much larger amounts of land animal meat and fat than would have been possible in the early Paleolithic among the far more wary wildlife of the homeland.

Let me put it this way.   An animal adapted to a diet obtained in the Africa invades a new ecosystem.  In this new ecosystem, this invasive species finds that hunting is a lot easier than it was in the homeland, because the animals are relatively oblivious to the danger presented by humans.  Consequently, in accord with optimal foraging theory, this animal goes for the easiest calories possible:  large animals that don’t know that they should run away from anything that smells, sounds, or looks like a human.

If this new diet supports health adequately to allow reproduction for a majority of individuals, but disrupts homeostasis just enough that it gradually induces metabolic disorders that emerge over time, past the time of reproduction,  or shortens the average individual’s lifespan by inducing chronic diseases, the new diet will not select against those not adapted to the new diet, nor produce a population better adaptated to the diet.

I might even call it the upper Paleolithic dietary revolution. 

Now, let’s consider Africa again.  As noted above, the great deserts of Africa were grasslands with large “permanent” lakes (well, permanent during those ages).  If hunting land animals was not as easy in Africa as in Eurasia and the Americas, what about hunting animals living in those lakes?  Would they all catch the smell of humans or run away easily?

Lacustrine, riverine, and wetland environments are very rich in food resources, both plant and animal, including animals that don't move very fast (shellfish).  Foraging in such environments might just provide greater return on investment than possible on a grassland.

Some people studying human evolutionary nutrition believe that the archaeological, cultural, nutritional, biochemical, and medical evidence points in the direction of humans evolving in those econiches, where shellfish, fish, and amphibious animals could have been the primary sources of animal foods.  These animals have nutritional properties significantly different from  those of savannah animals—the protein, fats, and mineral contents of water animals all differ significantly from land-based animals, in very interesting ways relative to specifically human nutritional requirements, particularly for the nervous and cardiovascular systems.

In fact, the earliest human fossils are consistently found associated with lacustrine or marine fossils indicating humans inhabiting niches incorporating a land-water interface:  wet woodlands, flood plains, wetlands, rivers, lakes, and coastlines.  In these two segments of the Journey of Man, we learn about archaeological evidence of humans living along the African coastline and consuming large amounts of seafoods, leaving large heaps of seashells as evidence (segment starts about 5:00 of first video):

 Also, and again, as explained in the  Journey of Man, the first exodus of humans from Africa appears to have followed a path along the marine coastline to Australia; it does not track through a grassland.   And, in this segment, we learn that some of the first humans in Australia lived in a lacustrine environment (now a desert), described as "quite a rich environment," eating fish (leaving hearths and fish bones as evidence):

Indeed, in the two segments below, Wells point out that, at the end of this journey, to get from Indonesia to Australia, it appears that people had to find a way across 150 miles of open ocean (segment starts at about 3:20 of the first video).  

The animal capable of this seafaring--150 miles of open ocean with only the most primitive of boats--would have had to have been very familiar with and physically very well adapted to a marine environment.  Not a likely accomplishment for a mammal specialized in exploiting the dry savannah environment. 

Which leads me again back to Michael Crawford and David Marsh, co-authors of Nutrition and Evolution.   Crawford, a biochemist, has devoted his life work to study of the effects of fats on human physiology, and has a bit to say on the subject and its relation to human evolution.  Some others have something to say about the differences between land and animal foods with regard to human brain nutrition.

But for now I have run out of time.  To be continued.