Thursday, September 29, 2011

Iron, Oxidation, Inflammation, and Aging

In my Ancestral Health Presentation I discussed the idea that nutritional diseases fall into two categories, excess and deficiency.

When we compare recently observed hunter-gatherers with members of modern affluent nations, we see important differences between the two groups in general nutritional status throughout life.  Hunter-gatherers have very low body fat levels incomparison to modern people, which reflects the fact that from birth they have a low fat intake relative to fat expenditure

As the rapid sales of nutritional supplements in the U.S. shows, we tend to worry a lot about deficiencies.  From an evolutionary psychology standpoint, we might have this predisposition toward thinking that our problems are due to deficiency of something because our ancestors were more likely to develop dietary deficiencies than dietary excesses.

In the ancestral environment, food was more scarce than in modern nations, and it took more energy to get that food, a situation favoring deficiency (low nutrient availability plus high nutrient expenditure).  In our modern environment, we have a high availability of nutrients and do not need to expend much to get those nutrients, a situation favoring development of excesses. 

Worldwide, iron deficiency is the most common nutrient deficiency, occurring mainly in developing nations (low food/iron availability) and among children and menstruating women (high iron demand).

But in modern nations, a growing body of evidence suggests that excessive iron intake and retention promotes chronic degenerative diseases.

Iron and Oxidation

In human metabolism, iron play a critical role, as part of hemoglobin and myoglobin, in the promotion of mitochondrial  oxidation reactions that sustain the process of life.  The oxidation process inevitably produces peroxide and superoxide radicals.  These radicals themselves are comparatively non-toxic, and cells have evolved means of dealing with them [1].  However, when these oxide radicals react with unbound or poorly bound iron, they generate much more damaging hydroxyl radicals.

According to Chinese medical theory, we classify anything that acts like fire as relatively yang.  Fire consumes by oxidation and produces heat and light.  Oxidation fuels the normal process of transformation (growth and development, or aging); acceleration of oxidation accelerates transformation (growth, development, aging).  For example, a log left to age will gradually oxidize and turn to ashes; subjecting the log to fire just accelerates the transformation from log to ash.

Indeed, Ou et al have proposed that, on a biological level, the concepts of yang and yin correspond, in part, respectively, to oxidation (yang) and antioxidation (yin).  They found that traditional Chinese herbal medicines classified as yin tonics (used to nourish yin or promote water-like, cool, moist aspects of the body) have, on average, six times more antioxidant activity and polyphenolic contents than herbals classified traditionally as yang tonics (used to fortify yang or promote fire-like, hot, dry aspects of the body).

Since iron supports oxidation, iron deficiency reduces rates of cellular oxidation, and results in a more yin condition:  the iron-deficient person feels cold, fatigued, and weak, and suffers from pallor and cognitive impairment, a weakening of the light of the mind.  Its like the inner fire has died down.  Returning iron sufficiency restores the heat, energy, strength, color (reddish) and mental function, all signs of a fire burning more brightly.

Chinese medical theory predicts that an excess of a yang factor like iron would result in excessive 'fire' in the body, which Western medicine calls in-flamm-ation, the "flamm" simply meaning 'flame,' i.e. fire.   Exposing any tissue of the body to chronic in-flame-ation--low level fire-- results in scarring and hardening.

In "Iron behaving badly: inappropriate iron chelation as a major contributor to the aetiology of vascular and other progressive inflammatory and degenerative diseases," [2] Douglass Kell reviews an impressive body of evidence indicating

(i) that it is this combination of poorly liganded iron species, coupled to the natural production of ROSs, that is especially damaging, (ii) that the role of iron has received far less attention than has the general concept of ROSs, albeit the large literature that we review, and (iii) that this basic combination underpins a great many (and often similar) physiological changes leading to a variety of disease manifestations, and in particular those where the development of the disease is manifestly progressive and degenerative.

Kell's paper reviews evidence linking excess, unbound, or poorly bound iron to the following disorders:

Diabetes type 2, insulin resistance, and metabolic syndrome
Cardiovascular disease (heart failure, stroke, and atherosclerosis)
Alzheimer's, Parkinson's, and other neurodegenerative diseases
Amyotrophic lateral sclerosis (ALS, Lou Gerhig's)
Rheumatoid arthritis
Inflammatory bowel disease
Age-related macular degeneration
Chronic obstructive pulmonary disease

I have also found studies linking elevated body iron to PCOS [3] and Multiple Sclerosis [4].

Kell notes that many natural plant food compounds act to bind iron or block iron uptake:

Even though elements of the 'Mediterranean' diet that are considered to be beneficial are usually assumed to be so on the basis of their antioxidant capabilities (but cf. [1820]), many of the polyphenolic compounds (e.g. flavones, isoflavones, stilbenes, flavanones, catechins (flavan-3-ols), chalcones, tannins and anthocyanidins) [1821-1828] so implicated may also act to chelate iron as well [1073,1829-1843]. This is reasonable given that many of these polyphenols and flavonoid compounds [1821,1844-1853] have groups such as the catechol moiety that are part of the known iron-binding elements of microbial siderophores. Examples include flavones such as quercetin [914,1813,1829,1854-1864], rutin [1829,1857,1858,1865,1866], baicalin [1860,1867], curcumin [1813,1868-1872], kolaviron [1873], flavonol [1874], floranol [1875], xanthones such as mangiferin [1876-1879], morin [1876], catechins [1073,1807,1838,1854,1880,1881] and theaflavins [1882], as well as procyanidins [1835,1883] and melatonin [1628,1884-1887].

In addition to the phenols and flavonoids, dietary tannins (tea, coffee, nuts, vegetables) phytates (seeds, nuts, grains, beans), calcium, phosphorus, dairy products, and reduce iron uptake.

On the other hand, dietary factors that increase iron availability include:

Dietary meat, poultry, and fish:  All enhance iron absorption via the MFP factor, which promotes absorption of iron from non-animal foods eaten with the animal product.  Meat, fish and poultry also provide the most bioavailable heme iron, of which the body consistently absorbs about 23 percent, up to ten times more than from non-animal sources.  Red meat generally provides the greatest level of iron as well.

Dietary acids:  Vitamin C and other dietary acids (e.g. those found in sodas) increase the availability of dietary iron.

Hunter-Gatherers and Iron

In my Ancestral Health Presentation, I argued that hunter-gatherers and modern people differ in some very important contextual aspects.  The following slide from my presentation summarizes my observations:

To summarize the table, in comparison to modern people, hunter-gatherers live in an environment with a low food availability and a high energy expense required to get that food.  This combination results in lifelong caloric restriction and low body fat levels in the hunter-gatherer.  

Hunter-gatherers also consume many unrefined plant foods and herbs that contain polyphenols, flavonoids, tannins, and fiber,  all of which so-called 'antinutrients' reduce iron uptake and bind iron to reduce its availability for reaction with peroxide and superoxide.  Finally, they have many other factors causing blood and iron loss, including parasites, insects, and injuries.

This environment is similar to that of our primate ancestors.  Hence, I would surmise that human metabolism is adapted to an environment with a high intake of 'antinutrients,' a low dietary iron availability and a condition of borderline iron deficiency.

In such an environment, meat, particularly red meat, with its highly available iron, may serve as a medicine.

In contrast, people in modern affluent nations inhabit an environment with a low intake of 'antinutrients' (due to emphasis on refined plant foods), a high food iron availability and comparatively few drains on body iron stores.

Even many menstruating women have less iron loss in modern nations due to use of birth control methods.

Thus, modern people have a tendency to accumulate excess iron.

Comparing Nations and Diets

In The Iron Factor of Aging, Francesco S. Facchini discusses the relationship between iron and chronic diseases at length.  After a thorough review of the evidence linking iron to inflammation, disease, and aging, he notes that when we look at modern nations, people who have diets with a lower iron availability also have lower rates of chronic inflammatory, autoimmune, and degenerative diseases.  These include the Mediterranean and Asian nations where tea, wine, cheese, legumes, vegetables, and fruits provide the 'antinutrients' reducing iron availability, and people either consume less red meat and more white meat (fish and poultry, lower in iron) or nearly vegetarian diets.

This perspective raises the possibility that fish consumption sometimes correlates with reduced risks of degenerative diseases not because it provides some essential nutrient (e.g. fish oils) but because people eating fish instead of land animal meat will have a lower intake of iron.

Also, vegetarian diets have a lower iron availability, and also associate with lower risks of chronic degenerative diseases.

The modern diet provides more iron by route of red meat and iron-enriched foods consumed in combination with highly acidic foods or beverages or vitamin C supplements. 


Iron nutrition provides a good example of how context can modify the effect of a food.   The French eat more red meat than the Japanese, and they have a higher risk of cardiovascular disease, but not as high as in the U.S.

The Japanese eat vegetables, rice, fish, and soy products, and drink tea.  They typically eat no red meat.  Their vegetables, soy, and tea all reduce iron availability, while nothing in their typical diet is a rich source of bioavailable iron.  They live long lives with a low incidence of chronic degenerative diseases. 

The French eat meat with vegetables, fruits, dairy, and wine, all of which reduce iron availability.  In contrast, Americans eat meat with bread made from iron-enriched flour, hardly any vegetables,  and typically drink either acidic soda or low-polyphenol beer.

The typical French meal would not provide as much iron as the typical American meal.  The typical French man or woman would have less stored iron and this may explain why he or she has a lower risk of cardiovascular disease.

Men vs. Women

Historically, U.S. men have higher risks of cardiovascular diseases than U.S. women, until women pass menopause.  This means women have lower risk when they have monthly losses of iron through menstruation, and their risk rises when they stop losing iron.

In general, in modernized nations, women have a greater life expectancy than men.  This means women age more slowly, and this may occur because premenopausal women lose iron every month, resulting in a lower iron status, and a lower level of hydroxyl radical formation, during much of their lives.

However, the iron hypothesis predicts that women who reduce menstrual blood losses by birth control methods without compensating by reducing dietary iron availability will have an increased risk of iron-related diseases. 

Men can reduce their iron stores by regularly consuming 'antinutrients' and giving blood.

Friday, September 23, 2011

Potatoes and Protein

The Food and Nutrition Board of the National Academies of Science publishes the Dietary Reference intakes, which includes Estimated Average Requirements for Indispensible Amino Acids for Adults Aged 19 years or Older.

I decided to find out whether the average person could meet his or her IAA requirements eating a diet composed solely of white potatoes.  I used the USDA nutrient database to find the amino acid delivery of potatoes at 1880 kcal and 2350 kcal, approximate caloric requirements of a 120 pound woman or a 160 pound man, respectively.  I created the following table for the purpose:
Click for larger version

According to these FNB and USDA data, the average person can meet all indispensable amino acid requirements eating potatoes as his/her sole protein source.   No 'limiting amino acids' nor protein complementing required so long as caloric requirements are met.

Kon and Klein reported in 1927 on The Value of Whole Potato in Human Nutrition.  Two healthy adults obtained all of their protein and IAAs from potatoes for 167 days.  They maintained nitrogen balance.  They reported:

"The digestion was excellent throughout the experiment and both subjects felt very well. They did not tire of the uniform potato diet and there was no craving for change."

The potato provides only about 10% of calories as protein.  A mixed diet containing other plant foods providing higher proportions of protein, like green vegetables (20-40% of calories as protein), nuts (~25% of calories as protein), or legumes (~25% of calories as protein) will provide higher levels of the IAAs and total protein.

It seems that humans can obtain all of the protein they require from a food like the potato.  What does this tell us about ancestral nutrition?   

I decided to compare the IAA delivery of potatoes to that of 95% lean ground beef.  I compared the IAA delivery of amounts of beef and potatoes that provide comparable total protein, and created the following table to illustrate:
Click for larger version
Both the beef and the potato provide adequate amounts of total protein and IAAs. The much smaller portion of beef (200 g vs 2500 for the potatoes) provides larger doses of the IAAs, presumably constituting greater excesses of IAAs for the average individual.

The body will deaminate and oxidize these excesses of IAAs, increasing the amount of ammonia the liver must detoxify and sulfuric acid and urea the kidneys must excrete.

Since the beef supplies only about 14% of total energy requirements, the individual who fills in the other 86% of calories with whole foods that also provide protein will automatically consume a higher amount of IAAs than one who consumes a mix of plant foods with a far smaller amount of animal protein.

For those who have concerns about overconsuming methionine because some research suggests that reduced methionine consumption might increase longevity, I find it interesting to note that the ground beef provides about 15 times more methionine+cysteine per unit weight than the potatoes (about 9.5 mg per g cooked beef, versus 0.6 mg per g cooked potatoes).  

Let's say someone consumes the half-pound of beef (328 kcal) and gets the remainder of his required 2350 calories from potatoes (2022).  He would get 1906 + 0.86(1425) = 3132 mg of methionine+cysteine, compared to 1425 mg (less than half as much) if he ate only potatoes. 

I wonder if the body has an internal regulatory mechanism for amino acid consumption, which drives appetite to control total amino acid intake, such that if a person eats a diet rich in animal protein, the sytem drives the appetite toward attempting to fill the bulk of caloric requirements with low protein items like fat/oil, sugar, fruits, some very low protein tubers (e.g. cassava), or some refined starches?

Saturday, September 10, 2011

Zone Out

Neil Mann belongs to the team of researchers who work with Loren Cordain and promote high intakes of lean meat on an evolutionary basis.

For example, he authored Dietary lean red meat and human evolution in which he argues that various lines of study "indicate the reliance on meat intake as a major energy source by pre-agricultural humans."

Mann and another team from Royal Melbourne Institute of Technology published a new study of the efficacy of a high (30%) protein diet, in comparison to a high (55%) carbohydrate diet, for type 2 diabetes.

The effect of high-protein, low-carbohydrate diets in the treatment of type 2 diabetes: a 12 month randomised controlled trial.

In this study, 99 subjects received advice to follow low-fat (30% total energy) diets; 53 of those received instructions to eat a diet supplying 30% of total energy from protein and 40% from carbohydrate (high protein arm), while  46 received instructions to eat a diet supplying 55% of total energy from carbohydrate and 15% from protein.   

The high-protein diet had the same proportions of protein, fat, and carbohydrate (30:30:40) recommended by Barry Sears in his "Zone" diet books.  Supposedly this proportion produces better blood sugar and insulin control than a high carbohydrate, lower protein diet.

The aim was to find out if eating a diet high in protein would provide superior glycemic control to a diet high in carbohydrate, so the primary endpoint was change in HbA(1c).  "Secondary endpoints included changes in weight, lipids, blood pressure, renal function and calcium loss."

The results?

"HbA(1c) decreased in both groups over time, with no significant difference between groups (mean difference of the change at 12 months; 0.04 [95% CI -0.37, 0.46]; p = 0.44). Both groups also demonstrated decreases over time in weight, serum triacylglycerol and total cholesterol, and increases in HDL-cholesterol. No differences in blood pressure, renal function or calcium loss were seen."

Mann et al concluded:

"These results suggest that there is no superior long-term metabolic benefit of a high-protein diet over a high-carbohydrate in the management of type 2 diabetes."

I don't have access to the full text, but since the team that did this study includes Neil Mann, one of the strongest proponents of the idea that humans are adapted to diets high in animal protein, who might have a bias in favor of high-protein diets, this study appears to undermine the high-protein approach to diabetes.

It doesn't appear to do the Zone Diet any favors either. 

On the other hand, it supports the already established body of literature showing efficacy of a high-carbohydrate approach to diabetes type 2.   The high-carbohydrate diet apparently produced meaningful decreases in weight, HbA(1c), triglycerides, and total cholesterol, and increases in HDL.

The decrease in trigs and elevation of HDL are particularly of interest, since very often I see claims that high carb diets raise trigs and lower HDL. 

This study provides evidence against the claim that humans are specially, evolutionarily adapted to high-protein diets and maladapted to high-carbohydrate diets, and undermines the claim that this one disease of civilization, type 2 diabetes, and its chief feature, hyperinsulinemia, arise from high-carbohydrate diets. 

Of interest, both diets had relatively low fat contents.  Since altering the ratio of protein and carbohydrate appeared to have no effect on results, this study may also suggest that reduction of dietary fat proportion plays a key role in the treatment of type 2 diabetes if the goals are reduction of body mass, HbA(1c), triglycerides, and total cholesterol, along with increases of HDL.

Friday, September 2, 2011

Interesting Links

In a small pilot study, 18 overweight people ate six to eight small purple potatoes twice daily for a month and found their systolic and diastolic blood pressures (the top and bottom numbers on a blood pressure reading) dropped by 3.5 and 4.3 percent, respectively.

The potatoes were microwaved, with no toppings added.  I prefer mine steamed.  I usually eat 3 to 6 potatoes daily. 

I believe that potatoes frequently correlate with increased disease risks in epidemiological studies because people usually eat them fried or topped with fat and liberally salted, and they travel with otherwise poor quality food choices.  Hence, they tend to serve as a marker for poor diet choices that promote fat gain, metabolic syndrome, and diabetes.  The potatoes themselves are not to blame, they just happen to be at the scene of the crime.

This clinical study shows that the potato itself promotes cardiovascular health without weight gain.
 Potatoes have a very high potassium to sodium ratio (about 56:1 when microwaved, no salt added) and I suspect this potassium infusion provided the blood pressure correction.  However it is possible that potatoes contain some other phytochemical that affects blood pressure as well.  Whole foods are greater than the sum of their parts. 

In Costa Rica, "People who ate at least two servings of beans for every serving of white rice tended to be at lower risk for metabolic syndrome. In those who substituted a serving of beans for a serving of white rice the risk of metabolic syndrome was reduced by 35 percent, the researchers report in American Journal of Clinical Nutrition.”

This study suggests that we are metabolically better adapted to cooked dried beans than to white rice.  Legumes have a lower glycemic load, a higher proportion of protein, and a fiber profile more like fresh vegetables and fruits (higher in soluble fiber) than cereal grains.  

I think we have reason to believe that human ancestral diets included some fresh legumes and this use provided the segue into the use of dried legumes as a part of the agricultural evolution.  I think we are better adapted to fresh than dried legumes.  Dried legumes have more resistant starch than fresh legumes, and this feeds gut flora, resulting in bloating when you exceed your personal legume limit. 

Soaking and sprouting start converting the dried legume back into a fresh vegetable.  Among the dried legumes, the smaller varieties (lentils, adukis, mungs) and those in the pea family (peas, chickpeas) have lower proportions of the resistant starch.  

People who trained vigorously for 45 minutes at a level of effort that increased body temperature, raised heart rate, and induced sweating burned an average of 190 extra calories in the 14 hours following the training session.   They also burned about 590 total calories in the exercise session itself. This contrasts with low intensity exercise, which does not increase caloric expenditure in the hours following exercise sessions.  

As this article points out, if you need mental refreshment, a coffee break will probably backfire, but a growing body of research shows that making contact with live plants and animals, even if vicarious (photos of forests) is more effective than other diversions.   A little time enjoying the sights and sounds of nature also has proven successful at relieving depression and anxiety and boosting cognitive performance.

This makes sense from an ancestral perspective.  For millenia the human body-mind adapted to life surrounded by plants and animals in wild settings.  Our nervous systems are adapted to the stimuli provided by living environments.  Now in civilization we spend most of our time in less lively built environments.  Our nervous systems apparently go awry in such spaces.  

I suspect this contributes not only to mental illness, but also to disorders we tend to consider 'physical' like cardiovascular disease.  The great health of isolated tribes might have a lot to do with the fact that the human neuroendocrine system is adapted to their more natural surroundings.  I would guess that they have very different levels of various neuroendocrine chemicals (such as adrenaline and cortisol) found out of balance in diseases like metabolic syndrome.

This also means we can help ourselves by naturalizing our built environments.  The Chinese developed an art of building and arranging environments to make them more natural and conducive to human vitality.  They routinely incorporated natural sights and sounds--like stones, flowing water, fish ponds and tanks, plants and trees-- into built environments with the express intent of maximizing the beneficial impact on human health and awareness.  

This part of Feng Shui (literally translated, Wind Water) centers on understanding how various environmental elements affect the nervous system. Its not hocus-pocus.   It focuses on how to make built environments compatible with human needs by incorporating the 'five elements' of water, wood, fire, soil, and mineral, and balancing complementary such as bright and dark, wet and dry, hard and soft, heavy and light in ways that calm the nervous system or produce another desired effect, depending on location.  

By bringing the elements of wind, water, wood, sunlight or fire, earth, and stone into our home and work spaces, we make them more like our ancestral environment and beneficial to our health.  Then, we simply arrange them in the way than makes the space feel most comfortable and fitting for the use. 

Thursday, September 1, 2011

Catching Fire

Catching Fire by Richard Wrangham presents a compelling argument that the primary nutritional change driving human evolution from small-brained Homo habilis to large-brained Homo sapiens was cooking––not meat-eating.

Wrangham starts off with some critical observations:  No known human tribe lives on a predominantly raw food diet, and those modern people who attempt to live on a largely raw food diet have demonstrated difficulties maintaining body mass, energy levels, and fertility.  This points to the hypothesis that modern humans are actually "adapted to eating cooked food in the same essential way as cows are adapted to eating grass, or fleas to sucking blood, or any other animal to its signature diet.  We are tied to our adapted diet of cooked food, and the results pervade our lives, from our bodies to our minds.  We humans are the cooking apes, the creatures of the flame."

Traditional Chinese medicine has for milennia maintained that humans need to eat cooked food to get adequate food energy (Pinyin: gu qi).  In The Tao of Healthy Eating, traditional Chinese physician Bob Flaws writes:

"Traditional Chinese medicine suggests that most people, most of the time, should eat mostly cooked food.  Cooking is predigestion on the outside of the body to make food more easily digestible on the inside.  By cooking foods in a pot on the outside of the body, one can initiate and facilitate the stomach's rottening and ripening in its pot on the inside of the body.  cold and raw foods require that much more energy to transform them into warm soup within the pot of the stomach.  Since it takes energy or qi to create this warmth and transformation, the net profit from this transformation is less.  Whereas, if one eats cooked foods, less qi is spent in the process of digestion.  This means that the net profit of digestion, i.e. qi or energy, is greater."

This perspective contradicts the common belief that raw food is better than cooked because cooking can destroy nutrients.  But as Flaws points out,  net nutrient delivery matters more than gross amount of nutrient in the raw food.  Let's assume that a carrot has 10 units of X nutrient, but only 10% of it is available to humans because it is locked in an largely indigestible cellulose envelope.  Let's say that cooking destroys 50% of that nutrient (a gross overestimation for proper cooking), but increases the availability to 50%.  The net delivery of X from the raw carrot is 1 unit, but the net from the cooked carrot is 2.5 units. 

Wrangham presents multiple lines of evidence that humans and non-humans have a greater net macronutrient absorption from cooked than from raw foods, resulting in cooked foods delivering more energy than raw foods.

Wrangham includes some of the research I discussed in my series on raw vegan diet, which found that a high proportion of people eating diets high in raw foods (70% or more raw) are underweight and have low fertility.   Belgian researchers showed that humans can digest only about 65% of the protein in raw eggs, but 91-94% of the protein from cooked eggs. [1] Another team showed that enzymatic  digestion of heated beef protein increased by four times over raw beef protein. [2] This occurs because cooking denatures protein more effectively than stomach acid, making it more vulnerable to enzymatic digestion.

Wrangham's hypothesis competes with the Man-The-Hunter hypothesis which maintains that humans evolved big brains and small guts by route of increased meat-eating.   However, Wrangham points out that the hunting hypothesis can't account for some of the facts. Increased meat-eating might explain the transition from Australopithecines to Homo habilis (habilines), but not the transition from the habilines to Homo erectus:
"Meat-eating accounts smoothly for the first transition, jump-starting evolution toward humans by shifting chimpanzeelike australopithecines into knife-wielding, bigger-brained habilines, while still leaving them with apelike bodies capable of collecting and digesting [raw] vegetable foods as efficiently as did australopithecines.  But if meat eating explains the origin of the habilines, it leaves the second transition unexplained, from habilines to Homo erectus.  Did habilines and Homo erectus obtain their meat in such different ways that they evolved different kinds of anatomy?  Some people think the habilines might have been primarily scavengers while Homo erectus were more proficient hunters.  The idea is plausible, though archaeological data do not directly test it.  But it does not solve a key problem concerning the anatomy of Homo erectus, which had small jaws and small teeth that were poorly adapted for eating the tough raw meat of game animals.  These weaker mouths cannot be explained by Home erectus's becoming better at hunting.  Something else must have been going on."
Increased meat eating can't explain whey we have such small mouths and jaws. 
"Given that the mouth is the entry to the gut, humans have an astonishingly tiny opening for such a large species....To find a primate with as relatively small an aperture as that of humans, you have to go to a diminutive species, such as a squirrel monkey weighing less than 1.4 kilograms (3 pounds). In addition to having a small gape, our mouths have a relatively small volume––about the same size as chimpanzee mouths, even though we weigh some 50 percent more than they do.  Zoologists often try to capture the essence of our species with such phrases as the naked, bipedal, or big-brained ape.  They could equally well call us the small-mouthed ape."
Compare the jaws of any raw food eating animal to human jaws.  The largely vegetarian chimp has a gape much larger than that of a human:
Source:  Junglewalk

The carnivorous cat has a gape nearly half the size of its head, and the jaws are very powerful for cutting through raw meat. 

You can see some other big yawns here.  Compare to the modern human gape:

Source:  Flikr
Humans have a small mouth for such a large head.  The larger gape of other species is not for taking in large bites, it is necessary for leverage to crush tough, chewy raw foods. 

By the way, although Wrangham does not mention it, the shrinkage and reorganization of the mouth laid the foundation for speech.  Thus, we may owe our linguistic abilities to the mastery of fire and cooking.  I seem to recall reading that another anthropologist had proposed this hypothesis more than 20 years ago, but I no longer have the book that had the reference.

If evolution from Homo habilis to Homo erectus had been driven by increased consumption of raw meat, with technology and cooking as an afterthought, we would expect to have seen it maintain the large powerful ape mouth and jaws, retained the large, sturdy teeth, and increased the shearing action for adaptation to meat eating.  Instead, from the habilines to the erectines the mouth and teeth shrank.

Here's a habiline skull:

Source: skulls/s10_homo_habilis

And here's an erectine skull:

The erectine jaw and teeth are much smaller relative to body size.  Erectines had a smaller gape and must have had a softer diet than the habilines.  The skeletal remains provide the best available evidence that some tribe of Homo habilis discovered something that made for a much softer and energy-rich diet, giving rise to Homo erectus. 

One might think that the use of knives and hammers alone selected for smaller mouths.  Perhaps habilines simply cut the meat into small pieces or pounded it tender.  Although initially plausible, on further examination, this loses credibility, because it can't explain how an animal adapted to a diet consisting predominantly of raw vegetation can continue eating that vegetation while adapting to the raw meat portion of the diet.

Wrangham notes that "Peter Lucas has calculated that the size of a tooth needed to make a crack in a cooked potato is 56 percent to 82 percent smaller than needed for a raw potato."  Thus, so long as human ancestors continued to eat raw plants, they needed large teeth and jaws.  And they definitely needed to eat plants.
"The problem is that tropical hunter-gatherers have to eat at least half of their diet in the form of plants, and the kinds of plant foods our hunter-gatherer ancestors would have relied on are not easily digested raw.
Tropical wild game simply does not provide adequate amounts of fat or carbohydrate to prevent excessive intake of protein resulting in ammonia and urea accumulation, especially in the annual dry seasons, when the whole carcass fat levels of game will drop as low as 1 percent to 2 percent.
By the way, Wrangham notes:
"Starchy foods make up more than half of the diets of tropical hunter-gatherers today and may well have been eaten in similar quantity by our human and pre-human ancestors in the African savannas."
Moreover, if raw meat was a staple of our ancestors, we would expect modern humans to have some significant resistance to toxins produced by bacteria that infect raw meat.  But we are still vulnerable to those bacterial toxins.

In addition, there is a major economic problem with the meat-eating hypothesis.  Wrangham has studied chimps directly, watching them hunt and eat.  The typical chimp has to spend about 6 hours daily chewing its bulky, chewy raw foods.  They hunt opportunistically, but will only spend 15 to 20 minutes on a hunt.  If not successful in that time frame, they give up and return to eating plants.  Why?

Wrangham explains that because digestion of raw food takes more time than digestion of cooked food and costs a lot of energy, a chimp has to devote eight or nine hours daily to feeding in order to get adequate energy.  Australopithecines and habilines probably had similar constraints.  This would have prevented them from investing much time in hunting:
"Males who did not cook would not have been able to rely on hunting to feed themselves.  Like chimps, they could hunt in opportunistic spurts.   But if they devoted many hours to hunting, the risk of failure to obtain prey could not be compensated rapidly enough.  Eating their daily required calories in the form of their staple plant foods would have taken too long."
As Wrangham explains, a division of labor into hunting and gathering would not solve this problem, so long as the food was consumed raw.
"Suppose that a hunter living on raw food has a mate who is willing to feed him, that his mate could collect enough raw foods for him (while satisfying her own needs) and would bring them back to a central place, to be met by her grateful mate.  Then suppose the male has had an unsuccessful day of hunting....The hungry hunter needs to consume, say, two thousand calories, but he cannot eat after dark.  To do so would be too dangerous, scrabbling in the predator-filled night to feel for the nuts, leaves, or roots his gatherer friend brought him.  If the hunter slept on the ground, he would be exposed to predators and large ungulates as he fumbled for his food.  If he were in a tree, he would find it hard to have his raw foods with him because they do not come in tidy packages.  
"So to eat his fill he would have to do most of his eating before dusk, which falls between about 6 and 7 P.M. in equatorial regions.  If he had eaten nothing while on the hunt, he would need to be back in camp before midday, and there he would find his mate's gathered foods 9assuming she had been able to complete her food gathering so early in the day).  He would then have to spend the rest of the day eating, resting, eating, resting, and eating.  In short, the long hours of chewing necessitated by a raw diet would have sharply reduced hunting time.  It is questionable whether the sexual division of labor would have been possible at all.

"The use of fire solved the problem.  It freed hunters from previous time constraints by reducing the time spent chewing.  It also allowed eating after dark.  The first of our ancestral line to cook their food would have gained several hours of daytime. Instead of being an opportunistic activity, hunting could have become a more dedicated pursuit with a higher potential for success.  Nowadays men can hunt until nightfall and still eat a large meal in camp.  After cooking began, therefore, hunting could contribute to the full development of the family household, reliant as it is on a predictable economic exchange between women and men."
 In short, cooking (and other culinary technologies that make food softer and easier to digest) made it possible for humans to pursue increased meat-eating.  It freed men from the need to continuously feed on plant foods, giving them time to devote to hunting meat. 

Simply put, cooked food delivers more energy and nutrition in a smaller, more easily digested package than raw food.  Wrangham argues that since Homo erectus had a larger brain and a much smaller face, mouth and teeth than Homo habilis, probably some tribe of Homo habilis first controlled fire and used it for cooking.  The resulting increase in energy and nutrient availability led to rapid selection for smaller guts and larger brains and bodies.  By providing protection from nocturnal predators, control of fire also enabled human ancestors to give up tree-dwelling.  It also supported the sexual division of labor (hunting and gathering/cooking) present in human cultures.

Wrangham's Catching Fire will provide plenty of food for thought for anyone interested in ancestral nutrition.