Meet Sir Edward Mellanby, the man who discovered vitamin D. Along with his wife, Dr. May Mellanby, he identified dietary factors that control the formation and repair of teeth and bones. He also identified the primary cause of rickets (vitamin D deficiency) and the effect of phytic acid on mineral absorption. Truly a great man! This research began in the 1910s and continued through the 1940s.
What he discovered about tooth and bone formation is profound, disarmingly simple, and largely forgotten. I remember going to the dentist as a child. He told me I had good teeth. I informed him that I tried to eat well and stay away from sweets. He explained to me that I had good teeth because of genetics, not my diet. I was skeptical at the time, and rightly so.
Tooth structure is primarily determined during growth. Well-formed teeth are highly resistant to decay, while poorly-formed teeth are cavity-prone. Drs. Mellanby demonstrated this by showing a strong correlation between tooth enamel defects and cavities in British children. The following graph is drawn from several studies he compiled in the book Nutrition and Disease (1934). "Hypoplastic" refers to enamel that's poorly formed on a microscopic level.
The graph is confusing, so don't worry if you're having a hard time interpreting it. If you look at the blue bar representing children with well-formed teeth, you can see that 77% of them have no cavities, and only 7.5% have severe cavities (a "3" on the X axis). Looking at the green bar, only 6% of children with the worst enamel structure are without cavities, while 74% have severe cavities. Enamel structure is VERY strongly related to cavity prevalence.
What determines enamel structure during growth? Drs. Mellanby identified three dominant factors:
Teeth and bones are a mineralized protein scaffold. Vitamin D influences the quality of the protein scaffold that's laid down, and the handling of the elements that mineralize it. For the scaffold to mineralize, the diet has to contain enough minerals, primarily calcium and phosphorus. Vitamin D allows the digestive system to absorb the minerals, but it can only absorb them if they aren't bound by phytic acid. Phytic acid is an anti-nutrient found primarily in unfermented seeds such as grains. So the process depends on getting minerals (sufficient minerals in the diet and low phytic acid) and putting them in the right place (fat-soluble vitamins).
- The mineral content of the diet
- The fat-soluble vitamin content of the diet, chiefly vitamin D
- The availability of minerals for absorption, determined largely by the diet's phytic acid content
Optimal tooth and bone formation occurs only on a diet that is sufficient in minerals, fat-soluble vitamins, and low in phytic acid. Drs. Mellanby used dogs in their experiments, which it turns out are a good model for tooth formation in humans for a reason I'll explain later. From Nutrition and Disease:
Thus, if growing puppies are given a limited amount of separated [skim] milk together with cereals, lean meat, orange juice, and yeast (i.e., a diet containing sufficient energy value and also sufficient proteins, carbohydrates, vitamins B and C, and salts), defectively formed teeth will result. If some rich source of vitamin D be added, such as cod-liver oil or egg-yolk, the structure of the teeth will be greatly improved, while the addition of oils such as olive... leaves the teeth as badly formed as when the basal diet only is given... If, when the vitamin D intake is deficient, the cereal part of the diet is increased, or if wheat germ [high in phytic acid] replaces white flour, or, again, if oatmeal [high in phytic acid] is substituted for white flour, then the teeth tend to be worse in structure, but if, under these conditions, the calcium intake is increased, then calcification [the deposition of calcium in the teeth] is improved.
Other researchers initially disputed the Mellanbys' results because they weren't able to replicate the findings in rats. It turns out, rats produce the phytic acid-degrading enzyme phytase in their small intestine, so they can extract minerals from unfermented grains better than dogs. Humans also produce phytase, but at levels so low they don't significantly degrade phytic acid. The small intestine of rats has about 30 times the phytase activity of the human small intestine, again demonstrating that humans are not well adapted to eating grains. Our ability to extract minerals from seeds is comparable to that of dogs, which shows that the Mellanbys' results are more applicable to humans than those in rats.
Drs. Mellanby found that the same three factors determine bone quality in dogs as well, which I may discuss in another post.
Is there anything someone with fully formed enamel can do to prevent tooth decay? Drs. Mellanby showed (in humans this time) that not only can tooth decay be prevented by a good diet, it can be almost completely reversed even if it's already present. Dr. Weston Price used a similar method to reverse tooth decay as well. I'll discuss that in my next post.
In the last post, I reviewed the controlled trials on the effect of the glycemic index (GI) of carbohydrate foods on health. I concluded that there is not much evidence that a low GI diet is better for health than a high GI diet.
It is true that for the "average" individual the GI of carbohydrate foods can affect the glucose and insulin response somewhat, even in the context of an actual meal. If you compare two meals of very different GI, the low GI meal will cause less insulin secretion and cause less total blood glucose in the plasma over the course of the day (although the differences in blood glucose may not be large in all individuals).
But is that biologically significant? In other words, do those differences matter when it comes to health? I would argue probably not, and here's why: there's a difference between post-meal glucose and insulin surges within the normal range, and those that occur in pathological conditions such as diabetes and insulin resistance. Chronically elevated insulin is a marker of metabolic dysfunction, while post-meal insulin surges are not (although glucose surges in excess of 140 mg/dL indicate glucose intolerance). Despite what you may hear from some sectors of the low-carbohydrate community, insulin surges do not necessarily lead to insulin resistance. Just ask a Kitavan. They get 69% of their 2,200 calories per day from high-glycemic starchy tubers and fruit (380 g carbohydrate), with not much fat to slow down digestion. Yet they have low fasting insulin, very little body fat and an undetectable incidence of diabetes, heart attack and stroke. That's despite a significant elderly population on the island.
Furthermore, in the 4-month GI intervention trial I mentioned last time, they measured something called glycated hemoglobin (HbA1c). HbA1c is a measure of the amount of blood glucose that has "stuck to" hemoglobin molecules in red blood cells. It's used to determine a person's average blood glucose concentration over the course of the past few weeks. The higher your HbA1c, the poorer your blood glucose control, the higher your likelihood of having diabetes, and the higher your cardiovascular risk. The low GI group had a statistically significant drop in their HbA1c value compared to the high GI group. But the difference was only 0.06%, a change that is biologically meaningless
OK, let's take a step back. The goal of thinking about all this is to understand what's healthy, right? Let's take a look at how carbohydrate foods are consumed by cultures that rarely suffer from obesity or metabolic disease. Cultures that rely heavily on carbohydrate generally fall into three categories: they eat cooked starchy tubers, they grind and cook their grains, or they rely on grains that become very soft when cooked. In the first category, we have Africans, South Americans, Polynesians and Melanesians (including the Kitavans). In the second, we have various Africans, Europeans (including the villagers of the Loetschental valley), Middle Easterners and South Americans. In the third category, we have Asians, Europeans (the oat-eating residents of the outer Hebrides) and South Americans (quinoa-eating Peruvians).
The pattern here is one of maximizing GI, not minimizing it. That's not because high GI foods are inherently superior, but because traditional processing techniques that maximize the digestibility of carbohydrate foods also tend to increase their GI. I believe healthy cultures around the world didn't care about the glycemic index of foods, they cared about digestibility and nutritional value.
The reason we grind grains is simple. Ground grains are digested more easily and completely (hence the higher GI). Furthermore, ground grains are more effective than intact grains at breaking down their own phytic acid when soaked, particularly if they're allowed to ferment. This further increases their nutritional value.
The human digestive system is delicate. Cows can eat whole grass seeds and digest them using their giant four-compartment stomach that acts as a fermentation tank. Humans that eat intact grains end up donating them to the waste treatment plant. We just don't have the hardware to efficiently extract the nutrients from cooked whole rye berries, unless you're willing to chew each bite 47 times. Oats, quinoa, rice, beans and certain other starchy seeds are exceptions because they're softened sufficiently by cooking.
Grain consumption and grinding implements appear simultaneously in the archaeological record. Grinding has always been used to increase the digestibility of tough grains, even before the invention of agriculture when hunter-gatherers were gathering wild grains in the fertile crescent. Some archaeologists consider grinding implements one of the diagnostic features of a grain-based culture. Carbohydrate-based cultures have always prioritized digestibility and nutritional value over GI.
Finally, I'd like to emphasize that some people don't have a good relationship with carbohydrate. Diabetics and others with glucose intolerance should be very cautious with carbohydrate foods. The best way to know how you deal with carbohydrate is to get a blood glucose meter and use it after meals. For $70 or less, you can get a cheap meter and 50 test strips that will give you a very good idea of your glucose response to typical meals (as opposed to a glucose bomb at the doctor's office). Jenny Ruhl has a tutorial that explains the process. It's also useful to pay attention to how you feel and look with different amounts of carbohydrate in your diet.
The glycemic index (GI) is a measure of how much an individual food elevates blood sugar when it's eaten. To measure it, investigators feed a person a food that contains a fixed amount of carbohydrate, and measure their blood glucose response over time. Then they determine the area under the glucose curve and compare it to a standard food such as white bread or pure glucose.
Each food must contain the same total amount of carbohydrate, so you might have to eat a big plate of carrots to compare with a slice of bread. You end up with a number that reflects the food's ability to elevate glucose when eaten in isolation. It depends in large part on how quickly the carbohydrate is digested/absorbed, with higher numbers usually resulting from faster absorption.
The GI is a standby of modern nutritional advice. It's easy to believe in because processed foods tend to have a higher glycemic index than minimally processed foods, high blood sugar is bad, and chronically high insulin is bad. Yet many people have criticized the concept. Why?
Blood sugar responses to a carbohydrate-containing foods vary greatly from person to person. For example, I can eat a medium potato and a big slice of white bread (roughly 60 g carbohydrate) with nothing else and only see a modest spike in my blood sugar. I barely break 100 mg/dL and I'm back at fasting glucose levels within an hour and a half. You can see a graph of this experiment here. That's what happens when you have a well-functioning pancreas and insulin-sensitive tissues. Your body shunts glucose into the tissues almost as rapidly as it enters the bloodstream. Someone with impaired glucose tolerance might have gone up to 170 mg/dL for two and a half hours on the same meal.
The other factor is that foods aren't eaten in isolation. Fat, protein, acidity and other factors slow carbohydrate absorption in the context of a normal meal, to the point where the GI of the individual foods become much less pronounced.
Researchers have conducted a number of controlled trials comparing low-GI diets to high-GI diets. I've done an informal literature review to see what the overall findings are. I'm only interested in long-term studies-- 10 weeks or longer-- and I've excluded studies using subjects with metabolic disorders such as diabetes.
The question I'm asking with this review is, what are the health effects of a low-glycemic index diet on a healthy normal-weight or overweight person? I found a total of seven studies on PubMed in which investigators varied GI while keeping total carbohydrate about the same, for 10 weeks or longer. I'll present them out of chronological order because they flow better that way.
One issue with this literature that I want to highlight before we proceed is that most of these studies weren't properly controlled to isolate the effects of GI independent of other factors. Low GI foods are often whole foods with more fiber, more nutrients, and a higher satiety value per calorie than high GI foods.
Study #1. Investigators put overweight women on a 12-week diet of either high-GI or low-GI foods with an equal amount of total carbohydrate. Both were unrestricted in calories. Body composition and total food intake were the same on both diets. Despite the diet advice aimed at changing GI, the investigators found that both groups' glucose and insulin curves were the same!
Study #2. Investigators divided 129 overweight young adults into four different diet groups for 12 weeks. Diet #1: high GI, high carbohydrate (60%). Diet #2: low GI, high carbohydrate. Diet #3: high GI, high-protein (28%). Diet #4: low GI, high protein. The high-protein diets were also a bit higher in fat. Although the differences were small and mostly not statistically significant, participants on diet #3 improved the most overall in my opinion. They lost the most weight, and had the greatest decrease in fasting insulin and calculated insulin resistance. Diet #2 came out modestly ahead of diet #1 on fat loss and fasting insulin.
Study #3. At 18 months, this is by far the longest trial. Investigators assigned 203 healthy Brazilian women to either a low-GI or high-GI energy-restricted diet. The difference in GI between the two diets was substantial; the high-GI diet was supposed to be double the low-GI diet. This was accomplished by a number of differences between diets, including different types of rice and higher bean consumption in the low-GI group. Weight loss was a meager 1/3 pound greater in the low-GI group, a difference that was not statistically significant at 18 months. Changes in estimated insulin sensitivity were not statistically significant.
Study #4. The FUNGENUT study. In this 12-week intervention, investigators divided 47 subjects with the metabolic syndrome into two diet groups. One was a high-glycemic, high-wheat group; the other was a low-glycemic, high-rye group. After 12 weeks, there was an improvement in the insulinogenic index (a marker of early insulin secretion in response to carbohydrate) in the rye group but not the wheat group. Glucose tolerance was essentially the same in both groups.
What makes this study unique is they went on to look at changes in gene expression in subcutaneous fat tissue before and after the diets. They found a decrease in the expression of stress and inflammation-related genes in the rye group, and an increase in stress and inflammation genes in the wheat group. They interpreted this as being the result of the different GIs of the two diets.
Further research will have to determine whether the result they observed is due to the glycemic differences of the two diets or something else.
Study #5. Investigators divided 18 subjects with elevated cardiovascular disease risk markers into two diets differing in their GI, for 12 weeks. The low-glycemic group lost 4 kg (statistically significant), while the high-glycemic group lost 1.5 kg (not statistically significant). In addition, the low-GI group ended up with lower 24-hour blood glucose measurements. This study was a bit strange because of the fact that the high-GI group started off 14 kg heavier than the low-GI group, and the way the data are reported is difficult to understand. Perhaps these limitations, along with the study's incongruence with other controlled trails, are what inspired the authors to describe it as a pilot study.
Study #6. 45 overweight females were divided between high-GI and low-GI diets for 10 weeks. The low-GI group lost a small amount more fat than the high-GI group, but the difference wasn't significant. The low-GI group also had a 10% drop in LDL cholesterol.
Study #7. This was the second-longest trial, at 4 months. 34 subjects with impaired glucose tolerance were divided into three diet groups. Diet #1: high-carbohydrate (60%), high-GI. Diet #2: high-carbohydrate, low-GI. Diet #3: "low-carbohydrate" (49%), "high-fat" (monounsaturated from olive and canola oil). The diet #1 group lost the most weight, followed by diet #2, while diet #3 gained weight. The differences were small but statistically significant. The insulin and triglyceride response to a test meal improved in diet group #1 but not #2. The insulin response also improved in group #3. The high-GI group came out looking pretty good.
[Update 10/2011-- please see this post for a recent example of a 6 month controlled trial including 720 participants that tested the effect of glycemic index modification on body fatness and health markers-- it is consistent with the conclusion below]
Overall, these studies do not support the idea that lowering the glycemic index of carbohydrate foods is useful for weight loss, insulin or glucose control, or anything else besides complicating your life. I'll keep my finger on the pulse of this research as it expands, but for the time being I don't see the glycemic index per se as a significant way to combat fat gain or metabolic disease.
In April of 1982, archaeologists from around the globe converged on Plattsburgh, New York for a research symposium. Their goal:
...[to use] data from human skeletal analysis and paleopathology [the study of ancient diseases] to measure the impact on human health of the Neolithic Revolution and antecedent changes in prehistoric hunter-gatherer food economies. The symposium developed out of our perception that many widely debated theories about the origins of agriculture had testable but untested implications concerning human health and nutrition and our belief that recent advances in techniques of skeletal analysis, and the recent explosive increase in data available in this field, permitted valid tests of many of these propositions.
In other words, they got together to see what happened to human health as populations adopted agriculture. They were kind enough to publish the data presented at the symposium in the book Paleopathology at the Origins of Agriculture, edited by the erudite Drs. Mark Nathan Cohen and George J. Armelagos. It appears to be out of print, but luckily I have access to an excellent university library.
There are some major limitations to studying human health by looking at bones. The most obvious is that any soft tissue pathology will have been erased by time. Nevertheless, you can learn a lot from a skeleton. Here are the main health indicators discussed in the book:
The book presents data from 19 regions of the globe, representing Africa, Asia, the Middle East, Europe, South America, with a particular focus on North America. I'll kick things off with a fairly representative description of health in the upper Paleolithic in the Eastern Mediterranean. The term "Paleolithic" refers to the period from the invention of stone tools by hominids 2.5 million years ago, to the invention of agriculture roughly 10,000 years ago. The upper Paleolithic lasted from about 40,000 to 10,000 years ago. From page 59:
- Mortality. Archaeologists are able to judge a person's approximate age at death, and if the number of skeletons is large enough, they can paint a rough picture of the life expectancy and infant mortality of a population.
- General growth. Total height, bone thickness, dental crowding, and pelvic and skull shape are all indicators of relative nutrition and health. This is particularly true in a genetically stable population. Pelvic depth is sensitive to nutrition and determines the size of the birth canal in women.
- Episodic stress. Bones and teeth carry markers of temporary "stress", most often due to starvation or malnutrition. Enamel hypoplasia, horizontal bands of thinned enamel on the teeth, is probably the most reliable marker. Harris lines, bands of increased density in long bones that may be caused by temporary growth arrest, are another type.
- Porotic hyperostosis and cribra orbitalia. These are both skull deformities that are caused by iron deficiency anemia, and are rather creepy to look at. They're typically caused by malnutrition, but can also result from parasites.
- Periosteal reactions. These are bone lesions resulting from infections.
- Physical trauma, such as fractures.
- Degenerative bone conditions, such as arthritis.
- Isotopes and trace elements. These can sometimes yield information about the nutritional status, diet composition and diet quality of populations.
- Dental pathology. My favorite! This category includes cavities, periodontal disease, missing teeth, abscesses, tooth wear, and excessive dental plaque.
In Upper Paleolithic times nutritional health was excellent. The evidence consists of extremely tall stature from plentiful calories and protein (and some microevolutionary selection?); maximum skull base height from plentiful protein, vitamin D, and sunlight in early childhood; and very good teeth and large pelvic depth from adequate protein and vitamins in later childhood and adolescence...
The level of skeletal (including cranial and pelvic) development Paleolithic groups exhibited has remained unmatched throughout the history of agriculture. There may be exceptions but the trend is clear. Cranial capacity was 11% higher in the upper Paleolithic. You can see the pelvic data in this table taken from Paleopathology at the Origins of Agriculture.
There's so much information in this book, the best I can do is quote pieces of the editor's summary and add a few remarks of my own. One of the most interesting things I learned from the book is that the diet of many hunter-gatherer groups changed at the end of the upper Paleolithic, foreshadowing the shift to agriculture. From pages 566-568:
Adult longevity, at 35 years for males and 30 years for females, implies fair to good general health...
There is no clear evidence for any endemic disease.
During the upper Paleolithic stage, subsistence seems focused on relatively easily available foods of high nutritional value, such as large herd animals and migratory fish. Some plant foods seem to have been eaten, but they appear not to have been quantitatively important in the diet. Storage of foods appears early in many sequences, even during the Paleolithic, apparently to save seasonal surpluses for consumption during seasons of low productivity.
One of the interesting things I learned from the book is that Mesolithic populations, groups that were halfway between farming and hunting-gathering, were generally as healthy as hunter-gatherers:
As hunting and gathering economies evolve during the Mesolithic [period of transition between hunting/gathering and agriculture], subsistence is expanded by exploitation of increasing numbers of species and by increasingly heavy exploitation of the more abundant and productive plant species. The inclusion of significant amounts of plant food in prehistoric diets seems to correlate with increased use of food processing tools, apparently to improve their taste and digestibility. As [Dr. Mark Nathan] Cohen suggests, there is an increasing focus through time on a few starchy plants of high productivity and storability. This process of subsistence intensification occurs even in regions where native agriculture never developed. In California, for example, as hunting-gathering populations grew, subsistence changed from an early pattern of reliance on game and varied plant resources to to one with increasing emphasis on collection of a few species of starchy seeds and nuts.
...As [Dr. Cohen] predicts, evolutionary change in prehistoric subsistence has moved in the direction of higher carrying capacity foods, not toward foods of higher-quality nutrition or greater reliability. Early nonagricultural diets appear to have been high in minerals, protein, vitamins, and trace nutrients, but relatively low in starch. In the development toward agriculture there is a growing emphasis on starchy, highly caloric food of high productivity and storability, changes that are not favorable to nutritional quality but that would have acted to increase carrying capacity, as Cohen's theory suggests.
...it seems clear that seasonal and periodic physiological stress regularly affected most prehistoric hunting-gathering populations, as evidenced by the presence of enamel hypoplasias and Harris lines. What also seems clear is that severe and chronic stress, with high frequency of hypoplasias, infectious disease lesions, pathologies related to iron-deficiency anemia, and high mortality rates, is not characteristic of these early populations. There is no evidence of frequent, severe malnutrition, so the diet must have been adequate in calories and other nutrients most of the time. During the Mesolithic, the proportion of starch in the diet rose, to judge from the increased occurrence of certain dental diseases [with exceptions to be noted later], but not enough to create an impoverished diet... There is a possible slight tendency for Paleolithic people to be healthier and taller than Mesolithic people, but there is no apparent trend toward increasing physiological stress during the mesolithic.
Cultures that adopted intensive agriculture typically showed a marked decline in health indicators. This is particularly true of dental health, which usually became quite poor.
Stress, however, does not seem to have become common and widespread until after the development of high degrees of sedentism, population density, and reliance on intensive agriculture. At this stage in all regions the incidence of physiological stress increases greatly, and average mortality rates increase appreciably. Most of these agricultural populations have high frequencies of porotic hyperostosis and cribra orbitalia, and there is a substantial increase in the number and severity of enamel hypoplasias and pathologies associated with infectious disease. Stature in many populations appears to have been considerably lower than would be expected if genetically-determined maxima had been reached, which suggests that the growth arrests documented by pathologies were causing stunting... Incidence of carbohydrate-related tooth disease increases, apparently because subsistence by this time is characterized by a heavy emphasis on a few starchy food crops.
Infectious disease increased upon agricultural intensification:
Most [studies] conclude that infection was a more common and more serious problem for farmers than for their hunting and gathering forebears; and most suggest that this resulted from some combination of increasing sedentism, larger population aggregates, and the well-established synergism between infection and malnutrition.
There are some apparent exceptions to the trend of declining health with the adoption of intensive agriculture. In my observation, they fall into two general categories. In the first, health improves upon the transition to agriculture because the hunter-gatherer population was unhealthy to begin with. This is due to living in a marginal environment or eating a diet with a high proportion of wild plant seeds. In the second category, the culture adopted rice. Rice is associated with less of a decline in health, and in some cases an increase in overall health, than other grains such as wheat and corn. In chapter 21 of the book Ancient Health: Bioarchaeological Interpretations of the Human Past, Drs. Michelle T Douglas and Michael Pietrusewsky state that "rice appears to be less cariogenic [cavity-promoting] than other grains such as maize [corn]."
One pathology that seems to have decreased with the adoption of agriculture is arthritis. The authors speculate that it may have more to do with strenuous activity than other aspects of the lifestyle such as diet. Another interpretation is that the hunter-gatherers appeared to have a higher arthritis rate because of their longer lifespans:
The arthritis data are also complicated by the fact that the hunter-gatherers discussed commonly displayed higher average ages at death than did the farming populations from the same region. The hunter-gatherers would therefore be expected to display more arthritis as a function of age even if their workloads were comparable [to farmers].
In any case, it appears arthritis is normal for human beings and not a modern degenerative disease.
And the final word:
Taken as a whole, these indicators fairly clearly suggest an overall decline in the quality-- and probably in the length-- of human life associated with the adoption of agriculture.
Margarine is one of my favorite foods. To rip on. It's just so easy!
The body has a number of ways of keeping bad things out while taking good things in. One of the things it likes to keep out are plant sterols and stanols (phytosterols), cholesterol-like molecules found in plants. The human body even has two enzymes dedicated to pumping phytosterols back into the gut as they try to diffuse across the intestinal lining: the sterolins. These enzymes actively block phytosterols from passing into the body, but allow cholesterol to enter. Still, a little bit gets through, proportional to the amount in the diet.
As a matter of fact, the body tries to keep most things out except for the essential nutrients and a few other useful molecules. Phytosterols, plant "antioxidants" like polyphenols, and just about anything else that isn't body building material gets actively excluded from circulation or rapidly broken down by the liver. And almost none of it gets past the blood-brain barrier, which protects one of our most delicate organs. It's not surprising once you understand that many of these substances are bioactive: they have drug-like effects that interfere with enzyme activity and signaling pathways. For example, the soy isoflavone genistein abnormally activates estrogen receptors. Your body does not like to hand over the steering wheel to plant chemicals, so it actively defends itself.
A number of trials have shown that large amounts of phytosterols in the diet lower total cholesterol and LDL. This has led to the (still untested) hypothesis that phytosterols lower heart attack risk. The main problem with this hypothesis is that although statin drugs do lower LDL and heart attack risk, not all interventions that lower LDL lower risk. LDL plays an important role in heart attack risk, but it's not the only factor. Statins have a number of biological effects besides lowering LDL, and some of these probably play a role in its ability to protect against heart attacks.
Lowering total cholesterol and LDL through diet and drugs other than statins does not reliably reduce mortality in controlled trials. Decades of controlled diet trials showed overall that replacing saturated fat with polyunsaturated vegetable oil lowers cholesterol, lowers LDL, but doesn't reliably reduce the risk of cardiovascular disease. Soy contains a lot of phytosterols, which is one of the reasons it's heavily promoted as a health food.
All right, let's put on our entrepreneur hats. We know phytosterols lower cholesterol. We know soy is being promoted as a healthier alternative to meat. We know butter is considered a source of artery-clogging saturated fat. I have an idea. Let's make a margarine that contains a massive dose of phytosterols and market it as heart-healthy. We'll call it Benecol, and we'll have doctors recommend it to cardiac patients.
Here are the ingredients:
Liquid Canola Oil, Water, Partially Hydrogenated Soybean Oil, Plant Stanol Esters, Salt, Emulsifiers, (Vegetable Mono- and Diglycerides, Soy Lecithin), Hydrogentated Soybean Oil, Potassium Sorbate, Citric Acid and Calcium Disodium EDTA to Preserve Freshness, Artificial Flavor, DL-alpha-Tocopheryl Acetate, Vitamin A Palmitate, Colored with Beta Carotene.
And I haven't even gotten to the best part yet. There's a little disorder called phytosterolemia that may be relevant here. These patients have a mutation in one of their sterolin genes that allows phytosterols (including stanols) to pass into their circulation more easily. They end up with 10-25 times more phytosterols in their circulation than a normal individual. What kind of health benefits do these people see? Premature atherosclerosis, an early death from heart attacks, abnormal accumulation of sterols and stanols in the tendons, and liver damage.
Despite the snappy-looking tub, margarine is just another industrial food-like substance that I am highly suspicious of. In the U.S., manufacturers can put the statement "no trans fat" on a product's label, and "0 g trans fat" on the nutrition label, if it contains less than 0.5 grams of trans fat per serving. A serving of Benecol is 14 grams. That means it could be up to 3.5 percent trans fat and still labeled "no trans fat". This stuff is being recommended to cardiac patients.
When deciding whether or not a food is healthy, the precautionary principle is in order. Margarine is a food that has not withstood the test of time. Show me a single healthy culture on this planet that eats margarine regularly. Cow juice may not be as flashy as the latest designer food, but it has sustained healthy cultures for generations. The U.S. used to belong to those ranks, when coronary heart disease was rare.
A Dutch group led by Dr. Yvonne T. van der Schouw recently published a paper examining the link between vitamin K intake and heart attack (thanks Robert). They followed 16,057 women ages 49-70 years for an average of 8.1 years, collecting data on their diet and incidence of heart attack.
They found no relationship between K1 intake and heart attack incidence. K1 is the form found in leafy greens and other plant foods. They found that each 10 microgram increase in daily vitamin K2 consumption was associated with a 9% lower incidence of heart attack. Participants consumed an average of 29 micrograms K2 per day, with a range of 0.9 to 128 micrograms. That means that participants with the highest intake had a very much reduced incidence of heart attack on average. Vitamin K2 comes from animal foods (especially organs and pastured dairy)and fermented foods such as cheese, sauerkraut, miso and natto. Vitamin K is fat-soluble, so low-fat animal foods contain less of it. Animal foods contain the MK-4 subtype while fermentation produces longer menaquinones, MK-5 through MK-14.
There's quite a bit of evidence to support the idea that vitamin K2 inhibits and possibly reverses arterial calcification, which is possibly the best overall measure of heart attack risk. It began with the observations of Dr. Weston Price, who noticed an inverse relationship between the K2 MK-4 content of butter and deaths from coronary heart disease and pneumonia in several regions of the U.S. You can find those graphs in Nutrition and Physical Degeneration.
The 25% of participants eating the most vitamin K2 (and with the lowest heart attack risk) also had the highest saturated fat, cholesterol, protein and calcium intake. They were much less likely to have elevated cholesterol, but were more likely to be diabetic.
Here's where the paper gets strange. They analyzed the different K2 subtypes individually (MK-4 through MK-9). MK-7 and MK-6 had the strongest association with reduced heart attack risk per microgram consumed, while MK-4 had no significant relationship. MK-8 and MK-9 had a weak but significant protective relationship.
There are a few things that make me skeptical about this result. First of all, the studies showing prevention/reversal of arterial calcification in rats were done with MK-4. MK-4 inhibits vascular calcification in rats whereas I don't believe the longer menaquinones have been tested. Furthermore, they attribute a protective effect to MK-7 in this study, but the average daily intake was only 0.4 micrograms! You could get that amount of K2 if a Japanese person who had eaten natto last week sneezed on your food. I can't imagine that amount of MK-7 is biologically significant. That, among other things, makes me skeptical of what they're really observing.
I'm not convinced of their ability to parse the effect into the different K2 subtypes. They mentioned in the methods section that their diet survey wasn't very accurate at estimating the individual K2 subtypes. Combine that with the fact that the K2 content of foods varies quite a bit by animal husbandry practice and type of cheese, and you have a lot of variability in your data. Add to that the well-recognized variability inherent in these food questionnaires, and you have even more variabiltiy.
I'm open to the idea that longer menaquinones (K2 MK-5 and longer, including MK-7) play a role in preventing cardiovascular disease, but I don't find the evidence sufficient yet. MK-4 is the form of K2 that's made by animals, for animals. Mammals produce it in their breast milk and other animals produce it in eggs all the way down to invertebrates. I think we can assume they make MK-4, and not the longer menaquinones, for a reason.
MK-4 is able to play all the roles of vitamin K in the body, including activating blood clotting factors, a role traditionally assigned to vitamin K1. This is obvious because K2 MK-4 is the only significant source of vitamin K in the diet of infants before weaning. No one knows whether the longer menaquinones are able to perform all the functions of MK-4; it hasn't been tested and I don't know how you could ever be sure. MK-7 is capable of performing at least some of these functions, such as activating osteocalcin and clotting factors.
I do think it's worth noting that the livers of certain animals contain longer menaquinones, including MK-7. So it is possible that we're adapted to eating some of the longer menaquinones. Many cultures also have a tradition of fermented food (probably a relatively recent addition to the human diet), which could further increase the intake of longer menaquinones. The true "optimum", if there is one, may be to eat a combination of forms of K2, including MK-4 and the longer forms. But babies and healthy traditional cultures such as the Masai seem to do quite well on a diet heavily weighted toward MK-4, so the longer forms probably aren't strictly necessary.
Well if you've made it this far, you're a hero (or a nerd)! Now for some humor. From the paper:
The concept of proposing beneficial effects to vitamin K2 seems to have different basis as for vitamin K1. Vitamin K1 has been associated with a heart-healthy dietary pattern in the earlier work in the USA and this attenuated their associations with CHD. Vitamin K2 has different sources and relate to different dietary patterns than vitamin K1. This suggests that the risk reduction with vitamin K2 is not driven by dietary patterns, but through biological effects.
They seem confused by the fact that people who ate foods high in saturated fat and cholesterol had less CHD, yet people consuming green vegetables didn't. Here's more:
Thus, although our findings may have important practical implications on CVD prevention, it is important to mention that in order to increase the intake of vitamin K2, increasing the portion vitamin K2 rich foods in daily life might not be a good idea. Vitamin K2 might be, for instance more relevant in the form of a supplement or in low-fat dairy. More research into this is necessary.
Translation: "People who ate the most cheese, milk and meat had the lowest heart attack rate, but be careful not to eat those things because they might give you a heart attack. Get your K2 from low-fat dairy (barely contains any) and supplements."
Vegetarians deserve our respect. They're usually thoughtful, conscientious people who make sacrifices for environmental and ethical reasons. I was vegetarian for a while myself, and I have no regrets about it.
Vegetarianism and especially veganism can get pretty ideological sometimes. People who have strong beliefs like to think that their belief system is best for all aspects of their lives and the world, not just some aspects of it. Many vegetarians believe their way of eating is healthier than omnivory. One of the classic arguments for vegetarianism goes something like this: our closest living relatives, chimpanzees and bonobos, are mostly vegetarian, therefore that's the diet to which we're adapted as well. Here's the problem with that argument:
Where are chimps (Pan troglodytes) on this chart? They aren't on it, for two related reasons: they aren't in the genus Homo, and they diverged from us 5-7 million years ago. Homo erectus diverged from our lineage about 1.5 million years ago. I don't know if you've ever seen a Homo erectus skull, but 1.5 million years is clearly enough time to do some evolving. Homo erectus ate animals as a significant portion of its diet.
If you look at the chart above, Homo rhodesiensis (often considered a variant of Homo heidelbergensis) is our closest ancestor, and our point of divergence with neanderthals (Homo neanderthalensis). Some archaeologists believe H. heidelbergensis was the same species as modern Homo sapiens. I haven't been able to find any direct evidence of the diet of H. heidelbergensis from bone isotope ratios, but the indirect evidence indicates that they were capable hunters who probably got a substantial proportion of their calories from meat. In Europe, they hunted now-extinct megafauna such as wooly rhinos. These things make modern cows look like chicken nuggets.
H. heidelbergensis was a skilled hunter and very athletic. They were top predators in their ecosystems, judged by the fact that they took their time with carcasses, butchering them thoroughly and extracting marrow from bones. No predator or scavenger was capable of driving them away from a kill.
Our closest recent relative was Homo neanderthalensis, the neanderthal. They died out around 30,000 years ago. There have been several good studies on the isotope ratios of neanderthal bones, all indicating that neanderthals obtained most of their protein from meat. They relied both on land and marine animals, depending on what was available. Needless to say, neanderthals are much more closely related to humans than chimpanzees, having diverged from us less than 500,000 years ago. That's less than one-tenth the time between humans and chimpanzees.
I don't think this means humans are built to be carnivores, particularly since there is accumulating evidence of diverse plant consumption by neanderthals, but it certainly blows away the argument that we're built to be vegetarians. Historical human hunter-gatherers had very diverse diets, but on average were meat-heavy omnivores.
Ricardo just sent me a link to the British Heart Foundation statistics website. It's a goldmine. They have data on just about every aspect of health and lifestyle in the U.K. I find it very empowering to have access to this kind of information on the internet.
I've just started sifting through it, but something caught my eye. The U.K. is experiencing an obesity epidemic similar to the U.S.:
Here's where it gets interesting. This should look familiar:
Hmm, those trends look remarkably similar. Just like in the U.S, the British are exercising more and getting fatter with each passing year. In fact, maybe exercise causes obesity. Let's see if there's any correlation between the two. I'm going to plot obesity on the X-axis and exercise on the Y-axis to see if there's a correlation. The data points only overlap on three years: 1998, 2003 and 2006. Let's take a look:
By golly, we've proven that exercise causes obesity! Clearly, the more people exercise, the fatter they get. The R-value is a measure of how closely the points fall on the best-fit line. 0.82 isn't bad for this type of data. If only we could get all British citizens to become couch potatoes, obesity would be a thing of the past! OK, I'm kidding. The obesity is obviously caused by something else. I'm illustrating the point that correlations can sometimes be misleading. Even if an association conforms to our preconceived notions of how the world works, that does not necessarily justify saying one factor causes another. Controlled experiments can often help us strengthen a claim of causality.