Wednesday, June 17, 2009

A Little Tidbit

I'm gearing up for a new series of posts based on some fascinating reading I've been doing lately. I'm not going to spill the beans, but I will give you a little hint, from a paper written by Dr. Robert S. Corruccini, professor of anthropology at Southern Illinois university. I just came across this quote and it blew me away. It's so full of wisdom I can't even believe I just read it. The term "occlusion" refers to the way the upper and lower teeth come together, as in overbite or underbite.
Similar to heart disease and diabetes which are "diseases of civilization" or "Western diseases" (Trowell and Burkitt, 1981) that have attained high prevalence in urban society because of environmental factors rather than "genetic deterioration," an epidemiological transition (Omran, 1971) in occlusal health accompanies urbanization.

Western society has completely crossed this transition and now exists in a state of industrially buffered environmental homogeneity. The relatively constant environment both raises genetic variance estimates (since environmental variance is lessened) and renders epidemiological surveys largely meaningless because etiological factors are largely uniform. Nevertheless most occlusal epidemiology and heritability surveys are conducted in this population rather than in developing countries currently traversing the epidemiological transition.
In other words, the reason observational studies in affluent nations haven't been able to get to the bottom of dental/orthodontic problems and chronic disease is that everyone in their study population is doing the same thing! There isn't enough variability in the diets and lifestyles of modern populations to be able to determine what's causing the problem. So we study the genetics of problems that are not genetic in origin, and overestimate genetic contributions because we're studying populations whose diet and lifestyle are homogeneous. It's a wild goose chase.

That's why you have to study modernizing populations that are transitioning from good to poor health, which is exactly what Dr. Weston Price and many others have done. Only then can you see the true, non-genetic, nature of the problem.

Saturday, May 16, 2009

The Coronary Heart Disease Epidemic: Possible Culprits Part I

In the last post, I reviewed two studies that suggested heart attacks were rare in the U.K. until the 1920s -1930s. In this post, I'll be discussing some of the diet and lifestyle factors that preceded and associated with the coronary heart disease epidemic in the U.K and U.S. I've cherry picked factors that I believe could have played a causal role. Many things changed during that time period, and I don't want to give the impression that I have "the answer". I'm simply presenting ideas for thought and discussion.

First on the list: sugar. Here's a graph of refined sugar consumption in the U.K. from 1815 to 1955, from the book The Saccharine Disease, by Dr. T. L. Cleave. Sugar consumption increased dramatically in the U.K. over this time period, reaching near-modern levels by the turn of the century, and continuing to increase after that except during the wars: Here's a graph of total sweetener consumption in the U.S. from 1909 to 2005 (source: USDA food supply database). Between 1909 and 1922, sweetener consumption increased by 40%:

If we assume a 10 to 20 year lag period, sugar is well placed to play a role in the CHD epidemic. Sugar is easy to pick on. Diets high in refined sugar tend to promote obesity due to overeating.  An excess causes a number of detrimental changes in animal models and human subjects that are partially dependent on the development of obesity, including fatty liver, the metabolic syndrome, and small, oxidized low-density lipoprotein particles (LDL). Small and oxidized LDL associate strongly with cardiovascular disease risk and may be involved in causing it. These effects seem to be partly attributable to the fructose portion of sugar, which is 50% of table sugar (sucrose), about 50% of most naturally sweet foods, and 55% of the most common form of high-fructose corn syrup. That explains why starches, which break down into glucose (another type of sugar), don't have the same negative effects as table sugar and HFCS.

Hydrogenated fat is the next suspect. I don't have any graphs to present, because no one has systematically tracked hydrogenated fat consumption in the U.S. or U.K. to my knowledge. However, it was first marketed in the U.S. by Procter & Gamble under the brand name Crisco in 1911. Crisco stands for "crystallized cottonseed oil", and involves taking an industrial waste oil (from cotton seeds) and chemically treating it using high temperature, a nickel catalyst and hydrogen gas (see this post for more information). Hydrogenated fats for human consumption hit markets in the U.K. around 1920. Here's what Dr. Robert Finlayson had to say about margarine in his paper "Ischaemic Heart Disease, Aortic Aneurysms, and Atherosclerosis in the City of London, 1868-1982":
...between 1909-13 and 1924-28, margarine consumption showed the highest percentage increase, whilst that of eggs only increased slightly and that of butter remained unchanged. Between 1928 and 1934, margarine consumption fell by one-third, while butter consumption increased by 57 percent: and increase that coincided with a fall of 48 percent in its price. Subsequently, margarine sales have burgeoned, and if one is correct in stating that the coronary heart disease epidemic started in the second decade of this century, then the concept of hydrogenated margarines as an important aetiological factor, so strongly advocated by Martin, may merit more consideration than hitherto.
Partially hydrogenated oils contain trans fat, which is truly new to the human diet, with the exception of small amounts found in ruminant fats including butter. But for the most part, natural trans fats are not the same as industrial trans fats, and in fact some of them, such as conjugated linoleic acid (CLA), may be beneficial. To my knowledge, no one has discovered health benefits of industrial trans fats. To the contrary, compared to butter, they shrink LDL size. They also inhibit enzymes that the body uses to make a diverse class of signaling compounds known as eicosanoids. Trans fat consumption associates very strongly with the risk of heart attack in observational studies. Which is ironic, because hydrogenated fats were originally marketed as a healthier alternative to animal fats.  In 2009, even the staunchest opponents of animal fats have to admit that they're healthier than hydrogenated fat.
The rise of cigarettes was a major change that probably contributed massively to the CHD epidemic. They were introduced just after the turn of the century in the U.S. and U.K., and rapidly became fashionable (source):
If you look at the second to last graph from the previous post, you can see that there's a striking correspondence between cigarette consumption and CHD deaths in the U.K. In fact, if you moved the line representing cigarette consumption to the right by about 20 years, it would overlap almost perfectly with CHD deaths. The risk of heart attack is so strongly associated with smoking in observational studies that even I believe it probably represents a causal relationship. There's no doubt in my mind that smoking cigarettes contributes to the risk of heart attack and various other health problems.

Smoking is a powerful factor, but it doesn't explain everything. How is it that the Kitavans of Papua New Guinea, more than 3/4 of whom smoke cigarettes, have an undetectable incidence of heart attack and stroke? Why do the French and the Japanese, who smoke like chimneys (at least until recently), have the two lowest heart attack death rates of all the affluent nations? There's clearly another factor involved that trumps cigarette smoke. 

Tuesday, May 12, 2009

The Coronary Heart Disease Epidemic

Few people alive today are old enough to remember the beginning of the coronary heart disease (CHD) epidemic in the 1920s and 1930s, when physicians in the U.S. and U.K. began sounding alarm bells that an uncommon disease was rapidly becoming the leading cause of death. By the 1950s, their predictions had come true. A decade later, a new generation of physicians replaced their predecessors and began to doubt that heart attacks had ever been uncommon. Gradually, the idea that the disease was once uncommon faded from the public consciousness, and heart attacks were seen as an eternal plague of humankind, avoided only by dying of something else first.

According to U.S. National Vital Statistics records beginning in 1900, CHD was rarely given as the cause of death by physicians until after 1930. The following graph is from The Great Cholesterol Con, by Anthony Colpo.


The relevant line for CHD deaths begins in the lower left-hand part of the graph. Other types of heart disease, such as heart failure due to cardiomyopathy, were fairly common and well recognized at the time. These data are highly susceptible to bias because they depend on the physician's perception of the cause of death, and are not adjusted for the mean age of the population. In other words, if a diagnosis of CHD wasn't "popular" in 1920, its prevalence could have been underestimated. The invention of new technologies such as the electrocardiogram facilitated diagnosis. Changes in diagnostic criteria also affected the data; you can see them as discontinuities in 1948, 1968 and 1979. For these reasons, the trend above isn't a serious challenge to the idea that CHD has always been a common cause of death in humans who reach a certain age.

This idea was weakened in 1951 with the publication of a paper in the Lancet medical journal titled "Recent History of Coronary Disease", by Dr. Jerry N. Morris. Dr. Morris sifted through the autopsy records of London Hospital and recorded the frequency of coronary thrombosis (artery blockage in the heart) and myocardial infarction (MI; loss of oxygen to the heart muscle) over the period 1907-1949. MI is the technical term for a heart attack, and it can be caused by coronary thrombosis. Europe has a long history of autopsy study, and London Hospital had a long-standing policy of routine autopsies during which they kept detailed records of the state of the heart and coronary arteries. Here's what he found:

The dashed line is the relevant one. This is a massive increase in the prevalence of CHD death that cannot be explained by changes in average lifespan. Although the average lifespan increased considerably over that time period, most of the increase was due to reduced infant mortality. The graph only includes autopsies performed on people 35-70 years old. Life expectancy at age 35 changed by less than 10 years over the same time period. The other possible source of bias is in the diagnosis. Physicians may have been less likely to search for signs of MI when the diagnosis was not "popular". Morris addresses this in the paper:
The first possibility, of course, is that the increase is not real but merely reflects better post-mortem diagnosis. This is an unlikely explanation. There is abundant evidence throughout the forty years that the department was fully aware of the relation of infarction to thrombosis, of myocardial fibrosis to gradual occlusion, and of the topical pathology of ostial stenosis and infarction from embolism, as indeed were many pathologists last century... But what makes figures like these important is that, unlike other series of this kind, they are based on the routine examination at necropsy of the myocardium and of the coronary arteries over the whole period. Moreover Prof. H. M. Turnbull, director of the department, was making a special case of atheroma and arterial disease in general during 1907-1914 (Turnbull 1915). The possibility that cases were overlooked is therefore small, and the earlier material is as likely to be reliable as the later.
Dr. Morris's study was followed by another similar one published in 1985 in the journal Medical History, titled "Ischaemic Heart Disease, Aortic Aneurysms, and Atherosclerosis in the City of London, 1868-1982", conducted by Dr. Robert Finlayson. This study, in my opinion, is the coup de grace. Finlayson systematically scrutinized autopsy reports from St. Bartholemew's hospital, which had conducted routine and detailed cardiac autopsies since 1868, and applied modern diagnostic criteria to the records. He also compared the records from St. Bartholemew's to those from the city mortuary. Here's what he found:

The solid line is MI mortality. Striking, isn't it? The other lines are tobacco and cigarette consumption. These data are not age-adjusted, but if you look at the raw data tables provided in the paper, some of which are grouped by age, it's clear that average lifespan doesn't explain much of the change. Heart attacks are largely an occurrence of the last 80 years.

What caused the epidemic? Both Drs. Morris and Finlayson also collected data on the prevalence of atherosclerosis (plaques in the arteries) over the same time period. Dr. Morris concluded that the prevalence of severe atherosclerosis had decreased by about 50% (although mild atherosclerosis such as fatty streaks had increased), while Dr. Finlayson found that it had remained approximately the same:


He found the same trend in females. This casts doubt on the idea that coronary atherosclerosis is sufficient in and of itself to cause heart attacks, although modern studies have found a strong association between advanced atherosclerosis and the risk of heart attack on an individual level. Heart attacks are caused by several factors, one of which is atherosclerosis.  

What changes in diet and lifestyle associated with the explosion of MI in the U.K. and U.S. after 1920? Dr. Finlayson has given us a hint in the graph above: cigarette consumption increased dramatically over the same time period, and closely paralleled MI mortality. Smoking cigarettes is very strongly associated with heart attacks in observational studies. Animal studies also support the theory. While I believe cigarettes are an important factor, I do not believe they are the only cause of the MI epidemic. Dr. Finlayson touched on a few other factors in the text of the paper, and of course I have my own two cents to add. I'll discuss that next time.

Thursday, May 7, 2009

Dihydro-Vitamin K1

Step right up ladies and gents; I have a new miracle vitamin for you. Totally unknown to our ignorant pre-industrial ancestors, it's called dihydro-vitamin K1. It's formed during the oil hydrogenation process, so the richest sources are hydrogenated fats like margarine, shortening and commercial deep fry oil. Some of its benefits may include:
Dihydro-vitamin K1 accounts for roughly 30% of the vitamin K intake of American children, and a substantial portion of adult intake as well. Over 99 percent of Americans have it in their diet. Research on dihydro-vitamin K1 is in its infancy at this point, so no one has a very solid idea of its effects on the body beyond some preliminary and disturbing suggestions from animal experiments and brief human trials.

This could be another mechanism by which industrially processed vegetable oils degrade health. It's also another example of why it's not a good idea to chemically alter food. We don't understand food, or our bodies, well enough to know the long-term consequences of foods that have been recently introduced to the human diet. I believe these foods should be avoided on principle.

Monday, May 4, 2009

Pastured Eggs

Eggs are an exceptionally nutritious food. It's not surprising, considering they contain everything necessary to build a chick! But all eggs are not created equal. Anyone who has seen the tall, orange yolk, viscous white, and tough shell of a true pastured egg knows they're profoundly different. So has anyone who's tasted one. This has been vigorously denied by the American Egg Board and the Egg Nutrition Council, primarily representing conventional egg farmers, which assert that eggs from giant smelly barns are nutritionally equal to their pastured counterparts.

In 2007, the magazine Mother Earth News decided to test that claim. They sent for pastured eggs from 14 farms around the U.S., tested them for a number of nutrients, and compared them to the figures listed in the USDA Nutrient Database for conventional eggs. Here are the results per 100 grams for conventional eggs, the average of all the pastured eggs, and eggs from Skagit River Ranch, which sells at my farmer's market:

Vitamin A:
  • Conventional: 487 IU
  • Pastured avg: 792 IU
  • Skagit Ranch: 1013 IU
Vitamin D:
  • Conventional: 34 IU
  • Pastured avg: 136 - 204 IU
  • Skagit Ranch: not determined
Vitamin E:
  • Conventional: 0.97 mg
  • Pastured avg: 3.73 mg
  • Skagit Ranch: 4.02 mg
Beta-carotene:
  • Conventional: 10 mcg
  • Pastured avg: 79 mcg
  • Skagit Ranch: 100 mcg
Omega-3 fatty acids:
  • Conventional: 0.22 g
  • Pastured avg: 0.66 g
  • Skagit Ranch: 0.74 g

Looks like the American Egg Board and the Egg Nutrition Council have some egg on their faces...

Eggs also contain vitamin K2, with the amount varying substantially according to the hen's diet. Guess where the A, D, K2, beta-carotene and omega-3 fatty acids are? In the yolk of course. Throwing the yolk away turns this powerhouse into a bland, nutritionally unimpressive food.

It's important to note that "free range" supermarket eggs are nutritionally similar to conventional eggs. The reason pastured eggs are so nutritious is that the chickens get to supplement their diets with abundant fresh plants and insects. Having little doors on the side of a giant smelly barn just doesn't replicate that.

Saturday, May 2, 2009

Iodine

Iodine is an essential trace mineral. It's required for the formation of activated thyroid hormones T3 and T4. The amount of thyroid hormones in circulation, and the body's sensitivity to them, strongly influences metabolic rate. Iodine deficiency can lead to weight gain and low energy. In more severe cases, it can produce goiter, an enlargement of the thyroid gland.

Iodine deficiency is also the most common cause of preventable mental retardation worldwide. Iodine is required for the development of the nervous system, and also concentrates in a number of other tissues including the eyes, the salivary glands and the mammary glands.

There's a trend in the alternative health community to use unrefined sea salt rather than refined iodized salt. Personally, I use unrefined sea salt on principle, although I'm not convinced refined iodized salt is a problem. But the switch removes the main source of iodine in most peoples' diets, creating the potential for deficiency in some areas. Most notably, the soil in the midwestern United States is poor in iodine and deficiency was common before the introduction of iodized salt.

The natural solution? Sea vegetables. They're rich in iodine, other trace minerals, and flavor. I like to add a 2-inch strip of kombu to my beans. Kombu is a type of kelp. It adds minerals, and is commonly thought to speed the cooking and improve the digestion of beans and grains.

Dulse is a type of sea vegetable that's traditionally North American. It has a salty, savory flavor and a delicate texture. It's great in soups or by itself as a snack.

And then there's wakame, which is delicious in miso soup. Iodine is volatile so freshness matters. Store sea vegetables in a sealed container. It may be possible to overdo iodine, so it's best to eat sea vegetables regularly but in moderation like the Japanese.

Seafood such as fish and shellfish are rich in iodine, especially if fish heads are used to make soup stock. Dairy is a decent source in areas that have sufficient iodine in the soil.

Cod liver oil is another good source of iodine, or at least it was before the advent of modern refining techniques. I don't know if refined cod liver oil contains iodine. I suspect that fermented cod liver oil is still a good source of iodine because it isn't refined.


Friday, April 24, 2009

Nutrition and Infectious Disease

Dr. Edward Mellanby's book Nutrition and Disease contains a chapter titled "Nutrition and Infection". It begins:
There is general agreement among medical men that the susceptibility of mankind to many types of infection is closely related to the state of nutrition. The difficulty arises, when closer examination is given to this general proposition, as to what constitutes good and bad nutrition, and the problem is not rendered easier by recent advances in nutritional science.
Dr. Mellanby was primarily concerned with the effect of fat-soluble vitamins on infectious disease, particularly vitamins A and D. One of his earliest observations was that butter protected against pneumonia in his laboratory dogs. He eventually identified vitamin A as the primary protective factor. He found that by placing rats on a diet deficient in vitamin A, they developed numerous infectious lesions, most often in the urogenital tract, the eyes, the intestine, the middle ear and the lungs. This was prevented by adding vitamin A or cabbage (a source of beta-carotene, which the rats converted to vitamin A) to the diet. Mellanby and his colleagues subsequently dubbed vitamin A the "anti-infective vitamin".

Dr. Mellanby was unsure whether the animal results would apply to humans, due to "the difficulty in believing that diets even of poor people were as deficient in vitamin A and carotene as the experimental diets." However, their colleagues had previously noted marked differences in the infection rate of largely vegetarian African tribes versus their carnivorous counterparts. The following quote from
Nutrition and Disease refers to two tribes which, by coincidence, Dr. Weston Price also described in Nutrition and Physical Degeneration:
The high incidence of bronchitis, pneumonia, tropical ulcers and phthisis among the Kikuyu tribe who live on a diet mainly of cereals as compared with the low incidence of these diseases among their neighbours the Masai who live on meat, milk and raw blood (Orr and Gilks), probably has a similar or related nutritional explanation. The differences in distribution of infective disease found by these workers in the two tribes are most impressive. Thus in the cereal-eating tribe, bronchitis and pneumonia accounted for 31 per cent of all cases of sickness, tropical ulcers for 33 per cent, and phthisis for 6 per cent. The corresponding figures for the meat, milk and raw blood tribe were 4 per cent, 3 per cent and 1 per cent.
So they set out to test the theory under controlled conditions. Their first target: puerperal sepsis. This is an infection of the uterus that occurs after childbirth. They divided 550 women into two groups: one received vitamins A and D during the last month of pregnancy, and the other received nothing. Neither group was given instructions to change diet, and neither group was given vitamins during their hospital stay. The result, quoted from Nutrition and Disease:
The morbidity rate in the puerperium using the [British Medical Association] standard was 1.1 per cent in the vitamin group and 4.7 in the control group, a difference of 3.6 per cent which is twice the standard error (1.4), and therefore statistically significant.
This experiment didn't differentiate between the effects of vitamin A and D, but it did establish that fat-soluble vitamins are important for resistance to bacterial infection. The next experiment Dr. Mellanby undertook was a more difficult one. This time, he targeted puerperal septicemia. This is a more advanced stage of puerperal sepsis, in which the infection spreads into the bloodstream. In this experiment, he treated women who had already contracted the infection. This trial was not as tightly controlled as the previous one. Here's a description of the intervention, from Nutrition and Disease:
...all patients received when possible a diet rich not only in vitamin A but also of high biological quality. This diet included much milk, eggs, green vegetables, etc., as well as the vitamin A supplement. For controls we had to use the cases treated in previous years by the same obstetricians and gynecologists as the test cases.
In the two years prior to this investigation, the mortality rate for puerperal septicemia in 18 patients was 92%. In 1929, Dr. Mellanby fed 18 patients in the same hospital his special diet, and the mortality rate was 22%. This is a remarkable treatment for an infection that was almost invariably fatal at the time.

Dr. Mellanby was a man with a lot of perspective. He was not a reductionist; he knew that a good diet is more than the sum of its parts. Here's another quote from
Nutrition and Disease:
It is probable that, as in the case of vitamin D and rickets, the question is not simple and that it will ultimately be found that vitamin A works in harmony with some dietetic factors, such as milk proteins and other proteins of high biological value, to promote resistance of mucous membranes and epithelial cells to invasion by micro-organisms, while other factors such as cereals, antagonise its influence. The effect of increasing the green vegetable and reducing the cereal intake on the resistance of herbivorous animals to infection is undoubted (Glenny and Allen, Boock and Trevan) and may well indicate a reaction in which the increased carotene of the vegetable plays only a part, but an important part.

P.S.- I have to apologize, I forgot to copy down the primary literature references for this post before returning the book to the library. So for the skeptics out there, you'll either have to take my word for it, or find a copy of the book yourself.