The new USDA/HHS Guidelines are not entirely bad —for example, they recommend limiting added sugars, tossing the hydrogenated oils, and even limiting fruit juice, and they advocate sidewalks, parks, and safe neighborhoods as ways to provide people with opportunities to increase physical activity — but they provide an awfully strange definition of the phrase “nutrient dense” that leads them to advocate a diet that is anything but.

The term “nutrient dense” indicates that the nutrients and other beneficial substances in a food have not been “diluted” by the addition of calories from added solid fats, added sugars, or added refined starches, or by the solid fats naturally present in the food.

So what, then, are the “nutrient-dense” foods?

All vegetables, fruits, whole grains, fat-free or low-fat milk and milk products, seafood, lean meats and poultry, eggs, beans and peas (legumes), and nuts and seeds that are prepared without added solid fats, sugars, starches, and sodium are nutrient-dense.

Somehow “the solid fats naturally present in the food” reduce the food’s nutrient density by this definition, so suddenly meats are only nutrient-dense if they are lean and milk products are only nutrient-dense if the are fat-free or low-fat. 

It would seem that the term “nutrient-dense” should refer to the density of nutrients in a food, adjusted for bioavailability.  This could be measured per gram, per calorie, or per unit volume, depending on a person’s particular needs.  Since nutrients are essentially worthless if they aren’t absorbed and utilized, the term should incorporate bioavailability. 

The USDA considers other factors that have nothing to do with nutrient content or bioavailability to trump these issues.

A strong body of evidence indicates that higher intake of most dietary saturated fatty acids is associated with higher levels of blood total cholesterol and low-density lipoprotein (LDL) cholesterol. Higher total and LDL cholesterol levels are risk factors for cardiovascular disease.

Notice the careful wording.  Associated with.  Risk factors.

The use of this careful wording is somewhat ironic, because the document provides a clear warning against confusing association with causation.

When considering the evidence that supports a recommendation, it is important to recognize the difference between association and causation. Two factors may be associated; however, this associa­tion does not mean that one factor necessarily causes the other. Often, several different factors may contribute to an outcome. In some cases, scientific conclusions are based on relationships or associations because studies examining cause and effect are not available. When developing education materials, the relationship of associated factors should be carefully worded so that causa­tion is not suggested.

So it is important to use careful wording so as not to confuse association and causation, but not so important to recommend dietary changes that are based on solid evidence of causation?  But I digress.

There are, of course, nutrients within the water-soluble fraction of milk and within lean meat that would be “diluted” by the inclusion of their natural fats, but there are also unique nutrients that are found in these fats, and the fats help increase the bioavailability of fat-soluble nutrients.

Although there is no one source from which Americans get most of their saturated fat, the leading source according to this document is cheese.

The leading source of saturated fat is cheese, also the main source of vitamin K2.

As discussed in the Rotterdam Study, the main source of vitamin K2 in modern diets is also cheese.  In my 2007 article, “On the Trail of the Elusive X Factor,” I reviewed a number of lines of evidence suggesting that vitamin K2 is a potent weapon against heart disease.

The clear association between vitamin K2 and “the solid fats naturally present in the food” can be seen by looking at the vitamin K2 content in various types of dairy products.

Vitamin K2 is found in fat.

Similar results would be found for other fat-soluble nutrients.

Fat also increases the absorption of fat-soluble nutrients.  For example, a 2006 study found that a bagel with low-fat cream cheese containing 2.4 grams of fat (6% of calories) increased the absorption of vitamin E from fortified apples, while a bagel with regular-fat cream cheese containing 11 grams of fat (21% of calories) increased it even more.

A 2004 study found similar results for carotene absorption from a salad using fat-free, low-fat, or regular-fat salad dressing.  The regular-fat dressing added 28 grams of fat to the salad, so the calories in the meal must have been mostly from fat.

Black triangles represent the fat-free salad, while white circles represent the low-fat salad and black circles represent the regular-fat salad.

These studies provide no evidence of a “ceiling” to the fat effect.  The highest fat meal in both studies provided the best absorption of fat-soluble nutrients.

An 2000 animal study found that carotene absorption was nearly two times higher with olive oil than with corn oil, suggesting perhaps that the polyunsaturated fatty acids in corn oil promote oxidative stress in the intestine and thereby decrease carotene absorption.  So much for the case against “solid fats.”

To its credit, the USDA and HHS do list eggs among its nutrient-dense foods, and eggs are one of the best sources of choline

But how much credit do these institutions deserve? Not much.

Although eggs are considered nutrient-dense, that doesn’t  mean we should eat much of them, they say.  This document recommends we eat 0.4 ounces of eggs per day, which allows us about one egg every five days.

And why?

 

Dietary cholesterol has been shown to raise blood LDL cholesterol levels in some individuals. However, this effect is reduced when saturated fatty acid intake is low, and the potential negative effects of dietary cholesterol are relatively small compared to those of saturated and trans fatty acids. Moderate evidence shows a relationship between higher intake of cholesterol and higher risk of cardiovascular disease. Independent of other dietary factors, evidence suggests that one egg (i.e., egg yolk) per day does not result in increased blood cholesterol levels, nor does it increase the risk of cardiovascular disease in healthy people. Consuming less than 300 mg per day of cholesterol can help maintain normal blood cholesterol levels. Consuming less than 200 mg per day can further help individuals at high risk of cardiovascular disease.

 

Actually, consuming three to four eggs per day has little or no effect on blood cholesterol levels in about 70 percent of people.  In nearly 30 percent of people, total cholesterol increases, but HDL-cholesterol increases as much as LDL-cholesterol, and LDL particle size improves.  A 2006 review of randomized controlled trials feeding eggs suggested that three to four eggs per day made blood lipid profiles less “atherogenic” in about 99 percent of healthy adults (ethical considerations, of course, would demand you only feed this “extreme” number of eggs to healthy people).

An egg every five days?  So much for the choline.

It would seem that a diet lacking in organ meats, egg yolks, and saturated fats would be missing some key nutrients, and could hardly be considered “nutrient-dense” unless one simply ignored the whole class of fat-soluble nutrients.

But hey, if we’re going to confuse association with causation and redefine nutrient density to incorporate factors that have nothing to do with nutrient density, we might as well start ignoring nutrients too. 

But if we’re going to do that, why reinvent the wheel?  It would be more efficient to just use the system for ranking nutrient density that Dr. Joel Fuhrman already developed.

Posted in WAPF Blog | 7 Comments

The USDA’s 2010 Dietary Guidelines came out early this year, and the soy industry is thrilled that “soy made the cut.”   

Soy products are cited twice in the executive summary of the report with the recommendation that all Americans increase their intake of soy products and fortified soy beverages.  In the body of the report itself, soy milk appears right up there with low-fat and no-fat milks as good for us and to be drunk two or three times daily while processed soy products are touted as worthy meat equivalents.  Vegetable oils — a code for soy oil in most cases — are recommended to “replace solid fats wherever possible.”   This triple threat to public health can only be the work of the USDA in conjunction with the soy industry and other manufacturers of processed, packaged and junk foods.   

 

Vegans too ought to be happy.  There’s still dread animal flesh and “white blood” in the picture, but the USDA has kowtowed to vegan mythology, buying into their belief that vegan diets, if carefully planned, can be healthful.   USDA even gives vegans their very own appendix, including specific dietary recommendations, including “fortified foods for some nutrients,” especially calcium and B12.  What might those fortified foods be?   Soy milk, energy bars, fake steaks, burgers and other processed, packaged foods tricked out as health foods.   

 

Overall, there’s something for everyone who eats packaged, processed and fast foods, even chocoholics.  The USDA actually considers fat-free chocolate milk to be a “nutrient dense food,”  their phrase, not mine, and I am sorry to say I am not making any of this up.  

 

So what might adopting soy milk, fake meats and vegetable oils mean to the health of the American public?  Let’s look here at two of the USDA’s choices:  fortified soy beveages, and soy proteins.   For information about the inadvisability of vegetable oils, read “The Skinny on Fats,” “The Oiling of America” and other articles on this website.   

 

SOY BEVERAGE

Soy beverage–popularly known as soy milk–is a lactose-free dairy substitute that marketers would have us believe has been drunk by healthy Asians since time immemorial.  In fact,  the earliest historical reference is 1866 and  the Chinese did not traditionally value soy milk until vegetarian Seventh Day Adventists missionaries from America  popularized it starting in the 1920s.   

 

The soy milks sold in supermarkets and health food stores and recommended by the USDA are not exactly traditional soy products.   In the good old days,  soy milk-making  began with a long  soak. The softened beans were then ground on a stone grinder, using massive amounts of water. The mush then went into a cloth bag, was placed under a heavy rock, and pressed and squeezed until most of the liquid ran out. The soy paste was then boiled in fresh water. Large amounts of filthy scum that rose to the surface were carefully removed.

 

The modern method is faster, cheaper — and retains the scum. It speeds up the presoaking phase with the use of an alkaline solution, skips the squeezing and skimming steps, uses common fluoridated and chlorinated tap water, and cooks the soy paste in a pressure cooker. The speed comes at a cost: the high pH of the soaking solution followed by pressure cooking destroys key nutrients, including vitamins and the sulfur-containing amino acids and leaves toxic residues.

 

Taste, not nutrition, is what most concerns the soy industry, and  the USDA as well if it plans to get Americans of all ages to swig  two to three cups daily.   The taste problem is the enzyme lipoxygenase, which oxidizes the polyunsaturated fatty acids in soy, causing the “beaniness” and rancidity.  The industry’s attempted solutions have included high heat, pressure cooking and replacement of the traditional presoaking with a fast blanch in an alkaline solution of sodium bicarbonate (baking soda). Major manufacturers have even “offed” the off flavors using a deodorizing process similar to that in oil refining, which involves passing cooked soy milk through a vacuum pan at extremely high temperatures in the presence of a strong vacuum.

 

To cover up any “beaniness” that remains, processors trot out sweeteners and flavorings.  Almost all commercially sold soy milks contain barley malt, brown rice syrup, raw cane crystals or some other form of sugar.  The higher the sugar, the higher the acceptability among consumers.    Accordingly, most 8 ounce glasses of soy milk contain anywhere from four to sixteen grams (slightly less than 1 teaspoon to slightly more than 1 tablespoon).   Flavors such as “plain” or “original” are almost always sweetened, although perceived by many consumers as unsweetened.   Perhaps the USDA folks who came up with the guidelines thought so as well.  Otherwise its recommendation of soy milk would not jive with its recommendation for consumers to cut back on sugar. 

 

Eliminating the aftertaste in soy milk poses yet another challenge for food manufacturers.  The undesirable sour, bitter and astringent characteristics come from oxidized phospholipids (rancid lecithin), oxidized fatty acids (rancid soy oil), the antinutrients called saponins and the plant estrogens known as isoflavones. The last are so bitter and astringent that they produce dry mouth.  This has put the soy industry into a bit of a quandary. The only way it can make its soy milk please consumers is to remove some of the very toxins that it has assiduously promoted as cancer preventing and cholesterol lowering.   

 

Note the USDA caveat that the soy milk be “fortified soy milk.”   The reason is soy milk made with soybeans and water has such a poor nutritional profile that  it must be  fortified with calcium, vitamin D and other vitamins and minerals to compete with cow’s milk.  Even in health-food store foods, these added supplements are cheap, mass-produced products. The soy milk industry puts vegetarian vitamin D2 in soymilk, even though the dairy industry quietly stopped adding this form of the vitamin years ago. Although any form of vitamin D helps people meet their RDAs (Recommended Daily Allowances), D2 has been linked to hyperactivity, coronary heart disease and allergic reactions.  The USDA has singled out Vitamin D in these dietary guidelines as a special nutrient to keep in mind.  Too bad it’s not specific enough about type.  

  

In keeping with USDA  approved lowfat diets, consumers may opt for the low fat — or “lite”– soymilks made with soy protein isolate (SPI), not the full-fat soybean. To improve both color and texture of these “healthier soy milks,” manufacturers work with a whole palette of additives, including colorants, flavorizers and texturizers.  

 

Soy-milk derived products such as soy puddings, ice creams, yogurts, cottage cheese whipped “creams” and cheese substitutes also meet USDA guideline, but are even poorer choices, given ingredients such as carageenen, corn oil, high fructose corn syrup, partially hydrogenated fats and soy protein hydrolyzates.

 

Should we really be eating and drinking processed foods with ingredient lists like this?   Soy milk has a reputation for being a simple, old fashioned food.  It is not.   Even Peter Golbitz of Soyatech has admitted this.  “Soymilk is one of those unique food products that doesn’t exist naturally in nature, such as a fruit, vegetable or cow’s milk — it is, and always has been, a processed food. Since there are many options available to processors today in regards to process type, variety of soybean, type of sugar and an array of flavoring and masking additives, product formulators need real guidelines to follow to create winning products.”   Too bad that the USDA is more interested in pushing “product formulations” than Mother Nature’s real foods.

 

 

MEAT ANALOGUES AND OTHER SOY PROTEIN PRODUCTS

The USDA supports all-American ingenuity.  That’s the only positive reason I can think of for its recommendation of  the ersatz meat products known in the food industry as “analogues.”    Soy analogue products marketed over the years have had colorful names such as Soysage, Not Dogs, Fakin’ Bakin, Sham Ham, Soyloin, Veat, Wham, Tuno, Bolono and  Foney Baloney.   Although named after — and often made to look like — the familiar meat products they are meant to replace, taste testers tend to evaluate them as poor imitations at best.  But thanks to food technology specialists and their lavish use of sugar and other sweeteners, salt, artificial flavorings, colorings, preservatives and MSG,  more and more consumers are willing to tolerate these products, some solely because of their belief in alleged health benefits.   

 

Manufactured using high heat and pressure, chemical solvents, acids and alkalis, extruders and other harsh tools, these USDA-approved meat substitutes are very likely to contain toxic or carcnogenic residues.   This is also true of highly processed porducts using fractions of milk, eggs, meat, grains, oils or vegetables.  The difference is that processed soy foods are billed as “health  foods” whereas other processed foods are widely acknowledged to be what they are — junk foods that do not support health.   The soy industry typically puts a positive spin on their products by claiming all the health benefits found in soy while insisting that levels of toxins are too low to pose any hazard to the consumer.   

 

But risk is always a product of dose and duration of exposure.   Vegans who favor soy protein, wheat gluten and other heavily processed plant protein products as their primary sources of protein are regularly exposed to relatively high levels of toxins.    The usual suspects are nitrosamines, lysinoalanines, heterocyclic amines, excitotoxins, chlorpropanols, furanones, hexane and other solvents.    

 

Let’s look now at how soy protein isolate and textured soy protein — two of the most common ingredients found in soy meat analogues –  are manufactured. 

 

SOY PROTEIN ISOLATE (SPI) is mixed with nearly every food product sold in today’s stores — energy bars, body builder  powders, breakfast shakes, burgers and hot dogs.   SPI is a highly refined product, heavily processed to remove “off flavors,” “beany“ tastes, and flatulence producers and to improve digestibility.  Vitamin, mineral and protein quality, however, are sacrificed.  Indeed soy isolates increase the requirements for vitamins E, K, D and B12.  Among the minerals, phosphorous is poorly utilized, and calcium, magnesium, manganese, molybdenum, copper, iron and especially zinc deficiencies appear routinely in animals — including human animals — fed SPI as the primary source of protein in their diets.  Soy protein isolates are also more deficient in sulfur-containing amino acids than other soy protein products.  What’s increased during the production of SPI are levels of toxins and carcinogens such as nitrosamines and lysinoalanines.   

 

The manufacture of SPI has always been a complicated, high-tech procedure.  There’s nothing natural about it.  It takes place in chemical factories, not kitchens.  Although the manufacturing process varies, and some companies hold patents on key elements of the process, the basic procedure begins with defatted soybean meal, which is mixed with a caustic alkaline solution to remove the fiber, then washed in an acid solution to precipitate out the protein.  The protein curds are then dipped into yet another alkaline solution and spray dried at extremely high temperatures.     

 

SPI is often spun into protein fibers using technology borrowed from the textile industry.  The only difference is that taste-enhancing and fiber-binding elements are incorporated into the fibers during processing.    The process involves preparing a protein solution with a soy protein content of 10 to 50 percent at a very alkaline pH that is above 10.  The solution is aged at about 121 degrees F until it becomes as viscous as honey at which point it is called ”spinning dope.”  The dope is next forced through the holes of an extrusion device, coagulated with an acid bath, stretched long and thin, bound with edible binders such as starch, dextrins, gums, albumen and cellulose, and coated with fat flavor, color and other substances.  The idea is to attain the fibrous “bite” of animal muscle meats.     

 

For chunkier, less well-defined fibers, processor tend to prefer the Textured Soy Protein (TSP) process. Textured Soy Protein or Textured Vegetable Protein is sold as granules, particles and chunks and used by fast food companies and food processors as a meat substitute or extender for chili, spaghetti sauce, tacos, sloppy joes and other strongly spiced recipes.  It’s been big in the USDA school lunch programs since 1971.    

 

Here’s how it’s made:  First force defatted soy flour through a machine with a spiral tapered screw called an extruder  under conditions of such extreme heat and pressure that the very structure of the soy protein is changed.   What comes out is a dried out, fibrous, and textured alien protein product that can survive just about anything that a food processor might later do to it.   Then add red or brown colors and flavorings  before texturization, drying and packaging.   

 

Soy protein extrusion differs little from extrusion technology used to produce starch-based packing materials, fiber-based industrial products or plastic toy parts, bowls and plates.   The difference is that extruded foods such as TSP are designed to be reconstituted with water, at which point they resemble ground beef or stew meat.   Processing always leaves s toxic residues and TSP furthermore requires using natural and artificial flavors and MSG if it’s going to taste anything like ham, chicken or beef.  

 

In conclusion, the USDA sure has an interesting idea of what constitutes healthy proteins.    Bringing soy front and center in the new food guidelines will feed the profits required by Big Pfood.   Big Pharm is surely happy as well as this latest USDA food fix isn’t going to solve any of our great American health crises soon. 


* * * *  *  


Complete references for the information on soy products contained in this blog can be found in my book The Whole Soy Story:The Dark Side of America’s Favorite Health Food (New Trends, 2005), particularly chapters 6-9, 11 and 14,   

 

Posted in WAPF Blog | 5 Comments

As always, if the font is too small you can increase the size by pressing ‘control’ and the ‘plus’ sign.

Virtually everything we know about vitamin D and latitude might be wrong.

When I wrote “Seafood to Sunshine: A New Understanding of Vitamin D Safety” in 2006, I took it for granted that the conventional beliefs about the effect of latitude on vitamin D synthesis were true.  Here is what I wrote:

The amount of UVB radiation available depends on the angle at which the sun’s rays strike the earth, the presence of clouds and buildings, ozone and aerosol pollution, altitude and reflective surfaces such as snow (18). Because of the effect of the sun’s angle, Webb and colleagues showed in 1988 that, even in completely clear skies, synthesis of vitamin D in the skin is impossible for four months of the year in Boston, Massachusetts and six months of the year in Edmonton, the capital of Alberta, Canada. The Webb team found that such a “vitamin D winter” occurred during at least part of the year at any latitude greater than 34 degrees (19).  More recently, one group of researchers used a computer model to suggest that in the nearly unattainable condition of truly clear skies, the vitamin D winters are shorter than Webb’s team suggested, but that under some environmental conditions, vitamin D winters can occur even at the equator (18).

The 1988 data, to which Michael Holick contributed, has been the most important data set used for understanding the vitamin D winter.  They measured vitamin D production in a handful of cities using isolated pieces of skin or 7-dehydrocholesterol mixed into a test tube concoction.

According to models developed from that evidence, no vitamin D synthesis whatsoever occurs outside of the summer at far latitudes such as 90 degrees (the north and south poles), and none whatsoever occurs during the winter at latitudes beyond 50 degrees (Antarctica, most of Greenland and Alaska, and the northern parts of Canada, Russia and Europe).  The models suggested that very little vitamin D production occurs outside of the summer in all of these northerly places and that even small migrations from equitorial regions cause huge decreases in vitamin D synthesis during all of the non-summer months. 

These assumptions have fueled two important hypotheses: first, that our emergence from Africa has necessitated the evolution of whiter skin in order to make it easier to obtain vitamin D and that those of us who wear clothes in these regions are vulnerable to massive deficiency; and second, that the reduced risk of many diseases that occurs as we approach the equator is a result of improved vitamin D status.  

Throughout my 2006 article, I accepted these hypotheses as likely to be true.  This had little impact on the content relating to the interactions between vitamins A, D, and K, which is the most important part of the article, but it had a major impact on my suggestions of what the ideal dose and blood level of vitamin D was likely to be.

Since I wrote that article, the vitamin D movement has grown much stronger and made bolder and bolder claims that have penetrated much deeper into mainstream consciousness, but the state of evidence for the need of these high levels has remained at the hypothesis stage.  I have thus grown more conservative, in part from studying the philosophy of science and statistics, and in part because the tables have now turned and the see-saw has now flipped, with vitamin D hitting the mainstream.  I now wonder if the lion unleashed may need to be tamed.

I recently pointed out in my post “Are Some People Pushing Their Vitamin D Too High?” that there is very little scientific evidence that we need 25(OH)D levels higher than 30-35 ng/mL (75-88 nmol/L).  Even the evidence for 30-35 ng/mL is primarily observational, meaning that we have very strong reasons for promoting the hypothesis, but no solid confirmation. 

Research that has emerged since 2006 has threatened to turn the “latitude hypothesis” of vitamin D on its head.

In a 2007 paper, “Location and Vitamin D synthesis: Is the hypothesis validated by geophysical data?” (1), an Australian group of researchers created an index of ultraviolet (UV) radiation in the vitamin D range and analyzed how much vitamin D could be produced in seven locations across the United States using UV measurements collected by the US EPA.  They came to the “startling” conclusion that latitude was only related to vitamin D production during the coldest four months of the year.

They then developed a computer model that suggested vitamin D could be effectively synthesized at tropical rates across the entire globe for three quarters of the year and that the ability to synthesize it dropped off gradually between 40 and 70 degrees latitude during the winter months, and only regions between 70 and 90 degrees latitude had a complete vitamin D winter.

Another 2007 study conducted in Adenes, Norway (2) provided limited evidence suggesting that even at this far north latitude of 68 degrees vitamin D production begins in late February.  The study was not anywhere near as rigorously controlled as Webb and Holick’s test tube study, but it was conducted in live human beings.

These studies shed some major light on the form of Eskimo hysteria known as pibloktoq.  In “The Pursuit of Happiness,” and in my 2008 Wise Traditions conference lecture, “The Fat-Soluble Vitamins and Mental Health,” I described how this form of hysteria, possibly resulting from severe calcium and vitamin D deficiency, developed in Inuit who lived in regions without year-round access to dried fish and fish bones during the late winter and early spring.

If vitamin D synthesis is limited to summer in this region as previously thought, why would pibloktoq only occur in the late winter and early spring?  And how would all of the animals obtain sufficient levels of vitamin D for themselves let alone to feed the humans that would eventually prey on them? 

The recent research suggesting vitamin D sythesis proceeds optimally for most of the year in this region helps explain this scenario.  It would also suggest that the Inuit must obtain vitamin D from food because the cold weather leads them to wear a great deal of clothing, and not because the UV-B light is usually unavailable.

Both of these studies directly contradict the predictions about the magnitude and geographical extent of the vitamin D winter developed from Webb and Holick’s earlier data.  Why did Webb and Holick find no vitamin D production in Boston for four months a year and none in Edmonton for six months of the year if in fact plenty of vitamin D can be produced even further north for most of the year?  Perhaps pollution, city buildings, and differences between test tube isolates and real humans made the critical difference.

A major analysis published in 2009 (3) pooled together the results of 394 studies examining vitamin D levels in over 30,000 people all across the globe in order to investigate the effect of latitude on vitamin D status.  The authors only included people who were native to the area in which they were living, and who were free-living.  They concluded that there was only an effect of latitude in Caucasians.  There was no effect of latitude in people with non-Caucasian ancestry.

The reason this deals such a major blow to the latitude hypothesis is that it is precisely people with white skin who dwell outside the equatorial regions who are supposed to be among the most vulnerable, but Caucasians actually had 45% higher levels of vitamin D than non-Caucasians! 

If, in fact, the “original” humans best adapted to their environments are those who never came “out of Africa,” we must wonder why they have, on average, lower vitamin D status than Caucasians living in more northerly regions.

Granted, none of this refutes the notion that people who move outside of the environments to which they are native might suffer from vitamin D deficiency as a result.

About 20 percent of the studies found average 25(OH)D levels above 30 ng/mL, but only 4 percent of subgroups within those studies had levels above 40 ng/mL.  The authors did not report how many had average levels above 50 ng/mL, but certainly it must have been negligible. 

The authors further pointed out in their discussion that the Inuit have genetic adaptations that increase their production of calcitriol, the active hormone form of vitamin D, and that Asian Indians have genetic adaptations that increase their detoxification of calcitriol (either that, or non-Asian Indians have developed adaptations that decrease detoxification). 

Thus, people seem very adapted to the conditions for vitamin D synthesis that exist in the region they are from, but the evidence seems very scant that the “natural” levels of 25(OH)D are 40-60 ng/mL and even if they are in some tropical regions, people outside those regions may not be genetically adapted to having vitamin D levels so high.  In other words, a white American or an Inuit might make much more calcitriol from a 25(OH)D level of 40 ng/mL than an Indian would make.

There are, of course, problems with pooling together almost 400 studies.  For example, the studies used different assays to measure vitamin D, they were conducted during different seasons, there were fewer studies from equatorial regions, and many of the studies were likely not to have been perfectly random samples of their populations.  But when the authors took into account the type of assay, the year of publication, or the season in which the measurements were taken, these factors had no effect.  And while the sampling may not have been perfect, such a massive collection of data should help smooth things out.

Nevertheless, despite the imperfections, this is much more comprehensive an analysis than looking at a handful of tropical lifeguards and assuming that we evolved as “naked apes” bathing in the sun all day long, dreaming of the millenium in which we’d be able to blow whistles and surf out to save our drowning brethren.

The reality is even when our healthy ancestors may have been living with reduced clothing because of the heat in the tropical regions, they still may have been seeking shade or even using primitive forms of sunscreen. 

Consider, for example, what Weston Price reported (NAPD, p. 104) from his visits to the Pacific Islands:

While the missionaries have encouraged the people to adopt habbits of modern civilzation, in the isolated districts the tribes were not able to depart much from their native foods because of the infrequency of the call of the trader ship.  Effort had been made in almost all of the islands to induce the natives to cover their bodies, especially when in the sight of strangers.  In several islands regulatory measures had been adopted requiring the covering of the body.  This regulation had greatly reduced the primitive practice of coating the surface of the body with coconut oil, which had the effect of absorbing the ultra-violet rays thus preventing injury from the tropical sun.  This coating of oil enabled them to shed the rain which was frequently torrential though of short duration.  The irradiation of the coconut oil was considered by the natives to provide, in addition, an important source of nutrition.  Their newly acquired wet garments became a serious menace to the comfort and health of the wearers.

We know that the Pacific Islanders Price studied were healthy.  We don’t know too much of the health of the hypothetical naked ape-like creature from whom we evolved.  And unfortunately, we have no idea what the 25(OH)D levels of these healthy Pacific Islanders were.  We can, however, deduce from this that the assumption that our healthy ancestors did not actively reduce their exposure to UV light in certain ways is nonsense.

Price’s comment that they believed they derived nutritional benefit from the irradiation of the coconut oil is curious.  Several likely candidate hypotheses jump to mind:

  • Irradiation of the polyphenols creates some type of nutritional substance.
  • The absorbance spectrum of the polyphenols interferes with UV-mediated destruction of vitamin D but not its production, thereby increasing vitamin D status.
  • The polyphenols serve to curb excess production of vitamin D and thereby spare vitamins A and K.

Price does not elaborate on this, so it is possible that they simply perceived the prevention of sunburn as a “nutritional” effect or that the use of a sunscreen simply allowed them to spend more time in the sun.  However, his wording seems to suggest that the natives believed that, all things being equal, their state of nutrition increased when they applied the coconut oil to their skin during their ordinary exposure to sunlight.  Studying the effect of topical coconut oil on vitamin D production might provide us with some preliminary guess-work about the vitamin D status of these healthy natives.

In conclusion, evidence has accumulated over the last few years that strongly challenges the “latitude hypothesis” of vitamin D and that should cause us to question with great skepticism the idea that we evolved as naked apes in the summer sun with blood levels similar to tropical lifeguards and that anyone living outside the equatorial regions must supplement with vitamin D for most of the year in order to achieve such evolutionary concentrations.

These data in no way whatsoever show that levels above 30 ng/mL or even levels above 50 ng/mL are not superior to measly ol’ 30 ng/mL.  But they do show the necessity of quickly producing dose-finding, randomized, controlled trials with clinical endpoints to satisfy the vitamin D debate definitively.

And, in the mean time, to self-experiment and carefully monitor the results.  And to share these results with the paleo/traditional foods/vitamin D enthusiast communities, whether they are good results or bad results.  After all, we all want perfect health now and not in five to ten years when these trials are completed.

Stay tuned for my review of the IOM vitamin D report!

Posted in WAPF Blog | Tagged | 41 Comments

This holiday season Whole Foods Market is offering gift boxes and certificates brightly printed with the wish “Soy to the World.”

Whole Foods Market, of course, perceives soy foods and soy milk — particularly modern packaged and processed soy products — as a major profit center.   Soy also fits nicely within CEO John Mackey’s vegan agenda and his promotion of soy as the ticket to personal and planetary health.  Sadly, soy to the world will not bring joy to the world this holiday season or any other.   

The word “soy,” however, fits Whole Foods Market very well.  As discussed in my last blog, “Talking Tofurky,” “soy” is urban slang for something false, of poor value or just not what it seems.   That pretty much sums up a whole lot of the phoney baloney, pseudo-organic products Whole Foods sells.   Indeed a whole lot of what this chain preaches is out of integrity with what it practices. 


Greenwashing   

Heard of whitewashing?  The variant found at Whole Foods is known as “greenwashing.”  The chain put green leaves on its logo,  prominently displays environmentally correct “core values,” and gives mouth service to sustainability yet engages in numerous practices that are environmentally unfriendly.  

Bagging It, for example. Whole Foods encourages us to bring our own bags to save the environment and gives bag credits to local charities.   Eco consumers feel good about this, but what about all those highly processed and overly packaged foods toted home in them?   Soy good to know that not one of those pricey crackers or cookies will crack or crumble.   As for those sturdy packages, they’ll survive for years in the landfills.   

 

Soy Local or Soy Loco

Whole Foods talks the good talk about supporting  local farmers.  It’s one of its conspicuously displayed “core values.”   But walk down the aisles and most everything comes from somewhere else.    Where were all those little soybeans milked to produce soymilk?   Where did they catch those tofurkies?   Where did those fruits and vegetables grow?   California, Mexico, Chili, India?   Not soy often in our own backyard.  

How do local farmers feel about Whole Foods Market?   Many mutter “soy loco”  (“I am crazy”) under their breath whenever they give in and sell to Whole Foods.   Farmers who expect a fair wage for their hard work rarely sell there given the chain’s aim to buy dirt cheap and sell sky high.  

 

Soy Green

More acres of the Rain Forest are destroyed for soybean crops than for beef cattle yet soy is touted as green for the environment.   Most of the Midwest has been destroyed by the monocropping of three vegan staples — corn, wheat and soy. 


Soy Generous  

“Soy to the World” means planeloads of soy products given to survivors of famines and natural disasters.   Seems benevolent, but there’s more to this than good PR.Disaster relief builds global business by making the world’s people dependent upon imported soy and other industrially grown, processed and packaged products.  Such “charitable” practices undermine local farmers and cottage industries and wipe out indigenous crops.  


Soy Egalitarian

Equal opportunity poor health.   Yuppie vegans at one end of the spectrum pay premium prices for health-destroying soy foods.   Poor people eat donated soy from relief packages.  The results for both are malnutrition, digestive distress, thyroid disorders, reproductive problems, ADD/ADHD, allergies, even heart disease and cancer.   Soy to the world.

Meanwhile, John Mackey, the CEO of Whole Foods, likes to be seen as just a regular Joe.  He earns only fourteen times the salary of his average ”team member,” after all. While other corporate executives doubtless take home far bigger paychecks, Mackey’s “talking tofurky” here.  If he were an executive who “talks turkey,” he would admit to also earning millions  in stock options.    He might also be sensitive to the fact that his store is widely mocked as “Whole Paycheck Market” because its extreme markups make it soy overpriced for the average consumer.  

  

Soy Organic

Whole Foods sells only organic soybeans, right?   That’s what they say, but it took months — and an embarrassing expose by  the Cornucopia Institute  –before just some of the Silk products made with commercial soybeans was removed from the shelves.   Similarly, Whole Foods has sold a whole lot of veggie burgers, energy bars and other “organic” products made with soy protein isolate and other ingredients processed using hexane solvents.  Cornucopia also exposed that, but you read it first in The Whole Soy Story.   

Elsewhere in the store, pseudo organic reigns.   Consider factory-farmed “organic” Horizon brand milk and butter.  As for produce, the artful displays conflate organic and commercial.    And if the internet postings of disgruntled Whole Foods “team members” can be trusted, much — if not all —  of it is cleaned with non-organic cleaners.   Seems the  organic cleaners come out, when the inspectors come in.  

Shoppers who aren’t careful may go home with commercial produce just like that found at the supermarket down the block but at a substantially higher price    Whole Foods Market carefully crafts the illusion it sells organic, but far more of what it sells is “natural”– whatever that means —  or even commercial.     

 

Soyled Health Claims

Is soy the “miracle bean” that can cure everything from cancer to ingrown toe nails? Whole Foods would certainly like us to think so.    Similarly, consumers who buy baked and deli goods at Whole Foods are almost always con-oiled, though canola is increasingly replaced by soy oil, which if anything is even worse.

Hemp, chocolate, agave anyone?   Health claims for any of these are very “soy” — i.e. not what they seem.  Agave, for instance,  is tricked out high fructose corn syrup. Chocolate-covered soy nuts are surely the  “tofurky” of snacks.   Most sanctimonious of all is Whole Foods’  promotion of  vegan goods with a green smiley face and the words “I’m vegan!”   

 

Stepford Foods

All the onions are exactly the same size.  Big,  round and heavy! All the apples, too.

Never saw anything like that in my own garden or orchard.   Yet Whole Foods gives us row after perfectly presented row of produce.   Bland but pretty-faced, immaculately clean, blemish free, perfectly made up and not one strand of hair out of place, these are the Stepford Wives of the fruit and vegetable kingdom.   Guess Whole Foods thinks Stepford goods provide a stress-free shopping experience.  No need to choose.  Perfect for the shopper in Calvin Klone jeans. 

 

Soy Latte

The Urban Dictionary defines “soy latte” as something overpriced and pretentious, especially something that tastes good initially but leaves a bad taste in one’s mouth. Seems to me that sums up Whole Foods Market awfully well. 

 

 

 

Posted in WAPF Blog | 22 Comments

As always, if the font is too small, you can use “control +” to increase its size.

In the last several weeks, two momentous occasions have occurred in the world of vitamin D.

First, the Institute of Medicine (IOM) released its new report.  It tripled the recommended intakes, doubled the upper limit, and commissioned researchers to go forth and test the effects of intakes higher than the upper limit, as this would be safe under proper supervision and provide valuable information. 

This sucker is 999 pages long.  When I finish reading it, you can look forward to one heck of a blog post on the matter.  Stay tuned, folks.

The second momentous occasion occurred two days ago (Tuesday, December 14, 2010) when bestselling business author and tango champion Tim Ferriss released The 4-Hour Body.  Tim’s last book, The 4-Hour Workweek, was number one on the New York Times, Business Week, and Wall Street Journal bestseller lists and has been translated thus far into 35 languages. 

In his new book, he dicusses my research on the interactions between vitamins A, D, and K, and gives the following warning:

Supplemental vitamin D increases your need for vitamin A, so don’t forget the aforementioned cod liver, which includes both.

Hooray!  It’s wonderful to see someone with this type of reach get this information out there.

I first raised the issue of vitamin A-and-D interactions in the spring of 2006 in my article on vitamin A and osteoporosis.  I developed these thoughts further and introduced their interactions with vitamin K in the fall of 2006 in my article, From Seafood to Sunshine: A New Understanding of Vitamin D Safety, and in the spring of 2007 in my article, On the Trail of the Elusive X Factor: A 62-Year Mystery Finally Solved, Vitamin K2 Revealed.

I formally published my hypothesis that vitamin D toxicity results not so much from hypercalcemia but moreso from causing the excessive production of vitamin K-dependent proteins, leading to defective forms of these proteins in the absence of adequate vitamin K, in my December, 2007 Medical Hypotheses paper, Vitamin D Toxicity Redefined: Vitamin K and the Molecular Mechanism.  Tufts University researchers confirmed the first prediction of this hypothesis the following year, showing that when vitamin A protects against vitamin D toxicity, it curbs the excessive production of vitamin K-dependent proteins.

When I wrote these articles, all the most compelling research I had was from animal studies.  I wish that when I wrote them I had known that proof of principle for vitamin A-and-D interactions had already been experimentally demonstrated in humans.  This revelation, however, had to wait for my 2009 Wise Traditions lecture, Cod Liver Oil: Our Number One Superfood.  (You can get the DVD here).

In 1941, Irwin G. Spiesman published a human trial in the Archives of Otolaryngology, a journal published by the American Medical Association, Massive Doses of Vitamins A and D in the Prevention of the Common Cold.

Spiesman treated 54 individuals who suffered from frequent colds (five to seven colds per winter) with massive doses of either vitamin A alone, vitamin D alone, or vitamins A and D together.  He treated them during the winter, for as many as three years, with a dosing schedule reaching a maximum of 40,000 IU for vitamin A and 300,000 IU for vitamin D.

Spiesman found that vitamins A and D only reduced colds when fed together:

Vitamins A and D Only Prevent Colds When Supplied Together

Likewise, he found that vitamins A and D were only safe when provided together:

Vitamins A and D are toxic when provided alone, but safe when provided together.
This study is not perfect.  As you can see from the numbers on top of the bars in the second graph, there were far fewer people in the groups receiving either vitamin alone than in the group receiving both vitamins together.  Spiesman reported that this was because it was difficult to get people receiving no benefit to continue the study for very long.  It makes the study more difficult to interpret.  On the other hand, given the toxicity figures in the second graph, we can be happy for safety’s sake that so few people were given massive doses of one or the other vitamin alone.

It would also have been better to have had a vitamin-free control group.  And it would have been better to see the effects of more realistic doses of vitamins.

Nevertheless, the study quite clearly provides proof of principle in humans that vitamins A and D are most beneficial and safest when provided together, just like in the animal experiments.

Several commenters on this blog and on my Facebook Fan Page have provided testimonials about their negative experiences with vitamin D supplementation that support the protective effects of vitamins A and K:

Cynthia Frederick, March 2010 I, and many others I’ve met on forums, have adverse reactions to Vit D3, even the lower amounts of 2,000 IU/day and even though we were tested and were deficient. And we do not have the conditions that would make Vit D contraindicated. . . . I’m waiting for the long-term effects to take place in people taking these higher amounts who are not simultaneously increasing their Vit A and K levels. In 10 years I wager we will be hearing about the negative effects that the higher doses of this ‘miracle nutrient’ have had on those taking it. . . [in a later comment] As I mentioned before, there seems to be a subset of us for whom even low doses (1,000 IU/day) of Vit D cause kidney stones, chest pain, fatigue, and aches and pains, even though our 25 OH D levels were ‘low’ (23). We get these symptoms as soon as a week after using the D. . . . I saved myself a trip to the ER with the above symptoms by remembering previous articles of yours about balancing the fat-soluble vitamins, and took about 100,000 IU of Vit A from fish oil and 5 mg of K2. Within 1/2 hour ALL the symptoms disappeared. This happened more than once so I know it is not coincidence. You are definitely on to something here.

Lynn Razaitis, June 2010 I certainly know what happens when you get these ratios off. I naively had a vitamin D shot of 200,000 units after a serious viral infection that used up my vit A. I wrecked my kidneys, thyroid and who knows what else. It took 6 months and Chris’s articles to figure out what the heck was going on with me. Within weeks of getting my vit A up with cod liver oil and a ton of liver (and I was thyroid blood testing monthly so I had test results to compare) my thyroid hormones all normalized. It was fairly stunning.

Andrea Schüler, June 2010 I stopped taking D3 because I developed tendonitis, bursitis, tendon calcification and aches and pains. I brought my level from 20 to 50 in the 25 OH test but maybe the 2000 – 4000 IU daily was not good for me or I have not enough A and K. I will test again to see where my levels are after several months without D3 pills. Maybe I should check Vit. A and K levels too.

I have received a number of other testimonials by email from people who have developed problems such as kidney stones and bladder stones after supplementing with “safe” amounts of vitamin D — within the IOM’s new upper limit — and these symptoms quite readily develop in animals fed vitamin D with no vitamin A under experimental conditions.  I have not shared these because they were sent to me in private.  If you have such a testimonial and are willing to share it publicly, please post it in the comments section.

If you are a blogger or a practitioner and have commenters or patients willing to share these stories, please help me compile them into a single source by posting them here or contacting me privately.

So how much do we need of each of these vitamins and in what ratio?  I do not know.  We do not even know what the ideal vitamin D level is, and all of the vitamin D studies are confounded by their failure to account for the status of vitamins A and K.

Dr. Robert Heaney apparently knows what the ideal vitamin D level is:

Finally, I believe that the presumption of adequacy should rest with vitamin D intakes needed to achieve the serum 25(OH)D values (i.e., 40–60 ng/mL) that prevailed during the evolution of human physiology. Correspondingly, the burden of proof should fall on those maintaining that there is no preventable disease or dysfunction at lower levels. The IOM has not met that standard.

If I had access to Dr. Heaney’s time machine, I would love to replicate this study. All my requests for NIH funding for a time machine have been denied without any reviewer comments on how to improve my proposal. If we cannot measure paleolithic man’s 25(OH)D, perhaps we can study the fossils that his clothing has left behind, or study the residue that the melanin in his skin has left on his bones, or inspect these bones for the fossilized remains of light-absorbing coconut polyphenols from the coconut oil he may have rubbed into his skin.

But alas, I know of no studies that have quantified the decay rate of melanin or coconut polyphenols over a timescale of thousands of years, or determined the effects that sun spots, lunar cycles, planetary arrangements, and innumerable possible climate changes might have on these decay rates.

Dr. Michael Holick has a more conservative opinion.  He believes that 25(OH)D should be at least 30 ng/mL.  I believe there is more scientific backing for this level, which I’ve expressed in my post, “Are Some People Pushing Their Vitamin D Levels Too High?”

But that doesn’t mean we have scientific evidence that higher levels aren’t better, or that they wouldn’t be if people were getting enough vitamins A and K.  I don’t see any reason to believe that this is the case, since vitamin A seems to increase the turnover and utilization of vitamin D, which should produce a “low” level despite “high” status, but this is currently in the stage of hypothesis and guesswork.

Stephan Guyenet recently commented that it would be nice to know what the vitamin D levels of Kitavans and members of other traditional, healthy groups are:

I don’t know what their 25(OH)D3 status is, but I wish I did. I’d love to know what their 1,25(OH)D3 levels look like too. I agree that it’s important to have a baseline for comparison so that we can decide what’s biologically normal. I’ve been looking for data to answer that question but I haven’t found it yet.

Knowing this would help, but there is still the question of the optimal A-to-D ratio, and how this might further be affected by vitamin K status. 

A paper that Dr. Holick recently co-authored suggested that ratios between four and eight may be ideal.  The lead author, Dr. Linda Linday, had used cod liver oil with a ratio within this range to successfully protect against upper respiratory tract infections.  These authors also cited research showing this range of ratios to be ideal in chickens.  They also cited Sally Fallon’s summary of my A-and-D interaction work as evidence that there was growing concern among the public about the proper ratio of A and D.

Their findings about cod liver oil are somewhat convincing, but they can’t account for the vitamin D the subjects were getting from the sun, and they didn’t test different ratios.  The chicken research is more rigorous, but it’s, well, it’s in chickens.

From an evolutionary perspective, the usefulness of data from chickens depends on whether you believe the evolutionary trees (or bushes, if you prefer) derived from morphology, which place birds as closer to crocodiles than to mammals by 65 million years, or the evolutionary tree/bushes derived from molecular biology, which place birds with mammals and not with crocodiles.  (See this review.)  Or perhaps we should consider binding proteins and enzymatic pathways directly related to vitamin D metabolism.  With respect to D2 versus D3, for example, primates are much more similar to birds than to rats.  (See this review).

Ah, the mired network of divergent and convergent evolution.  Perhaps we should follow Setphan’s ingenius idea of studying humans.  Living ones.

Paul Jaminet of Perfect Health Diet (here, down a few comments) suggests the ideal amounts are 10,000 IU A and 4,000 IU D.  This also sounds quite plausible to me, but again, we can’t say it’s more than a semi-educated guess.

This brings us back to Tim Ferris.  Ferriss tripled his testosterone by bringing his 25(OH)D up to 50 ng/mL and by following a number of other parts of his testosterone-boosting protocol including getting vitamin A from cod liver oil and eating plenty of vitamin K-rich foods.  Ferriss didn’t conduct a dose-finding study on himself, and there’s just about zero reason to believe that the ideal level in Ferriss is going to be the ideal level in anyone else, except that we know that Ferriss is human and will therefore fall within the distribution of human requirements instead of outside of it.

But “follow me and do what I do” is not the point of his book.  The point of his book is to advocate self-experimentation and to provide a starting point for each individual from among his massive audience based on his own self-experimentation.

We should still do the science, but it’s going to take a long time, folks.  The best thing to do now is to eat a well rounded whole foods diet and experiment with the levels of cod liver oil and/or vitamin D supplementation that make you feel the best, resolve your symptoms, increse your performance, and normalize your clinical tests if they’re out of whack or don’t throw them out of whack if they’re normal. 

And share your results with the rest of us! 

Posted in WAPF Blog | Tagged , , | 38 Comments

As always, if the font is too small to read, you can increase its size by pressing “control” and “+.”

Lots of people find that eating a WAP-friendly traditional diet has no effect on their blood lipids or improves them, by a conventional standard.  But I’ve had a number of people ask me, “why is my cholesterol so high on this diet?” 

Or, “why are my triglycerides so high on this diet?”

There are a number of factors that affect blood lipids, and in the future I’ll present a more comprehensive view of this issue.  For now, I’d like to explore the possibility that many people might experience a temporary increase in triglycerides or cholesterol when they switch to a traditional diet because they are actually curing themselves of fatty liver disease.

Recently over at The Daily Lipid, I’ve posted a series of blogs on fatty liver disease.  Fatty liver is a silent epidemic that probably affects 70-100 million Americans, and is most likely caused by the loss of choline-rich foods like organ meats and egg yolks from the American cuisine.  If you haven’t seen them already, you can find these posts here:

 

I’ll provide references for the new material in this blog post, but the references for any background information can be found in these recent posts from The Daily Lipid.

The reason that choline is so important to fatty liver is because it is required for the synthesis of phosphatidylcholine, a critical component of very low density lipoprotein (VLDL), which exports fat and cholesterol from the liver.  If choline is deficient, fat and cholesterol stay in the liver instead of going into the blood. 

As we can see from the following diagram, providing enough choline to allow efficent export of lipids from the liver should increase the concentration of lipids in the blood:

Many different factors can contribute to plasma triglyceride and cholesterol.  Curing fatty liver disease may temporarily increase blood lipids.

We can see from this diagram how ridiculous it is to use blood levels of triglycerides or cholesterol in and of themselves to diagnose a problem. 

Increasing the amount of energy available to the cell, which could be achieved by consuming coconut oil, can increase blood cholesterol, but it also provides lots of energy for good things like detoxification and defending against oxidative stress. 

While coconut oil might be especially effective at increasing cholesterol synthesis, fructose is especially effective at increasing the synthesis of fat, which can actually contribute to fatty liver if there isn’t enough choline to export that fat.

Ideally, lipids are efficiently cleared from the blood so that cells can use them for energy and can use the fat-soluble vitamins that are transported with them for all kinds of critical processes.  Lipid clearance is especially dependent on insulin and thyroid hormone.  When lipids are not cleared efficiently, the delicate polyunsaturated fatty acids (PUFAs) among them will begin to oxidize, which makes them toxic to blood vessels.  The immune system then mops up this toxic mess by forming an atherosclerotic plaque, which is protective at first but eventually can contribute to a heart attack or stroke. 

Poor lipid clearance will usually increase LDL levels, and the LDL particles will then begin to oxidize.  On the other hand, the portion of LDL that oxidizes is cleared from the blood extremely rapidly, so a very high rate of oxidation could actually decrease LDL levels!  On top of this, there are genetic variations among the various receptors that clear oxidized LDL from plasma, leading to further contradicting effects on blood lipids.

Confused yet?  The main point is this: many different factors affect cholesterol and triglyceride levels, and it is simplistic to immediately assume that an increase indicates something bad is going on.  It is also foolish to ignore high cholesterol or triglyceride levels rather than using them as clues and stepping stones to understanding what’s going on in the body.

Now, let me make the case that resolving fatty liver can increase blood lipids.

First of all, a diet deficient in choline and its precursor, methionine, causes fatty liver disease in lab mice and rats while decreasing blood levels of cholesterol and triglycerides.  The effect of a methionine- and choline- deficient (MCD) diet is dependent on dietary sucrose, because the fructose component of sucrose is used by the liver to make lots of fat.  The graphs below (from ref 1) show the effects of choline-deficient (MCD) and choline-sufficient (MCS) diets using either starch or sucrose in mice.

This one shows that choline increases serum cholesterol, and that sugar exacerbates the effect:

Sugar and choline increase serum cholesterol by curing fatty liver.

By comparing the blue bar to the pink bar, we see that choline increases serum cholesterol even when animals are fed starch.  By comparing the green bar to the red bar, we see that this effect is even more dramatic when the animals are fed sugar.  These effects are all statistically significant.

The next graph shows the same thing for serum triglycerides:

Choline may increase serum triglyercides by curing fatty liver.
In this case the comparison between sugar and starch is not statistically significant, meaning that it is too small to distinguish from the effect of chance.  By comparing the green bar to the red bar, however, or by comparing the blue bar to the pink bar, we can see quite clearly that choline increases serum triglycerides.

The next graph shows the amount of fat in the liver, and makes it quite clear that choline increases blood lipids precisely because it is preventing fatty liver disease:

Choline increases serum cholesterol and triglyceride by preventing their accumulation in the liver.
Here we see a very small effect when the mice are fed starch (blue bar versus pink bar).  By comparing the green bar to the red bar, however, we see a huge effect.  This reflects the fact that the liver makes lots of fat from the fructose component of sucrose while choline deficiency prevents the liver from sending that fat out into the blood.

Those of you familiar with the research on fatty liver disease may be skeptical.  Whoa, you say.  Hold your horses, Mr. Masterjohn!  Why do humans with fatty liver have increased blood lipids when these mice have decreased blood lipids?

I believe this likely reflects the fact that humans with fatty liver tend to be insulin resistant and leptin resistant.  Insulin is a major hormone that clears triglycerides from plasma.  Leptin causes us to make and activate thyroid hormone, which activates the LDL receptor.  When people are choline deficient, much of the fat and cholesterol their livers make will stay in the liver.  When they are insulin resistant and leptin resistant, whatever fat and cholesterol makes it out into the blood will stay there until it eventually oxidizes and gets mopped up by the immune system into an atherosclerotic plaque.

Several studies suggest that resolving fatty liver with choline or its close cousin betaine may increase blood lipids.  Betaine is found abundantly in wheat, spinach, and beets.  Choline is a precursor to betaine, which is used in the liver and kidney for an important process called methylation.  It substitutes for vitamin B12 and folate in this process, as shown in the following diagram:

Betaine can spare choline through its participation in the methylation pathway.

Dietary betaine can thus spare choline and allow more of it to be used for the export of lipids from the liver.

The effects of betaine, folic acid (a synthetic form of folate), and choline as demonstrated by several randomized, placebo-controlled trials have been neatly compiled into one paper (2).

These studies found that two to six weeks of betaine supplementation increased total cholesterol, LDL-cholesterol, the total-to-HDL cholesterol ratio, and triglycerides.  Two weeks of choline only increased triglycerides, and folic acid had no effect on blood lipids.

These studies were small and used amounts of these nutrients at the upper end of what is possible to consume from food, but they provide proof of principle that choline and betaine increase blood lipids.

Why did choline only increase triglycerides while betaine increased both triglycerides and cholesterol?  I believe this is most likely because the choline was provided as phosphatidylcholine, which is the main form found in food, and which can be used to help clear cholesterol from the blood. 

A portion of phosphatidylcholine is cleaved by our pancreatic enzymes during digestion to yield free choline, which will go straight to the liver where it will help our liver export fat and cholesterol.  However, the uncleaved portion will be absorbed with fat, bypass the liver, travel through the lymphatic system, and circulate throughout the body where it will help our tissues clear cholesterol from blood.  Thus, choline is likely to increase triglycerides in most people but its effect on cholesterol levels will depend on a person’s digestive system as well as the activity of the enzymes involved in clearing the cholesterol from blood.

A nourishing, traditional diet may thus increase blood lipids in many people while healing them from fatty liver disease.

Presumably, this effect should be temporary, and over time this fat should be stored properly or burnt off for energy.

However, I believe that patients and health care practitioners who are “in the know” should start monitoring levels of liver fat as people transition to a healthy diet, especially if their blood lipids increase.  This can provide critical anecdotal information while we wait, wait, and wait some more for the randomized, controlled trials to come in. 

These will eventually come in, because as the choline proponents rally for a higher choline RDA or for individualizing the choline recommendations according to genetics, paranoia about its effects on blood lipids will ensue.  It could be another decade or two, however, before we fully understand these interactions in humans, so I hope that people will begin compiling anecdotal information now.

Liver fat can be measured by MRI or ultrasound.

In any case, the information in this post demonstrates why it is necessary but very insufficient to pay attention to blood lipids.  Blood lipids can reflect many contradicting processes, and an increase could be good or bad, depending on what we need to heal from, and whether our diet is truly healing.  Let’s stop the war on cholesterol and start the search for truth.

Posted in WAPF Blog | Tagged , , , | 18 Comments

Eager readers want to know how to incorporate the “health benefits” of soy into their Thanksgiving dinners.    As the Naughty Nutritionist, I suggest we not eat soy this Thursday but speak it.    In other words, let’s talk tofurky.   Given that laughter is the best medicine, I present a baker’s dozen examples of soyspeak found in — or inspired by — the Urban Dictionary.   www.urbandictionary.com.    

1.  Soy:   Short for soybeans, soy foods or soy products.   Something ersatz, poor quality or otherwise lacking good value.

 

“Man, that joke was soy!”

 

2.  Tofu:    Soybean curd.   Something that seems fine at first, but turns out to be ersatz or cheaply made. 

 

“Dang.  Got my Tiffany diamond appraised and found out it was tofu.”   

 

3.  Tofurky:    A mound of pseudo turkey made of tofu and other interesting ingredients.   Something bland and boring that has been tricked out to seem hip, cool or funky.

 

“It’s a tofurky of a house — particleboard box iced with corinthian columns, gables, griffins and a red tile roof.” 

 

4. Going Cold Tofurky:   The action of a vegan who gives up a habit or addiction at a single moment, rather than gradually.  

 

“I’m addicted to Facebook.   Gonna have to go cold tofurky.” 


5. Talking Tofurky:    To use a $10 word or phrase when a $1 one will do.   For example, using “utilize” rather than “use” or “at the present time” instead of “now.”  

 

“Am I just dumb or is my boss ‘talking tofurky’ when he orders me to “validate support strategies for customer satisfaction parameters?’”

 

6.  Soymanella:   Food poisoning from contaminated Tofurky or other soyfood product.     


“Can’t go shopping today, sweetie.  Got soymanella at that Thanksgiving potluck.” 

 

7.  Soy Latte:   Overpriced and pretentious. 

 

“She wears Gucci socks to work out!   How ‘soy latte’ of her!”

 

8.  Veg*n:   Alternative spelling of vegan.  Diet said to bring one to G*d .   A loving, inclusive term that unites vegans and vegetarians rather than emphasizes their five letters of separation.    

 

“Oh my g*d, I’m a veg*n.”

 

9.  Veg@n:   Alternative spelling of vegan.   

“Why do you spell it veg@n instead of vegan?  Cuz, its where its at.   Veg@an  looks so cool!”    

 

10.  Tofu Friends:   Friends who act like the people they are with.  Bland nondescript friends who can change their flavor at will.   

 

“He’s a WAPFer at home but a Tofu Friend with the girls on campus.” 

 

11.  Soy Vey:  Expression used by people who get so frustrated trying to talk with their vegan friends that they throw up their hands and walk away.    


“Soy vey, Jack.  Enough already.”    

 

12.  Turkey in Trouble.  Game played on vegans attending Thanksgiving dinner, in which the cook hides a little squab or cornish hen along with stuffing in the turkey cavity.    

“We played ‘Turkey in Trouble’ on poor Chandra.    She’d learned nothing about turkeys laying eggs at her ‘soy latte’ college and started screaming when she found we’d roasted a pregnant turkey!”   

 

13.  Soybeans:   Grain legume processed into tofu, tofurky and other food-like products and marketed as unique agent able to effect miracle cures

 

“So many claims are  made for soybeans that you just want to bang your head into the wall while reading about them.”

 

Amen 


 

Posted in WAPF Blog | 2 Comments

Every week I get agonized letters from parents who fed their sons soy infant formula and who report estrogenized boys who are flabby, lethargic, high strung and/or embarrassed by breasts and underdeveloped genitals.   These parents want to know, “What can we do now?”

First,  read my two articles  ”Soy Recovery Part I” and “Soy Recovery: The Toxic Metal Component,” which are posted on this website.   The first article discusses the importance of eliminating soy and other estrogenic foods from the diet and the necessity of gut healing. The second article covers the importance of eliminating toxic metals, such as mercury, aluminum, cadmium and lead, as well as reducing toxic levels of needed minerals such as copper and manganese.  

Why do toxic metals play a part in soy recovery?   By interfering with every metabolic function in the body.   Malnourished children — most children these days but certainly those who were put on soy formula —  have impaired detoxification pathways.   That means widespread heavy metal toxicity.   In addition, boys estrogenized by soy formula nearly always have toxic levels of copper.   Those given soy formula in the first six months of life are also prone to toxic levels of manganese, contributing to ADD/ADHD and assorted learning and behavior disorders.    All of these interfere with hormone production and cause havoc within the reproductive system.      

The good news, as I report in the Soy Recovery articles,  is that heavy metals and excess copper and manganese can be eliminated.   I cannot emphasize the importance of doing this strongly enough.   The risk of long-term, late-developing health problems from soy formula problem is far too serious for a “wait and see” attitude.   It is vital to act NOW rather than wait until puberty or later when hormonal problems are diagnosed and full blown.   Cleaning up the gut and clearing out the metals gives the soy-fed boy his best chance to recover his health and go through puberty as normally as possible. 

Sadly, there’s no guarantee that diet and detox will correct the hormonal damage caused by soy formula.  Addressing them will at least improve overall health, but it may also be advisable to consider hormone repletion and balancing.    

Thyroid First  

The first hormone to consider is thyroid.  More than 70 years of studies show that soy causes thyroid damage, most often manifesting as hypothyroidism or  auto-immune Hashimoto’s thyroiditis.   While coconut oil and other nourishing foods can support  thyroid health and healing, soy-fed babies may need the additional help of replenishing thyroid hormones to optimum levels, preferably with natural thyroid hormones.  Proper levels of thyroid hormone will help improve energy levels, mental acuity, overweight and other issues.   If thyroid hormone is needed, it will make a huge difference in your son’s overall health.   

Testosterone Next 

As for reproductive hormones,  the first year of life is a critical period for a boy’s sexual maturation.   The body during this time should surge with testosterones and other hormones designed to program the newborn’s reproductive system to mature from infancy through puberty into adulthood.   The risk for boys estrogenized by soy formula is that their programming may be interrupted and later reproductive development arrested.   Conventional wisdom holds that once this developmental window has passed, it is too late. That said, Bioidentical Hormone Replacement Therapy (BHRT) could help some soy-fed boys normalize.  

The first step is to test the  levels and ratios of the boy’s testosterone and other hormones.   Those who come up  deficient or imbalanced may opt for Bioidentical Hormone Replacement, a therapy that is highly experimental for children and adolescents who were fed soy as babies.  Whether replacement hormones can help these boys “catch up” remains to be seen.  However,  even when BHRT fails to jump start growth of the gonads,  it  could still prove worthwhile in terms of overall health and well being.   Testosterone, after all, is not just a macho “sex hormone,” but needed for growth, repair, red blood cell formation, and immune function.   Estrogenized boys might also need help with progesterone or other hormones so a full panel should be tested.    

Parents who would like to consider BioIdentical Hormone Replacement need to know that it is by prescription only, and must be carefully dosed and monitored.  This is true for everyone considering BHRT,  but especially for children and adolescents who have not yet reached adulthood.  

Hope from hcG

Yet another hormone that might help our soy-fed boys is human chorionic gonadotropin (hcG).   Given that hcG is naturally found in high levels only in pregnant women, this idea might seem bizarre.   Less naturally, hcG has been in the news because of its popularization for weight loss by bestselling author Kevin Trudeau and others.   In these programs,  hcg injections plus extremely low calorie, no fat diets help patients to lose significant amounts of weight quickly while retaining muscle mass and high levels of energy.   

My interest in hcG for soy formula fed-boys does not stem from the fact that many of these estrogenized boys are pudgy.   Rather I am intrigued by  a couple of paragraphs in a 1954 report on the use of hcG for weight loss written by the late British physician, Dr. A. T. W. Simeons, in which the doctor suggested hcG for boys with underdeveloped sex organs.  Back then, none of the boys would have been damaged by soy, but I can’t help but wonder if  hcG could play an important role in soy recovery.   Here is the relevant section of Dr. Simeons’ report:   

A Curious Observation

Mulling over this depressing situation, I remembered a rather curious observation made many years ago in India. At that time we knew very little about the function of the diencephalon, and my interest centered round the pituitary gland. Proehlich had described cases of extreme obesity and sexual underdevelopment in youths suffering from a new growth of the anterior pituitary lobe, producing what then became known as Froehlich’s disease. However, it was very soon discovered that the identical syndrome, though running a less fulminating course, was quite common in patients whose pituitary gland was perfectly normal. These are the so called “fat boys” with long, slender hands, breasts any flat-chested maiden would be proud to possess, large hips, buttocks and thighs with striation, knock-knees and underdeveloped genitals, often with undescended testicles.

It also became known that in these cases the sex organs could he developed by giving the patients injections of a substance extracted from the urine of pregnant women, it having been shown that when this substance was injected into sexually immature rats it made them precociously mature. The amount of substance which produced this effect in one rat was called one International Unit, and the purified extract was accordingly called “Human Chorionic Gonadotrophin” whereby chorionic signifies that it is produced in the placenta and gonadotropin that its action is sex gland directed.   

The usual way of treating “fat boys” with underdeveloped genitals is to inject  several hundred international Units twice a week.  Human Chorionic Gonadotrophin which we shall henceforth simply call hCG is expensive and as “fat boys” are fairly common among Indians I tried to establish the smallest effective dose. In the course of this study three interesting things emerged. The first was that when fresh pregnancy-urine from the female ward was given in quantities of about 300 cc. by  retention enema, as good results could be obtained as by injecting the pure substance. The second was that small daily doses appeared to be just as effective as much larger ones given twice a week. Thirdly, and that is the observation that concerns us here, when such patients were given small daily doses they seemed to lose their ravenous appetite though they neither gained nor lost weight. Strangely enough however, their shape did change. Though they were not restricted in diet, there was a distinct decrease in the circumference of their hips.


This is all Dr. Simeons says, and all I know about hcG for sex organ development.   I have no experience whatsoever working with this therapy.   I  would very much like to hear from physicians, other health care practitioners and  parents currently involved in using Bioidentical Hormone, hcG or other natural, herbal  or pharmaceutical therapies to  help estrogenized boys with breasts and underdeveloped gonads become healthy and normal men. Ideas and thoughts about this are also welcome, either as comments here below or to me by my direct email  kaayla@drkaayladaniel.com.     Thank you.  

 

Posted in WAPF Blog | 5 Comments

It’s Halloween time, and once again our newspapers are spooking readers with stories about “creepy food.”  
So what are these creepy foods?   Heart, liver, kidneys,, sweetbread, tripe and other organ meats.   Bones are especially gruesome as can be seen in blood curdling photos of chicken feet “clawing” their way out of bowls of festering broth.     

  What a grave mistake to consider these nourishing traditional foods too creepy to eat.   Traditionally, families honored the animal by eating all edible parts of it.    It was the frugal thing to do, and people instinctively knew that organ meats fostered good health.   As for bone broth, it’s long been the ticket to healing anything that ails us.   It’s even been called “Jewish penicillin” and in South America said to “resurrect the dead.”    

What frightens me is the millions of people caught in the web of the bloodsuckers at Big Pfood and Big Pfarm.   Far too many creepy products to name here.  My motto is “If it’s got a label don’t eat it” because most foods that require labels do have creepy ingredients.   But here’s a few nominations:  

1.   Trans fats.    Partially hydrogenated in Transylvania?   No, here in the USA.   But I expect Count Dracula’s blood is now polluted with them.   Could be that’s drained his life force so much that he has only enough energy to drink the “fast food” of people with high blood pressure.
  
2.   Undead Burgers.    Witness those internet pics of a McDonald’s “Happy Meal” that shows no signs of decomposing after months of sitting out.    Clearly no need to ask, “Want flies with that?”

3.  Caca Crispies.    Proposed name for pet kibble, livestock feed and fish farm rations in which soy protein is mixed with animal doody.   Slogan should be “Snap, Crapple and Poop.”  

4.  Count Chocula — and other breakfast cereals from the dark side.  Call them “Cereal Killers.” 

5.  Ghoul Aid.   Preferably readymade packaged and instant in Lemon Slime Flavor

6   EdaMummies.   Chocolate covered green soy beans.  Wrapped up in smug health claims.   

7.  SPLBLBLBLBT!   That’s a rude raspberry to raspberry candies.  Dunno which is the creepiest ingredient. The HFCS and other sugars, the red dyes or the anal secretions from beavers?  
8.  Ice Scream.    And now Viagra Ice Cream!  I am not making this up.   Sold only to those over 18.   Not just Ice Scream, but Vice Scream.    

 Next week I plan to make bone broth with my Halloween skeletons.   Preferably those of vegetarians since they  insist they  taste better.     Bone Appetit!  

 

Posted in WAPF Blog | Leave a comment

Several readers forwarded me a response to my post “The Curious Case of Campbell’s Rats” that had been posted on the vegetarian site, 30 Bananas a Day!, and suggested I make a rebuttal.  The response can be found here, and I will quote the relevant portion in full:

Re: Masterjohn’s (MJ) article, since I do not have access to many of the articles he cites, at the moment I cannot assess how accurately he describes all of them. Based on the studies I was able to access and what I could gather the abstracts of others, it seems for the most part the rats having problems MJ attributes to ‘protein deficiency’ (stunted growth, increased susceptibility to aflatoxin poisoning etc.) were weanlings. MJ neglects to mention that although adult rats and humans have similar protein requirements calorically (~5%), rats have considerably higher protein requirements when nursing and as weanlings (~20%), the latter being a period of rapid bodily growth. Indeed rat mother’s milk contains 20% calories from protein vs. 6% for human milk. So no wonder the weanlings were having problems on a 5% protein diet, they really were protein deficient! This would also explain the case of weanling rats successfully blocking aflatoxin initiation fed 20% protein, they were being fed an adequate amount of protein for their age, and hence were better at fighting off disease than their protein-starved counterparts. In any case this oversight of MJ’s is quite misleading.

I certainly agree that protein-deficient diets will only cause stunted growth when they are fed to growing rats.  The younger the rats are, moreover, the more rapidly they are growing, so the effects on growth will be worst when the deficient diets are fed to weanling rats. 

The question here, however, is not whether rats eventually grow to an age where they can tolerate low-protein diets.  The question is whether the same low-protein diets that prevent the growth of pre-cancerous lesions once they are formed also increase the toxicity of aflatoxin and other harmful chemicals, decrease the capacity for tissue repair, increase the risk of dying from chemical overdose, and promote the formation of pre-cancerous lesions when these diets are fed to the rats of the same strain, sex, age, and protein requirements.  As we will see below, this is exactly the case.

Estimating protein requirements is extremely difficult.  While it is simple to determine the amount of protein that maximizes the growth of young rats, maximal growth does not necessarily mean maximal health, as Campbell frequently points out. 

The most common method of estimating protein requirements in adults is the nitrogen balance method, which is when researchers find the amount of protein necessary to make sure the rats (or people, or other animals) are not excreting more nitrogen than they are consuming in their diets.  This is actually a poor way of estimating protein requirements because it completely ignores how the protein intake affects metabolic functions, like the synthesis and degradation of essential proteins.  The turnover of proteins and the excretion of nitrogen rapidly declines on a low-protein diet, so the optimal intake of protein is almost certainly higher than that required to maintain nitrogen balance.  These facts are acknowledged in nutritional textbooks (this one, for example), and the widespread use of the nitrogen balance method simply reflects the difficulty of reliably estimating protein requirements.

It is true that adult rats require much less protein to maintain nitrogen balance than young, growing rats require to maximize growth.  For example, Imai (2003) found that diets containing 9% casein will maintain nitrogen balance in adult rats while diets containing 25% casein were necessary to maximize nitrogen balance and growth in 4-week-old rats.  Rats are considered “adults,” however, at six months.  Most of Campbell’s rat experiments were conducted using rats much younger than this, so the protein requirement of adult rats is irrelevant.

Appleton and Campbell (1983) showed that 20% casein diets promoted the growth of pre-cancerous lesions when they were fed to rats already dosed with aflatoxin, but protected against the formation of pre-cancerous lesions when they were fed to rats before, during, and soon after the dosing period.  The dose of aflatoxin was equivalent to just under two million peanut butter sandwiches containing 100 grams of peanut butter contaminated with the maximum limit of aflatoxin set by the US government. 

Were the rats “adults” during the promotion period?  Not quite.  All the rats were fed 20% casein until they weighed 80 grams.  According to this vendor, male Fischer 344 rats weigh 80 grams when they are just over five weeks old.  The initiation period lasted from five to nine weeks, and the promotion period lasted from nine to 21 weeks.  They would have become adults after another month, but the promotion period corresponds primarily to adolescence. 

Were these rats’ protein requirements lower during the promotion period?  Let’s take a look at their growth:

Rats had similar protein requirements during the initiation and promotion periods, even though low-protein diets protected against the promotion and pre-cancerous lesions already formed but promoted the initiation of those lesions.

As we can see above, the low-protein diets affected the final body weight of the rats similarly whether they were provided during the initiation period (red bar) or the promotion period (green bar). 

If we consider 5% casein as the reference (the blue bar), providing 20% casein during the initiation period (red bar) increased body weight by 72 grams, while providing 20% casein during the promotion period (green bar) increased body weight by 66 grams.  This difference is so small as to be meaningless and it is not statistically significant.  If we consider 20% casein as the reference (purple bar), providing 5% casein during the initiation period (red bar) decreased body weight by 49 grams, while providing 5% casein during the promotion period (green bar) decreased body weight by 43 grams.  Again, the difference is obviously meaningless and is not statistically significant.

After observing how the red and green bars in the above graph are nearly identical, suggesting similar protein requirements during the initiation and promotion periods, let’s refresh our memories by taking another look at the occurrence of pre-cancerous lesions in the same rats:

Although the rats has similar protein requirements during the initiation and promotion periods, the high-protein diet promoted the growth of pre-cancerous lesions during the promotion period but prevented the initiation of pre-cancerous lesions during the initation period.
The red and green bars don’t seem so close together anymore.  When these two graphs are considered together, it becomes extremely unlikely that the difference between the effects of protein during the initiation and promotion periods is an illusion created by different protein requirements of the rats during the two different periods.  Rather, it appears that high-protein diets promote the growth of pre-cancerous lesions that have already been formed, but provide equally dramatic protection against the formation of these lesions in the first place.

To provide further evidence that the negative effects of the low-casein diets were not limited to very young animals with unusually high needs for protein, Campbell found that the apparent loss of the ability to repair damaged tissue occurred when the 20% casein diet was fed during the promotion period, the same period during which it protected against the growth of pre-cancerous lesions.   Schulsinger, Root, and Campbell (1989) wrote the following:

Under such conditions [wherein low-protein diets suppress the activity of detoxification enzymes], the parent compound AFB1 would accumulate to cause greater acute toxic effects, whereas less product (i.e., aflatoxin-DNA adducts) would accumulate to initiate carcinogenesis. Although this was our previous explanation for the differences between AFB1-induced toxic and carcinogenic responses to protein intake, it would not appear to account for the greater toxic effects observed in the animals fed 20% casein during initiation (when AFB1 metabolism occurs) but 5% casein after initiation. This observation suggests that the low protein intake was not sufficient to allow for tissue recovery from the acute toxic effects.

And thus ended the final paragraph of this paper. Campbell likes to cite it as having showed that plant protein doesn’t promote cancer, but he usually leaves out not only the little part about lysine making plant protein just as active as casein, but also this part about the low-casein diets destroying the capacity for tissue repair.

Both of these studies involved an “initiation” period that preceded and coincided with an acute dose of aflatoxin and a “promotion” period that followed.  This reductionist model is useful for studying the effects of initiation and promotion separately, but it is extremely unrealistic.  None of us will eat two million highly contaminated peanut butter sandwiches over a short period of time, but we are all exposed to small amounts of carcinogens on a day-to-day basis.

In more realistic models where the rats were dosed with smaller (but still large) amounts of aflatoxin every day, the low-protein diets proved fatal, even in adulthood. 

Madhavan and Gopalan (1968) fed rats a daily dose of aflatoxin with either 5% (LP) or 20% (HP) casein.  They carried on the experiment for two years, roughly a full lifetime for a lucky lab rat, but they stopped the aflatoxin after six months (when the rats were seven months old, and thus mature adults) because the low-protein rats were keeling over dead left and right:

About 50% of the animals on the LP diet died before the cessation of toxin administration in all experiments and the livers of these showed typical lesions, namely, parenchymal necrosis, bile duct proliferation and fatty change. The majority of the remaining animals died between 52 and 104 weeks with a greater mortality in the LP groups. The death in all was almost invariably associated with a massive pneumonia which was characterized by the presence of microabscesses and mucus-containing cysts.

This is the obscure Indian study with which Campbell had been so enamored.  What of Campbell’s own studies?  Mainigi and Campbell (1980) tried feeding 20% casein and 5% casein (5C) to rats, spiked with 5 parts per million (ppm) aflatoxin as a daily dose, but they had to feed the low-casein group only 2.5 ppm instead:

The 5C animals were fed half the dietary AFB1 since 5 ppm was found to be lethal for this dietary group.

Thus we see that, regardless of the exact protein requirement of adult rats or of humans, the very same low-casein diets that promoted the growth of pre-cancerous lesions once they were formed — by dosing the rats with a couple million highly contaminated peanut butter sandwiches worth of aflatoxin — also increased the toxicity of aflatoxin and protected against the formation of pre-cancerous lesions. 

Rather than condemning animal protein or even protein itself, we should be asking whether there is an intermediate level of protein that maximizes our protection against the small doses of toxins to which we are exposed every day, and whether other nutrients found in whole foods rich in protein allow us to reap the protective effects of protein safely.

Posted in WAPF Blog | Tagged | 5 Comments