New Statin Study: Too Good to be True?
“A highly anticipated study has produced powerful evidence that a simple blood test can spot seemingly healthy people who are at increased risk for a heart attack or stroke and that giving them a widely used drug offers potent protection against the nation’s leading killers.” So begins a Washington Post article (November 10, 2008) on the recently published JUPITER study, whose lead author, Paul M. Ridker, claims provides evidence for taking the statin Crestor to lower a substance called C-reactive protein (CRP), considered a marker for inflammation. “Compared with those getting the placebo, those taking Crestor were 54 percent less likely to have a heart attack, 48 percent less likely to have a stroke, 46 percent less likely to need angioplasty or bypass surgery to open a clogged artery, 44 percent less likely to suffer any of those events and 20 percent less likely to die from any cause.” If this sounds too good to be true, you are right. Before you rush to your doctor to have your CRP levels tested and jump on the statin bandwagon, read Sandy Szware’s excellent analysis of the study and its accompanying hype at junkfoodscience.blogspot. com. Key points: the actual differences in outcome were in fact very small, with the difference in mortality between the statin and control groups after nearly two years only 0.25 percent; researchers stopped the trial early, just as the projected overall mortality of the statin group was about to surpass that of the placebo group; the selection process for trial participants was so rigorous that it screened out eight of ten seniors recruited, for conditions ranging from inflammatory disease to “unstated reasons.” Even though participants were screened to exclude those with potential “compliance” problems, nearly 15 percent of participants had stopped taking their pills after one year; and there was a 25 percent higher number of newly diagnosed cases of diabetes among the statin group compared to the placebo (270 cases versus 216 cases). But the world of cardiology is breathless: “For the cardiology world, discovering a major new risk factor as well as an effective treatment is like hitting a walk-off home run to win the World Series,” says Dr. Eugene Braunwald at Brigham and Woman’s Hospital. And the corporate world is rubbing its hands with glee. The stock of AstraZeneca, maker of Crestor, climbed 45 percent after JUPITER was halted last March. The lab test for C-reactive protein costs $50-$80 and Crestor costs $1,400 per year. If publicity for JUPITER increases the number of people taking the statin, business analysts estimate that AstraZeneca’s already $3.5 billion in annual sales for Crestor will double over the next five years. And if the official guidelines are changed to include CRP as a risk factor, seven to ten million more American adults could join the ranks of those “at risk for heart disease” and needing treatment with statins, which, to meet federal guidelines, could potentially mean $14 billion per year for the drug company. P.S. Dr. Ridker, chief author of JUPITER, co-invented the CRP test, with Brigham and Women’s Hospital holding the patent and patent rights having been licensed in part to AstraZeneca.
The Cholesterol Risk Factor
One very interesting fact emerged from the media discussions of the JUPITER trial—with JUPITER, cardiologists have finally acknowledged that cholesterol levels do not accurately reflect a tendency to heart disease. Dr. James Stein, MD, from the University of Wisconsin Medical School in Madison, praised the study for exposing the fact that current therapeutic LDL-cholesterol levels are not only arbitrary, but are in fact a poor indicator of cardiovascular risk. “Most patients with heart attacks have normal LDL-cholesterol values,” he stated. With the cholesterol theory crumbling, the industry is under intense pressure to come up with a new risk factor, and one that can be treated with the same statin drugs they have invested so much money in. Enter Dr. Ridker and C-reactive protein. Ridker has been pushing CRP as an important risk factor to be treated with statins for a number of years. But is CRP really a risk factor for heart disease, or simply an associated factor? Studies indicate the latter. In fact, a National Panel on CRP Testing found no evidence to support the premise that treating CRP will improve survival rates (www.urmc.rochester.edu/pr/News/story.cfm?id=182). Elevated CRP levels are associated with everything from anger and stress to arthritis, cancer, lupus, inflammatory disease, pneumonia, TB, oral contraceptive use, pregnancy, heart attacks, surgery, trauma, burns, strenuous exercise and many other conditions. They are a marker for disease, not a cause, but since statin drugs lower CRP levels slightly, you can bet that CRP levels will be the new cholesterol, to be feared, tested for and lowered using the dangerous and expensive drugs.
Statins for Pregnant Women? Too Horrible to be True
FDA lists statins in category X for pregnancy, along with thalidomide and accutane, meaning that they should never be taken by pregnant women (http://healthlink.mcw.edu/article/1031002752.html0). They are teratogens, with the potential to cause horrible birth defects. For this reason the March of Dimes opposed over-the-counter statin sales because of birth defect risks. But researchers at New York’s Hospital for Special Surgery have pregnant women in their sights. They tested statins on mice with a condition called antiphospholipid syndrome (APS), which can cause miscarriages, and found that biochemical markers indicative of better pregnancy outcome improved. Now they are claiming that statins should be given to women with APS-induced pregnancy complications. Guillermina Girardi, PhD, lead author of the study, claims that statins are perfectly safe for pregnant women and that a trial involving pregnant women is needed. A trial like this can only get approval from the hospital’s human safety review board if all the board members are on Lipitor themselves.
New Food Ranking System
An example of how reductionist tunnel vision can lead to ludicrous conclusions is the new Overall Nutritional Quality Index (ONQI), developed by a “panel of twelve of the nation’s foremost experts on nutrition” at Yale University’s Griffin Prevention Research Center. Using a complex mathematical formula that looks at selected nutrients in a particular food, as well as the amount of fat, sugar, sodium, cholesterol, calories, glycemic load and other factors, the august experts have rated commonly consumed foods on a scale of one to one hundred, with one hundred being the “healthiest.” The top rated food under the ONQI system? Broccoli! All but one of the foods rating 90 and above are fruits and vegetables—radish gets a 99—the exception being nonfat milk with a 91. Meats and seafood range from the twenties to the eighties in this system, with the lower numbers assigned to fatty cuts of meat such as baby back ribs and chicken wings. Obviously non-nutritious foods such as popsicles, cheese puffs and sodas get ratings below 20, along with nutrient-dense traditional foods like fried eggs, salami and bacon (www.onqi.org). The sacred foods—prized by traditional cultures as absolutely necessary for optimal health and normal reproduction—don’t even show up in the ONQI list—butter, organ meats, fish eggs, cod liver oil, and whole raw dairy products from grass-fed cows. It would be interesting to feed one group of laboratory rats a diet of highrated foods such as broccoli, radishes and nonfat milk, and another group of rats a diet of low-rated foods such as fried eggs, salami and bacon, and compare the results. Meanwhile, expect to see the new ratings posted on various grocery store foods very soon. The scale system “will allow busy parents and others who care about nutrition a quick, at-a-glance way to see what food item is ultimately the healthiest without having to read every label” (CalorieLab Calorie Counter News, January 28, 2008). An ominous statement! Can we expect to see the ONQI replace food labels sometime in the future?
More News About Vitamin D
Vitamin D is in the news these days as more and more studies show the benefits of the “sunshine vitamin.” Vitamin D protects against bone loss, diabetes, nervous disorders like MS, cancer and heart disease. A new study, published in the Archives of Internal Medicine, indicates that people with low vitamin D levels have a significantly higher risk of death than those with higher levels. The study involved over 13,000 people in their forties whose blood was tested for vitamin D levels and who were followed for about nine years. The participants were divided into four groups based on their vitamin D levels. Those in the bottom quarter, whose vitamin D was less than 17.8 nanograms per milliliter had a 26 percent greater risk of dying from any cause than those in the top quarter (Vol. 168 No. 15, Aug 11/25, 2008). It is studies like these that have shamed the American Academy of Pediatriacs into issuing “new” guidelines for vitamin D intake in children, raising the recommended dose from 200 to 400 IU. Actually, these guidelines are not new, but rather the old AAP recommended guidelines, in place from 1963 to 2003. In 2003, the AAP reduced the recommended dose to 200 IU, justifying this decision with a short paper containing absolutely no discussion of the scientific data and arguing that breast milk was an inferior source of vitamin D to formula (which is true if breast-feeding moms are avoiding food sources of vitamin D). The fact that vitamin D is found in foods that are conspicuously absent from the USDA official food guidelines has led to some contradictory and even ridiculous statements by researchers. According to Neil Binkley, an associate professor of geriatrics and endocrinology at the University of Wisconsin School of Medicine and Public Health, “We were never supposed to eat our vitamin D.” How people living in places like Wisconsin got their vitamin D during the winter months before the advent of vitamin D pills is not explained. Vitamin D pills are the chopped-logical choice in today’s climate of confinement agriculture, animal fat avoidance and sun phobia. The AAP spent a large part of its recent position paper warning mothers to keep their infants and young children totally out of the sunshine; heliophobia has led to the widespread practice of putting sun screen on children who will spend the day fully clothed, “just in case” a noxious ray of sunlight should touch their skin.
Sweet Sourdough Bread. . .
A key component of our mission at the Weston A. Price Foundation is to report on the scientific validation of traditional food ways. Recent research on bread, soon to be published in the British Journal of Nutrition, has given us a wonderful validation of the benefits of traditional sourdough bread-making techniques. Using white, whole wheat, whole wheat with barley and sourdough white breads, researchers at the University of Guelph examined how subjects responded after eating bread for breakfast and again after lunch. The ten male subjects, who were overweight and ranged between fifty and sixty years old, showed the most positive body responses after eating sourdough white bread. With the sourdough, the subjects’ blood sugar levels were lower for a similar rise in blood insulin, and this positive effect remained during the second meal and lasted hours after. Surprisingly, the worst results were seen after consumption of whole wheat and whole wheat with barley bread, which caused blood sugar levels to spike, with high levels lasting until well after lunch. According to Professor Terry Graham, head researcher on the project, the fermentation of the sourdough “changes the nature of starches in the bread, creating a more beneficial bread.” The research team is now looking into the effects of sourdough fermentation on whole wheat bread (The Canadian Press, July 7, 2008). What these preliminary results tell us is that consumption of improperly prepared whole grains puts the body under stress, as witnessed by the unhealthy increase in blood sugar levels.
. . . With Butter
Butter is bad for us, we’ve been told—over and over again— because butter contains saturated fat and saturated fat raises “bad” cholesterol and makes us gain weight. Yet in a recent trial carried out in Israel, described as “arguably the best such trial ever done and the most rigorous,” researchers found that a low-carbohydrate diet high in saturated fat resulted in the greatest weight loss and the most desirable lipid profiles. The trial compared three diets: a restrictedcalorie American Heart Association (AHA) diet with about 30 percent of calories from fat, with less than 10 percent of calories as saturated fat; a restricted-calorie Mediterranean diet, high in dietary fiber and monounsaturated fat; and a low-carbohydrate diet, described as “high in saturated fat,” containing about 40 percent of calories with 12.5 percent as saturated fat. Calories were not restricted in the low-carbohydrate diet, yet after two years, this group had lost the most weight—10 pounds versus six in the low-fat diet. LDL-cholesterol reduction was best with the Mediterranean diet while those on the supposedly heart-healthy AHA-recommended diet saw no reduction in LDL-cholesterol. Those on the low-carbohydrate diet had moderate reduction of LDLcholesterol, but the best results for the ratio of total cholesterol to HDL-cholesterol occurred with the low-carb dieters, who had increased HDL-cholesterol, the so-called “good” cholesterol, whereas the other two groups did not. Furthermore, the low-carb dieters saw the biggest reduction in C-reactive protein, a marker for inflammation, and the non-diabetic lowcarb dieters had the lowest fasting insulin levels. (Diabetics on the Mediterranean diet had the best markers for fasting glucose and insulin levels.) It’s a pity the researchers did not look at a normal traditional diet containing 50-80 percent of calories as fat, with at least half of those calories as saturated fat, but the results of even slightly more calories as saturated fat compared to the AHA-recommended diet should give our dietary pundits pause. How long will it be before saturated fats like butter, tallow and coconut oil take their proper place in government dietary recommendations? Probably not any time soon. But meanwhile, those of us in the know can enjoy plenty of butter on our bread—sourdough bread!
One In Five
Almost one in five young American adults has a personality disorder that interferes with everyday life, with a greater number abusing alcohol or drugs, according to a report in the Archives of General Psychiatry (2008;65(12):1429-1437). The findings are the result of interviews involving more than 5,000 individuals ages 19 to 25. The disorders include problems such as obsessive or compulsive tendencies, anti-social and paranoid behaviors that are not mere quirks but which interfere with ordinary functioning. Substance abuse, including drug addiction, alcoholism and drinking that interferes with school or work, affected nearly one-third of those interviewed. Pharmaceutical exectives must be rubbing their hands together at the thought of this “untapped” source of more customers, referred by their college health centers and dorm counselors. Sadly, no media commentators are suggesting the obvious solution— to clean up the food supply and stop the demonization of nutrient-dense animal foods needed by the brain.
Even Alternative Folks Are Conventional
A group calling itself the Natural Therapies Research Board surveyed “hundreds of nutrition research articles” and interviewed over 300 “natural” nutritionists to develop a Consensus Report on Basic Health and Nutrition. These research articles were all published by “reputable scientists in peer-reviewed Journals,” thus eliminating “marketing propaganda in favor of good science.” The nutritionists were agreed that we should eliminate refined and artifical sweeteners, hydrogenated or trans fats, artificial vitamins, over-the-counter drugs and caffeine from the diet, but the other dietary suggestions are just offshoots of the dietary guidelines, with a decided emphasis on plant foods. They recommend 5-7 servings of vegetables daily, 4-5 servings of fruit, 2-3 servings of berries (!) and 2-3 servings of nuts daily, along with 2-3 servings of “protein.” No mention of whether this protein should be meat or soybeans. Milk is included as long as it is organic (youhealyourself. com/consensus/index.html). The guidelines contain no recommendations whatsoever about healthy fats and nutrient-dense animal foods. Looks like the diet dictocrats have successfully infiltrated the “natural” health movement.
Good News About GMOs
Cultivation of conventional soybeans is on the increase, according to a report from the University of Mississippi Delta Research Center, replacing genetically engineered Roundup Ready beans. According to the report, farmers are choosing conventional seeds because of lower seed costs, lower weed control costs and comparable or higher yields. Roundup herbicide that cost $15 per gallon in 2007 is selling for $40-$50 in 2008 (www.nwrage.org). Other possible reasons for the trend: increased demand for non-GMO food and studies indicating that glyphosate, the main ingredient in Roundup, increases the risk of non-Hodgkin’s lymphoma, a form of cancer (Journal of the American Cancer Society, March 15, 1999).