One of the themes that has recently emerged with the resurgence of debate over the China Study is the supposed conflict between “reductionism” and “holism.”
For example, T. Colin Campbell has argued that many critics of the China Study follow a reductionist approach to nutrition and fail to appreciate the “symphony of mechanisms” that underlies nutritional biology and the processes of health and disease. Denise Minger recently wrote a 36-page, 129-reference response to Campbell that should be considered the most current and authoritative critique of the China Study to date. Within it, she made a powerful argument that Campbell has only teased apart the data (in other words, “reduced” it) enough to hear bits and pieces of this symphony and through his “holistic” perspective has failed to account for many important observations. Dr. Michael Eades recently chimed in with his own assessment of the debate, in which he hammered home the point that correlations do not show causation and all hypotheses must be tested in randomized, controlled trials. In Campbell’s recent “Primer on Statistics” he argued that this is primarily true only when when “one adheres to the reductionist philosophy of nutritional biology.”
In this post, I will make the argument that, far from being opposed to one another, reductionism and holism go hand in hand. In doing so, I will present an analysis of a paper recently published in the journal Archives of Neurology purporting to show that high-fat dairy products, red meat, butter, and even organ meats contribute to Alzheimer’s disease as an example of the use of “dietary pattern analysis” gone completely awry through the misuse of “holism.”
Reduction and Holism — The Watchmaker, the Surgeon, and Weston Price
To solve nearly any problem, we must both reduce the problem to its component parts, and then provide a solution to the problem as a whole. In doing so, we use both reductionism and holism in partnership.
A watchmaker cannot fix a watch without first taking the watch apart. Were he to do this and this alone, he would be a “reductionist.” Were he to admire the symphony made as a working watch ticks, he would be a “holist.” The skilled watchmaker, however, is both a reductionist and a holist at once. He takes the broken watch and first reduces it to its component parts. He fixes what is broken, and he puts the watch back together again.
Likewise, a surgeon cannot heal an ailing organ without first making an incision. She spends years learning the intricacies of how the many components of the heart symphonize to produce the quiet percussion that keeps each of us alive. In a matter of life and death, she, like the watchmaker, must reduce the heart to its component parts, heal what is ailing, and allow the whole to once again beat the beat of life.
Weston Price applied reductionism and holism to nutrition. He observed many healthy groups with very different diets in disparate parts of the globe. Had he insisted upon using holism alone, he would have been forced to conclude from observing the Inuit that one must eat mostly animal foods to be healthy. He would have been forced to conclude from observing the Swiss that one must eat rye bread day by day to be healthy. And he would have been forced to conclude from observing the Masai that one must consume blood.
Price, of course, made no such conclusions. He used reductionism to form the hypothesis that it was not the specific foods that mattered, but the nutrients within them. He then came up with a holistic solution — he designed a nutrient-dense diet supplying similar levels of nutrients contained in each of the diverse diets he found in healthy groups across the world, and used this diet to not only prevent but even reverse tooth decay in his dental patients.
When Holism Goes Terribly Wrong — The Perils of Dietary Pattern Analysis
T. Colin Campbell is not the only researcher who believes in analyzing dietary patterns rather than specific nutrients. Many epidemiologists are increasingly using forms of dietary pattern analysis and those with ingenuity and outstanding statistical expertise are continually developing new statistical methods for this purpose.
The approach, conceptually, makes a lot of sense. After all, it is not individual nutrients that carry out biological effects, but complex diets containing many interacting nutrients. Dietary pattern analysis, however, has a dangerous flaw. Foods exert their influence on our health at the biological level, itself composed of the biochemical, biomechanical, and energetic forces that underlie it. Dietary patterns, on the other hand, are primarily determined at the cultural level and to some degree at the level of individual choice.
If I were to eat a diet of liver, kale, and spinach, this dietary pattern would affect my health very differently than if I were to eat a diet of liver, yogurt, and bread, or than if I were to eat a diet of bread, kale, and spinach. This represents the phenomenon of dietary patterns, through biology, affecting health.
Yet there is nothing about liver that, through biological mechanisms, causes me to eat kale or spinach, nor is there anything about kale that causes me to eat bread. Either because of the cultural milieu in which I have acquired my habits or because of my own individual choice, I will form a dietary pattern. This represents the phenomenon of culture and preference, through food choices, affecting dietary patterns.
So what happens when a cultural milieu and a set of dietary preferences causes people to create dietary patterns involving both health-promoting and health-destroying foods? For statistical analysts engaging in dietary pattern analysis, something very awful happens: they condemn the whole dietary pattern and throw the baby out with the bathwater.
Let’s look at a recent example.
A New Type of Observational Study — One That Can’t Be Used For… Anything
Researchers from Columbia University recently published a paper in the Archives of Neurology entitled “Food Combination and Alzheimer Disease Risk: A Protective Diet.” This paper concluded that a diet rich in salad dressing, nuts, fish, tomatoes, poultry, cruciferous vegetables, fruits, and dark and green leafy vegetables, yet low in high-fat dairy products, red meat, organ meat, and butter was protective against Alzheimer’s disease.
Someone forwarded me a clip from a newsletter that even made the following completely false claim about this study:
Those who developed Alzheimer’s all seemed to eat a diet that was rich in high-fat dairy, red meat, organ meat, and butter, whereas healthy participants had diets rich in salad dressings, nuts, fish, tomatoes, poultry, fruits and cruciferous and dark and green leafy vegetables.
This study found nothing of the sort.
The authors used a very complicated and sophisticated statistical method called “reduced rank regression.” This method was first applied to dietary pattern analysis in 2004 in a paper entitled “Application of a New Statistical Method to Derive Dietary Patterns in Nutritional Epidemiology.” This process can be summarized in the following steps:
1) “Nutrients and ratios of nutrients presumed to be important in the development” of a particular disease are chosen. This is a direct quote from the paper to which I linked above. The word presumed is important, because often the presumption is based on statistical correlations rather than cause-and-effect patterns demonstrated with experimental science. Generally only about five nutrients, give or take a couple, are chosen.
2) The researchers then take anywhere from ten to thirty “food groups” and determine several different dietary patterns that independently explain as much variation as possible in the nutrients presumed to be important. They do this in a way that derives several patterns that have no statistical correlation with one another. Note that the patterns are not based on the relationships between the foods and the disease, but are instead based on the relationships between the foods and the nutrients presumed to be important in the development of the disease.
3) Finally, they perform a statistical analysis to determine the relationship between each pattern and the disease. Each person is allotted a score on each dietary pattern. For each pattern, everyone is divided into groups according to whether they scored high, moderate, or low for that pattern. The investigators then test whether there is any change in risk associated with being in the groups with the highest or lowest pattern score. The researchers then form “holistic” conclusions about the dietary pattern rather than “reductionist” conclusions about individual foods or nutrients.
A German researcher wrote a letter to the journal in which this method was first proposed, the American Journal of Epidemiology, pointing out a major flaw in this approach:
[In order for the approach to be useful,] there needs to be a clear picture of the underlying biologic mechanism relating nutrients or dietary factors to the development of a specific disease. . . . RRR [reduced rank regression] does not overcome the limited knowledge about the relations among food intake, dietary factors, and disease risk. If the underlying biologic mechanisms remain to be elucidated, RRR can only work on the basis of current knowledge or hypotheses. This is quite often the case and is not an extreme case, as Hoffmann et al. stated in their Discussion section. Therefore, the results can only provide answers within the current theoretical framework.
Houston, we have a problem.
Epidemiological studies are observational. The scientific method states that we 1) make observations, 2) develop hypotheses, 3) make testable predictions based on those hypotheses, 4) test the predictions by experimentation, 5) ensure that the results can be replicated, and 6) make cause-and-effect conclusions based on experimental evidence. This “new statistical method to derive dietary patterns” has all the benefits of observational studies except that it cannot generate new hypotheses. Oh wait. The one benefit of observational studies is that they can generate new hypotheses.
So what is it we do with this method? The authors who originally developed it acknowledged that it is only useful for deriving dietary patterns that will “explain maximal variation” in specific nutrient intakes or other such variables. When we go beyond these strict limitations, however, we form conclusions like this one:
Protect Yourself From Alzheimer’s by Becoming Deficient in B12
The authors of the Alzheimer’s study chose seven nutrients that they presumed to be important to the development of the disease: saturated fat, monounsaturated fat, omega-3 fatty acids, omega-6 fatty acids, vitamin E, vitamin B12, and folate.
They then performed statistical associations between thirty different food groups and the set of seven nutrients. They came up with seven dietary patterns. Together, these seven patterns explained just over 75 percent of the variation in the seven nutrients.
Of the seven dietary patterns, only one of them had a statistically significant relationship with Alzheimer’s disease. The pattern that explained the most variation in the seven nutrients — 29 percent of it —had no relationship with Alzheimer’s disease. The one pattern that did have a relationship, called “dietary pattern 2,” explained only 19 percent of the variation in the seven nutrients presumed to be important.
Immediately, this should call into question whether they truly chose the seven nutrients with the most important relationships with the disease. If these seven nutrients were so important, why wouldn’t the dietary pattern that best explains them have the best relationship to the disease?
Although they presented no statistical correlations between the individual food groups making up that dietary pattern, scoring high on dietary pattern 2 was associated with a 46 percent lower risk of Alzheimer’s disease than scoring low on it.
The authors thus called the pattern “a protective diet” in their title and identified it in their abstract as a diet “characterized by higher intakes of salad dressing, nuts, fish, tomatoes, poultry, cruciferous vegetables, fruits, and dark and green leafy vegetables and a lower intake of high-fat dairy products, red meat, organ meat, and butter.”
When they described the pattern this way, they simply chose the most powerful contributors to the pattern. In actuality, the pattern is composed of 30 food groups that all make fairly small contributions to the pattern. Although the pattern could explain 19 percent of the variation in the seven nutrients, it could only explain 5 percent of the variation in the foods people were eating. In other words, our ability to use the pattern to predict what people are eating is pretty poor.
How did organ meats get included? The authors do not report raw data or correlations between food groups and specific nutrients or disease end points. However, organ meats are rich in B12, and dietary pattern 2 explained about six percent of the variance in B12 intake, which the authors presumed to be among the seven important nutrient intakes. This is the only nutrient that plausibly explains their relationship to the dietary pattern. But a higher score on dietary pattern 2 meant a lower intake of B12. Thus, a higher score on dietary pattern 2 also meant a lower intake of organ meats.
But wait — did the authors really presume that a low intake of B12 would protect against Alzheimer’s, when a deficiency of this nutrient results in irreversible damage to the central nervous system? On the contrary, they included it among the seven presumably important nutrients because “higher intakes of vitamin B12, folate, and vitamin E may be related to better cognitive functioning or lower risk of [Alzheimer’s disease] in elderly individuals.”
Organ meats, in fact, only explained about 3 percent of the variation in this dietary pattern. They explained a whopping 69 percent of the variation in the third dietary pattern, but the third dietary pattern had no relationship to Alzheimer’s. Clearly, if the association between dietary pattern 2 and Alzheimer’s risk in any way reflected the 3 percent contribution of a low intake of organ meats, then the 69 percent contribution of a high intake of organ meats to dietary pattern 3 should have produced a statistically significant result for that pattern.
We thus have organ meats and vitamin B12 indicted on the basis of mathematics rather than biology. Please forgive my momentary lapse into statistical jargon while I explain how this can occur.
Beginning of Momentary Lapse Into Statistical Jargon
In correlation analysis, there are two important terms that indicate the strength of the correlation. The first is the correlation coefficient, denoted “r.” This value is positive when two variables increase or decrease together and negative when two variables move in opposite directions. However, it is the square of this value that indicates the amount of variation in one variable that can be explained by another. In stuffy textbookish language this is called the coefficient of determination. More commonly, we simply called it “r squared.” The r-squared value is always positive, so it tells us nothing about the direction of a relationship.
If a correlation coefficient between X and Y is +0.5, then the r-squared is +0.25. This tells us that Y increases as X increases and that X explains 25% of the variation in Y. However, if the correlation coefficient were -0.5, this would tell us that Y decreases as X increases, yet the r-squared value would still be +0.25, and X would still explain 25% of the variation in Y.
Since the dietary pattern analysis we are looking at seeks to derive dietary patterns that explain maximal variation in presumably important nutrients, it seeks to maximize r-squared. Thus, it makes no difference whether the relationship is positive or negative.
So the authors proposed that low intakes of saturated fat and high intakes of monounsaturated fat, polyunsaturated fat, vitamin E, folate, and B12 are likely to protect against Alzheimer’s. This is the model they consider biologically plausible. Obviously, we can criticize the biological plausibility of this model, but for now let’s focus on how they developed a dietary pattern that contradicts their own theory of biological plausibility.
Biologically, it is very important whether a high intake or a low intake of vitamin B12 protects against Alzheimer’s. Thus, whether a dietary pattern is rich in B12 or poor in B12 is important to the biological plausibilty of the hypothesis that the dietary pattern has a cause-and-effect relationship to the disease. Mathematically, however, the direction of a relationship has no effect on the r-squared value between the food groups and the total score based on the seven presumably important nutrients. The method simply seeks to maximize the r-squared value, and thus pays no attention to the biological plausibility of the result.
In this case, these authors wound up creating a dietary pattern that was high in all the other factors they considered protective but low in vitamin B12. Was vitamin B12 itself associated with Alzheimer’s disease? No. The authors stated in their discussion:
The effect of a single nutrient or food item may be too small to detect. Indeed, none of the nutrients was significantly associated with [Alzheimer’s disease] risk in a fully adjusted model (data not shown).
Thus, we have a case of guilt by mathematic association.
End of Momentary Lapse Into Statistical Jargon
It should seem obvious at this point that dietary pattern analysis can indict certain foods simply because, for reasons of culture or individual preference, they happen to be correlated with other dietary factors. And these confounding factors are not just dietary. In fact, “Subjects who were older, less educated, and current smokers tended to adhere less to [dietary pattern 2].” Adherers to the pattern were more likely to be black or white than hispanic and were more likely to be women.
Although dietary pattern analysis may be “holistic,” this method seems to be taking two giant leaps backward for mankind. First, while good scientists are using observational studies to generate new hypotheses, this method creates observational studies whose use are limited to interpreting data within the framework of existing hypotheses. Second, while most statisticians are using methods to adjust for confouding variables, this method tries its hardest to introduce as many confounding variables as possible.
This method needs a heavy dose of reductionism.
Using Holism Properly— Within the Context of the Scientific Method
Like the watchmaker who reduces the watch to its component parts in order to put back together the whole in working order, we must approach science using reductionism and holism together.
Analyzing observational data should always start with the simplest approach — looking at simple, unadjusted correlations. Then we can slice and dice the data in many ways using many models in order to generate new ideas and new ways of looking at things. Ultimately, we test these ideas using experimentation.
It would be reductionist folly to test the effect of a single nutrient and make a conclusion about diet. This type of testing does have an appropriate place in determining the whys and hows of nutrition. Ultimately, however, the holistic complement to such reductionist research must be to perform experiments using dietary patterns that contain the “symphony” of nutrients working in concert.
Read more about the author, Chris Masterjohn, PhD, here.