In 1976, Thomas McKeown published The Modern Rise of Population, wherein he argued that the increase in population that occurred in industrialized nations over the nineteenth and twentieth centuries was driven by declines in infectious disease mortality, which in turn was driven by improvements in “nutrition” rather than public health and medical improvements. “Improved nutrition” in this sense basically meant access to more food, not better food. McKeown’s thesis caused quite a stir, and he supported it with provocative graphs like this one:
McKeown’s data were for England and Wales. We should note that the United States never introduced a vaccine for tuberculosis, but still eradicated the disease. McKeown never argued that the medical treatments had no effect, however; he simply argued that they weren’t major drivers of the trend. He published similar graphs for mortality from respiratory diseases, measles, scarlet fever, diphtheria, and smallpox. By and large, the declines in overall mortality were driven by the declines in mortality due to tuberculosis.
In 2010, Aleck Ostry and John Frank published an up-to-date evaluation of McKeown’s argument in the journal Critical Public Health, entitled “Was Thomas McKeown Right for the Wrong Reasons?” (1). Other critics of McKeown had argued that he too easily dismissed many advances in public health that took decades to fully manifest and that he failed to give enough credit to advances in therapeutic medicine like identifying and isolating tuberculosis patients. Ostry and Frank considered these criticisms legitimate, but too narrow to discredit McKeown’s broader argument.
Instead, they expanded on a criticism that Frank had previously published: that McKeown failed to distinguish between two separate mortality trends. The first, occurring between 1780 and 1870, was a decline in mortality among adolescents and young adults driven largely by declining tuberculosis mortality. The second, beginning in the 1890s, was a decline in mortality among infants and young children driven largely by declines in mortality from measles, whooping cough, and various diarrheal infections. They argue that these two trends had fundamentally different causes.
Ostry and Frank argue that the emergence of the “yeoman farmer” in the sixteenth century, along with technological advancements in farming, contributed to a large increase in food availability. Not everyone was a yeoman farmer, however: this period also saw increased division of labor and urbanization, and reconstructions of wealth distribution leave us uncertain just how broadly Britons benefited from the increase in food availability. Indeed, between the 1830s and 1870s, the decline in mortality was isolated to rural areas as exploding populations and grossly unsanitary conditions prevailed in urban centers. Increased height, however, which is sensitive to nutritional status, closely parallels the declines in mortality. These data suggest that while urbanization may have taken its toll on certain sectors of the population, a greater abundance of food may nevertheless have improved overall survival.
While they accept that this greater abundance of available food may have driven the early decline in adolescent and young adult tuberculosis mortality, Ostry and Frank argue that the later decline in infant and childhood mortality had a different cause: declining family size. This may seem confusing on the surface, since the decline in mortality is supposed to have caused a massive increase in population. They are, thus, arguing that the population increased because fewer people were being born. Is this even mathematically possible? Let’s consider the argument.
Reproductive rates peaked in 1816, slowly fell till the 1870s, and then began rapidly plummeting. In the 1860s, marrying couples had an average of 6.2 children; this fell to 4.1 by 1890, and fell to 2.3 by the 1920s. Millions of pro-contraceptive tracts were distributed during this period and, although abortions increased, withdrawal (the “pull-out” method) remained the main method of contraception until World War II. Since withdrawal was always available, it would seem that an increase in the availability of contraception could not have driven the decline in reproductive rates. Since there was a strong popular movement in favor of contraception and since abortion increased during this period, it seems reasonable to suggest that people restricted family size primarily because they wanted smaller families. Ostry and Frank suggest that increasing educational costs as well as laws prohibiting child labor may have motivated the desire for smaller families. Despite the increase in abortion, they appear to have achieved this, Ostry and Frank claim, primarily by practicing withdrawal.
To support the notion that having fewer children could decrease child and infant mortality, Ostry and Frank point out that bearing and nursing children taxes a mother’s micronutrient stores, especially her stores of vitamin A and iron. These nutrients, in turn, are well known to protect infants and children from infectious diseases. Ostry and Frank also point out that siblings may compete for nutrition when food is scarce. They cite Reeves (2), who originally proposed this hypothesis in the 1980s, and who estimated that declining family size could account for two thirds of the twentieth century decline in infant mortality due to respiratory diseases, and may have shifted the burden of measles from infants to older children less likely to die from the disease. McKeown himself provided data from late eighteenth century Britain on page 40 of The Modern Rise of Population that seems to support the general idea that smaller families have lower mortality rates:
As we move down the rows, we see statistics for larger and larger families. If we follow the red numbers, we see that the proportion of children alive at the time the statistics were compiled decreases as the size of the family increases. This is consistent with the argument that Ostry and Frank maintain, that the more children a woman has, the more her nutritional stores are taxed, compromising each child’s ability to survive infectious diseases.
If this argument is correct, it is a great testament to the “primitive wisdom” documented by Weston Price, which included family planning, as he explained in Nutrition and Physical Degeneration (p. 398):
Another important feature of the control of excellence of child life among the primitive races has been the systematic spacing of children by control of pregnancies. The interval between children ranged from two and a half to four years.
Of course there is a difference between declining overall family size and systematic child spacing, but they would tend to correlate with one another and would both help preserve maternal nutritional status.
It’s a shame that it would take child labor laws and expensive education to force this wisdom on modernized Western Europeans, particularly since other aspects of “primitive wisdom” included how to educate children inexpensively and give them plenty of work to do (isn’t work educational?) without abusing them in mine shafts.
But Ostry and Frank go further than to claim that family planning reduced child and infant mortality; they contend that it drove population growth:
After 1870, with shifts to a modern pattern of fertility control, population growth was increasingly maintained through decreasing mortality, particularly among infants.
Yet in the table we see above, in the late 1700s, while women who had 11-24 children had half as many of their children survive as women who had only 1-2 children, they nevertheless had an average of 3.4 surviving children compared to only 0.78. If we follow the rightward most column in the table, which shows the average number of surviving children, we confront the big surprise that the surest ticket to having a greater number of surviving children is to bear more children, not fewer. If “nutrition” improved over the course of the following century, we would expect survival to improve in larger families, which would make having lots of kids an even surer ticket to having lots of kids who remain alive.
Thus, while it seems entirely reasonable that restricting family size contributed to the decline in infant and child mortality, I find it hard to believe that it contributed enough to this decline to drive massive population growth.
While Ostry and Frank understand the importance of vitamin A in preventing infant and child mortality, both they and McKeown seem to have been completely unaware that the most important medical intervention for tuberculosis was a nutritional one, and that this same nutritional intervention, one of which Weston Price was very fond, became a universal prophylactic during the 1920s and 1930s, taken by children and adults throughout the modernized world to prevent infectious diseases until the post-War period ushered in the ascent of antibiotics.
That’s right, folks: cod liver oil. In a future post, I’ll consider the role cod liver oil played in banishing widespread infectious disease mortality to the annals of history. Was this great feat of modernization wrought by another bedrock of “primitive wisdom”? Stay tuned…
Read more about the author, Chris Masterjohn, PhD, here.
1. Ostry AS, Frank J. Was Thomas McKeown right for the wrong reasons? Critical Public Health. 2010;20(2):233-243.
2. Reeves R. Declining fertility in England and Wales as a major cause of the twentieth century decline in mortality. The role of changing family size and age structure in infectious disease mortality in infancy. Am J Epidemiol. 1985;122(1):112-26.🖨️ Print post