Global climate change, or more precisely the fear of global climate change, has emerged as one of the dominant narratives of our time. It has become so prevalent that not a single day goes by without the topic gracing the front pages of at least one of the major international news outlets. Pretty much every single event that involves weather, pollution, farming, construction of any kind and even human psychology is now attributed to climate change. Even more startling, a 2021 Pew Research Center survey reported that 5 to 10 percent of childless adults are foregoing having children due to climate change concerns. Imagine an external force, which no human being can control, so terrifying that it overtakes our biological programming to successfully procreate.
The “official” narrative on climate change goes like this: the Earth’s climate has been stable for hundreds of thousands, if not millions, of years, that is, until about the 1850s, when human society began burning (combusting) hydrocarbons—defined as coal, oil, natural gas liquids and natural gas—in massive amounts, releasing ever-increasing volumes of combustion by-products into the atmosphere. We are told that the most important of these is carbon dioxide, which is a greenhouse gas, and as such has an almost linear relationship with the average global temperature of Earth; in other words, carbon dioxide is the environmental control knob for our planet; every increment we produce warms the planet by that same increment. According to this narrative, the only way to save the planet is to stop using hydrocarbons and stop the release of carbon dioxide from human activity. If this does not happen in the next twenty (or fifty or one hundred) years, humanity as we know it will be doomed, and the Earth will be rendered uninhabitable.
We are told in no uncertain terms that the scientific community is “unanimous” in this prognostication and that the science regarding this topic is “irrefutable.” A good example of the conventional wisdom came in a multipage spread of prestige journalism from the New York Times in April 2021, titled “The Science of Climate Change Explained: Facts, Evidence and Proof (Definitive Answers to the Big Question).” Unsurprisingly the article focused only on the human (or “anthropogenic”) role of fossil fuel combustion, while ignoring or heavily discounting any natural forces that contribute to climate change.
I intend this article, the first in a series, and subsequent discussions to help inform readers about the complexity of Earth’s climate system and to show that no one variable can explain or is responsible for variations in Earth’s macro climate patterns. Further, a trace gas in our atmosphere, carbon dioxide—which over the last one hundred fifty years has increased from two hundred eighty to four hundred twenty parts per million (0.028 percent to 0.042 percent), largely through human combustion of fossil fuels—is not the dominant variable affecting all of Earth’s weather. It should also become evident that there is much less consensus within the scientific community than we are led to believe, and “the science” is by no means settled.
POLITICAL, NOT SCIENTIFIC, ORIGINS
In tackling this expansive subject, I decided to start at the beginning, with the political origins of the climate change discussion. In early 1972, a previously obscure think tank published an unusual book about the future of society. The Limits to Growth (LTG), developed and published by The Club of Rome (CoR), would go on to sell thirty million copies and influence a generation of academics, public policy wonks and, unfortunately, politicians.
Several details about this book were unusual for the time but would not be considered out-of-place today. The first oddity was the book’s popularity despite its dark, Malthusian and utterly bleak outlook for humanity as a whole. The second was the use of computer programs as the primary tool to support the analysis in the modeling of humanity’s societal trajectory into the great unknown of the future. The final aberration was the source of the book: the CoR think tank. Although this group’s official branding is altruistic, it consists of a collection of unelected intellectuals—today they would be called “thought leaders”—who research, analyze and disseminate (through studies, books, white papers, etc.) policy documents to help steer policymakers into what they believe to be the best course for countries, nations, society and other cultural constructs.
In retrospect, the advent of groups such as the CoR can be viewed as a canary in the coal mine presaging the emergence of a “consulting-industrial complex” into the mainstream. The influence of groups like the CoR has metastasized across our society over the past fifty years to the point of mass-infection that we have today. This complex consists of for-profit consultancies (such as McKinsey, Bain, Boston Consulting Group, Oliver Wyman, Booz Allen Hamilton and IBM); accounting and advisory firms (for example, Deloitte, Ernst & Young, Accenture, KPMG and PricewaterhouseCoopers [PwC]); and nonprofit foundations and think tanks (which in addition to CoR include entities like Davos, World Economic Forum, Brookings Institute, RAND Corporation, Bill & Melinda Gates Foundation, Sierra Club and the Rockefeller Foundation). All of them together form the backbone of the consulting-industrial complex and represent the outsourcing of thought in the Western political states from the people and their elected representatives to corporations and special interest groups. Across the Western democracies (meaning the member-countries of the Organisation for Economic Co-operation and Development or OECD), there is almost no piece of public policy or political agenda today that is not heavily influenced or written outright by the various appendages of the consulting-industrial complex.
The Club of Rome was—and is—no different. On paper (that is, officially), the CoR was established by Aurelio Peccei (an industrial consultant for Fiat) and Alexander King (a British science advisor with the OECD). Reportedly, they formed a partnership over a shared interest in emerging global macroeconomic issues—namely, rising population and resource depletion. Behind the scenes (depending on how much tinfoil you own), the true financial backers of the CoR look like the Volkswagen corporation, the Agnelli family (founders of Fiat), David Rockefeller (grandson of John D. Rockefeller), Maurice Strong (Canadian oilman) and David Rusk (U.S. Secretary of State from 1961 to 1969). There are two items of note regarding the financial backers of the CoR. First, they all seem to have had a heavy guiding hand not just in the formation of the CoR but also two other prominent organizations: the Rockefeller Foundation and the United Nations (UN). Second, their role as founders in each seems to have been largely scrubbed from the Internet record. The consistency of the underlying philosophy shared by each of these organizations—and the involvement of the same players—seems too remarkable to be purely coincidental (if not a reason to stock up on more tinfoil).
What were the interesting and novel ideas that The Limits to Growth proposed? Pointing to the post-WWII population boom—an estimated increase from 2.4 billion to 3.8 billion people between 1945 and 1972—LTG argued that the growing global population would soon strip the world of resources, food and industrial capacity, leading to an exponential rise in “pollution” and a collapse of prosperity and society—within the next hundred years but likely much sooner (see Figure 1). Among other predictions, LTG forecast that as of 1970 population growth levels, the world would exhaust its global gold resources by 2001, silver by 2014, copper by 2020, natural gas by 2021, oil/petroleum by 2022 and aluminum by 2027.
The solution to avert disaster, in the opinion of the LTG authors, was to return the human population and its influence over the earth to a state of “equilibrium.” They argued that the only way to avert catastrophe was to massively bend the curve on population (to, say, around one to two billion individuals) and de-emphasize the pursuit of growth among the world economies. Thomas Malthus reached a remarkably similar conclusion in his book, An Essay on the Principle of Population, written one hundred seventy-five years earlier.
The LTG premise, shared by Malthus, was that human population growth is geometric (exponential), while food and resource availability increase only linearly, setting the stage for population to regularly overshoot resource and food production. Unlike Malthus’ pad-and-paper approach, the technique underpinning the LTG analysis was the peak of modern mathematics—a computer simulation—which, in this case, was a program called World2 (now World3) developed by MIT systems engineer Jay Forrester. However, LTG was not without vocal detractors when published, many of whom pointed out a couple of inconvenient facts about the study. The main criticism was that the negative effects of population growth had been modeled on geometric growth, while the positive effects of a growing population—namely, innovation and technology—had used a linear growth model. A second major critique was that the core World2/World3 model made no adjustments for—and largely ignored—the impact of price or the relationship between supply, demand and price.
Ironically, given their current position on climate change, one of the largest detractors of the LTG study was the New York Times, which published an excoriating rebuttal in April 1972. The news outlet wrote:
“The Limits to Growth, in our view, is an empty and misleading work. Its imposing apparatus of computer technology and systems jargon conceals a type of intellectual Rube Goldberg device—one which takes arbitrary assumptions, shakes them up and comes out with arbitrary conclusions that have the ring of science. Limits pretends to a degree of certainty so exaggerated as to obscure the few modest (and unoriginal) insights that it generally contains. Less than pseudoscience and little more than polemical fiction, The Limits to Growth is best summarized not as a rediscovery of the laws of nature but as a rediscovery of the oldest maxim of computer science: Garbage In, Garbage Out.”¹
One of the fortuitous developments that initially made LTG such a popular and influential book is what happened next. In a twist of fate, the world experienced a physical scarcity of crude oil—the lifeblood of the global economy—during the 1973–1974 Arab Oil Embargo, which was OPEC member-nations’ response to Western democracies’ political and military support of Israel. This front-and-center resource scarcity (driven by political motivations rather than outright scarcity of this critical commodity) helped reinforce the underlying message of LTG to the masses, leading many to consider the study prescient in the moment.
However, what ended up happening following the 1973-74 oil price shock is exactly the type of complex system dynamic that the LTG/World3 model simplified or ignored. The reduction in oil supply drove a geometric increase in price as well as spurring short-term conservation measures. Over the next decade, the high oil prices led to development of new resources (namely, fields in the North Sea, Alaska, West Africa and the Soviet Union). As those resources came on line in the early 1980s, the supply deficit was transformed into a supply surplus, and by the mid-1980s, the price of oil was collapsing. The period of scarcity abated while consumption and prosperity continued, ushering in a twenty-year period of low energy prices and high economic growth globally.
REAL-WORLD CONSEQUENCES AND FAILED PREDICTIONS
Lest anyone think that white papers such as LTG are merely academic postulations without any real-world consequences, consider the example of China’s one-child policy, implemented between 1979 and 2015, which illustrates the real impact of translating an analytical exercise backed by computer simulations into social policy. The architect of that program, engineer and demographer Song Jian, was directly influenced by LTG and two other policy documents. The second document, The Population Bomb, was written in 1968 by Stanford biologist Paul R. Ehrlich, who argued that population growth in the 1960s would lead to widespread famine and societal collapse in the 1970s. The book was written at the suggestion of Sierra Club executive director David Brower. Thirty leading scientists of the day wrote Jian’s third influence in 1972, A Blueprint for Survival, making similar arguments that humans should return to living in small deindustrialized tribes.
Advocates of China’s policy will highlight the fact that the one-child program prevented an estimated four hundred million births and, by the 2010s, had helped China bend its population curve, topping out at a total population of 1.2 billion. However, this “success” must be weighed against the large societal distortions that the program produced, including millions of stories of personal human suffering and pain and a massively demographic top-heavy society where the younger generations have an unprecedented gender imbalance—25 percent more men than women—never experienced by such a large population in world history. (Usually, the demographic gender imbalance swings the opposite way as a function of war, with many more women than men.) The ultimate consequences of this type of policy are still being played out in China today, and I suspect much of the friction within Chinese society going forward will ultimately be traced back to this grand social engineering experiment.
Forty years after LTG’s publication, it becomes more obvious with each passing year that very little of what was predicted has come to pass, despite the study’s grim hundred-year time horizon. Even with a global population approaching eight billion, the world has not run out of resources. Moreover, much of the macro pollution of the 1970s (smog, acid rain, DDT poisoning and various air and water pollutants) has been abated in the West, and humans are enjoying one of the highest collective standards of living that any civilization (that we know of) has ever experienced.
Given the stark rebuttal of what was predicted, one would think that organizations such as the CoR would have disappeared, condemned to delegation as a footnote of history and largely forgotten, but they have not. In fact, starting in the late 1980s and early 1990s, the CoR and similar organizations began the great pivot away from population control and resource depletion to the next great phantom boogeyman: climate change. Nothing better exemplifies the great pivot than this quote from Alexander King and Aurelio Peccei from 1993:
“Because of the sudden absence of traditional enemies, new enemies must be identified. In searching for new enemies to unite us, we came up with the idea that pollution, the threat of global warming, water shortages, famine and the like would fit the bill. . . . All these dangers are caused by human intervention, and it is only through changed attitudes and behavior that they can be overcome. The real enemy then is humanity itself.”²
PASSING THE BATON
The handoff—from the CoR to other organizations as the vanguard of the climate change advocacy movement—began in the 1980s, highlighted by the June 23, 1988 Senate testimony of NASA scientist James Hansen,³ who reported that “he was ninety-nine percent certain the earth was warmer than it had ever [been] measured to be and that there was a clear cause and effect relationship with the greenhouse effect and lastly due to global warming, the likelihood of freak weather was steadily increasing.” Hansen at the time was the director of the NASA Goddard Institute for Space Studies (GISS) and was instrumental in advancing the transition from the instrument era (1750–1980) to the satellite era (1980 to present) in monitoring and measuring the Earth’s climate and atmosphere. It is important to note that when he gave his speech to the U.S. Senate, we had at most ten years of satellite-based measurements, so how these could be used to make deterministic, long-term statements about climate change remains a bit of a mystery to me personally. Hansen is also noted for his studies of the planet Venus and its 95 percent carbon dioxide atmosphere, for popularizing the concept of “runaway” climate change and for using large average global mean temperature as a means of tracking climate change.
That same year, the Intergovernmental Panel on Climate Change (IPCC) was formed as an offshoot of the UN Environment Programme (UNEP) and the World Meteorological Organization (WMO). The purpose of the IPCC is to publish periodic reports (namely, to collate, combine, scrub and peer-review the existing scientific literature) to provide a comprehensive outlook on the state of Earth’s climate to help inform and steer global decisions regarding climate change. The IPCC then feeds its principal Assessment Reports (ARs), published every five to seven years beginning in 1990, to the UN Framework Convention on Climate Change (UNFCCC), which coordinates or negotiates intergovernmental agreements on limiting the release of emissions associated with climate change. (Think Kyoto Protocol, Paris Agreement, etc.)
To date, the IPCC has published approximately forty reports on climate change and finished its sixth Assessment Report (AR6) in 2022. For the most part, the mainstream media consider the ARs an invaluable and indisputable source in making the case for human-caused climate change or warming. What we are supposed to believe, in other words, is that these reports catalog “the science” or “the consensus” regarding greenhouse gases and their role in driving climate change.
ROCKEFELLER FOUNDATION GLUE
What connects the Club of Rome to the UN, the IPCC, UNEP and UNFCCC? For that, you need to look at a third organization, the Rockefeller Foundation—the granddaddy of them all—founded in 1913 by oil magnate John D. Rockefeller and his son John D. Rockefeller, Junior. With its enormous funding generated by the Rockefeller oil fortune, the Rockefeller Foundation has been a major player in “nongovernmental” world politics since its founding. The World Health Organization (WHO), the National Science Foundation (NSF) and the National Institutes of Health (NIH) all were modeled on the Rockefeller Foundation and owe it much of their early funding. The Foundation’s early work focused on public health and medical research, and shifted in the 1930s into studying population control and the dark science subject of eugenics, including providing direct funding to German scientists in the 1930s and 1940s.
Although the Foundation’s focus diversified into global governance after WWII, it never lost its love for the controversial subject of population control. In the 1950s, Foundation control passed from the second to the third generation of Rockefellers, with John D. Rockefeller III becoming the chairman along with his colleague John Foster Dulles, who was Eisenhower’s Secretary of State and brother of Allen Dulles, the first civilian director of the CIA. In 1952, they founded the Population Council, tasked with—wait for it—controlling the population boom of the post-WWII era. The Rockefeller Foundation’s second focus in the post-WWII era was funding and supporting the fledgling United Nations. Their aim was to change the UN from a “parliament of nations to a modern think tank that used specialized expertise to provide in-depth impartial analysis of international issues.”
In 1965, Dean Rusk (U.S. Secretary of State after Dulles and also a board member of the Rockefeller Foundation) heard a speech on population control and non-renewable resource depletion by Italian consultant Aurelio Peccei. Rusk was so taken by the speech that he had it translated into English and had it circulated throughout the halls of power in Washington DC and New York City. This led to a collection of think-tank-type speeches organized by the OECD and the Rockefeller Foundation that, in turn, connected Peccei and Alexander King to the systems modeling work of Jay Forrester at MIT, resulting in early planning meetings for the Club of Rome. Many of those meetings took place at Rockefeller-owned properties in Italy. The meetings culminated with the CoR’s founding and the publication of LTG.
In parallel to this, a different board member of the Rockefeller Foundation, Canadian oilman Maurice Strong, picked up on the very same themes as the CoR and carried them to the UN. Strong commissioned a report titled Only One Earth: The Care and Maintenance of a Small Planet, which compiled views from one hundred fifty-two leading experts from fifty-eight countries about the state of the Earth’s environment. The report was presented at the UN meeting on the environment held in Stockholm in 1972.
This document and the 1972 UN meeting are considered the world’s first “state of the environment” report. Given the success and apparent urgency, the UN used this moment to create a special carve-out organization—the UNEP—with Strong elected as UNEP’s executive director for the first four years of its existence. In an interesting turn of events, Canadian prime minister Pierre Trudeau called Strong home in 1976 to found and run Canada’s national oil company, Petro-Canada. Pierre Trudeau is also a listed member of—wait for it—the CoR.
All of this could merely be attributed to birds of feather flocking together, but it does seem uncanny how many of these organizations’ founders seemed to be board members of other similar organizations—and how many of them held ingrained views on the state of the Earth and the need to control human population growth.
THE (IN)FAMOUS HOCKEY STICK
Later in life, Strong maintained a guiding hand in the growth of UNEP and its message. The next generation of politicians would cite Strong as a mentor and role model in addressing Earth’s multi-nation environmental emission problems, with one of those politicians, Al Gore, becoming U.S. Vice President in 1992.
The most famous (and controversial) IPCC Assessment Report came in 2001 (AR3), presenting a figure showing a historical temperature recreation of the planet going back one thousand years and marrying temperature proxy reconstructions with the modern instrumentation record. (Temperature proxies and their reconstructions refer to a common data collection technique of paleoclimatologists that involves using the plant and geologic record to recreate climate conditions of the past. The most common methods include using air bubbles from glacial ice cores to analyze the composition of the atmosphere, measuring tree rings, analyzing sub-fossilized pollen in lake beds and ocean sediment, taking temperature measurements in bore holes and analyzing the mineral and stable isotope composition of corals and calcium deposits.) The well-known figure represents a major data point in “proving” the current mainstream climate narrative, suggesting that Earth’s average climate (as represented by temperature) had been very stable for one thousand years until mass combustion of fossil fuels started with the Industrial Revolution. Combusting hydrocarbons releases carbon dioxide (and other gases); the IPCC argued that this had caused the planet’s average temperature to take off like a rocket ship (see Figure 2), resembling the curvature of a hockey stick (an oddly Canada-specific cultural reference).
This figure then figured prominently in Gore’s 2006 climate change documentary, An Inconvenient Truth, for which he was awarded two Academy Awards and a Nobel Peace Prize in 2007. The documentary brought the climate change discussion fully into the mainstream, where actors, influencers, celebrities, politicians and a very brave little girl from Sweden adopted it as the popular cause to stand behind.
The “hockey stick” graph is based on a research paper written by a trio of career academics from the University of Massachusetts and University of Arizona (Michael E. Mann, Raymond Bradley and Malcolm Hughes, often referred to in shorthand as “MBH”). The paper, in turn, derived largely from Mann’s dissertation work. The originating piece of scientific work, published in Nature, was titled “Global- Scale Temperature Patterns and Climate Forcing Over the Past Six Centuries.”4
What is most interesting about the paper and subsequent derivative work is that it does not present any actual new science; it is not the product of any new experiments and their rigorously tested results. Instead, “MBH” applied new statistical techniques to merge the results of various temperature inference information from climate proxies (such as tree rings and ice cores) with the primitive instrumentation record, the modern instrumentation record and the ultra-modern satellite record. In other words, it is more of a statistics and interpretation exercise, and thus is subject to both normal human biases and also outlier manipulation (such as throwing out data that do not fit the narrative you are trying to prove). Nevertheless, since the article’s publication and adoption as a cause célèbre, Mann & Co. have received every conceivable kind of award and platitude; their efforts and those of other scientist/researcher types in the field of paleoclimatology reconstructions have all bathed in enhanced fame, prestige and wealth.
The popularity and reach of the techniques developed by Mann & Co. initiated a cottage industry of similar reproductions and efforts, but with one glaring problem—none of the results ever seemed to match, further reinforcing the subjective nature of this type of “science.” Efforts to independently reproduce Mann’s results have proved impossible. In part, this is because Mann and his colleagues refuse to release their source work, but even when subsequent generations of academics use similar original data, their work has produced results that are often widely different. Some temperature reproductions show high variability in the past, but others show only a little. Some show various classic temperature periods (see Figure 3), while others omit them, and so on. The classic temperature periods refer to time spans—usually several hundred years—where qualitative observations from the time and the early instrument record reflect some type of shift in general climate conditions. These periods are RWP (Roman Warm Period, 250 BCE–250 CE), MWP (Medieval Warm Period, 950 CE–1250 CE) and LIA (Little Ice Age, 1450 CE–1850 CE).
DESIGNED TO CONFUSE?
There are some other small(er) items to note about Figures 2 and 3. These items have become hallmarks of the climate change discussion but, in my opinion, they are designed to confuse.
The first thing to note is the extremely small range on the Y-axis, usually with an absolute amplitude of two to three degrees Celsius. Most humans cannot feel or detect a temperature variation of less than one degree Celsius.
Second is the general refusal to use actual temperatures. All temperatures are presented in reference to a range of years from the instrumentation age (1961–1990 in Figure 2 and 1881–1980 in Figure 3). Why were those specific years chosen for this analysis? The answer, though often arbitrary, is relevant, as one (if so inclined) can manipulate the answer choosing a range of years that trended cold or hot.
The third item to point out is the use of smoothed rolling averages on top of highly variable underlying data.
Fourth (and pertaining just to Figure 2), the gray shaded area around the main blue and black data/lines is the confidence band, as in “with 95 percent confidence.” This means that there would be a 95 percent probability that the actual temperature readings (if instrumentation had existed at the time) would have landed within the gray band. This is a huge amount of error for such small increments of variation, particularly the further back in history you go.
Finally, notice that both figures clearly denote that these are “Northern Hemisphere” reconstructions only. This is often done on purpose, as the temperature record starting in the instrumentation age (roughly 1750 onward) was heavily dependent on just two countries (England and the U.S.) until fifty to seventy-five years ago.
More broadly, what exactly is a “hemispherical average temperature” (or, for that matter, a “global mean temperature”), and how does it account for half a hemisphere being in day and the other half being in night? How does it take into account the fact that our hemispherical seasonal patterns are mirror opposites of each other? One could reasonably assume that a society as complex and technologically inclined as ours would implement a global-scale project that looks something like this:
- Grid the world to Cartesian coordinates;
- Chop grid to some fixed increment of distance;
- Build identical temperature monitoring at each increment (over land, sea, mountains, sand, swamps, etc.);
- Equip surface air temperature monitoring stations with exactly the same staff and equipment down to the instrument manufacturer;
- Turn them all on at the same time;
- Collect data;
- Maintain each station to the same standard.
This is what you would have to do to ensure data integrity for the temperature data that you are collecting and to consider the data, you know, scientific. However, it should come as no surprise to the reader that globally, we have nothing close to this type of coordinated system. What we have is a patchwork of methods and accuracy checks, largely held together with duct tape and bailing wire, leaving huge gaps in coverage, reliability, standardization and historic data (which are then largely cobbled together using computer simulations, models and other novel statistical techniques).
Some may find these points to be arcane or minutiae-laden, but temperature—or, more importantly, large average temperatures—is one of the prime instruments being used to make the case for anthropogenic, humanity-induced climate change. Given that large societal gears are beginning to turn and that our current political trajectory envisions remaking the planet’s entire energy system in the next twenty-five years based on micro-movements in global average temperatures, it behooves all of us to get educated on this topic, particularly since the guesstimated cost of this energy transition would be higher than total current global GDP.
CLOSING THOUGHTS
Looking at the last one hundred fifty years of the early and modern instrumentation record, what is becoming apparent is that Earth, on average, is getting slightly warmer. Averaged across the various hemispheres, this is usually rated at about 0.5 to 1.0 degree Celsius (or 0.9 to 1.8 degrees Fahrenheit) over the past fifty years. I myself believe that human activity is responsible for a portion of this increase—but that includes all human activity and not just the release of carbon dioxide.
What is also true, if you believe the instrumentation record, is that the period of the early to mid-1800s (1800-1850) was one of the coldest on record and the last pulse of the Little Ice Age, a finding reinforced by a thick packet of anecdotal evidence (for example, the “year without summer,” 1816). Thus, to use this date as both the start of the hydrocarbon era and the start of the “trustworthy instrumentation” era ends up artificially inflating current temperature increases by comparing the latter to a temperature nadir. What this also means is that some of the temperature rise of the past one hundred fifty years must have been naturally induced, or we would be trending into a new ice age, with or without hydrocarbons.
The other part of the distortion is the ability of skilled artists to contort data and paint any kind of statistical story that may be wanted. Look at a raw data set and see if you can spot the emerging “climate emergency.” The Central England Temperature (CET) data set is the oldest continuous temperature data that we have in existence, stretching back to 1659 without any known data manipulations (see Figure 4).
Looking at the CET, there is not much variability—not much of a trend—aside from a barely discernible warming starting around 1975, but there was also an extended period of cooler-than-average temperatures from 1659 to 1700. This occurred despite England’s role as the epicenter of the Industrial Revolution and experiencing a thousand-fold increase in hydrocarbon usage over the same period.
If carbon dioxide is only a fractional contributor to the barely perceptible increase in global temperature, why is the primary solution presented to us the elimination of hydrocarbons from our society in order to “save the planet”? This is something I have pondered for a while, and I have come up with one possible hypothesis. Maybe it is not about saving the planet at all; maybe it is about cutting off the flow of available energy into our society. It is interesting to note that one of the other statistical correlations that we have observed since the start of the industrial age is the strong correlation between energy use (hydrocarbons), population, GDP and GDP per capita, as well as, for good measure, the literacy rate (see Figures 5 and 6).
What we also know is that the total human population never exceeded one billion inhabitants prior to the discovery and exploitation of hydrocarbon energy. The gusher of energy released with the Industrial Revolution created a tremendous amount of surplus energy in society as a whole, which has in part fueled the increase in global population from one to eight billion. Maybe the logic is that by throttling back the availability of surplus energy in society (hydrocarbons), this will in turn reverse or undo the population effects of the Industrial Revolution.
By sheer coincidence, this is the thesis first put forward by Malthus and later rebooted with shiny scientific bells and whistles by the Club of Rome, which in turn passed the same baton of population control to the UN (UNEP). Each iteration has taken a slightly different refraction (“Food will run out,” “We do not have enough resources,” “There is too much pollution”) to achieve the same end goal—reducing the human population to one to two billion inhabitants or to pre-Industrial Revolution levels. To some, this might make complete sense, if not for the pesky issue of having to get rid of seven out of every eight of us.
In a future article, I will cover all the other factors that affect climate and weather, other than carbon dioxide, to further show that this trace gas in our atmosphere is not the sole driver of all climate change. This will expand to a discussion of the energy system being envisioned and installed to replace hydrocarbons—and, if realized, the terrible consequences this will have for all of humanity.
SIDEBAR
TEN REASONS TO EAT BUTTER
- Butter is one of our best dietary sources of vitamin A.
- Butter is also a good source of vitamin K2, Dr. Price’s Activator X.
- Butter is a unique source of ready-made butyric acid, important for thyroid function and colon health.
- Butter provides arachadonic acid, out of which the body makes feel-good endocannabinoids.
- Butter contains glycospingolipids, which support digestive health.
- Butter can be a good source of minerals, such as selenium, iodine and zinc.
- Butter is a good source of vitamin E, important for fertility.
- The fats in butter are stable and will not break down into free radicals or aldehydes.
- Butter tastes delicious!
- Butter will warm the heart but does not cause global warming!!
REFERENCES
- Passell P, Roberts M, Ross L. The Limits to Growth. The New York Times, April 2, 1972.
- Alexander King & Bertrand Schneider. The First Global Revolution (The Club of Rome), 1993, p 115.
- https://www.pulitzercenter.org/sites/default/files/june_23_1988_senate_hearing_1.pdf
- Mann ME, Bradley RS, Hughes MK. Global-scale temperature patterns and climate forcing over the past six centuries. Nature. 1998;392:779-787.
This article appeared in Wise Traditions in Food, Farming and the Healing Arts, the quarterly journal of the Weston A. Price Foundation, Fall 2023
🖨️ Print post
Jonathan N says
Excellent summary. I’d like to see an update about Tim Ball’s court case against Mann’s bogus hockey stick graph, and how everyone in league with Mann is just going forward anyway with their agenda. This proves their science is bogus, and they put beliefs and agenda over reality.