Doctoring Data: How to Sort out Medical Advice from Medical Nonsense
Dr. Malcolm Kendrick
Columbus Publishing Ltd, 2015
Dr. Kendrick lives in Cheshire, England, so his writing has a distinctly British flavor—blimey this and bloody that. I love that stuff. He covers all the usual statistical misinterpretation strategies perpetrated by data-doctorers with a great sense of humor. He comes up with an excellent illustration of how studies will often ask the wrong question and then inevitably get the wrong answer. In a typical cancer study you will have a treatment group and a control group. The question they ask is: does the treatment group have a higher or lower rate of cancer deaths compared to the control group which received no treatment or a placebo? That is the wrong question.
Why is that the wrong question? Suppose the treatment involves shoving patients off a very high cliff. It should be readily apparent to the careful reader that the treatment group will have a drastically lower rate of death from cancer. While this is a somewhat extreme example, it illustrates very well how the trick works. The correct question to ask is what is the overall survival rate and quality of life in the treatment group compared to that of the control group? It is always safe to assume that any study that fails to mention total mortality outcome is flawed.
Kendrick cautions that when you hear that the “science is settled” regarding any topic of medicine, what you are really hearing is that no dissent will be tolerated. Intolerance for dissent is not science, it is fundamentalist religion.
Another good clue that you have departed from the rails of science is when you come across a vicious rant accusing some hapless contemporary of worshiping Hitler, sexism, racism, trying to murder his patients, blowing up the world, mutilating puppies, and so on. There are several splendid examples in the book and when you stumble onto something like that it is an excellent opportunity to calibrate your claptrap-o-meter to maximum. There is no science to see here, just an emotional temper tantrum. Really, even a money-grubbing snake wouldn’t try to kill his patients. That would be bad for business.
There are enough good quotes in this book to fill a 20-page review so it is hard for me to narrow it down. Zombie science is defined as “science that is dead, but is artificially kept moving by a continual infusion of funding.” To the casual observer it looks real but zombie science does not pursue truth. Its goals are non-scientific, usually money-oriented, since profit is the first priority of the large corporations that provide most of its funding. Grandiose claims are made, as in the example of a press release for the Heart Protection Study promising that thousands of lives will be saved each year thanks to statin drugs. Dr. Kendrick makes the clarifying observation that no drug has ever saved a single life. None. Zero. Everybody dies. If we are going to be so scientific and so impressed with our own science then we need to be more careful in how we state our conclusions. To be more scientifically accurate, the best you can hope for is to extend life or improve the quality of the life you have left.
The Heart Protection Study only looked at people who already had heart disease, diabetes, or stroke. On the average then, how long did statin drugs extend life in this study (assuming it was honestly done; more on that below)? Drum roll, please. The answer is … three months. After great expense and inconvenience you can expect to gain about ninety days of side-effect-riddled existence. And so the Medical Mediocrity Medal goes to the statin producers.
The title of chapter four sums up much of what the book focuses on: “Things that are not true are often held to be true.” I like Mark Twain’s way of putting it, “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” Kendrick points out the pitfalls of trusting the experts and even recommends not blindly trusting what he himself says.
What is going on in medical science today and who would be the best source to get straight answers from? One group who would certainly know would be the editors of major medical journals, but can you trust them? Would they jeopardize their paychecks to spill the beans? Probably not, but perhaps after they retire? In this case you find some very interesting quotes that carry real weight. These are the quotes I find most interesting. They rip the lid off the dark underside of Western medicine:
Dr. Marcia Angell said, “It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines. I take no pleasure in this conclusion, which I reached slowly and reluctantly over my two decades as an editor of The New England Journal of Medicine.”
Drummond Rennie, deputy editor of the Journal of the American Medical Association: “There seems to be no study too fragmented, no hypothesis too trivial, no literature citation too biased or too egotistical, no design too warped, no methodology too bungled, … (many more too something or other) … and no syntax too offensive for a paper to end up in print.”
Richard Horton, editor of the Lancet, says, “The mistake, of course, is to have thought that peer review was any more than a crude means of discovering the acceptability—not the validity—of a new finding. … We know that the system of peer review is biased, unjust, unaccountable, incomplete, easily fixed, often insulting, usually ignorant, occasionally foolish and frequently wrong.”
Richard Smith edited the British Medical Journal (BMJ) for many years and he writes, “Twenty years ago this week the statistician Doug Altman published an editorial in the BMJ arguing that much medical research was of poor quality and misleading. … Altman’s conclusion was: We need less research, better research, and research done for the right reasons. Abandoning using the number of publications as a measure of ability would be a start.”
How many times have I said that the truth is not up to a vote? I’m all aquiver with feelings of affirmation and validation. Maybe getting your medical opinions from the CONsensus bureau is not such a great idea after all. Richard Smith goes on to state that this twenty-year-old editorial by Doug Altman could be published today nearly unchanged. Smith concedes that quality assurance methods have been put in place to correct these problems but, in a nutshell, they do not work and are useless. My thumb, on the other hand, does work. Actually both thumbs work, not just the one on the other hand, and they are both way UP for this book.
This article appeared in Wise Traditions in Food, Farming and the Healing Arts, the quarterly journal of the Weston A. Price Foundation, Winter 2015
🖨️ Print post
Leave a Reply