Terence Corocoron wrote an opinion piece for the Financial Post recently that I take exception with. The subheading to the article is “Junk Science Week: Finding that people who consume certain drinks also develop cancer is not an indicator that the drinks cause the disease”. To summarize, Corcoron argues that much of research in nutrition epidemiology is flawed, but that do-gooder socialists will still use it to justify regulations of one thing or another.
I think being critical of science is not only healthy, but essential to the sceintific process. However, Corocoron is not a scientist, and his critique is incomplete and unhelpful. So, I sent him an email. Here is the text of the email (edited from the original for style and clarity):
I agree with the general need to be skeptical about science, and have myself been accused of being overly skeptical by some of my peers. I think many of California’s regulations on carcinogens are nuts. I think we worry too much about air pollution in Canada, the impacts of wind turbines on sleep, fluoride in the drinking water and the profit motives of big pharma.
However, some journalists have a way of taking this kind of skepticism too far. Your article on alcohol and cancer is a perfect example of this.
It is true that some study designs are weaker than others; experimental study designs have clear strengths, observational case studies based on self reported data have clear weaknesses. The challenge is that almost all health research on humans about exposure to nutritional or environmental harm are forced to employ inferior study designs. This is because we can almost never do proper experimental research on links between the environment/nutrition and human health. It is often unethical and almost always impractical.
Given this, we could choose to dismiss all non-experimental research on humans as junk science. This would include almost all the research on tobacco and cancer, by the way, as well as all the research linking asbestos to mesothelioma. Indeed, if we hold all science to the highest standard of evidence, then we would probably have to conclude that salt has no demonstrated impact on hypertension, exercise may or may not extend life (or even improve the quality of life) and that drinking a two-four every weekend may or may not be harmful to your liver. We’d also have to say that there is no convincing evidence that free markets are good for economic productivity, that killing terrorists reduces the risk of terrorism, and that free expression is good for the human soul.
The fact is that we (as societies and people) still have to make decisions about the possibility that some things in are world may be harmful (or good) in spite of the absence of rigorous and convincing evidence. Almost any research on the health impacts of alcohol will be unsatisfying if we hold it up to the gold standard of experimental research designs. It’s for this reason that epidemiologists come up with heuristics (“Hill’s criteria”, is one example) to make decisions considering things like effect size, replicability, study design, agency and other factors under conditions of empirical uncertainty. It’s also why science works best as an iterative process of falsification, rather than truth finding.
While you seem to see research on alcohol and cancer as part of a pernicious socialist plan to regulate the world, I see it as part of the process of untangling a mess of mixed evidence that leads to incrementally better information over time.
It’s true that some regulators use inconclusive science to justify unnecessary regulation. The antidote to bad decisions is to raise the level of science literacy not just invoke general skepticism towards certain areas of research.
I understand that nuance is not good for selling papers (or click baiting) but your article does little to help explain the scientific process properly, or help inform people about the challenges of making decisions about food and environmental safety.
Terence did not respond.