Economics of over-medicalization: implications for Canada?

I recently watched a TED lecture from 2012 authored by journalist-physician Ivan Oransky.  It’s an entertaining talk that demonstrates a few important points about modern medicine, and specifically, that medical research is not very good at predicting health outcomes.  He makes his point by drawing a distinction between disease risk factors and diseases. Sub-clinical atherosclerosis (hardening of the arteries without symptoms), for example, is a risk factor for adverse health events–like strokes. Even though it may be associated with an increased risk of stroke and heart attack, sub-clinical atherosclerosis does not perfectly predict future illness or adverse health events, and is not an illness itself.  Nevertheless, many people are still diagnosed and treated as sufferers of pre-clinical diseases.

Ivan Oransky on TED MED, 2012

When people are treated unnecessarily, it follows that they are often better off un-diagnosed–mostly because of the financial costs and medical risks associated with treatment.  So why does over-medicalization happen?  Oransky argues that practitioners (and some health advocacy groups) are incentivized to over-diagnose and over-treat because of the direct and indirect financial benefits of diagnosis and treatment.  A physician paid on a fee-for-service basis has an (at least potential) incentive to diagnose and treat, since the diagnosis will lead to more income.  Even if most doctors are ethical and do not wilfully over-diagnose, there are other structures that pressure them to err on the side of over-diagnosis–such as the legal ramifications of failing to detect an illness, and the marketing pressure of drug companies.  The result is a more expensive health care systems with little to no net benefit to health.

What does this mean for Canadian health care?

Oransky is an American, but his comments have interesting implications on Canadian health care as well.  In Canada there is often more demand for health care than can be met by the current health care system, resulting in a surplus of health care demand.  This surplus demand causes longer wait times for health services, and seems to be at the top of the list of things that makes Canadians unhappy with their health care system.

However, this surplus demand also affects the economic incentives of practitioners.  For example, long surgical wait times mean that surgeons are already as busy as they can be–there is little benefit to adding unnecessary surgeries to the existing waiting lists. This surplus demand means that there is ample supply of potential income, and that physicians can be more neutral in their assessments of patient health, since they don’t profit directly from over-medicalization.  In a free market of medical services, this surplus demand would not exist, because price would increase to a new equilibrium.  However, in Canada, the fees paid to physicians are set by the provinces, so the surplus demand results in wait times and other health care delivery problems.

However, if Canada were to see a decrease in surplus demand (for example, through increasing the number of physicians available, or changes in demography), we should expect to also see an increase in the cost of health care not just overall (because there are more physicians billing the system), but even on a per patient basis.  Once demand for treatment and supply of treatment converge on each-other, we could expect to see the over-medicalization incentives kick in the same way they have in the U.S.  This would make the system more expensive, and could offset some of the health benefits of reduced wait times.

This is one of the curious (but not accidental) characteristics of Canadian health care; the excess demand on health care probably saves money for two reasons.  First, it reduces the frequency of services, and second, it reduces the incentives for over-medicalization.  While the former probably has a net negative effect on health, the latter is probably good for health, since it means there is less unnecessary treatment.

Risk attributable to drunk drivers

Drunk driving is dangerous, but…

Drunk driving and distracted driving are dangerous, and particularly dangerous for young and inexperienced drivers.

However, the reality of road safety today is that that most of the public risk of driving that we incur is due to driving sober, not drunk.  Driving drunk is more dangerous than driving sober, but since most people are not driving drunk these days, fatalities that do occur are more often caused by sober drivers.

Here is a table from a study done a long time ago on the subject

Source Evans, L. (1990). The fraction of traffic fatalities attributable to alcohol. Accident Analysis & Prevention, 22(6), 587-602.

What this table shows is that drunk driving is responsible for the majority of one vehicle fatalities, probably involving the death of the drunk driver.  However, the majority of multi-vehicle collisions involving a fatality are caused by sober drivers.

Driving drunk is dangerous (this can’t be said often enough), however, even as far back as 1990 (when drunk driving was more common) it did not make up for the majority of fatalities on the road, particularly when multiple vehicles are involved.  The majority of fatalities were caused by sober drivers.

What this means

Advocacy against drunk driving is important and makes our roads safer, but should not distract from the a fundamental reality: driving is dangerous.  Our decisions to drive, and to build cities focused on motorized transport have an impact beyond the decisions of individual drivers, and we are all participants in this system.

We need to remember that the risks of driving are systemic, and part of our collective decision to live in a motor-vehicle centred world.  A drunk driver is legally culpable for the consequences of driving drunk, but we are all at least somewhat morally culpable for this system of transportation that causes death.  If we are uncomfortable with traffic fatalities, we need to rethink our transportation system as a whole.  Perhaps autonomous vehicles can help?  Or a rethink on private motor vehicles altogether?  But what is clear is that we can’t pretend that all the consequences are entirely the result of poor decisions made by a small subset of individuals.  We all have a share in the blame.

No simple answers in health science

One of the critiques of modern medicine is that medical research is not sufficiently focused on developing cures for disease, and instead, puts too much focus on symptom management and treatment.  The reasoning behind this critique, most often applied to the pharmaceutical industry, is that disease cures are wilfully hidden because it is more profitable to manage symptoms long term with expensive drugs than to cure them with a single pill or treatment protocol.  There might be a small hint of truth in these accusations, but they distract from a less conspiratorial and more consequential reality: the golden age of modern health care, medicine, medical technology, and medical research is probably behind us.

During the last epidemiological transition, the world benefited from tremendous successes of medicine and medical research.  Understanding germ theory led to immunization and water treatment, which probably account for the majority of the increase in life expectancy and quality of life in Western Europe and North America in the last century.  Surgical techniques have advanced by leaps and bounds over the last 100 years—with improvements in hygiene probably saving thousands millions of lives alone.  Treatments for cancer, diabetes and other diseases have increased the length and quality of many lives.  The scientific advancement of medical science research also helped eliminate a host of ineffective and harmful ‘cures’ of the pre-science era.

The biggest achievements over this period were gained through the understanding, treatment and prevention of diseases caused by pathogenic micro-organisms–like viruses and bacteria. By the late 20th century, many of the most serious infectious conditions were under either partial or full control.  There remain infectious diseases that still have a significant impact on global public health (e.g., AIDS, malaria and schistosomiasis) but even many of these diseases are preventable or treatable based on the knowledge that the science of germ theory has granted us.  Indeed, their continued burden on global health is mostly a reflection of the vulnerability of populations living in poverty, failed political and social institutions, unemployment and a lack of education, not a lack of medical understanding.

Causal simplicity

What explains our past success over infectious diseases?  It probably boils down to their causal simplicity.  Infectious diseases have at least one necessary cause—a disease causing micro-organism.  When we discovered ways to deal with pathogens—through immunization, the modification of our environments and the modification of our behaviour—we could target the one necessary cause of infectious diseases.  We reduced our exposure to pathogens in the environment by treating water and sewage.  We enhanced our natural defences for fighting off infection through immunization.  We changed our contact with pathogens by altering our behaviour.  Importantly, we targeted these strategies directly at the most immediate and proximate cause of disease–the pathogen–and it’s because the pathogen is a known necessary cause that we were so successful.

On the other hand, the causes of heart disease, cancer, Alzheimer’s and many other major modern causes of disease and mortality are multifaceted—part of a complex epidemiological web.  For these conditions, causality has been harder to pin down to one or even a handful of necessary risk factors.  Many of the main causes of illness and death today are explain by a mixture of genetic, behaviour and environmental factors.  Without a single evident necessary cause, it is very difficult to develop simple and effective cures for most of these diseases.

Diminishing Returns

The evidence of this change can be seen in the life expectancy curves of wealthy countries.  In Canada, we are still making gains in life expectancy, but the returns are diminishing, and may very well be heading towards a natural limit.  Based on the figure below, we can see that people who live to be 90 have about 5 years of life remaining, on average, and it’s been that way for the last 100 years.  It seems that much of medical science involves helping more and more people live a meaningful life to this limit, rather than increasing our life spans, as was accomplished by past medical innovations.

The investment in increasing the quality of life as well as distributing long life to more people is admirable, and worthwhile.  However, we have a cultural memory of finding ‘cures’ of disease that originates from a time when medical researchers had the relative easy task of finding and killing the bugs that made us sick.  Now that the causes of disease are more complex, there are probably not as many more simple cures to be found, but rather, an assortment of partly effective treatments and interventions that hopefully improve lives.  This is not a flaw of medical research, but the reality of our fragile existence on this planet.  Nevertheless, the apparent disconnect between our cultural memory of medical breakthroughs and the long list of uncured diseases today might be the inspiration for both  ‘alternative’ medicine and conspiracies about the pharmaceutical industry hiding cures.

Changing expectations

This problem of diminishing returns is more than just a pessimistic prediction of future health innovation, but has important implications on what we should expect from health care.  For one, we shouldn’t expect as much from health care practitioners or researchers as we do.  What we know now will, very likely, be more or less the foundational knowledge of future health care and medicine.  No new research is very likely to cure all cancer, heart disease or the other plagues of modern life in the way we cured infectious diseases of the past.

Second, while there will still be new research achievements in the future, the benefits will continue to get smaller and smaller over time, particularly with respect to how long we live.  Increasingly, medical dollars should probably be spent on improving the health of the worst-off, rather than pushing up against the ceiling of life expectancy.  We continue to live in a world where large numbers of people suffer and die from treatable diseases and malnutrition.  The massive investment in medical research and care in the wealthiest third of countries has a very small impact on health when compared to the potential impact of investing that same money in the health of people living in the poorest third of the world.

My conclusion

Modern medicine is a victim of its own success; past achievements in medical research have addressed the easy-to-cure diseases, and the remaining diseases are harder to prevent and cure.  The slowing progress of medical research is not attributable to medical capitalism, but are due to the complexity of non-infectious diseases.  We need to accept that the gains of future medical research are likely to be very small, and that our resources may be better spent elsewhere.

There are some areas of medical research that may be fruitful in the coming years.  I suspect there is still considerable room to improve the efficacy of cancer treatment, for example.  However, if the health of humanity is of importance, more needs to be done to invest in the health of people that is going to have an impact, focusing our health care dollars based on the return on investment, rather than often unrealistic expectation to find cures to all that ails us.

Using Ngram to measure trends in spelling mistakes

Ngram is a database of words published in books.  It is a convenient and fascinating resource for all sorts of crazy stuff interesting to linguists, English professors and other word lovers.  You can search the database (or even download it) to see the patterns of word frequency in books published as far back as the 16th century.  Here is a reference to a scholarly article on the subject.

To see an example of what Ngram can do, I provide a trivial example.  When I was a kid, my friends and I often debated the correct colloquialism for underwear: ginch, gotch or gitch.  Using Ngram, I can see the frequency of usage in books published in English, and resolve the debate once and for all.  Gotch wins!

gotch ginch gitch

(Slightly) more seriously, I used Ngram to look at the changes in frequency of misspelled words in published books between 1800 and 2000.  In particular, I was interested to see if spelling improved between 1980 and 2000 — a period which covers the introduction of personal computing and the computerized spell-checkers in word processing software. You can view the data I compiled from Ngram here.

I focus on three common mispellings: occassionaly, recieve and beleive

Using the Ngram data, I calculated the ratio of the fractional use of the misspelled word to the fractional use of the correctly spelled word.  This ratio is an attempt to control for the secular changes in word use.  For example, maybe the word “believe” was more commonly used in books in the past than it is today.

I then graphed out the result:Spelling2017

We can see that up until the 1980s, the publication of these misspelled words is trending upwards, but by about 1980, there is a rapid decline.  Here is a close-up of the last few decades of data:


Over this period of time, misspellings of all three of these words declined in a way consistent with the (utterly unsurprising) hypothesis that computerized spellcheckers improve spelling.  However, it is worth noting (and is somewhat surprising) that the misspelled variants have not entirely disappeared from published books, and nor have they reached the relatively lower spelling error rates seen in the early 19th century.