When I lived in Ann Arbor, my children attended a public school where upwards of 15 percent of kids were not vaccinated for mumps because their left-wing parents didn’t trust the vaccine industry. Meanwhile, on the right end of the political spectrum, Tea Party heartthrob Michele Bachmann famously accused vaccines of causing autism. How is it that such a technologically advanced country harbors so many vaccine luddites?
A quick glance at the U.S. small pox epidemic of 1900 offers a clue.
At the turn of the 20th century, the U.S. had managed to avoid a major smallpox epidemic for the better part of a generation. Then a small wave of illness washed over communities of black farmers and laborers in a few southeastern states. The white community wasn’t alarmed however, believing the disease, which some called “nigger itch,” would stay contained to that population, who they were convinced had brought it upon themselves through one or another vice. As one local newspaper put it at the time: “Up to the present, no white people have been attacked and there is positively no occasion for alarm.”
Then of course the disease began spreading to white people. The smallpox virus, it turns out, was colorblind. Yet although white people did become alarmed at this point, they didn’t turn out in droves to get vaccines. Instead, a vocal minority argued vehemently that the vaccine was of no benefit.
It should have been obvious to even casual observers that the smallpox vaccine was a lifesaver. During the Franco-Prussian War of 1870-1871, you see, a smallpox epidemic had swept through Europe, killing millions of citizens. The French army, which had half-heartedly vaccinated some of its troops, fared better than the population, but still saw more than 23,000 troops fall victim to this terrible scourge. Meanwhile, on the other side of the battle lines, the Prussian army, almost all of whom had been vaccinated, remained strong. Out of more than 800,000 troops, only 457 died from smallpox.
Good policies often depend on good evidence. In health care, our gold standard for good evidence is the randomized controlled trial, in which, for example, half the patients receive a new drug and half receive a placebo. When the drug and placebo patients are determined at random, we can be pretty confident that any subsequent differences between the groups—like a higher mortality rate in the placebo group—occur because one group got the drug and the other didn’t.
But sometimes, non-experimental evidence is so striking that conducting a randomized trial—withholding the new intervention from half of an experimental population—feels immoral. That’s one reason there has never been a randomized trial of the smallpox vaccine. Indeed, it is why many early medical advances became standard of care without anyone seeing the need for a placebo controlled experiment.
Yet a mere 30 years after the end of the Franco-Prussian War, when the smallpox epidemic swept through the U.S., a whole host of intelligent people refused to be vaccinated, convinced that the vaccine did more harm than good.
How could they hold this belief? For starters, the U.S. had much stronger libertarian leanings than countries like France and Germany. But another fascinating phenomenon also contributed to people’s anti-vaccine views: people didn’t believe the evidence. They remained unconvinced because of what us in the medical research world would call concern about “confounding.”
A confound occurs in research when two groups differ not only in the intervention of interest, but also in some other possibly unmeasured way. This makes it hard to tell whether the difference between the groups is caused by the intervention in question—the vaccine in this case—or from this other factor.
Vaccine skeptics of the time pointed out that those communities which had aggressively vaccinated people also made other public health changes that might explain the relative health of their populations. For example, before the smallpox vaccine came into widespread use, many public health experts pushed for people to be inoculated with the actual smallpox virus. The vaccine, you might remember, is derived from cowpox. Inoculations, on the other hand, were derived from the actual smallpox virus. With inoculation, doctors purposely infected people with very tiny amounts of the smallpox virus, hoping that recipients would experience a mild form of the disease and thereby be protected from a more severe illness. Inoculation was much riskier than vaccination. Some people got terribly ill after their inoculations. Others managed well enough but still unwittingly spread the disease to others, who didn’t fare so well.
When the turn of the century smallpox epidemic hit the U.S., those communities which aggressively vaccinated their populations also stopped all their inoculations. The anti-vaccine crowd grabbed onto this confound and claimed that it was the lack of inoculation that benefited these communities, not the presence of the vaccine. Anti-vaccinators also pointed out that public health departments in these communities were more aggressive about isolating patients from healthy people, and were even more thorough in enforcing hygiene laws, another confound which gave them the wiggle room to reject the benefits of the vaccine.
Missing a randomized trial, where the only difference between the two groups is the presence or absence of the vaccine, naysayers could attribute differences in the health of vaccinated and unvaccinated populations to other differences between the populations.
When people want to believe something, even the strongest evidence that their beliefs are misguided often fails to alter their worldviews. But when that evidence is not even the strongest kind of evidence, when there is no randomized trial and plenty of confounds, we cannot expect people to change their minds.
This blog post was previously published on Forbes.