The agonizingly slow upward creep of the U.S. COVID vaccination rate, coupled with the emergence of the Omicron variant, has observers speaking in tones of gloom. What is wrong with people who refuse to get the shots? Some point to diseases such as smallpox and polio as evidence of a less-broken time when people trusted authorities and believed more strongly in science. But as historians of medicine, we find the despair about vaccine hesitancy misplaced.
By historical standards, the U.S. COVID-19 vaccination campaign has already been an astonishing success. In the past, fearsome diseases have been brought to heel even in the face of vaccine resistance, and with lower vaccination rates than public health officials had hoped to achieve. Moreover, vaccines alone have rarely curtailed or eliminated infectious diseases. Other measures, such as faster and more-accessible testing, and support for infected individuals who must quarantine, are also essential.
Anti-inoculation activism in this country is older than both vaccination and the country itself. The first inoculation campaigns in America date to the early 18th century, when members of the political and social elite began to promote variolation—the term “vaccination” didn’t exist yet—against smallpox. Although smallpox was a widespread and frightening disease, many resisted variolation, which meant inserting material from a smallpox sufferer’s pustules into a healthy person’s skin. It was risky. The death rate from the procedure ranged from one to five in a 100, better than the dismal 25-30% mortality rate among those who contracted smallpox naturally, but still dangerous enough to spur opposition.
Dr. Zabdiel Boylston, an inoculation proponent, was threatened with hanging, and minister-physician Cotton Mather’s house was unsuccessfully firebombed by an irate critic. Many colonies passed laws prohibiting the procedure, fearing smallpox could be spread by those who had not quarantined sufficiently after inoculation. Benjamin Franklin later observed that “the practice of Inoculation always divided the people into parties, some contending warmly for it, and the others against it,” eerily reminiscent of today’s opposition, often propelled by political and cultural divides.
Read more: The History of Vaccines, From Smallpox to COVID-19
By the 19th century, when inoculation pioneered by British physician Edward Jenner came to the U.S., one might have expected opposition to subside. After all, Jenner’s method—called “vaccination,” because the inoculating material was from cowpox (vacca being the Latin word for cow) instead of smallpox—was much safer than variolation, and offered even more effective protection. It was not, however, entirely without risk. The lymph used to confer immunity was often transported long distances unrefrigerated, and in the pre-germ theory era, the skin-piercing tools that delivered it were unsterilized. Contamination was common. Then, too, the thought of introducing a substance from a diseased cow into a healthy human body provoked unease. When states began to make vaccination compulsory, punishing with fines and sometimes jailing parents, anti-compulsory-vaccination movements sprang up in earnest. So effective was this resistance that some states repealed their compulsory vaccination laws in the early 20th century.
And yet, little by little, smallpox disappeared. It was gone from the U.S. by 1949, and from the entire planet by the late 1970s. The vaccine had everything to do with this triumph, but as it turned out, achieving it did not require everyone to be vaccinated. One historian has estimated that smallpox eradication in the U.S. was achieved with only a 40% vaccination rate. Since smallpox had distinctive and highly visible symptoms, it was possible to bring down rates dramatically by “ring-fencing,” which meant vaccinating intensively in the area surrounding an outbreak, even without reaching high rates of inoculation in the population as a whole. COVID-19, unfortunately, is not amenable to this strategy.
What about polio? In the 1950s, we’ve all heard, Americans embraced Jonas Salk’s polio vaccine. When it was approved, church bells rang, Salk became a national hero, and relieved parents lined up around the block to get shots for their kids. Everyone got the vaccine, and the terrible scourge of polio was at last defeated.
There’s truth in this story: many Americans did, indeed, greet the polio vaccine with enthusiasm, and polio cases in the U.S plummeted after its introduction, halving in the first year it was publicly available, and halving again the next year.
But this simple story obscures significant complications. Prominent voices, including that of Salk’s rival Albert Sabin, publicly questioned the safety of the vaccine. The popular radio host Walter Winchell claimed that the government was preparing thousands of “little white coffins” for the children they anticipated would be killed by it. The tragic Cutter incident, in which tens of thousands of people contracted polio from faulty vaccines manufactured by Cutter Laboratories, only reinforced those fears. About 200 people were paralyzed, and 10 killed, by the polio vaccine in the first few weeks that it became available.
What’s more, other data belies the impression we get from photos of families lining up for polio shots. By 1956, one year after the vaccine was approved, many states were sending their allotted shipments of shots back to the federal government for lack of demand, despite the fact that over half of the population under the age of 40 had not yet been vaccinated. The Dallas Morning News happily reported, on the anniversary of the vaccine’s approval, that the polio campaign was proving successful—but the numbers it reported indicated that a mere 2% of the city’s residents under age 20 had received the three shots required to be considered fully vaccinated.
Nevertheless, polio dwindled and disappeared, just as smallpox had done.
These examples suggest that sometimes, dramatically reducing the incidence of a disease requires only adequate, not absolute, compliance with a public-health regimen. In addition, the more insistently a vaccination campaign is pursued, the more doubt it raises in the minds of the hesitant about its true aims. This is especially so where authorities seem otherwise unconcerned about the public’s well-being: where basic medical care is inaccessible and living or working conditions foster ill health.
Read more: Vaccines Can’t End Pandemics Alone—And We’ve Known That Since We Eradicated Smallpox
Viewing the COVID-19 pandemic in this historical perspective, then, we find cause for optimism. First, the humility: COVID-19 clearly poses a special challenge to humanity. Vaccination does not seem to confer lifetime immunity, and new variants proliferate more quickly than in the case of either smallpox or polio. Maintaining protection against COVID-19 may require repeated inoculations, similar to flu shots.
But let us not lose sight of the optimism. The COVID vaccines have arrived faster, and been even safer, than vaccines in the past. They have also enjoyed less violent resistance and a more enthusiastic uptake. Over 60% of the U.S. population has already been vaccinated, and the threat of the Omicron variant, with its high transmission rate, will likely push this number higher. The only non-compulsory vaccination that has ever come close to this degree of penetration is the flu vaccine, which according to the CDC reached its high-water mark in the 2019-2020 season, topping out at only 51.8% of the population.
Vaccine hesitancy is undeniably an obstacle in our progress against COVID-19. But are we living in a uniquely ignorant or hostile time? Hardly. We have no warrant for complacency, but history does give us cause for hope.