Vaccination is one of the great triumphs of science. According to the World Health Organization, 2 to 3 million deaths a year are averted thanks to immunization programs for measles, whooping cough, tetanus, and diphtheria. Smallpox, which once afflicted tens of millions across the globe, is a thing of the past.
But vaccination is also a triumph of democracy. Both the widespread adoption of childhood immunization programs, and the regulations that ensure the safety and efficacy of vaccines, are products of democratic processes operating at their best: public dialogue, careful consideration of the facts, and decisive action to implement evidence-based solutions.
These victories have not come easily, however. The history of vaccination in the United States, from colonial Boston to today's headlines, reminds us of how easily misinformation and policy failures can lead to preventable tragedies.
Life, liberty, and the pursuit of immunity
Decades before Edward Jenner discovered the technique that paved the way for modern vaccination, Cotton Mather learned about the practice of inoculation from Onesimus, a slave in his household. Inoculation uses controlled exposure to the live virus to produce a minor infection that confers lifetime immunity. The colonists of Boston were slow to embrace inoculation, which is riskier than vaccination, even after a smallpox outbreak in 1721 produced convincing evidence in its favor.
Benjamin Franklin, a young resident of Boston in 1721, was aware of the success of inoculation, but his older brother, James, was opposed. James owned a newspaper in Boston and published editorials against Mather. Fifteen years later, Franklin let worries about possible complications hold him back from inoculating his four-year-old son. Unfortunately, the child contracted smallpox and died, and Franklin's regret turned him into a dedicated inoculation advocate. Franklin's support for inoculation was mirrored by others among the Founding Fathers, including George Washington, John Adams, and Thomas Jefferson.
The growth of vaccine policy
As vaccination became more widespread, government began to address it. In 1813, almost a century after Cotton Mather first recommended inoculation to the citizens of Boston, James Madison signed An Act to Encourage Vaccination, which created a National Vaccine Agency and provided for free shipping of vaccine materials through the U.S. postal service. In 1855, Boston passed the nation’s first law requiring children to be vaccinated before attending public school.
As such policies grew more common, the need for regulation of vaccines became apparent. Early vaccines were not governed by the kinds of safeguards—such as clinical trials and oversight of manufacturing practices—that we take for granted. As a result, contamination of vaccines was common, leading to tragic incidents that gave credibility to opposition figures such as Lora C. Little, who became an anti-vaccine crusader when she mistakenly attributed her son’s death to his smallpox vaccination.
The U.S. government addressed this situation in the 20th century with a series of laws setting standards and creating regulatory bodies to ensure vaccine safety and efficacy, including the following:
- The Biologics Control Act of 1902, which provided for facilities inspection, product licensure, scientist supervision of production, and labeling.
- The Food, Drug and Cosmetic Act of 1938, which mandated pre-market approval of drugs and tightened prohibitions against false therapeutic claims.
- The Public Health Service Act of 1944, which created bodies to regulate vaccines and oversee vaccination policy, and set rules for vaccine research and production.
- The Childhood Vaccine Injury Act of 1986, which requires risk/benefit disclosure to vaccine recipients and reporting of adverse reactions.
- The Food and Drug Administration Amendments Act of 2007, which enhances FDA’s ability to track adverse reactions and requires pediatric assessment for all new drugs, including vaccines.
New vaccines, old misconceptions
The combination of scientific advances and careful regulation has made vaccination a resounding success, turning diseases that once killed thousands of children each year into historical footnotes. Yet misinformation about vaccine science persists, fueled by celebrity "anti-vaxxers" such as TV personality Jenny McCarthy. As a result, an increasing number of parents are withholding vaccination from their children.
The latest example of this trend involves the vaccine for human papillomavirus (HPV). HPV is a common sexually transmitted infection that most adults contract and fight off without ever knowing it. But a few strains of it have been associated with increased risk of some cancers, particularly cervical cancer.
Vaccination against HPV is now available, and clinical trials and ongoing monitoring have shown it to be highly safe and nearly 100 percent effective. Sadly, HPV vaccination rates in the United States have so far been very low: in 2013, only 37 percent of girls and 14 percent of boys aged 13-17 had received the full set of doses required to prevent infection. For every year vaccination rates remain this low, another 4,400 women will develop cervical cancer at some point in their lives.
These tragedies are preventable, and a growing movement of engaged citizens is working to prevent them. For example, a Pittsburgh, PA group associated with the international activist movement Grandmother Power recently began organizing a grassroots campaign to increase HPV vaccination rates. The grandmothers are combining scientific knowledge with their own life experience and community stature to mediate between vaccine-hesitant parents and health care providers.
Efforts like Grandmother Power's can reduce the "information contamination" that is undermining our fight against infectious disease—much as contamination of vaccines once did—and help ensure that commonsense vaccination policies protect today's young people as well as those in generations to come.