The History of Vaccines Shows Us a Lot About Their Future

Science Features The Vaccines
The History of Vaccines Shows Us a Lot About Their Future

Throughout human history we have attempted to defend and heal ourselves from the scourge of infectious disease. Before we even understood what caused pestilences, we were devising ways to avoid them—like the plague.

One such contagion was smallpox, caused by the variola virus. It altered human history not only through its virulence as an epidemic, but also through its later eradication. Smallpox was one of several Orthopoxviruses, like cowpox and monkeypox; a fact that matters insofar as although Smallpox was the only one to infect humans absent of an animal vector, it could serologically interact with the other viruses. Although the disease’s imagery—which included pustular, scabbed pox on the skin of an infected individual—was harrowing, the virus was airborne and therefore primarily spread through inhalation. Although there were two variations of smallpox, one of which caused a much less severe disease course, it was not that minor virus that ravaged civilization.

While recognizing its symptom profile and mode of transmission was instrumental to human survival, it was understanding immunity to it that gave humans—even as early as the first millennium—the impetus to begin fighting it in earnest. If someone became infected with smallpox and survived, they developed lifelong immunity to the disease. This was at least partially determined by the fact that smallpox-infected individuals, once they began to recover, would often be tasked with caring for the newly ill. Others observed that those who had been sick with it once did not become reinfected, even when directly exposed.

In Europe during the 18th century, people began noticing another peculiar thing about “pox” viruses: it seemed that dairy maids had an inexplicable immunity to smallpox, even if they’d never had the disease. What they had been infected with was cowpox—another Orthopoxvirus. Naturally, people began to wonder if they could acquire the apparent immunity of the milkmaids by exposing themselves willfully to cowpox. In fact, this practice dated back thousands of years, but hadn’t yet gone through the ringers of formal scientific inquiry or standardization. As smallpox made its way through not just Europe but the New World, the potential benefits of inoculation through such willful exposure seemed to outweigh the risks—but not everyone was convinced. In the New World in the early 1720s, Rev. Cotton Mather and a physician named Zabdiel Bolston attempted to convince settlers of the importance of smallpox inoculation. Since the exposure was not without risk, and in fact there were people who did get full-blown smallpox as a result of it, the wariness was understandable. But that wariness also turned into something of a panic over inoculation—to the point where Bolston’s house was bombed.

In 1721, a smallpox outbreak in Boston sickened 12,000. Some of the individuals had been inoculated. In what was most certainly he first intentional comparative analysis, Bolston determined that of the people who contracted smallpox naturally, 14 percent died. Of those who contracted it through inoculation, just 2 percent died.

These statistics didn’t convince everyone, but they may have laid a more solid foundation over the next several decades. At least, it did for George Washington: in 1776 smallpox took out half his troops, giving the British an advantage. British troops had been inoculated, as many UK citizens had. Washington quickly, if not reactively rather than proactively, followed suit.

Back in England, a young boy named Edward Jenner had been inoculated, and good thing: he would grow up to develop the world’s first smallpox vaccine. By the 1790s, Jenner had become a country doctor and was an adamant proponent of inoculation, though he desired to better understand how it worked. Having grown up in farming communities, he too was aware that smallpox didn’t tend to befall dairymaids, likely as a result of their cowpox exposure—but again, the why and the how plagued Jenner.

Armed with the inclination to experiment, Jenner then took what was the first scientific approach to vaccine development, beginning with a clinical trial—albeit a very, very small one. He found a local dairymaid with cowpox, took some of the pus from an active pox, and injected the pus into the cut on the arm of one James Phipps—the 8-year-old son of Jenner’s gardener. He allowed a bit of time to pass for the boy’s immune system to become acquainted with the cowpox virus, then—hoping that the boy’s youth and otherwise good health would be on his side—exposed him to smallpox. Much to everyone’s relief (and no doubt Jenner’s delight) Phipps did not become infected.

Although Jenner found the boy’s apparent immunity reassuring, he knew he needed more evidence that his vaccination, as he called it (vacca being the Latin word for cow), actually worked. Over the next several months he attempted to find volunteers in London, but to no avail. Though his recruitment effort was unsuccessful, he left the city with a small supply of his inoculant, just in case. Back home, he would also give it to anyone who asked and found that this strategy for acquiring case studies proved more fruitful. Soon, the demand for the vaccination increased such that he would frequently run out.

His goal, of course, was for it to become widely recognized and ideally compulsory. He wrote up a paper that included several additional case studies—including the vaccination of his own son—and published it in 1789. At best, the medical community was uninterested if not slightly dubious of his claim—predominantly because Jenner still couldn’t explain why the method worked. Microscopic technology of the era lacked the power to see viruses; only their effects could be visualized. At worst, he was subjected to the same distrust and ridicule that Mather and Bolston had in Boston decades before. He carried on with his work, and ever so slowly, vaccination gained traction. It would be 30 years after Jenner’s death before the vaccine would become a requirement in England.

Understanding the mechanism of disease is fundamental to the development of a vaccine. While Jenner didn’t fully understand smallpox’s virology, he still made an invaluable scientific contribution: what we refer to today as a live-attenuated vaccine.

vaccine p 2.pngPhoto by Dmitry RogulinTASS via Getty Images

Most of the vaccines that we’re familiar with are one of two types: live-attenuated or inactivated. Live-attenuated vaccines are sometimes referred to as having “live virus,” which has caused a great deal of misunderstanding in the general population. The terminology is not exactly a misnomer, but it’s not quite literal either: live-attenuated vaccines are often developed against viruses—which, by nature, can’t be killed because they are not, in fact, alive. Viruses do, however, can only replicate a finite number of times before they are essentially rendered unable to infect. A live-attenuated virus is developed by taking a “live” virus that’s fully capable of replication and basically putting it through the paces in order to weaken it. This is achieved by passing the virus through a series of test tubes, perhaps hundreds of times, while periodically checking to be sure the virus hasn’t smartened up and mutated in response. To be vaccinated with a virus in its weakened state allows the body to recognize the virus in its true form and promotes a strong immune response; as would occur in an active infection.

That being said, vaccines can still be created from inactivated—loosely dead—viruses, too. This option is generally regarded as a safer bet for people with compromised immune systems who would therefore struggle to even fend off a weakened version of a virus. Inactivated vaccines are made from exposing disease-causing microorganisms to agents like formaldehyde, or conditions like high temperatures, that render them incapable of causing infection—what you might call dead. While these vaccines can’t cause infection, they also don’t initiate the kind of immune response that live-attenuated vaccines do, so the immunity they provide doesn’t last as long, and therefore the vaccine may need to be repeated (think “booster” shots). Similarly, there are also subunit vaccinations which contain only the antigenic components of a pathogen, rather than the entire cell. Using only the antigenic part of a pathogen is not always a guaranteed way to create immunity because knowing which of those antigenic properties the body will form a response to can be difficult to ascertain.

Then, there are vaccines for toxins, which may be used for prevention or treatment, such as the vaccines for tetanus, pertussis and diphtheria—which are often given at the same time but in different relative doses. In children under age 7, it’s given as DTap ( where the capitalization of the D and T represent a higher ration of tetanus and diphtheria vaccine) and later doses are given as Tdap—colloquially referred to as a “tetanus booster,” which a child usually gets around age 11, and adults might get if they cut themselves on a rusty tool.

Developing vaccines—and getting them approved for use—is a process that takes years even without controversy. A contemporary example would be the baseless link between autism and the vaccine for measles, mumps and rubella (MMR) which was proposed by Andrew Wakefield. Although the link he claimed—along with his work and whatever reputation he may have had prior—has been discredited by the scientific community, the misinformation has become a stubborn part of the public consciousness. It has also lent itself to a generation of children who are under-immunized, if not lacking immunization altogether. The implications extend far beyond individual children, of course, as evidenced by measles outbreaks over the last decade or so. The CDC had documented measles as being eliminated in the U.S. in the year 2000. In 2014, there were a record number of measles cases in the U.S.: 667—the majority of which were unvaccinated individuals, many of them children.

Measles is hardly the only infectious disease controlled through vaccines: smallpox was effectively controlled by subsequent refined iterations of Jenner’s vaccine, and was in fact so successful that the WHO declared it eradicated in 1980. Rabies, a disease which is nearly always fatal in humans, was long ago deemed too risky for people to attempt to inoculate themselves against. Louis Pasteur and Émile Roux (the former of which is most remembered for his developing pasteurization) developed an inactivated rabies vaccine in 1885.

It was at first inconsistently effective and carried many risks, and because Pasteur kept others from reviewing his research, he was permitted to tout a narrative of success—one which was not supported by data. It was only after his death that scholars realized Pasteur’s initial success with his rabies vaccine was perhaps more of a lucky break than tried-and-tested science. And in fact, as Pasteur was not a licensed physician, had he not been so lucky he could have been prosecuted. But, it was his vaccine that became the basis for the chemically inactivated version which is used not just to prevent rabies, but treat exposures.

Despite the history of discourse surrounding them, the fact remains that vaccines work. Unlike many who pioneered their development, we now have a much better understanding as to how and why they do. Of course, as we have evolved in our fight so too have the organisms. We would expect this, as we would expect new infectious diseases to present themselves.

The future of vaccines will require us to assimilate new technology and our increased knowledge base to continue to innovate. And we already are: some researchers have already started using viruses and bacteria as vectors for the delivery of immunogenic proteins of microorganisms like HIV—which simply cannot be attenuated enough to be safe (live-recombinant vaccines). Beyond the cleverness of the Trojan horse strategy, DNA vaccines—which involve genetically engineering antigens—are in development, too.

If resources for scientific inquiry are to become scarce in the years ahead, to think that researchers will need to turn their focus upon a resurgence of diseases that were once well-controlled through vaccination—as a result of socio-political machinations—is not just disappointing, but terrifying and for many, downright enraging. There are greater present threats, no doubt many of which have yet to reveal themselves fully to us. Like Jenner, we may be working partially in the dark; witnessing the effects of something we don’t yet have the means to illuminate. But we have shed light upon many such murrains in the past. It should be our task to continue to light up the path as we forge ahead. But if we forsake history, we may find ourselves constantly turned around; striking a match to relight that which we should never have permitted to burn out.


Abby Norman is writer based in New England. She’s currently working on a memoir for Nation Books and is the weekend science editor at Futurism. Her work has been featured in The Rumpus, Atlas Obscura, The Establishment, Cosmopolitan, Seventeen, Medium, The Independent, and others. She’s represented by Tisse Takagi in New York City.

Share Tweet Submit Pin