Preparing Minds for Bioterrorism

I often give interviews to the press on various infectious disease topics and a few months ago I was talking to a journalist and referenced the anthrax attacks of 2001. The journalist replied, “Oh yeah, the anthrax 'scare' back then.” I replied, “it wasn’t a ‘scare’ it was an attack in which 22 people were infected and 5 murdered via spores being sent through the US Postal System.”

That 14 years have now passed since the Amerithrax attacks means that those horrific times have faded from people’s memory and that’s not a good thing because the threat remains.

With that context in mind, my colleagues and I wrote a clinical review paper with the aim of refreshing clinician’s minds with new information on these important infectious diseases (anthrax, plague, botulism, tularemia, and smallpox). We were ecstatic when the most prestigious medical journal in the world, The New England Journal of Medicine, accepted it for publication.

The subtext of the entire update is that it is vitally important for clinicians—the front-line defense against these pathogens—is armed with the knowledge necessary to recognize and treat these diseases as well as know when to sound the alarm.

As my hero Louis Pasteur famously said, “chance favors the prepared mind” and our hope is that our paper will prepared the minds of those crucial to protecting this nation from another bioattack.

 

 

Dr. Sheri Fink Brought the Lessons of Katrina to Pittsburgh

On the night of March 2, 2015, I had the opportunity to hear Dr. Sheri Fink lecture in Pittsburgh, my hometown. To all in the field of catastrophic health event preparedness, hers is a household name as her unmatched effort to understand the crisis in the healthcare facilities of New Orleans after Hurricane Katrina can be thought of as no less than foundational for the entire field—quiet an achievement.

What Dr. Fink’s work does is concretize what the term “crisis standards of care” is all about. Indeed, the events depicted in her book Five Days at Memorial (which I discussed in a prior post) illustrate exactly what happens when these standards are not in place and ad hoc decision-making becomes the norm and a DNR order is translated to mean “do not rescue”.

That we have now developed these standards and that these discussions are not taboo are thanks to Dr. Fink’s diligent work.

What I loved most about her lecture and found quite inspiring, apart from the content, is the fact that this was a physician-turned-journalist exemplifying all the best aspects of medicine. Her inquisitiveness, her passion for her work, and her ability to translate abstract concepts into concretes (to wit, she wrote a piece for inclusion on Chipotle bags about these topics) are all attributes of the best physicians and something to emulate. Her lecture had the all to infrequent attribute of being able to equally appeal to both the physicians in the audience and to the general public alike.

To hear such a renowned voice, who often references Aristotle, discuss topics such as hospital preparedness, crisis standards of care, Ebola, and a battle field hospital anywhere—let alone in the comfort of my hometown—was a rare treat. 

 

Rational Use of Antibiotics: Nearly A Hundred Years War

When I discuss antibiotic resistance, a point I am sure to make is that these compounds are to be treated like precious commodities. In the modern era, that strikes many people as odd because they have taken countless courses of different antibiotics and have always recovered. There always seems to be a new antibiotic released that their PCP may try them on if their symptoms are particularly recalcitrant to the first course. Medicine will always develop new antibiotics, they assume. 

Last week, I had the pleasure of listening to Harvard University's Dr. Scott Podolsky discuss the topic of antibiotics, not from the usual perspective that is focused on a prophecy of a post-antibiotic era (a valid concern), but from a historical perspective. Such a perspective allows one to place antibiotics and their overuse into a much larger context that is informed by an understanding of how antibiotics rose to such prominence that today prescriptions for them are literally demanded of physicians.

In his talk, part of C.F. Reynolds Medical History Society's annual lecture series at the University of Pittsburgh, Dr. Podolsky highlighted an important of the pre-antibiotic era that is often overlooked--and may hold the key to combating infectious diseases more rationally in the future: serum therapy. Serum therapy was a means of using products manufactured by the immune system against specific microbes in another individual.  It was the opposite of broad-spectrum and was largely supplanted by the much simpler use of broad-spectrum antibiotics. Today, we have anti-serum therapies for diseases like botulism and tetanus. But, there is a lot of effort to develop modernized serums, aka monoclonal antibodies, which we saw used experimentally against Ebola in the form of ZMapp and which already exist for anthrax and RSV. Moving back to specific therapies is a sure means to not only delimit antibiotic resistance but to avoid the collateral damage to the microbiome that is probably just as important.

Another aspect of Dr. Podolsky's talk I found fascinating was the prescience of those at the dawn of the antibiotic age, such as the first president of the Infectious Diseases Society of America (IDSA) Dr. Maxwell Finland, who realized the dangers of injudicious use of antibiotics and established early principles for antibiotic stewardship.

One of the themes from Dr. Podolsky's lecture that resonated with me was that the current debates on rational use of antibiotics we are in reaches back several decades and understanding the terms of the debate, at its inception, is an important and over-looked task. Thanks to Dr. Podolsky for doing the intellectual work needed to make these facts easily accessible and able to be integrated into the modern context.

Would Gerbils Obey the Pied Piper?

Rats are considered an unavoidable bane of urban life and have been generally associated with filth, disease, and pestilence. One of the most ominous events--The Black Death and the following European plague outbreaks--they have been linked to may not actually have involved them (they would blame it on the fleas anyway).

A new study suggests another rodent might have been to blame: the gerbil (actually the great gerbil from Asia). In this study, Schmid and colleagues looked at climatological data contained in tree rings in Europe and Asia (the traditional home of plague) to determine if such conditions were conducive to rat populations. The study attempts to unravel a few paradoxes regarding plague in Europe: what was the rodent reservoir that allowed plague to persist there (which disappeared in the late 19th century when plague outbreaks tapered off) and was plague something that seeded Europe at the time of The Black Death and persisted in rodent populations there? 

What the evidence presented in this paper suggests is that tree ring data in Europe from the time of known plague outbreaks do not support a climate suitable for rat populations to thrive. However, tree ring data from Asia do show a correlation with a climate conducive to gerbil population booms followed by  busts, forcing resident fleas to look for alternate hosts (i.e. humans and other animals, including rats). This finding is at odds with the traditional view that attributes the presence of plague in an infected European rat reservoir.

The authors speculate that repeated introductions of plague to Europe from Asia occurred to produce each plague outbreak--a finding that may absolve the rat of its role in perpetuating Europe's plague outbreaks. 

This study shouldn't dissuade people from having gerbils as pets as it is not applicable to captive gerbils. Similarly, it shouldn't encourage people to have rats as pets (don't forget about rat bite fever). 

What the study does do, at least for me, is provide another great concretization of how infectious diseases affect and are effected by absolutely everything: animal population, international trade, seasons, and, in the case of the Black Death, perhaps the structure of Western society. 

 

Looking at the Roots of My Immunity

Normal people look on Ancestry.com to find out things about their history. I, being the infectious disease obsessed nerd that I am, dig out my immunization book to look at which vaccines I received.

My parents are both doctors are kept pretty good records on my immunizations which made my task a little easier. Looking at the little blue book that holds the secrets of my early immunologic experiences, I was shocked by how few vaccines were available in 1975, when I was born, versus now. There was no rotavirus, HiB, Prevnar, varicella, meningococcal, hepatitis B, hepatitis A, or HPV vaccine. What does that leave? DTwP (now unfortunately replaced by DTaP, oral polio (now replaced with the inactivated Salk injection form), and measles/mumps/rubella.  I also received the cholera vaccine (no longer available in the US) for travel-related reasons.

Looking at the relative paucity of the vaccines I received I have the opposite reaction than many anti-vaccine advocates who reminisce for an era with less vaccines. I, on the other hand, which I was more vaccinated because I had to suffer through rotavirus (I’m pretty sure on an ominous notation of me having diarrhea in March of 1977 was the result of this virus), chickenpox (in 8th grade!), and dodged the bullets of invasive infections with type B H.influenzae and pneumococcus.

Another interesting thing I learned is that I received the measles, mumps, and rubella vaccines as individual injections. The MMR vaccine was introduced in 1971 and its uptake may not have been universal for some time. I was a person who only received one dose of these vaccines as the recommendation for a 2nd dose wasn’t made until 1989.

Interestingly, as a medical student in 2000 I was required to have my titers checked against measles and rubella and was found to not have adequate protection against rubella and received the MMR at that time. Then, 5 years later, the titers to all three components of the vaccine were checked and this time my mumps titers were found to be low and I got another MMR.

I wonder why my titers against rubella and mumps were low and I speculate that maybe I received the old killed mumps vaccine which was inferior to the live-attenuated version that replaced it. But that can’t be the whole story because I didn’t develop adequate mumps titers after receiving the MMR in 2000. As for rubella, I’m not sure. So in the end, I received 2 MMRs, in spite of receiving the monovalent vaccines I received as a child, just like the children of today.

As an adult, I treat vaccines like the precious commodities they are. To that end, I have gotten typhoid, hepatitis A, meningococcus, hepatitis B, and hepatitis A vaccines. Even when it comes to the sub-par influenza vaccine, I go out of my way for it—standing in line for the pandemic H1N1 vaccine at a local school and travelling around to find the quadrivalent vaccine in its first year of limited availability.

In the almost 40 years since I went through my routine childhood immunizations what were once breakthrough vaccines have now become routine.  So many infectious diseases remain for which no vaccine is commercially available. To name but a few: HIV, hepatitis C, malaria, MERS, SARS, Ebola, and West Nile Virus.  As science and medicine continue their reason-inspired march, routinizing of the cutting-edge will be the norm.