Trending Topics

Stopping the bias snowball, before it kills your patient

The EMS crew was about to administer epinephrine, “only they hadn’t noticed what I had; the patient’s heart rate was 268”

I arrived on scene perhaps ten minutes behind the first-in crew and the shift supervisor. Both the crew paramedic and the supervisor were knowledgeable, experienced providers. A school nurse was there as well, providing background info on our patient; a 16-year-old student with a history of asthma in severe distress.

Our patient was in bad shape. She had severe difficulty breathing, tachycardia, was pale and diaphoretic, and you could hear the wheezes and rhonchi from 10 feet away. Her eyes had that unfocused, half-lidded stare that signals impending loss of consciousness and respiratory arrest.

No doubt about it, she was sick and getting worse. She hadn’t responded to multiple doses of her Combivent inhaler, and the lead medic – one our our flight paramedics who was covering a shift on a ground unit that day – already had a DuoNeb in place on a non-rebreather mask he converted to a nebulizer mask. Despite that, the pulse oximeter only picked up a waveform intermittently, and when it did, her oxygen saturation was only in the 80s. The EMS crew was about to do what you’d expect anyone to do in this case – administer IM epinephrine.

Only they hadn’t noticed what I had; the patient’s heart rate was 268.

You’d expect a severely hypoxic patient to be tachycardic, sure, but compensatory tachycardias don’t get that high. Even supraventricular tachycardia doesn’t usually get that high in an adult. One of the few mechanisms that can cause a tachycardia higher than 250 in an adult is one involving an abnormal accessory pathway, like a Kent bundle.

I pointed out the heart rate to the supervisor and the lead medic, urged them to pause and assess for just a moment, and asked the school nurse for the girl’s medical file.

Sure enough, there it was: Wolff-Parkinson-White Syndrome.

One synchronized cardioversion, and about 5 minutes later, the girl looked like a different patient. She was perfusing well, had a heart rate of 92, and an oxygen saturation of 100%.

WPW was right there in the girl’s history, but as she took no meds for it and her initial complaint to the school nurse was difficulty breathing, the nurse got a bad case of tunnel vision and only considered status asthmaticus. Moreover, she passed that incorrect diagnosis on to the responding EMS crew, and they based their assessment and treatment on faulty information.

Now, I don’t know what would have happened if we gave this young lady epinephrine, but cardiac arrest is high up on the list of possibilities.

Bias cascade

What happened here was a bias cascade that nearly killed a patient. First, the nurse succumbed to confirmation bias, a form of tunnel vision. She knew asthma in the patient’s history, heard “I can’t breathe,” and stopped assessing at that point. From that point forward, the circumstances controlled her, rather than her controlling the circumstances.

Difficulty breathing? Well, let’s see here … yep, she has a history of asthma.

Wheezing? Check.

Palpitations and chest tightness? Check.

Tachycardia? Check.

Albuterol doesn’t help? Must be status asthmaticus. She needs IM epi and maybe magnesium sulfate, and I don’t have that. Better call EMS.

The responding EMS crew fell victim to the bandwagon effect and the bias snowball, where each piece of incorrect information influences the next decision, sometimes leading to disastrous results. What stopped the snowball was a dispassionate observer who was not caught up in the scene to objectively assess the situation and realize, “One of these things is not like the other.”

When we objectively examined the anomalous finding – an excessive tachycardia – then the entire faulty logic chain began to disintegrate. The lesson in that is, no matter how good or experienced you are, you should always play, “What if I’m wrong?” with your diagnosis and treatment decisions.

We teach this from the beginning of EMT and paramedic school; to form a list of differential diagnoses, obtain more data to narrow down that list to the most likely diagnosis, then provide treatment based upon that diagnosis. In a perverse way, experience as a provider works against us here; once you have become a master of pattern recognition, you are at increased risk of ignoring or rationalizing findings that don’t fit the pattern.

Most protocols are set up so that the most generic treatments, the ones that may treat several pathologies and have the least potential for harm – for example, supplemental oxygen – are usually standing orders. Those with a higher risk/reward ratio and more nuanced thinking often require two brains to determine the appropriate treatment pathway, like consulting that expensively educated brain on the other end of the medical control line. Those usually fall below that line on your protocols marked in bold type: By physician order only.

The dangers of tunnel vision

I happened to be the objective observer in this case, but I am as susceptible to tunnel vision and confirmation bias as anyone. Once, early in my career, I temporarily killed a patient because I discounted the possibility of encountering a profoundly hypothermic patient in 100-degree ambient temperatures of a Louisiana July. I gave him two liters of room-temperature IV fluid, because I was certain that cold, unconscious, hypotensive and bradycardic, plus a small puddle of coffee-ground emesis, equaled a patient in decompensating shock from a GI bleed.

I didn’t realize my mistake until a rough transfer from my stretcher to the ED bed put the patient into VF. Luckily, the ED physician managed to resuscitate him with no deficits. That didn’t stop him, however, from stepping out of the resuscitation room every 10 minutes, fixing me with a withering glare, and muttering, “80 friggin’ degrees … ”

It was not my finest moment as a paramedic, but I learned a valuable lesson that a cocky 25-year-old Kelly Grayson desperately needed: you are not infallible.

Avoid bias-related mistakes

There are some strategies you can employ to avoid making such mistakes with your patients. First and foremost, cultivate a relationship of trust and respect with your partner, and do not be afraid to bounce ideas off them, even if you’re a paramedic and they’re an EMT. It’s a cliché to say, “Paramedics save lives, and EMTs save paramedics,” but sayings become clichés because they are often correct.

The tool you both can use to check and recheck your biases is a decision-making model developed by Air Force Colonel John Boyd called the OODA Loop. Originally conceived as a decision-making process for fighter pilots engaging in dogfights, the OODA Loop focuses on rapid acquisition of information and acting upon it – what many of us in EMS call a “doorway assessment.” The key is that you constantly re-evaluate your decision-making paradigm as the circumstances change.

  • Observe. Don’t just do something, stand there. Take in the scene dispassionately before you immerse yourself in it.
  • Orient. Orient the observations you have made into a clinical picture based upon the environment, the patient presentation, your education and prior experience, etc.
  • Decide. Make a treatment decision based upon the clinical picture you have developed.
  • Act. Carry out the treatment decision.

Keep using that model as the call progresses and you perform your ongoing assessment, but the key is to dispassionately approach each step. Pretend it’s a new patient and a new call, and see if that alters your clinical decision making.

With a good partner and application of the OODA Loop, you’ll be that much better equipped to make sound treatment decisions. columnist Kelly Grayson, is a paramedic ER tech in Louisiana. He has spent the past 14 years as a field paramedic, critical care transport paramedic, field supervisor and educator. Kelly is the author of the book Life, Death and Everything In Between, and the popular blog A Day in the Life of An Ambulance Driver.