Section 5.3: Alert Fatigue and Human Factors Engineering
The Science of Designing Interventions That Clinicians Will Actually Use.
Alert Fatigue and Human Factors Engineering
Mastering the Psychology of Attention in Clinical Workflows.
5.3.1 The “Why”: The Most Dangerous Side Effect of Poorly Designed CDS
We have learned the philosophy of the Five Rights and the mechanics of IF-THEN logic. Now, we must confront the single greatest challenge in our field, the existential threat that can render even the most clinically sound rule completely useless: alert fatigue. This is not a minor inconvenience; it is a profound patient safety issue. Alert fatigue is a dangerous state of cognitive overload where a relentless barrage of low-value, irrelevant, or unactionable alerts conditions clinicians to reflexively dismiss them all—including the rare, critical warnings that are meant to prevent catastrophic harm. When a user sees an alert, their first thought should be, “I need to pay attention to this.” When alert fatigue sets in, their first thought becomes, “How fast can I click through this to get back to my work?”
It is essential to reframe our understanding of this problem. Alert fatigue is not a failing of the user. It is not that physicians or pharmacists are lazy or careless. It is a predictable, physiological response to a poorly designed technological environment. The human brain has a finite capacity for directed attention. When we, as informatics pharmacists, design systems that overwhelm that capacity with noise, we are the ones creating the unsafe condition. We are inadvertently training our colleagues to ignore the very safeguards we are building.
This section is an essential deep dive into the psychology and cognitive science behind this phenomenon. We will explore why the human brain starts to tune out repetitive stimuli and how well-intentioned alerts can backfire, creating new and unexpected pathways to error. More importantly, we will provide a masterclass on the mitigation strategies. You will learn about the art and science of Human Factors Engineering—a discipline dedicated to designing systems that work in harmony with human cognitive strengths and limitations. You will learn the practical techniques of alert tiering, rule suppression, and data-driven optimization that are the hallmarks of a sophisticated informatics program. Your goal, by the end of this section, will be to transform from a simple “rule builder” into a true “attention architect,” capable of designing CDS that is not just seen, but respected, trusted, and acted upon.
Retail Pharmacist Analogy: The Car Alarm in a Busy Parking Lot
Imagine you are working in a busy pharmacy located in a large shopping plaza. Outside, a car alarm is going off. For the first thirty seconds, it’s alarming. You and everyone around you look up, scanning the parking lot, assessing for a potential threat. The signal is working as intended.
Now, imagine that same car alarm goes off every five minutes, every single day, usually for no reason. A strong gust of wind sets it off. A passing truck sets it off. Sometimes it just goes off randomly. What happens to your response? After a few days, you don’t even look up. The constant, repetitive, low-value signal has become meaningless noise. You have become desensitized. Your brain has learned that this specific signal has a near-zero probability of representing a real threat, so to conserve cognitive energy, you automatically filter it out.
Then, one day, you hear that same, familiar car alarm blaring. But this time, it’s because someone is actually breaking into a car. Because you have been conditioned to ignore the signal, you don’t look up. Nobody does. A real, actionable event is missed because the signaling system has been rendered useless by a flood of false positives. This is alert fatigue.
Every time you, as an informatics pharmacist, create a low-value, clinically insignificant, or overly broad alert, you are contributing to this “parking lot” effect. You are adding another false alarm to the environment, making it incrementally more likely that a truly critical alert—the one warning about a fatal allergy or a catastrophic drug interaction—will be ignored along with all the others. Our job is to ensure that when our system “sounds the alarm,” it is for a real, verifiable threat that requires immediate attention.
5.3.2 The Cognitive Science Behind Alert Fatigue: Why Good Clinicians Miss Bad Alerts
To effectively combat alert fatigue, we must first understand its underlying mechanisms. This is not a problem of motivation or professionalism; it’s a problem of fundamental cognitive limitations. By understanding the psychology of attention, memory, and automation, we can design systems that complement, rather than conflict with, how the human brain processes information, especially under pressure.
Cognitive Load: The Brain’s Limited RAM
Think of a clinician’s working memory as the RAM on a computer. It’s incredibly fast and powerful, but it has a finite capacity. At any given moment, a physician in the ED is tracking a patient’s vital signs, formulating a differential diagnosis, remembering to order specific labs, and planning the next steps in care. This all consumes cognitive load. Every interruptive alert, no matter how brief, acts as a new “program” that demands a share of that limited RAM. It forces a context switch, where the clinician must stop their primary train of thought, attend to the alert, process its meaning, make a decision, and then attempt to reload their original mental context. This switching process is mentally expensive.
When a system presents too many low-value alerts, it overloads the user’s cognitive capacity. The brain’s natural defense mechanism against this overload is to start shedding tasks—and the first task to be shed is the careful consideration of alerts that have proven to be unhelpful in the past. Alert fatigue is, in essence, a cognitive self-preservation strategy.
Masterclass Table: The Psychology of User Interaction with Alerts
| Cognitive Principle | Description | How It Leads to Alert Fatigue & Medical Error | Human Factors Design Solution |
|---|---|---|---|
| Inattentional Blindness | A psychological phenomenon where a person fails to notice a fully visible, but unexpected, object or event because their attention was engaged on a different task. It’s not a vision problem; it’s an attention problem. | A clinician is focused on entering a complex set of chemotherapy orders. An alert for a routine drug interaction appears in the same place on the screen it always does. The clinician’s brain, focused on the primary task, literally does not “see” the alert text; it only registers the familiar box that needs to be clicked to continue. They override without reading. | Vary the Alert’s Appearance for High-Risk Scenarios. For critical, “never-miss” alerts (e.g., fatal allergy), change the format. Use a different color (e.g., a red, shaking alert box), require a typed reason for override instead of a simple click, or use a different screen location. This novelty breaks the pattern and recaptures the user’s attention. |
| Automation Bias | The tendency to over-trust or excessively rely on automated systems. If the computer suggests something, we are biased towards accepting it as correct, even if our own judgment might suggest otherwise. | An order set for community-acquired pneumonia defaults to a standard dose of ceftriaxone. The patient has severe renal failure. The prescriber, biased by the “correctness” of the system’s default suggestion, fails to perform their own mental check for a renal dose adjustment and orders the incorrect, excessive dose. | Build “Forcing Functions” and Smart Defaults. The system should not just provide a default; it should use patient data to provide a smart default. The order set should automatically calculate the CrCl and present the renally-adjusted dose as the default, forcing the user to consciously choose the higher dose if they disagree. |
| Response Stereotyping (Motor Memory) | After performing the same sequence of actions hundreds of times, it becomes an automatic motor program. Think of typing a password or driving home from work without consciously remembering the individual turns. | A pharmacist verifies hundreds of orders a day. The “override” button for alerts is always in the bottom right corner of the pop-up. Their hand and mouse develop a motor memory for moving to that corner and clicking. When a critical alert appears, the hand moves and clicks before the brain has even had a chance to process the text. | Introduce “Speed Bumps” for High-Risk Overrides. For critical alerts, do not allow a simple one-click override. Require a multi-step process. For example, the user must click a checkbox that says “I have reviewed the risks and confirm this is clinically appropriate,” and then click a separate “Proceed” button. This disrupts the motor stereotype and forces a moment of conscious thought. |
| Alarm Fatigue (The “Cry Wolf” Effect) | A form of sensory desensitization that occurs when a signal is presented so frequently that it loses its meaning and urgency. This is the core of the issue. | A system is configured to alert for every single potential QTc-prolonging agent, regardless of the patient’s baseline QTc or other risk factors. After seeing dozens of these low-risk alerts for drugs like ondansetron, the clinician ignores the one truly dangerous alert for a patient on methadone, dofetilide, and ciprofloxacin. | Embrace Rule Specificity and Tiering. This is the most important solution. The informatics pharmacist must aggressively tune the system. Instead of a simple “is the drug on the QT list?” rule, build a sophisticated rule: `IF` drug is on the list `AND` patient is on `>` 2 other QT drugs `AND` the latest QTc interval is `>` 470ms `AND` the serum potassium is `<` 3.5, `THEN` fire a high-priority alert. This focuses the interruption on the small subset of patients who are actually at high risk. |
5.3.3 The Anatomy of a Bad Alert: A Rogue’s Gallery for Informatics Pharmacists
To learn how to build good alerts, we must first become expert diagnosticians of bad ones. Bad alerts are the pathogens that cause the disease of alert fatigue. By dissecting their common characteristics, we can learn to recognize and eliminate them from our own systems. As you review rules and alerts in your institution, you should constantly be on the lookout for these classic offenders.
The Rogue’s Gallery
The Nuisance Alert (Clinically Insignificant)
An alert that is technically correct but clinically irrelevant or easily managed, requiring no action.
Classic Example:
An alert fires for the interaction between Lisinopril and Spironolactone at standard doses in a patient with normal renal function and a normal potassium level. While technically an interaction exists, this combination is intentionally used millions of times a day and managed with routine monitoring. It does not require an interruptive alert at the point of ordering.
The Harm:
This is the single biggest contributor to alert fatigue. When clinicians see that the system constantly interrupts them for routine, standard-of-care practice, they learn that the system’s definition of “danger” is not aligned with their own clinical judgment, leading them to distrust all alerts.
The “Cry Wolf” Alert (Low Positive Predictive Value)
An alert that fires so broadly that the vast majority of times it appears, it’s a false alarm.
Classic Example:
An alert that fires whenever any SSRI is ordered for a patient on any antiplatelet or anticoagulant, warning of “increased bleeding risk.” While a pharmacologic basis exists, the actual clinical risk of a significant bleed from this interaction in most patients is extremely low. The alert fires constantly but rarely leads to a change in therapy.
The Harm:
This directly erodes the user’s trust. The alert becomes synonymous with “the alert I always ignore.” When a truly high-risk bleed alert fires (e.g., triple therapy with warfarin, aspirin, and clopidogrel), it may be ignored because it looks just like the hundred other low-risk bleed alerts the user has dismissed.
The Vague or Unactionable Alert
An alert that presents a problem without providing enough context or a clear path to a solution.
Classic Example:
An alert that simply says, “Dose may require adjustment.” It doesn’t state why (renal? hepatic? age?), what the current calculated value is (e.g., CrCl), or what the recommended dose should be. It forces the user to stop, exit the ordering workflow, hunt for the relevant data, and do the calculation themselves.
The Harm:
This causes immense frustration. The user feels the system has given them homework instead of help. They are more likely to override it out of annoyance and the perceived time cost of investigation, even if a real issue exists.
The Late Alert (Wrong Point in Workflow)
An alert that provides correct and important information, but at a point in the workflow when it is too late to act upon it easily.
Classic Example:
An alert about a non-formulary medication that only appears to the pharmacist during final verification. The physician has already ordered it, set the patient’s expectations, and moved on to other tasks. The pharmacist is now faced with the time-consuming and professionally awkward task of contacting the physician to recommend a change for a non-clinical reason.
The Harm:
This creates rework, delays patient care, and causes inter-professional friction. The correct point in the workflow for this alert was during order entry, for the prescriber.
5.3.4 Masterclass in Mitigation: Practical Strategies to Fight Alert Fatigue
Recognizing bad alerts is the first step; actively designing a system that minimizes them is the goal. Fighting alert fatigue is not a single project, but an ongoing process of governance, design, and data-driven refinement. A successful informatics pharmacist must become a master of these mitigation strategies.
Strategy 1: Tiered Alerting – Matching the Interruption to the Risk
The most powerful strategy is to formally classify all your alerts based on their clinical severity. Not all risks are created equal, and therefore not all alerts should be presented with the same force. A tiered system allows you to reserve your most interruptive alerts for your most high-risk scenarios, preserving the user’s attention for when it matters most.
A Model Tiered Alerting System
| Tier | Clinical Severity | CDS Action (Channel & Format) | Example |
|---|---|---|---|
| Tier 1: Critical / “Hard Stop” | Likely to cause death or irreversible harm. Action is absolutely contraindicated. | Interruptive Hard Stop Alert. Blocks the user from proceeding. No override is possible without a specialized workflow (e.g., pharmacy call). | Ordering a penicillin for a patient with a documented anaphylactic reaction. |
| Tier 2: High / “Soft Stop” | Potential for significant harm, but exceptions may exist. Action is strongly discouraged. | Interruptive Soft Stop Alert. User is interrupted with a clear warning but can proceed after providing a mandatory, documented reason for overriding. | Ordering a known teratogen (e.g., isotretinoin) for a patient who is documented as pregnant. |
| Tier 3: Moderate / Clinical Warning | Clinically significant issue that requires consideration and may require action. The most common tier. | Interruptive Soft Stop Alert with Easy Override. A standard pop-up that clearly presents the issue and a recommended action, but can be overridden with a simple click. | A significant drug-drug interaction (e.g., simvastatin + amiodarone) or a renal dosing recommendation. |
| Tier 4: Informational / Passive | Helpful context or guidance that is not related to an immediate safety risk. | Non-Interruptive Guidance. Information presented within the workflow, such as a note in the order summary, a default value, or an infobutton. | Displaying the date of the last potassium level next to a new order for an ACE inhibitor. Suggesting a formulary alternative. |
| Tier 5: Suppressed | Clinically insignificant or “nuisance” level information. | No Alert Fires. The rule is turned off or suppressed in the clinical knowledge base. The system remains silent. | The interaction between an ACE inhibitor and an NSAID. It is suppressed from firing as an interruptive alert for prescribers. |
The Role of Governance
Implementing a tiered system requires strong clinical governance. As an informatics pharmacist, you will not make these decisions in a vacuum. You will work with your institution’s P&T Committee, clinical champions, and IT governance groups to review and classify alerts. This multi-disciplinary approach ensures that the decisions are based on a consensus of clinical evidence and institutional policy, and it provides the necessary authority to suppress low-value alerts, even if a commercial vendor’s “out-of-the-box” setting is to have them turned on.