By Jeannine Blake, Ph.D., RN
It’s 3 a.m. in a busy ICU. Nurses move between rooms, managing a constant stream of patient needs. In one of them, a nurse – working his third shift in a row – is running on four hours of sleep, holding out for 7 a.m. and a few nights off. His patient is critically ill. The room is filled with equipment, a tired family sits vigil, and providers come and go asking questions or performing assessments. Phones ring. Labs are due. Medications need administering.
An infusion pump alarm goes off. He sighs, walks over, and squints at the machine. Another occlusion alarm, again with no clear cause. He opens the pump, taps the tubing, gives it a slight tug, checks the connections, closes the door, and presses start … then mentally crosses his fingers that it will run for more than five minutes. The family watches closely. He offers a reassuring nod and a practiced smile, the kind that says, “Everything’s fine,” even though it doesn’t feel fine because this darn thing won’t stop beeping, but alas, this is just what happens night after night.
This nurse is educated and experienced. He knows the tips and tricks for managing these alarms. But they persist – baffling and disruptive. Is it the pump? The fluid? The patient? The training manuals never cover this in a real-world context. They list what to do, but not why. Not in a way that fits with the reality of clinical care or offers pearls of wisdom to critically think through these kinds of challenges. They don’t account for how fragmented, overloaded and error-prone this environment really is.
Medical device safety cannot rely on users operating at their best. Or even at their mediocre. Real clinical care, especially in high acuity settings like the ICU, happens every day, even on caregivers’ worst days, regularly. We can’t fix this by just adding another training session or telling someone to pay more attention.
WHY THIS MATTERS
The clinical environment is cognitively overloaded, emotionally intense and frequently understaffed.1 Efforts to fill staffing through shift rotations, floating between units, or travel staff increase the odds that a caregiver will be out of their element, even if they are familiar with the tasks and equipment involved. Mistakes are inevitable, believing otherwise is the biggest error of all.² The system must be designed to prevent human mistakes from becoming patient harm. And in healthcare, part of that system includes the medical devices we use to carry out care.³
Devices must be designed to work with humans, not despite them. We cannot assume that users will recall complex training or be mentally prepared to operate devices “as taught.”⁴ We must start from a different premise: people are fallible, and systems must account for that.
WHAT HUMAN FACTORS HAS ALWAYS TOLD US
Human factors engineering has long emphasized that education and training should not be the primary method of mitigating use error.³,⁵,⁶ And yet, when something goes wrong, clinicians are too often blamed, and then sent to retraining sessions that do little to change outcomes. The message becomes: “Do better.”
Training may be the only option in the short term, but it should NEVER be considered a long-term fix. That fix must come from technology, whether through a software patch, hardware update, or full redesign. Yes, those changes are harder. But they’re also more effective.⁷ And adopting that mindset creates space for innovation rather than defensiveness. It also empowers clinicians to speak up when something doesn’t work, without fearing blame, reprimand or judgement.
DESIGNING FOR REALITY, NOT IDEAL USE
Devices must reflect the context in which they’re used. For nurses and other frontline clinicians, this context often includes multitasking, frequent interruptions, fatigue and insufficient support.⁸ And yet, in these environments where the consequences of error are high, the tolerance for uncertainty is low.
Despite these stakes, medical technologies are too-often designed without enough meaningful input from their primary users about how it may fit into their daily workflow. That disconnect has real consequences, and when systems break down, the burden falls on the human user, in the form of more training on how to navigate the broken system.6 Each system breakdown, and each resulting training burden, is just one more responsibility placed on the human instead of the device.
But human factors principles are clear: if two users make the same mistake, it’s not a user problem. It’s a design problem.
A CALL FOR RETHINKING ACCOUNTABILITY
Nurses are highly educated professionals. But they’re still human and human error should be expected, anticipated, and prevented through robust systems, not punished or patched with more training.⁹ We need to shift the way we think about accountability. Errors and lapses are not moral failings or evidence of incompetence.² They are signs that the system, including its devices, is failing to support the people within it and the patients whose care relies upon it.
We don’t need to teach nurses to “try harder.” We need devices built for the messy, high-stakes, beautifully human environments in which nurses operate. Training may patch the system for a while, but only design can truly fix it.
Until we stop blaming individuals and start fixing systems, the same errors will continue, and we’ll keep blaming the same people (or the new ones who replace the ones who’ve burned out). It’s time to build devices for real people, doing real work, under real pressure. Because education is not the answer, design is.
– Jeannine Blake is an assistant professor affiliated with the Elaine Marieb Center for Nursing and Engineering Innovation at the University of Massachusetts Amherst. Opinions and insights are her own.
References
1. Carayon P, Gurses AP. Nursing workload and patient safety — A human factors engineering perspective. In: Hughes R, ed. Patient Safety and Quality: An Evidence-Based Handbook for Nurses. Agency for Healthcare Research and Quality; 2008.
2. Kohn LT, Corrigan JM, Donaldson MS. To err is human:: building a Safer Health System. vol 6. National Academies Press; 2000.
3. Carayon P, Wetterneck TB, Rivera-Rodriguez AJ, et al. Human factors systems approach to healthcare quality and patient safety. Appl Ergon. Jan 2014;45(1):14-25. doi:10.1016/j.apergo.2013.04.023
4. Reason J. Human Error. Cambridge University Press; 1990.
5. Russ AL, Fairbanks RJ, Karsh BT, Militello LG, Saleem JJ, Wears RL. The science of human factors: separating fact from fiction. BMJ Qual Saf. Oct 2013;22(10):802-8. doi:10.1136/bmjqs-2012-001450
6. FDA. Applying human factors and usability engineering to medical devices: Guidance for industry and Food and Drug Administration Staff. 2016. February 3. https://www.fda.gov/regulatory-information/search-fda-guidance-documents/applying-human-factors-and-usability-engineering-medical-devices
7. OSHA. Identifying Hazard Control Options: The Hierarchy of Controls. Washington, DC: U.S. Department of Labor.
8. Westbrook JI, Woods A, Rob MI, Dunsmuir W, Day RO. Association of interruptions with an increased risk and severity of medication administration errors. Archives of Internal Medicine. 2010;170(8):683-690.
9. Reason J. Human error: models and management. BMJ. 2000;320:768-70.
