Human Factors and Errors at the Person–Machine Interface



Introduction





Until now, we have discussed several paradigm shifts required to improve patient safety. The dominant one, of course, is replacing an environment based on “blame and shame” with one in which safety is viewed as a top priority and systems thinking is employed effectively. Second is an awareness of the impact of culture and relationships on communication and the exchange of information. This chapter will introduce another lens through which to view safety problems: how human factors engineering (HFE) can improve the safety of person- machine interactions and the environment in which healthcare providers work.






To prime ourselves for a discussion of HFE, let us consider the following scenarios:







  • An obstetric nurse inadvertently connects an opiate pain medication intended for an epidural catheter into a mother’s intravenous line, leading to the patient’s death.1 A subsequent review demonstrated that the bags and lines used for epidural and intravenous infusions were similar in size and shape, leaving nothing other than human vigilance to prevent a bag intended for epidural use from connecting to an IV catheter or hub.
  • A hospitalized elderly man dies after a heart attack suffered while in a monitored bed in one of America’s top hospitals. A later investigation reveals that the main crisis monitor had been turned off, and multiple lower level alarms—including ones that showed that the patient’s heart rate was slowing dangerously for nearly half an hour before his heart stopped—weren’t noticed by the busy nursing staff, who had become so inured to frequent false alarms that they suffered from “alarm fatigue.”2
  • Modern multichannel infusion pumps are routinely used in the ICU to administer multiple medications and fluids through a single central line (Figure 7-1). This often results in a confusing tangle of tubes that cannot be easily differentiated. No surprise, then, that a busy ICU nurse might adjust the dose of the wrong medication.
  • Medications are often stored in vials in doses that are highly concentrated. This means that they frequently need to be carefully diluted before being administered. For example, a vial of phenylephrine contains 10 mg/mL, while the usual IV dose administered to patients is 0.1 mg—one-hundredth of the dose in the vial! Inadvertent administration of full strength phenylephrine can cause a stroke.
  • An elderly patient dies when a Code Blue team—responding to the call for help after a cardiac arrest—is unable to connect the defibrillator pads to the defibrillator.3 A subsequent analysis showed that over the years the hospital had accumulated more than a dozen defibrillator models on its floors, so providers were often unfamiliar with the models at hand and incompatibilities were common.4







Figure 7-1



A modern sea of intravenous drips and lines. Is it any wonder that there are sometimes errors when incorrect medications or rates are administered to a desperately ill patient? (Courtesy of Michael Gropper, MD, PhD, with permission.)







In each of these examples, significant hazards resulted from people interacting with products, tools, procedures, and processes in the clinical environment. One could argue that these errors could have been prevented by more careful clinicians or more robust training. However, as we have already learned, to minimize the chances that fallible humans (in other words, all of us) will cause patient harm, it is critical to apply systems thinking. In the case of person–machine interfaces, this systems focus leads us to consider issues around device design, the environment, and the care processes that accompany device use. The field of HFE provides the tools to accomplish this.






This chapter was coauthored by Bryan Haughom, MD.






Human Factors Engineering





Human factors engineering is an applied science of systems design that is concerned with the interplay between humans, machines, and their work environments.5,6 Its goal is to assure that devices, systems, and working environments are designed to minimize the likelihood of error and optimize safety. As one of its central tenets, the field recognizes that humans are fallible and that they often overestimate their abilities and underestimate their limitations. Human factors engineers strive to understand the strengths and weaknesses of our physical and mental abilities and use that information to design safer devices, systems, and environments.






HFE is a hybrid field, mixing various engineering disciplines, design, and cognitive psychology. Its techniques have long been used in the highly complex and risky fields of aviation, electrical power generation, and petroleum refining, but its role in patient safety has only recently been appreciated.79 In applying HFE to healthcare, there has been a particular emphasis on the design and use of devices such as intravenous pumps, catheters, computer software and hardware, and the like.






Many medical devices have poorly designed user interfaces that are confusing and clumsy to use.10,11 According to the U.S. Food and Drug Administration (FDA), approximately half of all medical device recalls between 1985 and 1989 stemmed from poor design. FDA officials, along with other human factors experts, now recognize the importance of integrating human factors principles into the design of medical equipment.1113






In Chapter 2, I introduced the concept of forcing functions, design features that prevent the user from taking an action without deliberately considering information relevant to that action. The classic example of a forcing function was the redesign of automobiles that prevented cars from being placed in reverse if the driver’s foot was off the brake. In healthcare, forcing functions have been created to make it impossible to connect the wrong gas canisters to an anesthetized patient or prevent patients from overdosing themselves while receiving patient-controlled analgesia (PCA). Although forcing functions are the most straightforward application of HFE, it is important to appreciate other healthcare applications, ranging from improving device design to aiding in device procurement decisions to evaluating processes within the care environment.






For example, many hospitals are now approaching the challenge of increasing the frequency with which providers clean their hands partly as a human factors problem (Chapter 10).14,15 While these institutions continue to work on education and observation, they also ensure that cleaning gel dispensers are easy to use and strategically located throughout the hospital wards. In fact, a whole field of patient safety–centered hospital and clinic design has emerged, and some buildings have been constructed using human factors principles.16,17






Despite these early success stories, HFE remains conspicuously underused as a patient safety tool, for reasons ranging from the lack of well-defined avenues to report and correct design or process flaws within hospitals to the natural tendency of highly trained caregivers to feel that they can outsmart or work around problems.9 With the dramatic growth in the volume and complexity of man–machine clinical interactions, the probability that human workers will make mistakes has escalated, as has the importance of considering HFE approaches.






Usability Testing and Heuristic Analysis





One of the key tools in HFE is usability testing, in which experts observe frontline workers engaging in their task under realistic conditions—either actual patient care or simulated environments that closely replicate reality. Users are observed, videotaped, and asked to “talk through” their thought processes, explaining their actions as well as their difficulties with a given application or device. Engineers then analyze the data in order to fine-tune their design for the users, the chosen tasks, and the work environment.18






Software engineers and design firms now see usability testing as an indispensable part of their work, preferring to make modifications at the design stage instead of waiting until errors have become apparent through real-world use. Similarly, many equipment manufacturers and a growing number of healthcare organizations now employ individuals with human factors expertise to advise them on equipment purchasing decisions, modify existing equipment to prevent errors, or identify error-prone equipment and environmental situations.8,10,11 These trained individuals instinctively approach errors with a human factors mindset, asking questions about usability and possible human factors solutions before considering fixes involving retraining and incentives, interventions that may seem easier than device or environmental redesign but are generally far less effective.






Usability testing can be a complex process, requiring not only human factors experts but also extensive investigatory time and cooperation from users. A less resource-intensive alternative is known as heuristic analysis.11 The term heuristics was first mentioned in Chapter 6 in reference to the cognitive shortcuts that clinicians often take during diagnostic reasoning, shortcuts that can lead to errors. In the context of HFE, though, heuristics have a different connotation: “rules of thumb” or governing principles for device or system design. In heuristic evaluations, the usability of a particular system or device is assessed by applying established design fundamentals such as visibility of system status, user control and freedom, consistency and standards, flexibility, and efficiency of use (Table 7-1).18,19



Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Jun 14, 2016 | Posted by in GENERAL & FAMILY MEDICINE | Comments Off on Human Factors and Errors at the Person–Machine Interface

Full access? Get Clinical Tree

Get Clinical Tree app for offline access