Appendices



Appendix I. Key Books, Reports, Series, and Web Sites on Patient Safety





Key Books and Reports on Medical Errors and Errors More Generally*








1. Agency for Healthcare Research and Quality. Advances in Patient Safety: From Research to Implementation. Rockville, MD: Agency for Healthcare Research and Quality; February 2005. AHRQ Publication Nos. 050021 (1–4).


2. Agency for Healthcare Research and Quality. Advances in Patient Safety: New Directions and Alternative Approaches. Rockville, MD: Agency for Healthcare Research and Quality; July 2008. AHRQ Publication Nos. 080034 (1–4).


3. Agency for Healthcare Research and Quality. Advancing Patient Safety: A Decade of Evidence, Design, and Implementation. Rockville, MD: Agency for Healthcare Research and Quality; November 2009. AHRQ Publication No. 09(10)-0084.


4. Antonsen S. Safety Culture: Theory, Method and Improvement. Burlington, VT: Ashgate; 2009.


5. Banja J. Medical Errors and Medical Narcissism. Sudbury, MA: Jones and Bartlett Publishers Inc; 2005.


6. Berwick DM. Escape Fire: Designs for the Future of Health Care. San Francisco, CA: Jossey-Bass; 2003.


7. Bogner MSE. Human Error in Medicine. Mahwah, NJ: Lawrence Erlbaum Associates; 1994.


8. Bosk CL. Forgive and Remember: Managing Medical Failure. 2nd ed. Chicago, IL: University of Chicago Press; 2003.


9. Bunting RF Jr, Schukman J, Wong WB. A Comprehensive Guide to Managing Never Events and Hospital-Acquired Conditions. Washington, DC: Atlantic Information Services Inc; 2009.


10. Casey SM. Set Phasers on Stun: And Other True Tales of Design, Technology, and Human Error. 2nd ed. Santa Barbara, CA: Aegean Publishing Company; 1998.


11. Cohen MR, ed. Medication Errors. 2nd ed. Washington, DC: American Pharmaceutical Association; 2006.


12. Columbia Accident Investigation Board. Report of the Columbia Accident Investigation Board; August 2003.


13. Conway J, Federico F, Stewart K, et al. Respectful Management of Serious Clinical Adverse Events. Cambridge, MA: Institute for Healthcare Improvement; 2010.


14. Cook RI, Woods DD, Miller C. A Tale of Two Stories: Contrasting Views of Patient Safety. National Patient Safety Foundation at the AMA: Annenberg Center for Health Sciences; 1998.


15. Dekker S. Just Culture: Balancing Safety and Accountability. Aldershot, England: Ashgate Publishing Limited; 2007.


16. Donaldson L. An Organisation with a Memory: Report of an Expert Group on Learning from Adverse Events in the NHS Chaired by the Chief Medical Officer. London: The Stationery Office; 2000.


17. Farley DO, Ridgely MS, Mendel P, et al. Assessing Patient Safety Practices and Outcomes in the U.S. Health Care System. Santa Monica, CA: RAND Corporation; 2009.


18. Frankel A, Leonard M, Simmonds T, et al., eds. The Essential Guide for Patient Safety Officers. Oakbrook Terrace, IL: Joint Commission on Accreditation of Healthcare Organizations and Institute for Healthcare Improvement; 2009.


19. Gawande A. Complications: A Surgeon’s Notes on an Imperfect Science. New York, NY: Metropolitan Books; 2002.


20. Gawande A. Better: A Surgeon’s Notes on Performance. New York, NY: Metropolitan Books; 2007.


21. Gawande A. The Checklist Manifesto: How to Get Things Right. New York, NY: Metropolitan Books; 2009.


22. Gibson R, Singh JP. Wall of Silence: The Untold Story of the Medical Mistakes that Kill and Injure Millions of Americans. Washington, DC: Lifeline; 2003.


23. Gosbee JW, Gosbee LL, eds. Using Human Factors Engineering to Improve Patient Safety. 2nd ed. Oakbrook Terrace, IL: Joint Commission Resources; 2010.


24. Griffin FA, Resar RK. IHI Global Trigger Tool for Measuring Adverse Events. 2nd ed. IHI Innovation Series White Paper. Cambridge, MA: Institute for Healthcare Improvement; 2009.


25. Groopman J. How Doctors Think. Boston, MA: Houghton Mifflin; 2007.


26. Helmreich RL, Merritt AC. Culture at Work in Aviation and Medicine: National, Organizational, and Professional Influences. Aldershot, Hampshire, UK: Ashgate; 1998.


27. Hughes RG, ed. Patient Safety and Quality: An Evidence-Based Handbook for Nurses. Rockville, MD: Agency for Healthcare Research and Quality; 2008. AHRQ Publication No. 08-0043.


28. Hurwitz B, Sheikh A, eds. Health Care Errors and Patient Safety. Hoboken, NJ: Wiley-Blackwell; 2009.


29. Jewell K, McGiffert L. To Err is Human—To Delay is Deadly. Austin, TX: Consumers Union; 2009.


30. Kahneman D, Slovic P, Tversky A. Judgment Under Uncertainty: Heuristics and Biases. Cambridge, England: Cambridge University Press; 1987.


31. Kahneman D. Thinking Fast and Slow. New York: Farrar, Strauss and Giroux; 2011.


32. King S. Josie’s Story. New York, NY: Atlantic Monthly Press; 2009.


33. Krause TR, Hidley J. Taking the Lead in Patient Safety: How Healthcare Leaders Influence Behavior and Create Culture. Hoboken, NJ: Wiley; 2008.


34. Langley GJ, Moen R, Nolan KM, et al. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. 2nd ed. San Francisco, CA: Jossey-Bass; 2009.


35. Levinson DR. Adverse Events in Hospitals: National Incidence Among Medicare Beneficiaries. Washington, DC: US Department of Health and Human Services, Office of the Inspector General; November 2010. Report No. OEI-06-09-00090.


36. Lucian Leape Institute at the National Patient Safety Foundation. Unmet Needs: Teaching Physicians to Provide Safe Patient Care. Boston, MA: Lucian Leape Institute at the National Patient Safety Foundation; March 2010.


37. Marx D. Whack-a-Mole: The Price We Pay for Expecting Perfection. Plano, TX: By Your Side Studios; 2009.


38. Massachusetts Coalition for the Prevention of Medical Errors. When Things Go Wrong: Responding to Adverse Events. A Consensus Statement of the Harvard Hospitals. Burlington, VT: Massachusetts Coalition for the Prevention of Medical Errors; 2006.


39. Merry A, Smith AM. Errors, Medicine, and the Law. Cambridge, England: Cambridge University Press; 2001.


40. Millenson ML. Demanding Medical Excellence. Doctors and Accountability in the Information Age. Chicago, IL: University of Chicago Press; 1997.


41. Nance JJ. Why Hospitals should Fly: The Ultimate Flight Plan to Patient Safety and Quality Care. Boseman, MT: Second River Healthcare Press; 2008.


42. National Quality Forum. Safe Practices for Better Healthcare—2009 Update. Washington, DC: National Quality Forum; 2009.


43. Nemeth CP, ed. Improving Healthcare Team Communication: Building on Lessons from Aviation and Aerospace. Burlington, VT: Ashgate Publishing; 2008.


44. Norman DA. The Design of Everyday Things. New York, NY: Basic Books; 2002.


45. Paget MA. Unity of Mistakes: A Phenomenological Interpretation of Medical Work. Philadelphia, PA: Temple University Press; 1993.


46. Paget MA. In: DeVault ML, ed. Reflections on Cancer and an Abbreviated Life. Philadelphia, PA: Temple University Press; 1993.


47. Perrow C. Normal Accidents: Living with High-Risk Technologies. With a New Afterword and a Postscript on the Y2K Problem. Princeton, NJ: Princeton University Press; 1999.


48. Pronovost P, Vohr E. Safe Patients, Smart Hospitals: How One Doctor’s Checklist can Help Us Change Health Care from the Inside Out. New York, NY: Hudson Street Press; 2010.


49. Reason JT. Human Error. New York, NY: Cambridge University Press; 1990.


50. Reason JT. Managing the Risks of Organizational Accidents. Aldershot, Hampshire, UK: Ashgate; 1997.


51. Reason J. The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries. Farnham Surrey, UK: Ashgate; 2008.


52. Reynard J, Reynolds J, Stevenson P. Practical Patient Safety. Oxford, UK: Oxford University Press; 2009.


53. Robins NS. The Girl who Died Twice: Every Patient’s Nightmare: The Libby Zion Case and the Hidden Hazards of Hospitals. New York, NY: Delacorte Press; 1995.


54. Rogers EM. Diffusion of Innovation. 5th ed. New York, NY: Free Press; 2003.


55. Rosenthal MM, Sutcliffe KM, eds. Medical Error. What do We Know? What do We Do? San Francisco, CA: John Wiley & Sons; 2002.


56. Rozovsky FA, Woods JR Jr, eds. The Handbook of Patient Safety Compliance: A Practical Guide for Health Care Organizations. San Francisco, CA: Jossey-Bass; 2005.


57. Sagan SD. The Limits of Safety: Organizations, Accidents and Nuclear Weapons. Princeton, NJ: Princeton University Press; 1993.


58. Sanders L. Every Patient Tells A Story: Medical Mysteries and the Art of Diagnosis. New York, NY: Broadway Books; 2009.


59. Schuster PM, Nykolyn L. Communication for Nurses: How to Prevent Harmful Events and Promote Patient Safety. Philadelphia, PA: F.A. Davis Company; 2010.


60. Scobie S, Thomson R. Building a Memory: Preventing Harm, Reducing Risks and Improving Patient Safety. London, England: National Patient Safety Agency; 2005.


61. Sharpe VA, Faden AI. Medical Harm: Historical, Conceptual, and Ethical Dimensions of Iatrogenic Illness. New York, NY: Cambridge University Press; 1998.


62. *Shekelle PG, Pronovost PJ, Wachter RM, et al.; PSP Technical Expert Panel. Assessing the Evidence for Context-Sensitive Effectiveness and Safety of Patient Safety Practices: Developing Criteria. Rockville, MD: Agency for Healthcare Research and Quality; December 2010. AHRQ Publication No. 11-0006-EF.


63. *Shojania KG, Duncan BW, McDonald KM, Wachter RM, eds. Making Health Care Safer: A Critical Analysis of Patient Safety Practices. Evidence Report/Technology Assessment No. 43 from the Agency for Healthcare Research and Quality: AHRQ Publication No. 01-E058. Rockville, MD: Agency for Healthcare Research and Quality; July 2001. Available at: http://www.ahrq.gov/clinic/ptsafety/.


64. Spath P, ed. Engaging Patients as Safety Partners. Chicago, IL: AHA Press; 2008.


65. Spath PL. Error Reduction in Health Care: A Systems Approach to Improving Patient Safety. 2nd ed. San Francisco, CA: Jossey-Bass; 2011.


66. Stewart JB. Blind Eye: How the Medical Establishment Let a Doctor Get Away with Murder. New York, NY: Simon & Schuster; 1999.


67. Tenner E. Why Things Bite Back: Technology and the Revenge of Unintended Consequences. New York, NY: A.A. Knopf; 1996.


68. Truog RD, Browning DM, Johnson JA, et al. Talking with Patients and Families about Medical Error: A Guide for Education and Practice. Baltimore, MD: Johns Hopkins University Press; 2011.


69. Ulmer C, Wolman DM, Johns MME, eds. Resident Duty Hours: Enhancing Sleep, Supervision, and Safety. Committee on Optimizing Graduate Medical Trainee (Resident) Hours and Work Schedule to Improve Patient Safety, Institute of Medicine. Washington, DC: National Academies Press; 2008.


70. Vance JE. A Guide to Patient Safety in the Medical Practice. Chicago, IL: American Medical Association; 2008.


71. Vaughan D. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago, IL: University of Chicago Press; 1997.


72. Vincent C. Patient Safety. 2nd ed. West Sussex, UK: Wiley-Blackwell; 2010.


73. *Wachter RM, Shojania KG. Internal Bleeding: The Truth Behind America’s Terrifying Epidemic of Medical Mistakes. New York, NY: Rugged Land; 2004.


74. Weick KE. Sensemaking in Organizations. Thousand Oaks, CA: Sage Publications; 1995.


75. Weick KE, Sutcliffe KM. Managing the Unexpected: Assuring High Performance in an Age of Complexity. 2nd ed. San Francisco, CA: John Wiley & Sons; 2007.


76. Wiener EL, Kanki BG, Helmreich RL. Cockpit Resource Management. San Diego, CA: Academic Press; 1993.


77. Woods DD, Dekker S, Cook R, et al. Behind Human Error. 2nd ed. Burlington, VT: Ashgate; 2010.


78. Wu AW, ed. The Value of Close Calls in Improving Patient Safety. Oakbrook Terrace, IL: Joint Commission Resources; 2011.


79. Wu HW, Nishimi RY, Page-Lopez CM, et al. Improving Patient Safety Through Informed Consent for Patients with Limited Health Literacy. Washington, DC: National Quality Forum; 2005.


80. Youngberg BJ, ed. Principles of Risk Management and Patient Safety. Sudbury, MA: Jones Bartlett; 2011.






The Institute of Medicine (IOM) Reports on Medical Errors and Healthcare Quality (From its “Quality Chasm” Series)








1. Kohn L, Corrigan J, Donaldson M, eds. To Err is Human: Building a Safer Health System. Washington, DC: Committee on Quality of Health Care in America, Institute of Medicine: National Academy Press, 2000.


2. Committee on Quality of Health Care in America, IOM. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001.


3. Page A, ed. Keeping Patients Safe. Transforming the Work Environment of Nurses. Committee on the Work Environment for Nurses and Patient Safety, Board on Health Care Services. Washington, DC: National Academy Press; 2004.


4. Aspden P, Corrigan JM, Wolcott J, et al. Patient Safety: Achieving a New Standard for Care. Washington, DC: National Academy Press; 2004.


5. Nielsen-Bohlman L, Panzer AM, Kindig DA. Institute of Medicine Committee on Health Literacy. Health Literacy: A Prescription to End Confusion. Washington, DC: National Academy Press; 2004.


6. Aspden P, Wolcott J, Bootman JL, et al., eds. Preventing Medication Errors. Committee on Identifying and Preventing Medication Errors. Washington, DC: National Academy Press; 2007.






Quality Grand Rounds Series, Annals of Internal Medicine*








1. *Wachter RM, Shojania KG, Saint S, et al. Learning from our mistakes: quality grand rounds, a new case-based series on medical errors and patient safety. Ann Intern Med 2002;136:850–852.


2. Chassin MR, Becher EC. The wrong patient. Ann Intern Med 2002;136:826–833.  [PubMed: 12044131]


3. Bates DW. Unexpected hypoglycemia in a critically ill patient. Ann Intern Med 2002;137:110–116.  [PubMed: 12118966]


4. Hofer TP, Hayward RA. Are bad outcomes from questionable clinical decisions preventable medical errors? A case of cascade iatrogenesis. Ann Intern Med 2002;137:327–333.  [PubMed: 12204016]


5. Gerberding JL. Hospital-onset infections: a patient safety issue. Ann Intern Med 2002;137:665–670.  [PubMed: 12379067]


6. Cleary PD. A hospitalization from hell: a patient’s perspective on quality. Ann Intern Med 2003;138:33–39.  [PubMed: 12513042]


7. Lynn J, Goldstein NE. Advance care planning for fatal chronic illness: avoiding commonplace errors and unwarranted suffering. Ann Intern Med 2003;138:812–818.  [PubMed: 12755553]


8. Brennan TA, Mello MM. Patient safety and medical malpractice: a case study. Ann Intern Med 2003;139:267–273.  [PubMed: 12965982]


9. Goldman L, Kirtane AJ. Triage of patients with acute chest pain and possible cardiac ischemia: the elusive search for diagnostic perfection. Ann Intern Med 2003;139: 987–995.  [PubMed: 14678918]


10. Pronovost PJ, Wu AW, Sexton JB. Acute decompensation after removing a central line: practical approaches to increasing safety in the intensive care unit. Ann Intern Med 2004;140:1025–1033.  [PubMed: 15197020]


11. Redelmeier DA. Improving patient care. The cognitive psychology of missed diagnoses. Ann Intern Med 2005;142:115–120.  [PubMed: 15657159]


12. Gandhi TK. Fumbled hand-offs: one dropped ball after another. Ann Intern Med 2005;142:352–358.  [PubMed: 15738454]


13. McDonald CJ. Computerization can create safety hazards: a bar-coding near miss. Ann Intern Med 2006;144:510–516.  [PubMed: 16585665]


14. Shojania KG, Fletcher KE, Saint S. Graduate medical education and patient safety: a busy—and occasionally hazardous—intersection. Ann Intern Med 2006;145:592–598.  [PubMed: 17043341]


15. *Wachter RM, Shojania KG, Markowitz AJ, et al. Quality grand rounds: the case for patient safety. Ann Intern Med 2006;148:629–630.






Selected Theme Issues on Medical Errors








Focus on computerized provider order systems. J Am Med Inform Assoc 2007;14:25–75. Available at: http://www.jamia.org/content/vol14/issue1/.


Profiles in patient safety. Case-based series of articles. Acad Emerg Med. Available at: .


Theme issue on medical error. BMJ 2000;320:7237. Available at: .


Theme issue on medical error. Eff Clin Pract 2000. Available at: http://www.acponline.org/journals/ecp/pastiss/nd00.htm.


Theme issue: contributions from ergonomics and human factors. Qual Saf Health Care 2010;19(suppl 3):i1–i79.


Theme issue: diagnostic error in medicine. Berner ES, Graber ML, eds. Adv Health Sci Educ Theory Pract 2009;14(suppl 1):1–112.


Theme issue: human factors and ergonomics in patient safety. Carayon P, Buckle P, eds. App Ergon 2010:41:643–718.


Theme issue: knowledge for improvement. BMJ Qual Saf 2011;20(suppl 1):1–105.


Theme issue: medical malpractice and errors. Health Aff (Millwood) 2010;29:1564–1619.


Theme issue: new approaches to researching patient safety. Iedema R, ed. Soc Sci Med 2009;69:1701–1783.


Theme issue: nurses transforming care. Am J Nurs 2009;109(suppl 11):3–80, C3.


Theme issue: quality and safety in medicine. Nash DB, Goldfarb NI, Patow C, eds. Acad Med 2009;84:1641–1846.


Theme issue: safety by design. Qual Saf Health Care December 2006;15(suppl 1):i1–i90.


Theme issue: safety. Simmons D, ed. Crit Care Nurs Clin North Am 2010;22:161–290.


Theme issue: special issue on health information technology. J Gen Intern Med 2008;23:353–507.


heme issue: still crossing the quality chasm. Health Aff (Millwood) 2011;30:554–800.






Selected Websites on Medical Errors








Agency for Healthcare Research and Quality (AHRQ). Patient Safety & Medical Errors. Available at: http://ahrq.gov/qual/errorsix.htm.


*AHRQ Patient Safety Network (PSNet). Available at: http://www.psnet.ahrq.gov.


*AHRQ WebM&M: Morbidity and Mortality Rounds on the Web. Available at: http://www.webmm.ahrq.gov/.


AHRQ Patient Safety Organizations. Available at: http://www.pso.ahrq.gov/index.html.


AHRQ Health Care Innovations Exchange. Available at: http://www.innovations.ahrq.gov/.


American College of Surgeons. National Surgical Quality Improvement Program (NSQIP). Available at: http://www.acsnsqip.org/.


American Hospital Association Patient Safety Center. Available at: http://www.ahaqualitycenter.org/ahaqualitycenter/dimQualityServlet?keywordId=3.


Association for Professionals in Infection Control and Epidemiology. PreventInfection.org. Available at: http://www.preventinfection.org//AM/Template.cfm?Section=Home4.


FDA Patient Safety News. Available at: .


Institute for Healthcare Improvement (IHI). Available at: http://www.ihi.org.


Institute for Safe Medication Practices. Available at: http://www.ismp.org/.


Joint Commission. Available at: http://www.jointcommission.org.


Joint Commission Center for Transforming Healthcare. Available at: http://www.centerfortransforminghealthcare.org/.


Joint Commission National Patient Safety Goals: Available at: http://www.jointcommission.org/standards_information/npsgs.aspx.


Leapfrog Group for Patient Safety. Available at: http://www.leapfroggroup.org/.


National Patient Safety Agency (United Kingdom). Available at: http://www.npsa.nhs.uk/.  [PubMed: 16624451]


National Patient Safety Agency (United Kingdom). Safer Healthcare. Available at: http://www.saferhealthcare.org.uk/ihi.


National Patient Safety Foundation. Available at: http://www.npsf.org/.


National Quality Forum. Available at: http://www.qualityforum.org.


US Department of Health and Human Services. Partnership for Patients. Available at: http://www.healthcare.gov/center/programs/partnership/index.html.


World Health Organization (WHO) Patient Safety. Available at: .


WHO Patient Safety Human Factors Web Site. Available at: .






*Edited or written by Robert M. Wachter.






Appendix II. The AHRQ Patient Safety Network (AHRQ PSNET) Glossary of Selected Terms in Patient Safety





Active error (or active failure)—The terms active and latent as applied to errors were coined by James Reason. Active errors occur at the point of contact between a human and some aspect of a larger system (e.g., a human–machine interface). They are generally readily apparent (e.g., pushing an incorrect button, ignoring a warning light) and almost always involve someone at the frontline. Active failures are sometimes referred to as errors at the sharp end, figuratively referring to a scalpel. In other words, errors at the sharp end are noticed first because they are committed by the person closest to the patient. This person may literally be holding a scalpel (e.g., an orthopedist operating on the wrong leg) or figuratively be administering any kind of therapy (e.g., a nurse programming an intravenous pump) or performing any aspect of care. Latent errors (or latent conditions), in contrast, refer to less apparent failures of organization or design that contributed to the occurrence of errors or allowed them to cause harm to patients. To complete the metaphor, latent errors are those at the other end of the scalpel—the blunt end—referring to the many layers of the healthcare system that affect the person “holding” the scalpel.






Adverse drug event (ADE)—An adverse event (i.e., injury resulting from medical care) involving medication use.






Examples:







  • Anaphylaxis to penicillin
  • Major hemorrhage from heparin
  • Aminoglycoside-induced renal failure
  • Agranulocytosis from chloramphenicol






As with the more general term adverse event, the occurrence of an ADE does not necessarily indicate an error or poor quality of care. ADEs that involve an element of error (of either omission or commission) are often referred to as preventable ADEs. Medication errors that reached the patient but by good fortune did not cause any harm are often called potential ADEs. For instance, a serious allergic reaction to penicillin in a patient with no prior such history is an ADE, but so is the same reaction in a patient who has a known allergy history but receives penicillin due to a prescribing oversight. The former occurrence would count as an adverse drug reaction or nonpreventable ADE, while the latter would represent a preventable ADE. If a patient with a documented serious penicillin allergy received a penicillin-like antibiotic but happened not to react to it, this event would be characterized as a potential ADE.






An ameliorable ADE is one in which the patient experienced harm from a medication that, while not completely preventable, could have been mitigated. For instance, a patient taking a cholesterol-lowering agent (statin) may develop muscle pains and eventually progress to a more serious condition called rhabdomyolysis. Failure to periodically check a blood test that assesses muscle damage or failure to recognize this possible diagnosis in a patient taking statins who subsequently develops rhabdomyolysis would make this event an ameliorable ADE: harm from medical care that could have been lessened with earlier, appropriate management. Again, the initial development of some problem was not preventable, but the eventual harm that occurred need not have been so severe, hence the term ameliorable ADE.






Adverse event—Any injury caused by medical care.






Examples:







  • Pneumothorax from central venous catheter placement
  • Anaphylaxis to penicillin
  • Postoperative wound infection
  • Hospital-acquired delirium (or “sundowning”) in elderly patients






Identifying something as an adverse event does not imply “error,” “negligence,” or poor quality care. It simply indicates that an undesirable clinical outcome resulted from some aspect of diagnosis or therapy, not an underlying disease process. Thus, pneumothorax from central venous catheter placement counts as an adverse event regardless of insertion technique. Similarly, a postoperative wound infection counts as an adverse event even if the operation proceeded with optimal adherence to sterile procedures, the patient received appropriate antibiotic prophylaxis in the perioperative setting, and so on. (See also “iatrogenic”).






Anchoring error (or bias)—Refers to the common cognitive trap of allowing first impressions to exert undue influence on the diagnostic process. Clinicians often latch on to features of a patient’s presentation that suggest a specific diagnosis. Often, this initial diagnostic impression will prove correct, hence the use of the phrase anchoring heuristic in some contexts, as it can be a useful rule of thumb to “always trust your first impressions.” However, in some cases, subsequent developments in the patient’s course will prove inconsistent with the first impression. Anchoring bias refers to the tendency to hold on to the initial diagnosis, even in the face of disconfirming evidence.






Authority gradient—Refers to the balance of decision-making power or the steepness of command hierarchy in a given situation. Members of a crew or organization with a domineering, overbearing, or dictatorial team leader experience a steep authority gradient. Expressing concerns, questioning, or even simply clarifying instructions would require considerable determination on the part of team members who perceive their input as devalued or frankly unwelcome. Most teams require some degree of authority gradient; otherwise roles are blurred and decisions cannot be made in a timely fashion. However, effective team leaders consciously establish a command hierarchy appropriate to the training and experience of team members.






Authority gradients may occur even when the notion of a team is less well defined. For instance, a pharmacist calling a physician to clarify an order may encounter a steep authority gradient, based on the tone of the physician’s voice or a lack of openness to input from the pharmacist. A confident, experienced pharmacist may nonetheless continue to raise legitimate concerns about an order, but other pharmacists might not.






Availability bias (or heuristic)—Refers to the tendency to assume, when judging probabilities or predicting outcomes, that the first possibility that comes to mind (i.e., the most cognitively “available” possibility) is also the most likely possibility. For instance, suppose a patient presents with intermittent episodes of very high blood pressure. Because episodic hypertension resembles textbook descriptions of pheochromocytoma, a memorable but uncommon endocrinologic tumor, this diagnosis may immediately come to mind. A clinician who infers from this immediate association that pheochromocytoma is the most likely diagnosis would be exhibiting availability bias. In addition to resemblance to classic descriptions of disease, personal experience can also trigger availability bias, as when the diagnosis underlying a recent patient’s presentation immediately comes to mind when any subsequent patient presents with similar symptoms. Particularly memorable cases may similarly exert undue influence in shaping diagnostic impressions.






Bayesian approach—Probabilistic reasoning in which test results (not just laboratory investigations but also history, physical exam, or any aspect for the diagnostic process) are combined with prior beliefs about the probability of a particular disease. One way of recognizing the need for a Bayesian approach is to recognize the difference between the performance of a test in a population and that in an individual. At the population level, we can say that a test has a sensitivity and specificity of, say, 90%—that is, 90% of patients with the condition of interest have a positive result and 90% of patients without the condition have a negative result. In practice, however, a clinician needs to attempt to predict whether an individual patient with a positive or negative result does or does not have the condition of interest. This prediction requires combining the observed test result not just with the known sensitivity and specificity but also with the chance the patient could have had the disease in the first place (based on demographic factors, findings on exam, or general clinical gestalt).






Benchmark—A benchmark in healthcare refers to an attribute or achievement that serves as a standard for other providers or institutions to emulate. Benchmarks differ from other standard of care goals, in that they derive from empiric data—specifically, performance or outcomes data. For example, a statewide survey might produce risk-adjusted 30-day rates for death or other major adverse outcomes. After adjusting for relevant clinical factors, the top 10% of hospitals can be identified in terms of particular outcome measures. These institutions would then provide benchmark data on these outcomes. For instance, one might benchmark “door-to-balloon” time at 90 minutes, based on the observation that the top-performing hospitals all had door-to-balloon times in this range. In regard to infection control, benchmarks would typically be derived from national or regional data on the rates of relevant nosocomial infections. The lowest 10% of these rates might be regarded as benchmarks for other institutions to emulate.






Blunt end—The blunt end refers to the many layers of the healthcare system not in direct contact with patients, but which influence the personnel and equipment at the sharp end who do contact patients. The blunt end thus consists of those who set policy, manage healthcare institutions, and design medical devices, and other people and forces, which, though removed in time and space from direct patient care, nonetheless affect how care is delivered. Thus, an error programming an intravenous pump would represent a problem at the sharp end, while the institution’s decision to use multiple different types of infusion pumps, making programming errors more likely, would represent a problem at the blunt end. The terminology of “sharp” and “blunt” ends corresponds roughly to active failures and latent conditions.






Checklist—Algorithmic listing of actions to be performed in a given clinical setting (e.g., advanced cardiac life support [ACLS] protocols for treating cardiac arrest) to ensure that, no mater how often performed by a given practitioner, no step will be forgotten. An analogy is often made to flight preparation in aviation, as pilots and air traffic controllers follow pretakeoff checklists regardless of how many times they have carried out the tasks involved.






Clinical decision support system (CDSS)—Any system designed to improve clinical decision making related to diagnostic or therapeutic processes of care. Typically a decision support system responds to “triggers” or “flags”—specific diagnoses, laboratory results, medication choices, or complex combinations of such parameters—and provides information or recommendations directly relevant to a specific patient encounter.






CDSSs address activities ranging from the selection of drugs (e.g., the optimal antibiotic choice given specific microbiologic data) or diagnostic tests to detailed support for optimal drug dosing and support for resolving diagnostic dilemmas. Structured antibiotic order forms represent a common example of paper-based CDSSs. Although such systems are still commonly encountered, many people equate CDSSs with computerized systems in which software algorithms generate patient-specific recommendations by matching characteristics, such as age, renal function, or allergy history, with rules in a computerized knowledge base.






The distinction between decision support and simple reminders can be unclear, but usually reminder systems are included as decision support if they involve patient-specific information. For instance, a generic reminder (e.g., “Did you obtain an allergy history?”) would not be considered decision support, but a warning (e.g., “This patient is allergic to codeine.”) that appears at the time of entering an order for codeine would be.






Close call—An event or situation that did not produce patient injury, but only because of chance. This good fortune might reflect robustness of the patient (e.g., a patient with penicillin allergy receives penicillin, but has no reaction) or a fortuitous, timely intervention (e.g., a nurse happens to realize that a physician wrote an order in the wrong chart). Such events have also been termed near miss incidents.






Competency—Having the necessary knowledge or technical skill to perform a given procedure within the bounds of success and failure rates deemed compatible with acceptable care. The medical education literature often refers to core competencies, which include not just technical skills with respect to procedures or medical knowledge but also competencies with respect to communicating with patients, collaborating with other members of the healthcare team, and acting as a manager or agent for change in the health system.






Complexity science (or complexity theory)—Provides an approach to understanding the behavior of systems that exhibit nonlinear dynamics, or the ways in which some adaptive systems produce novel behavior not expected from the properties of their individual components. Such behaviors emerge as a result of interactions between agents at a local level in the complex system and between the system and its environment.






Complexity theory differs importantly from systems thinking in its emphasis on the interaction between local systems and their environment (such as the larger system in which a given hospital or clinic operates). It is often tempting to ignore the larger environment as unchangeable and therefore outside the scope of quality improvement or patient safety activities. According to complexity theory, however, behavior within a hospital or clinic (e.g., noncompliance with a national practice guideline) can often be understood only by identifying interactions between local attributes and environmental factors.






Computerized provider order entry or computerized physician order entry (CPOE)—Refers to a computer-based system of ordering medications and often other tests. Physicians (or other providers) directly enter orders into a computer system that can have varying levels of sophistication. Basic CPOE ensures standardized, legible, complete orders, and thus primarily reduces errors caused by poor handwriting and ambiguous abbreviations.






Almost all CPOE systems offer some additional capabilities, which fall under the general rubric of CDSS. Typical CDSS features involve suggested default values for drug doses, routes of administration, or frequency. More sophisticated CDSSs can perform drug allergy checks (e.g., the user orders ceftriaxone and a warning flashes that the patient has a documented penicillin allergy), drug-laboratory value checks (e.g., initiating an order for gentamicin prompts the system to alert you to the patient’s last creatinine), drug–drug interaction checks, and so on. At the highest level of sophistication, CDSS prevents not only errors of commission (e.g., ordering a drug in excessive doses or in the setting of a serious allergy) but also errors of omission. For example, an alert may appear such as, “You have ordered heparin; would you like to order a partial thromboplastin time (PTT) in 6 hours?” Or, even more sophisticated: “The admitting diagnosis is hip fracture; would you like to order heparin for deep vein thrombosis (DVT) prophylaxis?” See also “Clinical decision support system.”






Confirmation bias—Refers to the tendency to focus on evidence that supports a working hypothesis, such as a diagnosis in clinical medicine, rather than to look for evidence that refutes it or provides greater support to an alternative diagnosis. Suppose that a 65-year-old man with a past history of angina presents to the emergency department with acute onset of shortness of breath. The physician immediately considers the possibility of cardiac ischemia, so asks the patient if he has experienced any chest pain. The patient replies affirmatively. Because the physician perceives this answer as confirming his working diagnosis, he does not ask if the chest pain was pleuritic in nature, which would decrease the likelihood of an acute coronary syndrome and increase the likelihood of pulmonary embolism (a reasonable alternative diagnosis for acute shortness of breath accompanied by chest pain). The physician then orders an ECG and cardiac troponin. The ECG shows nonspecific ST changes and the troponin returns slightly elevated.






Of course, ordering an ECG and testing cardiac enzymes is appropriate in the work-up of acute shortness of breath, especially when it is accompanied by chest pain and in a patient with known angina. The problem is that these tests may be misleading, since positive results are consistent not only with acute coronary syndrome but also with pulmonary embolism. To avoid confirmation bias in this case, the physician might have obtained an arterial blood glass or a D-dimer level. Abnormal results for either of these tests would be relatively unlikely to occur in a patient with an acute coronary syndrome (unless complicated by pulmonary edema), but likely to occur with pulmonary embolism. These results could be followed up by more direct testing for pulmonary embolism (e.g., with a helical CT scan of the chest), while normal results would allow the clinician to proceed with greater confidence down the road of investigating and managing cardiac ischemia.






This vignette was presented as if information were sought in sequence. In many cases, especially in acute care medicine, clinicians have the results of numerous tests in hand when they first meet a patient. The results of these tests often do not all suggest the same diagnosis. The appeal of accentuating confirmatory test results and ignoring nonconfirmatory ones is that it minimizes cognitive dissonance.






A related cognitive trap that may accompany confirmation bias and compound the possibility of error is “anchoring bias”—the tendency to stick with one’s first impressions, even in the face of significant disconfirming evidence.






Crew resource management (CRM)—Also called crisis resource management in some contexts (e.g., anesthesia), encompasses a range of approaches to training groups to function as teams, rather than as collections of individuals. Originally developed in aviation, CRM emphasizes the role of human factors—the effects of fatigue, expected or predictable perceptual errors (such as misreading monitors or mishearing instructions), as well as the impact of different management styles and organizational cultures in high-stress, high-risk environments. CRM training develops communication skills, fosters a more cohesive environment among team members, and creates an atmosphere in which junior personnel will feel free to speak up when they think that something is amiss. Some CRM programs emphasize education on the settings in which errors occur and the aspects of team decision making conducive to “trapping” errors before they cause harm. Other programs may provide more hands-on training involving simulated crisis scenarios followed by debriefing sessions in which participants assess their own and others’ behavior.






Critical incidents—A term made famous by a classic human factors study by Jeffrey Cooper of “anesthetic mishaps,” though the term had first been coined in the 1950s. Cooper and colleagues brought the technique of critical incident analysis to a wide audience in healthcare but followed the definition of the originator of the technique. They defined critical incidents as occurrences that are “significant or pivotal, in either a desirable or an undesirable way,” though Cooper and colleagues (and most others since) chose to focus on incidents that had potentially undesirable consequences. This concept is best understood in the context of the type of investigation that follows, which is very much in the style of root cause analysis. Thus, significant or pivotal means that there was significant potential for harm (or actual harm), but also that the event has the potential to reveal important hazards in the organization. In many ways, it embodies the expression in quality improvement circles that “every defect is a treasure.” In other words, these incidents, whether near misses or disasters in which significant harm occurred, provide valuable opportunities to learn about individual and organizational factors that can be remedied to prevent similar incidents in the future.






Decision support—Refers to any system for advising or providing guidance about a particular clinical decision at the point of care. For example, a copy of an algorithm for antibiotic selection in patients with community-acquired pneumonia would count as clinical decision support if made available at the point of care. Increasingly, decision support occurs via a computerized clinical information or order entry system. Computerized decision support includes any software employing a knowledge base designed to assist clinicians in decision making at the point of care.






Typically a decision support system responds to “triggers” or “flags”—specific diagnoses, laboratory results, medication choices, or complex combinations of such parameters—and provides information or recommendations directly relevant to a specific patient encounter. For instance, ordering an aminoglycoside for a patient with creatinine above a certain value might trigger a message suggesting a dose adjustment based on the patient’s decreased renal function.






Error—An act of commission (doing something wrong) or omission (failing to do the right thing) that leads to an undesirable outcome or significant potential for such an outcome. For instance, ordering a medication for a patient with a documented allergy to that medication would be an act of commission. Failing to prescribe a proven medication with major benefits for an eligible patient (e.g., low-dose unfractionated heparin as venous thromboembolism prophylaxis for a patient after hip replacement surgery) would represent an error of omission.




Jun 14, 2016 | Posted by in GENERAL & FAMILY MEDICINE | Comments Off on Appendices

Full access? Get Clinical Tree

Get Clinical Tree app for offline access