The health care industry needs to be more honest about medical errors.
Twenty years ago this fall, the Institute of Medicine—an U.S.-based independent, nongovernmental organization widely regarded as an authority at the intersection of medicine and society—released a report titled “To Err Is Human.” It announced that up to 98,000 Americans were dying each year from medical errors.
Official and popular reaction was swift. Congress mandated the monitoring of progress in efforts to prevent patient harm, and the health care industry set grand goals, such as reducing medical errors by 50% within five years. News outlets reported on the proceedings closely. A remedy for a longstanding problem seemed in sight.
Yet, in 2019, medical errors are about as prevalent as in 1999. “To Err Is Human” was an uneasy read; so is a September 2019 report on patient safety from the World Health Organization. Among WHO’s findings: Globally, hospital-acquired infections afflict about 10% of hospitalized patients. Medical errors harm some 40% of patients in primary and outpatient care. Diagnostic and medication errors hurt millions, and cost billions of dollars every year.
So, two decades on, why this chronic state of risk in health care?
The chain reaction to the 1999 report spent its energy quickly. Contrary to the report’s calls for expertise from outside the medical profession, patient safety was taken over by clinician managers and other health care administrators whose interests would hardly have been served by a thorough consideration of the crisis that would have rattled the status quo. These institutional leaders also brushed off experts (psychologists, sociologists, and organizational behaviorists, among others) who have long offered innovative ideas for improving safety and reducing health care mishaps.
The medical managers had ideas, too, but those amounted to localized—and weak—prescriptions like safety checklists, hand-sanitizing stations, posters promoting “a culture of safety,” and programs inviting low-level staff members to speak up and speak their minds to their supervisors. Absent were innovations aimed at bigger classes of hazards beyond the scope of even large, multi-hospital systems such as resolving problems like look-alike, sound-alike drugs, or of confusing and error-inducing interfaces in technology.
Look-alike sound-alike drugs are medications that have spelling similarities or are visually similar in physical appearance or packaging. For example, mixups in the medications “epinephrine” and “ephedrine” have led to much patient harm. The drug names look similar and are sometimes stored close to each other. But each drug has a different purpose and can have serious adverse and even deadly effects if administered incorrectly. Error-inducing technology interfaces occur when simple technology connecting devices fit multiple tubes, outlets, or machines, increasing the possibility of misconnections. For example, when a feeding tube is mistakenly coupled with a tube that enters a vein, or an IV tube is inadvertently connected to the nasal oxygen.
Patient safety can be tricky to define, because it’s essentially a non-event. When things are going well, no one wonders why. When a mistake occurs and threatens the unrealistic “getting-to-zero” goal of many health care managers, then it becomes an event that demands a reaction. And the reaction generally is to assign blame to people further down the organizational ladder.
It’s far easier, after all, for the industry to fault individual workers on the front lines of medical care than to scrutinize inherent organization and system flaws, or to finger highly paid specialist doctors. In the current approach to patient safety, the focus—on who did wrong and how they did wrong—is misplaced. Instead, it should be on what’s going right and what lessons can be learned from those successes.
This is how health care organizations and the industry as a whole avoid dealing with the troubling task of identifying root causes of the patient-safety problem. Meanwhile, the public is assured there is little to fear (and little need for external intervention), because, after all, health care professionals are on the job.
But clinician leaders and hospital administrators in charge of the industry need to realize that health care, including its patient safety component, is too big and too complex to be steered by medical professionals only. We live in an era of multifaceted problems that call for multidisciplinary approaches. Advances in anesthesia safety, for example, would not have come without the input of engineers. Experts with perspectives from outside of medicine should be welcomed to any serious discussion of how to improve patient safety, and their insights heeded.
Let the words of human-factors engineering pioneer John Senders help guide a truly reformed patient safety movement: “Human error in medicine, and the adverse events that may follow, are problems of psychology and engineering, not of medicine.”
An important social movement seemed to emerge in the wake of “To Err Is Human,” but it has lost its way. By being bolder and more comprehensive in its goal setting, and by embracing the acumen of experts from outside the medical profession, the health care industry could make patient safety the great social movement it deserves to be.