AMSA Convention 2016 Logo

Malignant Medicine

The New Physician December 2004
In 1999, the Institute of Medicine released its To Err Is Human report, exposing the extent of deadly
mistakes in U.S. medicine. Five years later, are patients any safer?

A 45-year-old developmentally delayed patient complained of abdominal pain. His mother brought him into the emergency room and told the attending her son had experienced similar symptoms the year before; the diagnosis then had been renal colic. The emergency room physician m aade the same diagnosis, treating the patient with painkillers and releasing him with a prescription for acetaminophen with codeine. He instructed him to drink at least eight glasses of water a day and gave him a strainer for his urine in case he passed a stone. The next morning, the patient was found dead in his bed. An autopsy revealed a perforated gastric ulcer and widespread peritonitis.

A pediatrician recommended a child receive a routine vaccination for hepatitis B as part of his annual school physical. The child’s father consented, and the child received an injection. When the physician recorded the lot number of the vaccine dose, he discovered the child was actually given a vaccine for hepatitis A. The only serious harm done in this case was to the relationship between the physician and the child’s family.

A hospital employee neglected to test a disinfectant solution used to clean endoscopes, and 180 patients were potentially exposed to HIV and hepatitis.

The public and the health-care industry were astonished when, in the fall of 1999, the Institute of Medicine (IOM) report To Err Is Human found that between 44,000 and 98,000 Americans die each year as a result of medical errors—mistakes similar to the ones described above. Depending on which estimate you use and how you crunch the numbers, this puts medical errors somewhere between the fifth and eighth leading cause of death in the United States, easily trumping breast cancer, traffic accidents and AIDS.

Before the IOM report, medical errors were largely unacknowledged by the industry and unknown to the general public. Now, Congress, federal and state government agencies, hospitals, patient advocacy groups and insurance providers seek solutions to preventing and correcting errors in medicine. Proposals have been offered, committees convened and initiatives undertaken, but has progress been made? While reform efforts abound, unfortunately, it remains unclear if patients are any safer.


The fight to end errors in medicine is a particularly intricate and thorny problem, a lot like the fight against cancer. Cancer is not one disease, but many, with the work to develop treatments and find cures spread among thousands of researchers at hundreds of institutions worldwide. Each time a piece
of the puzzle is unearthed, other researchers take note, incorporate the new information into their work, if applicable, and carry on. The result is a slow and uneven progress that’s frustratingly difficult to measure. And like cancer, the causes of medical errors are rooted in a variety of sources, and no one agency or organization is responsible for addressing the problem.

The cancer analogy breaks down, however, when it comes to research. Cancer researchers have at their fingertips decades of data on every imaginable aspect of the disease in all of its manifestations. Those who are working to reduce the problem of medical errors have extremely limited research to which to refer. Beyond a few small studies and a judicious use of common sense, experts don’t know what works and what doesn’t.

Part of the challenge is that medical errors come in many forms: A nurse gives a patient an overdose of medication because she couldn’t read the physician’s handwriting; a busy or distracted internist skims a lab report and makes an incorrect diagnosis; a surgeon operates on the wrong side of a patient; an IV drip malfunctions. Overworked staff, human–device interactions that go awry, communication problems, system failures—all can be the root of errors. But pinpointing what most often causes problems, what kinds of problems are most common, and which interventions are most effective is a huge challenge.

Because medical professionals fear repercussions ranging from censure by colleagues to malpractice suits, they are often reluctant to report errors. And even when they do report them, determining if the errors were avoidable is another hurdle, says Dr. Carolyn Clancy, the director of the Agency for Healthcare Research and Quality (AHRQ), which is part of the Department of Health and Human Services and the organization at the forefront of addressing this problem. Clancy gives the following example: If an elderly woman falls and breaks a hip because of water on the floor, we know what caused it, and we know what to do about it. But if she breaks her hip sliding out of a chair, it is not so obvious what, if anything, could have been done to prevent the accident.

It can be equally difficult to track progress, and Clancy offers another example: Nosocomial infections are reported to the Centers for Disease Control and Prevention on a voluntary basis. If the reporting increases, is that because hospitals are seeing more nosocomial infections or are they just getting better at reporting them?

Any researcher will say that having good data is essential, but at the moment, there simply isn’t enough. “We are very, very early in our understanding of this problem, yet there is enormous pressure on health-care providers to act,” says Dr. Eric Thomas, the principal investigator at the University of Texas (UT) Center of Excellence for Patient Safety Research and Practice, one of several centers funded by AHRQ grants to address medical errors.

But not all physicians feel such pressure, and this may be another source of the problem. According to research published in 2002 in the New England Journal of Medicine (NEJM), 35 percent of 831 surveyed physicians reported making medical errors or having mistakes happen to family members, yet the vast majority didn’t view errors as one of the most important problems in health care today.

These responses come as little surprise to Dr. Donald Berwick, the president and chief executive officer of the Institute for Healthcare Improvement. “People still don’t believe it’s a serious problem,” he says, adding that part of the reason is statistical. “At the local level, in the daily practice of medicine, the problem doesn’t feel very big. You may have one or two deaths per month from errors. Errors happen every day, but this is accepted as normal. There is no sense of urgency about the problem. There is an illusion of safety.”

He compares this to the perception of auto travel. Most people know someone who has been in an automobile accident or have been in one themselves, but it doesn’t seem to happen all that often, so they feel safe in their cars. But nationally, approximately 40,000 Americans die in auto accidents annually, making it one of the leading causes of death in the United States, he says.

With as many as 98,000 people dying each year from medical errors, medicine can’t wait for reams of data before taking action. Those working on the problem, like AHRQ and UT’s Center of Excellence, are working to assemble useful information using what little data they have, and trying what, as Thomas puts it, “sounds like the right thing to do.”


In the late 1970s, problems with aviation safety appeared on the national radar screen in much the same way medical errors did in 1999. Aviation’s troubles were addressed, and air safety has improved tremendously. Robert Helmreich, co-principal investigator at UT’s Center of Excellence and a researcher in aviation safety, says two-thirds of aviation accidents involve failures in teamwork—“misunderstandings, dictatorial leadership, people not speaking up because they feared some kind of censure….” These are characteristics of medical culture as well, he says.

Like medicine, the aviation industry is hierarchical. “In aviation, the administration is a very, very long way away [from the front-line workers],” Helmreich says. And like medicine, it is dependent on technology and highly trained professionals working in conditions requiring alertness and good judgment.

So what worked for aviation, and how can those lessons be applied to medicine? Data is key, Helmreich says. Yet, because air disasters are blessedly rare, the best data on airline safety comes from the reporting of near misses—when accidents almost happened.

So medicine needs to focus on its close calls. Helmreich recommends placing observers in operating rooms and other key health-care areas to report the beginnings of mistakes. The essential component to this system is, of course, trust. If people are going to report close calls and allow observers in their operating rooms, then they have to be confident they will not be censured for mistakes.

Many organizations, including the American Medical Association (AMA), have been lobbying for legislation, such as the Patient Safety and Quality Improvement Act (PSQIA), that would create a voluntary reporting system for medical errors. The PSQIA would not protect from lawsuits those who report accidents or near misses; instead, it would shield their identities. It would also provide a centralized source of data and analysis, allowing hospitals and other medical institutions to see what most likely causes errors and how to make the changes necessary for prevention. The system would not make error reporting mandatory, but the legislation’s supporters say the confidential venue would encourage health-care professionals and hospitals to share information about their errors and near misses. Dr. Donald J. Palmisano, past president of the AMA, has described the legislation as “the equivalent of the Aviation Safety Reporting System applied to medicine.”

This past July, the Senate passed a version of the PSQIA similar to one the House passed in March. Congress now has to reconcile the two versions and get a presidential signature before adjourning later this month. The measure has bipartisan support and some powerful backers, including physician and Senate Majority Leader Bill Frist (R-Tenn.) and Sen. Edward Kennedy (D-Mass.).


Once more data is available, it will be much easier to see where reform is most needed. In the meantime, patient-safety experts say solutions tend to break into three categories: transforming the culture of medicine, instituting low-tech system changes and making use of high technology.

The culture problem is made evident by the lack of openness and a threat of punishment in medicine that keeps so many mistakes and near misses from being reported. An increasing awareness of the need for candor and open discussion is helping, but “health-care professionals do not yet feel safe” to report concerns, Clancy says. And the required culture changes go beyond simply making people comfortable enough to anonymously report incidents. Changes need to go beyond front-line workers and involve management-level physicians and staff.

Dr. Darrell Campbell, the assistant dean for clinical affairs at the University of Michigan Medical School (UM) and chief of clinical affairs at the University of Michigan Health System hospitals, is dedicated to implementing necessary culture changes. Each week, Campbell is joined by a team of UM administration officials and patient advocates in making unannounced patient-safety rounds at the university’s three hospitals. Campbell calls this technique “management by walking around.” The team meets with nurses, physicians, residents and technicians and asks them such questions as “What happened yesterday?” and “What’s the worst thing that has ever happened here?”

Staff members can speak freely without fear of repercussions, thanks to two new UM hospitals’ policies designed to help ensure what Campbell calls a “blameless environment.” All employees are urged to speak up about safety concerns. The first policy protects them from being ostracized or punished for reporting a problem, and the second policy provides full disclosure. “These are good people, and they want to do the right thing,” Campbell says. “We aim to be fully honest with every patient who suffers an error. Our legal team is supportive of this policy.” Hospital policy now allows, and even encourages, medical staff to be completely forthcoming with patients and their families when an error has been made. Doctors need not fear that they will get in trouble for telling patients too much.

Sherry Wagner, a UM University Hospital pharmacy technician, recently put the first policy to the test when she received an order for insulin she thought was too much for a baby. She immediately brought the matter to the attention of her supervisor, Diane Gaul. The concern was then shared with the physician who wrote the order. At first, the physician insisted the order was correct, but when Wagner and Gaul held firm, the physician took a second look and discovered an error. The order was changed, and Wagner suffered no ill consequences for her action. In fact, Campbell publicly commended her for her attentiveness and commitment to the safety of the young patient. “I knew I had the support of my supervisors,” Wagner says. “They are very trusting and depend on us to act when we see that a patient is in danger.”

Campbell’s approach is an attempt to turn the culture of medicine from one of “individual blame and retribution to one of systems analysis,” and he may be right on target. According to results from the 2002 NEJM survey, physicians and patients are inclined to attribute mistakes to the failures of individuals. However, a growing body of evidence strongly suggests that errors are more often the result of system failures rather than lapses of attention or judgment by individuals.

Moving the focus of attention from people to systems may provide more options for making improvements. Problems in aviation safety were addressed in just this way. System changes can be quite complex, but they usually require only simple, inexpensive low-tech solutions. “We are beginning to see more carefully designed care processes with built-in redundancies,” says Janet Corrigan, the senior board director for health-care services at the IOM.

Things as simple as putting concentrated potassium chloride, which can cause heart failure if given in too high of a dose, in a locked cabinet rather than open at the nurses station can prevent serious accidents, UT’s Thomas says. Other low-tech measures include increasing communication between team members on different shifts, routine systems of verifying patients’ names and leaving more detailed notes on charts. As health-care workers and administrators become more aware of the problem, small changes like these are being made every day at hospitals across the nation. The results are impossible to measure, but hospital workers are certain that such changes are making a positive impact.

But the solutions getting the most attention are high-tech—and they’re neither small nor inexpensive. The two most talked about are Computerized Physician Order Entry (CPOE) and electronic medical records systems.

In a CPOE system, medication orders are entered into a computer equipped with software designed to prevent prescribing errors. CPOE is one of the few areas that have been carefully studied, according to Claire Turner, spokeswoman for The Leapfrog Group, a consortium of private and public health-care purchasers working to coordinate patient-safety initiatives. In Leapfrog-member hospitals, such systems have reduced serious prescribing errors by more than 50 percent. This is significant as AHRQ research indicates medication errors account for between 34 percent and 56 percent of all medical errors.

Electronic medical records could have an even greater impact. Because many patients see several physicians at multiple health-care facilities, it can be difficult for any one physician or hospital to get access to all the necessary information about a patient’s care—especially on short notice. “Electronic records will make a huge difference,” the IOM’s Corrigan says. “Health-care providers need complete info [about their patients’ medical history] and immediate access to it.”

Clancy agrees that information technology will make a tremendous difference, but she warns, “Simply digitizing information won’t have a big effect. We have to make sure that the information can follow the patient and be consistent from department to department and from hospital to hospital.” She likens the situation to the early days of telephone technology. A telephone didn’t do you much good unless the person you wanted to talk to also had one. She does not, however, think that centralization will be required to create the seamless communication that will be necessary for information technology to be most effective. What is essential, though, is money.

While internal systems changes can be implemented at very little cost, information technology doesn’t come cheap. And the possibility for a return on the investment, in terms of dollars—not lives—is not always clear. “A hospital may or may not benefit financially from making these changes,” Thomas says.

To ensure the changes occur, Corrigan says, “We need national leadership to put this into place…. I do expect a return on the investment; the current system has many redundancies and inefficiencies.”

And an investment may be forthcoming. In October, AHRQ awarded $96 million in grants to help communities, hospitals and health-care providers develop and better use information technology systems. AHRQ also awarded an $18.5 million contract to the University of Chicago to create a National Health Information Technology Resource Center to provide the grantees technical assistance and serve as a repository for best practice ideas. The funding is part of a Bush administration initiative announced in April that would, among other things, establish electronic medical records for most Americans over the next 10 years as well as create a national coordinator for health information technology.


Plenty of energy, great ideas and even some real dollars are now being applied to this problem. Still, the question lingers: Is any of it working? “It is my impression that the situation is probably getting worse. Hazards are the other side of the coin of technological innovation,” Berwick says, explaining how increasing the use of technological innovations also creates more opportunities for mistakes.

Others are less pessimistic. “It is an enormous step forward just to be talking about it,” Thomas says.

Helmreich agrees. “Once we are aware of the problem, we can’t put it back in the box.”
Now that changes to the medical culture and innovations, both low- and high-tech, are on the table—and in some cases, on the wards—what’s next? According to Corrigan, U.S. medicine needs to aggressively consider a redesign of payment systems. “Our current fee-for-service payment system is often an impediment to good care. We need to try some alternative systems, such as pay-for-performance,” she says.

Thomas concurs. “As a physician practicing general internal medicine, if I give high-quality care to a diabetic patient with traditional insurance, I will make less money on subsequent visits and treatments,” he says.

So, like the ongoing research into finding a cure for cancer, it may be a long time before we can be assured we are making progress in combating medical errors. But for now, we do know medical errors will no longer go unnoticed or unaddressed. In another five years, Clancy says, we’ll have more data in place, and then we can start to get some answers.

Meanwhile, medical professionals are doing what they can and hoping that it is working.
New Physician contributing editor Avery Hurt is a freelance writer based in Birmingham, Alabama. Direct comments about this article to