DOME home
TOP STORY








Blame it on the System

Mistakes happen. In the case of medical errors, finger pointing doesn't solve much. Only fixing the underlying systems will get to the root of the problem.

Peter Pronovost has his reasons for being a zealot about patient safety, and he isn't shy about sharing them. When Pronovost was in his fourth year of medical school at Johns Hopkins, his father died as the result of an error made by a hospital in New England. It colored everything-his choice of careers (critical care medicine) as well as his research interests.

Since then, the young associate professor has built a reputation around Hopkins Hospital as the person to go to for anything pertaining to patient safety. The timing of his interests was providential.

It took the convergence of a number of factors-an eye-opening, national report two years ago on medical errors in hospitals, a crack-down on hospitals by regulatory agencies, and unquestionably the death of a research subject at Hopkins last June-to propel safety to the top of Hopkins' priority list.

"It's the No. 1 issue," says Beryl Rosenstein, vice president for medical affairs at Hopkins Hospital. "We've gotten strong endorsements from the very highest echelons of the Institution."

That wasn't always the case. When Pronovost and Rosenstein got together last summer to inaugurate the Patient Safety Committee, which they co-chair, Pronovost, at least, felt hamstrung.

"There's a lot of rhetoric here, and around the whole country, about safety being important," says Pronovost, "but in reality, no one had really done anything to improve safety."

Pronovost approached his new responsibilities in his customary methodical manner. To start, he and Rosenstein surveyed 400 members of the medical staff to gauge, for the first time, the overall perception of patient safety. Some things, like the hospital systems to report medical errors, fared well. Others were worrisome, particularly that staff didn't think senior leadership was serious about safety.

Those results were a reality check to the executives at the top of Hopkins' organizational chart. At the Johns Hopkins Medicine strategic planning retreat late last year, Dean and CEO Edward Miller named safety his top priority.

The Culture of Blame

It's not that Pronovost is preaching perfection. Fallibility is part of the human condition, he acknowledges, and is not something that can be changed. But we can change the systems under which people work, he argues, thereby reducing the risk of errors.

It's a concept that goes against the medical culture, admits Rosenstein. "Physicians and nurses are trained from their earliest days in school that health professionals don't make mistakes, and if you do, you don't talk about it."

Rosenstein believes that culture of blame is starting to dissipate at Hopkins. "There's a whole chain of events that's involved to allow an error to take place." If a physician orders the wrong drug, for instance, there are back-up systems to catch the error. The pharmacist has the tools to check the dose and look for allergies, but maybe his workspace is too crowded or he's being interrupted by telephone calls that should have been handled by technical support. The nurse also is familiar with the drugs and correct dosages, but perhaps she becomes distracted because of a staffing issue.

"So for a patient to get the wrong drug," Rosenstein explains, "there have to be lapses at multiple entry points. It's usually not the result of just one person making an error."

A Seven-Pronged Approach

When Peter Pronovost takes on a unit, he begins by measuring the "culture of safety," that is to say, he asks 10 pointed questions of the people who work there. How comfortable are they at disclosing errors? Do they ever make mistakes?

Second comes education. The medical-error problem is huge, and it is global, Pronovost tells them. In the United States alone, 7 percent of patients in academic medical centers experience a mistake with their medication resulting in up to 98,000 deaths a year, numbers that are mirrored in Australia and the United Kingdom.

Pronovost prepared his talk on the science of safety after realizing his message wasn't getting through. The audience, he found, was reacting "like I was talking in French" about system factors and medical errors. "That's because doctors and nurses and pharmacists haven't been educated in this concept of thinking about systems."

After the in-service, there's a second, more probing series of questions meant to identify the unit's particular safety concerns. Employees are asked, How have you prevented a patient from being harmed? How will the next patient be harmed, and what can we do to prevent it?

Once staff has drawn up a list of concerns, they are assigned a hospital vice president who conducts executive walk-rounds each month. The executives, who include Dean Miller, JHU President William Brody and JHH President Ron Peterson, among others, then get to see first-hand where the problems lie.

One memorable example involved the intensive care units, which the safety program has targeted first because of the higher probability of life-threatening errors occurring there. During rounds, the potential danger of having inadequately trained employees transport very sick, ICU patients around the hospital for tests came through loud and clear. What if the intern accompanying Mr. X to his MRI didn't know how to use the infusion pump? Clearly, here was a safety issue, and the nurses knew it. They'd been asking for a transport team-the breed of ICU nurses who are specially trained to do just this sort of job during inter-hospital transport-for two years.

"The next morning," says Pronovost, "the transport team started. I don't know where the money came from, but the good will generated by that was tremendous."

The transport team is now available to all the ICUs.

Some of the ideas, on the other hand, have cost next to nothing or require an investment more of time than money. On the Weinberg ICU, Pronovost and the staff observed that several patients had epidural infusions, which could easily be mistakenly hooked up intravenously. "That's a potentially lethal complication, because they get a toxic drug in their vein." The solution? A bright orange label is now placed on every epidural catheter.

The WICU staff had also unearthed a problem with transfer orders. They feared that when patients left the unit, the list of medications and allergies to medications often contained errors. Pronovost did an audit for two weeks and discovered their concerns were well-founded. More than 90 percent of the transfer orders had mistakes in them. And even though most of the errors wouldn't have harmed patients, it was disturbing nonetheless.

Now, as part of the routine discharge process, nurses perform something called medication reconciliation, matching medications and allergies on the transfer orders to what patients have been getting on the WICU. If there's a discrepancy, nurses go to the doctor or patient for an explanation.

Time-consuming, yes. But also worthwhile. "We've essentially eliminated that error rate down to zero," reports Pronovost. "It's a pretty striking example of working on team communication."

Eventually, Pronovost wants to roll out his safety program to all the hospital units-cath labs, ORs, outpatient clinics-where a core team of a physician, nurse, pharmacist and administrator lead each effort. Because Pronovost estimates those team members will need a day each week to do the work, he is attempting to get time allocated for the project.

"It is a lot, but the reality is, you're not going to get meaningful improvements piling this stuff onto a job description that's already pretty full."

Despite the inevitable extra work it causes, the safety program has been much more positively received than other programs of its ilk. "This isn't cost cutting, this isn't about an administrative thing, there are no hidden agendas," says Pronovost. "This is, Patients shouldn't be harmed. And that makes people feel good, because it's what we all went into health care for."

-Mary Ellen Miller

 

 

 

Johns Hopkins Medicine About DOME | Archive
© 2002 The Johns Hopkins University