Good morning, passengers—this is your captain speaking. There will be a brief delay before we leave the gate. For your comfort and safety, we've added a couple of extra steps to the preflight checklist. The flight attendants will be handing out consent forms to make sure that's OK with you. Please read your copy carefully, and if you're satisfied that none of your rights are being violated, sign it and turn it in so we can get you to your destination. Thanks for your cooperation.
Think you'll hear that announcement anytime soon?
It almost would be less surprising, at least to me, than a federal agency's recent disruption of a study meant to see if following a five-step checklist—simple things like cleaning the skin with a disinfectant soap before inserting a catheter—could reduce infections and deaths in intensive-care units. It could and did. In the first 18 months, the infection rate in 108 ICUs at 67 Michigan hospitals was slashed by two thirds. It was reported in the New England Journal of Medicine at the end of 2006.
Last year a complaint (I haven't gotten the details) triggered an investigation of the study by the federal Office for Human Research Protections, a part of the Department of Health and Human Services. The OHRP determined that hospitals could keep using the checklist, but the research element—collecting and sending data to Peter Pronovost, the Johns Hopkins architect of the program—had to stop. The office ruled that the study amounted to a formal research project, and, as such, it should have gone through formal internal vetting by institutional review boards at all of the 67 hospitals to make sure patients' rights were protected and consent forms provided in case patients wanted to opt out.
Pronovost thought he'd covered all the bases by having the Johns Hopkins review board scrutinize and approve the study. "It wasn't exactly a stealth intervention," he told me yesterday evening. The view of the Hopkins IRB was that the checklist program was only a change in procedure that would lead to an improvement in quality, not research involving human subjects. That seems logical enough, since there was nothing new about any of the five checklist items. Pronovost was simply bundling them into a checklist. The federal office didn't agree, and Pronovost halted his analysis of the data and told the hospitals to stop sending him any more.
All this would have gone mostly unnoticed but for a December 30 opinion piece in the New York Times by surgeon Atul Gawande, who had extolled Pronovost's program in a long New Yorker article a few weeks earlier about the ability of simple checklists to reduce hospital errors and unsafe practices. Since then, a number of medical blogs have picked up the gauntlet, most vociferously Wachter's World, run by hospitalist Bob Wachter at the University of California-San Francisco. Like Pronovost, Wachter is heavily involved with the patient quality movement, and he is irate.
"As someone who cares about the lives of patients, this one gets me PISSED," he wrote yesterday in his latest rant. He went on: "OHRP does important work, and local and regional internal review boards are critical to protecting patients from potential harm. But we're talking about checklists here. CHECKLISTS!"
The agency usually labors in obscurity—and wasn't ready for the attention stirred up by Gawande's opinion piece. On Tuesday, a press release stated that Gawande's op-ed piece had "inaccurately characterized certain facts of the case." I spent the better part of an hour yesterday with federal officials who tried to explain the reasoning behind the decision, and I confess that I still don't completely understand it. My sense is that it was technically correct but based on a very narrow reading of the regulations—and unnecessarily rigid. Spokesperson Rebecca Ayers gave me a prepared statement that said in part: "We hope every hospital implements this five-step program, which has been proven to reduce infections—saving lives and millions of dollars. Current research regulations in no way prohibit the adoption of this five-step program by hospitals whose only goal is to improve the quality of care."
She's right. Hospitals can use the checklist. But they'd better not even think about doing what the Michigan hospitals naively agreed to do with Johns Hopkins: see how the checklist did and hand over the data (which, by the way, were clumped to prevent individual patients from being identified) to be analyzed and published, for physicians and quality experts at other hospitals to follow suit. Unless, of course, they run it through their review board first. That will take time, but what's the rush? So many patients die in ICUs—what's a few more?
"I'm not sure how to proceed," said a clearly frustrated Pronovost. "I have six or so states that have said, 'Hey, could you come put this here?' I scratch my head and say, 'OK, but we'll have to get IRB approval from every hospital.' " He points out that 40 percent of Michigan hospitals are rural and don't have institutional review boards because they do minimal research and can't afford the cost anyway. "These are hospitals that would be cut out of initiatives like this," said Pronovost. "I don't think the public would stand for that, and the hospitals are unhappy at not having access to them."
It's hard to disagree with another of his observations: "We have to clarify this issue in a wise way. Focusing on a very bureaucratic, regulatory interpretation rather than evaluating the risks and benefits to patients of a quality initiative seems very unproductive."