Nautilus

Bias in the ER

The dazed young woman who arrived at Sunnybrook Hospital, Canada’s first and largest regional trauma center, from a head-on car crash presented the surgeons treating her with a disturbing problem. In addition to her many broken bones, the rhythm of her heartbeat had become wildly irregular. It was either skipping beats or adding extra beats; in any case, she had more than one thing seriously wrong with her.

She remained alert enough to tell them that she had a past history of an overactive thyroid. An overactive thyroid can cause an irregular heartbeat. And so, when the hospital’s resident medical detective, Don Redelmeier, arrived, the staff believed that they no longer needed him to investigate the source of the irregular heartbeat but only to treat it. No one in the operating room would have batted an eye if Redelmeier had simply administered the drugs for hyperthyroidism. Instead, Redelmeier asked everyone to slow down. To wait. Just a moment. Just to check their thinking—and to make sure they were not trying to force the facts into an easy, coherent, but ultimately false story.

Something bothered him. As he said later, “Hyperthyroidism is a classic cause of an irregular heart rhythm, but hyperthyroidism is an infrequent cause of an irregular heart rhythm.” Hearing that the young woman had a history of excess thyroid hormone production, the emergency room medical staff had leaped, with seeming reason, to the assumption that her overactive thyroid had caused the dangerous beating of her heart. They hadn’t bothered to consider statistically far more likely causes of an irregular heartbeat. In Redelmeier’s experience, doctors did not think statistically. “Eighty percent of doctors don’t think probabilities apply to their patients,” he said. “Just like 95 percent of married couples don’t believe the 50-percent divorce rate applies to them, and 95 percent of drunk drivers don’t think the statistics that show that you are more likely to be killed if you are driving drunk than if you are driving sober applies to them.”

Redelmeier’s job in the trauma center was, in part, to check the understanding of the specialists for mental errors. “It isn’t explicit but it’s acknowledged that he will serve as a check on other people’s thinking,” said Rob Fowler, an epidemiologist at Sunnybrook. “About how people do their thinking. He keeps people honest. The first time people interact with him they’ll be taken aback: Who the hell is this guy, and why is he giving me feedback? But he’s lovable, at least the second time you meet him.” That Sunnybrook’s doctors had come to appreciate the need for a person to serve as a check on their thinking, Redelmeier thought, was a sign of how much the profession had changed since he entered it in the mid-1980s. When he’d started out, doctors set themselves up as infallible experts; now there was a place in Canada’s leading regional trauma center for a connoisseur of medical error. A hospital was now viewed not just as a place to treat the unwell but also as a machine for coping with uncertainty. “Wherever there is uncertainty there has got to be judgment,” said Redelmeier, “and wherever there is judgment there is an opportunity for human fallibility.”

Across the United States, more people died every year as a result of preventable accidents in hospitals than died in car crashes—which was saying something. Bad things happened to patients, Redelmeier often pointed out, when they were moved without extreme care from one place in a hospital to another. Bad things happened when patients were treated by doctors and nurses who had forgotten to wash their hands. Bad things even happened to people when they pressed hospital elevator buttons. Redelmeier had actually co-written an article about that: “Elevator Buttons as Unrecognized Sources of Bacterial Colonization in Hospitals.” For one of his studies, he had swabbed 120 elevator buttons and 96 bathroom surfaces at three big Toronto hospitals and produced evidence that the elevator buttons were far more likely to infect you with some disease.

Of all the bad things that happened to people in hospitals, the one that most preoccupied Redelmeier was clinical misjudgment. Doctors and nurses were human, too. They sometimes failed to see that the information patients offered them was unreliable—for instance, patients often said that they were feeling better, and might indeed believe themselves to be improving, when they had experienced no real change in their condition. Doctors tended to

You’re reading a preview, subscribe to read more.

More from Nautilus

Nautilus7 min read
Lithium, the Elemental Rebel
Inside every rechargeable battery—in electric cars and phones and robot vacuums—lurks a cosmic mystery. The lithium that we use to power much of our lives these days is so common as to seem almost prosaic. But this element turns out to be a wild card
Nautilus10 min readIntelligence (AI) & Semantics
How AI Can Save the Zebras
Tanya Berger-Wolf didn’t expect to become an environmentalist. After falling in love with math at 5 years old, she started a doctorate in computer science in her early 20s, attracting attention for her cutting-edge theoretical research. But just as s
Nautilus13 min read
The Shark Whisperer
In the 1970s, when a young filmmaker named Steven Spielberg was researching a new movie based on a novel about sharks, he returned to his alma mater, California State University Long Beach. The lab at Cal State Long Beach was one of the first places

Related Books & Audiobooks