Junkfood Science: Doctors and patients on the same page

March 22, 2007

Doctors and patients on the same page

We recently looked at a tragic case of a woman who suffered for twelve years with a misdiagnosis and the impact of preconceived beliefs about fat people among healthcare professionals.

But rather than churn about the injustice of it all or point blame at each other, both doctors and patients can come away with something more beneficial: the realization that we can all play a role in helping to reduce medical errors and misdiagnoses.

Dr. Jerome Groopman, M.D., chief of experimental medicine at Beth Israel Deaconess Medical Center and author of How Doctors Think, has been in the news over recent weeks and months with observations for both healthcare professionals and patients to help overcome stereotypes in medicine that can lead to medical errors and harm to patients. We all like to believe that we know how to think and always look at health issues objectively and rationally, but we’re all vulnerable to not.

As a columnist for the New Yorker, Dr. Groopman wrote a powerful essay in January in which he talked about how surprised he was as a medical student in the late 1970s, to realize how little attention was paid to understanding fallacies of logic and the “cognitive dimension” of clinical decision-making — “the process by which doctors interpret their patients’ symptoms and weigh test results in order to arrive at a diagnosis and a plan of treatment.” As healthcare professionals, we spend years memorizing scientific facts and practical applications, but no time at all learning the mental logic needed to make a correct diagnosis and avoid mistakes. Over the years as an emergency room doctor, he began documenting the common mistakes in medical judgment he was seeing.

He believes that many misdiagnoses are the result of readily identifiable—and often preventable—errors in thinking. He wrote:

[R]esearch shows that most physicians already have in mind two or three possible diagnoses within minutes of meeting a patient, and that they tend to develop their hunches from very incomplete information. To make diagnoses, most doctors rely on shortcuts and rules of thumb—known in psychology as “heuristics.”

If you’re unfamiliar with this term, it refers to sort of the Cliff Notes our brains use to simplify the overwhelming amount of information we’re exposed to and put it into simpler, intuitively correct beliefs and explanations. The problem comes in when many of these assumptions are incorrect. It's not unique to doctors, but happens with everyone. An explanation is here. This concept is used in everything from computer science, sociology, medicine, politics, marketing to education.

A common kind of heuristic is “representative,” where we tend to misjudge someone or a situation based on a feature that we believe is representative of a group. [This goes both ways. Patients, for example, may see any doctor as knowing everything about medicine, even outside his field. Doctors may see a young woman with chest pain and believe she’s another stressed out female.]

Another type of heuristic is “availability,” where we believe things to be truer and more likely when we’re exposed to them more often. [The saturation in the media and medical literature of obesity hyperbole plays a role in the readily-believed concepts about fat people. Or, if every patient who comes into the office during flu season has respiratory symptoms, we might assume the next person’s symptoms are the flu, too, rather than a heart problem.] When we hold a preconceived belief, we close our minds and dismiss data that contradicts it.

Back to Dr. Groopman’s column. In one of the cases he exampled in the New Yorker, he said the clinical information should have made him doubt his hypothetical diagnosis, but it didn’t:

Psychologists call this kind of cognitive cherry-picking “confirmation bias”: confirming what you expect to find by selectively accepting or ignoring information.

Representativeness and availability errors are intellectual mistakes, but the errors that doctors make because of their feelings for a patient can be just as significant. We all want to believe that our physician likes us and is moved by our plight. Doctors, in turn, are encouraged to develop positive feelings for their patients; caring is generally held to be the cornerstone of humanistic medicine. Sometimes, however, a doctor’s impulse to protect a patient he likes or admires can adversely affect his judgment....

As Tversky and Kahneman and other cognitive psychologists have shown, when people are confronted with uncertainty—the situation of every doctor attempting to diagnose a patient—they are susceptible to unconscious emotions and personal biases, and are more likely to make cognitive errors. Croskerry believes that the first step toward incorporating an awareness of heuristics and their liabilities into medical practice is to recognize that how doctors think can affect their success as much as how much they know, or how much experience they have. “Currently, in medical training, we fail to recognize the importance of critical thinking and critical reasoning,” Croskerry told me. “The implicit assumption in medicine is that we know how to think. But we don’t.”

There are numerous fallacies in logic. One of the most common, which often goes along with stereotypes and our preconceived beliefs about people or conditions, is that of confusing association with causation or confusing cause and effect. If we see two things together, we believe one causes the other. Of course, with today’s data dredges, computers can pull up countless, random and meaningless associations, but it happens in real life examples, too. We glimpse a fat child not engaging in sports and because of what we think we “know” about obesity, assume lack of sports activity caused the child’s fatness. Worse, we might take it a step further and put it in reverse and suppose sports activity can therefore prevent childhood obesity.

Or, with a post hoc error, if A happened after B, we believe A caused B. I like to call this the “killing turkeys causes winter fallacy.” We can also ignore one or more common causes for A and B that are totally independent. Missing confounding factors is very common.

By jumping to errors of logic, we are less likely to take the care and careful, objective testing necessary to determine the actual cause(s).

Dr. Groopman, wrote a valuable perspective on medical mistakes in the Boston Globe this week.

Why do we as physicians miss the correct diagnosis? It turns out that the mistakes are rarely due to technical factors, like the laboratory mixing up the blood specimen of one patient and reporting another's result. Nor is misdiagnosis usually due to a doctor's lack of knowledge about what later is found to be the underlying disease. Rather, most errors in diagnosis arise because of mistakes in thinking.

Based on pattern recognition, doctors draw bits of information together to form patterns and make quick judgments. It’s understandable because in today’s healthcare system, where there is intense pressure to see as many patients as possible, quick judgment is rewarded. Unfortunately, working in haste is a setup for errors in thinking, he said. And those initial assumptions can often be wrong. He deconstructed a case, showing how a preconceived belief, or stereotype, colored a doctor’s impression and led him to make an attribution error.

The doctors fixed on this diagnosis, so called "anchoring" where the mind attaches firmly to one possibility. Anchoring so tightly to one diagnosis and not broadly considering others is called "premature closure." Even when, later in Leslie's evaluation, a blood test result was obtained that was very abnormal, it was not sufficiently considered; no one involved in her case could lift their mental anchor and comprehensively explore other possibilities.

Discounting such discrepant or contradictory data is called "confirmation bias" -- the mind cherry-picks the available information to confirm the anchored assumption rather than revising the working diagnosis.

All of us as physicians are fallible, and while it is unrealistic to imagine a perfect clinical world, it is imperative to reduce the frequency of misdiagnosis. I believe all health professionals should learn in-depth about why and how and when we make errors in thinking, and I also believe that if our patients and their families and friends know about the common cognitive pitfalls, they can ask specific questions to help us think better.

On WFRV-TV in Green Bay, Wisconsin, Dr. Groopman explained that how doctors think can affect the care they give and that all of us — patients, families and friends — can play a role in reducing medical errors.

“If the doctor doesn’t like you, he or she closes their mind off. It’s a set-up for misdiagnosis,” he said. Doctors are human like the rest of us and not only bring their personal feelings to the exam room, they also carry around plenty of preconceived notions. “The most common stereotypes occur in women who are entering middle age and their symptoms are attributed - snap judgment - to stress, anxiety or menopause,” Groopman said. While it is imperative for doctors to change the ways they see patients and learn to think better, he said there are some things patients can do.

As WFRV reported, he recommended that patients try to turn the tables and ask questions of their doctors to help avoid becoming stereotyped:

"A patient can say, 'What else could it be?' especially if it's not getting better. Or, 'Could two things be going on at the same time?'," Groopman recommends.

He adds that you should never be afraid to tell your doctor what's worrying you the most. "Then the doctor becomes more sensitive to what the person is feeling about his or her body," Groopman says.

Another common stereotype in medicine surrounds obesity. On the CBS News evening news yesterday, he said there is considerable prejudice against fat people among healthcare professionals and that fat people are often stigmatized as being undisciplined and unhappy.

As with other stereotypes, such assumptions make doctors prone to “attribution errors,” meaning that we attribute the symptoms and problems to the stereotype rather than considering that this particular medical issue could be unrelated. Prejudice of any type is harmful, both for inhibiting broad clinical thinking and for conveying a negative attitude towards the patient.

For fat people who find themselves in a situation with a healthcare provider who appears to be stereotyping them and blaming everything on their weight, Dr. Groopman offered some advice. Be proactive. He suggested a woman talk openly with her doctor and try get him to see her as a real person and to come out and ask him to consider her symptoms broadly and not immediately attribute them as a weight issue. Patients can help their doctors think about their symptoms more objectively, he said. “How would you treat these symptoms if they were on a thin person, doc?” Good doctors will be responsive, he said. But as many fat women know, this can be easier said than done and not all doctors are yet receptive, something it appears Dr. Groopman is trying to change.

The heuristics about obesity, however, are also every bit as ingrained in many fat people as they are in healthcare professionals. The same can be said about the assumptions we make about all sorts of different people. Learning to understand and recognize our own errors in logical thinking and the stereotypes we believe about others and ourselves is a first step for all of us.

It all comes down to thinking. :)


©2007 Sandy Szwarc

Bookmark and Share