Junkfood Science: Mythbusters: Are the odds stacked against us?

September 10, 2007

Mythbusters: Are the odds stacked against us?

Is it even possible to effectively counter popular false beliefs and help people understand accurate information? Or, had Joseph Goebbels, in the 1920s, correctly pegged people as easily manipulated by media disinformation campaigns and we're doomed to repeat history? The underlying processes of human reasoning and fallacies of logic are known in the scientific literature, but if we don't understand them, they could remain the secrets of those using them against us.

Those trying to dispel junk science that can hurt people, and prevent them from making decisions that are in their best interests, know how hard it can be. Despite often meritorious patience in explaining the facts and correcting misinformation, the myths seem to become even more firmly believed. Recent research led by Dr. Norbert Schwarz, Ph.D., of the Institute for Social Research at the University of Michigan, Ann Arbor, confirmed this phenomenon. They found that presenting factual, accurate information that contradicts erroneous beliefs, and encouraging people to think about issues, not only often doesn’t help, it can actually reinforce the very myths being corrected!

In a recent issue of Advances in Experimental Social Psychology, they wrote:

One piece of this puzzle is that increased effort will only improve performance when people already possess strategies that are appropriate for the task at hand; in the absence of such strategies, they will just do the wrong thing with more gusto. But even when no particularly sophisticated strategy is required, trying harder does not necessarily result in any improvement — in fact, it may often backfire. This is the case for one of the most widely recommended de-biasing strategies: encouraging people to "consider the opposite," or to counterargue their initial response, by asking themselves, "What are some reasons that my initial judgment might be wrong?" Ironically, the more people try to consider the opposite, the more they often convince themselves that their initial judgment was right on target. The strategy of consider the opposite produces this unintended effect because it ignores the second piece of the puzzle: the metacognitive experiences that accompany the reasoning process.


“I remember hearing something about that....”

It turns out that our decisions and what we believe don’t follow the case we make in our heads, created with all of the reasons we can think of to support an idea. Instead, we follow and feel increasingly confident about whatever most easily comes to mind. And what most easily comes to mind and most likely to be remembered isn’t usually the complicated science or facts, but what we’ve heard most often. Our brains aren’t very good at remembering where we heard things, either, and each time we hear an idea it becomes more familiar and feels truer to us — meaning, over time, urban legends, advertising claims, myths and anecdotes can end up having the same influence as credible sources. As Schwarz wrote:

When the false claims are encountered again on a later occasion, all that is left may be the vague feeling that "I heard something like this before." This sense of familiarity, in turn, will foster the acceptance of statements as true.

Mythbusters who fail to realize this natural trick our brains play on us, can fall into its trap. When debunking a myth, it’s common to first repeat it, but this technique can work against the quack buster. Simply repeating the myth may contribute to its later familiarity and acceptance. This Catch-22 predicament has been known for more than 60 years, since the research on wartime rumors by Allport and Lepkin. Still, the idea that false information needs to be confronted is so compelling that it’s still at the heart of most information campaigns, said Dr. Schwarz.

To demonstrate this effect, he and colleagues at the University of Michigan had volunteers read a CDC flier dispelling myths about the flu vaccine, and found that three days later the volunteers misremembered 40 percent of the myths as factual. They had also come to mistakenly think that the source of their false beliefs was the CDC. Messages from seemingly credible or credentialed sources are more influential and even more likely to be accepted.

Anders Sandberg, Ph.D., recently wrote about a similar predicament at Overcoming Bias. A friend had become worried about the health risks being touted in the news from wifi, but after being given a scientific takedown of the issue, he came back more frightened than ever. The mere act of giving more attention to a scary claim and researching it also can make it seem more real and significant to us. As Sandberg wrote:

The public is concerned about a possible health threat (electromagnetic emissions, aspartame, GMOs) and demand that the potential threat is evaluated. Funding appears and researchers evaluate the threat. Their findings are reported back through media to the public, who update their risk estimates.

In an ideal world the end result is that everybody get better estimates. But this process very easily introduces bias: the initial concern will determine where the money goes, so issues the public is concerned about will get more funding regardless of where the real risks are. The media reporting will also introduce bias since the media favour reporting newsworthy news, and risk tends to cause greater interest than reports of no risk (or the arrival of reviews of the state of the knowledge). Hence studies warning of a risk will be overreported compared to risks downplaying it, and this will lead to a biased impression of the total risk. Finally, the public will have an availability bias that makes them take note of reported risks more than reported non-risks. And this leads to further concerns and demands for investigation....

The problem here isn't media per se, but that biases are compounding and possibly leading back to a distortion of the fact-finding process. Media priorities make things worse, but it is just an extra layer of compounding.

It’s gobs easier to plant a scare, than to convince people of the science countering it. Those who understand this natural brain process take advantage of it to manipulate public opinion. They know that whoever makes the first scary claim gains the upper hand. And for the media, fear sells. Saturating the media with a scare and giving it more attention creates the added perception that it’s a real threat. Balanced information barely gets a word in edgewise, nor is it likely to be believed. As one commenter noted, if a publication shows something is risky, then it agrees with people’s concerns, “but if you publish showing no risk, you are funded by the Sinister Conspiracy!”


Cognitive biases make truth harder to see

In reviewing the research, Dr. Schwarz and colleagues found that efforts to thwart bias by encouraging people to not jump to conclusions and to carefully consider alternative information often backfires.

Ironically, ...successful de-biasing may become less likely the harder people try to avoid bias: the more they search for information that may argue against their initial judgment, the harder they will find their task, convincing them that their initial judgment was indeed right on target. Finally, subjective accessibility experiences are only used as a source of information when their informational value is not discredited.... In this case, the judgment is solely based on accessible declarative information.

He is referring to our natural tendencies to impose confirmation bias and disconfirmation bias to our reasoning. These cognitive biases have been recognized for half a century, and even the most intelligent, educated among us can fall for them.

Have you heard about the classic “2-4-6” experiment done by Peter Wason back in 1960, which illustrated confirmation bias? In this study, the subjects had to discover a rule known only to the experimentor seen in the sequence 2-4-6. Subjects could test triplets until they felt certain they knew the experimenter’s rule. Only 21% guessed the rule right off, but Wason found that those who guessed wrong repeatedly worked to look for evidence that confirmed their hypotheses, rather than falsify them. The subjects in this study had little emotional involvement in the numbers, but subsequent studies have found that the more emotionally charged an hypothesis, the stronger our confirmation biases and the more resistant to changing our minds we become.

As Wason wrote, all that the subjects had to do was find an instance that didn’t conform to their rule to decisively eliminate it, while on the contrary, “instances exemplifying such a rule can never be exhausted.” Inductive inferences can only be checked against the evidence, he said, and his experiment “demonstrated the dangers of induction by simple enumeration as a means of discovering truth.” Tallys aren’t evidence. He concluded:

The results show that very few intelligent young adults spontaneously test their beliefs....The kind of attitude which this task demands...consists in a willingness to attempt to falsify hypotheses, and thus to test those intuitive ideas which so often carry the feeling of certitude.

Obviously, scientific method [scientific process, otherwise known as critical thinking] can be taught and cultivated. But the readiness (as opposed to the capacity) to think and argue rationally in an unsystematized area of knowledge is presumably related to other factors besides intelligence, in so far as it implies a disposition to refute, rather than vindicate assertions, and to tolerate the disenchantment of negative instances.

Disconfirmation bias, on the other hand, is when people with a belief or bias not only selectively accept only the favorable evidence, but apply more scrutiny to the evidence that disagrees with their belief. [It’s one reason I eagerly read sources from all sides of an issue I’m researching to capture objections and evidence I might not have otherwise considered.] As Thomas Gilovich, chairmen of the Department of Psychology at Cornell University, explained in his talk, “Motivated skepticism and motivated credulity” presented at the June 2000 convention of the American Psychological Society: conclusions we don’t want to believe are instinctively held to a higher standard and we look if the evidence compels us to accept it, whereas with conclusions we want to believe, we ask only if the evidence allows us to accept it.

“The less you know, the less likely you are to get good results, but the easier it is to allow yourself to believe in good results,” he said. Critical thinking, learning and being a skeptic is critical to not being led to believe unsound ideas.

But skeptics who are biased can be the most difficult group of people of all to get to see the evidence!

“People who are more skilled skeptics — who know a larger litany of logical flaws — but apply that skill selectively, may change their minds more slowly than unskilled reasoners,” said Eliezer Yudkowsky, research fellow and director of the Singularity Institute for Artificial Intelligence, a non–profit research institute at Palo Alto, CA.

Understanding the biases that are widespread in human reasoning can help us detect many of the logical flaws that might otherwise nab us. But this inspection must be applied to both our own ideas and those of others, “to ideas which discomfort us and to ideas which comfort us,” he said.


Influence of groupthink

While there are numerous other fallacies of logic, we’ll look at one more: the power of groupthink. Even professionals and skeptics are swayed by the power of group consensus. As the University of Michigan researchers noted:

[P]eople often resort to social consensus information to judge the truth value of a belief: if many people believe it, [they think] there’s probably something to it. Because one is more frequently exposed to widely shared beliefs than to highly idiosyncratic ones, the familiarity of a belief is often a valid indicator of social consensus. But, unfortunately, information can seem familiar for the wrong reason, leading to erroneous perceptions of social consensus... Findings of this type indicate that repeated exposure to a statement influences perceptions of social consensus, presumably because the statement seems more familiar.

The reinforcing effect of collective beliefs was described by Timur Kuran and Cass R. Sunstein in a classic paper published in the Stanford Law Review. The perception that everyone believes something — no matter how outrageous it may be — gives it increasing plausability simply because of its increased presence in public discussions. Consumers, professionals and even public policy makers endorse information that seems to be believed by others, they wrote, partly out of interest in maintaining social acceptance. The availability and frequency of information campaigns are also used by “activists who manipulate the content of public discourse and strive to trigger availability cascades likely to advance their agendas,” they said.

Sounds a lot like the Principles of Propaganda written by Joseph Goebbels, Hitler’s propaganda minister in 1933, a position which gave him power over all German media: radio, press, cinema, and theater. His speeches were paraphrased into the oft repeated:

If you tell a lie big enough and keep repeating it, people will eventually come to believe it. The lie can be maintained only for such time as the State can shield the people from the political, economic and/or military consequences of the lie. It thus becomes vitally important for the State to use all of its powers to repress dissent, for the truth is the mortal enemy of the lie, and thus by extension, the truth is the greatest enemy of the State.

Goebbels’ understanding of the psychology of misinformation, manipulating public opinion through the media and creating “optimum anxiety levels,” makes his propaganda principles invaluable reading. Many of them are evident in the marketing of today’s popular myths commandeered by the media, such as the deadliness of obesity; the crisis of an obesity epidemic; the dangers of meat and potatoes, fats or dairy in our diets; the imperative and benefits of "healthy" eating; and the frightening toxicity of our world. It’s become virtually impossible to hear news that doesn’t support today’s gloomy "truths," as the media has been quite effectively purged of alternative ideas.

The power of the collective group also helps to explain why more people aren’t doing something about junk science and speaking out. Another well-recognized human phenomenon, first described nearly forty years ago, is called the bystander effect. When people are part of a group that encounters something amiss — such as someone injured or a false statement — everyone looks to someone else to handle it and it's easiest to figure out what to believe by watching how others react or say, especially if those people have status. Being part of a group diffuses a sense of individual responsibility. “If you want to know why others aren’t responding to an emergency, before you respond yourself, you may have just answered your own question,” said Yudkowsky.

While the University of Michigan research might seem to imply that staying quiet and not saying anything might be better than debunking myths, that’s not true, either. Silence reinforces false information and is seen among a group as support. And, of course, if the truth was never given a voice, then everyone would believe myths.

All of this leads us to the importance of learning to think critically, question everything and understand how our minds work.

One commenter at Overcoming Bias said that he’d heard there are two main kinds of people: “the well-intentioned, but ill-informed who make up the majority of the population; and the well-informed, but ill-intentioned who prey on the former.”

Like this commenter admitted: “personal experience has left me somewhat scarred by the ‘well-informed, but ill-intentioned’ set.”


© 2007 Sandy Szwarc

Bookmark and Share