Junkfood Science: Inquiring minds want to know: when are food and health claims real?

December 25, 2007

Inquiring minds want to know: when are food and health claims real?

Is healthcare always based on science or are there times when our care and medical advice is founded on myths, beliefs, tradition or anecdotes? In the current issue of the British Medical Journal, two pediatricians took a lighthearted look at seven medical myths they said they’d heard repeated among doctors or in the popular media. The myths, they said, “appear to be ingrained in the popular imagination, including that of physicians.” Regrettably, with their rather unscientific selection process and choice of fairly trivial topics, the serious point they were trying to make was lost, in the media and medical commentary. Here it is, as they wrote:

Physicians understand that practicing good medicine requires the constant acquisition of new knowledge, though they often assume their existing medical beliefs do not need re-examination. The medical myths we give here are a light hearted reminder that we can be wrong and need to question what other falsehoods we unwittingly propagate as we practice medicine....

Physicians would do well to understand the evidence supporting their medical decision making. They should at least recognise when their practice is based on tradition, anecdote, or art. While belief in the described myths is unlikely to cause harm, recommending medical treatment for which there is little evidence certainly can. Speaking from a position of authority, as physicians do, requires constant evaluation of the validity of our knowledge.

It is extremely difficult, however, to bring oneself to even consider or accept evidence contrary to everything one has come to believe. Once a belief has taken hold and becomes popular, it’s rarely questioned. The need to do so is inconceivable.

No one, including healthcare professionals, intentionally sets out to believe things known to be false. Most people truly believe what they’ve come to know. But, as the Critical Thinking Lessons explained, in order to avoid falling victim to unsound claims requires understanding the process of critical thinking and how to circumvent the fallacies of logic that can beset anyone, and actively working to avoid the most fundamental error in thinking, the confirmation bias. And that’s harder than it sounds:

[W]e have an automatic tendency to pay attention to or seek out information that is in agreement with (confirms) our preconceptions, and to ignore, distort or avoid information that contradicts (disconfirms) our preconceptions, a tendency that is called the confirmation bias. The confirmation bias serves to maintain and strengthen the beliefs that we already hold by causing us to automatically (that is, without being aware that we are doing so) perceive and remember experiences that confirm these beliefs, and to ignore or reinterpret those that disconfirm them. Because we tend to seek out only confirming evidence, our beliefs over time become so well confirmed in our minds that we come to think of them as “obviously true." In order to avoid the confirmation bias, we must force ourselves to look for evidence that disconfirms our beliefs.

As the confirmation bias takes hold [discussed here], people become more and more certain that what they believe is right. Simultaneously, according to Dr. Ben Goldacre, a London physician, people also become increasingly more resistant to evidence which counters their beliefs. “When you point out a problem with the evidence,” he wrote, “people don't engage with you about it, or read and reference your work.” They react quite negatively. As he illustrated with homeopathy, this vehement resistance isn’t actually about the science or a productive debate of the evidence. It is not scientists disagreeing about the science, as some might try to claim. Even when carefully conducted scientific studies have proven something not to work, he explained, its proponents cannot see or accept the evidence.

A study published in this month’s Journal of the American Medical Association highlighted how common it is for claims to persist and even continue to be supported in certain scientific circles and in the medical literature long after they have been disproven. Changing established medical and nutritional beliefs is not easy, nor is it just a matter of presenting the evidence. The researchers, led by Dr. Athina Tatsioni, M.D., at Tufts University School of Medicine in Boston, traced nutritional claims (antioxidants beta-carotene for prevention of cancer, vitamin E to prevent cardiovascular disease in women, and estrogen for the prevention of dementia) that had strong contradictory evidence from large, high-quality, randomized clinical trials. They examined the acceptance of this evidence in the medical literature compared with the persistence of popular beliefs formed from earlier studies based largely on associations:

The persistent favorable stance toward the contradicted interventions was particularly prominent in articles published in specialty journals of both clinical and basic science disciplines. Specialist articles apparently continued to use references to the highly cited observational studies to support their own lines of research. The presence of refuting data were not mentioned in many articles. Other articles did report data with contrary results, but they raised also a wide array of counterarguments to support the observational claim.

As is well-recognized, the researchers noted that positive results from randomized clinical trials were the ones most published in specialty journals and citations were biased towards positive findings. The belief bias of people, they said, regardless of the topic, also influences the interpretation of scientific results. When studies offered contradictory findings, the original belief based on observational studies “was defended at all cost.”

The defense of the observational associations was persistent, despite the availability of very strong contradicting randomized evidence on the same topic. Thus, one wonders whether any contradicted associations may ever be entirely abandoned, if such strong randomized evidence is not considered as much stronger evidence on the topic.

This is important because, as we know, it’s those null and negative findings which are vital to the progress of science. It is the ability of experiments to disprove an hypothesis in carefully-designed studies that sets science apart from pseudoscience. This is the source of Albert Einstein’s famous saying: “No amount of experimentation can ever prove me right; a single experiment can prove me wrong.”

The Tufts researchers showed that randomized trials disproving beliefs derived from observational studies eventually results in less frequent citations of those epidemiological studies. But it occurs with “considerable delay and a considerable segment of the literature continues to cite the contradicted articles long after the contradiction.” Even fifteen years later, the claims and disproven studies continue to predominate the literature. And “the articles that cited these observational studies continued to be predominantly favorable.”

The delay can mean wasted healthcare resources, failure to pursue effective modalities and potential harm.

They concluded by suggesting that better communication of evidence-based clinical science might improve this situation and “lead to more rational and concerted translational efforts in basic, preclinical, and clinical research.” But simply communicating the evidence circles back to the issue of confirmational bias and missing critical thinking skills. Several have voiced concerns that even medical school curriculums don’t provide these essential skills for young doctors. Dr. R. W. Donnell, a hospitalist in Northwest Arkansas, for example, has written extensively on the inundation of alternative beliefs and modalities in American medical schools and the medical profession as a result. Similar concerns have been raised concerning nursing.

Researchers at Georgetown University School of Medicine in Washington, DC, for example, surveyed 265 medical students and found 91% embraced alternative modalities as beneficial to Western medicine; most wanted it incorporated in their medical training; and most planned to endorse, refer patients or provide alternative modalities in their future practices.

Last summer, Prometheus wrote a valuable and germane article on how misconceptions about how science works can lead to false yet “generally accepted theories of reality.” These aren't just seen behind established medical myths, but also the myths popular throughout our culture and mainstream media. They also help to explain why the validity of those beliefs aren’t often questioned and re-evaluated.

To illustrate his point, he used the beliefs in an epidemic of autism [previously examined], writing:

One of the most commonly repeated misconceptions is that scientific ‘facts’ (what scientists refer to as ‘generally accepted theories of reality’) are determined by popular vote.... Unfortunately for them, reality has shown itself supremely indifferent to majority rule... So, even if seven thousand people think that Andy Wakefield’s thoroughly disproven hypothesis about measles vaccine causing autism is true, that will have no impact on the ability of the vaccine strain of measles to cause autism.

The sad fact is that the purpose of science is to discover the underlying realities of nature, not to confirm our most cherished hypotheses. When people...set out to prove themselves right, they often overlook the data that show they are wrong.

Considering another example, the perceived epidemic of obesity, even if innumerable people believe the thoroughly disproven hypothesis about overeating or bad foods and sedentary behavior as causing obesity, it will have no impact on the ability of diet and exercise or a healthy lifestyle to “cure” or prevent obesity. Hence, the consistent failure of diets and exercise interventions to work long-term, with rare exceptions, after a century of such efforts.

Prometheus went on to explain another popular misconception:

Another popular concept is that scientific reality can be legislated. This has been tried a number of times previously and has a dismal history... “Science by decree” appeals to those who are absolutely convinced that there is no possibility that they might be wrong....But what happens when it becomes apparent that the legislated “science” is in error? What will the legislators say to those who entreated them to make the law in the first place? How receptive will they be to another group of parents who come to them, saying “Well, it turns out that vaccines weren’t the cause of autism and we need a bunch of money to research the real cause.” Do you think that any law maker is going to want to bring that before their peers?

I think that everybody knows that if the various...groups had the data, they wouldn’t need to do an “end run” around science (and, curiously, the courts) to the legislature. What they are saying, in essence, is: “We can’t convince scientists, we can’t convince the courts and we can’t even convince a majority of parents with our data, so we’re asking you to force everybody to say that we’re right.”

Similarly, the "Healthy Lifestyles and Prevention America Act" and no amount of legislated healthy behaviors are working on obesity, either.

So why do so many beliefs persist, even after the soundest science has long ago disproven them? And why are they so compelling? Whether it be promises of long-term weight loss or cures for autism, the beliefs are accepted, in large part, because they sell hope. As Prometheus commented about autism:

[T]he "experts"...are accepted because they offer hope. The sad part is that they are — so far as science can determine — offering false hope. I know that some people feel that it is a mercy to offer hope, even when it is false. However, I am firm in my belief that, ultimately, offering false hope is more destructive than offering the truth — that there is nothing that is known to help.

I think that it is perfectly reasonable for physicians to tell parents that there are some “treatments" that other parents have tried and that they report some success, but I think that it is imperative that they be absolutely clear that none of these treatments has been shown to work. Much of the problem is that the “practitioners" who advocate these “therapies" are so divorced from critical thinking that they fail evaluate their own results. I have heard them rationalize their failures into successes ... They are, as the old saw goes, “Often in error but never in doubt."


© 2007 Sandy Szwarc


Up next, the newest review of the evidence on diets and weight loss interventions that exemplifies what Prometheus described as rationalizing failures into successes.

Bookmark and Share