Junkfood Science: More Reindeer Games

March 28, 2009

More Reindeer Games

Nearly 3,000 news stories this past week jumped on the bandwagon to report that a new study had found that red meat may be deadly. This is another flagrant illustration that we’d all be a lot healthier if we just stopped reading medical news stories. Not one health journalist reported the study accurately, truthfully or responsibly. As a result, countless people have been needlessly frightened about their food and health and are being led to make health decisions or support policies that have no grounds in sound science.

This was not a clinical study at all. Not a single person was ever examined. It turned out to be another computer data dredge of those AARP member mail-in questionnaires from 14 years ago. It was unable to find a single tenable correlation between meat consumption and premature death — in fact, it not only failed to find an association between meat and higher incidences of cancer or premature deaths, but if you want to split hairs, it found the opposite of many of the claims in the news this week.

Before revealing what didn't make the news, let’s take a quick look at how they did it.

Garbage In

As readers will remember, in 1995-6, the AARP had sent out questionnaires to its members (50-71 years of age) in select parts of country, asking them their height and weight, lifestyle habits, and to estimate how often they had eaten 124 food items over the past year. Only 1.5% of AARP members returned the questionnaires, and they were not representative of adults their age at all. Their self-reported answers were never confirmed, either.

But these mail-in membership questionnaires became the NIH-AARP Diet and Health Study database. It has been used to dredge through and find all sorts of meaningless correlations to frighten people. [The most recent scare was here.]

A total of 617,119 persons returned the AARP member questionnaires. For the study published this week in Archives of Internal Medicine, the authors used questionnaires from 322,263 men and 223,390 women. Based on those food frequency questionnaires, the authors estimated how much red meat the seniors had eaten.

Red meat intake was calculated using the frequency of consumption and portion size information of all types of beef and pork and included bacon, beef, cold cuts, ham, hamburger, hotdogs, liver, pork, sausage, steak, and meats in foods such as pizza, chili, lasagna, and stew.

Taking this already dubious data, they “created three diet types: high-, medium- and low-risk meat diets” and gave a point score of 1 through 3 to people at the different meat consumptions. Using computer modeling, they looked for correlations between their estimated meat consumption scores and deaths over the following ten years, as reported on the Social Security Administration Death Master File. Finally, they estimated odds ratios (also called hazard ratios) of the links that came up. Causes of death were taken from insurance billing codes (International Classification of Diseases, ICD-9 and ICD-10).

Computer games

These types of studies [epidemiological data dredges were explained here] are most rife with misinterpreted statistics, errors and biases, and are most easily manipulated to arrive at whatever conclusions researchers set out to find. They are also the most poorly understood. Not only is it common to mistakenly think that any correlations the computer models dredge up indicate causation, but the public doesn’t realize that, especially for these types of studies, a correlation (“risk”) has to be mighty big to even suggest a true effect. The more data you mine, the more likely you’ll randomly yield hits that are statistically significant, but no better than chance and mean nothing. The bigger the study, the bigger the chances for spurious correlations. Computer modeling errors are often even larger than random chance. So, relative risks for a link between meat and deaths, for example, have to be tenable — beyond what would have come up by random chance and statistical error or as a marker for a confounding factor. Credible scientists don’t accept as tenable any relative risk under 200% to 300%.

But it’s easy to frighten people who don’t understand statistics with inconsequential correlations that sound significant and scary. These are the studies that become the scare of the week.

Computer programs can and do spit out countless nonsensical and contradictory correlations. We only hear about the ones the authors choose to report, though. We never hear about all the others their computer model found that are just as spurious. If we heard the full story, it would be much more obvious how uncredible the entire statistical game is.

For example, we didn’t hear that this study also “found” that among women:

● Being married was associated with more than a 30% increased risk of death.

● Doubling their alcohol consumption was associated with one-third lower risk for premature death.

● College diplomas were associated with a lower risk for premature death.

Yet, no one would seriously suggest that women could lower risks for premature death by hanging up a diploma, getting divorced and hitting the bars! These are clearly markers for factors that might play an actual role. In epidemiological observational studies, foods are most often markers for real factors in health outcomes, although the correlations have never been tenable in the first place. That’s why, every time healthy eating has been put to test in well-designed clinical trials, it’s failed to show a meaningful benefit for the primary prevention of diabetes, heart disease or cancers, or to help people live longer. Meanwhile, all of those clinical research resources could have been spent on finding real cures and treatments, but those aren’t as profitable or as politically useful as “healthy eating and lifestyles.”

What the study found

The authors in this latest AARP data dredge were unable to find even one tenable correlation between any type or amount of meat consumed and the 47,976 deaths in men and 23,276 deaths among the women during the ten years.

This was a null study. It was unable to find any valid link between red meat or processed meats and premature death from any cause.

If you want to have fun splitting hairs among untenable odds ratios, the highest red meat consumption was associated with a greater chance of the men dying from injuries than cancer. Clearly bunkum to think red meat makes men accident prone! Reporting odds ratios rather than actual incidences can also make the risks sound huge and real, like the 22% odds ratio for cancer deaths associated with the highest red meat consumption compared with the lowest consumption among the men. Of course, this odds ratio wasn’t more than random chance to begin with, but no reporter reported that. If the actual incidences had been reported, though, the nonsensical finding would have been instantly apparent. Among the 16,433 cancer deaths among the men over ten years, the difference in actual incidences each year between those with the lowest and highest meat consumption was 1.4% — certainly not a finding that would have generated nearly 3,000 news headlines trying to scare us that red meat is deadly.

Among the women the untenable findings were the opposite being reported. Among the total deaths among the women, the actual annual death rate was 1.19% among women with the lowest red meat intake. This compared to 0.8% among women with the highest red meat intakes. So, the more meat the women purportedly ate, the lower their risks for premature death and of dying from cancers.

Not a single reporter reported that the highest red meat consumption among women was "associated" with the lowest actual rates of premature death from all causes or cancer deaths.

Clearly, no food or health writer understood the study methodology or statistics, or bothered to look beyond the press release. Claims that this study supported the need for everyone to significantly reduce their meat intake were simply not supported in this study.

Nor are such admonitions supported in the body of the soundest evidence to date, as has been covered in-depth previously. We don’t even need to think very hard. Rates of premature deaths from all causes, including cancer and heart disease, have been steadily dropping for more than half a century. Meanwhile, per capita red meat consumption has stayed nearly identical since 1990. It makes all of those unfounded speculations that we’ve heard all week to “explain” why red meat is bad have been the biggest give-away of people who don’t understand science.

If you’re still worried about red meat, though, there’s a simple solution: grill or roast your favorite type of red meat until it’s a rich brown color and enjoy guilt-free. It tastes a lot better, too. :-)

© 2009 Sandy Szwarc

Bookmark and Share