Which story are you more likely to read: “Vitamin X, The Fountain of Youth!” or, “Scientists Discover Mechanism Behind Cell Atrophy”? It's not only the media that distorts the results of scientific studies to gain attention. Scientists themselves are sometimes guilty of exaggerating the significance of their findings in the journal articles that they publish.

A new study suggests that nearly one out of every ten published papers on nutrition or obesity do exactly that in their summary or abstract, making statements that overreach — go beyond what the study actually shows. And the practice has gotten worse in the last ten years.

Studies published in 2011 were more likely to overreach than those published in 2001.

In an attempt not to overreach here, it's only fair to mention that this study only looked at a fairly small sample, 937 papers published in eight leading journals. But if that sample is representative, 8.9% of nutrition and obesity papers have overreaching conclusions.

Overstated Results Can Mislead Policy

These overstatements may be unintentional, but they mislead both the public and professionals. They can also cause policymakers to make inappropriate recommendations and implement needless policy changes. And if researchers are overstating their results, one can only imagine what the media is doing to them.

Among the more common types of overreaching were conclusions indicating that one event was the cause of another when all the research showed was that the two events tended to occur together; or that a finding from a very specific group of people, such as those over 80 years old, likely applied to a much broader range of people.

Nearly one out of every ten published papers on nutrition or obesity make statements that overreach — go beyond what the study actually shows.

Studies published in 2011 were more likely to overreach than those published in 2001. And unfunded studies were more likely to overreach than funded ones, no matter who paid for the research.

It took years and many studies to adequately prove that cigarette smoking caused cancer. Usually, the best researchers can do is to show a correlation, that two events appear to vary in relationship to each other. For example, a recent study suggests that doing simple household chores may be heart healthy and even add years to your life. What the study authors' published conclusion says is, “A generally active daily life was, regardless of exercising regularly or not, associated with cardiovascular health and longevity in older adults.”

Will Cleaning the Garage Really Prolong Your Life?
The data showed that people who led active lives, including those who spent significant time on housework and DIY (do-it-yourself) activities, were healthier and less likely to die over the 12.5 year course of the study. The study does not prove that painting the livingroom, building a dog house, or washing the windows caused them to live longer. There could have been other factors at work that would have to be ruled out (or, in scientific terms, controlled for); for example, as a group DIY-ers might be more physically active in general, more likely to have better eating habits, or have more education, all of which could also affect longevity and, therefore, the results.

The fact that the study doesn't prove a cause and effect relationship, but merely an association, shouldn't stop anyone from embarking on those types of activities. They seem to go along with good health and your house probably needs the work done anyway. They may very well make you healthier. But it would be wrong for the study authors to say that mopping the floors, gardening or even repainting your house will make you healthier. And to their credit, they don't.

For a more thorough exploration of the pitfalls of a study "proving" that exercise will add twenty years to your life (though there are plenty of good reasons to exercise), see What's Good Health Information?

Know the Buzzwords; Be Skeptical

So what can you do to get a better idea if an article summary or media description of a piece of nutrition or obesity research is accurate or overstates its results? In scientific research, as in so much of life, the devil is in the details. And those details are why studies presented on TheDoctor generally include links to the original study (see below).

The devil is in the details, but those details can be hard to access and understand.

Even then the details may be hard to access. Scientific papers (except for their abstracts) usually must be purchased before you can read them and are often expensive. And their language and the statistics used to crunch their data can be daunting, to say the least. But unless you know how a study was conducted, you can't be sure about what its findings mean, unless the journalist reporting the study has been careful to explain them fully.

For most people the best approach is to 1) be skeptical of applying the findings of animal studies to people; and 2) pay close attention to words like “correlation” or “association,” recognizing that those terms indicate the study did not establish a cause and effect relationship. If a study's results sound too good to be true, they probably are being overstated.

The study was done by a team at the University of Alabama, Birmingham and appears in the American Journal of Preventive Medicine. Check it out and see if you think the team's results have been presented accurately here.