Health advice changes regularly as scientists are constantly making new discoveries. And the news media reports on those discoveries, nearly in real-time.
While that system seems like a good idea, it has a major fault: A single scientific study does not provide enough evidence to offer reliable health advice. Broad recommendations require scientists to test and retest hypotheses using robust study designs, and even then, the data may not present a clear result.
Here’s an amusing example: Two Harvard researchers analyzed the available data on 50 common recipe ingredients to find out which foods are associated with causing or preventing cancer. Surprisingly, for the majority of foods in the study, the authors found reliable that data that they both caused and prevented cancer.
The problem is not that the studies are faulty, or that researchers are skewing the results. Wide-ranging results occur in scientific research because there are dozens of variables that can affect the data. To draw an accurate conclusion, researchers in other labs need to replicate the same or similar studies. When there is a large enough body of evidence on a given topic, researchers conduct a systematic review that combines all of the data to draw a sound, well thought-out recommendation.
Trouble arises when the media reports on individual study results without giving any context, and readers interpret those results as fact.
That brings us to a systematic review published last month. Researchers set out to measure whether newspapers report on scientific studies that can legitimately offer broad health advice. To do this, they looked at a database of more than 5,000 studies. They found 161 of those studies were reported in newspapers – a total of more than 1,500 newspaper articles altogether.
So, which studies made the news? Nearly 40 percent of the studies in the newspaper covered lifestyle topics, such as diet or smoking. Sixty percent of the studies reported in the news focused on non-lifestyle factors, such as biomarkers for disease.
For lifestyle studies, newspapers covered initial findings and follow-up stories about equally. But for non-lifestyle factors, newspapers were significantly more likely to cover initial findings compared to follow-up stories. And for both types of studies, publishing in a prestigious journal considerably increased the chance for newspaper coverage.
Newspapers rarely reported negative results, even though they are often equally as valuable as studies reporting positive results. In the review, only 5 percent of the media reports covered negative results.
That brings us to the reliability of the studies in newspapers. For more than half of the studies receiving newspaper coverage, the authors did not find a larger body of evidence proving that the reported research findings offered meaningful health advice. In short, half of the newspaper articles on health studies are not proven to be true.
In addition, there were 234 newspaper articles about initial studies that were later disconfirmed. But only four articles reported that further research had revealed the initial conclusions were not correct. In all of the other cases, the news readers were left believing that the original results were accurate.
What does all of this mean for the average consumer? Read health news with a wary eye. Look for evidence reported in the news that is confirmed by other studies. And – above all – recognize that scientific research moves slowly and involves a great deal of uncertainty.