It ‘s hard to find relevant information among contradictory and sometimes misleading information in the medical news. As a consequence, valuable information may be lost, and useless information is taken seriously.
Evaluating the results and their background in medical, scientific papers and abstracts is complicated and time-consuming for healthcare providers, and more or less impossible without a medical education. Therefore press releases, headlines and abstracts become a primary source for the news. But what and how can you trust?
With close to 130 medical, scientific papers being added to MedLine every hour 24 hours a day, seven days a week, 365 days a year, it is not possible for anyone to keep up with all news, and consequently, you may valuable information may get lost, and useless information live on. Trying to keep up, we use databases, keywords, perhaps reliable sources, or perhaps web services presenting headlines and abstracts. Oh yes, and Twitter, Facebook, Google, Youtube, and much more. Journalists selecting the information to be displayed in the news stream must use the same methods, but are also receiving press releases and are being pitched from different sources. Press releases are designed to catch the journalist or other news providers attention, with the hope of being presented in the news with subsequent public awareness of the content of the press release – and of the sender.
Trustworthy sources are often mentioned in the coverage of the news, adding credibility to the information, or at least that is what is intended. However, a paper in the British Medical Journal tells us another story. The authors of the BMJ paper had examined 462 press releases from UK universities and found that 40% contained exaggerated advice, 33% contained exaggerated casual claim (like implying or stating that one thing caused another when the study only observed a correlation). 36% contained exaggerated interference to humans from animal research. This was of course often reflected in the news stories, in particular for the casual claims and extrapolating findings in animal research to humans.
You may wonder why the universities should make stronger claims than their research could account for. One reason could be to increase the uptake of the news, but then this study leaves them disappointed, as the BMJ paper found little evidence that exaggeration leads to increased uptake of the news. Other reasons could be the increasing competition and thereby need for self-promotion, support for funding, or to create awareness of the institution.
The authors of the paper emphasise that this should not be perceived as shifting the blame from the journalists to the press offices of the universities. Press releases may be created in the press office, but are most often draughted in a dialogue between the scientists and the press officers. This makes the problem pretty easy to solve, as no one knows the data better than the scientists, why they can easily improve the accuracy of the information in the press releases. After all, many studies are provisional, and links between previous and future research and findings, and not suitable for being “today’s news”.
But press releases also comes from many other sources than academic institutions, like patient organisations, pharma, healthcare institutions and – providers, and from insurance companies, just to mention a few. Any such source may have another reason than just informing for this press release, a reason not disclosed. This may add bias to the content, both for what is told and what is untold. There is not necessarily anything wrong with such a press release, but it requires great critical faculty from any reader.
Unfortunately, some stories in the news are just presented from press releases without being critically examined by the journalist. If the journalists do not want to tell abut the study, e.g. that it was made on rats, that it only included seven persons, or who founded the study, the at least provide a reference.
Finally, the British Medical Journal study is looking at correlations, not causality so please don’t over interpret the data! Next, we must remember that the majority of the 462 press releases in the study did not exaggerate. Last but not least we have to bear in mind that big claims and miracle cures are seldom right, even though “it was in the news