News about medical research ignore basic facts: Study

News stories about medical research, often based on initial findings presented at professional conferences, frequently omit basic facts about the study and fail to highlight important limitations, researchers have found. Such omissions can mislead the public and distort the actual significance of the research.

HAS IT BEEN PUBLISHED: In this photo released by the University of Maastricht's Medical Hospital a team of two surgeons and two stomach specialists use a tube, miniature camera and newly developed robotic tools to perform surgery to correct heartburn without making any external incisions, Maastricht, Netherlands, Friday June 16, 2006. Only 2 of 175 stories about unpublished studies noted that the study was unpublished. Schwartz and Woloshin, who frequently present to the media on how to understand and accurately report research results, said that while reporters can and should do better, another reason for misinterpreted or "over-hyped" research is its early release at professional meetings that reporters are encouraged to attend.(AP Photo/University of Maastricht's Medical Hospital)

Dr Lisa Schwartz and Dr Steven Woloshin, both Associate Professors of Medicine at Dartmouth Medical School (Hanover, New Hampshire) and at the VA Outcomes Group (White River Junction, Vermont), studied media coverage of research presented at scientific meetings, the findings of which have been published in the latest issue of the Medical Journal of Australia.

"Scientific meetings are an important forum for researchers to exchange ideas and present work in progress. But much of the work presented is not ready for public consumption," said Schwartz. "The studies have undergone limited review and findings may change substantially by the time the final report is published in a medical journal." And, she noted, "Some meeting presentations are never published at all."

Scientific meeting research, nevertheless, receives extensive news media coverage. "Unless journalists are careful to provide basic study facts and highlight limitations the public may be misled about the meaning, importance and validity of the research," said Woloshin.

The duo analysed newspaper, television and radio stories that appeared in the United States and Canada for research reports from five major scientific meetings in 2002 and 2003 to see if basic study facts (eg., size, design) were reported; whether cautions about inherent study weaknesses were noted; and if the news stories were clear about the preliminary stage of the research.

They identified 210 potentially eligible newspaper stories and 20 nationally syndicated television or radio transcripts from the US or Canada that reported on research presented at these scientific meetings (i.e., not general reports about the meetings or policy statements). After reviewing all potentially eligible stories, Schwartz and Woloshin coded the 174 newspaper stories and 13 television/radio transcripts meeting their criteria. To fully report what the public is exposed to, the researchers did not exclude wire reports (which are edited and given headlines at the newspaper’s discretion).

The researchers found that basic study facts were often missing. For example, a third of the reports failed to mention the study size; 40 per cent did not quantify the main result at all. Important study limitations were also often missing. For example, only 6 per cent of the news stories about animal studies noted that results might not apply to humans.

PUBLICITY FOR ALL: Chemotherapy treatments for lung cancer patients are mixed in the pharmacy at the Kimmel Comprehensive Cancer Center at Johns Hopkins in Baltimore, Maryland, USA, August 2005. "It is not hard to understand why research presented at scientific meetings garners extensive media attention," Schwartz and Woloshin wrote. "Researchers benefit from the attention because it is a mark of academic success, their academic affiliates benefit because good publicity attracts patients and donors, and research funders – public and private –benefit when they can show a good return on their investments." (AFP/Getty/Win McNamee)

Only 21 per cent of news stories about studies involving fewer than 30 people alerted readers to the imprecision of small studies. Ten per cent of news stories about uncontrolled studies noted (or implied) that without a control group it is not possible to know if the outcome really relates to the exposure, and 19 per cent of stories about controlled but not randomised studies raised the possibility of confounding. Cautions about possible downsides of interventions were also missing: 142 news stories covered intervention studies, but only 29 per cent noted any possible downsides (eg, side effects or other harms) or stated that there were none.

Only 2 of 175 stories about unpublished studies noted that the study was unpublished. Schwartz and Woloshin, who frequently make presentations to the media on how to understand and accurately report research results, said that while reporters can and should do better, another reason for misinterpreted or "over-hyped" research is its early release at professional meetings that reporters are encouraged to attend.

"It is not hard to understand why research presented at scientific meetings garners extensive media attention," they wrote. "Researchers benefit from the attention because it is a mark of academic success, their academic affiliates benefit because good publicity attracts patients and donors, and research funders – public and private – benefit when they can show a good return on their investments. The meeting organisers benefit too; extensive media coverage attracts more advertisers, and higher profile scientists for the following year, guaranteeing more dramatic reports and ultimately more press."

Moreover, they noted, "the public has a strong appetite for medical news and scientific meetings provide the media with an easy source of provocative material."

Given the reality that a decrease in media coverage of scientific meetings is not likely, the authors have urged reporters and editors to make sure their stories include three things:

(1) basic study facts: what kind of study was done, how many subjects were included, what was the main result;

(2) cautions about study designs with intrinsic limitations; and

(3) clear statements about the preliminary stage of the work under discussion.

 
 
Date Posted: 18 June 2006 Last Modified: 18 June 2006