Garbage In–Garbage Out: Systematic Reviews and Meta-Analyses can tell us a Flawed Story

1
3016

Well known Stanford University researcher John Ioannidis published a new paper this week criticizing the use and production of systematic reviews and meta-analyses, often considered the highest forms of research evidence. In the paper, “The Mass Production of Redundant, Misleading, and Conflicted Systematic Reviews and Meta-analyses,” Ioannidis describes meta-analyses as being taken over by industry sponsors and concludes that an estimated 3% of all of these reviews may be useful.

In 1978, Hans Eysenck commented on the “mega-silliness” of using poorly designed research studies to study outcomes in psychotherapy. He quoted the well-known maxim from computer science – “garbage in-garbage out” to refer to the uncritical selection of disparate studies to produce reviews.“A mass of reports – good, bad and indifferent – are fed into the computer in the hope that people will cease caring about the quality of the material on which the conclusions are based,” wrote Eysenck.

John Ioannidis, via Stanford University
John Ioannidis, via Stanford University

The pitfalls of this practice are the subject of a new investigation by John Ioannidis, a Stanford University researcher well known for his critique of research methodologies summarized in his paper “Why Most Published Research Findings Are False.” Focusing on biomedical research he writes, “Most topics addressed by meta- analyses of randomized trials have overlapping, redundant meta-analyses; same topic meta-analyses may exceed 20 sometimes. Some fields produce massive numbers of meta-analyses; for example, 185 meta-analyses of antidepressants for depression were published between 2007 and 2014. These meta-analyses are often produced either by industry employees or by authors with industry ties and results are aligned with sponsor interests.”

Systematic reviews and meta-analyses have grown at an exponential rate – from 1991 to 2014 there has been an increase of over 2,500% of these studies in the published literature. While they are a useful way to combine results from a large number of studies and have the potential to provide a breadth of information, many have been conducted uncritically and used to advance industry interests instead of good science.

Ioannidis reports that most studies cannot be included in systematic reviews because replication studies are avoided by the research community in their quest for innovation. This leads to a glut of similar, but different studies that cannot be combined analytically. Another issue is the sheer number of meta-analyses and reviews, especially when their conclusions differ even when they are summarizing the same evidence.

“. . .it is possible that nowadays there are more systematic reviews of randomized trials being published than new randomized trials,” he comments.

The author provides the example of meta-analyses and systematic reviews for antidepressant medication as an exemplar of the major issues with this literature. The studies that are covered by these reviews contain critical methodological flaws, including the reporting of selective, industry favorable outcomes. This interacts with the fact that most analyses and reviews of this topic tend to be conducted by investigators with major financial conflicts of interest, turning the review into a viable marketing tool. Then “conflicted expert editorials” often appear to further popularize the biased findings. At best such reviews provide the public with misinformation, at worst they can be dangerous.

Ioannidis speculates that only 3% of all meta-analyses provide us with good quality and clinically useful information. This does not mean that meta-analyses are inherently poor science. Rather, what this study reveals is that the quality of most published information is poor, pointing to the need for better primary data that is synthesized and analyzed as studies are conducted, instead of after the fact.

Other researchers have begun to propose new ways to conduct systematic reviews, for example by looking at networks of trials for alternative treatments. “Eventually, prospective meta-analyses designed and conducted by non-conflicted investigators may need to become the key type of primary research. Production of primary data, teamwork, replication, and meta-analyses can be integrally connected”, concludes Ioannidis. In order for this to happen, more fields, including biomedicine, will need to incentivize this kind of collaborative research.

Retraction Watch has published a full interview with Ioannidis about this paper. Read it here →

 

****

Ioannidis, J. A. (2016). The Mass Production of Redundant, Misleading, and Conflicted Systematic Reviews and Meta-analyses. The Milbank Quarterly94(3), 485-514. doi:10.1111/1468-0009.12210 (Abstract)

1 COMMENT

LEAVE A REPLY