An analysis of psychological meta-analyses reveals a reproducibility problem

Meta-analysis research studies in psychology aren’t always reproducible due to a lack of transparency of reporting in the meta-analysis process, according to a new study published May 27, 2020 in the open-access journal PLOS ONE by Esther Maassen of Tilburg University, the Netherlands, and colleagues.

Meta-analysis is a widely used method to combine and compare quantitative data from multiple primary studies. The statistical approach used in meta-analyses can reveal whether study outcomes differ based on particular study characteristics, and help compute an overall effect size—for instance, the magnitude of a treatment effect—for the topic of interest. However, many steps of a meta-analysis involve decisions and judgements that can be arbitrary or differ by researcher.

In the new study, researchers analyzed 33 meta-analysis articles in the field of psychology. The meta-analytical studies were all published in 2011 and 2012, all had data tables with primary studies, and all included at least ten primary studies. For each meta-analysis, the team searched for the corresponding primary study articles, followed any methods detailed in the meta-analysis article, and recomputed a total of 500 effect sizes reported in the meta-analyses.

Out of 500 primary study effect sizes, the researchers were able to reproduce 276 (55%) without any problems. (In this case, reproducibility was defined as arriving at the same result after reanalyzing the same data following the reported procedures.) However, in some cases, the meta-analyses did not contain enough information to reproduce the study effect size, while in others a different effect than stated was calculated. 114 effect sizes (23%) showed discrepancies compared to what was reported in the meta-analytical article. 30 of the 33 meta-analyses contained at least one effect size that could not be easily reproduced.

When the erroneous or unreproducible effect sizes were integrated into each meta-analysis itself, the team found that 13 of the 33 (39%) meta-analyses had discrepancies in their results, although many were negligible. The researchers recommend adding to existing guidelines for the publication of psychological meta-analyses to make them more reproducible.

Source: Read Full Article