Advertisement

“One more time”: why replicating some syntheses of evidence relevant to COVID-19 makes sense

  • Matthew J. Page
    Correspondence
    Corresponding author. School of Public Health and Preventive Medicine, Monash University, 553 St Kilda Road, Melbourne, Victoria, 3004, Australia. Tel.: +61-9903-0248.
    Affiliations
    School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia
    Search for articles by this author
  • Vivian A. Welch
    Affiliations
    Bruyere Research Institute, Ottawa, Canada

    Campbell Collaboration, Oslo, Norway

    School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa
    Search for articles by this author
  • Neal R. Haddaway
    Affiliations
    Stockholm Environment Institute, Linnégatan 87D, Stockholm, Sweden

    African Centre for Evidence, University of Johannesburg, Johannesburg, South Africa

    The SEI Centre of the Collaboration for Environmental Evidence, Stockholm, Sweden

    Mercator Research Institute on Global Commons and Climate Change, Berlin, Germany
    Search for articles by this author
  • Sathya Karunananthan
    Affiliations
    Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada
    Search for articles by this author
  • Lara J. Maxwell
    Affiliations
    Faculty of Medicine, University of Ottawa, Ottawa, Canada
    Search for articles by this author
  • Peter Tugwell
    Affiliations
    Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada

    Faculty of Medicine, Department of Medicine, and School of Epidemiology and Public Health, University of Ottawa, Ottawa, Canada
    Search for articles by this author

      Highlights

      • Given the urgent need for credible answers to high-priority questions about the health and social impacts of COVID-19, many systematic reviewers seek to contribute their skills and expertise.
      • Rather than embarking on unnecessary, duplicate reviews, we encourage the evidence synthesis community to prioritise purposeful replication of systematic reviews of evidence relevant to COVID-19.
      • We explain why replication of systematic reviews is important, how to carry out a replication, and when to consider replication of reviews.
      The coronavirus disease 2019 (COVID-19) pandemic has mobilized researchers across the world on a scale not seen before [
      • Wang C.
      • Horby P.W.
      • Hayden F.G.
      • Gao G.F.
      A novel coronavirus outbreak of global health concern.
      ]. As of 11 May 2020, 2,787 studies presenting primary data on COVID-19 have been indexed in MEDLINE and Embase [
      • Lorenc T.
      • Khouja C.
      • Raine G.
      • Sutcliffe K.
      • Wright K.
      • Sowden A.
      • et al.
      COVID-19: living map of the evidence.
      ], and 1,029 clinical trials of interventions for the disease are currently underway [
      Global Coronavirus COVID-19 Clinical Trial Tracker.
      ,
      • Thorlund K.
      • Dron L.
      • Park J.
      • Hsu G.
      • Forrest J.I.
      • Mills E.J.
      A real-time dashboard of clinical trials for COVID-19.
      ]. Preprint servers medRxiv and bioRxiv host more than three thousand preprints on COVID-19 [
      COVID-19 SARS-CoV-2 preprints from medRxiv and bioRxiv.
      ]. There is also a wealth of data from previous pandemics (e.g., SARS and MERS) that may inform efforts to combat COVID-19. To make sense of all these data, timely, relevant systematic reviews and meta-analyses have started appearing (e.g., [
      Oxford COVID-19 evidence service.
      ,
      Cochrane COVID rapid reviews.
      ,
      COVID-19 L·OVE WG.
      ,
      • Keenan C.
      • Fogarty D.
      • Cheng S.
      • Noone C.
      The role of evidence synthesis in COVID19.
      ,
      Living mapping and living network meta-analysis of Covid-19 studies.
      ]) and more will be necessary in the coming weeks and months. Systematic reviews are required to address not only the etiology, diagnosis, prognosis, and treatment of symptoms of COVID-19, but also the social impacts of the disease (e.g., effects of strategies to support parents with home-schooling their children and educators with online learning pedagogical strategies; consequences of police being mobilized to police quarantines; and international development issues such as food security during the pandemic). We believe that while original reviews are essential, decision-making during the pandemic would benefit also from the purposeful replication of some systematic reviews of evidence relevant to COVID-19.
      In this article, we draw a distinction between duplication and replication of systematic reviews. By “duplication” of systematic reviews, we mean needless, frequently unwitting or unacknowledged repetition of reviews without a clearly defined purpose for the repetition. By “replication”, we mean using the same or very similar methods as a previous systematic review to determine whether comparable results are obtained, or intentionally broadening or narrowing the question addressed in a previous review to check how operationalization of concepts in the previous review influenced the results [

      Karunananthan S, Maxwell LJ, Welch V, Petkovic J, Pardo Pardo J, Rader T, et al. When and how to replicate systematic reviews. Cochrane Database Syst Rev. 2020(2):MR000052.

      ].
      Previous research suggests that a high proportion of systematic reviews and meta-analyses duplicate those that came before [
      • Siontis K.C.
      • Hernandez-Boussard T.
      • Ioannidis J.P.
      Overlapping meta-analyses on the same topic: survey of published studies.
      ,
      • Naudet F.
      • Schuit E.
      • Ioannidis J.P.A.
      Overlapping network meta-analyses on the same topic: survey of published studies.
      ]. For example, 57 systematic reviews of the effects of direct oral anticoagulants for stroke prevention in atrial fibrillation were published between 2012 and 2017 [
      • Doundoulakis I.
      • Antza C.
      • Apostolidou-Kiouti F.
      • Akrivos E.
      • Karvounis H.
      • Kotsis V.
      • et al.
      Overview of systematic reviews of non-vitamin K oral anticoagulants in atrial fibrillation.
      ]. Duplicate systematic reviews waste time and resources, creating extra work for health care providers and other users who need to determine what unique information, if any, each review provides. Duplication can also create confusion when reviews addressing the same question reach conflicting findings [
      • Hacke C.
      • Nunan D.
      Discrepancies in meta-analyses answering the same clinical question were hard to explain: a meta-epidemiological study.
      ]. By 11 May 2020, there were 806 systematic reviews of human studies relevant to COVID-19 registered in PROSPERO, the international prospective register for systematic reviews [
      PROSPERO.
      ]. However, many of these registered reviews appear to address the same or a similar question (e.g., 21 reviews include chloroquine or hydroxychloroquine and 30 include traditional Chinese medicine in the title). Given the urgent need for credible answers to high-priority questions about the health and social impacts of COVID-19, it is unsurprising that many systematic reviewers seek to contribute their skills and expertise. However, unless different teams working on the same review begin collaborating with one another, an epidemic of redundant reviews on COVID-19 is likely on the horizon.
      Along with minimizing production of unnecessary, duplicate reviews, we encourage the evidence synthesis community to prioritize purposeful replication of some systematic reviews of evidence relevant to COVID-19. Initially, this could involve replicating previously published, high-priority reviews conducted to address questions of relevance to a previous pandemic (e.g., what are the effects of wearing masks in public?) or questions originally posed in an unrelated context (e.g., what are the effects of programs to support physical activity at home for housebound older adults?). Thereafter, it may be necessary to replicate some reviews relevant to COVID-19 that are conducted during the pandemic (e.g., what are the effects on COVID-19 symptoms of drugs currently being evaluated in randomized trials?). Replicating reviews might satisfy the curiosities of methodologists wondering what impact-specific methods have on review findings but that is far from their only purpose. Rather, replicating reviews is a mechanism for verifying or addressing uncertainties about the results of an original review that decision-makers might be relying on to formulate recommendations for practice and policy.
      Replication of reviews is important in general but is especially valuable for syntheses of evidence relevant to COVID-19. The results of systematic reviews are determined by many choices relating to their design, conduct, and analysis [
      • Wanous J.P.
      • Sullivan S.E.
      • Malinak J.
      The role of judgment calls in meta-analysis.
      ,
      • Palpacuer C.
      • Hammas K.
      • Duprez R.
      • Laviolle B.
      • Ioannidis J.P.A.
      • Naudet F.
      Vibration of effects from diverse inclusion/exclusion criteria and analytical choices: 9216 different ways to perform an indirect comparison meta-analysis.
      ]. For example, reviewers need to decide which studies to include, how to identify studies, which outcome data to collect, and how to synthesize the results. There are also many opportunities for errors in reviews, for example, in the selection of eligible studies, or collection of relevant data. These issues are compounded during a pandemic such as COVID-19, when stakeholders need answers to pressing questions as soon as possible. The time available to decide the review's scope and methods may be substantially less than usual and the potential for errors may be considerably higher. Replication of systematic reviews of evidence relevant to COVID-19 therefore can serve as a useful quality control process. The results of a replication could lead to an increase or decrease in confidence in the claims made in the original review, indicate constraints on the reliability of the findings, and help refine or advance theory, subsequently providing more accurate information for decision-makers during the pandemic [
      • Nosek B.A.
      • Errington T.M.
      What is replication?.
      ].
      Many systematic reviews of evidence relevant to COVID-19 are using methodological shortcuts to provide evidence in a timely manner [
      • Borges do Nascimento I.J.
      • O'Mathuna D.P.
      • von Groote T.C.
      • Abdulazeem H.M.
      • Weerasekara I.
      • Marusic A.
      • et al.
      Coronavirus disease (COVID-19) pandemic: an overview of systematic reviews.
      ]. For example, in their review of quarantine alone or in combination with other public health measures to control COVID-19, Nussbaumer-Streit et al. [
      • Nussbaumer-Streit B.
      • Mayr V.
      • Dobrescu A.I.
      • Chapman A.
      • Persad E.
      • Klerings I.
      • et al.
      Quarantine alone or in combination with other public health measures to control COVID-19: a rapid review.
      ] decided to have a single author screen 70% of titles and abstracts, and one author collect data with verification by another. Based on registration data in PROSPERO, there are many systematic reviewers keen to contribute to the COVID-19 research effort, who could band together to work on purposeful replications that evaluate the impact of abbreviated methods on review findings, rather than proceeding with a redundant review. Doing so could help reveal what risks the use of methodological shortcuts entail, if any, adding to the limited comparative evidence on different methods for systematic reviews [
      • Robson R.C.
      • Pham B.
      • Hwee J.
      • Thomas S.M.
      • Rios P.
      • Page M.J.
      • et al.
      Few studies exist examining methods for selecting studies, abstracting data, and appraising quality in a systematic review.
      ].
      Replication of systematic reviews can be performed in various ways, with some requiring less resources than others. Systematic reviewers could perform a full replication of a review by repeating the entire set of systematic review methods or a partial replication by repeating a particular method for which there was reason for concern. Examples of the latter include the following: running the same or a broader search to see if any relevant studies were missed; extracting the study data necessary to recreate one of the meta-analyses reported to see if an alternative result was obtained; or conducting a more in-depth analysis of a subgroup of studies in the original review. Replication of a review could be performed for the purpose of determining the impact of involving different stakeholders (e.g., patients and insurers) in the review process or using an alternative statistical or qualitative synthesis approach [
      • Melendez-Torres G.J.
      • Thomas J.
      • Lorenc T.
      • O'Mara-Eves A.
      • Petticrew M.
      Just how plain are plain tobacco packs: re-analysis of a systematic review using multilevel meta-analysis suggests lessons about the comparative benefits of synthesis methods.
      ]. Replication might also be performed to evaluate the impact on the review findings of using automation tools (e.g., for study selection or risk of bias assessment) [
      • Clark J.
      • Glasziou P.
      • Del Mar C.
      • Bannach-Brown A.
      • Stehlik P.
      • Scott A.M.
      A full systematic review was completed in 2 weeks using automation tools: a case study.
      ], as compared with an original review relying on human reviewers only. The commonality among all these approaches is the adoption of similar or somewhat expanded methods as those used in a target systematic review. By contrast, adopting the methods of an entirely different type of evidence synthesis (e.g., scoping review and overview of systematic reviews) would not constitute a replication, given the different purpose the other type of synthesis serves.
      Even with an army of experienced systematic reviewers, replicating every review of evidence relevant to COVID-19 is neither feasible nor desirable. Systematic reviewers, commissioners, and other stakeholders must therefore prioritize which reviews to replicate. An international, multidisciplinary group of 36 individuals from seven countries met in Wakefield, Canada, in 2019 to develop guidance on when and when not to replicate systematic reviews. The resulting guidance advises reviewers to consider various criteria, such as (i) the priority of the review question for decision makers; (ii) the potential for replication to address uncertainties, controversies, or the need for additional evidence relating to the framing, conduct, potential for author influence, or discordance of findings in previous reviews; (iii) the extent to which implementation of the results of the replication could affect a sizable population; and (iv) whether resources required to replicate are offset by the potential value in reaffirming or addressing uncertainties related to the original results [
      • Welch V.
      • Grimshaw J.
      • Rada G.
      • Smith M.
      • Pardo Pardo J.
      • Soares-Weiser K.
      • et al.
      When should systematic reviews be replicated, and when is it wasteful? In: abstracts of the 26th Cochrane Colloquium, Santiago, Chile.
      ]. An article describing this guidance is under review, and we encourage anyone wishing further information, or keen to collaborate on research on the replicability of systematic reviews, to contact us.
      The COVID-19 context provides some unique opportunities and challenges for replication of reviews. Several syntheses of evidence relevant to COVID-19 are being continually updated (i.e., “living” reviews) [
      • Thorlund K.
      • Dron L.
      • Park J.
      • Hsu G.
      • Forrest J.I.
      • Mills E.J.
      A real-time dashboard of clinical trials for COVID-19.
      ,
      Living mapping and living network meta-analysis of Covid-19 studies.
      ,
      • Chou R.
      • Dana T.
      • Buckley D.I.
      • Selph S.
      • Fu R.
      • Totten A.M.
      Epidemiology of and risk factors for coronavirus infection in health care workers: a living rapid review.
      ,
      • Juul S.
      • Nielsen N.
      • Bentzer P.
      • Veroniki A.A.
      • Thabane L.
      • Linder A.
      • et al.
      Interventions for treatment of COVID-19: a protocol for a living systematic review with network meta-analysis including individual patient data (The LIVING Project).
      ]. If one of these reviews was replicated and errors were identified, these could be corrected in the original review at a much faster pace than usually occurs. In addition, by bringing together various organizations to help reduce duplication and better coordinate evidence syntheses relevant to COVID-19, the recently established COVID-19 Evidence Network to support Decision-making (COVID-END) [
      COVID-19 Evidence Network to support Decision-making (COVID-END).
      ] could help facilitate the process of prioritizing and coordinating replications of reviews. On the other hand, the politicization of discussions about COVID-19 means that the findings of replicated reviews would need to be communicated carefully, as failures to obtain the same result in a replication could be weaponized by some to discredit the entire systematic review process, something already observed in discourse on modeling studies for COVID-19 [
      • Wan W.
      • Blake A.
      Coronavirus modelers factor in new public health risk: Accusations their work is a hoax.
      ].
      To enhance replication of systematic reviews relevant to COVID-19 completed during the pandemic, we urge systematic reviewers to make their workflow publicly accessible. We recommend reviewers use reporting guidelines for systematic reviews, which typically recommend authors report what question(s) the review addressed, the types of studies they considered eligible, how they identified such studies, which data they collected, and how the results were synthesized [
      • Page M.J.
      • McKenzie J.E.
      • Bossuyt P.M.
      • Boutron I.
      • Hoffmann T.
      • Mulrow C.D.
      • et al.
      Mapping of reporting guidance for systematic reviews and meta-analyses generated a comprehensive item bank for future reporting guidelines.
      ]. Following the principles of “Open Synthesis” by sharing the underlying data, analytic code, and other materials used in the review via one of the various public repositories available (such as the Open Science Framework, figshare, or Dryad) can supplement information provided in the review report [
      • Haddaway N.R.
      Open synthesis: on the need for evidence synthesis to embrace open science.
      ]. For example, the summary data required to rerun meta-analyses and data for other outcomes for which meta-analysis was not possible could be provided in a well-curated format ready for reuse (e.g., Review Manager file, or a Microsoft Excel or CSV file) along with any analytic code necessary for reanalysis. In addition, data extraction forms that clearly indicate what data were sought, what data were obtained, and where data were obtained from may reduce uncertainties for replicators [
      • Haddaway N.R.
      Open synthesis: on the need for evidence synthesis to embrace open science.
      ,
      • Page M.J.
      • Altman D.G.
      • Shamseer L.
      • McKenzie J.E.
      • Ahmadzai N.
      • Wolfe D.
      • et al.
      Reproducible research practices are underused in systematic reviews of biomedical interventions.
      ]. Replicators should also register their plans to replicate a review at PROSPERO and post working protocols in publicly accessible repositories. In addition to aiding replication efforts, making the review workflow available for scrutiny should help increase the public's trust in systematic review findings.
      We believe Nosek and Errington's view of replication as an “exciting, generative, vital contributor to research progress” [
      • Nosek B.A.
      • Errington T.M.
      What is replication?.
      ] easily applies to systematic reviews and other research syntheses. However, replicated systematic reviews are currently a rarity, likely because their potential value is under-recognized by researchers, funders, journals, and other stakeholders. We hope that this changes throughout the COVID-19 pandemic and beyond, with replicated systematic reviews coming to be seen as highly valued and necessary research products, and redundant reviews a relic of the past.

      CRediT authorship contribution statement

      Matthew J. Page: Conceptualization, Writing - original draft, Writing - review & editing. Vivian A. Welch: Writing - review & editing. Neal R. Haddaway: Writing - review & editing. Sathya Karunananthan: Writing - review & editing. Lara J. Maxwell: Writing - review & editing. Peter Tugwell: Writing - review & editing.

      References

        • Wang C.
        • Horby P.W.
        • Hayden F.G.
        • Gao G.F.
        A novel coronavirus outbreak of global health concern.
        Lancet. 2020; 395: 470-473
        • Lorenc T.
        • Khouja C.
        • Raine G.
        • Sutcliffe K.
        • Wright K.
        • Sowden A.
        • et al.
        COVID-19: living map of the evidence.
        EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London, London2020
      1. Global Coronavirus COVID-19 Clinical Trial Tracker.
        (Available at)
        https://www.covid19-trials.org/
        Date accessed: May 11, 2020
        • Thorlund K.
        • Dron L.
        • Park J.
        • Hsu G.
        • Forrest J.I.
        • Mills E.J.
        A real-time dashboard of clinical trials for COVID-19.
        Lancet Digit Health. 2020; 24: e286-e287
      2. COVID-19 SARS-CoV-2 preprints from medRxiv and bioRxiv.
        (Available at)
      3. Oxford COVID-19 evidence service.
        (Available at)
        https://www.cebm.net/covid-19/
        Date accessed: May 11, 2020
      4. Cochrane COVID rapid reviews.
        (Available at)
        https://covidrapidreviews.cochrane.org/
        Date accessed: May 11, 2020
      5. COVID-19 L·OVE WG.
        (Available at)
        • Keenan C.
        • Fogarty D.
        • Cheng S.
        • Noone C.
        The role of evidence synthesis in COVID19.
        (Available at)
      6. Living mapping and living network meta-analysis of Covid-19 studies.
        (Available at)
        https://covid-nma.com/
        Date accessed: May 11, 2020
      7. Karunananthan S, Maxwell LJ, Welch V, Petkovic J, Pardo Pardo J, Rader T, et al. When and how to replicate systematic reviews. Cochrane Database Syst Rev. 2020(2):MR000052.

        • Siontis K.C.
        • Hernandez-Boussard T.
        • Ioannidis J.P.
        Overlapping meta-analyses on the same topic: survey of published studies.
        BMJ. 2013; 347: f4501
        • Naudet F.
        • Schuit E.
        • Ioannidis J.P.A.
        Overlapping network meta-analyses on the same topic: survey of published studies.
        Int J Epidemiol. 2017; 46: 1999-2008
        • Doundoulakis I.
        • Antza C.
        • Apostolidou-Kiouti F.
        • Akrivos E.
        • Karvounis H.
        • Kotsis V.
        • et al.
        Overview of systematic reviews of non-vitamin K oral anticoagulants in atrial fibrillation.
        Circ Cardiovasc Qual Outcomes. 2018; 11: e004769
        • Hacke C.
        • Nunan D.
        Discrepancies in meta-analyses answering the same clinical question were hard to explain: a meta-epidemiological study.
        J Clin Epidemiol. 2020; 119: 47-56
      8. PROSPERO.
        (Available at)
        https://www.crd.york.ac.uk/prospero/
        Date accessed: May 11, 2020
        • Wanous J.P.
        • Sullivan S.E.
        • Malinak J.
        The role of judgment calls in meta-analysis.
        J Appl Psychol. 1989; 74: 259-264
        • Palpacuer C.
        • Hammas K.
        • Duprez R.
        • Laviolle B.
        • Ioannidis J.P.A.
        • Naudet F.
        Vibration of effects from diverse inclusion/exclusion criteria and analytical choices: 9216 different ways to perform an indirect comparison meta-analysis.
        BMC Med. 2019; 17: 174
        • Nosek B.A.
        • Errington T.M.
        What is replication?.
        PLoS Biol. 2020; 18: e3000691
        • Borges do Nascimento I.J.
        • O'Mathuna D.P.
        • von Groote T.C.
        • Abdulazeem H.M.
        • Weerasekara I.
        • Marusic A.
        • et al.
        Coronavirus disease (COVID-19) pandemic: an overview of systematic reviews.
        medRxiv. 2020;
        • Nussbaumer-Streit B.
        • Mayr V.
        • Dobrescu A.I.
        • Chapman A.
        • Persad E.
        • Klerings I.
        • et al.
        Quarantine alone or in combination with other public health measures to control COVID-19: a rapid review.
        Cochrane Database Syst Rev. 2020; 4: CD013574
        • Robson R.C.
        • Pham B.
        • Hwee J.
        • Thomas S.M.
        • Rios P.
        • Page M.J.
        • et al.
        Few studies exist examining methods for selecting studies, abstracting data, and appraising quality in a systematic review.
        J Clin Epidemiol. 2019; 106: 121-135
        • Melendez-Torres G.J.
        • Thomas J.
        • Lorenc T.
        • O'Mara-Eves A.
        • Petticrew M.
        Just how plain are plain tobacco packs: re-analysis of a systematic review using multilevel meta-analysis suggests lessons about the comparative benefits of synthesis methods.
        Syst Rev. 2018; 7: 153
        • Clark J.
        • Glasziou P.
        • Del Mar C.
        • Bannach-Brown A.
        • Stehlik P.
        • Scott A.M.
        A full systematic review was completed in 2 weeks using automation tools: a case study.
        J Clin Epidemiol. 2020; 121: 81-90
        • Welch V.
        • Grimshaw J.
        • Rada G.
        • Smith M.
        • Pardo Pardo J.
        • Soares-Weiser K.
        • et al.
        When should systematic reviews be replicated, and when is it wasteful? In: abstracts of the 26th Cochrane Colloquium, Santiago, Chile.
        Cochrane Database Syst Rev. 2020; : 18
        • Chou R.
        • Dana T.
        • Buckley D.I.
        • Selph S.
        • Fu R.
        • Totten A.M.
        Epidemiology of and risk factors for coronavirus infection in health care workers: a living rapid review.
        Ann Intern Med. 2020;
        • Juul S.
        • Nielsen N.
        • Bentzer P.
        • Veroniki A.A.
        • Thabane L.
        • Linder A.
        • et al.
        Interventions for treatment of COVID-19: a protocol for a living systematic review with network meta-analysis including individual patient data (The LIVING Project).
        Syst Rev. 2020; 9: 108
      9. COVID-19 Evidence Network to support Decision-making (COVID-END).
        (Available at)
        • Wan W.
        • Blake A.
        Coronavirus modelers factor in new public health risk: Accusations their work is a hoax.
        (Available at) (The Washington Post)
        • Page M.J.
        • McKenzie J.E.
        • Bossuyt P.M.
        • Boutron I.
        • Hoffmann T.
        • Mulrow C.D.
        • et al.
        Mapping of reporting guidance for systematic reviews and meta-analyses generated a comprehensive item bank for future reporting guidelines.
        J Clin Epidemiol. 2020; 118: 60-68
        • Haddaway N.R.
        Open synthesis: on the need for evidence synthesis to embrace open science.
        Environ Evid. 2018; 7: 26
        • Page M.J.
        • Altman D.G.
        • Shamseer L.
        • McKenzie J.E.
        • Ahmadzai N.
        • Wolfe D.
        • et al.
        Reproducible research practices are underused in systematic reviews of biomedical interventions.
        J Clin Epidemiol. 2018; 94: 8-18