Advertisement

Open synthesis and the coronavirus pandemic in 2020

  • Neal R. Haddaway
    Correspondence
    Corresponding author. University of Johannesburg, Johannesburg, South Africa.
    Affiliations
    Stockholm Environment Institute, Linnégatan 87D, Stockholm, Sweden

    African Centre for Evidence, University of Johannesburg, Johannesburg, South Africa

    The SEI Centre of the Collaboration for Environmental Evidence, Stockholm, Sweden

    Mercator Research Institute on Global Commons and Climate Change, Berlin, Germany
    Search for articles by this author
  • Elie A. Akl
    Affiliations
    Department of Internal Medicine, Faculty of Medicine, American University of Beirut, Beirut, Lebanon

    Department of Health Research Methods, Evidence, and Impact (HE&I), McMaster University, Hamilton, Ontario, Canada

    The Global Evidence Synthesis Initiative (GESI) Secretariat, American University of Beirut, Beirut, Lebanon
    Search for articles by this author
  • Matthew J. Page
    Affiliations
    School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia
    Search for articles by this author
  • Vivian A. Welch
    Affiliations
    Bruyere Research Institute, Ottawa, Canada

    Campbell Collaboration, Oslo, Norway

    School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa
    Search for articles by this author
  • Ciara Keenan
    Affiliations
    Campbell UK and Ireland, Belfast, UK

    Centre for Evidence and Social Innovation, Queen's University Belfast, Belfast, UK
    Search for articles by this author
  • Tamara Lotfi
    Affiliations
    Department of Health Research Methods, Evidence, and Impact (HE&I), McMaster University, Hamilton, Ontario, Canada

    The Global Evidence Synthesis Initiative (GESI) Secretariat, American University of Beirut, Beirut, Lebanon
    Search for articles by this author

      Highlights

      • Open Science principles are vital for ensuring reproducibility, trust, and legacy.
      • Evidence synthesis is a vital means of summarizing research for decision-making.
      • Open Synthesis is the application of Open Science principles to evidence synthesis.
      • Open approaches to planning, conducting, and reporting synthesis have many benefits.
      • We call on the evidence synthesis community to embrace Open Synthesis.

      Keywords

      The coronavirus disease 2019 (COVID-19) pandemic of 2020 has caused high levels of mortality and continues to threaten the lives of the global population [
      World Health Organization
      Coronavirus disease 2019 (COVID-19): situation report, 85.
      ]. The pandemic has amounted to a “once in a lifetime” event for humanity and has affected it across its different sectors of existence: health, education, economy, environment, etc. The pandemic continues to threaten job prospects for millions of people and has resulted in widespread economic turmoil [
      • McKibbin W.J.
      • Fernando R.
      The global macroeconomic impacts of COVID-19: seven scenarios. SSRN J.
      ]. It has also led to the cancellation of numerous conferences (e.g., [
      • Robbins R.
      STAT’s guide to health care conferences disrupted by the coronavirus crisis. STAT News.
      ]) and research fieldwork and closed offices across the globe.
      As the scientific community grapples to respond to the massive and rapidly evolving crisis, the volume of research literature that has been published in relation to the outbreak has expanded rapidly (Figure 1). Simultaneously, efforts to synthesize this growing evidence base have begun, both through ongoing traditional approaches to independent systematic reviews (e.g., [
      • Sahin A.R.
      • Erdogan A.
      • Mutlu Agaoglu P.
      • Dineri Y.
      • Cakirci A.Y.
      • Senel M.E.
      • et al.
      2019 novel coronavirus (COVID-19) outbreak: a review of the current literature.
      ,
      • Salehi S.
      • Abedi A.
      • Balakrishnan S.
      • Gholamrezanezhad A.
      Coronavirus disease 2019 (COVID-19): a systematic review of imaging findings in 919 patients.
      ]), and through both rapid and living systematic reviews (e.g., https://covidrapidreviews.cochrane.org/search/site). Rapid systematic reviews provide in a timely way the evidence needed to inform policy making under urgent circumstances. On the other hand, living systematic reviews ensure that any evidence synthesis is up to date with the latest evidence (e.g., by the L.OVE team at Epistemonikos).
      Figure thumbnail gr1
      Fig. 1Proliferation of publications on COVID-19 found in PubMed on 5th June 2020 with creation dates in 2020 [corresponding to week 23] (all fields search for (“COVID-19″ OR “nCoV” OR ″2019 novel coronavirus” OR ″2019-nCoV” OR “SARS-CoV-2″) AND research). A total of 19,260 hits were identified. Data and code were freely accessible from https://github.com/nealhaddaway/COVID19/. Week of 2020 calculated based on PubMed creation date. Records lacking creation date were excluded.
      As the volume of evidence increases and decision makers and scientists struggle to grapple with the rapidly expanding evidence base, many research groups are volunteering to support these efforts by using online collaborative tools and virtual workspaces, in an effort to support continued working during challenging times, and also to help identify, map, and synthesize research as it emerges.
      This work faces a suite of challenges because of the often closed nature of science. The major challenges are the duplication of efforts (leading to research waste), the inefficiency in conducting research, and missing the opportunity to address important questions. Open science principles present an opportunity to address these challenges in the context of the COVID-19 pandemic. They would also ensure that the research in the field is more collaborative, transparent, and rigorous. This article argues for, and illustrates how, to apply the principles of Open Science to the field of evidence synthesis, a concept we refer to as Open Synthesis [
      • Haddaway N.R.
      Open Synthesis: on the need for evidence synthesis to embrace Open Science.
      ]. We use the COVID-19 pandemic as a case in point to highlight the potential significant benefits of Openness to the research, policy, and practice communities.

      1. Evidence synthesis

      Evidence synthesis is the name for research methodologies that involve identifying, collating, appraising, and summarizing a body of research evidence using tried and tested systematic and robust literature review methods: i.e., systematic reviews and systematic maps [
      • Gough D.
      • Oliver S.
      • Thomas J.
      An introduction to systematic reviews.
      ]. Systematic reviews are now widely used in the field of health care as a “gold standard” for summarizing evidence to provide support for decision-making in policy and practice, through a variety of knowledge translation products and practice guidelines [
      • Alonso-Coello P.
      • Schünemann H.J.
      • Moberg J.
      • Brignardello-Petersen R.
      • Akl E.A.
      • Davoli M.
      • et al.
      GRADE Evidence to Decision (EtD) frameworks: a systematic and transparent approach to making well informed healthcare choices. 1: Introduction.
      ].
      However, systematic reviewers face challenges as a result of an often closed academic system; research can be difficult to find and download without access to expensive bibliographic databases [
      • Livoreil B.
      • Glanville J.
      • Haddaway N.R.
      • Bayliss H.
      • Bethel A.
      • Lachapelle F.F.
      • et al.
      Systematic searching for environmental evidence using multiple tools and sources.
      ]; primary research articles and the systematic reviews that synthesize them are hidden behind paywalls [
      • Chawla A.
      • Twycross-Lewis R.
      • Maffulli N.
      Microfracture produces inferior outcomes to other cartilage repair techniques in chondral injuries in the paediatric knee.
      ,
      • Piwowar H.
      • Priem J.
      • Larivière V.
      • Alperin J.P.
      • Matthias L.
      • Norlander B.
      • et al.
      The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles.
      ]; reporting of methods used in trials and syntheses is often deficient to some degree, hampering verification and learning about methodology [
      • Glasziou P.
      • Altman D.G.
      • Bossuyt P.
      • Boutron I.
      • Clarke M.
      • Julious S.
      • et al.
      Reducing waste from incomplete or unusable reports of biomedical research.
      ]; research data are often not made public, particularly when produced by organizations with commercial interests, such as pharmaceutical companies [
      • Moynihan R.
      • Bero L.
      • Hill S.
      • Johansson M.
      • Lexchin J.
      • Macdonald H.
      • et al.
      Pathways to independence: towards producing and using trustworthy evidence.
      ]; analytical code is rarely shared and statistical methods can be hard to verify [
      • Chiang I.C.A.
      • Jhangiani R.S.
      • Price P.C.
      From the “replicability crisis” to open science practices. Research Methods in Psychology. BCcampus.
      ], and educational materials to train the next generation of evidence synthesists are often not made public [
      • Farrow R.
      Open education and critical pedagogy.
      ].

      2. Open Science

      Open Science has central premises relating to accessibility and the collaborative nature of knowledge creation and the knowledge itself [
      • Fecher B.
      • Friesike S.
      Open science: one term, five schools of thought.
      ]. These principles (see Table 1) include concepts such as open access (unrestricted availability of research publications,11) and open data (freely accessible research data used in analyses; [
      • Gewin V.
      Data sharing: an open mind on open data.
      ]) that together support efficient, transparent, and rigorous research.
      Table 1Main concepts within Open Science [translated and adapted from OpenScienceASAP; http://openscienceasap.org/open-science]
      ConceptDefinition
      Open dataFreely available research data
      Open sourceUse and production of freely accessible software and hardware
      Open methodologyDocumentation of methods for a research process as far as possible
      Open peer reviewTransparent and traceable quality assurance through open peer review
      Open accessPublish research articles in an accessible manner, making them useable and accessible for all
      Open educational resourcesFree and accessible materials for education and university teaching
      There are various definitions of Open Science, ranging from relatively simple classifications of “data, analysis, publications, and comments” [
      • Foster E.D.
      • Deardorff A.
      Open science framework (OSF).
      ] to somewhat more elaborate frameworks (see Table 1), all the way to complex hierarchical conceptual models [
      • Knoth P.
      • Pontika N.
      Open Science Taxonomy.
      ]. Although these classifications differ in their complexity, they each attempt to cover all aspects of research processes from initiation to communication.

      3. Open Synthesis

      Some of the problems with traditional approaches to evidence synthesis described above (access to data, methods, publications, etc.) can be and indeed are being mitigated by applying these Open Science principles to evidence synthesis; the result has been termed Open Synthesis [
      • Haddaway N.R.
      Open Synthesis: on the need for evidence synthesis to embrace Open Science.
      ]. Open Synthesis was first proposed to apply Open Access, Open Data, Open Source and Open Methodology to evidence synthesis, with the possible addition of Open Education. We propose a finer resolution based on more complex taxonomies (e.g., [
      • Knoth P.
      • Pontika N.
      Open Science Taxonomy.
      ]).
      We suggest that such Open Synthesis would support the transfer of knowledge from primary research to decision support tools and evidence portals (e.g., the Teaching and Learning Toolkit), particularly during humanitarian crises; for example, Evidence Aid hosts a freely accessible evidence repository that holds summaries of COVID-19 relevant evidence (https://www.evidenceaid.org/coronavirus-covid-19-evidence-collection/) [
      • Clarke M.
      Evidence Aid – from the Asian tsunami to the Wenchuan earthquake.
      ]. Many Open Synthesis resources have been developed and assembled in an effort to facilitate access to the novel evidence base emerging in relation to the COVID-19 pandemic. These examples are (understandably) almost exclusively related to the field of health, but the evidence base will become increasingly multidisciplinary and cross-sectoral as research focus spreads to include the societal and environmental impacts of the outbreak and subsequent social policies, such as widescale lockdowns. The key components of Open Synthesis are described in Figure 2, and examples are given below.
      Figure thumbnail gr2
      Fig. 2Provisional core principles of Open Synthesis. This is the subject of discussion by an international, interdisciplinary Open Synthesis Working Group (https://opensynthesis.github.io) that aims to define and describe pathways toward more Open evidence synthesis.

      3.1 Open collaboration

      The COVID-19 evidence map of emerging literature produced by the Meta-Evidence blog was open to interested collaborators (before the project was discontinued because of considerable overlap with several other projects) and involved substantial efforts to translate and extract information from literature written in Chinese. The synthesizing group under COVID Evidence Network to support Decision makers (COVID-END; https://www.mcmasterforum.org/networks/covid-end/working-groups/synthesizing) supports efforts to synthesize the evidence that already exists in ways that are more coordinated and efficient and that balance quality and timeliness. Cochrane's COVID Rapid Reviews repository provides space for Open Collaboration by connecting authors interested in addressing the same rapid review question that were submitted by the public.

      3.2 Open discovery

      To enable free (i.e., not paywalled) searching for relevant evidence, various efforts are seeking to build “living” bibliographies and databases of research on COVID-19. For example, the CORD19 database (MIT); the COVID-19 living systematic map (EPPI center); Cochrane's COVID-19 Study Register; the Norwegian Institute of Public Health's live map of COVID-19 evidence. Similarly, the McMaster GRADE Center is collaborating with the Norwegian Institute of Public Health and others to map recommendations relevant to COVID-19 and make them publicly available (including the strength and certainty of supporting evidence) [
      • Schünemann H.J.
      • Santesso N.
      • Vist G.E.
      • Cuello C.
      • Lotfi T.
      • Flottorp S.
      • et al.
      Using GRADE in situations of emergencies and urgencies: certainty in evidence and recommendations matters during the COVID-19 pandemic, now more than ever and no matter what.
      ].

      3.3 Open methods

      Efforts exist to ensure that evidence syntheses use transparent and well reported methods to improve repeatability and usability. For example, the systematic review registry PROSPERO has provided a link to already registered reviews of human and animal studies relevant to COVID-19.

      3.4 Open data

      Freely accessible data (including those extracted and generated within the process of conducting a systematic review) are being made available for reuse and analysis. From evidence syntheses, the Epistemonikos COVID-19 collection archives data extracted from within reviews in a publicly accessible database (https://www.epistemonikos.cl/all-about-covid-19/).

      3.5 Open source

      Freely useable and adaptable tools for analysis and visualization have been made available online to support the conduct and communication of COVID-19 relevant research, for example, corona-cli (code for analyzing and visualizing data on the outbreak); the EviAtlas tool for mapping the geographical spread of evidence on COVID-19 [
      • Haddaway N.R.
      • Feierman A.
      • Grainger M.J.
      • Gray C.T.
      • Tanriver-Ayder E.
      • Dhaubanjar S.
      • et al.
      EviAtlas: a tool for visualising evidence synthesis databases.
      ].

      3.6 Open code

      Many researchers routinely publish the analytic code to accompany their research (e.g., R script for statistical analyses), although to date this practice is not common in the syntheses we have examined; perhaps because this is challenging where reviewers have not made use of code-driven software, and code does not readily exist (e.g., for reviews conducted using RevMan software). However, some examples of Open Code in primary research include code to webscrape COVID data from Worldometers and epidemiological modeling code for COVID.

      3.7 Open access

      Several publishers and journals have made COVID-19 relevant research articles and evidence syntheses freely accessible, including the Cochrane COVID-19 evidence collection and several Elsevier journals including Journal of Clinical Epidemiology and The Lancet (https://www.elsevier.com/connect/coronavirus-information-center). Systematic reviewers can facilitate Open Access by ensuring their reviews are freely accessible (e.g., by publishing in open access journals or depositing preprints or postprints in publicly accessible repositories) but also by facilitating access to the primary research synthesized in their reviews (e.g., by providing DOIs for the full texts of their included studies).

      3.8 Open peer review:

      Although most journals do not currently publish peer review reports and revisions of systematic reviews, some resources exist to support this, including the Outbreak Science Rapid PREReview for prepublication peer review.

      3.9 Open education

      Various freely accessible training resources (e.g., courses, webinars, and handbooks) exist for evidence synthesis methodology, including #ESTraining provided by the Collaboration for Environmental Evidence and Stockholm Environment Institute and webinars provided by the Global Evidence Synthesis Initiative.

      3.10 Open interests

      Systematic reviews have been shown to suffer from poor reporting of funding, role of funders, and conflicts of interest in general [
      • Bou-Karroum L.
      • Hakoum M.B.
      • Hammoud M.Z.
      • Khamis A.M.
      • Al-Gibbawi M.
      • Badour S.
      • et al.
      Reporting of financial and non-financial conflicts of interest in systematic reviews on health policy and systems research: a cross sectional survey.
      ]. Open Interests calls for individuals to transparently declare possible financial and nonfinancial interests—ideally, this would be performed by all parties involved in the conduct and publication of systematic reviews (including educators, engaged stakeholders, review authors, advisory group members, peer reviewers, editors, and publishers); these should be updated regularly. In practical terms, this could either be a declaration at the point of publication (e.g., review publications, educational materials, or peer review comments) or via a freely accessible central database of interests. At present, no Open Interests initiative exists.

      3.10.1 Challenges of implementing Open Synthesis and their relation to Open Science criticisms

      Although no criticisms have been fielded against Open Synthesis yet, some researchers have raised concerns about Open Science. We have described some of these in Table 2. These concerns either relate to openness itself as a practice or the application and enforcement of Open Science within current institutions and incentive structures.
      Table 2Concerns relating to Open Science and their applicability to and mitigation within Open Synthesis
      Concern relating to Open ScienceDescription of the concernApplicability to Open SynthesisPotential mitigations for Open Synthesis
      Exacerbation of power imbalance and inequality or exclusion of minorities [
      • Bahlai C.
      • Bartlett L.J.
      • Burgio K.R.
      • Fournier A.M.
      • Keiser C.N.
      • Poisot T.
      • et al.
      Open science isn’t always open to all scientists.
      ]
      Open Science practices applied within the current incentive structures and institutions can exacerbate power imbalance and inequality, particularly adversely affecting minorities and the vulnerable or oppressedHighly applicable to evidence syntheses, just as with primary research.Open Synthesis principles can be endorsed rather than enforced to avoid penalizing vulnerable researchers who may struggle to be Open. Structures can be put in place to support minorities and vulnerable researchers (e.g., publication fee waivers for low- and middle-income researchers [
      • Lawson S.
      Fee Waivers for Open Access Journals.
      ], mentoring in Open practices).
      Risk of misuse [
      • Grand A.
      • Wilkinson C.
      • Bultitude K.
      • Winfield A.F.T.
      Open science: a new “trust technology”?.
      ]
      Open Data and Code may be reused or reanalyzed incorrectly, potentially for nefarious reasonsAlthough some data in syntheses are in the public domain, some data from unpublished studies or unpublished outcomes obtained from authors are not available in the public domain. Furthermore, the calculation of effect sizes may use assumptions that affect the estimates calculated.Ensure full methodological transparency to avoid misunderstandings, including annotation of analytic or statistical code and any assumptions. Adequate reference and easy linkage to the original data source should be provided for clarity.
      Risk of public misunderstanding (e.g., [
      • Nielsen M.
      Reinventing discovery: the new era of networked science.
      ])
      Detailed language and nuance of data may be misunderstood by lay people, nonspecialists, or those who did not collect the dataSystematic reviews are typically not intended to be a means of communication with the public (plain language summaries instead). The risk is not higher for Open Synthesis relative to standard synthesis.Synthesis methods must be detailed enough and follow standard language to allow full understanding.
      Potential to be overwhelmed by information [
      • Grand A.
      • Wilkinson C.
      • Bultitude K.
      • Winfield A.F.T.
      Mapping the hinterland: data issues in open science.
      ]
      Publication of large volumes of data or information may make it difficult to find important details within/across studiesInformation is typically more structured across evidence syntheses than primary research because they use a common methodological framework.Standardized reporting templates could be built to support or facilitate metadata formatting so that information is readily found and understood. Reviewers could provide different versions with different levels of detail for different audiences (e.g., Plain language summary for the lay public).
      Fear of repercussions if mistakes are unearthed after publication [
      • Allen C.
      • Mehler D.M.A.
      Open science challenges, benefits and tips in early career and beyond.
      ]
      Authors may fear that they could be subjected to persecution if mistakes are identified in their methods after publication and so may prefer to keep data and analyses privateThere is potential for error in the identification, selection, appraisal, and analysis of studies included in systematic reviewsReviewers should be incentivized to admit errors and supported when these occur. Institutional punitive measures for publishing corrections or retractions should first examine the reasons behind the action, avoiding blanket punishments and acknowledge authors who act ethically and responsibly, while promoting and rewarding Open behaviors. Open Synthesis should be reframed as an opportunity to validate findings as opposed to detecting mistakes.
      Publication of data leads to “research parasitism” [
      • Longo D.L.
      • Drazen J.M.
      Data sharing.
      ]
      Some researchers feel that reuse of data or methods by others is an unfair practice and that authors alone should retain exclusive rightsCochrane, the Campbell Collaboration and the Collaboration for Environmental Evidence allow review teams the right to lead updates to their reviews for a fixed period. Data collected and used in an evidence synthesis is typically already in the public domain, anyway.Raise awareness of the benefits in legacy and impact of research resulting from reuse of data. Ensure those reusing data provide appropriate and full acknowledgment of data sources.

      Reconsider rules for academic credit, reward, and promotion.
      Belief that low quality science will proliferate [
      • Lancaster A.
      Open Science and its Discontents | Ronin Inst.
      ]
      [Specifically referring to Open Peer Review and preprints] some argue that a lack of traditional peer review for preprints removes the gatekeeping that ensures research validity, and low-quality research will become commonPreprints are, in part, a response to a lack of immediate Open Access and closed peer review. They are not an integral part of Open Science but rather an extension of it. Current institutions and incentive structures may not be sufficient to prevent low quality evidence syntheses from being published, but this is also the case for those that are traditionally peer reviewed.Make use of opportunities for Open Peer Review that complement and strengthen preprints (i.e., postpublication peer review;,31). Raise awareness and establish standard communication practices for understanding preprints within the communications community (i.e., journalists and institutional communications officers). Ensure preprints follow standards for conducting and reporting evidence synthesis (e.g., PRISMA and ROSES)
      Increased resources needed to attain Openness [
      • Grand A.
      • Wilkinson C.
      • Bultitude K.
      • Winfield A.F.T.
      Open science: a new “trust technology”?.
      ,
      • Beagrie N.
      • Lavoie B.
      • Woolard M.
      ]
      Ensuring that data and information are made fully Open may require resources (time and funding) that are not readily available to allThe large amounts of data potentially produced within a systematic review project could require considerable resources to clean and annotate if not planned from the outset, particularly for analytic code. Open Collaboration could require considerable time to manage if roles and tasks are not carefully predefined.Openness can be achieved for the most part by using cost-free alternatives (e.g., self-archiving to avoid publication fees and the use of free data repositories) and by incentivizing and institutionalizing Open and transparent practices from an early career stage (e.g., good code annotation practices). However, this point is not trivial and highlights the need for careful planning across all aspects of Open Synthesis; planning can significantly reduce resource requirements. Standardizing methods and processes and tools used to abstract and store data could assist in this process [
      • Akl E.A.
      • Haddaway N.R.
      • Rada G.
      • Lotfi T.
      Evidence synthesis 2.0: when systematic, scoping, rapid, living, and overviews of reviews come together.
      ]
      Risk of “platform capitalism” (i.e., commercialization of public data) [
      • Pievatolo M.C.
      Open science: human emancipation or bureaucratic serfdom? SCIRES-it.
      ]
      The free availability of data permits the development of subscription-based/pay-to-use services (e.g., Academia.edu) that aim to provide additional services using public data (e.g., analytics) and platforms that may exploit or disadvantage certain groups of people (e.g., by charging for a service that is otherwise already free elsewhere)Grass roots and no-cost alternatives to these services are often available but awareness of free-to-use services is vital to avoid entrapment by commercial enterprises (e.g., paying a publisher to access an article that is already Open Access).Noncommercial use Creative Commons licenses may help restrict/prevent commercial use of Open Data (e.g., CC BY-NC 3.0), but they are not without criticism, for example, that Creative Commons licenses are based on copyright law that is overly restrictive to academic collaborations [
      • Corbett S.
      Creative commons licences, the copyright regime and the online community: is there a fatal disconnect?.
      ].
      Need to maintain confidentiality [
      • Cummings J.A.
      • Zagrodney J.M.
      • Day T.E.
      Impact of open data policies on consent to participate in human subjects research: discrepancies between participant action and reported concerns.
      ,
      • Walsh C.G.
      • Xia W.
      • Li M.
      • Denny J.C.
      • Harris P.A.
      • Malin B.A.
      Enabling open-science initiatives in clinical psychology and psychiatry without sacrificing patients’ privacy: current practices and future challenges.
      ]
      Research subjects are typically provided anonymity that may mean publication of raw data is not feasible or safeEvidence syntheses often make use of summary data not disaggregated at the level of individual participants, and for these reviews this may not be an issue. Individual participant data (IPD) meta-analyses, however, may not be able to publish data openly.For IPD meta-analyses, the requirements for Open Data may need to be relaxed or adapted in some contexts to ensure anonymity can be maintained. For example, data on request repositories for individual patient data exist [
      • van Middelkoop M.
      • Lohmander S.
      • Bierma-Zeinstra S.M.A.
      Sharing data–taming the beast: barriers to meta-analyses of individual patient data (IPD) and solutions.
      ]. Standardized ethical practices could be established where needed for IPD meta-analysis.
      Institutional barriers including career incentives that reward closed practices [
      • Gagliardi D.
      • Cox D.
      • Li Y.
      Institutional inertia and barriers to the adoption of open science.
      ]
      Career incentives in academic typically and historically center around publication in high-impact journals that are prohibitively expensive to publish Open Access. Recruitment and promotion in academia typically also do not reward or acknowledge Open practices. Institutions may not understand/accept the desire to be OpenSystematic reviewers often work within institutions established around primary research practices, so the same incentives apply. Organizations primarily focusing on evidence synthesis may already have Open practices.Incentive structures are likely to change over time as Open Science practices become more common, but authorities must take a stand to support researchers who are likely to be disadvantaged by being more Open (e.g., early career researchers).
      In addition, there are risks associated with some of the practices that may be facilitated by Open Synthesis, for example, 1) living systematic reviews may involve repeated incremental rerunning of meta-analyses, leading to increased chances of false positive that need to be accounted for (e.g., [
      • Mavergames C.
      • Elliott J.H.
      Living Systematic Reviews: towards real-time evidence for health-care decision-making | BMJ Best Pract.
      ]); 2) updates may need to account for changes in best practice in risk of bias assessments as novel methods become available, potentially involving reassessment of studies identified in the original review.
      These are not problems with Open Synthesis but rather important issues that should be addressed when planning incentives and infrastructure in support of Open Syntheses. However, a pathway to Open systematic reviews and systematic maps will involve many steps and a diverse array of different actions; these changes should not be expected overnight, and there is a need for detailed discussion about implications and pitfalls. That said, it is generally accepted that the advantages of Open Science outweigh the disadvantages [
      • LeBel E.P.
      • Campbell L.
      • Loving T.J.
      Benefits of open and high-powered research outweigh costs.
      ].

      3.10.2 Open Synthesis and current systematic review traditions

      At present, some of these Open Synthesis practices are enforced or encouraged by review coordinating bodies. Cochrane reviews can be made immediately Open Access at the point of publication for a fee (payable by authors) or made free after a 12 month period (otherwise requiring subscription to access, green Open Access). Cochrane does not yet require systematic review–extracted data to be made public [
      • Shokraneh F.
      • Adams C.E.
      • Clarke M.
      • Amato L.
      • Bastian H.
      • Beller E.
      • et al.
      Why Cochrane should prioritise sharing data.
      ]. While methods in Cochrane reviews are typically well-reported thanks to the Methodological Expectations for Cochrane Intervention Reviews reporting standards [
      • Higgins J.
      • Churchill R.
      • Lasserson T.
      • Chandler J.
      • Tovey D.
      Update from the methodological Expectations of Cochrane Intervention reviews (MECIR) project.
      ], the “raw” data extracted from primary studies within a review are not typically included. All Campbell Collaboration reviews are published in their Open Access journal. Transparent and Open Methods are required by the Methodological Expectations for Campbell Collaboration Intervention Reviews. Open Data and Code are in the vision for the future of the journal [
      • Welch V.A.
      Campbell systematic reviews takes next step to meeting FAIR principles.
      ]. For both organizations, review protocols are published online and time-stamped before work commences, as should be performed with all systematic reviews and maps (e.g., in PROSPERO, Cochrane Database of Systematic Reviews, or published in a suitable journal).

      3.10.3 Ways forward

      Adopting truly Open evidence synthesis approaches has the potential to globalize research, break down barriers to data sharing and collaboration, and mitigate inequality in knowledge availability (e.g., a large body of Chinese coronavirus trials was recently translated and mapped by researchers from Lanzhou University). Open synthesis also supports either living systematic reviews or intermittent updates; it is agnostic toward the framework chosen to update reviews. Importantly, it emphasizes the need to facilitate updates however that may occur.
      Moreover, Open Synthesis of evidence will provide guideline developers with faster and better access to the synthesis methods, findings, conflict of interest information, and other elements necessary for guideline development, and subsequently, improve the quality and efficiency of guideline development.
      Achieving the optimal impact of Open Synthesis requires the consideration of other principles. Of outmost importance is to respond to the knowledge needs of decision makers by adopting valid prioritisty setting approaches. Similarly, it has to feed into knowledge translation tools that are appropriate to the target decision makers. In addition, it should build on emerging concepts, such as Evidence Synthesis 2.0 [
      • Akl E.A.
      • Haddaway N.R.
      • Rada G.
      • Lotfi T.
      Evidence synthesis 2.0: when systematic, scoping, rapid, living, and overviews of reviews come together.
      ], to ensure the efficiency of the process and appropriateness of the output.
      We encourage adoption of these principles across all disciplines to meet the social, legal, ethical, and economic challenges of the global COVID-19 pandemic, such as supporting home-based education for children out of school; mitigating social impacts of isolation; responding to the increased risk and severity of domestic violence, global food insecurity, or the implications of social lockdowns on environmental recovery from long-term anthropogenic disturbance and climate change.
      We call for increasing application of Open Science and Open Synthesis principles across disciplines both within and beyond the COVID-19 epidemic to support evidence production, synthesis, and evidence-informed policy. By embracing Open Synthesis, evidence synthesis communities from all disciplines can maximize the efficiency, impact, and legacy of systematic reviews and better support decision-making, particularly in global crises such as the current COVID-19 pandemic, establishing a more resilient and collaborative future in the event of similar global challenges.

      CRediT authorship contribution statement

      Neal R. Haddaway: Conceptualization, Data curation. Elie A. Akl: Conceptualization, Writing - original draft, Writing - review & editing. Matthew J. Page: Writing - original draft, Writing - review & editing. Vivian A. Welch: Writing - original draft, Writing - review & editing. Ciara Keenan: Writing - original draft, Writing - review & editing. Tamara Lotfi: Conceptualization, Writing - original draft, Writing - review & editing.

      References

        • World Health Organization
        Coronavirus disease 2019 (COVID-19): situation report, 85.
        World Health Organisation, Geneva2020
        • McKibbin W.J.
        • Fernando R.
        The global macroeconomic impacts of COVID-19: seven scenarios. SSRN J.
        ([cited 2020 Apr 15]; Available at)
        https://www.ssrn.com/abstract=3547729
        Date: 2020
        Date accessed: April 15, 2020
        • Robbins R.
        STAT’s guide to health care conferences disrupted by the coronavirus crisis. STAT News.
        ([cited 2020 Apr 7]. Available at)
        • Sahin A.R.
        • Erdogan A.
        • Mutlu Agaoglu P.
        • Dineri Y.
        • Cakirci A.Y.
        • Senel M.E.
        • et al.
        2019 novel coronavirus (COVID-19) outbreak: a review of the current literature.
        Eurasian J Med Oncol. 2020; 4: 1-7
        • Salehi S.
        • Abedi A.
        • Balakrishnan S.
        • Gholamrezanezhad A.
        Coronavirus disease 2019 (COVID-19): a systematic review of imaging findings in 919 patients.
        Am J Roentgenol. 2020; : 1-7
        • Haddaway N.R.
        Open Synthesis: on the need for evidence synthesis to embrace Open Science.
        Environ Evid. 2018; 7: 26
        • Gough D.
        • Oliver S.
        • Thomas J.
        An introduction to systematic reviews.
        2nd ed. SAGE Publication, London2017: 304
        • Alonso-Coello P.
        • Schünemann H.J.
        • Moberg J.
        • Brignardello-Petersen R.
        • Akl E.A.
        • Davoli M.
        • et al.
        GRADE Evidence to Decision (EtD) frameworks: a systematic and transparent approach to making well informed healthcare choices. 1: Introduction.
        BMJ. 2016; 353: i2016
        • Livoreil B.
        • Glanville J.
        • Haddaway N.R.
        • Bayliss H.
        • Bethel A.
        • Lachapelle F.F.
        • et al.
        Systematic searching for environmental evidence using multiple tools and sources.
        Environ Evid. 2017; 6: 23
        • Chawla A.
        • Twycross-Lewis R.
        • Maffulli N.
        Microfracture produces inferior outcomes to other cartilage repair techniques in chondral injuries in the paediatric knee.
        Br Med Bull. 2015; 116: 93-103
        • Piwowar H.
        • Priem J.
        • Larivière V.
        • Alperin J.P.
        • Matthias L.
        • Norlander B.
        • et al.
        The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles.
        PeerJ. 2018; 6: e4375
        • Glasziou P.
        • Altman D.G.
        • Bossuyt P.
        • Boutron I.
        • Clarke M.
        • Julious S.
        • et al.
        Reducing waste from incomplete or unusable reports of biomedical research.
        Lancet. 2014; 383: 267-276
        • Moynihan R.
        • Bero L.
        • Hill S.
        • Johansson M.
        • Lexchin J.
        • Macdonald H.
        • et al.
        Pathways to independence: towards producing and using trustworthy evidence.
        BMJ. 2019; 367: l6576
        • Chiang I.C.A.
        • Jhangiani R.S.
        • Price P.C.
        From the “replicability crisis” to open science practices. Research Methods in Psychology. BCcampus.
        (Available at:)
        • Farrow R.
        Open education and critical pedagogy.
        Learn Media Technology. 2017; 42: 130-146
        • Fecher B.
        • Friesike S.
        Open science: one term, five schools of thought.
        in: Bartling S. Friesike S. Opening Science: The Evolving Guide on How the Internet is Changing Research, Collaboration and Scholarly Publishing [Internet]. Springer International Publishing, Cham2014: 17-47
        • Gewin V.
        Data sharing: an open mind on open data.
        Nature. 2016; 529: 117-119
        • Foster E.D.
        • Deardorff A.
        Open science framework (OSF).
        J Med Libr Assoc. 2017; 105: 203-206
        • Knoth P.
        • Pontika N.
        Open Science Taxonomy.
        figshare, 2015https://doi.org/10.6084/m9.figshare.1508606.v3
        • Clarke M.
        Evidence Aid – from the Asian tsunami to the Wenchuan earthquake.
        J Evid Based Med. 2008; 1: 9-11
        • Schünemann H.J.
        • Santesso N.
        • Vist G.E.
        • Cuello C.
        • Lotfi T.
        • Flottorp S.
        • et al.
        Using GRADE in situations of emergencies and urgencies: certainty in evidence and recommendations matters during the COVID-19 pandemic, now more than ever and no matter what.
        J Clin Epidemiol. 2020; 0
        • Haddaway N.R.
        • Feierman A.
        • Grainger M.J.
        • Gray C.T.
        • Tanriver-Ayder E.
        • Dhaubanjar S.
        • et al.
        EviAtlas: a tool for visualising evidence synthesis databases.
        Environ Evid. 2019; 8: 22
        • Bou-Karroum L.
        • Hakoum M.B.
        • Hammoud M.Z.
        • Khamis A.M.
        • Al-Gibbawi M.
        • Badour S.
        • et al.
        Reporting of financial and non-financial conflicts of interest in systematic reviews on health policy and systems research: a cross sectional survey.
        Int J Health Policy Manag. 2018; 7: 711-717
        • Bahlai C.
        • Bartlett L.J.
        • Burgio K.R.
        • Fournier A.M.
        • Keiser C.N.
        • Poisot T.
        • et al.
        Open science isn’t always open to all scientists.
        Am Sci. 2019; 107: 78-82
        • Lawson S.
        Fee Waivers for Open Access Journals.
        Publications. 2015; 3: 155-167
        • Grand A.
        • Wilkinson C.
        • Bultitude K.
        • Winfield A.F.T.
        Open science: a new “trust technology”?.
        Sci Commun. 2012; 34: 679-689
        • Nielsen M.
        Reinventing discovery: the new era of networked science.
        Princeton University Press, New Jersey2020: 204
        • Grand A.
        • Wilkinson C.
        • Bultitude K.
        • Winfield A.F.T.
        Mapping the hinterland: data issues in open science.
        Public Underst Sci. 2016; 25: 88-103
        • Allen C.
        • Mehler D.M.A.
        Open science challenges, benefits and tips in early career and beyond.
        PLoS Biol. 2019; 17: e3000246
        • Longo D.L.
        • Drazen J.M.
        Data sharing.
        N Engl J Med. 2016; 374: 276-277
        • Lancaster A.
        Open Science and its Discontents | Ronin Inst.
        (Available at)
        • Beagrie N.
        • Lavoie B.
        • Woolard M.
        Keeping research data safe (Phase 2). Jisc.
        (Available at)
        • Akl E.A.
        • Haddaway N.R.
        • Rada G.
        • Lotfi T.
        Evidence synthesis 2.0: when systematic, scoping, rapid, living, and overviews of reviews come together.
        J Clin Epidemiol. 2020; 0
        • Pievatolo M.C.
        Open science: human emancipation or bureaucratic serfdom? SCIRES-it.
        (Available at)
        https://archiviomarini.sp.unipi.it/858/
        Date accessed: June 1, 2020
        • Corbett S.
        Creative commons licences, the copyright regime and the online community: is there a fatal disconnect?.
        Mod Law Rev. 2011; 74: 503-531
        • Cummings J.A.
        • Zagrodney J.M.
        • Day T.E.
        Impact of open data policies on consent to participate in human subjects research: discrepancies between participant action and reported concerns.
        PLoS One. 2015; 10: e0125208
        • Walsh C.G.
        • Xia W.
        • Li M.
        • Denny J.C.
        • Harris P.A.
        • Malin B.A.
        Enabling open-science initiatives in clinical psychology and psychiatry without sacrificing patients’ privacy: current practices and future challenges.
        Adv Methods Practices Psychol Sci. 2018; 1: 104-114
        • van Middelkoop M.
        • Lohmander S.
        • Bierma-Zeinstra S.M.A.
        Sharing data–taming the beast: barriers to meta-analyses of individual patient data (IPD) and solutions.
        (Available at)
        • Gagliardi D.
        • Cox D.
        • Li Y.
        Institutional inertia and barriers to the adoption of open science.
        in: The transformation of university institutional and organizational boundaries. Brill Sense, Leiden, The Netherlands2015: 107-133
        • Mavergames C.
        • Elliott J.H.
        Living Systematic Reviews: towards real-time evidence for health-care decision-making | BMJ Best Pract.
        (Available at)
        • LeBel E.P.
        • Campbell L.
        • Loving T.J.
        Benefits of open and high-powered research outweigh costs.
        J Pers Soc Psychol. 2017; 113: 230
        • Shokraneh F.
        • Adams C.E.
        • Clarke M.
        • Amato L.
        • Bastian H.
        • Beller E.
        • et al.
        Why Cochrane should prioritise sharing data.
        BMJ. 2018; 362: k3229
        • Higgins J.
        • Churchill R.
        • Lasserson T.
        • Chandler J.
        • Tovey D.
        Update from the methodological Expectations of Cochrane Intervention reviews (MECIR) project.
        in: Cochrane Methods. Cochrane. 2012 (; 2012. Accessed April 22 2020)
        • Welch V.A.
        Campbell systematic reviews takes next step to meeting FAIR principles.
        Campbell Syst Rev. 2019; 15: e1032