Highlights
- •Open Science principles are vital for ensuring reproducibility, trust, and legacy.
- •Evidence synthesis is a vital means of summarizing research for decision-making.
- •Open Synthesis is the application of Open Science principles to evidence synthesis.
- •Open approaches to planning, conducting, and reporting synthesis have many benefits.
- •We call on the evidence synthesis community to embrace Open Synthesis.
Keywords
- Robbins R.

1. Evidence synthesis
- Chiang I.C.A.
- Jhangiani R.S.
- Price P.C.
2. Open Science
Concept | Definition |
---|---|
Open data | Freely available research data |
Open source | Use and production of freely accessible software and hardware |
Open methodology | Documentation of methods for a research process as far as possible |
Open peer review | Transparent and traceable quality assurance through open peer review |
Open access | Publish research articles in an accessible manner, making them useable and accessible for all |
Open educational resources | Free and accessible materials for education and university teaching |
3. Open Synthesis

3.1 Open collaboration
3.2 Open discovery
3.3 Open methods
3.4 Open data
3.5 Open source
3.6 Open code
3.7 Open access
3.8 Open peer review:
3.9 Open education
3.10 Open interests
3.10.1 Challenges of implementing Open Synthesis and their relation to Open Science criticisms
Concern relating to Open Science | Description of the concern | Applicability to Open Synthesis | Potential mitigations for Open Synthesis |
---|---|---|---|
Exacerbation of power imbalance and inequality or exclusion of minorities [ [24] ] | Open Science practices applied within the current incentive structures and institutions can exacerbate power imbalance and inequality, particularly adversely affecting minorities and the vulnerable or oppressed | Highly applicable to evidence syntheses, just as with primary research. | Open Synthesis principles can be endorsed rather than enforced to avoid penalizing vulnerable researchers who may struggle to be Open. Structures can be put in place to support minorities and vulnerable researchers (e.g., publication fee waivers for low- and middle-income researchers [ [25] ], mentoring in Open practices). |
Risk of misuse [ [26] ] | Open Data and Code may be reused or reanalyzed incorrectly, potentially for nefarious reasons | Although some data in syntheses are in the public domain, some data from unpublished studies or unpublished outcomes obtained from authors are not available in the public domain. Furthermore, the calculation of effect sizes may use assumptions that affect the estimates calculated. | Ensure full methodological transparency to avoid misunderstandings, including annotation of analytic or statistical code and any assumptions. Adequate reference and easy linkage to the original data source should be provided for clarity. |
Risk of public misunderstanding (e.g., [ [27] ]) | Detailed language and nuance of data may be misunderstood by lay people, nonspecialists, or those who did not collect the data | Systematic reviews are typically not intended to be a means of communication with the public (plain language summaries instead). The risk is not higher for Open Synthesis relative to standard synthesis. | Synthesis methods must be detailed enough and follow standard language to allow full understanding. |
Potential to be overwhelmed by information [ [28] ] | Publication of large volumes of data or information may make it difficult to find important details within/across studies | Information is typically more structured across evidence syntheses than primary research because they use a common methodological framework. | Standardized reporting templates could be built to support or facilitate metadata formatting so that information is readily found and understood. Reviewers could provide different versions with different levels of detail for different audiences (e.g., Plain language summary for the lay public). |
Fear of repercussions if mistakes are unearthed after publication [ [29] ] | Authors may fear that they could be subjected to persecution if mistakes are identified in their methods after publication and so may prefer to keep data and analyses private | There is potential for error in the identification, selection, appraisal, and analysis of studies included in systematic reviews | Reviewers should be incentivized to admit errors and supported when these occur. Institutional punitive measures for publishing corrections or retractions should first examine the reasons behind the action, avoiding blanket punishments and acknowledge authors who act ethically and responsibly, while promoting and rewarding Open behaviors. Open Synthesis should be reframed as an opportunity to validate findings as opposed to detecting mistakes. |
Publication of data leads to “research parasitism” [ [30] ] | Some researchers feel that reuse of data or methods by others is an unfair practice and that authors alone should retain exclusive rights | Cochrane, the Campbell Collaboration and the Collaboration for Environmental Evidence allow review teams the right to lead updates to their reviews for a fixed period. Data collected and used in an evidence synthesis is typically already in the public domain, anyway. | Raise awareness of the benefits in legacy and impact of research resulting from reuse of data. Ensure those reusing data provide appropriate and full acknowledgment of data sources. Reconsider rules for academic credit, reward, and promotion. |
Belief that low quality science will proliferate [ [31] ]
Open Science and its Discontents | Ronin Inst. http://ronininstitute.org/open-science-and-its-discontents/1383/ Date accessed: May 28, 2020 | [Specifically referring to Open Peer Review and preprints] some argue that a lack of traditional peer review for preprints removes the gatekeeping that ensures research validity, and low-quality research will become common | Preprints are, in part, a response to a lack of immediate Open Access and closed peer review. They are not an integral part of Open Science but rather an extension of it. Current institutions and incentive structures may not be sufficient to prevent low quality evidence syntheses from being published, but this is also the case for those that are traditionally peer reviewed. | Make use of opportunities for Open Peer Review that complement and strengthen preprints (i.e., postpublication peer review;,31). Raise awareness and establish standard communication practices for understanding preprints within the communications community (i.e., journalists and institutional communications officers). Ensure preprints follow standards for conducting and reporting evidence synthesis (e.g., PRISMA and ROSES) |
Increased resources needed to attain Openness [ [26] ,[32] ]
Keeping research data safe (Phase 2). Jisc. https://www.webarchive.org.uk/wayback/archive/20140613220103mp_/http://www.jisc.ac.uk/publications/reports/2010/keepingresearchdatasafe2.aspx Date: 2010 Date accessed: June 7, 2020 | Ensuring that data and information are made fully Open may require resources (time and funding) that are not readily available to all | The large amounts of data potentially produced within a systematic review project could require considerable resources to clean and annotate if not planned from the outset, particularly for analytic code. Open Collaboration could require considerable time to manage if roles and tasks are not carefully predefined. | Openness can be achieved for the most part by using cost-free alternatives (e.g., self-archiving to avoid publication fees and the use of free data repositories) and by incentivizing and institutionalizing Open and transparent practices from an early career stage (e.g., good code annotation practices). However, this point is not trivial and highlights the need for careful planning across all aspects of Open Synthesis; planning can significantly reduce resource requirements. Standardizing methods and processes and tools used to abstract and store data could assist in this process [ [33] ] |
Risk of “platform capitalism” (i.e., commercialization of public data) [ [34] ]
Open science: human emancipation or bureaucratic serfdom? SCIRES-it. https://archiviomarini.sp.unipi.it/858/ Date accessed: June 1, 2020 | The free availability of data permits the development of subscription-based/pay-to-use services (e.g., Academia.edu) that aim to provide additional services using public data (e.g., analytics) and platforms that may exploit or disadvantage certain groups of people (e.g., by charging for a service that is otherwise already free elsewhere) | Grass roots and no-cost alternatives to these services are often available but awareness of free-to-use services is vital to avoid entrapment by commercial enterprises (e.g., paying a publisher to access an article that is already Open Access). | Noncommercial use Creative Commons licenses may help restrict/prevent commercial use of Open Data (e.g., CC BY-NC 3.0), but they are not without criticism, for example, that Creative Commons licenses are based on copyright law that is overly restrictive to academic collaborations [ [35] ]. |
Need to maintain confidentiality [ [36] ,[37] ] | Research subjects are typically provided anonymity that may mean publication of raw data is not feasible or safe | Evidence syntheses often make use of summary data not disaggregated at the level of individual participants, and for these reviews this may not be an issue. Individual participant data (IPD) meta-analyses, however, may not be able to publish data openly. | For IPD meta-analyses, the requirements for Open Data may need to be relaxed or adapted in some contexts to ensure anonymity can be maintained. For example, data on request repositories for individual patient data exist [ [38] ]. Standardized ethical practices could be established where needed for IPD meta-analysis.
Sharing data–taming the beast: barriers to meta-analyses of individual patient data (IPD) and solutions. https://bjsm.bmj.com/content/early/2020/01/29/bjsports-2019-101892 Date: 2020 Date accessed: June 5, 2020 |
Institutional barriers including career incentives that reward closed practices [ [39] ] | Career incentives in academic typically and historically center around publication in high-impact journals that are prohibitively expensive to publish Open Access. Recruitment and promotion in academia typically also do not reward or acknowledge Open practices. Institutions may not understand/accept the desire to be Open | Systematic reviewers often work within institutions established around primary research practices, so the same incentives apply. Organizations primarily focusing on evidence synthesis may already have Open practices. | Incentive structures are likely to change over time as Open Science practices become more common, but authorities must take a stand to support researchers who are likely to be disadvantaged by being more Open (e.g., early career researchers). |
- Mavergames C.
- Elliott J.H.
3.10.2 Open Synthesis and current systematic review traditions
3.10.3 Ways forward
CRediT authorship contribution statement
Supplementary data
- Data Profile
References
- Coronavirus disease 2019 (COVID-19): situation report, 85.World Health Organisation, Geneva2020
- The global macroeconomic impacts of COVID-19: seven scenarios. SSRN J.([cited 2020 Apr 15]; Available at)
- STAT’s guide to health care conferences disrupted by the coronavirus crisis. STAT News.([cited 2020 Apr 7]. Available at)https://www.statnews.com/2020/03/07/stats-guide-health-care-conferences-disrupted-covid-19/Date: 2020Date accessed: April 7, 2020
- 2019 novel coronavirus (COVID-19) outbreak: a review of the current literature.Eurasian J Med Oncol. 2020; 4: 1-7
- Coronavirus disease 2019 (COVID-19): a systematic review of imaging findings in 919 patients.Am J Roentgenol. 2020; : 1-7
- Open Synthesis: on the need for evidence synthesis to embrace Open Science.Environ Evid. 2018; 7: 26
- An introduction to systematic reviews.2nd ed. SAGE Publication, London2017: 304
- GRADE Evidence to Decision (EtD) frameworks: a systematic and transparent approach to making well informed healthcare choices. 1: Introduction.BMJ. 2016; 353: i2016
- Systematic searching for environmental evidence using multiple tools and sources.Environ Evid. 2017; 6: 23
- Microfracture produces inferior outcomes to other cartilage repair techniques in chondral injuries in the paediatric knee.Br Med Bull. 2015; 116: 93-103
- The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles.PeerJ. 2018; 6: e4375
- Reducing waste from incomplete or unusable reports of biomedical research.Lancet. 2014; 383: 267-276
- Pathways to independence: towards producing and using trustworthy evidence.BMJ. 2019; 367: l6576
- From the “replicability crisis” to open science practices. Research Methods in Psychology. BCcampus.(Available at:)https://opentextbc.ca/researchmethods/chapter/from-the-replicability-crisis-to-open-science-practices/Date: 2015Date accessed: April 22, 2020
- Open education and critical pedagogy.Learn Media Technology. 2017; 42: 130-146
- Open science: one term, five schools of thought.in: Bartling S. Friesike S. Opening Science: The Evolving Guide on How the Internet is Changing Research, Collaboration and Scholarly Publishing [Internet]. Springer International Publishing, Cham2014: 17-47
- Data sharing: an open mind on open data.Nature. 2016; 529: 117-119
- Open science framework (OSF).J Med Libr Assoc. 2017; 105: 203-206
- Open Science Taxonomy.figshare, 2015https://doi.org/10.6084/m9.figshare.1508606.v3
- Evidence Aid – from the Asian tsunami to the Wenchuan earthquake.J Evid Based Med. 2008; 1: 9-11
- Using GRADE in situations of emergencies and urgencies: certainty in evidence and recommendations matters during the COVID-19 pandemic, now more than ever and no matter what.J Clin Epidemiol. 2020; 0
- EviAtlas: a tool for visualising evidence synthesis databases.Environ Evid. 2019; 8: 22
- Reporting of financial and non-financial conflicts of interest in systematic reviews on health policy and systems research: a cross sectional survey.Int J Health Policy Manag. 2018; 7: 711-717
- Open science isn’t always open to all scientists.Am Sci. 2019; 107: 78-82
- Fee Waivers for Open Access Journals.Publications. 2015; 3: 155-167
- Open science: a new “trust technology”?.Sci Commun. 2012; 34: 679-689
- Reinventing discovery: the new era of networked science.Princeton University Press, New Jersey2020: 204
- Mapping the hinterland: data issues in open science.Public Underst Sci. 2016; 25: 88-103
- Open science challenges, benefits and tips in early career and beyond.PLoS Biol. 2019; 17: e3000246
- Data sharing.N Engl J Med. 2016; 374: 276-277
- Open Science and its Discontents | Ronin Inst.(Available at)http://ronininstitute.org/open-science-and-its-discontents/1383/Date accessed: May 28, 2020
- Keeping research data safe (Phase 2). Jisc.(Available at)https://www.webarchive.org.uk/wayback/archive/20140613220103mp_/http://www.jisc.ac.uk/publications/reports/2010/keepingresearchdatasafe2.aspxDate: 2010Date accessed: June 7, 2020
- Evidence synthesis 2.0: when systematic, scoping, rapid, living, and overviews of reviews come together.J Clin Epidemiol. 2020; 0
- Open science: human emancipation or bureaucratic serfdom? SCIRES-it.(Available at)https://archiviomarini.sp.unipi.it/858/Date accessed: June 1, 2020
- Creative commons licences, the copyright regime and the online community: is there a fatal disconnect?.Mod Law Rev. 2011; 74: 503-531
- Impact of open data policies on consent to participate in human subjects research: discrepancies between participant action and reported concerns.PLoS One. 2015; 10: e0125208
- Enabling open-science initiatives in clinical psychology and psychiatry without sacrificing patients’ privacy: current practices and future challenges.Adv Methods Practices Psychol Sci. 2018; 1: 104-114
- Sharing data–taming the beast: barriers to meta-analyses of individual patient data (IPD) and solutions.(Available at)https://bjsm.bmj.com/content/early/2020/01/29/bjsports-2019-101892Date: 2020Date accessed: June 5, 2020
- Institutional inertia and barriers to the adoption of open science.in: The transformation of university institutional and organizational boundaries. Brill Sense, Leiden, The Netherlands2015: 107-133
- Living Systematic Reviews: towards real-time evidence for health-care decision-making | BMJ Best Pract.(Available at)https://bestpractice.bmj.com/info/toolkit/discuss-ebm/living-systematic-reviews-towards-real-time-evidence-for-health-care-decision-making/Date: 2020Date accessed: June 5, 2020
- Benefits of open and high-powered research outweigh costs.J Pers Soc Psychol. 2017; 113: 230
- Why Cochrane should prioritise sharing data.BMJ. 2018; 362: k3229
- Update from the methodological Expectations of Cochrane Intervention reviews (MECIR) project.in: Cochrane Methods. Cochrane. 2012 (; 2012. Accessed April 22 2020)
- Campbell systematic reviews takes next step to meeting FAIR principles.Campbell Syst Rev. 2019; 15: e1032
Article info
Publication history
Footnotes
Declarations of interest: NRH and TL are the coordinators of the Open Synthesis Working Group, a voluntary collaboration of stakeholders interested in the application of Open Science principles in evidence synthesis conduct and publication.
Funding: This work was produced in part as a result of funding from FORTE, the Swedish Research Council for Health, Working Life, and Welfare (2018-01619).
Author’s contributions: NRH and TL developed the concept for the manuscript. NRH drafted the manuscript. All authors have read and approved the manuscript prior to submission.