Abstract
Objectives
Study Design and Setting
Results
Conclusions
Keywords
- •In this study, we compared adherence to the transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD) statement for studies developing or validating coronavirus (COVID-19) prediction models with a preprint version available, before and after the peer review process.
- •The findings of this report demonstrate a poor quality of reporting of COVID-19 prediction modeling studies among preprint versions, which did not improve much following peer review.
- •Most TRIPOD items saw no change in the frequency of their reporting, with only the coverage of discussion items being substantially improved in the published version.
Key findings
- •While it has previously been reported that the adherence to reporting guidelines for prediction modeling studies was poor, our findings also suggest that the peer review process had little impact on this adherence during the pandemic.
What this adds to what was known?
- •The implication of these findings is that a greater focus is needed on the importance of adhering to reporting guidelines by authors as well as checking during the editorial process.
What is the implication and what should change now?
1. Introduction
2. Methods
2.1 Inclusion criteria
2.2 Data extraction and calculating adherence to TRIPOD
2.3 Outcome measures and statistical analysis
3. Results

Variable | Median (lower quartile-upper quartile)/[minimum to maximum] |
---|---|
Preprint version percentage adherence | 33 (30–50)/[10 to 68] |
Published version percentage adherence | 42 (31–57)/[10 to 71] |
Change in percentage adherence | 3 (0–7)/[0 to 14] |
Journal impact factor | 4 (4–7)/[3 to 40] |
Number of days between first submission to the journal and acceptance date | 80 (40–187)/[22 to 259] |
Number of preprint versions prior to journal submission | 1 (1–1)/[1 to 3] |
Days between manuscript first submission and start of pandemic | 67 (35–141)/[12 to 202] |
Half way down | n (%) |
Presence of statement relating to TRIPOD on journal website | |
Yes | 6 (32) |
No | 13 (68) |
Preprint server | |
ArXiv | 2 (11) |
medRxiv | 16 (84) |
bioRxiv | 1 (5) |
3.1 Changes in completeness of reporting following peer review


Model | Covariate | Coefficient (95% CI) |
---|---|---|
Model 1 | Preprint score | 1.01 (0.87 to 1.16) |
Intercept | 0.03 (−0.03 to 0.10) | |
Model 2 | Journal impact factor | 0.00 (−0.003 to 0.003) |
Preprint score | 1.01 (0.83 to 1.19) | |
Intercept | 0.04 (−0.03 to 0.11) | |
Model 3 | Time between journal submission and acceptance (per 28-day period) | −0.005 (−0.01 to 0.004) |
Preprint score | 1.01 (0.86 to 1.15) | |
Intercept | 0.06 (−0.02 to 0.13) | |
Model 4 | Number of preprints uploaded prior to journal submission | −0.03 (−0.07 to 0.02) |
Preprint score | 1.03 (0.88 to 1.18) | |
Intercept | 0.06 (−0.01 to 0.13) | |
Model 5 | Days between manuscript first submission and start of pandemic (per 28-day period) | −0.008 (−0.02 to 0.003) |
Preprint score | 1.06 (0.90 to 1.21) | |
Intercept | 0.04 (−0.02 to 0.10) | |
Model 6 | Statement on the use of the TRIPOD in the journal's instructions to authors | |
No | (reference group) | |
Yes | −0.002 (−0.05 to 0.05) | |
Preprint score | 1.01 (0.86 to 1.17) | |
Intercept | 0.03 (−0.03 to 0.10) |
3.2 Reporting of individual TRIPOD items

3.3 Assessment of open peer review
4. Discussion
4.1 Comparison to other studies
4.2 Strengths and limitations of this study
4.3 Implications for practice and areas for future research
5. Conclusions
Acknowledgments
Appendix A. Supplementary Data
- Supplementary Material
References
- COVID-19 coronavirus pandemic.(Available at)https://www.worldometers.info/coronavirus/?utm_campaign=homeAdvegas1Date accessed: May 26, 2022
- Transparent Reporting of a multivariable prediction model for Individual Prognosis or Diagnosis (TRIPOD): explanation and elaboration.Ann Intern Med. 2015; 162: W1-W73
- Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): the TRIPOD Statement.BMC Med. 2015; 13: 1
- Poor reporting of multivariable prediction model studies: towards a targeted implementation strategy of the TRIPOD statement.BMC Med. 2018; 16: 1-12
- Prediction models for diagnosis and prognosis of covid-19 infection: systematic review and critical appraisal.BMJ. 2020; 369: m1328
- A systematic review finds prediction models for chronic kidney disease were poorly reported and often developed using inappropriate methods.J Clin Epidemiol. 2013; 66: 268-277
- Reducing waste from incomplete or unusable reports of biomedical research.Lancet. 2014; 383: 267-276
- Avoidable waste in the production and reporting of research evidence.Lancet. 2009; 374: 86-89
- Waste in covid-19 research.BMJ. 2020; 369: m1847
- The PRISMA 2020 statement: an updated guideline for reporting systematic reviews.J Clin Epidemiol. 2021; 134: 178-189
- Uniformity in measuring adherence to reporting guidelines: the example of TRIPOD for assessing completeness of reporting of prediction model studies.BMJ Open. 2019; 9: e025611
- Enhancing the QUAlity and transparency of health research.(Available at)https://www.equator-network.org/Date accessed: August 1, 2022
- STARD 2015: an updated list of essential items for reporting diagnostic accuracy studies.BMJ. 2015; 351: h5527
- Predicting CoVID-19 community mortality risk using machine learning and development of an online prognostic tool.PeerJ. 2020; 8: e10083
- Evaluation and improvement of the national early warning score (NEWS2) for COVID-19: a multi-hospital study.BMC Med. 2021; 19: 23
- Building a COVID-19 vulnerability index.J Med Artif Intell. 2020; 3: 15
- Risk stratification of patients admitted to hospital with covid-19 using the ISARIC WHO Clinical Characterisation Protocol: development and validation of the 4C Mortality Score.BMJ. 2020; 370: m3339
- Developing a COVID-19 mortality risk prediction model when individual-level data are not available.Nat Commun. 2020; 11: 4439
- Developing risk prediction models for type 2 diabetes: a systematic review of methodology and reporting.BMC Med. 2011; 9: 103
- Reporting and methods in clinical prediction research: a systematic review.PLoS Med. 2012; 9: 1-12
- Prediction models for cardiovascular disease risk in the general population: systematic review.BMJ. 2016; 353: i2416
- Developing risk prediction models for postoperative pancreatic fistula: a systematic review of methodology and reporting quality.Indian J Surg. 2016; 78: 136-143
- TRIPOD statement: a preliminary pre-post analysis of reporting and methods of prediction models.BMJ Open. 2020; 10: e041537
- Evaluating the quality of reporting of melanoma prediction models.Surgery. 2020; 168: 173-177
- External validation of multivariable prediction models: a systematic review of methodological conduct and reporting.BMC Med Res Methodol. 2014; 14: 40
- Comparing quality of reporting between preprints and peer-reviewed articles in the biomedical literature.Res Integr Peer Rev. 2020; 5: 16
- Readers' evaluation of effect of peer review and editing on quality of articles in the Nederlands Tijdschrift voor Geneeskunde.Lancet. 1996; 348: 1480-1483
- Manuscript quality before and after peer review and editing at annals of internal medicine.Ann Intern Med. 1994; 121: 11-21
- Statistical reviewers improve reporting in biomedical articles: a randomized trial.PLoS One. 2007; 2: e332
- Effect of using reporting guidelines during peer review on quality of final manuscripts submitted to a biomedical journal: masked randomised trial.BMJ. 2011; 343: d6783
- The evolving role of preprints in the dissemination of COVID-19 research and their impact on the science communication landscape.PLoS Biol. 2021; 19: e3000959
- Tracking changes between preprint posting and journal publication during a pandemic.PLoS Biol. 2022; 20: e3001285
Article info
Publication history
Footnotes
Ethics approval and consent to participate: Not Applicable.
Consent for publication: Not Applicable.
Availability of data and materials: The datasets generated and analyzed during the current study are available from the corresponding author on reasonable request.
Conflict of interests: The authors confirm we have no competing interests.
Funding: MTH is supported by St. George's, University of London. LA, RDR, and GSC were supported by funding from the MRC Better Methods Better Research panel (grant reference: MR/V038168/1). GSC was also supported by Cancer Research UK (programme grant: C49297/A27294). The funders had no role in considering the study design or in the collection, analysis, interpretation of data, writing of the report, or decision to submit the article for publication.
Author Contributions: Conceptualization–MTH, MvS, GSC, EWS, JBR, RDR, BVC, LW. Data Curation–MTH, LA, CW, LW. Formal Analysis–MTH, LA, LW. Investigation–MTH, LA, LW. Methodology–MTH, LA, MvS, GSC, EWS, JBR, RDR, BVC, LW. Project Administration–MTH. Writing–Original Draft–MTH, LA, LW. Writing–Review&Editing–All Authors.
Identification
Copyright
User license
Creative Commons Attribution (CC BY 4.0) |
Permitted
- Read, print & download
- Redistribute or republish the final article
- Text & data mine
- Translate the article
- Reuse portions or extracts from the article in other works
- Sell or re-use for commercial purposes
Elsevier's open access license policy