This paper is only available as a PDF. To read, Please Download here.
Many authors used reporting checklists as an assessment tool to analyze the reporting quality of diverse types of evidence. We aimed to analyze methodological approaches used by researchers assessing reporting quality of evidence in randomized controlled trials (RCTs), systematic reviews and observational studies.
Study design and setting
We analyzed articles reporting quality assessment of evidence with PRISMA, CONSORT or STROBE checklists published up to 18 July 2021. We analyzed methods used for assessing reporting quality.
Among 356 analyzed articles, 293 (88%) investigated a specific thematic field. The CONSORT checklist (N=225; 67%) was most often used, in its original, modified, partial form, or its extension. Numerical scores were given for adherence to checklist items in 252 articles (75%), of which 36 articles (11%) used various reporting quality thresholds. In 158 (47%) articles, predictors of adherence to reporting checklist were analyzed. The most studied factor associated with adherence to reporting checklist was the year of article publication (N=82; 52%).
The methodology used for assessing reporting quality of evidence varied considerably. The research community needs a consensus on a consistent methodology for assessing the quality of reporting.
To read this article in full you will need to make a payment
Purchase one-time access:Academic & Personal: 24 hour online accessCorporate R&D Professionals: 24 hour online access
One-time access price info
- For academic or personal research use, select 'Academic and Personal'
- For corporate R&D use, select 'Corporate R&D Professionals'
Subscribe:Subscribe to Journal of Clinical Epidemiology
Already a print subscriber? Claim online access
Already an online subscriber? Sign in
Register: Create an account
Institutional Access: Sign in to ScienceDirect
Accepted: March 7, 2023
Received in revised form: February 14, 2023
Received: November 2, 2022
Publication stageIn Press Journal Pre-Proof
© 2023 Elsevier Inc. All rights reserved.