Advertisement
Series GRADE Series| Volume 129, P138-150, January 2021

Download started.

Ok

GRADE Guidelines 30: the GRADE approach to assessing the certainty of modeled evidence—An overview in the context of health decision-making

Published:September 24, 2020DOI:https://doi.org/10.1016/j.jclinepi.2020.09.018

      Abstract

      Objectives

      The objective of the study is to present the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) conceptual approach to the assessment of certainty of evidence from modeling studies (i.e., certainty associated with model outputs).

      Study Design and Setting

      Expert consultations and an international multidisciplinary workshop informed development of a conceptual approach to assessing the certainty of evidence from models within the context of systematic reviews, health technology assessments, and health care decisions. The discussions also clarified selected concepts and terminology used in the GRADE approach and by the modeling community. Feedback from experts in a broad range of modeling and health care disciplines addressed the content validity of the approach.

      Results

      Workshop participants agreed that the domains determining the certainty of evidence previously identified in the GRADE approach (risk of bias, indirectness, inconsistency, imprecision, reporting bias, magnitude of an effect, dose–response relation, and the direction of residual confounding) also apply when assessing the certainty of evidence from models. The assessment depends on the nature of model inputs and the model itself and on whether one is evaluating evidence from a single model or multiple models. We propose a framework for selecting the best available evidence from models: 1) developing de novo, a model specific to the situation of interest, 2) identifying an existing model, the outputs of which provide the highest certainty evidence for the situation of interest, either “off-the-shelf” or after adaptation, and 3) using outputs from multiple models. We also present a summary of preferred terminology to facilitate communication among modeling and health care disciplines.

      Conclusion

      This conceptual GRADE approach provides a framework for using evidence from models in health decision-making and the assessment of certainty of evidence from a model or models. The GRADE Working Group and the modeling community are currently developing the detailed methods and related guidance for assessing specific domains determining the certainty of evidence from models across health care–related disciplines (e.g., therapeutic decision-making, toxicology, environmental health, and health economics).

      Keywords

      To read this article in full you will need to make a payment

      Purchase one-time access:

      Academic & Personal: 24 hour online accessCorporate R&D Professionals: 24 hour online access
      One-time access price info
      • For academic or personal research use, select 'Academic and Personal'
      • For corporate R&D use, select 'Corporate R&D Professionals'

      Subscribe:

      Subscribe to Journal of Clinical Epidemiology
      Already a print subscriber? Claim online access
      Already an online subscriber? Sign in
      Institutional Access: Sign in to ScienceDirect

      References

        • Oreskes N.
        The role of quantitative models in science.
        in: Canham C.D. Cole J.J. Lauenroth W.K. Models in ecosystem science. Princeton University Press, Princeton, NJ2003: 13-31
        • Frigg R.
        • Hartmann S.
        Models in science.
        in: Zalta E.N. The Stanford Encyclopedia of Philosophy. Spring 2017 Edition, Stanford, CA2017
        • Guyatt G.H.
        • Oxman A.D.
        • Kunz R.
        • Vist G.E.
        • Falck-Ytter Y.
        • Schunemann H.J.
        • et al.
        What is "quality of evidence" and why is it important to clinicians?.
        BMJ. 2008; 336: 995-998
        • Oreskes N.
        Evaluation (not validation) of quantitative models.
        Environ Health Perspect. 1998; 106: 1453-1460
        • Briggs A.H.
        • Weinstein M.C.
        • Fenwick E.A.
        • Karnon J.
        • Sculpher M.J.
        • Paltiel A.D.
        • et al.
        Model parameter estimation and uncertainty: a report of the ISPOR-SMDM modeling good research practices task force--6.
        Value Health. 2012; 15: 835-842
        • Caro J.J.
        • Briggs A.H.
        • Siebert U.
        • Kuntz K.M.
        • Force I-SMGRPT
        Modeling good research practices--overview: a report of the ISPOR-SMDM modeling good research practices task force-1.
        Med Decis Making. 2012; 32: 667-677
        • Caro J.J.
        • Eddy D.M.
        • Kan H.
        • Kaltz C.
        • Patel B.
        • Eldessouki R.
        • et al.
        Questionnaire to assess relevance and credibility of modeling studies for informing health care decision making: an ISPOR-AMCP-NPC Good Practice Task Force report.
        Value Health. 2014; 17: 174-182
        • Eddy D.M.
        • Hollingworth W.
        • Caro J.J.
        • Tsevat J.
        • McDonald K.M.
        • Wong J.B.
        • et al.
        Model transparency and validation: a report of the ISPOR-SMDM modeling good research practices task force-7.
        Med Decis Making. 2012; 32: 733-743
        • Karnon J.
        • Stahl J.
        • Brennan A.
        • Caro J.J.
        • Mar J.
        • Moller J.
        Modeling using discrete event simulation: a report of the ISPOR-SMDM modeling good research practices task force-4.
        Med Decis Making. 2012; 32: 701-711
        • Marshall D.A.
        • Burgos-Liz L.
        • MJ I.J.
        • Crown W.
        • Padula W.V.
        • Wong P.K.
        • et al.
        Selecting a dynamic simulation modeling method for health care delivery research-part 2: report of the ISPOR Dynamic Simulation Modeling Emerging Good Practices Task Force.
        Value Health. 2015; 18: 147-160
        • Marshall D.A.
        • Burgos-Liz L.
        • MJ I.J.
        • Osgood N.D.
        • Padula W.V.
        • Higashi M.K.
        • et al.
        Applying dynamic simulation modeling methods in health care delivery research-the SIMULATE checklist: report of the ISPOR simulation modeling emerging good practices task force.
        Value Health. 2015; 18: 5-16
        • Pitman R.
        • Fisman D.
        • Zaric G.S.
        • Postma M.
        • Kretzschmar M.
        • Edmunds J.
        • et al.
        Dynamic transmission modeling: a report of the ISPOR-SMDM modeling good research practices task force working group-5.
        Med Decis Making. 2012; 32: 712-721
        • Roberts M.
        • Russell L.B.
        • Paltiel A.D.
        • Chambers M.
        • McEwan P.
        • Krahn M.
        • et al.
        Conceptualizing a model: a report of the ISPOR-SMDM modeling good research practices task force-2.
        Med Decis Making. 2012; 32: 678-689
        • Siebert U.
        • Alagoz O.
        • Bayoumi A.M.
        • Jahn B.
        • Owens D.K.
        • Cohen D.J.
        • et al.
        State-transition modeling: a report of the ISPOR-SMDM modeling good research practices task force-3.
        Med Decis Making. 2012; 32: 690-700
        • Vemer P.
        • van Voom G.A.
        • Ramos I.C.
        • Krabbe P.F.
        • Al M.J.
        • Feenstra T.L.
        Improving model validation in health technology assessment: comments on guidelines of the ISPOR-SMDM modeling good research practices task force.
        Value Health. 2013; 16: 1106-1107
        • Weinstein M.C.
        • O'Brien B.
        • Hornberger J.
        • Jackson J.
        • Johannesson M.
        • McCabe C.
        • et al.
        Principles of good practice for decision analytic modeling in health-care evaluation: report of the ISPOR Task Force on Good Research Practices--Modeling Studies.
        Value Health. 2003; 6: 9-17
        • Bennett C.
        • Manuel D.G.
        Reporting guidelines for modelling studies.
        BMC Med Res Methodol. 2012; 12: 168
        • Peñaloza Ramos M.C.
        • Barton P.
        • Jowett S.
        • Sutton A.J.
        A systematic review of research guidelines in decision-analytic modeling.
        Value Health. 2015; 18: 512-529
        • Philips Z.
        • Bojke L.
        • Sculpher M.
        • Claxton K.
        • Golder S.
        Good practice guidelines for decision-analytic modelling in health technology assessment: a review and consolidation of quality assessment.
        Pharmacoeconomics. 2006; 24: 355-371
        • LaKind J.S.
        • O'Mahony C.
        • Armstrong T.
        • Tibaldi R.
        • Blount B.C.
        • Naiman D.Q.
        ExpoQual: evaluating measured and modeled human exposure data.
        Environ Res. 2019; 171: 302-312
        • Husereau D.
        • Drummond M.
        • Petrou S.
        • Carswell C.
        • Moher D.
        • Greenberg D.
        • et al.
        Consolidated health economic evaluation reporting standards (CHEERS) statement.
        BMJ. 2013; 346: f1049
        • Balshem H.
        • Helfand M.
        • Schunemann H.J.
        • Oxman A.D.
        • Kunz R.
        • Brozek J.
        • et al.
        GRADE guidelines: 3. Rating the quality of evidence.
        J Clin Epidemiol. 2011; 64: 401-406
        • Guyatt G.H.
        • Oxman A.D.
        • Kunz R.
        • Brozek J.
        • Alonso-Coello P.
        • Rind D.
        • et al.
        GRADE guidelines: 6. Rating the quality of evidence--imprecision.
        J Clin Epidemiol. 2011; 64: 1283-1293
        • Guyatt G.H.
        • Oxman A.D.
        • Kunz R.
        • Woodcock J.
        • Brozek J.
        • Helfand M.
        • et al.
        GRADE guidelines: 8. Rating the quality of evidence--indirectness.
        J Clin Epidemiol. 2011; 64: 1303-1310
        • Guyatt G.H.
        • Oxman A.D.
        • Kunz R.
        • Woodcock J.
        • Brozek J.
        • Helfand M.
        • et al.
        GRADE guidelines: 7. Rating the quality of evidence--inconsistency.
        J Clin Epidemiol. 2011; 64: 1294-1302
        • Guyatt G.H.
        • Oxman A.D.
        • Montori V.
        • Vist G.
        • Kunz R.
        • Brozek J.
        • et al.
        GRADE guidelines: 5. Rating the quality of evidence--publication bias.
        J Clin Epidemiol. 2011; 64: 1277-1282
        • Guyatt G.H.
        • Oxman A.D.
        • Sultan S.
        • Glasziou P.
        • Akl E.A.
        • Alonso-Coello P.
        • et al.
        GRADE guidelines: 9. Rating up the quality of evidence.
        J Clin Epidemiol. 2011; 64: 1311-1316
        • Guyatt G.H.
        • Oxman A.D.
        • Vist G.
        • Kunz R.
        • Brozek J.
        • Alonso-Coello P.
        • et al.
        GRADE guidelines: 4. Rating the quality of evidence--study limitations (risk of bias).
        J Clin Epidemiol. 2011; 64: 407-415
        • Cumpston M.
        • Chandler J.
        • Thomas J.
        • Higgins J.P.T.
        • Deeks J.J.
        • Clarke M.J.
        Chapter I: Introduction.
        in: Higgins J.P.T. Thomas J. Chandler J. Cumpston M. Li T. Page M.J. Cochrane Handbook for Systematic Reviews of Interventions. 6.1. Cochrane, 2020
        • Eykhoff P.
        System identification: parameter and state estimation.
        Wiley-Interscience, London1974
        • Schunemann H.J.
        • Best D.
        • Vist G.
        • Oxman A.D.
        • Group G.W.
        Letters, numbers, symbols and words: how to communicate grades of evidence and recommendations.
        CMAJ. 2003; 169: 677-680
        • Schunemann H.J.
        • Mustafa R.
        • Brozek J.
        • Santesso N.
        • Alonso-Coello P.
        • Guyatt G.
        • et al.
        GRADE Guidelines: 16. GRADE evidence to decision frameworks for tests in clinical practice and public health.
        J Clin Epidemiol. 2016; 76: 89-98
        • Schunemann H.J.
        • Oxman A.D.
        • Brozek J.
        • Glasziou P.
        • Jaeschke R.
        • Vist G.E.
        • et al.
        Grading quality of evidence and strength of recommendations for diagnostic tests and strategies.
        BMJ. 2008; 336: 1106-1110
        • Iorio A.
        • Spencer F.A.
        • Falavigna M.
        • Alba C.
        • Lang E.
        • Burnand B.
        • et al.
        Use of GRADE for assessment of evidence about prognosis: rating confidence in estimates of event rates in broad categories of patients.
        BMJ. 2015; 350: h870
        • Hooijmans C.R.
        • de Vries R.B.M.
        • Ritskes-Hoitinga M.
        • Rovers M.M.
        • Leeflang M.M.
        • IntHout J.
        • et al.
        Facilitating healthcare decisions by assessing the certainty in the evidence from preclinical animal studies.
        PLoS One. 2018; 13: e0187271
        • Brunetti M.
        • Shemilt I.
        • Pregno S.
        • Vale L.
        • Oxman A.D.
        • Lord J.
        • et al.
        GRADE guidelines: 10. Considering resource use and rating the quality of economic evidence.
        J Clin Epidemiol. 2013; 66: 140-150
        • Zhang Y.
        • Alonso-Coello P.
        • Guyatt G.H.
        • Yepes-Nunez J.J.
        • Akl E.A.
        • Hazlewood G.
        • et al.
        GRADE Guidelines: 19. Assessing the certainty of evidence in the importance of outcomes or values and preferences-Risk of bias and indirectness.
        J Clin Epidemiol. 2018;
        • Zhang Y.
        • Coello P.A.
        • Guyatt G.H.
        • Yepes-Nunez J.J.
        • Akl E.A.
        • Hazlewood G.
        • et al.
        GRADE guidelines: 20. Assessing the certainty of evidence in the importance of outcomes or values and preferences-inconsistency, imprecision, and other domains.
        J Clin Epidemiol. 2018;
        • World Health Organization
        WHO guidelines for screening and treatment of precancerous lesions for cervical cancer prevention.
        World Health Organization, Geneva, Switzerland2013
        • Thayer K.A.
        • Schunemann H.J.
        Using GRADE to respond to health questions with different levels of urgency.
        Environ Int. 2016; 92-93: 585-589
        • Porgo T.V.
        • Norris S.L.
        • Salanti G.
        • Johnson L.F.
        • Simpson J.A.
        • Low N.
        • et al.
        The use of mathematical modeling studies for evidence synthesis and guideline development: a glossary.
        Res Synth Methods. 2019; 10: 125-133
        • National Institute for Health and Care Excellence
        Chapter 5: The reference case. Guide to the methods of technology appraisal 2013.
        NICE, 2013
        https://www.nice.org.uk/process/pmg9/
        Date accessed: October 13, 2020
        • Eyles H.
        • Ni Mhurchu C.
        • Nghiem N.
        • Blakely T.
        Food pricing strategies, population diets, and non-communicable disease: a systematic review of simulation studies.
        PLoS Med. 2012; 9: e1001353
        • Jaime Caro J.
        • Eddy D.M.
        • Kan H.
        • Kaltz C.
        • Patel B.
        • Eldessouki R.
        • et al.
        Questionnaire to assess relevance and credibility of modeling studies for informing health care decision making: an ISPOR-AMCP-NPC Good Practice Task Force report.
        Value Health. 2014; 17: 174-182
        • National Institute for Health and Care Excellence
        Appendix G: Methodology checklist: economic evaluations. The guidelines manual: Process and methods.
        NICE, 2012
        https://www.nice.org.uk/process/pmg6/
        Date accessed: October 13, 2020
        • Schultz T.W.
        • Richarz A.-N.
        • Cronin M.T.D.
        Assessing uncertainty in read-across: questions to evaluate toxicity predictions based on knowledge gained from case studies.
        Comput Toxicol. 2019; 9: 1-11
        • Cronin M.T.D.
        • Richarz A.N.
        • Schultz T.W.
        Identification and description of the uncertainty, variability, bias and influence in quantitative structure-activity relationships (QSARs) for toxicity prediction.
        Regul Toxicol Pharmacol. 2019; 106: 90-104
        • Brazier J.
        • Ara R.
        • Azzabi I.
        • Busschbach J.
        • Chevrou-Severac H.
        • Crawford B.
        • et al.
        Identification, review, and use of health state utilities in cost-effectiveness models: an ISPOR good practices for outcomes research task force report.
        Value Health. 2019; 22: 267-275
        • Kaltenthaler E.
        • Tappenden P.
        • Paisley S.
        • Squires H.
        NICE DSU Technical Support Document 13: Identifying and Reviewing Evidence to Inform the Conceptualisation and Population of Cost-Effectiveness Models.
        National Institute for Health and Care Excellence, London, UK2011
        • Paisley S.
        Identification of evidence for key parameters in decision-analytic models of cost effectiveness: a description of sources and a recommended minimum search requirement.
        Pharmacoeconomics. 2016; 34: 597-608
        • Guyatt G.
        • Oxman A.D.
        • Sultan S.
        • Brozek J.
        • Glasziou P.
        • Alonso-Coello P.
        • et al.
        GRADE guidelines: 11. Making an overall rating of confidence in effect estimates for a single outcome and for all outcomes.
        J Clin Epidemiol. 2013; 66: 151-157
        • Bilcke J.
        • Beutels P.
        • Brisson M.
        • Jit M.
        Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.
        Med Decis Making. 2011; 31: 675-692
        • Saltelli A.
        • Ratto M.
        • Andres T.
        • Campolongo F.
        • Cariboni J.
        • Gatelli D.
        • et al.
        Global sensitivity analysis. The primer.
        John Wiley & Sons, Chichester, England2008
        • Page M.J.
        • Higgins J.P.T.
        • Sterne J.A.C.
        Chapter 13: Assessing risk of bias due to missing results in a synthesis.
        in: Higgins J.P.T. Thomas J. Chandler J. Cumpston M. Li T. Page M.J. Cochrane Handbook for Systematic Reviews of Interventions. 6.1 Cochrane, 2019
        • Schünemann H.J.
        • Lerda D.
        • Quinn C.
        • Follmann M.
        • Alonso-Coello P.
        • Rossi P.G.
        • et al.
        Breast cancer screening and Diagnosis: a synopsis of the European Breast guidelines.
        Ann Intern Med. 2020; 172: 46-56
        • Eaton J.W.
        • Johnson L.F.
        • Salomon J.A.
        • Barnighausen T.
        • Bendavid E.
        • Bershteyn A.
        • et al.
        HIV treatment as prevention: systematic comparison of mathematical models of the potential impact of antiretroviral therapy on HIV incidence in South Africa.
        Plos Med. 2012; 9: e1001245
        • Gomersall J.S.
        • Jadotte Y.T.
        • Xue Y.
        • Lockwood S.
        • Riddle D.
        • Preda A.
        Conducting systematic reviews of economic evaluations.
        Int J Evid Based Healthc. 2015; 13: 170-178
        • Mandelblatt J.S.
        • Stout N.K.
        • Schechter C.B.
        • van den Broek J.J.
        • Miglioretti D.L.
        • Krapcho M.
        • et al.
        Collaborative modeling of the benefits and harms associated with different U.S. Breast cancer screening strategies.
        Ann Intern Med. 2016; 164: 215-225
        • Davies N.G.
        • Kucharski A.J.
        • Eggo R.M.
        • Gimma A.
        • Edmunds W.J.
        Centre for the Mathematical Modelling of Infectious Diseases C-wg. Effects of non-pharmaceutical interventions on COVID-19 cases, deaths, and demand for hospital services in the UK: a modelling study.
        Lancet Public Health. 2020; 5: e375-e385
        • Tibaldi R.
        • ten Berge W.
        • Drolet D.
        Dermal absorption of chemicals: estimation by IH SkinPerm.
        J Occup Environ Hyg. 2014; 11: 19-31
        • Young B.M.
        • Tulve N.S.
        • Egeghy P.P.
        • Driver J.H.
        • Zartarian V.G.
        • Johnston J.E.
        • et al.
        Comparison of four probabilistic models (CARES((R)), Calendex, ConsExpo, and SHEDS) to estimate aggregate residential exposures to pesticides.
        J Expo Sci Environ Epidemiol. 2012; 22: 522-532
      1. United States Environmental Protection Agency. Human Exposure Modeling - Overview. https://www.epa.gov/fera/human-exposure-modeling-overview. Accessed October 13, 2020.

        • Levin S.
        • Dugas A.
        • Gurses A.
        • Kirsch T.
        • Kelen G.
        • Hinson J.
        • et al.
        Hopscore: An Electronic Outcomes-Based Emergency Triage System.
        Agency for Healthcare Research and Quality, 2018
        • Smith R.D.
        • Keogh-Brown M.R.
        • Barnett T.
        • Tait J.
        The economy-wide impact of pandemic influenza on the UK: a computable general equilibrium modelling experiment.
        BMJ. 2009; 339: b4571
        • Hultcrantz M.
        • Rind D.
        • Akl E.A.
        • Treweek S.
        • Mustafa R.A.
        • Iorio A.
        • et al.
        The GRADE Working Group clarifies the construct of certainty of evidence.
        J Clin Epidemiol. 2017; 87: 4-13