PRESS Peer Review of Electronic Search Strategies: 2015 Guideline Statement

Open AccessPublished:March 19, 2016DOI:https://doi.org/10.1016/j.jclinepi.2016.01.021

      Abstract

      Objective

      To develop an evidence-based guideline for Peer Review of Electronic Search Strategies (PRESS) for systematic reviews (SRs), health technology assessments, and other evidence syntheses.

      Study Design and Setting

      An SR, Web-based survey of experts, and consensus development forum were undertaken to identify checklists that evaluated or validated electronic literature search strategies and to determine which of their elements related to search quality or errors.

      Results

      Systematic review: No new search elements were identified for addition to the existing (2008–2010) PRESS 2015 Evidence-Based Checklist, and there was no evidence refuting any of its elements. Results suggested that structured PRESS could identify search errors and improve the selection of search terms. Web-based survey of experts: Most respondents felt that peer review should be undertaken after the MEDLINE search had been prepared but before it had been translated to other databases. Consensus development forum: Of the seven original PRESS elements, six were retained: translation of the research question; Boolean and proximity operators; subject headings; text word search; spelling, syntax and line numbers; and limits and filters. The seventh (skilled translation of the search strategy to additional databases) was removed, as there was consensus that this should be left to the discretion of searchers. An updated PRESS 2015 Guideline Statement was developed, which includes the following four documents: PRESS 2015 Evidence-Based Checklist, PRESS 2015 Recommendations for Librarian Practice, PRESS 2015 Implementation Strategies, and PRESS 2015 Guideline Assessment Form.

      Conclusion

      The PRESS 2015 Guideline Statement should help to guide and improve the peer review of electronic literature search strategies.

      Key Words

      What is new?

         Key findings

      • Structured peer reviews of electronic literature search strategies are able to find search errors and offer enhancements to the selection of subject headings and text words, leading to the retrieval of additional studies.
      • No new relevant searching elements have emerged beyond those in the original Peer Review of Electronic Search Strategies (PRESS) Evidence-Based Checklist for peer review.
      • Of the seven original PRESS elements, six were retained while the seventh (skilled translation of the search strategy to additional databases) was removed, as there was consensus that this should be left to the discretion of searchers.

         What this adds to what was known?

      • The evidence suggests that peer review of electronic literature search strategies using a structured tool enhances the quality and comprehensiveness of the search compared with searches that are not peer reviewed.
      • The PRESS 2015 Guideline Statement should be helpful to guide and improve the peer review of electronic literature search strategies.

         What is the implication and what should change now?

      • The “primary” search strategy for systematic reviews, health technology assessments, and other evidence syntheses should be peer reviewed using a structured tool such as the PRESS 2015 Evidence-Based Checklist, which is part of the PRESS 2015 Guideline Statement.
      • The name and credentials of the person who undertook the peer review should be reported.

      1. Introduction

      Systematic review (SR) and health technology assessment (HTA) reports are pillars of evidence-based medicine due to their methodological rigor in the conduct of unbiased knowledge syntheses. The literature search component of these reviews provides the important evidence base and therefore is a fundamental element that can affect overall quality. The aim is to achieve comprehensiveness of coverage while maintaining a moderate degree of precision of the records retrieved.
      Before our original (2008–2010) publications on Peer Review of Electronic Search Strategies (PRESS) [
      • Sampson M.
      • McGowan J.
      • Cogo E.
      • Grimshaw J.
      • Moher D.
      • Lefebvre C.
      An evidence-based practice guideline for the peer review of electronic search strategies.
      ,
      • Sampson M.
      • McGowan J.
      • Lefebvre C.
      • Moher D.
      • Grimshaw J.M.
      PRESS: Peer Review of Electronic Search Strategies.
      ,
      • McGowan J.
      • Sampson M.
      • Lefebvre C.
      Evidence Based Checklist for the Peer Review of Electronic Search Strategies (PRESS EBC).
      ], several tools existed to validate some aspects of the SR search-reporting methods, but none evaluated the overall process [
      • Sampson M.
      • McGowan J.
      • Tetzlaff J.
      • Cogo E.
      • Moher D.
      No consensus exists on search reporting methods for systematic reviews.
      ]. Furthermore, a study we conducted of over 100 MEDLINE searches revealed that most search strategies contained errors [
      • Sampson M.
      • McGowan J.
      Errors in search strategies were identified by type and frequency.
      ]. The quality of the database search may be enhanced by PRESS.

      2. Objective

      The objective was to develop, using an evidence-based process, a practice guideline for the peer review of electronic literature search strategies for librarians and other information specialists who perform literature searches for SR and HTA reports.

      3. Intent of the PRESS 2015 updated Guideline Statement

      The PRESS 2015 Guideline Statement updates and expands on the previous PRESS publications [
      • Sampson M.
      • McGowan J.
      • Cogo E.
      • Grimshaw J.
      • Moher D.
      • Lefebvre C.
      An evidence-based practice guideline for the peer review of electronic search strategies.
      ,
      • Sampson M.
      • McGowan J.
      • Lefebvre C.
      • Moher D.
      • Grimshaw J.M.
      PRESS: Peer Review of Electronic Search Strategies.
      ,
      • McGowan J.
      • Sampson M.
      • Lefebvre C.
      Evidence Based Checklist for the Peer Review of Electronic Search Strategies (PRESS EBC).
      ]. A companion document was produced called the PRESS 2015 Guideline Explanation and Elaboration (PRESS 2015 E&E) and is more detailed than this article and is intended to enhance the use, understanding, and dissemination of the PRESS 2015 Guideline Statement [
      • McGowan J.
      • Sampson M.
      • Salzwedel D.
      • Cogo E.
      • Foerster V.
      • Lefebvre C.
      PRESS–Peer Review of Electronic Search Strategies: 2015 Guideline Explanation & Elaboration (PRESS E&E).
      ]. The PRESS 2015 Guideline Statement is this article and includes four components: (1) an updated PRESS 2015 Evidence-Based Checklist (Table 1); (2) six PRESS 2015 Recommendations for librarian practice (Table 2); (3) four PRESS 2015 Implementation Strategies (Table 3); and (4) an updated PRESS 2015 Guideline Assessment Form (Appendix/Appendix A at www.jclinepi.com). The PRESS 2015 Evidence-Based Checklist (Table 1) is to be used when completing the PRESS 2015 Guideline Assessment Form. PRESS 2015 Guideline Statement and the PRESS 2015 E&E [
      • McGowan J.
      • Sampson M.
      • Salzwedel D.
      • Cogo E.
      • Foerster V.
      • Lefebvre C.
      PRESS–Peer Review of Electronic Search Strategies: 2015 Guideline Explanation & Elaboration (PRESS E&E).
      ] should be helpful resources to improve and provide guidance for peer reviewing electronic literature search strategies.
      Table 1PRESS 2015 Guideline Evidence-Based Checklist
      Translation of the research question
      • Does the search strategy match the research question/PICO?
      • Are the search concepts clear?
      • Are there too many or too few PICO elements included?
      • Are the search concepts too narrow or too broad?
      • Does the search retrieve too many or too few records? (Please show number of hits per line.)
      • Are unconventional or complex strategies explained?
      Boolean and proximity operators (these vary based on search service)
      • Are Boolean or proximity operators used correctly?
      • Is the use of nesting with brackets appropriate and effective for the search?
      • If NOT is used, is this likely to result in any unintended exclusions?
      • Could precision be improved by using proximity operators (eg, adjacent, near, within) or phrase searching instead of AND?
      • Is the width of proximity operators suitable (eg, might adj5 pick up more variants than adj2)?
      Subject headings (database specific)
      • Are the subject headings relevant?
      • Are any relevant subject headings missing; for example, previous index terms?
      • Are any subject headings too broad or too narrow?
      • Are subject headings exploded where necessary and vice versa?
      • Are major headings (“starring” or restrict to focus) used? If so, is there adequate justification?
      • Are subheadings missing?
      • Are subheadings attached to subject headings? (Floating subheadings may be preferred.)
      • Are floating subheadings relevant and used appropriately?
      • Are both subject headings and terms in free text (see the following) used for each concept?
      Text word searching (free text)
      • Does the search include all spelling variants in free text (eg, UK vs. US spelling)?
      • Does the search include all synonyms or antonyms (eg, opposites)?
      • Does the search capture relevant truncation (ie, is truncation at the correct place)?
      • Is the truncation too broad or too narrow?
      • Are acronyms or abbreviations used appropriately? Do they capture irrelevant material? Are the full terms also included?
      • Are the keywords specific enough or too broad? Are too many or too few keywords used? Are stop words used?
      • Have the appropriate fields been searched; for example, is the choice of the text word fields (.tw.) or all fields (.af.) appropriate? Are there any other fields to be included or excluded (database specific)?
      • Should any long strings be broken into several shorter search statements?
      Spelling, syntax, and line numbers
      • Are there any spelling errors?
      • Are there any errors in system syntax; for example, the use of a truncation symbol from a different search interface?
      • Are there incorrect line combinations or orphan lines (ie, lines that are not referred to in the final summation that could indicate an error in an AND or OR statement)?
      Limits and filters
      • Are all limits and filters used appropriately and are they relevant given the research question?
      • Are all limits and filters used appropriately and are they relevant for the database?
      • Are any potentially helpful limits or filters missing? Are the limits or filters too broad or too narrow? Can any limits or filters be added or taken away?
      • Are sources cited for the filters used?
      Abbreviation: PICO, population/problem, intervention/exposure, comparison, outcome.
      From reference
      • McGowan J.
      • Sampson M.
      • Salzwedel D.
      • Cogo E.
      • Foerster V.
      • Lefebvre C.
      PRESS–Peer Review of Electronic Search Strategies: 2015 Guideline Explanation & Elaboration (PRESS E&E).
      . © 2015 CADTH. Reprinted with permission.
      Table 2PRESS 2015 Guideline Recommendations for librarian practice
      No.RecommendationGuidance
      1Translation of the research question: Assess whether the research question has been correctly translated into search concepts.Ideally, the primary search strategy is submitted for peer review to ensure conceptual accuracy. The research question, typically formatted according to some variation of PICO and fine points of how the search was informed by the reference interview, should be submitted with the search strategy.
      2Boolean and proximity operators: Assess whether the elements addressing the search question have been correctly combined with Boolean and/or proximity operators.
      Note that proximity operators vary based on search service.
      Review the search for any instances where mistakes occurred in Boolean operators; for example, OR may have been unintentionally substituted for AND (or vice versa), or AND may have been used to link phrases or words (e.g., as a conjunction) rather than as a Boolean operator. Note that where NOT has been used, there is the possibility of unintentional exclusions, and another device (e.g., using a subject heading, check tag, or limit) could produce an equivalent outcome.

      Ensure that the use of nesting within brackets is logical and has been applied, as needed. Also note whether the use of a proximity operator (adjacent, near, within) instead of AND could increase precision.

      If proximity operators are used, consider whether or not the chosen width is too narrow to capture all anticipated instances of the search terms, which may vary depending on whether or not the database being searched recognizes stop words. Consider whether the width is too broad.

      If restrictions are included (eg, human or elderly populations), ensure that the appropriate construction has been used.
      3Subject headings (database specific): Assess whether there is enough scope in the selection of subject headings to optimize recall.Examine the following elements of subject heading usage: missing or incorrect headings, relevance/irrelevance of terms, and correct use of explosion to include relevant narrower terms.

      Consider the use of floating subheadings which are in most instances preferable to using subheadings attached to specific subject headings (e.g., in MEDLINE, “Neck Pain/and su.fs.” rather than “Neck Pain/su”). Note that subject headings and subheadings are database specific.
      4Text word search (free text): Assess whether search terms without adequate subject heading coverage are well represented by free-text terms, and whether additional synonyms or antonyms (opposites) and related terms are needed.Free-text terms are typically used to cover missing database subject headings. Consider elements of free-text usage such as too narrow or too broad, relevance of terms, and whether synonyms or antonyms have been included.
      5Spelling, syntax, and line numbers: Assess correct use of spelling, correct use of syntax and correct search implementation.Review the search strategy for misspelled words and for errors in system syntax that are not easily found by spell checking.

      Check each line number and combinations of line numbers to ensure that the search logic was correctly implemented.
      6Limits and filters: Assess whether the limits used (including filters) are appropriate and have been applied correctly.Review the search strategy to see if limits that are not relevant to the eligible study designs or to the clinical question have been applied, as these could potentially introduce epidemiological bias.

      Check that methodological search filters have been properly applied; for example, that SRs of economic evaluations are not restricted to RCTs.
      Abbreviations: PRESS, Peer Review of Electronic Search Strategies; PICO, population/problem, intervention/exposure, comparison, outcome; RCT, randomized controlled trial; SR, systematic review.
      From reference
      • McGowan J.
      • Sampson M.
      • Salzwedel D.
      • Cogo E.
      • Foerster V.
      • Lefebvre C.
      PRESS–Peer Review of Electronic Search Strategies: 2015 Guideline Explanation & Elaboration (PRESS E&E).
      . © 2015 CADTH. Reprinted with permission.
      a Note that proximity operators vary based on search service.
      Table 3PRESS 2015 Guideline Implementation Strategies
      Implementation strategyGuidance
      1The “primary search” should be peer reviewed.
      The primary search is determined by the searcher as the most important database to be searched and normally where the first search strategy is developed.


      Depending on the findings of the peer review and the complexity of translation to other databases and/or interfaces, further peer review may be desirable. Any time an SR or HTA is being updated, the updated search should also be peer reviewed.
      2One peer review is acceptable using the PRESS Guideline.

      A second review may be recommended in some cases; for example, the project scope or research question(s) change OR complex new interfaces are involved OR the peer reviewer specified that there are required revisions.

      The peer-review process should be documented.
      3Peer reviewers should be recognized. At a minimum, the peer reviewer should be recognized through acknowledgment in the publication (anonymous if the reviewer so wishes). The database searched and the service provider should be specified; for example, MEDLINE on OvidSP.
      4The turnaround time for a peer review of a search should be a maximum of five working days. A shorter turnaround time could be negotiated.
      Abbreviations: SR, systematic review; HTA, health technology assessment; PRESS, Peer Review of Electronic Search Strategies.
      From reference
      • McGowan J.
      • Sampson M.
      • Salzwedel D.
      • Cogo E.
      • Foerster V.
      • Lefebvre C.
      PRESS–Peer Review of Electronic Search Strategies: 2015 Guideline Explanation & Elaboration (PRESS E&E).
      . © 2015 CADTH. Reprinted with permission.
      a The primary search is determined by the searcher as the most important database to be searched and normally where the first search strategy is developed.
      PRESS focuses on the quality of the database search that is the core element in the SR or HTA search plan. The search plan should include searching a range of bibliographic databases as well as additional sources, for example, study registers, gray literature sources, citation databases, and related article searching; as well as contacting experts and/or manufacturers [
      • Lefebvre C.
      • Manheimer E.
      • Glanville J.
      Chapter 6: searching for studies.
      ]. However, those aspects of the search plan are outside the scope of PRESS. Other important aspects of a search include search validation and search reporting. Peer review of the search strategy provides a subjective validation. Accurate search reporting is necessary to ensure critical appraisal, replication, and updating [
      • Liberati A.
      • Altman D.G.
      • Tetzlaff J.
      • et al.
      The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration.
      ,
      • Harbour J.
      • Fraser C.
      • Lefebvre C.
      • glanville J.
      • Beale S.
      • Boachie C.
      • et al.
      Reporting methodological search filter performance comparisons: a literature review.
      ,
      • Booth A.
      “Brimful of STARLITE”: toward standards for reporting literature searches.
      ,
      • Fehrmann P.
      • Thomas J.
      Comprehensive computer searches and reporting in systematic reviews.
      ].

      4. PRESS process

      A PRESS peer review involves the person requesting the peer review (requestor) and the person completing the peer review (reviewer). Both are assumed to be skilled in the art of searching, typically librarians or information specialists. First the requestor fills out the pertinent information in the updated PRESS 2015 Guideline Assessment Form (Appendix/Appendix A at www.jclinepi.com) for the “primary” search strategy, which will be MEDLINE for most health-related SRs. The completed form is either sent to a reviewer (typically a colleague), or it is submitted to the PRESSforum (pressforum.pbworks.com). The reviewer reviews the search strategy using the PRESS 2015 Evidence-Based Checklist (Table 1) and additional guidance presented here. If major revisions are advised as a result of the peer review, a second PRESS peer review of the revised search strategy is conducted. The requestor and reviewer determine the time frame for completing the review.

      5. Case study

      The following is a case study of a simple MEDLINE search containing errors in each PRESS element for the research question, “What are the effectiveness and safety of acetaminophen for migraine headaches?” Please note that the PRESS 2015 Guideline Assessment Form (Appendix/Appendix A at www.jclinepi.com) requires more details to be provided about the review than those presented here in this basic illustration (Table 4).
      Table 4Example of an incorrect search
      Search termsNumber of hits
      1. Acetaminophen/15,078
      2. Analgesics/38,415
      3. 1 or 252,368
      4. Migraine Disorders/21,542
      5. 3 and 41,216
      6. Animals/ and Humans/1,584,813
      7. 4 not 620,602
      8. random*.mp.9,22,322
      9. 7 and 82,116
      Errors previously mentioned:
      • a)
        Free-text terms—lines 1 and 4: Adding text words for both the acetaminophen and migraine concepts could retrieve relevant studies.
      • b)
        Translation of the research question—line 2: Analgesics (in general) are not within the scope of the research question.
      • c)
        Subject headings—line 4: The MeSH Migraine Disorders/should be “exploded” to include narrower, more specific subject headings.
      • d)
        Boolean and proximity operators—line 6: This should be Animals/ Not Humans/.
      • e)
        Spelling, syntax, and line numbers—line 7: This should be 5 rather than 6.
      • f)
        Limits and filters—line 8: This search should not be limited to a specific study design (e.g., randomized controlled trials).
      Few searches will contain this many errors; however, this example shows that a convincing-looking search can contain shortcomings that peer review can identify.

      6. Background to the update

      The original PRESS project began in 2005 and involved an SR, a Web-based survey of experts, and two peer-review forums. The aim was to determine which elements of the search process have significant impact on the overall comprehensiveness of the resulting evidence base.
      The SR identified evidence related to quality issues and errors in complex electronic search strategies. A Web-based survey of individuals experienced in SR searching gathered expert opinion regarding the impact of search elements on the search results and the importance of each element. Finally, after a pilot test of the PRESS checklist, the PRESSforum web site was developed to provide a reciprocal facility for health librarians/information specialists to obtain peer review of their SR and HTA searches.
      In the original PRESS development process, there was strong consensus about the seven elements of search strategies that are important to check in peer review: (1) accurate translation of the research question into search concepts; (2) correct choice of Boolean operators; (3) accurate line numbers and absence of spelling errors; (4) an appropriate text word search; (5) inclusion of relevant subject headings; (6) correct use of limits and filters; and (7) search strategy adaptations. In 2010, the annotated PRESS Evidence-Based Checklist was published, and the web site pressforum.pbworks.com was launched.

      7. Development of the PRESS 2015 updated Guideline Statement

      The research questions were as follows:
      • Are there any existing checklists that evaluate or validate the quality of literature searches in the health sciences?
      • What elements relate to quality or errors in search strategies?
      The steps in the update included (1) an SR to update the evidence base; (2) a Web-based survey of experts; and (3) a consensus development forum meeting. Results of the SR suggested that structured peer reviews are able to identify search errors and present improvements for better recall and precision. The international survey included 117 experts who completed at least one response beyond the demographic data, with 108 fully completing the survey.
      The consensus development forum meeting involved leading experts in literature searching methodology. Participants were recruited from the Canadian Agency for Drugs and Technologies in Health (CADTH), the Cochrane Trial Search Coordinators (TSCs) Executive, the Information Retrieval Methods Group, a PRESS user selected from the membership of PRESSforum, and the Health Technology Assessment international Interest Sub-Group on Information Resources, and one member at large.
      The meeting aims were to review the SR and survey results, develop the PRESS 2015 Guideline recommendations, and discuss a knowledge translation strategy to disseminate PRESS. One of the coauthors (V.F.) facilitated the consensus process by leading structured discussions on each of the PRESS 2015 Evidence-Based Checklist items. Decisions were reached by open (unblinded) voting, with consensus being defined a priori as a majority vote. All items received unanimous votes. A detailed report of our methods and results is published elsewhere [
      • McGowan J.
      • Sampson M.
      • Salzwedel D.
      • Cogo E.
      • Foerster V.
      • Lefebvre C.
      PRESS–Peer Review of Electronic Search Strategies: 2015 Guideline Explanation & Elaboration (PRESS E&E).
      ].
      The elements of an effective search remain unchanged from those identified in 2005. These results were used in the final two steps of the guideline development: recommendations and guidance for librarians and other information specialists, and an updated PRESS 2015 Evidence-Based Checklist.

      8. Changes in PRESS 2015 updated Guideline Statement

      The original PRESS Evidence-Based Checklist included guidance in seven elements of the search strategy development process. The 2015 updating process confirmed the utility of the first six elements and led to suggested improvements in them, but eliminated the seventh, “skilled translation of the search strategy to additional databases.” This element was withdrawn because there was consensus that this element should be left to the discretion of searchers in terms of the scheduling and nature of the search strategy translations.
      The new PRESS 2015 Guideline Statement incorporates four components: (1) six PRESS 2015 Recommendations for librarian and information specialist practice; (2) four PRESS 2015 Implementation Strategies; (3) an updated PRESS 2015 Evidence-Based Checklist; and (4) an updated PRESS 2015 Guideline Assessment Form.

      9. Implications and limitations

      Sound database searching methods are well established, as evidenced by the stability of elements over the 10 years since the original PRESS Evidence-Based Checklist was developed. However, the area of text mining approaches is an active field of research that may eventually shift search performance to focusing on maximizing recall without regard to precision [
      • O'Mara-Eves A.
      • Thomas J.
      • McNaught J.
      • Miwa M.
      • Ananiadou S.
      Using text mining for study identification in systematic reviews: a systematic review of current approaches.
      ]. Thus, the interface between database searching and these new technologies may be a rich area for study.
      The SR for this update was focused on the health science databases. Although the original PRESS guideline [
      • Sampson M.
      • McGowan J.
      • Cogo E.
      • Grimshaw J.
      • Moher D.
      • Lefebvre C.
      An evidence-based practice guideline for the peer review of electronic search strategies.
      ,
      • McGowan J.
      • Sampson M.
      • Lefebvre C.
      Evidence Based Checklist for the Peer Review of Electronic Search Strategies (PRESS EBC).
      ] was informed by research and theory from all fields of library science and was designed for peer reviewing any important search, uptake of the PRESS guideline appears to be largely confined to SR and HTA searches. This narrowed focus was therefore warranted.
      Grading of strength of recommendations was not done. The SR was an update, and risk of bias and strength of evidence assessments would have required revisiting the studies included in the original PRESS SR. However, no guidance in the original PRESS was overturned by new evidence or by the expert opinions of survey respondents or consensus development forum participants, adding to confidence in these findings. Piloting of the revised PRESS 2015 Guideline Statement was undertaken by only one agency (CADTH), the sponsor of this research.

      10. Conclusion

      The literature search strategies for knowledge syntheses should be peer reviewed using a structured tool such as the PRESS 2015 Evidence-Based Checklist. Research indicates that this can improve the quality and comprehensiveness of the search and reduce errors. Consequently, it can increase the overall quality of the evidence base for an SR or HTA. The elements that are important for effective Boolean searches have been confirmed in this update, and new evidence has been incorporated into the guidance.

      Acknowledgments

      The authors thank the peer reviewers of the MEDLINE search strategy using PRESS: Janet Joyce, MLS, Ottawa, Ontario; Linda Slater, MLIS, public service manager; John W. Scott, Health Sciences Library, University of Alberta, Edmonton, Alberta.
      The authors acknowledge the contributions of the PRESS Consensus forum expert panel: Jessie McGowan, MLIS, PhD, AHIP, library and health consultant, Ottawa, Ontario; Carol Lefebvre, MSc, HonFCLIP, independent information consultant, Lefebvre Associates Ltd, UK and coconvenor, Cochrane Information Retrieval Methods Group; Margaret Sampson, MLIS, PhD, AHIP, manager, Library and Media Services, Children's Hospital of Eastern Ontario, Ottawa, Ontario; Vicki Foerster, MD, MSc, health policy consultant and medical writer, Oxford Station, Ontario; Shaila Mensinkai, MLIS, director, Information Services, Canadian Agency for Drugs and Technologies in Health, CADTH, Ottawa, Ontario; David Kaunelis, MLIS, information methods specialist, CADTH, Ottawa, Ontario; Carolyn Spry, BSc, MLIS, information specialist, CADTH, Ottawa, Ontario; Deirdre Beecher, BA (Hons), MSc Econ, information specialist, Cochrane Injuries Group, London School of Hygiene & Tropical Medicine, London, UK; Linda Slater, MLIS, public service manager, John W. Scott Health Sciences Library, University of Alberta, Edmonton, Alberta; Kate Misso, BSc (Hons), PgCertPgDip, MSc, information specialist manager, Kleijnen Systematic Reviews Ltd.; York, UK; Su Golder, FRSA, research fellow, Department of Health Sciences, University of York, Heslington, York, UK; Lindsey Sikora, BSc (Hon), MISt, Health Sciences Research Liaison librarian, Health Sciences Library, University of Ottawa, Ottawa, Ontario.
      The authors also thank Sarah Calder for recording the PRESS consensus development forum.

      Supplementary data

      References

        • Sampson M.
        • McGowan J.
        • Cogo E.
        • Grimshaw J.
        • Moher D.
        • Lefebvre C.
        An evidence-based practice guideline for the peer review of electronic search strategies.
        J Clin Epidemiol. 2009; 62: 944-952
        • Sampson M.
        • McGowan J.
        • Lefebvre C.
        • Moher D.
        • Grimshaw J.M.
        PRESS: Peer Review of Electronic Search Strategies.
        Canadian Agency for Drugs and Technologies in Health, Ottawa2008 (Available at) (Appendices Available at) (Accessed August 17, 2015)
        • McGowan J.
        • Sampson M.
        • Lefebvre C.
        Evidence Based Checklist for the Peer Review of Electronic Search Strategies (PRESS EBC).
        Evid Based Libr Inf Pract. 2010; 5 (Available at) (Accessed August 17, 2015): 149-154
        • Sampson M.
        • McGowan J.
        • Tetzlaff J.
        • Cogo E.
        • Moher D.
        No consensus exists on search reporting methods for systematic reviews.
        J Clin Epidemiol. 2008; 61: 748-754
        • Sampson M.
        • McGowan J.
        Errors in search strategies were identified by type and frequency.
        J Clin Epidemiol. 2006; 59: 1057-1063
        • McGowan J.
        • Sampson M.
        • Salzwedel D.
        • Cogo E.
        • Foerster V.
        • Lefebvre C.
        PRESS–Peer Review of Electronic Search Strategies: 2015 Guideline Explanation & Elaboration (PRESS E&E).
        CADTH, Ottawa2016 (Ottawa)
        • Lefebvre C.
        • Manheimer E.
        • Glanville J.
        Chapter 6: searching for studies.
        in: Higgens J. Green S. Cochrane handbook for systematic reviews of interventions Version 5.1.0. 5.1.0 ed. John Wiley & Sons, Ltd, Chichester2011: 1-46
        • Liberati A.
        • Altman D.G.
        • Tetzlaff J.
        • et al.
        The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration.
        PLos Med. 2009; 6: e1000100
        • Harbour J.
        • Fraser C.
        • Lefebvre C.
        • glanville J.
        • Beale S.
        • Boachie C.
        • et al.
        Reporting methodological search filter performance comparisons: a literature review.
        Health Info Libr J. 2014; 31: 176-194
        • Booth A.
        “Brimful of STARLITE”: toward standards for reporting literature searches.
        J Med Libr Assoc. 2006; 94 (e205. Available at) (Accessed August 17, 2015): 421-429
        • Fehrmann P.
        • Thomas J.
        Comprehensive computer searches and reporting in systematic reviews.
        Res Synth Methods. 2011; 2 (Available at http://onlinelibrary.wiley.com/doi/10.1002/jrsm.31/abstract; http://onlinelibrary.wiley.com/doi/10.1002/jrsm.31/full; http://onlinelibrary.wiley.com/store/10.1002/jrsm.31/asset/jrsm31.pdf?v=1&t=h3389vri&s=0e997cf5b8a71a96d9bcdeaa63fff8bbffcc00b5. Accessed August 10, 2015.): 15-32
        • O'Mara-Eves A.
        • Thomas J.
        • McNaught J.
        • Miwa M.
        • Ananiadou S.
        Using text mining for study identification in systematic reviews: a systematic review of current approaches.
        Syst Rev. 2015; 4: 5