PRISMA Reporting Guideline for Diagnostic Test Accuracy Studies | Guidelines | JN Learning | AMA Ed Hub [Skip to Content]
[Skip to Content Landing]

Preferred Reporting Items for a Systematic Review and Meta-analysis of Diagnostic Test Accuracy StudiesThe PRISMA-DTA Statement

Educational Objective
To understand the development of a guideline for reporting systematic reviews and meta-analyses of diagnostic test studies.
1 Credit CME
Key Points

Question  What items should be reported to allow readers to evaluate the validity and applicability and to enhance the replicability of systematic reviews of diagnostic test accuracy studies?

Findings  This diagnostic test accuracy guideline is an extension of the original Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. Two PRISMA items have been omitted, 2 were added, and 17 were modified to reflect specific or optimal contemporary systematic review methods of diagnostic test accuracy studies.

Meaning  The guideline checklist can facilitate transparent reporting of reviews of diagnostic test accuracy studies, and may help assist evaluations of validity and applicability, enhance replicability of reviews, and make the results more useful for clinicians, journal editors, reviewers, guideline authors, and funders.

Abstract

Importance  Systematic reviews of diagnostic test accuracy synthesize data from primary diagnostic studies that have evaluated the accuracy of 1 or more index tests against a reference standard, provide estimates of test performance, allow comparisons of the accuracy of different tests, and facilitate the identification of sources of variability in test accuracy.

Objective  To develop the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) diagnostic test accuracy guideline as a stand-alone extension of the PRISMA statement. Modifications to the PRISMA statement reflect the specific requirements for reporting of systematic reviews and meta-analyses of diagnostic test accuracy studies and the abstracts for these reviews.

Design  Established standards from the Enhancing the Quality and Transparency of Health Research (EQUATOR) Network were followed for the development of the guideline. The original PRISMA statement was used as a framework on which to modify and add items. A group of 24 multidisciplinary experts used a systematic review of articles on existing reporting guidelines and methods, a 3-round Delphi process, a consensus meeting, pilot testing, and iterative refinement to develop the PRISMA diagnostic test accuracy guideline. The final version of the PRISMA diagnostic test accuracy guideline checklist was approved by the group.

Findings  The systematic review (produced 64 items) and the Delphi process (provided feedback on 7 proposed items; 1 item was later split into 2 items) identified 71 potentially relevant items for consideration. The Delphi process reduced these to 60 items that were discussed at the consensus meeting. Following the meeting, pilot testing and iterative feedback were used to generate the 27-item PRISMA diagnostic test accuracy checklist. To reflect specific or optimal contemporary systematic review methods for diagnostic test accuracy, 8 of the 27 original PRISMA items were left unchanged, 17 were modified, 2 were added, and 2 were omitted.

Conclusions and Relevance  The 27-item PRISMA diagnostic test accuracy checklist provides specific guidance for reporting of systematic reviews. The PRISMA diagnostic test accuracy guideline can facilitate the transparent reporting of reviews, and may assist in the evaluation of validity and applicability, enhance replicability of reviews, and make the results from systematic reviews of diagnostic test accuracy studies more useful.

Sign in to take quiz and track your certificates

Buy This Activity

JN Learning™ is the home for CME and MOC from the JAMA Network. Search by specialty or US state and earn AMA PRA Category 1 CME Credit™ from articles, audio, Clinical Challenges and more. Learn more about CME/MOC

Article Information

Corresponding Author: Matthew D. F. McInnes, MD, Ottawa Hospital-Civic Campus, 1053 Carling Ave, Ottawa, ON K1E 4Y9, Canada (mmcinnes@toh.ca).

Correction: This article was corrected on November 26, 2019, to fix the term receiver operating characteristic plot in Table 2.

Accepted for Publication: December 6, 2017.

The PRISMA-DTA Group Authors: Tammy Clifford, PhD; Jérémie F. Cohen, MD, PhD; Jonathan J. Deeks, PhD; Constantine Gatsonis, PhD; Lotty Hooft, PhD; Harriet A. Hunt, MSc; Christopher J. Hyde, PhD; Daniël A. Korevaar, MD, PhD; Mariska M. G. Leeflang, PhD; Petra Macaskill, PhD; Johannes B. Reitsma, MD, PhD; Rachel Rodin, MD, MPH; Anne W. S. Rutjes, PhD; Jean-Paul Salameh, BSc; Adrienne Stevens, MSc; Yemisi Takwoingi, PhD; Marcello Tonelli, MD, SM; Laura Weeks, PhD; Penny Whiting, PhD; Brian H. Willis, MD, PhD.

Affiliations of The PRISMA-DTA Group Authors: Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Ontario, Canada (Salameh); Department of Clinical Epidemiology, Biostatistics and Bioinformatics, University of Amsterdam, Academic Medical Center, Amsterdam, the Netherlands (Korevaar, Leeflang); Canadian Agency for Drugs and Technologies in Health, Ottawa, Ontario (Clifford, Weeks); Department of Pediatrics, Necker-Enfants Malades Hospital, Assistance Publique Hôpitaux de Paris, Paris Descartes University, Paris, France (Cohen); Inserm UMR 1153, Research Center for Epidemiology and Biostatistics Sorbonne Paris Cité, Paris Descartes University, Paris, France (Cohen); University of Birmingham, Birmingham, England (Deeks, Takwoingi, Willis); Brown University, Providence, Rhode Island (Gatsonis); Cochrane Netherlands, Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht, the Netherlands (Hooft, Reitsma); University of Exeter, Exeter, England (Hunt, Hyde); University of Sydney, Sydney, Australia (Macaskill); Public Health Agency of Canada, Ottawa, Ontario, Canada (Rodin); Institute of Social and Preventive Medicine, Berner Institut für Hausarztmedizin, University of Bern, Bern, Switzerland (Rutjes); School of Epidemiology and Public Health, University of Ottawa, Ottawa, Ontario, Canada (Salameh); Ottawa Hospital Research Institute, Ottawa, Ontario, Canada (Stevens); Translational Research in Biomedicine Program, School of Medicine, University of Split, Split, Croatia (Stevens); University of Calgary, Calgary, Alberta, Canada (Tonelli); University of Bristol, National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care West, Bristol, England (Whiting).

Author Contributions: Dr McInnes had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Concept and design: McInnes, Moher, McGrath, Bossuyt, Clifford, Cohen, Korevaar, Reitsma, Salameh, Takwoingi, Willis.

Acquisition, analysis, or interpretation of data: McInnes, Thombs, McGrath, Cohen, Gatsonis, Hunt, Hyde, Korevaar, Leeflang, Reitsma, Rutjes, Salameh, Stevens, Takwoingi, Tonelli, Weeks, Whiting, Willis.

Drafting of the manuscript: McInnes, Moher, McGrath, Leeflang, Salameh, Willis.

Critical revision of the manuscript for important intellectual content: McInnes, Thombs, McGrath, Bossuyt, Clifford, Cohen, Deeks, Gatsonis, Hooft, Hunt, Hyde, Korevaar, Leeflang, Macaskill, Reitsma, Rodin, Rutjes, Salameh, Stevens, Takwoingi, Tonelli, Weeks, Whiting, Willis.

Obtained funding: McInnes, Clifford.

Administrative, technical, or material support: McInnes, Moher, Clifford, Hunt, Salameh, Willis.

Supervision: McInnes, Bossuyt, Takwoingi.

Conflict of Interest Disclosures: The authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest and none were reported.

Funding/Support: The research was supported by grant 375751 from the Canadian Institute for Health Research; funding from the Canadian Agency for Drugs and Technologies in Health; funding from the Standards for Reporting of Diagnostic Accuracy Studies Group; funding from the University of Ottawa Department of Radiology Research Stipend Program; and funding from the National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care South West Peninsula.

Role of the Funder/Sponsor: None of the funding sources had any role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

References
1.
McInnes  MD, Bossuyt  PM.  Pitfalls of systematic reviews and meta-analyses in imaging research.  Radiology. 2015;277(1):13-21.PubMedGoogle ScholarCrossref
2.
Bastian  H, Glasziou  P, Chalmers  I.  Seventy-five trials and eleven systematic reviews a day: how will we ever keep up?  PLoS Med. 2010;7(9):e1000326.PubMedGoogle Scholar
3.
Glasziou  P, Altman  DG, Bossuyt  P,  et al.  Reducing waste from incomplete or unusable reports of biomedical research.  Lancet. 2014;383(9913):267-276.PubMedGoogle ScholarCrossref
4.
Tunis  AS, McInnes  MD, Hanna  R, Esmail  K.  Association of study quality with completeness of reporting: have completeness of reporting and quality of systematic reviews and meta-analyses in major radiology journals changed since publication of the PRISMA statement?  Radiology. 2013;269(2):413-426.PubMedGoogle ScholarCrossref
5.
McGrath  TA, McInnes  MD, Korevaar  DA, Bossuyt  PM.  Meta-analyses of diagnostic accuracy in imaging journals: analysis of pooling techniques and their effect on summary estimates of diagnostic accuracy.  Radiology. 2016;281(1):78-85.PubMedGoogle ScholarCrossref
6.
Willis  BH, Quigley  M.  The assessment of the quality of reporting of meta-analyses in diagnostic research: a systematic review.  BMC Med Res Methodol. 2011;11:163.PubMedGoogle ScholarCrossref
7.
Willis  BH, Quigley  M.  Uptake of newer methodological developments and the deployment of meta-analysis in diagnostic test research: a systematic review.  BMC Med Res Methodol. 2011;11:27.PubMedGoogle ScholarCrossref
8.
Naaktgeboren  CA, van Enst  WA, Ochodo  EA,  et al.  Systematic overview finds variation in approaches to investigating and reporting on sources of heterogeneity in systematic reviews of diagnostic studies.  J Clin Epidemiol. 2014;67(11):1200-1209.PubMedGoogle ScholarCrossref
9.
Ochodo  EA, van Enst  WA, Naaktgeboren  CA,  et al.  Incorporating quality assessments of primary studies in the conclusions of diagnostic accuracy reviews: a cross-sectional study.  BMC Med Res Methodol. 2014;14:33.PubMedGoogle ScholarCrossref
10.
Naaktgeboren  CA, Ochodo  EA, Van Enst  WA,  et al.  Assessing variability in results in systematic reviews of diagnostic studies.  BMC Med Res Methodol. 2016;16(1):6.PubMedGoogle ScholarCrossref
11.
McGrath  TA, McInnes  MDF, Langer  FW, Hong  J, Korevaar  DA, Bossuyt  PMM.  Treatment of multiple test readers in diagnostic accuracy systematic reviews-meta-analyses of imaging studies.  Eur J Radiol. 2017;93:59-64.PubMedGoogle ScholarCrossref
12.
Moher  D, Liberati  A, Tetzlaff  J, Altman  DG; PRISMA Group.  Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement.  J Clin Epidemiol. 2009;62(10):1006-1012.PubMedGoogle ScholarCrossref
13.
Liberati  A, Altman  DG, Tetzlaff  J,  et al.  The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration.  J Clin Epidemiol. 2009;62(10):e1-e34.PubMedGoogle ScholarCrossref
14.
Macaskill  PGC, Deeks  JJ, Harbord  RM, Takwoingi  Y. Analysing and presenting results. In: Deeks  JJ, Bossuyt  PM, Gatsonis  C, eds.  Cochrane Handbook for Systematic Reviews of Diagnostic Test Accuracy. Oxford, England: Cochrane Collaboration; 2010.
15.
Whiting  PF, Rutjes  AW, Westwood  ME,  et al; QUADAS-2 Group.  QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies.  Ann Intern Med. 2011;155(8):529-536.PubMedGoogle ScholarCrossref
16.
Bossuyt  PM, Reitsma  JB, Bruns  DE,  et al; STARD Group.  STARD 2015: an updated list of essential items for reporting diagnostic accuracy studies.  Radiology. 2015;277(3):826-832.PubMedGoogle ScholarCrossref
17.
Equator Network. Reporting guidelines under development. http://www.equator-network.org/library/reporting-guidelines-under-development/#99. Accessed November 28, 2014.
18.
Moher  D, Schulz  KF, Simera  I, Altman  DG.  Guidance for developers of health research reporting guidelines.  PLoS Med. 2010;7(2):e1000217.PubMedGoogle Scholar
19.
McGrath  TA, Alabousi  M, Skidmore  B,  et al.  Recommendations for reporting of systematic reviews and meta-analyses of diagnostic test accuracy: a systematic review.  Syst Rev. 2017;6(1):194.PubMedGoogle ScholarCrossref
20.
Trevelyan  E, Robinson  N.  Delphi methodology in health research: how to do it?  Eur J Integr Med. 2015;7:423-428.Google ScholarCrossref
21.
Boulkedid  R, Abdoul  H, Loustau  M, Sibony  O, Alberti  C.  Using and reporting the Delphi method for selecting healthcare quality indicators: a systematic review.  PLoS One. 2011;6(6):e20476.PubMedGoogle Scholar
22.
Whiting  P, Savović  J, Higgins  JP,  et al; ROBIS Group.  ROBIS: a new tool to assess risk of bias in systematic reviews was developed.  J Clin Epidemiol. 2016;69:225-234.PubMedGoogle ScholarCrossref
23.
Korevaar  DA, Cohen  JF, Reitsma  JB,  et al.  Updating standards for reporting diagnostic accuracy: the development of STARD 2015 [published online June 7, 2016].  Res Integr Peer Rev. doi:10.1186/s41073-016-0014-7Google Scholar
24.
Cohen  JF, Korevaar  DA, Gatsonis  CA,  et al; STARD Group.  STARD for abstracts: essential items for reporting diagnostic accuracy studies in journal or conference abstracts.  BMJ. 2017;358:j3751.PubMedGoogle ScholarCrossref
25.
Korevaar  DA, Bossuyt  PM, Hooft  L.  Infrequent and incomplete registration of test accuracy studies: analysis of recent study reports.  BMJ Open. 2014;4(1):e004596.PubMedGoogle Scholar
26.
Korevaar  DA, Cohen  JF, Spijker  R,  et al.  Reported estimates of diagnostic accuracy in ophthalmology conference abstracts were not associated with full-text publication.  J Clin Epidemiol. 2016;79:96-103.PubMedGoogle ScholarCrossref
27.
Deeks  J, Bossuyt  P, Gatsonis  C.  Cochrane Handbook for Systematic Reviews of Diagnostic Test Accuracy. 1.0.0 ed. Oxford, England: Cochrane Collaboration; 2013.
28.
van Enst  WA, Ochodo  E, Scholten  RJ, Hooft  L, Leeflang  MM.  Investigation of publication bias in meta-analyses of diagnostic test accuracy: a meta-epidemiological study.  BMC Med Res Methodol. 2014;14:70.PubMedGoogle ScholarCrossref
29.
Deeks  JJ, Macaskill  P, Irwig  L.  The performance of tests of publication bias and other sample size effects in systematic reviews of diagnostic test accuracy was assessed.  J Clin Epidemiol. 2005;58(9):882-893.PubMedGoogle ScholarCrossref
30.
Cohen  JF, Korevaar  DA, Altman  DG,  et al.  STARD 2015 guidelines for reporting diagnostic accuracy studies: explanation and elaboration.  BMJ Open. 2016;6(11):e012799.PubMedGoogle Scholar
31.
McGrath  TA, McInnes  MDF, van Es  N, Leeflang  MMG, Korevaar  DA, Bossuyt  PMM.  Overinterpretation of research findings: evidence of “spin” in systematic reviews of diagnostic accuracy studies.  Clin Chem. 2017;63(8):1353-1362.PubMedGoogle ScholarCrossref
32.
Rutter  CM, Gatsonis  CA.  A hierarchical regression approach to meta-analysis of diagnostic test accuracy evaluations.  Stat Med. 2001;20(19):2865-2884.PubMedGoogle ScholarCrossref
33.
Reitsma  JB, Glas  AS, Rutjes  AW, Scholten  RJ, Bossuyt  PM, Zwinderman  AH.  Bivariate analysis of sensitivity and specificity produces informative summary measures in diagnostic reviews.  J Clin Epidemiol. 2005;58(10):982-990.PubMedGoogle ScholarCrossref
34.
Dinnes  J, Mallett  S, Hopewell  S, Roderick  PJ, Deeks  JJ.  The Moses-Littenberg meta-analytical method generates systematic differences in test accuracy compared to hierarchical meta-analytical models.  J Clin Epidemiol. 2016;80:77-87.PubMedGoogle ScholarCrossref
35.
Irwig  L, Macaskill  P, Glasziou  P, Fahey  M.  Meta-analytic methods for diagnostic test accuracy.  J Clin Epidemiol. 1995;48(1):119-130.PubMedGoogle ScholarCrossref
36.
Lijmer  JG, Mol  BW, Heisterkamp  S,  et al.  Empirical evidence of design-related bias in studies of diagnostic tests.  JAMA. 1999;282(11):1061-1066.PubMedGoogle ScholarCrossref
37.
Zwinderman  AH, Glas  AS, Bossuyt  PM, Florie  J, Bipat  S, Stoker  J.  Statistical models for quantifying diagnostic accuracy with multiple lesions per patient.  Biostatistics. 2008;9(3):513-522.PubMedGoogle ScholarCrossref
38.
Beller  EM, Glasziou  PP, Altman  DG,  et al; PRISMA for Abstracts Group.  PRISMA for abstracts: reporting systematic reviews in journal and conference abstracts.  PLoS Med. 2013;10(4):e1001419.PubMedGoogle Scholar
39.
de Vet  HCW, Eisinga  A, Riphagen  II, Aertgeerts  B, Pewsner  D. Searching for studies. In:  Cochrane Handbook for Systematic Reviews of Diagnostic Test Accuracy Version 0.4. Oxford, England: Cochrane Collaboration; 2008.
40.
Hong  PJ, Korevaar  DA, McGrath  TA,  et al.  Reporting of imaging diagnostic accuracy studies with focus on MRI subgroup: adherence to STARD 2015 [published online June 22, 2017].  J Magn Reson Imaging. doi:10.1002/jmri.25797PubMedGoogle Scholar
If you are not a JN Learning subscriber, you can either:
Subscribe to JN Learning for one year
Buy this activity
jn-learning_Modal_LoginSubscribe_Purchase
Close
If you are not a JN Learning subscriber, you can either:
Subscribe to JN Learning for one year
Buy this activity
jn-learning_Modal_LoginSubscribe_Purchase
Close
With a personal account, you can:
  • Access free activities and track your credits
  • Personalize content alerts
  • Customize your interests
  • Fully personalize your learning experience
Education Center Collection Sign In Modal Right
Close

Name Your Search

Save Search
Close
With a personal account, you can:
  • Track your credits
  • Personalize content alerts
  • Customize your interests
  • Fully personalize your learning experience
jn-learning_Modal_SaveSearch_NoAccess_Purchase
Close

Lookup An Activity

or

Close

My Saved Searches

You currently have no searches saved.

Close
With a personal account, you can:
  • Access free activities and track your credits
  • Personalize content alerts
  • Customize your interests
  • Fully personalize your learning experience
Education Center Collection Sign In Modal Right
Close