Evaluation of baseline adherence to the PRISMA-DTA reporting guideline




Poster session 1


Sunday 16 September 2018 - 12:30 to 14:00

All authors in correct order:

Salameh JP1, McInnes M1, Moher D1, Bossuyt P2, Thombs B3, Krajipool N4, McGrath T5, Levis B6, Frank R7, Sharifabadi AD7
1 Clinical Epidemiology Program, Ottawa Hospital Research Institute, Canada
2 Department of Clinical Epidemiology, Biostatistics and Bioinformatics, University of Amsterdam, Academic Medical Center, The Netherlands
3 Department of Psychiatry, McGill University, Canada
4 Department of Vascular Medicine,University of Amsterdam, Academic Medical Center, The Netherlands
5 University of Ottawa Department of Radiology, Canada
6 Department of Epidemiology, Biostatistics and Occupational Health McGill University, Canada
7 University of Ottawa, Canada
Presenting author and contact person

Presenting author:

Jean-Paul Salameh

Contact person:

Abstract text
To evaluate the adherence of recently published diagnostic test accuracy (DTA) systematic reviews (SR) to PRISMA-DTA and PRISMA-DTA for abstracts to identify areas of deficiency in reporting.

We searched MEDLINE for DTA SR published between 31 October 2017 and 20 January 2018 to achieve a sample size target of 100 SRs. In our analysis we evaluated overall adherence to PRISMA-DTA, on a per item basis. We evaluated the association of completeness of reporting with: journal, country, impact factor (IF), index test type, subspecialty area, use of supplementary material, PRISMA citation and PRISMA adoption by journal. We also performed correlation of adherence with word count.

Adherence (n = 100 studies) was 73% (19.0/26 items, standard deviation (SD) 2.0) for PRISMA-DTA and 45% (4.5/10 items, SD 1.2) for PRISMA-DTA for abstracts. Items pertaining to the results section (study selection, synthesis of the results) were frequently reported (> 66% of studies), while infrequently reported items (< 33% of studies) included those related to protocol reporting and registration, and characteristics of the included studies (clinical and study settings, funding sources) (Table 1). Infrequently reported items from PRISMA-DTA for abstracts included funding information, strengths and limitations of the SR, characteristics of the included studies and assessment of applicability (Table 2). Adherence was higher in journals with a higher IF (19.5 versus 18.5 items; P = 0.014), studies that cited PRISMA (19.3 versus 18.2 items; P = 0.012) and studies that used supplementary material (19.6 versus 18.5 items; P = 0.006). No variability in reporting was identified for country (P = 0.076), journal (P = 0.596), PRISMA adoption by journal (P = 0.343), study design (P = 0.668), subspecialty area (P = 0.313) or index test (P = 0.812) (Table 3). Association of adherence with a higher word count was present for abstracts (R = 0.43; P < 0.001) but not for full texts (R = -0.03; P = 0.782).

Recently published DTA SRs show moderate adherence to PRISMA-DTA and low adherence to PRISMA-DTA for abstracts.

Patient or healthcare consumer involvement:
This evaluation will guide knowledge translation strategies to improve DTA SR reporting completeness, allowing the many stakeholders who rely on DTA SR to better assess critical aspects of review methods to determine the applicability and validity of the review.


Relevance to patients and consumers: 

Incomplete reporting of diagnostic test accuracy (DTA) systematic reviews can hinder the ability of stakeholders such as clinicians, guidelines authors, and policy makers to evaluate the quality and applicability of conclusions. This incomplete understanding of diagnostic tests in clinical practice can harm patients via under-recognition of incorrect test results (false positives and false negatives). Identification of specific areas of incomplete reporting can contribute to targeted knowledge translation strategies aimed at addressing these deficiencies.