Agreement between expert thoracic radiologists and the chest radiograph reports provided by consultant radiologists and reporting radiographers in clinical practice: review of a single clinical site

Woznitza, N., Piper, K., Burke, S., Ellis, S and Bothamley, G. (2018) Agreement between expert thoracic radiologists and the chest radiograph reports provided by consultant radiologists and reporting radiographers in clinical practice: review of a single clinical site. Radiography. (In Press)

[img] PDF
16931_CXR Agreement Manuscript R2 submitted (clean).pdf - Accepted Version
Restricted to Repository staff only

Download (416kB) | Request a copy

Abstract

Introduction: To compare the clinical chest radiograph (CXR) reports provided by consultant radiologists and reporting radiographers with expert thoracic radiologists.

Methods: Adult CXRs (n=193) from a single site were included; 83% randomly selected from CXRs performed over one year, and 17% selected from the discrepancy meeting. Chest radiographs were independently interpreted by two expert thoracic radiologists (CTR1/2).Clinical history, previous and follow-up imaging was available, but not the original clinical report. Two arbiters compared expert and clinical reports independently. Kappa (Ƙ), Chi Square (χ2) and McNemar tests were performed to determine inter-observer agreement.

Results: CTR1 interpreted 187 (97%) and CTR2 186 (96%) CXRs, with 180 CXRs interpreted by both experts. Radiologists and radiographers provided 93 and 87 of the original clinical reports respectively. Consensus between both expert thoracic radiologists and the radiographer clinical report was 70 (CTR1;Ƙ=0.59) and 70 (CTR2; Ƙ=0.62), and comparable to agreement between expert thoracic radiologists and the radiologist clinical report (CTR1=76,Ƙ=0.60; CTR2=75, Ƙ=0.62). Expert thoracic radiologists agreed in 131 cases (Ƙ=0.48). There was no difference in agreement between either expert thoracic radiologist, when the clinical report was provided by radiographers or radiologists (CTR1 χ=0.056, p=0.813; CTR2 χ=0.014, p=0.906), or when stratified by inter-expert agreement; radiographer McNemar p=0.629 and radiologist p=0.701.

Conclusion: Even when weighted with chest radiographs reviewed at discrepancy meetings, content of CXR reports from trained radiographers are comparable to the content of reports issued by radiologists and expert thoracic radiologists.

Item Type: Article
Uncontrolled Keywords: Clinical Competence; radiography; thoracic; radiographer reporting; observer performance
Subjects: R Medicine
Divisions: Faculty of Health and Wellbeing > School of Allied Health Professions
Depositing User: Nick Woznitza
Date Deposited: 31 Jan 2018 13:23
Last Modified: 23 Feb 2018 10:11
URI: https://create.canterbury.ac.uk/id/eprint/16931

Actions (login required)

Update Item (CReaTE staff only) Update Item (CReaTE staff only)

Downloads

Downloads per month over past year

View more statistics

Share

Connect with us

Last edited: 29/06/2016 12:23:00