Chest X-Ray interpretation: agreement between consultant radiologists and a reporting radiographer in clinical practice in the United Kingdom

Woznitza, N., Piper, K., Burke, S., Patel, K., Amin, S. and Grayson, K. (2013) Chest X-Ray interpretation: agreement between consultant radiologists and a reporting radiographer in clinical practice in the United Kingdom. In: American Thoracic Society Congress, Philidelphia, US.

[img]
Preview
PDF
ATS_2013_Poster_A2229_Woznitza_A4_Handout.pdf

Download (561kB)

Abstract

Rationale: Driven by developing technology and an ageing population, radiology has witnessed an unprecedented rise in workload. One response to this in the United Kingdom has been to train radiographers to undertake clinical reporting. Accurate interpretation of imaging is crucial to allow clinicians' to correctly manage and treat patients.

Methods: A random sample of cases (n=100) was selected from a consecutive series of 1,000 chest x-ray reports produced by a radiographer in clinical practice using a simple computer generated algorithm. Due to the high level of observer variation which is apparent when interpreting chest x-rays, three consultant radiologists were also included to establish the rate of inter-observer variation between radiologists, which was then used as the baseline. Fifty images were interpreted by each radiologist who examined the radiographer report for accuracy and agreement, with 50% duplication of cases between radiologists to determine inter-radiologist variation. The radiologists performed their evaluation independently and blinded to the proportion of cases receiving multiple radiologist opinions. Inter-observer agreement analysis using Kappa was performed to determine consistency among observers.

Results: Disagreement was found between the radiologist and radiographer in 7 cases, which in three instances showed agreement between one radiologist and the radiographer. Inter-observer agreement (Kappa statistic) between the three radiologists and the reporting radiographer was found to be almost perfect, K=0.91, 95% confidence interval (0.79,1.0), K=0.91 (0.78,1.0) and K=0.83 (0.68,0.99) respectively. Inter-radiologist agreement was also almost perfect, K=0.82 (0.57,1.0) and K=0.91 (0.75,1.0).

Conclusion: The level of inter-observer agreement between radiologist and reporting radiographer chest x-ray interpretation compares favourably with inter-radiologist variation.

Item Type: Conference or Workshop Item (Poster)
Subjects: R Medicine > RC Internal medicine > RC0071 Examination. Diagnosis including radiography
Divisions: pre Nov-2014 > Faculty of Health and Social Care > Allied Health Professions
Related URLs:
Depositing User: Nick Woznitza
Date Deposited: 16 Sep 2014 13:08
Last Modified: 11 Dec 2014 14:12
URI: https://create.canterbury.ac.uk/id/eprint/12652

Actions (login required)

Update Item (CReaTE staff only) Update Item (CReaTE staff only)

Downloads

Downloads per month over past year

View more statistics

Share

Connect with us

Last edited: 29/06/2016 12:23:00