Hold Pusten
04.04.2022
G. CAIN a, , L.J. PITTOCK b, K. PIPER b, M.R. VENUMBAKA a, M. BODOCEANU a
a Radiology Department, Colchester Hospital, Turner Road, Colchester CO4 5JL, UK b Faculty of Medicine, Health and Social Care, Canterbury Christ Church University, North Holmes Road, Canterbury CT1 1QU, UK
A B S T R A C T
Introduction: This study assessed the inter-observer agreement of reporting radiographers and consultant radiologists compared with an index radiologist when reporting General Practitioner (GP) requested musculoskeletal radiographs. The potential effect of discordant reports on patient management and outcome was also examined.
Methods: Three reporting radiographers, three consultant radiologists and an index radiologist reported on a retrospective randomised sample of 219 GP requested musculoskeletal radiographs, in conditions simulating clinical practice. A speciality doctor in radiology compared the observers' reports with the index radiologist report for agreement and assessed whether any discordance between reports was clinically important.
Results: Overall agreement with the index radiologist was 47.0% (95% CI, 40.5e53.6) and 51.6% (95% CI, 45.0e58.1) for the consultant radiologists and reporting radiographers, respectively. The results for the appendicular and axial skeleton were 48.6% (95% CI, 41.3e55.9) and 40.9% (95% CI, 27.7e55.6) for the radiologists, and 52.6% (95% CI, 45.2e59.8) and 47.7% (95% CI, 33.8e62.1) for the radiographers, respectively. The difference in overall observer agreement between the two professional groups with the index radiologist was not statistically significant (p ¼ 0.34). Discordance with the index radiologist's reports was judged to be clinically important in less than 10% of the observer's reports.
Conclusion: Reporting radiographers and consultant radiologists demonstrate similar levels of concordance with an index radiologist when reporting GP requested musculoskeletal radiographs.
Implications for practice: These findings contribute to the wider evidence base that selected radiographers with appropriate postgraduate education and training are proficient to report on musculoskeletal radiographs, irrespective of referral source.
© 2021 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.Introduction
A chronic shortage of consultant radiologists and sustained increases in demand for radiology services represent a persistent challenge for the National Health Service (NHS).1 Demand is anticipated to grow substantially over the next 5 years during the Covid-19 recovery phase.2 Alternative strategies used to help manage reporting shortfalls, such as outsourcing and autoreporting, bring their own challenges. In the United Kingdom £193 million was spent on outsourcing/insourcing in 2019, which the Royal College of Radiologists calculated as the equivalent of the combined salaries of more than half of the existing radiologist workforce.3 Auto-reporting may be an appropriate tool for managing workloads in limited circumstances, such as follow-up images for fracture clinic patients, but wider use risks missed or delayed diagnoses.4 In 2003 the Department of Health introduced the skills mix initiative to help meet the rising demand for diagnostic services; this included the delegation of reporting activities to appropriately trained radiographers.5 While the widespread use of reporting radiographers has increased reporting capacity,3 there is still significant regional variation in their use,4 and limitations placed on their scope of practice.6-8 The value of skills mix initiatives has been recognised in a recent NHS England report2 which has estimated an additional 500 reporting radiographers will be needed over the next 5 years to help meet demand. In 2014 the scope of practice of musculoskeletal reporting radiographers at a large District General Hospital (DGH) was extended to include General Practitioner (GP) referrals, with the intention of improving report turnaround times and reducing outsourcing costs. It has been argued that reporting GP radiographs is a more complex task than accident and emergency (A&E) reporting where a binary decision about the presence or absence of a fracture is often all that is required.9 Furthermore, GP radiographs are normally only seen by the reporting practitioner hence it is vital the reports produced are accurate and meet requisite standards. In the absence of explicitly defined standards, acceptable reporting performance is typically assessed by comparison with the performance of the average competent practitioner.10-12 There is definitive evidence that reporting radiographers can accurately report musculoskeletal radiographs in clinical practice at a level comparable to consultant radiologists.13-15 However, these studies were primarily focused on the interpretation of musculoskeletal radiographs in the A&E/trauma context. Although there is strong evidence from the academic setting to support the reporting of non-A&E/ GP radiographs by radiographers in an examination environment, 16 there is limited research in the clinical setting.9 A robust research study was, therefore, undertaken to substantiate radiographer performance in GP reporting to ensure no loss of quality from the delegation of this task.
Aim and objectives
The aim of this study was to determine whether reporting radiographers report GP requested musculoskeletal radiographs with a similar level of proficiency as consultant radiologists, in clinical practice conditions. The primary objective was to assess the inter-observer agreement of radiographers and radiologists compared with an index radiologist; the secondary objectivewas to examine the potential effect of discordant reports on patient management and outcome.
Method
Study design, setting and ethical approvals
A prospective quasi-experimental observer agreement study was performed using a non-inferiority approach. The study took place in a single department of radiology within a NHS DGH in the East of England. Ethical approvals were obtained from the Faculty of Medicine, Health and Social Care, Canterbury Christ Church University and the local Research and Development department. The study was classified as service evaluation therefore full NHS Research Ethics Committee approval was not required. Each participant was given a participant information sheet relevant to their role in the study. Informed consent was obtained from all participants prior to data collection.
Sample size
To calculate an appropriate sample, estimates were required for the expected percentage agreement of the two groups with the index radiologist, any postulated difference between them and a non-inferiority margin.17 Robinson et al.18 estimated variability of 9 - 10% between consultant radiologists when interpreting A&E musculoskeletal radiographs. The complexity of GP reporting, with wider range of descriptive and interpretative observations, is likely to result in greater inter-observer variability. Brealey et al.9
found consultant radiologists reported A&E radiographs more accurately than GP radiographs, 86% (95% Confidence Interval (CI): 82e89) and 77% (95% CI: 72e81), respectively. To the author's knowledge, no previous study has compared the performance of reporting radiographers, with clinical experience reporting GP radiographs, with that of consultant radiologists. There is no evidence of a difference in performance between reporting radiographers and radiologists when reporting A&E requested musculoskeletal radiographs15 or chest radiographs.19 On these grounds, the assumed percentage agreement was 77% for both professional groups with a 10% non-inferiority margin. Based on these assumptions a sample of 219 examinations was required to adequately power the study to detect a difference between the two groups, if one existed (Appendix: sample size calculation).
Case selection
All GP requested X-ray examinations of the appendicular and axial skeleton were eligible for inclusion which reflects the local practice of the reporting radiographers. No exclusions were made on the basis of patient age or the technical adequacy of the radiographs (Table 1). In order to infer how well the observers can report all GP radiographs it was necessary for the sample to be reflective of the typical clinical case-mix.20-22 The radiology information system was interrogated for GP requested X-ray examinations across all hospital sites for the 2018 calendar year. The proportion of each anatomical regionwas quantified and used to calculate the number of cases required for each area, using a system adapted from Neep
Gå til medietA B S T R A C T
Introduction: This study assessed the inter-observer agreement of reporting radiographers and consultant radiologists compared with an index radiologist when reporting General Practitioner (GP) requested musculoskeletal radiographs. The potential effect of discordant reports on patient management and outcome was also examined.
Methods: Three reporting radiographers, three consultant radiologists and an index radiologist reported on a retrospective randomised sample of 219 GP requested musculoskeletal radiographs, in conditions simulating clinical practice. A speciality doctor in radiology compared the observers' reports with the index radiologist report for agreement and assessed whether any discordance between reports was clinically important.
Results: Overall agreement with the index radiologist was 47.0% (95% CI, 40.5e53.6) and 51.6% (95% CI, 45.0e58.1) for the consultant radiologists and reporting radiographers, respectively. The results for the appendicular and axial skeleton were 48.6% (95% CI, 41.3e55.9) and 40.9% (95% CI, 27.7e55.6) for the radiologists, and 52.6% (95% CI, 45.2e59.8) and 47.7% (95% CI, 33.8e62.1) for the radiographers, respectively. The difference in overall observer agreement between the two professional groups with the index radiologist was not statistically significant (p ¼ 0.34). Discordance with the index radiologist's reports was judged to be clinically important in less than 10% of the observer's reports.
Conclusion: Reporting radiographers and consultant radiologists demonstrate similar levels of concordance with an index radiologist when reporting GP requested musculoskeletal radiographs.
Implications for practice: These findings contribute to the wider evidence base that selected radiographers with appropriate postgraduate education and training are proficient to report on musculoskeletal radiographs, irrespective of referral source.
© 2021 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.Introduction
A chronic shortage of consultant radiologists and sustained increases in demand for radiology services represent a persistent challenge for the National Health Service (NHS).1 Demand is anticipated to grow substantially over the next 5 years during the Covid-19 recovery phase.2 Alternative strategies used to help manage reporting shortfalls, such as outsourcing and autoreporting, bring their own challenges. In the United Kingdom £193 million was spent on outsourcing/insourcing in 2019, which the Royal College of Radiologists calculated as the equivalent of the combined salaries of more than half of the existing radiologist workforce.3 Auto-reporting may be an appropriate tool for managing workloads in limited circumstances, such as follow-up images for fracture clinic patients, but wider use risks missed or delayed diagnoses.4 In 2003 the Department of Health introduced the skills mix initiative to help meet the rising demand for diagnostic services; this included the delegation of reporting activities to appropriately trained radiographers.5 While the widespread use of reporting radiographers has increased reporting capacity,3 there is still significant regional variation in their use,4 and limitations placed on their scope of practice.6-8 The value of skills mix initiatives has been recognised in a recent NHS England report2 which has estimated an additional 500 reporting radiographers will be needed over the next 5 years to help meet demand. In 2014 the scope of practice of musculoskeletal reporting radiographers at a large District General Hospital (DGH) was extended to include General Practitioner (GP) referrals, with the intention of improving report turnaround times and reducing outsourcing costs. It has been argued that reporting GP radiographs is a more complex task than accident and emergency (A&E) reporting where a binary decision about the presence or absence of a fracture is often all that is required.9 Furthermore, GP radiographs are normally only seen by the reporting practitioner hence it is vital the reports produced are accurate and meet requisite standards. In the absence of explicitly defined standards, acceptable reporting performance is typically assessed by comparison with the performance of the average competent practitioner.10-12 There is definitive evidence that reporting radiographers can accurately report musculoskeletal radiographs in clinical practice at a level comparable to consultant radiologists.13-15 However, these studies were primarily focused on the interpretation of musculoskeletal radiographs in the A&E/trauma context. Although there is strong evidence from the academic setting to support the reporting of non-A&E/ GP radiographs by radiographers in an examination environment, 16 there is limited research in the clinical setting.9 A robust research study was, therefore, undertaken to substantiate radiographer performance in GP reporting to ensure no loss of quality from the delegation of this task.
Aim and objectives
The aim of this study was to determine whether reporting radiographers report GP requested musculoskeletal radiographs with a similar level of proficiency as consultant radiologists, in clinical practice conditions. The primary objective was to assess the inter-observer agreement of radiographers and radiologists compared with an index radiologist; the secondary objectivewas to examine the potential effect of discordant reports on patient management and outcome.
Method
Study design, setting and ethical approvals
A prospective quasi-experimental observer agreement study was performed using a non-inferiority approach. The study took place in a single department of radiology within a NHS DGH in the East of England. Ethical approvals were obtained from the Faculty of Medicine, Health and Social Care, Canterbury Christ Church University and the local Research and Development department. The study was classified as service evaluation therefore full NHS Research Ethics Committee approval was not required. Each participant was given a participant information sheet relevant to their role in the study. Informed consent was obtained from all participants prior to data collection.
Sample size
To calculate an appropriate sample, estimates were required for the expected percentage agreement of the two groups with the index radiologist, any postulated difference between them and a non-inferiority margin.17 Robinson et al.18 estimated variability of 9 - 10% between consultant radiologists when interpreting A&E musculoskeletal radiographs. The complexity of GP reporting, with wider range of descriptive and interpretative observations, is likely to result in greater inter-observer variability. Brealey et al.9
found consultant radiologists reported A&E radiographs more accurately than GP radiographs, 86% (95% Confidence Interval (CI): 82e89) and 77% (95% CI: 72e81), respectively. To the author's knowledge, no previous study has compared the performance of reporting radiographers, with clinical experience reporting GP radiographs, with that of consultant radiologists. There is no evidence of a difference in performance between reporting radiographers and radiologists when reporting A&E requested musculoskeletal radiographs15 or chest radiographs.19 On these grounds, the assumed percentage agreement was 77% for both professional groups with a 10% non-inferiority margin. Based on these assumptions a sample of 219 examinations was required to adequately power the study to detect a difference between the two groups, if one existed (Appendix: sample size calculation).
Case selection
All GP requested X-ray examinations of the appendicular and axial skeleton were eligible for inclusion which reflects the local practice of the reporting radiographers. No exclusions were made on the basis of patient age or the technical adequacy of the radiographs (Table 1). In order to infer how well the observers can report all GP radiographs it was necessary for the sample to be reflective of the typical clinical case-mix.20-22 The radiology information system was interrogated for GP requested X-ray examinations across all hospital sites for the 2018 calendar year. The proportion of each anatomical regionwas quantified and used to calculate the number of cases required for each area, using a system adapted from Neep


































































































