Article, Radiology

Accuracy of radiographic readings in the emergency department

Original Contribution

Accuracy of radiographic readings in the emergency department?

Bruno Petinaux MD a,?, Rahul Bhat MD b, Keith Boniface MD a, Jaime Aristizabal MD a

aDepartment of Emergency Medicine, George Washington University, Washington, DC 20037, USA

bDepartment of Emergency Medicine, Georgetown University, Washington, DC 20007, USA

Received 21 September 2008; revised 2 June 2009; accepted 10 July 2009

Abstract

Objectives: A review of Radiology discrepancies of emergency department (ED) radiograph interpretations was undertaken to examine the types of error made by emergency physicians (EPs). Methods: An ED quality assurance database containing all radiology discrepancies between the EP and radiology from June 1996 to May 2005 was reviewed. The discrepancies were categorized as bone, chest (CXR), and abdomen (AXR) radiographs and examined to identify abnormalities missed by EPs. Results: During the study period, the ED ordered approximately 151 693 radiographs. Of the total, 4605 studies were identified by radiology as having a total of 5308 abnormalities discordant from the EP interpretation. Three hundred fifty-nine of these abnormalities were not confirmed by the radiologist (false positive). The remainder of the discordant studies represented abnormalities identified by the radiologist and missed by the EP (false negatives). Of these false-negative studies, 1954 bone radiographs (2.4% of bone x-rays ordered) had missed findings with 2050 abnormalities; the most common missed findings were fractures and dislocations. Of the 220 AXRs (3.7% of AXRs ordered) with missed findings, 240 abnormalities were missed; the most common of these was bowel obstruction. Of the 2431 CXRs (3.8% of CXRs ordered), 2659 abnormalities were missed; the most common were air-space disease and Pulmonary nodules. The rate of discrepancies potentially needing emergent Change in management based solely on a radiographic discrepancy was 85 of 151 693 x-rays (0.056%).

Conclusions: Approximately 3% of radiographs interpreted by EPs are subsequently given a discrepant interpretation by the radiology attending. The most commonly missed findings included fractures, dislocations, air-space disease, and pulmonary nodules. Continuing education should focus on these areas to attempt to further reduce this error rate.

(C) 2011

Introduction

Emergency medicine (EM) attending physicians (EP) often order imaging studies as part of a patient’s workup in

? This study was presented in the form of a Poster at the Society of Academic Emergency, Medicine Meeting in the Spring of 2007.

* Corressponding author. Tel.: +1 202 258 0615 (Office); fax: +1 202 741

2921.

E-mail address: [email protected] (B. Petinaux).

the emergency department (ED). In most EDs, the interpretation of the radiograph is initially done by the treating EP, with a radiologist’s (RAD) interpretation done after the disposition of the patient [1-4]. Most EDs have a quality assurance system to ensure concordance between the EP and RAD interpretation. Should a discrepancy between the 2 interpretations be noted, a review of the ED chart typically determines whether or not this discrepancy is of any clinical importance and if so, whether or not it needs to alter the clinical management of the patient. A number of studies

0735-6757/$ - see front matter (C) 2011 doi:10.1016/j.ajem.2009.07.011

have not only examined such discrepancies but also investigated the effect of interventions to minimize such events [5,6]. In the past, large discrepancy rates of up to 45% have been noted between internal medicine house staff and radiology attending radiograph interpretations [7-9]. More recent studies have indicated this rate to be as low as 1.1% in EDs [10]. The largest study to date has reviewed 16 246 radiographs [11], although most studies have been smaller [12-16]. We undertook a review of radiographs at our institution to quantify and characterize the discrepancies between EM and radiology interpretations of plain radio- graphs and to determine our accuracy of diagnostic radiographic interpretation by the EP.

Table 1 case definitions

Radiograph Any diagnostic radiograph, to exclude

Cross-sectional imaging and ultrasonography

Abnormality Pathology identified on the radiograph

(ASD, humeral head fracture, etc)

Discrepancy Event where the ED attending interpretation

of a radiograph does not match the radiology attending interpretation of the same radiograph.

RAD Radiologist (attending)

EP Emergency physician (attending) False negative An abnormality missed by the EP

False positive An abnormality noted by the EP which

RAD did not feel was present.

Questionable false A radiograph in which the EP noted no negative abnormalities and RAD indicated the

possibility of an abnormality.

Questionable false A radiograph in which the EP noted an positive abnormality and RAD indicated the

possibility that the abnormality did not

exist.

Methods

This study was a retrospective review of all plain radiographs ordered during the period of the study from June 1996 to May 2005 in the ED of an urban, university- affiliated level I trauma center with a residency training program in EM. Most patient population was older than 18 years. The study was approved by the institutional review board. Attending EPs routinely reviewed all plain radio- graphs ordered in the ED and documented a preliminary interpretation on a “wet read” form. The same radiograph was subsequently reviewed by an attending RAD. Any discrepancies were reported to the ED staff member responsible for follow-ups, typically a third-year EM resident. Upon receipt of the discrepancy, it was entered into a database (Microsoft Access, Redmond, Wash). The EM resident reviewed the patient’s medical record to determine whether or not the discrepancy was known to the EP attending (ie, flagged by the RAD as a discrepancy due to incomplete documentation only). All other studies were entered as false negative, false positive, questionable false negative, or questionable false positive (see Table 1 for definitions). The EM resident’s assessment of the clinical importance of the discrepancy, based on the abnormality described, medical record review, and consultation with the on duty EP attending, then led to at least one of the following follow-up actions: no action, call to admitting team (if patient was an inpatient), call to patient’s regular physician, call to patient to change follow-up or instruc- tions, or call to patient to return to the ED. Typically, the EM resident documented the result of the follow-up action in the database, along with any pertinent patient informa- tion to that discrepancy. This included whether or not that patient was admitted or discharged and what change in follow-up or treatment was conveyed to the patient. This quality assurance process remained unchanged throughout the study period.

The inclusion criteria into the study were all plain radiograph discrepancies entered into the quality assurance database. All cross-sectional imaging studies (such as

computed tomography, magnetic resonance imaging, ultrasound) were excluded, as well as blank entries in the database containing no data. At least 2 investigators independently reviewed each discrepancy in the database and removed discrepancies in which documentation error and not radiograph misinterpretation led to a discrepancy. Disagreements between investigators were settled by consensus of all 4 investigators. If the investigators were not able to determine with certainty after review of the database whether the abnormality was known to the treating EP, then it was counted as a false negative. Investigators then reviewed the remaining discrepancies and further categorized them into bone (including soft tissue), abdomen (AXR), and chest (CXR). Multiple abnormalities existed for some discrepancies, and each abnormality was individually counted. Of note, some abnormalities resulted in a discrepancy being classified into both false-negative and false-positive abnormalities if the EP overread one finding but missed another on the same discrepancy. When the RAD was not certain about a false-negative abnormality, the abnormality was called a questionable false negative. For the purposes of this study, these abnormalities were analyzed as true false negatives, as they potentially represented pathology missed by EPs. The total number of radiographs ordered during the study period was extrapolated using available billing data from 2001 through 2005. See Table 1 for case definitions.

The data were extracted from Microsoft Access and processed in Microsoft Excel (Microsoft). Using the number of abnormalities compared with the total number of radio- graphs ordered, the overall accuracy of the EP radiographic interpretation was ascertained.

Results

The investigators reviewed all 5660 discrepancies in the database. Of these, 449 discrepancies were cross-sectional imaging ineligible for the study. Two discrepancies were test entries and 23 were blank, containing no data fields, leaving 5186 discrepancies. Eighteen discrepancies upon review were found to represent 2 separate radiograph discordances, yielding 5204 total discrepancies. A total of 599 discrepan- cies removed after review of the discrepancy clearly indicated that the treating EP was aware of the finding reported as discrepant. Investigators then reviewed the remaining 4605 discrepancies and further categorized the discrepancies into bone, AXR, and CXR. Due to multiple abnormalities on a single discrepancy, a total of 5308 abnormalities were reported.

The total number of radiographs ordered during the study period was extrapolated using available billing data from the year 2001 through 2005 to be 151 693 radiographs. Of these, 82 557 were bone radiographs, 5,987 were AXR, and 63 149 were CXR.

Within each category, major groupings served to organize the data. The bone category was grouped into dislocations, fractures, old fractures, retained foreign bodies, inadequate films, and soft tissue or bony lesions other than fractures, and other pathology. The bone-other grouping included above- the-diaphragm pathology such as pulmonary or vascular disease, as well as sinus disease and presence of hardware. Abdomen discrepancies were placed into bowel obstruction, Free air, calcifications, masses, chest, bony lesion, foreign body, and other pathology groupings. The AXR-other

grouping included abnormal bowel gas pattern, pancreatitis, organomegally, dilated bladder, ascites, hernia, as well as questionable pneumobilia, hernia, and abscess. Chest discrepancies were placed into pulmonary (atelectasis/air- space disease [ASD]), structural (hilum/mass/nodule/pleural problems), vascular, bone, foreign body, abdominal, soft tissue, and other groupings. The CXR-other grouping included small heart size, poor inspiratory effort, incomplete studies, elevated diaphragm, and others.

Fig. 1 illustrates the breakdown of the abnormalities by major radiographic categories. False positives represented 6.8% of all abnormalities (359/5308), of which 4 abnormal- ities were qualified by RAD as questionable false positive. The remaining studies represented false negative (93.2%), of which 17.4% (859/4949) were qualified by RAD as questionable false negatives.

Table 2 illustrates the results of the bone radiograph abnormalities. On bone radiographs, a total of 878 fractures and 263 questionable fractures were missed by EPs (see Table 3 for types of fractures missed-across all x-ray categories to include CXR, AXR, and bone). Twenty-four complete Joint dislocations were missed on bone x-rays, most commonly of the hand, shoulder, finger, and wrist. There were an additional 78 questionable dislocations including subluxations and/or questionable subluxations. Forty-six of 78 represented cervical spine discordances. Of these 46, half (23) were related to C1/C2 asymmetry. Fifty radiographs were judged to be inadequate due to Imaging techniques. Of these, 36 involved the cervical spine. Foreign bodies on bone x-rays were missed in 27 studies. Soft tissue lesions included 180 cases of soft tissue swelling, 249 cases

Fig. 1 Study method.

Grouping

False negatives

Questionable false negative

acute fractures

878

263

Soft tissue lesions

553

64

Old fractures

49

0

Dislocations

24

78

Foreign bodies

27

14

Other

36

14

Incomplete studies

50

0

Totals

1617

433

Grouping

False negative

Questionable false negative

Calcification

45

0

Bowel obstruction

40

43

Intra-abdominal-other

30

5

Mass

26

0

Chest

25

1

Bone

16

2

Foreign body

5

0

Free air

2

0

Total

189

51

of bony lesions (avulsions, DJD, osteomyelitis, etc), 8 calcifications, and 116 cases of missed joint effusions.

Table 2 Bone radiograph discrepancy summary-false negatives (n = 2050)

Table 4 AXR radiograph discrepancy summary-false negatives (n = 240)

Table 4 illustrates the results of the AXR radiograph abnormalities. On AXR, 13 Small bowel obstructions (SBOs) were missed with an additional 43 that were qualified as possible SBO. There were an additional 14 air/fluid levels, 11 partial SBO, and 2 large bowel obstructions. There were 45 missed calcifications and 26 missed Abdominal masses

Table 3 Missed fractures (definite) across all radiograph categories

including 3 missed questionable abdominal aortic aneur- ysms. Three abdominal series showed extraluminal contrast. Finally, there were 2 cases of missed free air.

Table 5 illustrates the results of the CXR radiograph abnormalities. On CXR, ASD was missed in 765 cases, and there were 90 questionable false negatives. The most commonly missed pulmonary pathology overall involved both lungs in 433 or 35% of cases, followed by the left lower lobe in 236 cases. A total of 23 pneumothoraces (PTXs) were missed and 9 questionable PTXs. Two pneumomediastinums were also missed, as were 8 cases of free air under the diaphragm and 9 cases of questionable free air under the diaphragm. Eighty-three aortic and 16 questionable aortic lesions were missed. One hundred twenty-eight cases of pulmonary edema and 120 cases of Pleural effusions were missed by EPs. Three hundred eight pulmonary nodules or masses were missed, and 148 were questionable misses. Table 6 illustrates the abnormalities missed by EPs (false negatives) for all radiographs across the 3 categories of radiographs.

Fracture type N

Ankle 7

Calcaneous 42

Cervical spine 11

Clavicle 26

Facial bones 12

Femur 26

Fibula 68

Finger 176

Foot 52

Glenoid 6

Hand 35

Humerus 40

lumbar spine 43

Nasal 9

Patella 14

Pelvis 17

Radius 83

Rib 108

Sacrum 4

Scaphoid 1

Scapula 6

Shoulder 1

Skeleton not otherwise specified 3

Spine not otherwise specified 2

Sternum 3

thoracic spine 66

Tibia 57

Toe 105

Ulna 30

Total 1053

The false positives totaled 359 abnormalities, of which 187 abnormalities were related to bones, 6 to AXR, and 166 to CXR. Of the 187 false-positive bony abnormalities, there were 126 fractures, 27 cases of soft tissue swelling, and 1 dislocation. In addition, 29 questionable fractures and 4

Table 5 CXR radiograph discrepancies summary-false negatives (n = 2659)

Grouping

False negative

Questionable false negative

Pulmonary

1098

130

Structural

588

175

Vascular

187

23

Bone

272

23

Abdominal

51

21

Foreign body

43

1

Soft tissue

15

2

Other

30

0

Total

2284

375

Grouping

False negatives

Questionable false negatives

Bowel obstruction

47

47

Calcification

67

2

Dislocations

26

80

Foreign body

75

15

Fractures

1053

280

Free air

10

10

Incomplete studies

50

0

Intra-abdominal-other

41

8

Abdominal mass

44

0

Old fractures

82

0

Other

25

0

Pulmonary

1134

138

Soft tissue lesions

637

76

Structural

599

175

Vascular

200

28

Total

4090

859

a Values differ from the prior 3 tables as a result of abnormalities

being picked up on a nondedicated film, that is, bowel obstruction being detected on chest x-ray.

questionable foreign body abnormalities were reported as false positives. The bony false-positive abnormalities involved feet (34), wrists (29), hand (16), ankles (13), and fingers (12) most frequently. Of the AXR, all 6 false-positive abnormalities related to SBO. On CXR, the 166 false- positive abnormalities included 96 ASDs, 11 congestive heart failures, 9 mass/nodules, 6 fractures, 4 cardiomegally, 4 hilar lesions, 3 effusions, 3 PTXs, 1 of each aortic, pneumomediastinum, bony lesion, hardware, free air, abdominal finding-other, and SBO. In addition, 12 questionable ASDs, 6 questionable mass/nodule, and 1 questionable finding each of congestive heart failure, effusion, fracture, and free air were also reported. Twenty- six discrepancies with a false-positive abnormality actually also included a separate false-negative abnormality.

Table 6 Discrepancy summary across all categories (bone, CXR, and AXR)-false negatives (n = 4949) a

Of the 4605 discrepancies included in this study, 889 had no documented interpretation by the EP, thus triggering an automatic discrepancy. These discrepancies were included if a database review did not clearly indicate based on the documentation that the treating EP was aware of the finding. All remaining discrepancies had an EP interpretation. Out of the 4605 discrepancies, 1268 patients were admitted, 2444 were discharged, 892 did not have a disposition noted, and 2 patients expired in the ED.

Follow-up regarding actions taken by the ED for discrepancies was noted in 3515 of 4605 discrepancies. Of the noted discrepancies, 1349 cases were judged upon review of the chart by the quality assurance physician at the time of discovery of the discrepancy to require no additional follow-up or treatment. Of the discrepancies judged to be significant, 775 patients, 542 admitting physicians, and 292 primary care physicians were contacted to inform them of

the discrepancy. One hundred nine additional patients were called back to return to the ED for additional treatment, 5 of whom were told to return immediately by emergency medical services (EMS). Ten patients returned on their own before being contacted. Certified letters were sent to 260 patients to inform them of the discrepancy, and 178 (5% of patients whom there is a record of the ED attempting to contact them) had no contact information on record. Table 7 illustrates the radiograph and discrepancies in these more urgent cases.

Discussion

This retrospective study of an ED radiology quality assurance database represents the largest series of radio- graphic discrepancies reported from an ED to date. It represents all levels of illness and injury and differs from some prior studies that evaluated only subacute patient populations with a lower pretest probability of serious

Table 7 Discrepancies in 109 discharged patients called back to the ED or told to call EMS for follow-up (representing 113 discrepancies, with multiple discrepancies in 2 patients)

CXR

AXR

Bone

Questionable Abdominal aortic aneurysm

1

1

Questionable ASD

4

Questionable bowel obstruction

2

Questionable dislocation

6

Questionable foreign body

2

Questionable fracture

14

Questionable mass/nodule

1

Questionable pericardial effusion

1

Questionable pleural effusion

1

Questionable pneumomediastinum

2

Questionable PTX

1

Questionable wide mediastinum

1

Aorta

4

ASD

8

Bowel obstruction

3

Bone lesion, nonfracture

10

Calcification

1

Joint effusion

1

Foreign body

1

Fracture

2

27

Mass/nodule

1

3

Negative a

1

Pleural effusion

1

Pneumobilia

1

Pneumomediastinum

1

PTX

4

Soft tissue swelling

5

Wide mediastinum

2

a This patient was called back to the ED after a chart review raised the question of a pulmonary embolism after the initial ED diagnosis of

pneumonia was questioned by radiology.

disease [17,18], which therefore may have underestimated the true significance of ED false negatives. Multiple studies have evaluated differing levels of physician training on the outcome of the interpretation [19,20]. This study differs from these prior studies by specifically assessing only the radiographic interpretation of the ED attending. The overall rate of 5308 discrepancies for an estimated 151 693 radiographs (3.5%) is in line with prior studies [21]. When only the discrepancies characterized by RAD as definite are included, the rate drops to 2.9%. The rate of false positives in our study of 359/5308 discrepancies represented a relatively small proportion of overall discrepancies. The most common false positives involved the “overcalling” of fractures or of ASD that were ultimately felt not to be present on RAD review.

A larger proportion of films were considered false negatives, although a significant number of false negatives were qualified by the RAD as “questionable.” These 859 questionable false negatives represented radiographic uncertainties after RAD review requiring additional imag- ing studies or clinical correlation. The most common uncertainty represented the possibility of a missed fracture in 280/859 cases. How this translates to clinical care is difficult to ascertain from these data set because the EP will use the radiograph as a diagnostic adjunct in addition to history and physical examination. Most EPs’ practice takes into account the patient’s clinical examination as well as the x-ray results, for example, immobilizing musculoskel- etal injuries that are clinically severe and arranging orthopedic follow-up regardless of radiographic finding. Also, some of the findings called a false negative by the RAD were findings that were undoubtedly noticed by the treating physician, but ignored because they were so obviously not part of the clinical scenario. Emergency physicians have clinical information not available to the RAD, for instance, the 20-year-old bullet seen on the abdominal x-ray performed to rule out obstruction. These points are supported by the relatively high rate of patients not requiring any intervention in their management and follow-up (1359/3515) despite a radiographic discrepancy. Also, because the number of questionable discrepancies represents a sizable part of the study, a question arises regarding the criterion standard of radiographic interpreta- tion. For the purposes of this study, the radiology attending was assumed to represent the criterion standard, though multiple studies [22-26] have demonstrated large variations in interpretation between RADs. Also, in one study of the outcomes of 175 discrepancies, the ED interpretation proved to be the correct one in 39 cases [27]. In addition, in our study some unknown portion of the 892 discrepan- cies were generated because of the lack of documentation on the part of the EP, illustrating the need for good documentation both to record medical care provided and to decrease medical liability [28-30].

Missed fractures have represented the largest proportion of errors in EP radiograph interpretation in prior literature

[31], and a previous analysis shows a relatively large number of missed calcaneal fractures [32]. Digits are also demon- strated in other studies [11] to remain a frequent area of missed fracture diagnosis. In our study, fingers, ribs, and toes were the most frequently fractured bones missed by EPs.

From an EM perspective, any discrepancy across all categories that changed the management and/or disposition of a patient is a significant one. It is worth noting that on

8 studies, potentially emergent findings were noted on nondedicated studies (eg, large aortic aneurysm seen on pelvis x-ray). Overall, in this study, missed PTXs (24), aneurysm (14-10 thoracic and 4 abdominal), wide medias- tinum (35), pneumomediastinum (2), and free air (10) represent potentially emergent disease presentations (n = 85) that were missed by the interpreting EP attending. There were an additional 1979 urgent radiographic discrepancies including dislocations (26), ASD (778), fractures (1053), bowel obstruction (47), and/or foreign bodies (75, including 5 bullets), which may represent clinically significant discrepancies. These radiographic emergent and urgent disease presentations represent a total of 2064 (39%) of 5308 discrepancies or a 1.4% false-negative rate compared with the total number of radiographs ordered in the study period [33]. It is unclear how many of these discrepancies would have lead to changes in clinical management of the affected patients, though prior studies [34] have examined this issue and found a rather small number of clinical changes instituted [11,35,36]. The rate of discrepancies potentially requiring emergent change in medical management (85/5308) represents only a small fraction of an estimated 151 693 x-rays during our 9-year study period. This corresponds to

1 radiograph in 1785 or 0.056% potentially needing emergent change in management based solely on a radiographic discrepancy.

Limitation of this study is that the set of data was at times not complete in all aspects. Also, no chart reviews were undertaken at the time of this study; thus, the only information regarding patient outcomes was clinical details that were entered in the database from a chart review performed at the time of discrepancy. The database available for study captured almost all returned radio- graphic discrepancies to the ED, with the following exception: because the RAD read these radiographs with a time delay in relation to patient care, on rare occasion, a discrepancy might have been called to the treating physician in the ED before patient disposition. Such a discrepancy would not have been entered in the database and thus would have caused the number of discrepancies to be even higher. Such real-time interpretation of ED radiographs at our facility is unusual, although anecdotally growing more frequent with increasing lengths of stay for patients awaiting inpatient beds. Also, a significant number of discrepancies in the database were included because incomplete documentation would not allow the investiga- tors to conclusively determine whether or not the treating

EP knew of the abnormality at the time of the ED visit. Unlike other studies [36-38], this study did not assess the EP’s Confidence level of the ED interpretation of all radiographs. Interestingly, the study period also spans that of a technology transfer. In 2003, the hospital introduced a digital radiology system for processing and displaying the captured radiographs. The impact of such a technological transition on this study was not evaluated, although the transition might have had an impact [39]. Lastly, our ED sees few pediatric patients, and our results cannot be extrapolated to this population.

Conclusions

Plain radiograph interpretation has long been an integral part of ED patient management. Because real-time radiology attending overread is not universally available during daylight hours, and rarely available during Off hours, EPs are often called upon to determine clinical care based on their own interpretation of x-rays. Previous studies have shown a wide variability in the rate of missed findings, and few specifically address EP’s interpretation of x-rays. This study represents the largest database review of radiographic discrepancies to date in the literature. Using radiology overread as a criterion standard, EPs had definitely discrepant interpretations from radiology in 2.9% of cases. Emergency physicians were found to have infrequently missed clinically significant findings and rarely missed emergent findings. The most commonly missed findings included fractures, dislocations, ASD, and pulmonary nodules. Continuing education should focus on these areas to attempt to further reduce this error rate.

References

  1. O’Leary MR, Smith M, Olmsted WW. physician assessments of practice patterns in emergency department radiograph interpretation. Ann Emerg Med 1988;17(10):1019-23.
  2. Torreggiani WC, Nicolaou S, Lyburn ID, Harris AC, et al. Emergency radiology in Canada: a national survey. Can Assoc Radiol J 2002;53 (3):160-7.
  3. James MR, Bracegirdle A, Yates DW. X-ray reporting in accident and emergency departments-an area for improvements in efficiency. Arch Emerg Med 1991;8(4):266-70.
  4. Hunter TB, Krupinski EA, Hunt KR, Erly WK. Emergency department coverage by academic departments of radiology. Acad Rad 2000;7(3): 165-70.
  5. Preston CA, Marr JJ, Amaraneni KK, Suthar BS. Reduction of ‘callbacks’ to the ED due to discrepancies in plain radiograph interpretation. Am J Emerg Med 1998;16(2):160-2.
  6. Espinosa JA, Nolan TW. Reducing errors made by emergency physicians in interpreting radiographs: longitudinal study. BMJ 2000;320(7237):737-40.
  7. De Lacey G, Barker A, Harper J, et al. An assessment of the clinical effects of reporting accident and emergency radiographs. Br J Radiol 1980;53(628):304-9.
  8. Fleisher G, Ludwig S, McSorley M. Interpretation of pediatric x-ray films by emergency department pediatricians. Ann Emerg Med 1983;12(3):153-8.
  9. McLain PL, Kirkwood CR. The quality of emergency room radiograph interpretations. J Fam Pract 1985;20(5):443-8.
  10. Warren JS, Lara K, Connor PD, et al. Correlation of emergency department radiographs: results of a quality assurance review in an urban community hospital setting. J Am Board Fam Pract 1993;6(3): 255-9.
  11. Thomas HG, Mason AC, Smith RM, Fergusson CM. Value of radiograph audit in an accident service department. Injury 1992;23(1): 47-50.
  12. Nitowski LA, O’Connor RE, Reese CL. The rate of clinically significant plain radiograph misinterpretation by faculty in an emergency medicine residency program. Acad Emerg Med 1996;3 (8):782-9.
  13. Berman L, de Lacey G, Twomey E, et al. Reducing errors in the accident department: a simple method using radiographers. Br Med J 1985;290(6466):421-2.
  14. Quick G, Podgorny G. An emergency department radiology audit procedure. JACEP 1977;6(6):247-50.
  15. Klein EJ, Koenig M, Diekema DS, Winters W. Discordant radiograph interpretation between emergency physicians and radiologists in a pediatric emergency department. Pediatr Emerg Care 1999;15(4): 245-8.
  16. Masel JP, Grant PJ. Accuracy of Radiological diagnosis in the casualty department of a children’s hospital. Aust Paediatr J 1984;20(3):221-3.
  17. Tachakra S, Mukherjee P, Smith C, Dutton D. Are accident and emergency consultants as accurate as consultant radiologists in interpreting plain skeletal radiographs taken at a Minor injury unit? Eur J Emerg Med 2002;9(2):131-4.
  18. Snow DA. Clinical significance of discrepancies in roentgenographic film interpretation in an acute walk in area. J Gen Intern Med 1986;1 (5):295-9.
  19. Halvorsen JG, Kunian A, Gjerdingen D, Connolly J, et al. The interpretation of office radiographs by family physicians. J Fam Pract 1989;28(4):426-32.
  20. Nolan TM, Oberklaid F, Boldt D. Radiological services in a hospital emergency department-an evaluation of service delivery and radiograph interpretation. Aust Paediatr J 1984;20(2):109-12.
  21. O’Leary MR, Smith MS, O’Leary DS, Omsted WW, et al. Application of clinical indicators in the emergency department. JAMA 1989;262 (24):3444-7.
  22. Robinson PJ, Wilson D, Coral A, et al. Variation between experienced observers in the interpretation of accident and emergency radiographs. Br J Radiol 1999;72(856):323-30.
  23. Siegle RL, Baram EM, Reuter SR, et al. Rates of disagreement in imaging interpretation in a group of community hospitals. Acad Radiol 1998;5(3):148-54.
  24. Berlin L. Does the ‘missed’ radiographic diagnosis constitute malpractice? Radiology 1977;123(2):523-7.
  25. Herman PG, Hessel SJ. Accuracy and its relationship to experience in the interpretation of chest radiographs. Invest Radiol 1975;10(1): 62-7.
  26. Robinson PJ, Culpan G, Wiggins M. Interpretation of selected accident and emergency radiographic examinations by radiographers: a review of 11000 cases. Br J Radiol 1999;72(858):546-51.
  27. Benger JR, Lybrun ID. What is the effect of reporting all emergency department radiographs? Emerg Med J 2003;20(1):40-3.
  28. Guly HR. missed diagnoses in an accident and emergency department.

Injury 1984;15(6):403-6.

  1. Gwynne A, Barber P, Tavener F. A review of 105 negligence claims against accident and emergency departments. J Accid Emerg Med 1997;14(4):243-5.
  2. George JE, Espinosa JA, Quattrone MS. Legal issues in emergency radiology. Practical strategies to reduce risk. Emerg Med Clin North Am 1992;10(1):179-203.
  3. Guly HR. diagnostic errors in an accident and emergency department. Emerg Med J 2001;18(4):263-9.
  4. Freed HA, Shields NN. Most frequently overlooked radiographically apparent fractures in a teaching hospital emergency department. Ann Emerg Med 1984;13(10):900-4.
  5. Walsh-Kelly CM, Melzer-Lange MD, Hennes HM, et al. Clinical impact of radiograph misinterpretation in a pediatric ED and the effect of physician training level. Am J Emerg Med 1995;13(3):262-4.
  6. Gatt ME, Spectre G, Paltiel O, et al. Chest radiographs in the emergency department: is the radiologist really necessary? Postgrad Med J 2003;79(930):214-7.
  7. Williams SM, Connelly DJ, Wadsworth S, Wilson DJ. Radiological review of accident and emergency radiographs: a 1-year audit. Clin Radiol 2000;55(11):861-5.
  8. Mayhue FE, Rust DD, Aldag JC, et al. Accuracy of interpretations of emergency department radiographs: effect of confidence levels. Ann Emerg Med 1989;18(8):826-30.
  9. Smith PD, Temte J, Beasley JW, Mundt M. Radiographs in the office: is a second reading always needed? J Am Board Fam Prac 2004;17(4): 256-63.
  10. Lufkin KC, Smith SW, Matticks CA, Brunette DD. Radiologists’ review of radiographs interpreted confidently by emergency physi- cians infrequently leads to changes in patient management. Ann Emerg Med 1998;31(2):202-7.
  11. Scott WW, Bluemke DA, Mysko WK, Wellter GE, et al. Interpre- tation of emergency department radiographs by radiologists and emergency medicine physicians: teleradiology workstation versus radiograph readings. Radiology 1995;195(1):223-9.

Leave a Reply

Your email address will not be published. Required fields are marked *