RADPEER quality assurance program: a multifacility study of interpretive disagreement rates

J Am Coll Radiol. 2004 Jan;1(1):59-65. doi: 10.1016/S1546-1440(03)00002-4.

Abstract

Purpose: To develop and test a radiology peer review system that adds minimally to workload, is confidential, uniform across practices, and provides useful information to meet the mandate for "evaluation of performance in practice" that is forthcoming from the American Board of Medical Specialties as one of the four elements of maintenance of certification.

Method: RADPEER has radiologists who review previous images as part of a new interpretation record their ratings of the previous interpretations on a 4-point scale. Reviewing radiologists' ratings of 3 and 4 (disagreements in nondifficult cases) are reviewed by a peer review committee in each practice to judge whether they are misinterpretations by the original radiologists. Final ratings are sent for central data entry and analysis. A pilot test of RADPEER was conducted in 2002.

Results: Fourteen facilities participated in the pilot test, submitting a total of 20,286 cases. Disagreements in difficult cases (ratings of 2) averaged 2.9% of all cases. Committee-validated misinterpretations in nondifficult cases averaged 0.8% of all cases. There were considerable differences by modality. There were substantial differences across facilities; few of these differences were explicable by mix of modalities, facility size or type, or being early or late in the pilot test. Of 31 radiologists who interpreted over 200 cases, 2 had misinterpretation rates significantly (P < .05) above what would be expected given their individual mix of modalities and the average misinterpretation rate for each modality in their practice.

Conclusions: A substantial number of facilities participated in the pilot test, and all maintained their participation throughout the year. Data generated are useful for the peer review of individual radiologists and for showing differences by modality. RADPEER is now operational and is a good solution to the need for a peer review system with the desirable characteristics listed above.

Publication types

  • Multicenter Study

MeSH terms

  • Certification
  • Clinical Competence
  • Diagnostic Errors / statistics & numerical data*
  • Humans
  • Peer Review, Health Care*
  • Pilot Projects
  • Program Development
  • Program Evaluation
  • Quality Assurance, Health Care / organization & administration*
  • Radiology / education
  • Radiology / standards*
  • Radiology Department, Hospital / standards
  • Societies, Medical
  • Specialty Boards