This article requires a subscription to view the full text. If you have a subscription you may use the login form below to view the article. Access to this article can also be purchased.
Abstract
BACKGROUND AND PURPOSE: Atherosclerotic disease of the carotid arteries is a major cause of ischemic stroke. Traditionally, the degree of stenosis has been regarded as the primary parameter for predicting stroke risk, but emerging evidence highlights the importance of carotid plaque composition and morphology. Recently, the Carotid Plaque-Reporting and Data System (RADS) has been introduced to standardize carotid plaque assessment beyond the degree of carotid stenosis. However, its reliability in routine radiologic practice has yet to be established. This study assesses the interreader agreement, intrareader agreement, and learning curve associated with Carotid Plaque-RADS.
MATERIALS AND METHODS: In this retrospective study, 500 subjects who underwent CTA for suspected carotid atherosclerosis were assessed. Three readers with varying experience levels in vascular imaging independently evaluated all CTAs in 5 blocks by using Carotid Plaque-RADS. To assess the impact of reader experience and potential improvement over time, interreader agreement between the 3 pairs of readers was calculated for each block by using Cohen κ, enabling a comparison of agreement sequentially across the blocks. Intrareader agreement was calculated on a random block of 100 patients (192 carotid arteries).
RESULTS: After exclusion of low-quality examinations, 490 patients were selected for analysis, but 46 carotid arteries were excluded due to previous revascularization procedures. The remaining 934 carotid arteries were assessed. The agreement was substantial between expert and intermediate readers, ranging from κ = 0.78 to κ = 0.88, moderate between intermediate and beginner readers, ranging from κ = 0.50 to κ = 0.74, and between expert and beginner improved from substantial (κ = 0.68) to almost optimal (κ = 0.86) across blocks, indicating the effect of a learning curve. The interreader percent agreement was best for Plaque-RADS category 1, 2, and 3 and poorest for 4. Intrareader agreement was substantial for the beginner (κ = 0.77) and almost perfect for both the intermediate and expert readers (κ = 0.88).
CONCLUSIONS: In the CTA application of the Carotid Plaque-RADS, interreader agreement is substantial to near perfect among experienced and intermediate readers, with a notable learning curve for beginners. Intrareader agreement is almost perfect in experienced and intermediate readers, indicating consistency of their grading, ensuring data reproducibility by using Carotid Plaque-RADS.
ABBREVIATIONS:
- HU
- Hounsfield unit
- IPH
- intraplaque hemorrhage
- MWT
- maximum wall thickness
- RADS
- Reporting and Data System
- © 2025 by American Journal of Neuroradiology
ASNR members
Login to the site using your ASNR member credentials







