PACS Integration of Semiautomated Imaging Software Improves Day-to-Day MS Disease Activity Detection ==================================================================================================== * A. Dahan * R. Pereira * C.B. Malpas * T. Kalincik * F. Gaillard ## Abstract **BACKGROUND AND PURPOSE:** The standard for evaluating interval radiologic activity in MS, side-by-side MR imaging comparison, is restricted by its time-consuming nature and limited sensitivity. VisTarsier, a semiautomated software for comparing volumetric FLAIR sequences, has shown better disease-activity detection than conventional comparison in retrospective studies. Our objective was to determine whether implementing this software in day-to-day practice would show similar efficacy. **MATERIALS AND METHODS:** VisTarsier created an additional coregistered image series for reporting a color-coded disease-activity change map for every new MS MR imaging brain study that contained volumetric FLAIR sequences. All other MS studies, including those generated during software-maintenance periods, were interpreted with side-by-side comparison only. The number of new lesions reported with software assistance was compared with those observed with traditional assessment in a generalized linear mixed model. Questionnaires were sent to participating radiologists to evaluate the perceived day-to-day impact of the software. **RESULTS:** Nine hundred six study pairs from 538 patients during 2 years were included. The semiautomated software was used in 841 study pairs, while the remaining 65 used conventional comparison only. Twenty percent of software-aided studies reported having new lesions versus 9% with standard comparison only. The use of this software was associated with an odds ratio of 4.15 for detection of new or enlarging lesions (*P* = .040), and 86.9% of respondents from the survey found that the software saved at least 2–5 minutes per scan report. **CONCLUSIONS:** VisTarsier can be implemented in real-world clinical settings with good acceptance and preservation of accuracy demonstrated in a retrospective environment. ## ABBREVIATIONS: AIC : akaike information criterion CSSC : conventional side-by-side comparison EDSS : Expanded Disability Status Scale VT : VisTarsier Multiple sclerosis is a common immune-mediated inflammatory disease of the central nervous system and the most frequent neurologic cause of disability in young adults.1,2 With the ongoing development and approval of disease-modifying drugs, the armamentarium of therapies to reduce relapse frequency, radiological disease activity and progression continues to grow. With these therapies, no evidence of disease activity has become a new treatment target, making disease monitoring more important than ever.3,4 MR imaging is the most commonly used surrogate marker of MS activity.5,6 Radiologists typically evaluate MR imaging studies for the development of new MS lesions by comparing the current study with a prior study in adjacent view ports on a monitor, usually in multiple planes, which we will refer to as conventional side-by-side comparison (CSSC). The sensitivity of such a comparison is degraded by multiple human and technologic factors, including the quality of MR imaging protocols and the expertise of radiologists evaluating the examinations.7⇓–9 Although it is routinely accepted in phase II and III trials, the demanding nature and relative inaccuracy of visual inspection of MRIs compared with novel methods including computer-assisted lesion detection pose an important limitation to utility in clinical practice.10,11 Indeed, computer-assisted lesion-detection software has shown promise by increasing the specificity and sensitivity of MS disease-activity monitoring.8,12,13 One such software, VisTarsier (VT; open-source available at github.com/mh-cad/vistarsier) has been validated in a series of retrospective studies, allowing radiologists, regardless of training level, to detect up to 3 times as many new MS lesions on monitoring scans compared with CSSC.8,9,14 These validation studies, however, were performed on a dedicated research workstation with axial, coronal, sagittal and semitransparent 3D “overview” images, rather than on a conventional PACS workstation during normal clinical practice. In this prospective, observational cohort study, we sought to share our experiences implementing this assistive software in the Royal Melbourne Hospital PACS and to demonstrate that once implemented, it would augment radiologists' capacity to detect increases in MS disease-activity detection compared with CSSC. ## Materials and Methods ### Software Integration into PACS Every new MR imaging brain demyelination protocol study generated using 3T magnets (Tim Trio, 12-channel head coil; Siemens, Erlangen, Germany) for a patient with a previous study obtained with the same MR imaging protocol was automatically processed by the software. The automated process (Fig 1) is triggered as soon as a study is verified in our radiology information system (Karisma; Kestral, Perth, Australia) by the radiographer, with the radiology information system automatically sending a completion HL7 message (NextGen Connect; NextGen Health care, Irvine, Cali-fornia) to the software virtual machine (Xeon Processer E5645, 8 VCPU cores @ 2.40 GHz, 8 GB DDR3 RAM, 500 GB SATA3 7200 RPM hard disk drive, no 3D/GPU acceleration [Intel, Santa Clara, California, Windows 7 Professional 64-bit operating system [Microsoft, Redmond, Washington]). The software then queries the PACS and searches the study for a series that is deemed compatible on the basis of a list of possible series descriptors (eg, FLAIR sagittal 3D). If a compatible series exists in the new study, the software then queries the PACS for previous MR imaging studies of the same patient. Once a compatible series is found in the previous most recent MR imaging, the 2 series are retrieved and processed. Software processing includes brain-surface extraction and masking of volumetric FLAIR sequences, followed by intensity normalization, 6-*df* registration, automated change detection, and reslicing to generate 3 new coregistered series: 1) A resliced prior study sagittal FLAIR (∼160 images, preserving original resolution, one 16-bit grayscale channel); 2) an increased signal intensity color map (∼160 images, 256 × 256, three 8-bit RGB channels); and 3) a decreased signal intensity color map (∼160 images, 256 × 256, three 8-bit RGB channels). Once processing is complete, the virtual machine sends the 3 series (typical total size ∼150 megabytes) back to the new study as additional series. These series are then available as part of the normal clinical study for staff radiologists to report in real-time in the usual PACS environment (see the On-line Figure for an example of the output series generated by VisTarsier). ![Fig 1.](http://www.ajnr.org/https://www.ajnr.org/content/ajnr/40/10/1624/F1.medium.gif) [Fig 1.](http://www.ajnr.org/content/40/10/1624/F1) Fig 1. Software integration into PACS workflow. This flow diagram outlines how the new MR imaging studies for patients with MS are processed by the VisTarsier software in a virtual machine once they are signed off in the radiology information system (RIS) by the radiographer. Successful processing requires all systems to be operational and compatible sequences to be available. Most important, these change maps do not replace routine sequences and reformats but are in addition to routine imaging. They merely draw the attention of reporting radiologists to areas that may represent new or enlarging lesions (orange). These areas are then assessed normally on routine imaging, and a determination is made as to whether they represent disease activity. ### Participants and Data Collection In July 2015, the software underwent a soft launch within our tertiary hospital's PACS (ethics approval number QA2015161). Eligibility criteria included the following: consecutive studies in patients with a confirmed diagnosis of multiple sclerosis (as per 2017 revised McDonald criteria) and an MR imaging including a volumetric FLAIR sequence (FOV = 250, 160 sections, section thickness = 0.98 mm, matrix = 258 × 258, in-plane resolution = 0.97 mm, TR = 5000 ms, TE = 350 ms, TI = 1800 ms, 72 degree selective inversion recovery magnetic preparation).15 For all studies not meeting the automated criteria for software assistance, only CSSC was used by staff radiologists to report MS disease progression. At our hospital, the software runs as a virtual machine on a server that hosts several other research and nonessential clinical services. Thus, upgrades, power outages, and hospital network reconfigurations lead to a small amount of downtime. In cases in which studies were performed during these times or due to other software-based failures illustrated in Fig 1, VT-assisted series were not auto-matically generated, and only CSSC was used by reporting radiologists. Unfortunately, a detailed breakdown of the various causes of nonprocessing could not be collated prospectively and cannot be established retrospectively. We collected imaging reports for all studies performed with the above protocol prospectively from July 1, 2015, to June 30, 2017. All imaging reports for studies meeting the inclusion and exclusion criteria were assessed for written evidence of interval radiologic disease activity. Disease activity was defined as the presence of new or enlarging lesions as stated in the report body and/or conclusion available to the referring clinician. Demographic and clinical details for each patient were included in the study. After study completion, a brief survey was sent to assess the real-world impact of the software on the day-to-day lives of reporting radiologists and trainees. The results of this survey will be summarized without statistical analysis. ### Statistical Analysis Assessed demographic and clerical variables included the following: the presence of VT-generated series, age at scanning, sex, and reporting radiologist's training level. Assessed clinical variables included disease-modifying drug use, Expanded Disability Status Scale (EDSS), time from diagnosis to the date of the scan, and annualized rate of MR imaging scans (ie, the number of MR imaging scans per year). Because available MS subtype data were incomplete, EDSS, time since diagnosis, and annualized scan rates were used as surrogate markers of disease activity and trajectory. The distributions of the variables were compared between the groups, using *t* tests and χ2 tests. Generalized linear mixed models were computed to assess the difference in rates of disease progression with the software compared with CSSC. For the primary analysis, interval radiologic activity was entered as the dependent variable. All other assessed variables were entered as independent variables. Continuous variables were centered and scaled. A random intercept term for each participant was specified to allow multiple observations per person. Parameter estimation was performed using maximum likelihood. Because the dependent variable was binary, a binomial response family was used with a logit-link function. We also performed an additional sensitivity analysis with a stepwise forward variable selection for the multivariable generalized linear mixed model. An estimated odds ratio was computed for each variable. A 2-sided critical *P* value of .05 was used to assess statistical significance. Confidence intervals at the 95% level are presented when relevant. Data were analyzed with R statistical and computing software ([http://www.r-project.org](http://www.r-project.org)).16 ## Results During the 2-year study period, 906 study pairs for 538 patients met the inclusion criteria. VT was automatically activated in 841 study pairs. This activation occurred only on the occasions when both studies included a volumetric 3D-FLAIR sequence, the software was active at the time of image migration to PACS, and both studies had the same series labeling. Thus, all studies protocoled for MS follow-up should have been automatically processed by VT, and the instances in which this was not the case were random, resulting from technical reasons unrelated to patient factors (eg, server being restarted, Fig 1). These random cases occurred in the remaining 65 study pairs, which allowed CSSC only. Processing times for the software-generated series varied depending on a few factors, including ease of brain-surface extraction and workload of the server due to additional services (average processing time = 5 minutes 11 seconds ± 22 seconds). Clinical and demographic data are summarized in Table 1, with both groups showing a similar distribution of key variables. Age at scan, sex, and EDSS were comparable across the CSSC and software-assisted groups. As shown in Table 2, pharmacologic treatment was also comparable across groups. View this table: [Table 1:](http://www.ajnr.org/content/40/10/1624/T1) Table 1: Demographic and clinical data across each groupa View this table: [Table 2:](http://www.ajnr.org/content/40/10/1624/T2) Table 2: Treatment used at scanning for each study groupa In the first year following the introduction of the software, 20.49% (95% CI, 16.36%–24.63%) of studies using the software reported having new lesions versus 9.76% (95% CI, 0.67%–18.84%) with CSSC. Similarly, in the second year, 20.21% (95% CI, 16.6%–23.82%) of studies using the software reported new lesions versus 8.33% (95% CI, −2.72%–19.39%) with CSSC. These findings are illustrated in Fig 2. ![Fig 2.](http://www.ajnr.org/https://www.ajnr.org/content/ajnr/40/10/1624/F2.medium.gif) [Fig 2.](http://www.ajnr.org/content/40/10/1624/F2) Fig 2. The proportion of scans showing MS progression within each year. This scatterplot highlights the number of scans and the proportion in which new and enlarging lesions were detected for each study group during each year. The position on the vertical axis corresponds to the proportion of scans showing progression. The position along the horizontal axis corresponds to the study year. The lighter shade corresponds to scans generated with the software. The fully adjusted multivariable generalized linear mixed model found a greater probability of identifying new/enlarging lesions compared with CSSC with an estimated odds ratio of 4.15 (95% CI, 1.07–16.14; *P* = .04). It was adjusted for age at scanning, sex, whether a scan was reported by a staff radiologist or a radiology resident, EDSS, time since diagnosis, and annualized rate of MR imaging scans. The On-line Table outlines the results of each partially adjusted model computed as part of our sensitivity analysis. These highlight the sustained effect of the software when adjusting for each additional variable independently. The Akaike information criterion (AIC) for the fully adjusted model was 586.8. Of the 39 individuals reporting MR imaging to whom the impact assessment survey was sent, 23 responded, of whom eight (34.8%) were radiology residents and thirteen (56.5%) were staff radiologists, including eight (34.8%) fellowship-trained neuro-radiologists and two (8.7%) radiology fellows. Twenty-one (91.3%) reported always using the software when available, and 22 (95.7%) felt comfortable using it as an additional series for reporting. Twenty-one (91.3%) believed it saved them at least 2–5 minutes of reporting time per scan. None of the respondents believed the software added to their reporting time, and 21 (91.3%) stated that they would like to see it implemented in other areas soon. ## Discussion Semiautomated imaging software has shown great promise in the field of MS disease monitoring.17⇓–19 Earlier studies of VT concluded that it allowed higher lesion detection with improved interreader reliability and decreased reporting times when used by readers of all radiology training levels (ie, ranging from medical student to fellowship-trained neuroradiologist) compared with their performance using CSSC.8,9,14 The main caveats of prior research in this area, however, included the retrospective design, artificial research conditions, and/or relatively small sample sizes. In this translational study, we used a previously retrospectively validated open-source software for MS follow-up. We used prospectively acquired data, accounting for several potential demographic and clinical confounders. We sought to demonstrate the efficacy of semiautomated imaging when implemented in a real-world clinical setting and to share our experience integrating one such software in our daily practice. We used a permissive research design to mitigate any distortion created by a research setting. Department staff were given an in-service brief and informal overview of how the software worked and of prior validation; then radiologists were left to work as they would outside a trial environment. There was no pressure to use the software, to pay attention to or record their usage pattern, or to focus on time. We thought that any such intervention would potentially mislead what another department could expect if they were to implement this sort of assistive software. More than 800 of 906 new hospital scans had VT-assisted series automatically generated and available to the reporting radiologist in real-time, with only a few minutes elapsing before the color-mapped image series became available on the PACS for reporting. This feature yielded a >4-fold increase in new lesion detection compared with those scans reported using CSSC. While <10% of studies using CSSC showed disease progression, it was reported in >20% of those using software assistance. In a poststudy survey, almost all radiologists and radiology trainees used VT and thought that it cut down on their reporting times for MS comparison studies. The results observed in this prospective study of >800 scans demonstrate an effect equivalent to the ones seen in our earlier retrospective studies. Similar demographic data were seen across both study groups and were specifically included in our analysis model to limit the amount of confounding. The software was the sole variable associated with a difference in lesion detection compared with age, sex, disease state, and time course; reporting radiologist; and annualized rate of scanning. MR imaging remains the most widely used and reliable surrogate marker to monitor disease activity in patients in the real-world clinical setting.5,6,8 Physical and psychological disabilities seen in MS are associated with the number of demyelinating lesions, some of which can be visualized on neuroimaging with FLAIR and T2-weighted sequences.20⇓–22 Recently, the importance of accurate interval MR imaging activity has become even greater because postcontrast imaging is no longer recommended for routine follow-up, largely due to concerns about the presence of residual contrast in the brain after repeat exposure to gadolinium-based agents.23,24 Semiautomated imaging represents a growing field of MS and radiology research, with methods ranging from assisted lesion assessment to brain volumetric analysis.6,19,25 Similar growth is seen with an extension of computer-assisted detection called “radiomics,” which converts images to minable data for deep learning.26 Image coregistration is a crucial component of traditional MR imaging comparison. Although image coregistration is routinely performed on a PACS, minor changes in alignment are inevitable without reslicing.27⇓⇓–30 Thus, if not via the color-change maps, the automated reslicing and coregistration availed by the software rapidly and effectively provide an important and known means to optimal image comparison and assessment. After incorporating VT-assisted imaging in our hospital's daily MR imaging reporting activities, our findings are in line with other smaller prospective studies that have shown an absolute increase of 13% (22% relative increase) in new MS lesion detection using similar semiautomated software.19 Perhaps more important, implementation of this software in our department was largely seamless and did not appreciably increase transfer times to PACS or data memory burden. Similarly, a post hoc survey of staff in our department showed an overwhelmingly positive response to the integration of the software in our daily practice. ### Limitations The main limitation in this study is the relatively smaller number of scans in the CSSC group. Because our PACS is programmed to automatically process new images with the software whenever possible, the number of unaided scans was limited to the days when VT was unavailable, such as when servers were undergoing maintenance. These factors contributing to the group size discrepancy were random and were not associated with the probability of MR imaging activity. This discrepancy was also further addressed by the statistical design of our analysis. For those wishing to implement a similar system in their practice, the mentioned downtime could be addressed by having a dedicated server for the software. Similarly, series description and naming in PACS was another potential source of exclusion from automated VisTarsier integration. Similarly, our protocols included 3D-FLAIR sequence series that were all named “FLAIR 3D Sag”; however, at times this could be changed manually, resulting in a matching study not being found. This could be addressed by raising awareness of the importance of standardized series naming. Unfortunately, the reason that a given scan from the CSSC cohort did not meet the automated criteria was not recorded prospectively, and it could not be reconstructed retrospectively. Although a survey sent to all reporting doctors within the radiology department yielded highly positive results in terms of ease of use and time-saving capabilities of the software, we did not track reporting times as in previous retrospective studies. Unfortunately, these data were not retrospectively mineable on our department's PACS. The qualitative nature of these data thus makes them an adjunct, rather than a statistically rigorous end point. Last, the inherent limitations of a pragmatic real-world prospective observational cohort study mean that we cannot explicitly control how the studies are read by radiologists, and we do not have the ability to generate inter- or intrareader descriptive statistics. These limitations have, however, previously been established in retrospective validation studies.8 This is, in our opinion, offset by being able to describe the effect of implementing VisTarsier in a routine clinical environment, which is more likely to be of relevance to other institutions. ## Conclusions Semiautomated lesion-detection software improves the standard of reporting of new or enlarging T2/FLAIR hyperintense lesions in patients with multiple sclerosis. VisTarsier has improved reporting standards in cerebral MR imaging from patients with MS using standardized volumetric sequences and uniform scanning protocols. Most important, implementing this software in our practice's PACS was relatively seamless and very well received by staff. Future research should validate its capacity to improve reporting in a more heterogeneous sample of images. It should also seek to measure reporting times behind the scenes as a surrogate for workflow efficiency and to demonstrate a change in disease management as a marker of clinical relevance. Computer-aided detection systems promise to improve radiologists' ability to detect disease activity in patients with MS. ## Acknowledgments We again acknowledge the National Information and Communications Technology of Australia Research Center of Excellence Victorian Research Laboratory for their contribution in aiding the development of the semiautomated imaging software. ## Footnotes * Disclosures: Tomas Kalincik—*UNRELATED*: *Board Membership*: Roche, Sanofi Genzyme, Novartis, Merck, Biogen; *Consultancy*: Novartis, Biogen, Merck, brain atrophy initiative by Sanofi Genzyme; *Grants/Grants Pending*: National Health and Medical Research Council, Medical Research Future Fund, Australian Research Council, MS Research Australia, Eugène Devic EDMUS Foundation/Association pour la Recherche contre la Sclérose en Plaques, Multiple Sclerosis International Federation, Biogen; *Payment for Lectures Including Service on Speakers Bureaus*: WebMD Global, Novartis, Biogen, Sanofi Genzyme, Teva Pharmaceutical Industries, BioCSL, Merck; *Payment for Development of Educational Presentations*: WebMD Global, Novartis, Biogen, Sanofi Genzyme, Teva Pharmaceutical Industries, BioCSL, Merck; *Travel/Accommodations/Meeting Expenses Unrelated to Activities Listed*: WebMD Global, Novartis, Biogen, Sanofi Genzyme, Teva Pharmaceutical Industries, BioCSL, Merck. Frank Gaillard—*UNRELATED*: *Grant*: Royal Melbourne Hospital Foundation, *Comments*: The grant was internal (Royal Melbourne Hospital raises funds to support local projects) and was awarded to continue development of VisTariser software, make it available under an open-source license, and perform this prospective validation. No external funding was received for this project*; *OTHER RELATIONSHIPS:* Founder CEO of Radiopedia.org. *Money paid to the institution. ## References 1. 1. Ramagopalan SV, Sadovnick AD. Epidemiology of multiple sclerosis. Neurol Clin 2011;40:207–17 doi:10.1016/j.ncl.2010.12.010 pmid:21439437 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.ncl.2010.12.010&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=21439437&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 2. 2. Degenhardt A, Ramagopalan SV, Scalfari A, et al. Clinical prognostic factors in multiple sclerosis: a natural history review. Nat Rev Neurol 2009;5:672–82 doi:10.1038/nrneurol.2009.178 pmid:19953117 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1038/nrneurol.2009.178&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=19953117&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=000272252700009&link_type=ISI) 3. 3. Weiner HL. The challenge of multiple sclerosis: how do we cure a chronic heterogeneous disease? Ann Neurol 2009;65:239–48 doi:10.1002/ana.21640 pmid:19334069 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1002/ana.21640&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=19334069&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=000264779600005&link_type=ISI) 4. 4. Havrdova E, Galetta S, Stefoski D, et al. Freedom from disease activity in multiple sclerosis. Neurology 2010;74(Suppl 3):S3–7 doi:10.1212/WNL.0b013e3181dbb51c pmid:20421571 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1212/WNL.0b013e3181dbb51c&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=20421571&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 5. 5. Fisniku LK, Brex PA, Altmann DR, et al. Disability and T2 MRI lesions: a 20-year follow-up of patients with relapse onset of multiple sclerosis. Brain 2008;131:808–17 doi:10.1093/brain/awm329 pmid:18234696 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1093/brain/awm329&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=18234696&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=000253489800021&link_type=ISI) 6. 6. Bar-Zohar D, Agosta F, Goldstaub D, et al. Magnetic resonance imaging metrics and their correlation with clinical outcomes in multiple sclerosis: a review of the literature and future perspectives. Mult Scler 2008;14:719–27 doi:10.1177/1352458507088102 pmid:18424478 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1177/1352458507088102&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=18424478&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=000258252200001&link_type=ISI) 7. 7. Abraham AG, Duncan DD, Gange SJ, et al. Computer-aided assessment of diagnostic images for epidemiological research. BMC Med Res Methodol 2009;9:74 doi:10.1186/1471-2288-9-74 pmid:19906311 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1186/1471-2288-9-74&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=19906311&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 8. 8. van Heerden J, Rawlinson D, Zhang A, et al. Improving multiple sclerosis plaque detection using a semiautomated assistive approach. AJNR Am J Neuroradiol 2015;36:1465–71 doi:10.3174/ajnr.A4375 pmid:26089318 [Abstract/FREE Full Text](http://www.ajnr.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiYWpuciI7czo1OiJyZXNpZCI7czo5OiIzNi84LzE0NjUiO3M6NDoiYXRvbSI7czoyMToiL2FqbnIvNDAvMTAvMTYyNC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 9. 9. Wang W, van Heerden J, Tacey MA, et al. Neuroradiologists compared with non-neuroradiologists in the detection of new multiple sclerosis plaques. AJNR Am J Neuroradiol 2017;38:1323–27 doi:10.3174/ajnr.A5185 pmid:28473341 [Abstract/FREE Full Text](http://www.ajnr.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiYWpuciI7czo1OiJyZXNpZCI7czo5OiIzOC83LzEzMjMiO3M6NDoiYXRvbSI7czoyMToiL2FqbnIvNDAvMTAvMTYyNC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 10. 10. Filippi M, Grossman RI. MRI techniques to monitor MS evolution: the present and the future. Neurology 2002;58:1147–53 doi:10.1212/wnl.58.8.1147 pmid:11971079 [Abstract/FREE Full Text](http://www.ajnr.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6OToibmV1cm9sb2d5IjtzOjU6InJlc2lkIjtzOjk6IjU4LzgvMTE0NyI7czo0OiJhdG9tIjtzOjIxOiIvYWpuci80MC8xMC8xNjI0LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 11. 11. Goodin DS. Magnetic resonance imaging as a surrogate outcome measure of disability in multiple sclerosis: have we been overly harsh in our assessment? Ann Neurol 2006;59:597–605 doi:10.1002/ana.20832 pmid:16566022 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1002/ana.20832&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=16566022&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=000236604200009&link_type=ISI) 12. 12. Bilello M, Arkuszewski M, Nasrallah I, et al. Multiple sclerosis lesions in the brain: computer-assisted assessment of lesion load dynamics on 3D FLAIR MR images. Neuroradiol J 2012;25:17–21 doi:10.1177/197140091202500102 pmid:24028871 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1177/197140091202500102&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=24028871&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 13. 13. Moraal B, Meier DS, Poppe PA, et al. Subtraction MR images in a multiple sclerosis multicenter clinical trial setting. Radiology 2009;250:506–14 doi:10.1148/radiol.2501080480 pmid:19037018 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1148/radiol.2501080480&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=19037018&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 14. 14. Dahan A, Wang W, Gaillard F. Computer-aided detection can bridge the skill gap in multiple sclerosis monitoring. J Am Coll Radiol 2018;15(1 Pt A):93–96 doi:10.1016/j.jacr.2017.06.030 pmid:28764954 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.jacr.2017.06.030&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=28764954&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 15. 15. Polman CH, Reingold SC, Banwell B, et al. Diagnostic criteria for multiple sclerosis: 2010 revisions to the McDonald criteria. Ann Neurol 2011;69:29–302 doi:10.1002/ana.22366 pmid:21387374 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1002/ana.22366&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=21387374&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 16. 16.The R Core Team. R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing; 2014 17. 17. Jain S, Sima DM, Ribbens A, et al. Automatic segmentation and volumetry of multiple sclerosis brain lesions from MR images. Neuroimage Clin 2015;8:367–75 doi:10.1016/j.nicl.2015.05.003 pmid:26106562 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.nicl.2015.05.003&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=26106562&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 18. 18. Schmidt P, Gaser C, Arsic M, et al. An automated tool for detection of FLAIR-hyperintense white-matter lesions in multiple sclerosis. Neuroimage 2012;59:3774–83 doi:10.1016/j.neuroimage.2011.11.032 pmid:22119648 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.neuroimage.2011.11.032&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=22119648&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 19. 19. Galletto Pregliasco AG, Collin A, Guéguen A, et al. Improved detection of new MS lesions during follow-up using an automated MR coregistration-fusion method. AJNR Am J Neuroradiol 2018;39:1226–32 doi:10.3174/ajnr.A5690 pmid:29880479 [Abstract/FREE Full Text](http://www.ajnr.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiYWpuciI7czo1OiJyZXNpZCI7czo5OiIzOS83LzEyMjYiO3M6NDoiYXRvbSI7czoyMToiL2FqbnIvNDAvMTAvMTYyNC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 20. 20. Mortazavi D, Kouzani AZ, Soltanian-Zadeh H. Segmentation of multiple sclerosis lesions in MR images: a review. Neuroradiology 2012;54:299–320 doi:10.1007/s00234-011-0886-7 pmid:21584674 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1007/s00234-011-0886-7&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=21584674&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 21. 21. Caramanos Z, Francis S, Narayanan S, et al. Large, nonplateauing relationship between clinical disability and cerebral white matter lesion load in patients with multiple sclerosis. Arch Neurol 2012;69:89–95 doi:10.1001/archneurol.2011.765 pmid:22232348 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1001/archneurol.2011.765&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=22232348&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=000298944500013&link_type=ISI) 22. 22. Sormani MP, Rovaris M, Comi G, et al. A reassessment of the plateauing relationship between T2 lesion load and disability in MS. Neurology 2009;73:1538–42 doi:10.1212/WNL.0b013e3181c06679 pmid:19794123 [Abstract/FREE Full Text](http://www.ajnr.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6OToibmV1cm9sb2d5IjtzOjU6InJlc2lkIjtzOjEwOiI3My8xOS8xNTM4IjtzOjQ6ImF0b20iO3M6MjE6Ii9ham5yLzQwLzEwLzE2MjQuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 23. 23. Traboulsee A, Simon J, Stone L, et al. Revised recommendations of the consortium of MS centers task force for a standardized MRI protocol and clinical guidelines for the diagnosis and follow-up of multiple sclerosis. AJNR Am J Neuroradiol 2016;37:394–401 doi:10.3174/ajnr.A4539 pmid:26564433 [Abstract/FREE Full Text](http://www.ajnr.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiYWpuciI7czo1OiJyZXNpZCI7czo4OiIzNy8zLzM5NCI7czo0OiJhdG9tIjtzOjIxOiIvYWpuci80MC8xMC8xNjI0LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 24. 24. Kanda T, Nakai Y, Hagiwara A, et al. Distribution and chemical forms of gadolinium in the brain: a review. Br J Radiology 2017;90:20170115. doi:10.1259/bjr.20170115 pmid:28749164 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1259/bjr.20170115&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=28749164&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 25. 25. Wang C, Beadnall HN, Hatton SN, et al. Automated brain volumetrics in multiple sclerosis: a step closer to clinical application. J Neurol Neurosurg Psychiatry 2016;87:754–57 doi:10.1136/jnnp-2015-312304 pmid:27071647 [Abstract/FREE Full Text](http://www.ajnr.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiam5ucCI7czo1OiJyZXNpZCI7czo4OiI4Ny83Lzc1NCI7czo0OiJhdG9tIjtzOjIxOiIvYWpuci80MC8xMC8xNjI0LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 26. 26. Giger ML. Machine learning in medical imaging. J Am Coll Radiol 2018;15:512–20 doi:10.1016/j.jacr.2017.12.028 pmid:29398494 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.jacr.2017.12.028&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=29398494&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 27. 27. Goodkin DE, Vanderburg-Medendorp S, Ross J. The effect of repositioning error on serial magnetic resonance imaging scans. Arch Neurol 1993;50:569–70 doi:10.1001/archneur.1993.00540060011007 pmid:8347222 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1001/archneur.1993.00540060011006&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=8503791&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) 28. 28. Gawne-Cain ML, Webb S, Tofts P, et al. Lesion volume measurement in multiple sclerosis: how important is accurate repositioning? J Magn Reson Imaging 1996;6:705–13 pmid:8890007 [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=8890007&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=A1996VJ68700001&link_type=ISI) 29. 29. Molyneux PD, Miller DH, Filippi M, et al. Visual analysis of serial T2-weighted MRI in multiple sclerosis: intra- and interobserver reproducibility. Neuroradiology 1999;41:882–88 pmid:10639661 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1007/s002340050860&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=10639661&link_type=MED&atom=%2Fajnr%2F40%2F10%2F1624.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=000084314400003&link_type=ISI) 30. 30. Duan Y, Hildenbrand PG, Sampat MP, et al. Segmentation of subtraction images for the measurement of lesion change in multiple sclerosis. AJNR Am J Neuroradiol 2008;29:340–46 doi:10.3174/ajnr.A0795 [Abstract/FREE Full Text](http://www.ajnr.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiYWpuciI7czo1OiJyZXNpZCI7czo4OiIyOS8yLzM0MCI7czo0OiJhdG9tIjtzOjIxOiIvYWpuci80MC8xMC8xNjI0LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 31. 31. Kurtzke JF. Rating neurologic impairment in multiple sclerosis: an Expanded Disability Status Scale (EDSS). Neurology 1983;33:1444–52 doi:10.1212/wnl.33.11.1444 pmid:6685237 [Abstract/FREE Full Text](http://www.ajnr.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6OToibmV1cm9sb2d5IjtzOjU6InJlc2lkIjtzOjEwOiIzMy8xMS8xNDQ0IjtzOjQ6ImF0b20iO3M6MjE6Ii9ham5yLzQwLzEwLzE2MjQuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) * Received May 30, 2019. * Accepted after revision July 19, 2019. * © 2019 by American Journal of Neuroradiology