Enhancing the Radiologist-Patient Relationship through Improved Communication: A Quantitative Readability Analysis in Spine Radiology ===================================================================================================================================== * D.R. Hansberry * A.L. Donovan * A.V. Prabhu * N. Agarwal * M. Cox * A.E. Flanders ## Abstract **BACKGROUND AND PURPOSE:** More than 75 million Americans have less than adequate health literacy skills according to the National Center for Education Statistics. Readability scores are used as a measure of how well populations read and understand patient education materials. The purpose of this study was to assess the readability of Web sites dedicated to patient education for radiologic spine imaging and interventions. **MATERIALS AND METHODS:** Eleven search terms relevant to radiologic spine imaging were searched on the public Internet, and the top 10 links for each term were collected and analyzed to determine readability scores by using 10 well-validated quantitative readability assessments from patient-centered education Web sites. The search terms included the following: x-ray spine, CT spine, MR imaging spine, lumbar puncture, kyphoplasty, vertebroplasty, discogram, myelogram, cervical spine, thoracic spine, and lumbar spine. **RESULTS:** Collectively, the 110 articles were written at an 11.3 grade level (grade range, 7.1–16.9). None of the articles were written at the American Medical Association and National Institutes of Health recommended 3rd-to-7th grade reading levels. The vertebroplasty articles were written at a statistically significant (*P* < .05) more advanced level than the articles for x-ray spine, CT spine, and MR imaging spine. **CONCLUSIONS:** Increasing use of the Internet to obtain health information has made it imperative that on-line patient education be written for easy comprehension by the average American. However, given the discordance between readability scores of the articles and the American Medical Association and National Institutes of Health recommended guidelines, it is likely that many patients do not fully benefit from these resources. ## ABBREVIATIONS: AMA : American Medical Association FRE : Flesch Reading Ease GFI : Gunning Fog Index NIH : National Institutes of Health As barriers to on-line access have decreased, the Internet has emerged as a primary resource for Americans desiring greater understanding of their health. According to a June 2015 report by the Pew Research Center,1 up to 84% of adults access the Internet, and within the past year, 72% of those users have searched for health information.2 Specifically, 55% wanted to learn more about a disease or medical problem; and 43%, about a medical treatment or procedure.2 Studies have confirmed that this on-line research impacts decision-making for many patients: the questions they ask, the types of treatment they pursue, and whether they visit a physician.2⇓⇓–5 Although more adults are accessing health care information on-line than ever before,2,4 it is uncertain how much of this information is fully comprehended due to poor health literacy. Health literacy, as defined by the US Department of Health and Human Services, is “the capacity to obtain, process, and understand basic health information and services needed to make appropriate health decisions.”6 In a 2003 assessment commissioned by the US Department of Education, only 12% of adults were found to have proficient health literacy. Proficiency was defined as having the skills necessary to locate, understand, and use information contained within documents commonly encountered in the medical system, such as medication dosing instructions, preventative care documentation, and insurance information. This definition indicates an ability to read, analyze, and synthesize complex content. More than 75 million Americans demonstrated either basic or below basic health literacy and would experience difficulty reading and comprehending health care–related text.7 The importance of health literacy cannot be understated because it has a direct influence on both health outcomes and health care expenditures. Studies have linked low health literacy to increased hospitalizations,8,9 higher mortality rates,8,10 and an annual cost to the US economy of up to $238 billion.11 In fact, the American Medical Association (AMA) has identified low health literacy as a strong independent predictor of health status.12 Readability, defined as the degree of ease with which a given text can be read and comprehended, is 1 correlative measure of health literacy.13 The reading level of the average American is between the 7th and 8th grade, while the average Medicaid enrollee reads at just a 5th grade level.12 Therefore, to maximize the number of individuals benefiting from patient education, the AMA and the National Institutes of Health (NIH) recommend that content be written at a level commensurate with the 3rd-to-7th grade levels.12,14 However, patient education materials across numerous specialties in medicine do not meet this recommendation. A 2013 readability study published in *Journal of the American Medical Association* analyzed material from 16 different medical specialties and determined that it was too complex for the average patient.15 Similar conclusions have been drawn regarding the surgical subspecialties.16 Readability analyses specific to spine-related patient education have also revealed a failure to meet reading level guidelines.17⇓⇓–20 However, research to date has only examined surgical procedures and material sourced from professional society Web sites. Three of the 4 studies were also limited by an analysis that incorporated just 1 readability assessment. The purpose of this study was to quantitatively determine the readability of patient education Web sites pertaining to radiologic diagnostic tests and interventions of the spine. We used 10 readability assessments that are well-vetted in the literature to avoid bias from any single test. This analysis does not include patient education materials related to imaging of the brain. ## Materials and Methods This study examined publicly available data; thus, institutional review board oversight was not required. In December 2015, Web sites dedicated to patient education relevant to spine imaging were sought on the public Internet by using the Google search engine. Eleven keywords were separately entered as search terms: x-ray spine, CT spine, MR imaging spine, lumbar puncture, kyphoplasty, vertebroplasty, discogram, myelogram, cervical spine, thoracic spine, and lumbar spine. The first 10 articles intended for patients for each term were included in the analysis. Web sites not specifically directed toward patients were excluded. The text of 110 articles was copied, pasted, and saved as individual Microsoft Word (Microsoft, Redmond, Washington) documents. Images, figures, tables, references, and other noneducational text were removed. Each document was then analyzed, and a readability analysis was performed with Readability Studio Professional Edition (Oleander Software, Vandalia, Ohio). An individual readability score was calculated for each of the 10 following well-validated assessments (Table): the Coleman-Liau Index,21 Flesch Reading Ease (FRE),22 Flesch-Kincaid Grade Level,23 FORCAST,24 Fry Graph,25 Gunning Fox Index (GFI),26 New Dale-Chall,27 New Fog Count,23 Raygor Readability Estimate,28 and SMOG.29 The FRE reports scores on a 0–100 scale with lower numbers corresponding to more difficult-to-read text. The remaining 9 scales report the readability of the text as a grade level. For instance, a GFI score of 9.0 corresponds to a 9th grade reading level. View this table: [Table1](http://www.ajnr.org/content/38/6/1252/T1) Formulas for the readability assessments Statistical analysis was conducted by using OriginPro (OriginLab, Northamptom, Massachusetts) to compare readability scores among the 11 keywords. A 1-way ANOVA and a Tukey Honestly Significant Difference post hoc analysis were performed with *P* < .05. ## Results Collectively, the 110 articles had a mean FRE score of 51.9, classifying them as fairly difficult on the FRE scale, and an 11.3 mean grade level averaged across the other 9 assessments, scored on the basis of grade level (Fig 1). FRE scores ranged from 74 (fairly easy) to 14 (very difficult), and grade levels ranged from 7.1 to 16.9. None of the articles (0/110) met the recommendations of the AMA and NIH of being written within a 3rd-to-7th grade level. Approximately 35% (39/110) were written at a level that required a high school education or higher (score of ≥12). An additional 50 articles scored between a 9th and 12th grade levels (On-line Table). ![Fig 1.](http://www.ajnr.org/https://www.ajnr.org/content/ajnr/38/6/1252/F1.medium.gif) [Fig 1.](http://www.ajnr.org/content/38/6/1252/F1) Fig 1. The grade level taken as the mean of all readability scales examined in this study for the 10 top search results for each key term. The *red box* represents the AMA and NIH recommended 3rd-to-7th grade guidelines. The articles consisted of many words characterized as complex, long, or unfamiliar. Words with at least 3 syllables were considered complex and composed 16.1% of the text of the articles, while words with at least 6 characters were considered long and composed 33.7%. More than 28% of words were classified as unfamiliar, as determined by an absence from the Dale-Chall list of simple words, which contains 3000 words known by most 4th grade children.27 In addition, unfamiliar words made up at least one-third of the text for 19 of the 110 (17.3%) articles. Sentences ranged from 23 to 127 words. The 1-way ANOVA found a statistical difference among the 11 keywords (*F*(10,99) = 3.19, *P* = .001). Average grade levels for each searched term were as follows: x-ray spine, 9.4; CT spine,9.6; MR imaging spine, 10.2; discogram, 10.7; myelogram, 11.0; cervical spine, 11.2; thoracic spine, 11.8; lumbar spine,11.8; lumbar puncture, 12.0; kyphoplasty, 12.4; and vertebroplasty, 13.4. Tukey Honestly Significant Difference post hoc analysis indicated that the vertebroplasty articles were significantly more advanced than the articles for x-ray spine, CT spine, and MR imaging spine (*P* < .05). ## Discussion Due to the inherently complex nature of spine diagnoses and treatments, patients are apt to seek more information on the Internet. Up to 77% of individuals begin this process with a search engine such as Google.2 More than 90% do not look beyond the first page of results.30 Consequently, patients wishing to learn more about radiologic spine imaging and interventions would likely encounter 1 of the 110 articles in this study when searching for these 11 terms. With a mean readability score of 11.3, these articles would be too complex for the average American who reads at a 7th-to-8th grade level. In addition, the abundance of uncommon words and long sentences would make understanding difficult for those classified as having less than proficient health literacy, which indicates an inability to read and synthesize complex health care–related text. Therefore, 62% of the adult population identified by the US Department of Education as having either basic or below basic health literacy would not fully benefit from this information and may be led to uninformed decisions that negatively affect health outcomes.7 If on-line patient education resources were written at a 7th grade reading level or lower, more Americans would be able to read and understand the material more thoroughly. Consequently, patients would likely experience increased involvement in their care and improved communication with their physicians. When empowered with knowledge, patients have been shown to ask more questions, communicate concerns with greater confidence, and actively engage in the medical decision-making process.31⇓–33 Patients have also reported greater satisfaction, particularly with informed consent.34 In radiology, health literacy has been linked to differing rates of imaging use35 and patient knowledge of procedure details and radiation use.36 Complex examinations and interventions, including those of the spine, stand to benefit from the active patient engagement and enhanced patient-provider communication resulting from well-written education materials. The results of this study are consistent with prior research investigating the readability of on-line patient education. Web sites for both medical and surgical subspecialties are routinely written at a level exceeding the 7th grade.37⇓⇓–40 Those dedicated to radiology, including radiologyinfo.org sponsored by the American College of Radiology and Radiological Society of North America, are written at a level too advanced for most patients.41 In addition, patient education materials from professional society Web sites, Wikipedia, WebMD, and hospital Web sites have all been written above the average comprehension level.42⇓⇓–45 This study, strengthened by the incorporation of text sourced from multiple Web site types and the use of 10 readability assessments, adds additional support to the conclusions drawn by prior spine imaging readability research. Collectively, these results highlight the need for further action to satisfy AMA and NIH readability recommendations. Authors and editors should use simpler words, construct shorter sentences, reduce abbreviations and acronyms, and eliminate medical jargon.14 Resources from the NIH,14 Centers for Disease Control and Prevention,46 and Center for Medicare and Medicaid Services are available to offer further guidance.47 This study is limited by the constraints of the readability assessments. Most important, the algorithms for certain quantitative parameters, such as the number of letters, syllables, words, and sentences used in the text, may lead to inaccurate scores for medical terminology. For instance, words with few syllables that are not necessarily familiar to the average person may lead to inappropriately low scores, while multisyllabic common words would be scored with a higher grade level. The FORCAST formula, which is based solely on the number of single-syllable words, is particularly susceptible to this bias. For example, “pia” would receive a lower rating than “operation,” despite being an uncommon term. The other assessments that use syllable counts, including the FRE, Flesch-Kincaid Grade Level, Fry Graph, GFI, and SMOG, may be affected to a somewhat lesser extent due to the use of additional variables. In this study, incorporation of 10 readability assessments reduces the bias of any single algorithm. An additional limitation is that none of the assessments evaluated the nontextual elements of readability, such as style, format, and organization13 or the use of supplemental material, such as images or diagrams. Further work is needed to determine the effect of these elements on the comprehension of patient education materials, specifically in radiology. Conducting readability and comprehension tests with target prospective patient populations may also be revealing. ## Conclusions With increasing use of the Internet for patient self-education, there is a growing need for the readability of material to fall within the limits of the average American's comprehension. However, an average reading level is often far exceeded in many disciplines of medicine. Spine imaging and radiologic interventions have not been an exception. It is imperative to broaden awareness of this discrepancy to mitigate the negative outcomes of poor health literacy. By adhering to the AMA and NIH guidelines, physicians, professional societies, and other authors can increase patient comprehension of on-line health care materials. ## References 1. 1. Perrin A, Duggan M. Americans' Internet Access: 2000–15. Pew Research Center; 2015. [http://www.pewinternet.org/2015/06/26/americans-internet-access-2000-2015/](http://www.pewinternet.org/2015/06/26/americans-internet-access-2000-2015/). Accessed August 15, 2016. 2. 2. Fox S, Duggan M. Health Online 2013. Pew Research Center: Science & Tech. January 15, 2013. [http://www.pewinternet.org/2013/01/15/health-online-2013/](http://www.pewinternet.org/2013/01/15/health-online-2013/). Accessed August 15, 2016. 3. 3. Pourmand A, Sikka N. Online health information impacts patients' decisions to seek emergency department care. West J Emerg Med 2011;12:174–77 pmid:21691522 [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=21691522&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 4. 4. Rice RE. Influences, usage, and outcomes of Internet health information searching: multivariate results from the Pew surveys. Int J Med Inform 2006;75:8–28 doi:10.1016/j.ijmedinf.2005.07.032 pmid:16125453 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.ijmedinf.2005.07.032&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=16125453&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=000234947400003&link_type=ISI) 5. 5. Rainie L, Fox S. The Online Health Care Revolution. Pew Research Center: Internet, Science & Tech. 2000. [http://www.pewinternet.org/2000/11/26/the-online-health-care-revolution/](http://www.pewinternet.org/2000/11/26/the-online-health-care-revolution/). Accessed August 15, 2016. 6. 6.Healthy People 2010. U.S. Department of Health and Human Services. Published November 26, 2000. [http://www.healthypeople.gov/2010/](http://www.healthypeople.gov/2010/). Accessed August 15, 2016. 7. 7.National Assessment of Adult Literacy: A Nationally Representative and Continuing Assessment of English Language Literary Skills of American Adults. 2003. National Center for Education Statistics. [https://nces.ed.gov/naal/fr\_skills.asp](https://nces.ed.gov/naal/fr_skills.asp). Accessed August 15, 2016. 8. 8. Berkman ND, Sheridan SL, Donahue KE, et al. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med 2011;155:97–107 doi:10.7326/0003-4819-155-2-201107190-00005 pmid:21768583 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.7326/0003-4819-155-2-201107190-00005&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=21768583&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=000292822000016&link_type=ISI) 9. 9. Baker DW, Gazmararian JA, Williams MV, et al. Functional health literacy and the risk of hospital admission among Medicare managed care enrollees. Am J Public Health 2002;92:1278–83 doi:10.2105/AJPH.92.8.1278 pmid:12144984 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.2105/AJPH.92.8.1278&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=12144984&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=000177109800022&link_type=ISI) 10. 10. Baker DW, Wolf MS, Feinglass J, et al. Health literacy and mortality among elderly persons. Arch Intern Med 2007;167:1503–09 doi:10.1001/archinte.167.14.1503 pmid:17646604 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1001/archinte.167.14.1503&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=17646604&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=000248230100009&link_type=ISI) 11. 11. Vernon JA, Trujillo A, Rosenbaum SJ, et al. Low Health Literacy: Implications for National Health Policy. Washington, DC: Department of Health Policy, School of Public Health and Health Services, The George Washington University; 2007 12. 12. Weiss BD, Schwartzberg JG; American Medical Association. Health Literacy and Patient Safety: Help Patients Understand: Manual for Clinicians. Chicago: AMA Foundation; 2007 13. 13. DuBay WH. The Principles of Readability. Costa Mesa: Impact Information; 2004 14. 14.How to Write Easy-to-Read Health Materials. MedlinePLus. [https://medlineplus.gov/etr.html](https://medlineplus.gov/etr.html). Accessed August 15, 2016. 15. 15. Agarwal N, Hansberry DR, Sabourin V, et al. A comparative analysis of the quality of patient education materials from medical specialties. JAMA Intern Med 2013;173:1257–59 doi:10.1001/jamainternmed.2013.6060 pmid:23689468 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1001/jamainternmed.2013.6060&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=23689468&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 16. 16. Hansberry DR, Agarwal N, Shah R, et al. Analysis of the readability of patient education materials from surgical subspecialties. Laryngoscope 2014;124:405–12 doi:10.1002/lary.24261 pmid:23775508 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1002/lary.24261&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=23775508&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 17. 17. Eltorai AE, Cheatham M, Naqvi SS, et al. Is the readability of spine-related patient education material improving? An assessment of subspecialty websites. Spine (Phila Pa 1976) 2016;41:1041–48 doi:10.1097/BRS.0000000000001446 pmid:27294810 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1097/BRS.0000000000001446&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=27294810&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 18. 18. Ryu JH, Yi PH. Readability of spine-related patient education materials from leading orthopedic academic centers. Spine (Phila Pa 1976) 2016;41:E561–65 doi:10.1097/BRS.0000000000001321 pmid:26641845 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1097/BRS.0000000000001321&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=26641845&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 19. 19. Agarwal N, Feghhi DP, Gupta R, et al. A comparative analysis of minimally invasive and open spine surgery patient education resources. J Neurosurg Spine 2014;21:468–74 doi:10.3171/2014.5.SPINE13600 pmid:24926930 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.3171/2014.5.SPINE13600&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=24926930&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 20. 20. Vives M, Young L, Sabharwal S. Readability of spine-related patient education materials from subspecialty organization and spine practitioner websites. Spine (Phila Pa 1976) 2009;34:2826–31 doi:10.1097/BRS.0b013e3181b4bb0c pmid:19910867 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1097/BRS.0b013e3181b4bb0c&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=19910867&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 21. 21. Coleman ML, Liau TL. A computer readability formula designed for machine scoring. J Appl Psychol 1975;60:283–84 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1037/h0076540&link_type=DOI) [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=A1975W086900026&link_type=ISI) 22. 22. Flesch R. A new readability yardstick. J Appl Psychol 1948;32:221–33 doi:10.1037/h0057532 pmid:18867058 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1037/h0057532&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=18867058&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 23. 23. Kincaid JP, Fishburne RP Jr., Rogers RL. Derivation of New Readability Forumlas (Automated Readability Index, Fog Count and Flesch Reading Ease Formula) For Navy Enlisted Personnel. Millington: National Technical Information Service; 1975 24. 24. Caylor JS, Sticht TG, Fox LC, et al. Methodologies for Determining Reading Requirements of Military Occupational Specialties. Technical Report. Alexandria: Human Resources Research Organization; 1973 25. 25. Fry E. A readability formula that saves time. Journal of Reading 1968;11:513–78 [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=A1968ZM48800002&link_type=ISI) 26. 26. Gunning R. The Technique of Clear Writing. New York: McGraw-Hill; 1952 27. 27. Chall JS, Dale E. Readability Revisited: The New Dale-Chall Readability Formula. Northampton: Brookline Books Cambridge; 1995 28. 28.1. Pearson PD, 2. Hansen J Raygor AL. The Raygor readability estimate: a quick and easy way to determine difficulty. In: Pearson PD, Hansen J, ed. Reading: Theory, Research, and Practice: Twenty-Sixth Yearbook of the National Reading Conference. Clemson: National Reading Conference; 1977 29. 29. McLaughlin GH. SMOG grading: a new readability formula. Journal of Reading 1969;12:639–46 [Web of Science](http://www.ajnr.org/lookup/external-ref?access_num=A1969ZM49800004&link_type=ISI) 30. 30.Chitika. The Value of Google Result Positioning. June 7, 2013. [https://chitika.com/google-positioning-value](https://chitika.com/google-positioning-value). Accessed August 15, 2016. 31. 31. Lee CJ, Gray SW, Lewis N. Internet use leads cancer patients to be active health care consumers. Patient Educ Couns 2010;81(suppl):S63–69 doi:10.1016/j.pec.2010.09.004 pmid:20889279 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.pec.2010.09.004&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=20889279&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 32. 32. Iverson SA, Howard KB, Penney BK. Impact of internet use on health-related behaviors and the patient-physician relationship: a survey-based study and review. J Am Osteopath Assoc 2008;108:699–711 pmid:19075034 [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=19075034&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 33. 33. Hironaka LK, Paasche-Orlow MK. The implications of health literacy on patient-provider communication. Arch Dis Child 2008;93:428–32 doi:10.1136/adc.2007.131516 pmid:17916588 [Abstract/FREE Full Text](http://www.ajnr.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTI6ImFyY2hkaXNjaGlsZCI7czo1OiJyZXNpZCI7czo4OiI5My81LzQyOCI7czo0OiJhdG9tIjtzOjIwOiIvYWpuci8zOC82LzEyNTIuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 34. 34. Fraval A, Chandrananth J, Chong YM, et al. Internet based patient education improves informed consent for elective orthopaedic surgery: a randomized controlled trial. BMC Musculoskelet Disord 2015;16:14 doi:10.1186/s12891-015-0466-9 pmid:25885962 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1186/s12891-015-0466-9&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=25885962&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 35. 35. Morrison AK, Brousseau DC, Brazauskas R, et al. Health literacy affects likelihood of radiology testing in the pediatric emergency department. J Pediatr 2015;166:1037–41.e1 doi:10.1016/j.jpeds.2014.12.009 pmid:25596100 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.jpeds.2014.12.009&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=25596100&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 36. 36. Gebhard RD, Goske MJ, Salisbury SR, et al. Improving health literacy: use of an informational brochure improves parents' understanding of their child's fluoroscopic examination. AJR Am J Roentgenol 2015;204:W95–W103 doi:10.2214/AJR.14.12573 pmid:25539281 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.2214/AJR.14.12573&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=25539281&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 37. 37. Prabhu AV, Hansberry DR, Agarwal N, et al. Radiation oncology and online patient education materials: deviating from NIH and AMA recommendations. Int J Radiat Oncol Biol Phys 2016;96:521–28 doi:10.1016/j.ijrobp.2016.06.2449 pmid:27681748 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.ijrobp.2016.06.2449&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=27681748&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 38. 38. Prabhu AV, Gupta R, Kim C, et al. Patient education materials in dermatology: addressing the health literacy needs of patients. JAMA Dermatol 2016;152:946–47 doi:10.1001/jamadermatol.2016.1135 pmid:27191054 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1001/jamadermatol.2016.1135&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=27191054&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 39. 39. Agarwal N, Chaudhari A, Hansberry DR, et al. A comparative analysis of neurosurgical online education materials to assess patient comprehension. J Clin Neurosci 2013;20:1357–61 doi:10.1016/j.jocn.2012.10.047 pmid:23809099 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.jocn.2012.10.047&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=23809099&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 40. 40. Kasabwala K, Misra P, Hansberry DR, et al. Readability assessment of the American Rhinologic Society patient education materials. Int Forum Allergy Rhinol 2013;3:325–33 doi:10.1002/alr.21097 pmid:23044857 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1002/alr.21097&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=23044857&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 41. 41. Hansberry DR, John A, John E, et al. A critical review of the readability of online patient education resources from RadiologyInfo.Org. AJR Am J Roentgenol 2014;202:566–75 doi:10.2214/AJR.13.11223 pmid:24555593 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.2214/AJR.13.11223&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=24555593&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 42. 42. Hansberry DR, Agarwal N, Baker SR. Health literacy and online educational resources: an opportunity to educate patients. AJR Am J Roentgenol 2015;204:111–16 doi:10.2214/AJR.14.13086 pmid:25539245 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.2214/AJR.14.13086&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=25539245&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 43. 43. Hansberry DR, Ramchand T, Patel S, et al. Are we failing to communicate? Internet-based patient education materials and radiation safety. Eur J Radiol 2014;83:1698–702 doi:10.1016/j.ejrad.2014.04.013 pmid:24968965 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1016/j.ejrad.2014.04.013&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=24968965&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 44. 44. Hansberry DR, Kraus C, Agarwal N, et al. Health literacy in vascular and interventional radiology: a comparative analysis of online patient education resources. Cardiovasc Intervent Radiol 2014;37:1034–40 doi:10.1007/s00270-013-0752-6 pmid:24482028 [CrossRef](http://www.ajnr.org/lookup/external-ref?access_num=10.1007/s00270-013-0752-6&link_type=DOI) [PubMed](http://www.ajnr.org/lookup/external-ref?access_num=24482028&link_type=MED&atom=%2Fajnr%2F38%2F6%2F1252.atom) 45. 45. Hansberry DR, Agarwal N, Gonzales SF, et al. Are we effectively informing patients? A quantitative analysis of on-line patient education resources from the American Society of Neuroradiology. AJNR Am J Neuroradiol 2014;35:1270–75 doi:10.3174/ajnr.A3854 pmid:24763420 [Abstract/FREE Full Text](http://www.ajnr.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiYWpuciI7czo1OiJyZXNpZCI7czo5OiIzNS83LzEyNzAiO3M6NDoiYXRvbSI7czoyMDoiL2FqbnIvMzgvNi8xMjUyLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 46. 46.Centers for Disease Control and Prevention. Health Literacy Guidance and Standards. 2015. [https://health.gov/communication/literacy/quickguide/quickguide.pdf](https://health.gov/communication/literacy/quickguide/quickguide.pdf). Accessed August 15, 2016. 47. 47.Toolkit for Making Written Material Clear and Effective. Centers for Medicare and Medicaid Services. [https://www.cms.gov/Outreach-and-Education/Outreach/WrittenMaterialsToolkit/index.html](https://www.cms.gov/Outreach-and-Education/Outreach/WrittenMaterialsToolkit/index.html). Accessed August 15, 2016. * Received October 7, 2016. * Accepted after revision January 25, 2017. * © 2017 by American Journal of Neuroradiology