0
Original Investigation | Clinical Sciences

Readability Assessment of Online Ophthalmic Patient Information

Matthew R. Edmunds, MRCP, MRCOphth1; Robert J. Barry, BMedSci(Hons), MBChB1; Alastair K. Denniston, MA, MRCP, FRCOphth, PhD1,2
[+] Author Affiliations
1Academic Unit of Ophthalmology, College of Medical and Dental Sciences, University of Birmingham, Birmingham and Midland Eye Centre, Birmingham, England
2University Hospitals Birmingham NHS Foundation Trust, Birmingham, England
JAMA Ophthalmol. 2013;131(12):1610-1616. doi:10.1001/jamaophthalmol.2013.5521.
Text Size: A A A
Published online

Importance  Patients increasingly use the Internet to access information related to their disease, but poor health literacy is known to impact negatively on medical outcomes. Multiple agencies have recommended that patient-oriented literature be written at a fourth- to sixth-grade (9-12 years of age) reading level to assist understanding. The readability of online patient-oriented materials related to ophthalmic diagnoses is not yet known.

Objective  To assess the readability of online literature specifically for a range of ophthalmic conditions.

Design and Setting  Body text of the top 10 patient-oriented websites for 16 different ophthalmic diagnoses, covering the full range of ophthalmic subspecialties, was analyzed for readability, source (United Kingdom vs non–United Kingdom, not for profit vs commercial), and appropriateness for sight-impaired readers.

Main Outcomes and Measures  Four validated readability formulas were used: Flesch Reading Ease Score (FRES), Flesch-Kincaid Grade Level (FKGL), Simple Measure of Gobbledygook (SMOG), and Gunning Fog Index (GFOG). Data were compared with the Mann-Whitney test (for 2 groups) and Kruskal-Wallis test (for more than 2 groups) and correlation was assessed by the Spearman r.

Results  None of the 160 webpages had readability scores within published guidelines, with 83% assessed as being of “difficult” readability. Not-for-profit webpages were of significantly greater length than commercial webpages (P = .02) and UK-based webpages had slightly superior readability scores compared with those of non-UK webpages (P = .004 to P < .001, depending on the readability formula used). Of all webpages evaluated, only 34% included facility to adjust text size to assist visually impaired readers.

Conclusions and Relevance  To our knowledge, this is the first study to assess readability of patient-focused webpages specifically for a range of ophthalmic diagnoses. In keeping with previous studies in other medical conditions, we determined that readability scores were inferior to those recommended, irrespective of the measure used. Although readability is only one aspect of how well a patient-oriented webpage may be comprehended, we recommend the use of readability scoring when producing such resources in the future. Minimum readability policies and inclusion of facilities within webpages to maximize viewing potential for visually impaired readers are important to ensure that online ophthalmic patient information is accessible to the broadest audience possible.

Figures in this Article

Sign In to Access Full Content

Don't have Access?

Register and get free email Table of Contents alerts, saved searches, PowerPoint downloads, CME quizzes, and more

Subscribe for full-text access to content from 1998 forward and a host of useful features

Activate your current subscription (AMA members and current subscribers)

Purchase Online Access to this article for 24 hours

Figures

Place holder to copy figure label and caption
Figure.
Distribution of Readability Scores in Terms of Number and Percentage for All Ophthalmic Webpages Evaluated and for Each of the Different Readability Scores

A, Flesch Reading Ease Score (FRES). B, Flesch-Kincaid Grade Level (FKGL). C, Simple Measure of Gobbledygook (SMOG). D, Gunning Fog Index (GFOG).

Graphic Jump Location

Tables

References

Correspondence

CME
Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
NOTE:
Citing articles are presented as examples only. In non-demo SCM6 implementation, integration with CrossRef’s "Cited By" API will populate this tab (http://www.crossref.org/citedby.html).
Submit a Comment

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Sign In to Access Full Content

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Topics
PubMed Articles
Jobs
brightcove.createExperiences();