0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Original Investigation |

A Pilot Study on Providing Ophthalmic Training to Medical Students While Initiating a Sustainable Eye Care Effort for the Underserved FREE

Julia M. Byrd, MD1; Michelle R. Longmire, MD2; Noah P. Syme, MD3; Cristina Murray-Krezan, MS4; Linda Rose, MD, PhD5
[+] Author Affiliations
1Moran Eye Center, University of Utah Health Care, Salt Lake City
2Department of Dermatology, Stanford University School of Medicine, Stanford, California
3Otolaryngology Division, Department of Surgery, The University of New Mexico, Albuquerque
4Biostatistics Section, UNM Health Sciences Center, The University of New Mexico, Albuquerque
5Cornea and External Diseases Service, Division of Ophthalmology, Department of Surgery, The University of New Mexico, Albuquerque
JAMA Ophthalmol. 2014;132(3):304-309. doi:10.1001/jamaophthalmol.2013.6671.
Text Size: A A A
Published online

Importance  We present a method to reintroduce ophthalmic training into the medical school curriculum.

Objectives  To evaluate knowledge and skills acquired when participating in a service project, the Community Vision Project, and to develop a quantitative method for testing skills with the direct ophthalmoscope in patients.

Design  Second-year medical students participated in the study. After 1 month, their knowledge was compared with that of peers and graduates (internal medicine residents). Also at 1 month, their direct ophthalmoscope skills were compared with those of upperclassmen who had completed all core clerkships. One year later, after the participants had completed their core clerkships, long-term ophthalmoscope skills retention was tested, and their performance was compared with that of their classmates.

Setting and Participants  Training occurred in mobile eye clinics. Knowledge and skills assessments were performed in the hospital eye clinic among students and residents at The University of New Mexico School of Medicine. Patients were recruited from the hospital eye clinic. Participants attended a 3-hour training session held by an attending physician in the hospital eye clinic and took part in at least 1 mobile eye clinic.

Main Outcomes and Measures  A knowledge assessment quiz was administered to participants (n = 12), their classmates (n = 18), and internal medicine residents (n = 33). Skills assessment with the direct ophthalmoscope was performed at 1 month and at 1 year in 5 participants and 5 nonparticipants. Tonometer skills were assessed by comparing participants’ readings with those of an ophthalmologist’s obtained in patients at the mobile eye clinics.

Results  Participants’ median knowledge assessment scores were 48% higher than those of their classmates and 37% higher than those of internal medicine residents (P < .001 for both). Short-term (1 month) direct ophthalmoscopy median scores were 60% (quartile 1 to quartile 3 range, 40%-80%) for participants and 40% (quartile 1 to quartile 3 range, 20%-60%) for nonparticipating upperclassmen (P = .24). Long-term direct ophthalmoscopy median scores were 100% (quartile 1 to quartile 3 range, 75%-100%) for participants and 0% (quartile 1 to quartile 3 range, 0%-25%) for nonparticipating classmates (P = .11). Participants’ tonometer readings were similar to those of the ophthalmologist’s; their median reading was 2 mm Hg (quartile 1 to quartile 3 range, 0-4 mm Hg) higher than that of the ophthalmologist’s (P = .05, sign test).

Conclusions and Relevance  Service-based learning offered an efficient model for incorporating ophthalmic training into the medical school curriculum. A viable tool for quantitatively testing ophthalmoscope skills is presented.

Figures in this Article

As medical schools attempt to fit an ever-expanding body of knowledge into a 4-year curriculum, ophthalmology has become marginalized. In 2004, the Association of University Professors of Ophthalmology found that only 30% of US medical schools had any required ophthalmology rotation, down from 68% in 2000.1 About that same time another survey reported that only 17% of US medical schools required ophthalmology rotations.2

Repeated inquiries have concluded that ophthalmic skills of primary care providers are inadequate. Graduating medical students report that examination skills in ophthalmology were not practiced, performed, or mastered.3 Internal medicine residency program directors admit that their residents have inadequate training in ophthalmology.4 Among practicing internists, only 44% surveyed were comfortable performing ophthalmoscopy.5 Even the most common problems, such as cataracts, which are expected to affect more than 30 million Americans by 2020, are not understood by most of today’s graduating physicians.6

Teaching ophthalmology, particularly the use of the ophthalmoscope, has been a challenge that many different approaches have unsuccessfully addressed. We have seen plastic canisters used to simulate the eye during direct ophthalmoscopy, polystyrene models, 3-dimensional computer simulations, and online lessons in MedEdPORTAL.7,8 These techniques do not reproduce the real-life challenges of a live human examination, and the acquisition of skills in the simulated model does not necessarily translate into relevant clinical skills with patients.

Methods to better assess competency in the eye examination are being explored. There are reports of using fundus photographs in multiple-choice examinations to test the competency of students in performing a direct ophthalmoscopic examination. Afshar and colleagues9 demonstrated the efficacy of this type of assessment with 33 students before and after an ophthalmologic clerkship. A study by Asman and Lindén10 describes the evaluation of funduscopic examination skills by identifying fundus photographs of peers uploaded to an automated Internet-based assessment.

The present study was developed after observing student participation in The University of New Mexico Community Vision Project (CVP). This is a community-based service project aimed at providing eye care to the underserved in Albuquerque, as well as rural areas of the state where access to eye care is limited. The project recruits medical student volunteers from The University of New Mexico. Students participate in a 2-phase process. In the first phase, they attend a 3-hour training session held by an attending ophthalmologist (L.R.). The after-hours training session is held in the hospital eye clinic. In this session, groups of 10 or fewer students are first given a 10-minute introductory talk using a handheld eye model, explaining the location of cataracts and basic eye structures. Photographs of cataracts, pterygia, corneal scars, and normal vs glaucomatous optic discs are shown. They are then taught by active participation techniques for visual acuity screening, slitlamp examination, tonometer use, and direct ophthalmoscopy on dilated eyes of fellow volunteers. Experienced volunteers assist with the training. In the second phase, on the weekend after the training session, the students participate in a mobile eye clinic, where they apply skills to patients in the field under the supervision of more experienced volunteers, as well as an attending ophthalmologist. The mobile eye clinics provide care to as many as 50 patients in 1 day, and 6 clinics are held per year. Students participating in this volunteer service project rapidly developed competence in ophthalmology. For this reason, we decided to undertake a formal study of their knowledge and skills and to compare them with what is acquired by students in the traditional curriculum. Like most medical schools, the current ophthalmology curriculum at The University of New Mexico School of Medicine is limited, consisting of 6 hours of didactics in the preclinical years and 2 hours of skills taught by primary care specialists in the preclinical years.

Before beginning the study, approval by The University of New Mexico institutional review board was obtained. For all participants (students and patients), written informed consent was obtained. To evaluate the knowledge level and examination skills of CVP participants, short- and long-term outcomes were measured. Short-term assessments included a knowledge assessment and a skills assessment. Long-term retention of skills using the direct ophthalmoscope was assessed at 1 year after the experience.

Knowledge Assessment Quiz

A multiple-choice quiz was used to evaluate knowledge after participation in the CVP. The quiz assessed basic knowledge about common ophthalmic problems (Figure 1). It was administered to CVP participants within 1 month of the experience. Classmates who had not participated in the CVP were recruited to complete the same quiz. The recruitment process took place at the end of a regularly scheduled nonophthalmology-related academic lecture. Students were asked to voluntarily stay for a brief anonymous quiz to assess their baseline understanding of ophthalmology. To sample the level of ophthalmic knowledge of recent graduates, a group of internal medicine residents was recruited to voluntarily and anonymously participate in the same quiz after one of their regularly scheduled academic meetings. All quizzes were graded by the supervising ophthalmologist (L.R.).

Place holder to copy figure label and caption
Figure 1.
Knowledge Assessment Quiz
Graphic Jump Location
Short-term Skills Assessment With the Direct Ophthalmoscope

To assess ophthalmoscope examination skills acquired in the CVP, second-year medical students who had participated were recruited 1 month after the experience, and 5 were randomly selected to have their skills evaluated. Their performance was compared with that of 5 volunteer upperclassmen, also randomly chosen, who had completed all third-year clinical clerkships but had never participated in the CVP.

Ophthalmoscopy skills were quantitatively tested in patients. To achieve this, we recruited patients with distinct fundus findings (eg, morning glory disc, macular scar, or myelinated nerve fiber layer). The volunteer patients had their pupils dilated, and each was placed in an examination room. Five students from each group were given answer sheets with instructions for each patient as to which eye to look at and which structure (eg, the right optic nerve or the left macula). For each patient, there were 4 photographs labeled A through D, 1 of the actual patient and 3 from fundus photograph archives (Figure 2). Students reported their findings on the answer sheet, which also contained an option E to be selected if they did not obtain a view. Students were encouraged not to guess if they did not see the fundus. Once in the room, they had 2 minutes to perform the examination with the ophthalmoscope and to choose the correct image corresponding to the patient examined.

Place holder to copy figure label and caption
Figure 2.
Skills Assessment Sample Item

Students were instructed to examine the patient’s right optic disc and select the correct photograph of the patient. In this example, the photograph corresponding to the patient examined was the myelinated nerve fiber layer seen in D. The other photographs were selected from archives, all demonstrating some feature around the optic disc of the right eye.

Graphic Jump Location
Long-term Skills Assessment With the Direct Ophthalmoscope

To test for long-term retention of skills, the same evaluation was performed 1 year later. At that point, CVP participants had completed their core clerkships and were compared with classmates who had never participated in the CVP but had also completed their core clerkships. Both groups of students were again randomly selected from volunteers. A new group of patients with unique fundus examinations was also recruited. The assessment was performed in the same way as the short-term skills assessment.

Tonometer Skills Assessment

Students’ accuracy using the tonometer was assessed by comparing their measurements with those obtained by the attending ophthalmologist. During the mobile eye clinics, measurements are routinely made by the students. The attending ophthalmologist repeated readings in random patients.

Statistical Analysis

Descriptive statistics, such as medians and quartiles, were calculated to summarize the test scores of the students and residents. The data were not normally distributed, and nonparametric statistics were used for all comparisons because of the small sample sizes. The Wilcoxon rank sum test was used to compare tests scores between independent groups (such as between CVP students and non-CVP students), and the sign test was used for the paired comparisons between CVP students’ and the attending ophthalmologist’s intraocular pressure readings. All analyses were performed using statistical software (SAS 9.3; SAS Institute Inc).

Knowledge Assessment Quiz

For the knowledge assessment, 12 CVP participants, 18 non-CVP participants, and 33 internal medicine residents took the quiz. The results were analyzed using the Wilcoxon rank sum test (Table 1). Participants’ median knowledge assessment scores were 48% higher than those of their classmates and 37% higher than those of internal medicine residents (P < .001 for both).

Table Graphic Jump LocationTable 1.  Knowledge Assessment Quiz Results

We compared performance in 2 groups. In group 1, comparing CVP participants vs their second-year classmates, CVP students had significantly higher scores (median score, 81%), while the median for their classmates was 33% (P < .001). In group 2, comparing CVP participants vs residents, CVP participants scored significantly higher (median score, 81%) compared with the residents’ median score of 44% (P < .001).

There was a concern among us that CVP participants might score better because of a preexisting interest in ophthalmology. For this reason, an additional question was placed at the beginning of the quiz, asking any nonparticipant if he or she had an interest in participating in future mobile eye clinics. The quizzes of nonparticipants were then divided into those who responded that they were interested in attending a future mobile eye clinic and those who responded that they had no interest in attending a mobile eye clinic. No statistically significant difference in median scores was found between these 2 groups (33% for the interested non-CVP students and 28% for the noninterested non-CVP students, P = .93) (data not shown). Because the interest reported had no effect on the score, all non-CVP student scores were presented as a single group.

The validity of the knowledge assessment quiz was evaluated. Questions testing similar concepts were paired, and the performance of each student on these similar questions was examined for correlation. For example, the ability to identify a pterygium and name the cause of the lesion (items 1 and 2 in the Knowledge Assessment Quiz) correlated 80% of the time. Similarly, the ability to identify a photograph of a cataract correlated with the ability to name its anatomic location as the lens (items 4 and 5) 87% of the time. Finally, the ability to name the normal intraocular pressure and identify an intraocular pressure that would be consistent with angle-closure glaucoma (items 7 and 8) correlated 73% of the time.

Short-term Skills Assessment With the Direct Ophthalmoscope

To assess the acquisition of ophthalmoscopy skills in the CVP, the scores of 5 second-year medical students who had participated in the CVP the month prior were compared with those of 5 upperclassmen who had completed their core clinical clerkships but had never participated in the CVP. The scores of the CVP students were 40%, 40%, 60%, 80%, and 100%, while the scores of the non-CVP students were 20%, 20%, 40%, 60%, and 40%. Short-term (1 month) direct ophthalmoscopy median scores were 60% (quartile 1 to quartile 3 range, 40%-80%) for participants and 40% (quartile 1 to quartile 3 range, 20%-60%) for nonparticipating upperclassmen (P = .24). Second-year CVP participants who had no formal clinical experience scored 20% higher than their upperclassmen. Although the difference is small, students in their second year who had not had clinical experience and would not be expected to have attained this skill were not only on par with upperclassmen but performed better.

Long-term Skills Assessment With the Direct Ophthalmoscope

To test the long-term retention of ophthalmoscopy skills, 5 medical students who had participated in the CVP 1 year earlier were recruited. These students were from the same cohort as the short-term skills assessment and were randomly chosen but were not the same group of 5 students. This cohort was then at the completion of their third year. We also recruited students from their class who had never participated in the CVP. Therefore, all medical students at this assessment had completed their third-year clerkships. The scores of the CVP students were 100%, 100%, 100%, 75%, and 0%, while the scores of the non-CVP students were 0%, 0%, 0%, 25%, and 75%. Long-term direct ophthalmoscopy median scores were 100% (quartile 1 to quartile 3 range, 75%-100%) for participants and 0% (quartile 1 to quartile 3 range, 0%-25%) for nonparticipating classmates (P = .11).

Tonometer Skills Assessment

To assess CVP participants’ accuracy with the tonometer, intraocular pressures obtained by students in random mobile eye clinic patients were compared with pressures obtained by the attending ophthalmologist. Readings by the students (n = 23) did not differ significantly from those obtained by the attending ophthalmologist; their median reading was 2 mm Hg (quartile 1 to quartile 3 range, 0-4 mm Hg) higher than that of the ophthalmologist’s (P = .05, sign test) (Table 2).

Table Graphic Jump LocationTable 2.  Tonometer Readings of Medical Students vs the Attending Ophthalmologist

This pilot study demonstrated that the service model is an effective way to create an immersion experience that imparts useful knowledge and examination skills. If a CVP student attends just one training session and service project, it is equivalent to 8 to 10 hours of total time commitment but produces high-yield gains.

A service project may not be the answer at all schools, and any high-volume experience that incorporates active participation can likely achieve similar goals. We emphasize that from the beginning training session students are active learners. They are given a glimpse of eye structures that few generalists ever see. At the slitlamp examination, the oculars are split between student and attending ophthalmologist while an up-close view of the cornea, iris, and lens is explained to the student. Similarly, although mastery of indirect ophthalmoscopy is not a goal for these students, the instrument is used as a means to develop their intuitions about and interest in eye anatomy as the attending ophthalmologist holds the lens for them and assists in acquiring a view. The students frequently report that this initial glimpse, facilitated by an experienced ophthalmologist, builds a foundation for the interest and motivation that they apply to learning to perform the examination later in the mobile eye clinics.

The mobile eye clinic experience that follows the training session creates a venue in which the students’ newly developing skills are critically needed and readily used, with the safety net of active expert supervision. It is this type of intense exposure, where skills are repeated in a short period, that produces unusual gains. We propose that condensed immersion experiences may be a method to restore meaningful ophthalmic learning, as was measured in our participants.

On evaluation of the data, this training produced a set of practical knowledge and skills that would be useful in primary care settings. The quizzes demonstrated the ability of participants to visually spot cataracts and pterygia (chosen because they are common in the Southwest), to identify the location and pathologic features of cataracts and glaucoma, and to problem solve common complaints, such as presbyopia.

Regarding skills acquisition, we show that the students learned to accurately use the tonometer. This is a useful skill in emergency departments and primary care settings and can help reduce the burden of unnecessary urgent consultations if physicians can produce accurate readings. Such accuracy requires repeated hands-on practice supervised by an expert. This is available in the setting of a mobile eye clinic, where up to 100 eyes of 50 patients are measured by 5 to 10 students, allowing each student to reinforce the skill by repeatedly using the tonometer up to 20 times in 1 day.

This study demonstrates that superior ophthalmoscopy skills are acquired after participation in the program and are retained at 1 year. The sample size for this portion of the study was small, thereby making it difficult to obtain statistical significance; however, in examining the raw scores, a notable difference between groups is observed. Preclinical CVP students in the short-term assessment outperformed upperclassmen who had completed all third-year clerkships. The long-term assessment scores showed an even greater score discrepancy in favor of participation, suggesting that with the proper skills at the beginning of the clinical clerkships students are more likely to practice and are better equipped to hone their skills with the direct ophthalmoscope during the clinical years. However, without this initial skill, there is perhaps a lost opportunity for learning when participating in the clerkships.

This study also offers a new method to quantitatively evaluate ophthalmoscopic skills. The ability to match fundus photographs after a direct ophthalmoscope examination captures the challenges of a real examination and has been used before.10 We added the dimension of incorporating patients with distinctive fundus findings and using matched photographs from archives. In this pilot study, dilated pupils were used; however, future studies incorporating undilated eyes would be even more relevant for the future generalist.

Logistically, a program like this meets relevant learning objectives without adding to the curriculum but is voluntary, limiting its scope of availability to those who are motivated or inclined to participate in service projects. Because service projects are required at many medical schools, it is possible that students could be motivated to use a service project such as this to meet academic requirements. At The University of New Mexico, the reputation of the program’s success as a valuable learning tool has spread during the past 4 years. Participation of first-year class members in the programs has grown from 10% in 2009 to 45% in 2013. In addition to meeting their service requirements, preclinical students are motivated by the hands-on opportunity and the appeal of having a notable addition to their résumé.

In conclusion, the findings of this small sample pilot study are preliminary, but the trend found herein supports the efficacy of an immersion experience of some kind. Key aspects of this model that seem to determine its success are the intense involvement of an attending ophthalmologist, the near-immediate reinforcement of the training session in the mobile eye clinic just days later, and the high volume and high participation that occurs in the mobile eye clinic, where skills are repeated on a substantial number of patients in a short period. Future studies with larger cohorts at other medical schools may be used to explore the efficacy of immersion training in some form, whether incorporating a service project or some other innovative method that creates condensed learning supervised by one with expertise. Even if such programs are voluntary, they may assist in restoring adequate ophthalmic training at least to motivated students in the next generation. As we have shown, these skilled students then mentor other students and at times their generalist attendings, thus reintegrating eye care back into the practice of general medicine.

Submitted for Publication: June 9, 2013; final revision received July 31, 2013; accepted August 2, 2013.

Corresponding Author: Linda Rose, MD, PhD, Cornea and External Diseases Service, Division of Ophthalmology, Department of Surgery, The University of New Mexico, Mail Stop Code 10-5610, One University of New Mexico, Albuquerque, NM 87131.

Published Online: January 2, 2014. doi:10.1001/jamaophthalmol.2013.6671.

Author Contributions: Drs Byrd and Rose had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Study concept and design: Byrd, Longmire.

Acquisition of data: Byrd, Syme, Rose.

Analysis and interpretation of data: Byrd, Longmire, Murray-Krezan.

Drafting of the manuscript: Byrd, Longmire, Murray-Krezan.

Critical revision of the manuscript for important intellectual content: Byrd, Syme, Rose.

Statistical analysis: Byrd, Longmire, Murray-Krezan.

Administrative, technical, or material support: Syme, Rose.

Study supervision: Byrd, Longmire.

Conflict of Interest Disclosures: Dr Rose is the recipient of grant support from Genentech for an unrelated project concerning pterygium management. No other disclosures were reported.

Funding/Support: This research was supported by a grant from the Scholarship in Education Allocation Committee at The University of New Mexico. The equipment used in the service project was purchased using funding from the Association of American Medical Colleges as part of a “Caring for the Community” grant.

Role of the Sponsor: Funding from the Scholarship in Education Allocation Committee was used in the design and conduct of the study and to collect the data. The Scholarship in Education Allocation Committee had no role in the management, analysis, or interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication. The Association of American Medical Colleges had no role in the design or conduct of the study and was not involved in any aspect of the preparation of the manuscript.

Additional Contributions: We gratefully acknowledge the support of the Association of American Medical Colleges.

Quillen  DA, Harper  RA, Haik  BG.  Medical student education in ophthalmology: crisis and opportunity. Ophthalmology. 2005;112(11):1867-1868.
PubMed   |  Link to Article
Khachikian  S, Morason  RT.  Trends in clinical ophthalmology training in United States medical schools. Invest Ophthalmol Vis Sci. 2004;45:1404. http://abstracts.iovs.org/cgi/content/abstract/45/5/1404. Accessed September 23, 2013.
PubMed   |  Link to Article
Remmen  R, Derese  A, Scherpbier  A,  et al.  Can medical schools rely on clerkships to train students in basic clinical skills? Med Educ. 1999;33(8):600-605.
PubMed   |  Link to Article
Stern  GA; Association of University Professors of Ophthalmology Education Committee.  Teaching ophthalmology to primary care physicians. Arch Ophthalmol. 1995;113(6):722-724.
PubMed   |  Link to Article
Roberts  E, Morgan  R, King  D, Clerkin  L.  Funduscopy: a forgotten art? Postgrad Med J. 1999;75(883):282-284.
PubMed
Congdon  N, Vingerling  JR, Klein  BE,  et al; Eye Diseases Prevalence Research Group.  Prevalence of cataract and pseudophakia/aphakia among adults in the United States. Arch Ophthalmol. 2004;122(4):487-494.
PubMed   |  Link to Article
Hoeg  TB, Sheth  BP, Bragg  DS, Kivlin  JD.  Evaluation of a tool to teach medical students direct ophthalmoscopy. WMJ. 2009;108(1):24-26.
PubMed
Morris W. Clinical examination of the eye and its adnexa for non-ophthalmologists. MedEdPORTAL; 2010. www.mededportal.org/publication/7758. Accessed November 8, 2013.
Afshar  AR, Oh  FS, Birnbaum  AD, Namavari  A, Riddle  JM, Djalilian  AR.  Assessing ophthalmoscopy skills. Ophthalmology. 2010;117(9):1863, 1863.e1-2. http://www.aaojournal.org/article/S0161-6420(10)00526-9/abstract. Accessed November 8, 2013.
PubMed   |  Link to Article
Asman  P, Lindén  C.  Internet-based assessment of medical students’ ophthalmoscopy skills. Acta Ophthalmol. 2010;88(8):854-857.
PubMed   |  Link to Article

Figures

Place holder to copy figure label and caption
Figure 1.
Knowledge Assessment Quiz
Graphic Jump Location
Place holder to copy figure label and caption
Figure 2.
Skills Assessment Sample Item

Students were instructed to examine the patient’s right optic disc and select the correct photograph of the patient. In this example, the photograph corresponding to the patient examined was the myelinated nerve fiber layer seen in D. The other photographs were selected from archives, all demonstrating some feature around the optic disc of the right eye.

Graphic Jump Location

Tables

Table Graphic Jump LocationTable 1.  Knowledge Assessment Quiz Results
Table Graphic Jump LocationTable 2.  Tonometer Readings of Medical Students vs the Attending Ophthalmologist

References

Quillen  DA, Harper  RA, Haik  BG.  Medical student education in ophthalmology: crisis and opportunity. Ophthalmology. 2005;112(11):1867-1868.
PubMed   |  Link to Article
Khachikian  S, Morason  RT.  Trends in clinical ophthalmology training in United States medical schools. Invest Ophthalmol Vis Sci. 2004;45:1404. http://abstracts.iovs.org/cgi/content/abstract/45/5/1404. Accessed September 23, 2013.
PubMed   |  Link to Article
Remmen  R, Derese  A, Scherpbier  A,  et al.  Can medical schools rely on clerkships to train students in basic clinical skills? Med Educ. 1999;33(8):600-605.
PubMed   |  Link to Article
Stern  GA; Association of University Professors of Ophthalmology Education Committee.  Teaching ophthalmology to primary care physicians. Arch Ophthalmol. 1995;113(6):722-724.
PubMed   |  Link to Article
Roberts  E, Morgan  R, King  D, Clerkin  L.  Funduscopy: a forgotten art? Postgrad Med J. 1999;75(883):282-284.
PubMed
Congdon  N, Vingerling  JR, Klein  BE,  et al; Eye Diseases Prevalence Research Group.  Prevalence of cataract and pseudophakia/aphakia among adults in the United States. Arch Ophthalmol. 2004;122(4):487-494.
PubMed   |  Link to Article
Hoeg  TB, Sheth  BP, Bragg  DS, Kivlin  JD.  Evaluation of a tool to teach medical students direct ophthalmoscopy. WMJ. 2009;108(1):24-26.
PubMed
Morris W. Clinical examination of the eye and its adnexa for non-ophthalmologists. MedEdPORTAL; 2010. www.mededportal.org/publication/7758. Accessed November 8, 2013.
Afshar  AR, Oh  FS, Birnbaum  AD, Namavari  A, Riddle  JM, Djalilian  AR.  Assessing ophthalmoscopy skills. Ophthalmology. 2010;117(9):1863, 1863.e1-2. http://www.aaojournal.org/article/S0161-6420(10)00526-9/abstract. Accessed November 8, 2013.
PubMed   |  Link to Article
Asman  P, Lindén  C.  Internet-based assessment of medical students’ ophthalmoscopy skills. Acta Ophthalmol. 2010;88(8):854-857.
PubMed   |  Link to Article

Correspondence

CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
Submit a Comment

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Web of Science® Times Cited: 1

Related Content

Customize your page view by dragging & repositioning the boxes below.

See Also...
Articles Related By Topic
Related Collections