Culican Lab - Research Opportunities
Evidence Basis of Educational Evaluations
The ACGME has implemented Milestones assessments for all residents every 6 months during training. The evidence to support these milestones as accurate indicators of progress in training is lacking. We hope to systematically assess the data collected after implementation of assessments to determine whether it predicts resident progress during training. Resident driven research on topics of their choice will be entertained. Ongoing projects include:
1) The use of the Ophthalmic Clinical Exam (OCEX) to gauge resident progress in clinical acumen. Drs. Grace Paley and Tommy Shute explored three years of OCEX data to assess the accuracy of the instrument to document increasing resident competency over 3 years of training. They found that a) there was no systematic improvement of scores during residency b) that the score on the OCEX exam was statistically significant by the faculty member doing the assessment but not according to the resident being examined c) a survey of faculty completing the tool demonstrated 70% did not respect the grading anchors on the form, choosing instead to use an internal metric of resident proficiency. Additional data from the University of Missouri Columbia further strengthened the conclusion that OCEX in typical usage in two residency programs was insufficient to document resident clinical progression during residency training.
2) The accuracy of a simplified surgical assessment tool to demonstrate advancing surgical proficiency during residency training. We adapted the O-SCORE surgical assessment tool for use in documenting resident surgical skill in an ophthalmology program. Preliminary data after 6 months demonstrates a significant difference in score based on year in training (PG, PGY3, PGY4). 12-month data will be forthcoming.
3) Are resident self-assessments able to capture progression in the 6 competency domains? Residents complete self-assessments every 6 months during residency training. Part of this assessment includes resident reported proficiency in the 6 core competency domains grades on a 9-point Likert scale. The general impression by the program director and the Clinical Competency Committee was that the resident self-assessment more accurately captured educational progression over 3 years of residency training than did faculty evaluations. We will test the hypothesis that there is a statistically significant improvement in resident proficiency by self-assessment that is not captured in faculty evaluations.
4) Methods to grade surgical skill require expert evaluation and are cumbersome and time consuming. We are conducting a proof-of-concept study of a low burden, low cost, objective and reliable method to assess resident surgical proficiency. We will test the feasibility of “crowd-sourcing” evaluations of surgical videos to lay raters to measure resident operative surgical skill. We will determine if there is agreement in lay raters’ and surgical experts’ assessments of surgical skill using a modified OSATS (Aim 1A). We will also determine whether agreement between lay raters and surgical experts is consistent over the range of resident surgical experience in a cross-sectional sample of different residents in varied years of training (Aim 1B) and in a longitudinal study of individual resident progress over a year of training (Aim 1C). This study will be the first of many to begin to define objective benchmarks to gauge surgical proficiency to insure that resident physicians are competent to practice independently at the conclusion of their residency.
Dr. Susan Culican: