Surgeons as Educators A Guide for Academic Development and Teaching Excellence

(Ben Green) #1

112


Future Applications of Crowd-Based Evaluation


Crowdsourced technology can identify those trainees requiring additional surgical
skills improvement and might also provide the potential to prospectively identify
those trainees who may be surgically precocious. Indeed, there is evidence to sug-
gest that in addition to deliberate practice, individual factors and ability play a sig-
nificant role in the acquisition of expertise, particularly in situations where the tasks
are unfamiliar or particularly complex [ 63 ]. In the surgical arena, where even com-
monly performed procedures remain unique, challenging, and complex due to indi-
vidual patient factors, anatomy, and clinical context, trainees with innate ability
might therefore be more apt to efficiently develop expertise.
This is particularly germane in the context of residency trainee selection, as prior
research has demonstrated that completing a surgical residency program alone does
not ensure competence. One longitudinal study suggested that about 5–10% of
trainees in a 5-year surgical training program did not reach technical proficiency by
the completion of their residency program [ 64 ]. Another survey of North American
fellowship directors revealed that 21% of fellows were deemed unprepared for the
operating room, with 66% unable to operate independently for more than 30 con-
secutive minutes [ 65 ]. In this setting, identifying those future trainees with the
strongest potential for technical aptitude is critical, given the large time and finan-
cial investments in the surgical training of residents. Despite a large body of work
investigating the predictive value of personal questionnaires and tests of innate apti-
tude, manual dexterity, visual-spatial ability, and basic performance resource tests,
no single test or combination of tests has yet been identified to reliably and accu-
rately predict technical aptitude [ 66 ].
Among surgical residency program directors, there has been growing interest in
including technical skills as a factor when considering applications from future sur-
gery trainee [ 66 ]. Long-term correlation between pretraining skills assessments and
the final performance of these applicants within their respective training programs
remains to be elucidated, though there is evidence from the otolaryngology field to
suggest that such a relationship may indeed exist [ 67 ]. Crowd-based feedback on
directly observed technical tasks might provide a means to efficiently accomplish
this assessment during the residency recruitment process. Such application of
crowdsourcing technology would potentially have significant implications on the
identification and selection of future cohorts of surgeons particularly suited to high-
risk or complex surgery.
In 2016, Vernez and colleagues at the University of California, Irvine, explored
this idea by applying crowdsourcing technology to a group of 25 medical students
applying into a urologic residency program. Applicants were asked to perform a
series of surgical simulation tasks and were then ranked in order of desired match
by both expert surgeons and crowds based on task performance scores alone.
Interestingly, the final submitted residency match rank list had poor concordance
with both match lists generated solely on crowd scores and on expert scores
(Cronbach’s α = 0.46 and 0.48, respectively). However, among those ranked in the


J.C. Dai and M.D. Sorensen
Free download pdf