234
ning of the simulator development process to look at the alignment of the content,
including skills, knowledge, and scenarios with the construct or domain intended to
measure and extends to assessment of the impact of training or assessment of the
target simulator. According to the current Standards for Educational and
Psychological Testing, validity is “unitary,” and validity evidences are collected to
either support or refute the interpretation of simulator scores for certain use not for
validating the simulator itself. The current standards described five sources for
validity evidence: content evidence, internal structure evidence, response processes
evidence, relations with other variables evidence, and consequences evidence
[ 6 , 13 ]. The required level of validity evidence for a simulation-based curriculum
intended solely to train residents would require a different amount of evidence then
a curriculum intended to credential or certify an individual (high-stakes exam) [ 14 ].
References
- Noureldin YA, Stoica A, Kassouf W, Tanguay S, Bladou F, Andonian S. Incorporation of the da
Vinci surgical skills simulator at urology objective structured clinical examinations (OSCEs):
a pilot study. Can J Urol. 2016;23(1):8160–6. - Noureldin YA, Fahmy N, Anidjar M, Andonian S. Is there a place for virtual reality simulators
in assessment of competency in percutaneous renal access? World J Urol. 2016;34(5):733–9. - Noureldin YA, Elkoushy MA, Fahmy N, Carrier S, Elhilali MM, Andonian S. Assessment
of photoselective vaporization of prostate skills during urology objective structured clinical
examinations (OSCE). Can Urol Assoc J. 2015;9(1–2):e61–6. - Wiggins G, McTighe J. Understanding by design. Alexandria: Association for Supervision and
Curriculum Development (ASCD); 2005. - McTighe J, Wiggins G. Understanding by design® framework. Alexandria: Association
for Supervision and Curriculum Development (ASCD); 2012. Accessed online on June
1st 2017 from the website http://www.ascd.org/ASCD/pdf/siteASCD/publications/UbD_
WhitePaper0312.pdf. - American Educational Research Association, American Psychological Association, and
National Council on Measurement in Education. Standards for educational and psychological
testing. Washington, DC: American Educational Research Association; 2014. - Millo Y, George I, Seymour N, Smith R, Petinaux O. Guidelines for simulation development:
a set of recommendations for preferred characteristics of surgical simulation; developed by the
technology and simulation committee of the accredited education institutes consortium; 2014.
Accessed online on June 1st 2017 from the website https://www.facs.org/~/media/files/educa-
tion/aei/guidelines%20for%20simulation%20interactive.ashx. - Hananel D, Sweet RM, Stubbs J. Simulator development – from idea to prototype to product. In:
Aggarwal R, Korndorfer J, Cannon-Bowers J, editors. ACS principles and practice for simulation
and surgical education research. 1st ed. Chicago: American College of Surgeons; 2015. p. 138–52. - Hananel, D. Sweet, R. Deconstructing fidelity for simulation in healthcare. Submitted to
Simulation in Healthcare. - Rooney D, Cooke J, Hananel D. The creation of a simulator value index tool by connected
consensus. Simul Healthc. 2014;9(6):427. DOI 10.1097/01.SIH.0000459322.91351.95. - BioGears Human Physiology Engine. Accessed online on May 30th from the website https://
http://www.ara.com/projects/biogears-human-physiology-engine. - Sweet RM. The CREST simulation development process: training the next generation.
J Endourol. 2017;31(S1):S69–75. DOI 10.1089/end.2016.0613. Epub 2016 Dec 22. - Sweet RM, Hananel D, Lawrenz F. A unified approach to validation, reliability, and education
study design for surgical technical skills training. Arch Surg. 2010;145(2):197–201. - Korndorffer JR, Kasten SJ, Downing SM. Validity and reliability. In: Tsuda ST, Scott DJ,
Jones DB, editors. Textbook of simulation: skills and team training. 1st ed. Woodbury: Ciné-
Med Publishing; 2012. p. 81–4.
Y.A. Noureldin and R.M. Sweet