Simulated Success: Impact of a Preclinical Elective on Medical Student

Danielle M Sawka, Mark C Kendall, Matthew S Diorio, Nardin O Derias, Arezoo Rajaee, Chao Ji, Shyamal R Asher

Correspondence: Shyamal R Asher, Anesthesiology, Rhode Island Hospital, 593 Eddy Street, Davol 129, Providence, RI, 02903, USA, Tel +1 401-527-4775, Email [email protected]

Introduction: Preclinical medical students often have minimal exposure to topics within the field of anesthesiology. As efforts to expand clinical exposure in the preclinical curriculum are increasing, there remains an unawareness of which education topics are still undervalued. This study is the first application of an anesthesiology simulation in the medical student population that incorporates objective performance evaluation that aims to identify key areas of clinical learning growth and gaps within anesthesiology following a semester-long elective.
Methods: Our study population consisted of 14 first- and 2 second-year medical students interested in a career in anesthesiology and enrolled in an anesthesia preclinical elective at a single-institution. The students were tested on the same clinical scenario graded based on a standardized rubric prior to and after the completion of the anesthesia preclinical elective. Differences in individual simulation scores were analyzed using t-test, or Wilcoxon signed-rank test for paired samples.
Results: The students had a statistically significant increase in individual total scores (p < 0.001), with the strongest improvements in basic induction skills (p < 0.001) and response to hemodynamic changes (p < 0.001) after completion of the elective. There was no significant improvement in PACU management skills (p = 0.184) after completion of the elective, although the small sample size limits statistical power.
Conclusion: This first application of anesthesia-based simulation training with performance scoring in medical students highlights the possible need for targeted early intervention in post-operative management for medical students interested in a career in anesthesiology.

Introduction

With only 20% of medical schools requiring an anesthesiology rotation,1 medical student exposure to topics within anesthesiology is unfortunately limited. Previous studies at our institution have demonstrated that a preclinical anesthesia elective enhances written knowledge acquisition2 and that the creation of a new anesthesiology residency program has been associated with an increase in anesthesiology match rates in affiliated medical schools.3 Despite the beneficial influences of these initiatives to enhance preclinical medical education, there remains a strong need to understand which specific domains of clinical skills are being adequately learned by preclinical medical students.

Simulation-based training is an indispensable tool in assessing such clinical skills, especially for more procedural specialties such as anesthesiology. It appears, however, that most simulation studies of anesthesia skills are applied to residents and attendings,4–7 not medical students, while the published literature shows a remarkable integration of medical student simulation training in other specialties such as internal medicine and obstetrics and gynecology.8–12 These simulations frequently employ objective measurements of performance improvement rather than subjective participant attitudes, a potentially more valuable measure when assessing a simulation’s clinical impact. This study provides a model for filling this gap. In one study of medical students actually performing anesthesia-related skills, the focus is limited to hands-on technical performance of maneuvers such as intubation with no simulation modeled after a clinical scenario and no testing of the crucial higher-order cognitive skills and knowledge learned in anesthesia training;13 our present study again addresses this gap.

Investigating student learning through clinical simulation training has the additional benefit of clearly isolating and testing understanding of specific content areas. The Accreditation Council for Graduate Medical Education (ACGME) program requirements for anesthesiology residents define broad domains of expected competency such as “airway management techniques” and management of “patients immediately after anesthesia, including direct care of patients in the post-anesthesia care unit, and responsibilities for management of pain, hemodynamic changes, and emergencies related to the post-anesthesia care unit”.14 This single institution prospective study aims to explore which anesthesia contents, as derived from ACGME core domains, are learned by preclinical medical students interested in anesthesia after a semester of exposure to the field of anesthesiology. Medical student clinical anesthesia performance in these domains will be assessed through a low fidelity simulation scenario before and after taking the elective. This study importantly uses objective assessments of participant performance and real-time applied knowledge, a desired and currently lacking objective in the currently sparse research on medical student simulations in anesthesia.

Methods

Study Participants

The students enrolled in the anesthesiology preclinical elective volunteered to participate in an identical clinical simulation scenario before the start (September 5, 2023) and upon completion of the elective (November 14, 2023). Participation in the simulation was not required to obtain credit for the elective and did not affect participation in the elective. Participants received an introduction letter describing the study. Their completion of the simulation indicated willingness to participate in the study. The demographics of study participants who completed the post-elective simulation are reported in Table 1. Highlights from Table 1 are that the majority of participants (88%) were first-year medical students, there was a roughly equal gender ratio (7 males and 9 females), and most (81%) students considered anesthesiology a top three specialty interest although three-quarters had not had prior exposure to the field.

Table 1 Study Participant Demographics

Study Context

The anesthesiology preclinical elective, BIOL 6704: “Anesthesia: Much More than Putting you to Sleep”, is a one-credit course held annually throughout the fall semester at The Warren Alpert Medical School of Brown University. This course has previously been shown to significantly increase medical students’ understanding of anesthesiology fundamentals in airway management, anesthetic pharmacology, ultrasound basics, and residency training.3 This pass/fail elective is open to all first- and second-year medical students during the fall semester. The sessions and learning objectives are presented in Table 2. Students participate in both didactics as well as shadowing experiences in the operating theaters.

Table 2 Introduction to Anesthesia Preclinical Elective Syllabus Overview

Ethical Clearance

Per the Rhode Island Hospital Institutional Review Board, this study was exempt from Human Subjects Research under 45 Code of Federal Regulations 46.104(d) requirements and did not require consent documentation (IRB2083944-3).

Simulation Design

The clinical anesthesiology simulation scenario was adapted, with permission, from a Vital Anesthesia Simulation Training (VAST) case.15 VAST is a low fidelity simulation course designed for trainees and practitioners in low resource settings that has been shown to improve trainee Anesthetists’ Non-Technical Skills (ANTS) scores16 and team coordination in cardiopulmonary resuscitation.17 In the latter study, the performance based measures were evaluated using a self-designed checklist similar in construction to ours that was tested to have good reliability and validity.18 The case involved a 19-year-old male, represented by an airway mannequin, presenting for a laparoscopic appendectomy after one day of abdominal pain, nausea, and vomiting. An iPad with the Sim-Mon app (Castle+Anderson ApS; Copenhagen, Denmark) was used to display a monitor screen with vital signs that could be changed in real time requiring live interpretation by the student as the simulation progressed. Following the case presentation, the student was cued to enter the simulation and act as the primary anesthesia provider.

Data Collection

The simulation facilitators included a senior anesthesia resident and an attending anesthesiologist who assessed performance in five key domains, called “stages”, and assigned numerical points based on a standardized rubric. The standardized rubric was designed to have binary responses and explicit unambiguous criteria for scoring in order to eliminate the need for training or the concerns of interrater variability. Stage (a) assessed basic induction knowledge, Stage (b) airway management, Stage (c) reaction to hemodynamic changes, Stage (d) emergence and postoperative pain control, and Stage (e) PACU management. In all five stages, the student acted as a supervisor for the anesthesia resident. The anesthesia resident is a scripted role undertaken by the resident simulation facilitator, and the attending (S.R.A.) assessed the students competency; thus a single scorer was used to minimize interrater variability. The detailed simulation scenario and scoring rubric are included in Appendix 1. Figure 1 is an illustrated schematic of the rubric and includes key tasks students had to initiate to receive credit. All data collected were de-identified.

Figure 1 Rubric illustrating an optimal series of steps that can be taken during the simulation scenario. A skilled anesthesiologist observed each student and awarded points in real-time based on actions completed. Each box represents 1 point that can be earned – 0.5 points were awarded if the student correctly named the class of drug, 1 point if the student answered with a correct drug name. The total points for Stages (a-e) were 6, 3, 3, 4, 4 respectively, for a maximum score of 20 points. As the scenario progressed from Stages (a-e), facilitators manipulated the situation according to the arrows. Drawings by D. M. Sawka.

Statistical Analysis

Only results from participants who completed both simulations were analyzed. The few students who completed the pre-simulation but not post-simulation were assumed to be Missing at Random and thus excluded from the analysis. The mean and standard deviation were computed for the five stages. Individual differences in pre- and post-scores were calculated for each section. The group of differences were assessed for normality using the Shapiro–Wilk test, then analyzed using paired t-test or Wilcoxon signed-rank test for paired samples. Alpha threshold for significance was selected to be 0.05. Post hoc power analysis was performed with G*Power 3, a free and publicly available software package19 (Henrich Heine Universität Düsseldorf) published in the literature.20 This software transforms input variables of type of statistical test (set as Means: Difference between two dependent means (matched pairs), number of tails (two), effect size calculated from group means and standard deviations, alpha, and total sample size) to an output parameter of study power.

Results

A total of 18 students enrolled in the elective and completed the pre-course simulation. These were all preclinical students with minimal exposure to simulation and no previous exposure to this particular simulation case. Out of this cohort, 2 students withdrew from the elective over the course of the semester, and the remaining 16 students participated in the post-course simulation (88.9%) and were included in the final analysis (Table 2 for demographics). These students represented 14 medical students in their first year and 2 in their second.

The total rubric scores for each individual are summarized in Figure 2. Thirteen of the 16 students had an improved performance on the second iteration of the simulation after taking the anesthesia preclinical elective, with a maximum score improvement of 10 points on the 20-point rubric.

Figure 2 Graded total simulation scores across Stages (a-e) for preclinical medical students enrolled in anesthesia elective, performed at the start and end of the elective. The maximum score possible was 20 points.

A detailed breakdown of aggregate scores for each stage are presented in Table 3. In general, participants significantly improved in 4 of the 5 stages with the exception of Stage (e): PACU Management. The average total score for the first simulation attempt was 4.3 ± 4.2 points. The average total score for the second simulation attempt following clinical exposure was 8.8 ± 2.8 points (p < 0.001). Figure 3 highlights this expected global improvement in clinical performance.

Table 3 Summary Scores of 16 Preclinical Medical Students in an Anesthesia Simulation Scenario Performed Before and After a Preclinical Elective

Figure 3 Distributions of total scores (sum of all 5 stages) for preclinical medical students in the standardized anesthesia simulation scenario pre- and post- elective. (i) are the initial total scores of the 16 students, and (ii) are the follow-up scores. Bins are structured as [0,1), [1,2), etc.

Stage (d): Emergency and Post-operative Pain had the lowest mean pre-score at 0.3, while Stages (b): Airway Management and (e): PACU Management had the highest mean pre-score at 1.3 points. Following several weeks of the elective, Stage (d) remained with the lowest mean post-score at 0.9, while Stage (a): Basic Induction had the highest mean post-score at 2.4 points. Interestingly, Stages (a) and (c): Hemodynamic changes had the strongest significant improvements (p < 0.001), Stages (b) and (d) had weaker but still significant improvements (p = 0.021 and p = 0.048 respectively), and Stage (e) had no significant improvement (p = 0.184). Performances widely varied among participants, as reflected by large standard deviations.

Discussion

There is a notable gap in the literature for simulation studies of medical students in the field of anesthesia that are objective and performance based, as most literature is on medical student simulations in other specialties8–12 or anesthesia simulations limited to residents and attendings.4–7 This single-center prospective study aimed to help fill this gap and to identify specific anesthesia content areas, if any, needing more future emphasis in the learning of preclinical medical students interested in anesthesia after a full semester anesthesia elective. Although conclusions are narrowed by study limitations detailed shortly, our findings do suggest the need for early intervention in the education of postoperative care management for medical students given the lack of better clinical performance after the semester (p = 0.184). Interestingly, prior studies of resident and attending anesthesia simulations have notably excluded evaluations of emergence and PACU preparation skills altogether,5–7 which indicates that perhaps this is a far-reaching education gap across the specialty. PACU management is a critically important part of perioperative care with potential for significant risk. An analysis of claims brought against US anesthesiologists for harm occurring in the PACU frequently reported respiratory injuries, nerve injuries, and airway injuries. Over half of cases resulting in patient death cited missed or delayed diagnoses in the PACU as a causative factor.21 Techniques such as airway management and regional anesthesia are considered synonymous with anesthesiology education leaving little time dedicated towards postoperative care management. In a multi-study analysis, with over 70% of included studies involving residents, education surrounding an integral part of PACU management – anesthesiologist handoffs – is suboptimal.22 Theoretical implications of our finding of lack of post-elective PACU skill management are a possible lack of student enthusiasm for the topic and inadequate exposure; practically, this result implies that while intraoperative patient care might improve with learner experience, postoperative patient outcomes might disproportionately stagnate unless our preclinical elective (and anesthesia didactics broadly) are redesigned to include more postoperative exposure and instruction. Understanding this bias can allow medical educators to specifically target postoperative care education through workshops or dedicated time towards learning this important phase of care.

The simulation was modeled after what is expected of residents in their training. In this study, multiple skills were assessed, such as identifying appropriate medications, recognizing and correcting improper airway management, responding to emergent vital sign changes, and transitioning the patient to PACU. This differs from the assessment tools used in a previously reported anesthesia simulation study conducted among medical students. In that case, third- and fourth-year students interacted with a mannequin in three realistic simulations that included malignant hyperthermia and pulmonary embolism. Despite the quality of the clinical scenarios, evaluation measures were limited to self-assessments. Details and results of the assessments were not provided.23 In another study involving first- and second-year medical students in 14 simulation cases, the study design similarly lacked real-time performance evaluation and instead, relied on participants’ self-completed multiple-choice pre- and post-test grades on knowledge-based questions.24 Based on our literature review, the sole study of student performance resulting from anesthesia simulation training was conducted with veterinary, not medical, students.25 Our study uniquely presents a quality simulation scenario with objective performance metrics excluded from participant self-bias through the use of an external assessor, metrics that are fully transparent to colleagues for future study reproducibility. Since the main purpose of medical simulation training is to enhance learner competency in performing given tasks, it is essential to use such objective metrics to evaluate the real-time application of participants’ knowledge.

This study has limitations including a small sample size that was limited by the number of students enrolled in the preclinical elective. Our post-hoc power analysis shows a limited power for analyses without strong (p < 0.001) statistical significance, especially for PACU emergence in which true differences between pre- and post-elective may exist. In addition, the lack of a control group limits causal inference of the impact of the preclinical elective itself on performance improvement but is reasonably attributed given our institution’s lack of anesthesia-related core curriculum. In addition, the simulation facilitators were not blinded thereby adding an element of bias in the post-course scoring. There is also a degree of selection bias in our sample as the students volunteered to enroll into the elective and all students were from a single institution. It is unclear in what direction and magnitude, if any, this selection bias would have on the results. Despite the construction of the rubric for easy incorporation elsewhere, this study’s results cannot be readily generalized beyond our single institution. Finally, the simulation performed was a low fidelity simulation with a prepared scenario, an airway mannequin and a simulated monitor but was conducted in a classroom rather than a mock operating room that reduces the level of realism for the student.

This novel study demonstrates the successful implementation of an objective performance-based evaluation of medical students using an anesthesiology simulation after a preclinical anesthesiology elective. Specific anesthesia content domains that may need additional instructional support from educators are identified.

Conclusion

Anesthesiology simulations are a popular and high-impact educational tool for anesthesiology training and a valuable tool for assessing clinical performance. This study demonstrates that a clinical anesthesiology simulation can help identify relative specific strengths and weaknesses in preclinical medical student clinical performance after a preclinical anesthesiology elective. Future initiatives for early medical student exposure to the field of anesthesia should be developed with a suggested greater emphasis on learning postoperative management.

Data Sharing Statement

All study data will be publicly shared at request.

Ethical Considerations

This student underwent a review by the Rhode Island Hospital Institutional Review Board (IRB2083944-3).

Consent to Participate

Per the Rhode Island Hospital Institutional Review Board, this study was exempt from Human Subjects Research under 45 Code of Federal Regulations 46.104(d) requirements and did not require consent documentation (IRB2083944-3).

Acknowledgments

The authors thank the Warren Alpert Medical School of Brown University and its preclinical elective student participants.

Funding

This study was not funded and the authors report no financial disclosures pertaining to this manuscript.

Disclosure

The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

References

1. Ly EI, Catalani BS, Boggs SD, McGreevey KE, Updegraff AB, Steadman JL. The anesthesiology clerkship: a requisite experience in medical education. Ochsner J. 2020;20(3):250–254. doi:10.31486/toj.20.0094

2. Walsh KL, Yadav A, Cradeur M, et al. Impact of a preclinical medical student anesthesiology elective on the attitudes and perceptions of medical students regarding anesthesiology. Adv Med Educ Pract. 2023;14:1347–1355. doi:10.2147/AMEP.S427974

3. Sawka D, Yadav A, Kendall M, Diorio M, Asher SR. The impact of a new anesthesiology residency program on the number of medical students matching into anesthesiology at a single institution: a retrospective longitudinal study. Cureus. 2023;15(12):e50677. doi:10.7759/cureus/50677

4. Klein MR, Schmitz ZP, Adler MD, Salzman DH. Simulation-based mastery learning improves emergency medicine residents’ ability to perform temporary transvenous cardiac pacing. West J Emerg Med. 2022;24(1):43–49. doi:10.5811/westjem.2022.10.57773

5. Weinger MB, Banerjee A, Burden A, et al. Simulation-based assessment of the management of critical events by board-certified anesthesiologists. Anesthesiology. 2017;127(3):475–489. doi:10.1097/ALN0000000000001739

6. Murray DJ, Boulet JR, Avidan M, et al. Performance of residents and anesthesiologists in a simulation-based skill assessment. Anesthesiology. 2007;107(5):705–713. doi:10.1097/01.anes.0000286926.01083.9d

7. Abrahamson S, Denson J, Wolf R. Effectiveness of a simulator in training anesthesiology residents. Qual Saf Health Care. 2004;13(5):395–397. doi:10.1136/qhc.13.5.395

8. McInerney N, Nally D, Khan MF, Heneghan H, Cahill RA. Performance effects of simulation training for medical students – a systematic review. GMS J Med Educ. 2022;39(5):Doc51. doi:10.3205/zma001572

9. Alluri RK, Tsing P, Lee E, Napolitano J. A randomized controlled trial of high-fidelity simulation versus lecture-based education in preclinical medical students. Med Teach. 2016;38(4):404–409. doi:10.3109/0142159X.2015.1031734

10. DeWaay DJ, McEvoy MD, Kern DH, Alexander LA, Nietert PJ. Simulation curriculum can improve medical student assessment and management of acute coronary syndrome during a clinical practice exam. Am J Med Sci. 2014;347(6):452–456. doi:10.1097/MAJ.0b013e3182a562d7

11. Kern DH, Mainous AG, Carey M, Beddingfield A. Simulation-based teaching to improve cardiovascular exam skills performance among third-year medical students. Teach Learn Med. 2011;23(1):15–20. doi:10.1080/10401334.2011.536753

12. Dayal AK, Fisher N, Magrane D, Goffman D, Bernstein PS, Katz NT. Simulation training improves medical students’ learning experiences when performing real vaginal deliveries. Simul Healthc. 2009;4(3):155–159. doi:10.1097/SIH.0b013e3181b3e4ab

13. Long S, Sorrels C, Cook R, et al. Early exposure to anesthesiology: a summer preceptorship program for first-year medical students. Proc. 2024;37(2):274–276.

14. ACGME Program Requirements for Graduate Medical Education in Anesthesiology. ACGME 2022. Available from: https://www.acgme.org/globalassets/pfassets/programrequirements/040_anesthesiology_2022.pdf. Accessed June 17, 2024.

15. Mossenson A. Airway management. VAST Scenario Bank. Available from: https://www.dropbox.com/sh/uks5o0s15mhee50/AADrkB5DvCzoMVYFKk_RFH_ga/Airway%20management?dl=0&subfolder_nav_tracking=1. Accessed November. 27, 2023.

16. Mossenson AI, Tuyishime E, Rawson D, et al. Promoting anesthesia providers’ non-technical skills through the vital anaesthesia simulation training (VAST) course in a low-resource setting. Br J Anaesth. 2020;124(2):206–213. doi:10.1016/j.bja.2019.10.022

17. Tuyishime E, Mossenson A, Livingston P, et al. Resuscitation team training in Rwanda: a mixed method study exploring the combination of the VAST course with advanced cardiac life support training. Resusc Plus. 2023;15:100415. doi:10.1016/j.resplu.2023.100415

18. Andersen PO, Jensen MK, Lippert A, Østergaard D, Klausen TW. Development of a formative assessment tool for measurement of performance in multi-professional resuscitation teams. Resuscitation. 2010;81(6):703–711. doi:10.1016/j.resuscitation.2010.01.034

19. G*Power; statistical power analyses for mac and windows. Heinrich Heine Universität Düsseldorf. Available from: https://www.psychologie.hhu.de/arbeitsgruppen/allgemeine-psychologie-und-arbeitspsychologie/gpower. Accessed July 16, 2025.

20. Faul F, Erdfelder E, Lang A-G, Buchner A. G*Power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods. 2007;39(2):175–191. doi:10.3758/bf03193146

21. Kellner DB, Urman RD, Greenberg P, Brovman EY. Analysis of adverse outcomes in the post-anesthesia care unit based on anesthesia liability data. J Clin Anesth. 2018;50:48–56. doi:10.1016/j.jclinane.2018.06.038

22. Riesenberg LE, Davis R, Heng A, Vong Do Rosario C, O’Hagan EC, Lane-Fall M. Anesthesiology patient handoff education interventions: a systematic review. Jt Comm J Qual Patient Saf. 2023;49(8):394–404. doi:10.1016/j.jcjq.2022.12.002

23. Jenkins KD, Stroud JM, Bhandary SP, et al. High-fidelity anesthesia simulation in medical student education: three fundamental and effective teaching scenarios. Int J Acad Med. 2017;3(1):66–71. doi:10.4103/IJAM.IJAM_45_17

24. Jabaay MJ, Marotta DA, Aita SL, et al. Medical simulation-based learning outcomes in pre-clinical medical education. Cureus. 2020;12(12):e11875. doi:10.7759/cureus.11875

25. Jones JL, Rinehart JJ, Englar RE. The effect of simulation training in anesthesia on student operational performance and patient safety. J Vet Med Educ. 2019;46(2):205–213. doi:10.3138/jvme.0717-097r

Continue Reading