Mastery of Status Epilepticus Management via Simulation-Based Learning for Pediatrics Residents
Abstract
Background
Management of status epilepticus (SE) in the pediatric population is highly time-sensitive. Failure to follow a standard management algorithm may be due to ineffective provider education, and can lead to unfavorable outcomes.
Objective
To design a learning module using high-fidelity simulation technology to teach mastery achievement of a hospital algorithm for managing SE.
Methods
Thirty pediatrics interns were enrolled. Using the Angoff method, an expert panel developed the minimal passing score, which defined mastery. Scoring of simulated performance was done by 2 observers. Sessions were digitally recorded. After the pretest, participants were debriefed on the algorithm and required to repeat the simulation. If mastery (minimal passing score) was not achieved, debriefing and the simulation were repeated until mastery was met. Once mastery was met, participants graded their comfort level in managing SE.
Results
No participants achieved mastery at pretest. After debriefing and deliberate simulator training, all (n = 30) achieved mastery of the algorithm: 30% achieved mastery after 1 posttest, 63% after a second, and 6.7% after a third. The Krippendorff α was 0.94, indicating strong interrater agreement. Participants reported more self-efficacy in managing SE, a preference for simulation-based education for learning practice-based algorithms of critical conditions, and highly rated the educational intervention.
Conclusions
A simulation-based mastery learning program using deliberate practice dramatically improves pediatrics residents' execution of a SE management protocol. Participants enjoyed and benefited from simulation education. Future applications include improving adherence to other hospital protocols.
Introduction
Encountering seizures is common for pediatrics residents working on inpatient wards,1 and a rotation in pediatric neurology is a requirement of the Accreditation Council for Graduate Medical Education.2 From our experience, teaching time-sensitive management of potentially life-threatening conditions, including seizures, remains a challenge. Ineffective management of seizures in adults can lead to status epilepticus (SE), and potentially worse patient outcomes.1,3 At our institution, first-year residents (interns) are the primary providers and expected to execute timely management to avoid progression to SE.
Simulation has consistently proved to be a valuable educational tool for residents, providing opportunities for safe, deliberate practice and clinical skills acquisition. It has demonstrated transfer of skills to actual clinical scenarios, which has led to improved patient care and outcomes.4–6 Simulation technology has been shown to help residents reach mastery learning standards.4,7–9 Mastery learning is a rigorous form of competency-based education that provides a method to objectively assess competency in a particular skill or task.10,11 Previous studies of simulation-based mastery learning have demonstrated improvement in residents' skills and adherence to protocols.4,8,9,12,13
To provide a standardized management guideline for seizures, we developed a protocol at our institution (figure 1), which was made available electronically and located in every “code book” in the hospital. The algorithm is introduced didactically once during residents' first year, and provided in handbooks for second-year residents. Interns were exposed to this protocol only when encountering inpatient seizures, and we knew anecdotally that interns get less experience in managing this critical condition than their senior counterparts. An informal survey-based needs assessment was distributed by the authors to all residents. This demonstrated that interns were uncomfortable managing SE and were deficient in recalling the protocol. Given these findings and the absence of a formal curriculum, we hypothesized that interns would benefit from a learning intervention. The purpose of this study was (1) to develop a mastery learning simulation intervention to meet this skill and knowledge deficit, and (2) to assess its impact on performance and self-efficacy.



Citation: Journal of Graduate Medical Education 7, 2; 10.4300/JGME-D-14-00516.1
Methods
Setting and Participants
The study was conducted at the Ann & Robert H. Lurie Children's Hospital of Chicago, a tertiary care facility with a categorical 3-year pediatric residency of 93 residents. The 30 participants were interns in the Northwestern University Pediatric Residency Program. Participants were informed by e-mail about the study and invited to participate voluntarily. Informed consent was obtained before the pretest. None of the 30 participants dropped out of the study.
Intervention
This study was a quasi-experimental single group pretest/posttest simulation-based mastery learning educational intervention on the management of SE.
Simulation scenario and script development relied on the SE management algorithm developed locally (figure 1), which is based on the standard of care.14,15 In the scenario, a 2-year-old child develops tonic-clonic seizures, requiring recall and practical application of the SE algorithm. The scenarios were performed in our kidSTAR Simulation Lab using the SimNewB simulator (Laerdal Medical), an immersive simulator capable of vital sign changes and tonic-clonic movements. Each scenario was executed in a standardized examination room to ensure a high-fidelity environment. Using a familiar space resembling a typical inpatient room provided opportunity for situational awareness, a key component to successfully manage a critically ill patient.16–19 Items in the room included a non-rebreather oxygen mask, cardiopulmonary continuous monitor, and a crash cart. Creating a highly realistic scenario was important for mastery achievement.
The “nurse” was played by the same individual for each scenario to maintain standardization and minimize bias. Simulator operation, also exclusive to a separate individual for the entire intervention to minimize variability, relied on a standardized, timed step-by-step script.
The SE algorithm was used to develop a 22-item observational scoring checklist, which mirrored the script (figure 2). Items were scored dichotomously (0, not done/done incorrectly, or 1, done correctly). An expert panel of pediatric neurologists familiar with resident training expectations reviewed the scenario and scoring checklist. The mastery learning scenarios were performed over a 3-month period to minimize time bias.



Citation: Journal of Graduate Medical Education 7, 2; 10.4300/JGME-D-14-00516.1
Each resident was scheduled individually for the simulation, allowing sufficient time for a pretest to assess baseline knowledge, and to provide individualized education in a separate debriefing room. Without knowing the case content, the participant first performed the simulation (pretest), and after scoring, returned for debriefing. During each debriefing, participants were taught each step of the algorithm and checklist in detail, received individualized feedback on performance, and were provided feedback on how to perform each step correctly.
After debriefing, the identical simulation scenario (posttest) was repeated, which did not vary in content from the pretest. If the minimum passing score (MPS) was not met at this point, the scenario and debriefings were repeated with the same level of detail and individualized feedback until mastery was achieved.
After the final debriefing, participants reported self-efficacy levels in managing SE at the pretest and posttest(s), and answered questions about learning preference and overall evaluation of the intervention.
Two observers scored the scenarios to establish interrater reliability. Both were trained via repetition on the standardization of the protocol and scoring, and were blinded to the other's results. The scorers did not know participants professionally or personally to minimize scoring bias. Sessions were video recorded to provide the opportunity to measure interrater reliability.
Outcomes
Primary measures of baseline knowledge were recalled and performance of the required management algorithm checklist steps obtained in a pretest simulation scenario. Posttest simulation scenarios provided measurements of comparative performance. Primary outcome measures were performance at posttest(s) and achievement of the MPS. Secondary outcomes were results of a survey given postintervention, which assessed comfort levels in managing SE with a Likert scale (1, very uncomfortable, to 10, very comfortable), learning preferences, and overall evaluation of the intervention.
The MPS was based on the observational scoring checklist. The expert panel was joined by another physician with a background in graduate medical education. The panel determined the MPS via the Angoff standard-setting method.20 Each expert rated all 22 checklist items and estimated the proportion of minimally competent residents who would adequately perform that step. Ratings were averaged to compute a raw cutoff score; this determined the MPS (ie, mastery standard) of 77% (17 of 22 checklist items).
The study was approved by the Institutional Review Board at the Ann & Robert H. Lurie Children's Hospital of Chicago Children's Research Center.
Analysis
Score differences from pretest to posttest were analyzed by using paired t tests (Stata version 11.2, StataCorp LP). All posttest scores were analyzed in aggregate. Interrater reliability of scoring was calculated using Krippendorff α.21 The surveys used a Likert scale for the estimation of self-efficacy and overall rating of the intervention, and used “yes/no” answers for learning preferences.
Results
At pretest, no participants demonstrated satisfactory performance, or met the MPS. All participants achieved mastery of the algorithm after debriefing and deliberate practice; the majority of participants required 2 simulation and debriefing sessions (figures 3 and 4). There was an improvement of 59% from pretest to posttest, which was statistically significant (P < .001). No participant performed more than the 17 checklist items required to meet the MPS. Interrater reliability of the 2 scorers was high (Krippendorff α = 0.94).



Citation: Journal of Graduate Medical Education 7, 2; 10.4300/JGME-D-14-00516.1



Citation: Journal of Graduate Medical Education 7, 2; 10.4300/JGME-D-14-00516.1
Participants reported an improvement in self-efficacy for managing SE from pretest to posttest (median grade of 3 of 10 to 7 of 10, respectively). All participants highly rated the educational intervention (median grade of 8 of 10). All reported a preference for simulation-based learning with debriefing over other didactic models, and reported feeling that it had better prepared them to manage SE.
Discussion
Our study demonstrates that knowledge of a hospital-wide SE management protocol was deficient in our cohort of interns. This entire group met the mastery learning standard after our educational intervention. While similar studies have involved internal medicine residents, this is the first mastery simulation study we are aware of involving pediatrics residents and pediatric SE.8,9,12,22
No participant achieved the MPS at pretest, confirming the general impression of our needs assessment. This low achievement score highlighted a significant knowledge gap from the level expected by the expert panel. Reasons for this may be due to ineffective provider education in the didactic setting, or clinical inexperience with SE. In the practice setting, the first responder typically initiates management, but this may be interrupted once a higher-level provider arrives, potentially impeding trainees from completely performing the algorithm and attaining experiential learning. This suggests that inexperience with managing this condition may lead to negative experiences curtailing skill acquisition.23–25
Educating new providers on patient management protocols early in postgraduate training is important. The majority of residents required 2 simulation and debriefing sessions to achieve mastery, demonstrating the condition's complexity. This highlights the often-underestimated detail of institutional protocols, and the expectation for immediate recall during actual scenarios.26 Without repeated practice, critical protocols are at risk of being underperformed, holding many considerations for patient safety and hospital best practices. The intervention was feasible, welcomed, and preferred over didactic sessions, consistent with previous studies.27
Overall, the estimated time requirement for our simulation intervention, including staffing, development and implementation, simulation laboratory scenario operation, and individualized feedback, was approximately 120 hours. An intervention similar to ours could be helpful for institutions wanting robust provider education when rolling out complex, time-sensitive protocols.
Translation of performance from the simulation laboratory to actual clinical scenarios remains an important consideration, with research showing a positive relationship between simulation and patient outcomes.4,12,19,28–31 Future research exploring patient outcomes could provide additional meaningful information on this study's translatability.
A major component of our intervention was the debriefing session(s), which provided a vehicle for standardized deliberate practice. We found it important to (re-)educate participants on how to achieve mastery at each debriefing. Repeated joint review of the algorithm between posttests reinforced the information, clarified questions, provided feedback, and developed a foundation for deliberate practice.5 Interestingly, a few residents admitted that their first encounter with the algorithm was during the debriefing, revealing an unexpected educational deficit. Simulation education interventions with dedicated debriefings, thus, can augment the knowledge gap that develops secondary to “missed practice opportunities.”22
Another important component was standardization across all scenarios. Since the number of posttests required to meet mastery was unpredictable and varied among participants, this standardization had to be adhered to every time.5 Maintaining the roles of the “nurse” and simulator operator, avoiding prompting, adhering to timing and script, and scoring an objective checklist promoted standardization. This served to ensure that mastery learning was largely being achieved via deliberate practice and education, without undue outside influence.
Our study had several limitations, including the sample size, having been done at a single institution, and a sample that included solely interns. There was no control group to compare less intensive and possibly less costly education vehicles. We also did not assess knowledge retention.
The objective of this specific study was to design and evaluate an education intervention; a future investigation could study the effect of the intervention on patient outcomes.
Conclusion
Pediatrics residents can achieve mastery of a critical SE management algorithm after high-fidelity simulation and deliberate practice. The study group found simulation enjoyable, reported feeling better prepared to manage SE, and preferred simulation learning over traditional didactic methods for learning critical protocols.

Inpatient Guidelines for Management and Evaluation of Status Epilepticus

Scoring Checklist for Simulation Scenario

Pretest and Posttest Performance of the Status Epilepticus Management Algorithm

Percentage of Residents to Minimal Passing Score by Scenario
Author Notes
Marcelo R. Malakooti, MD, is Instructor, Division of Critical Care Medicine, Department of Pediatrics, Northwestern University Feinberg School of Medicine; Mary E. McBride, MD, is Assistant Professor, Division of Cardiology and Critical Care Medicine, Department of Pediatrics, Northwestern University Feinberg School of Medicine; Bonnie Mobley, BSN, Clinical and Organizational Development, Department of Nursing, Ann & Robert H. Lurie Children's Hospital of Chicago; Joshua L. Goldstein, MD, is Associate Professor, Division of Pediatrics-Neurology, Department of Neurology, Northwestern University Feinberg School of Medicine; Mark D. Adler, MD, is Associate Professor, Division of Emergency Medicine, Department of Pediatrics, Northwestern University Feinberg School of Medicine; and William C. McGaghie, PhD, is Professor of Medical Education, Department of Medical Education, Loyola University Chicago Stritch School of Medicine.
Funding: The authors report no external funding source for this study.
Conflict of interest: The authors declare they have no competing interests.
The authors would like to thank the McGaw/Lurie Children's pediatrics residents who took the time to participate in this study; Dr Mark Wainwright for his recommendations regarding adherence to the seizure protocol in the scenario design, and for allowing his algorithm to be the foundation of this study; and the expert panel for their time and contribution to education.



