Brief Educational Intervention Improves Content of Intern Handovers
Abstract
Background
The Accreditation Council for Graduate Medical Education requires residency programs to ensure safe patient handovers and to document resident competency in handover communication, yet there are few evidence-based curricula teaching resident handover skills.
Objective
We assessed the immediate and sustained impact of a brief educational intervention on pediatrics intern handover skills.
Methods
Interns at a freestanding children's hospital participated in an intervention that included a 1-hour educational workshop on components of high-quality handovers, as well as implementation of a standardized handover format. The format, SAFETIPS, includes patient information, current diagnosis and assessment, patient acuity, a focused plan, a baseline exam, a to-do list, anticipatory guidance, and potential pointers and pitfalls. Important communication behaviors, such as paraphrasing key information, were addressed. Quality of intern handovers was evaluated using a simulated encounter 2 weeks before, 2 weeks after, and 7 months after the workshop. Two trained, blinded, independent observers scored the videotaped encounters.
Results
All 27 interns rotating at the Children's Hospital consented to participate in the study, and 20 attended the workshop. We included all participant data in the analysis, regardless of workshop attendance. Following the intervention, intern reporting of patient acuity improved from 13% to 92% (P < .001), and gains were maintained 7 months later. Rates of key communication behaviors, such as paraphrasing critical information, did not improve.
Conclusions
A brief educational workshop promoting standardized handovers improved the inclusion of essential information during intern handovers, and these improvements were sustained over time. The intervention did not improve key communication behaviors.
Editor's Note: The online version of this article contains the SAFETIPS card used in this study.
Introduction
Less-than-adequate handovers by residents have been associated with poor patient outcomes.1, 2 The Accreditation Council for Graduate Medical Education requires residency programs to monitor safe patient handovers and document resident competency in handover communication.3 As programs seek to comply with these requirements, there are few evidence-based curricula on this subject,4 and our project aims to address this gap.
We developed a multifaceted intervention for pediatrics interns at the Medical College of Wisconsin, and we evaluated its effectiveness in a prospective study with baseline, postintervention, and year-end handover skills assessments. Our hypothesis was that interns would demonstrate significant improvements in inclusion of key content during handover communication following the curriculum implementation, and that these gains would be sustained at year's end.
Methods
Development of Educational Intervention
Our educational intervention had 3 elements. The first was a novel standardized handover format that was developed using principles from Arora and Johnson's seminal paper on handovers.5 A pool of content for the pediatrics intern handover was collected from multiple sources, including resident interviews, a faculty focus group, and review of the handover literature. Essential items were formulated into a mnemonic: SAFETIPS (the SAFETIPS card is provided as online supplemental material).
The second element of the educational intervention was an hour-long educational workshop. The workshop used brief didactics, interactive group discussions, case-based learning examples, and opportunity for practice with faculty supervision. Handover topics included the importance of standardization, essential content, preferred environment, and active recipient behaviors, representing best practices from the handover literature.4, 6-12 The SAFETIPS format also was introduced (the curriculum used in this study has been published on the Association of American Medical Colleges' MedEdPortal13).
The third element of the educational intervention was obtaining widespread resident and program leadership support for the new standardized format. Quality improvement science emphasizes the importance of obtaining buy-in from key stakeholders.5,14 The department chair and program director approved the project, and critical resident support was obtained by involving residents in all phases. Interns and senior residents provided input during the development of SAFETIPS, and chief residents developed expectations for compliance.
Evaluation Methods
Eligible participants for this study included all pediatrics and internal medicine–pediatrics interns rotating at the Children's Hospital of Wisconsin during the study period. Participants participated in handover skills assessments in the simulation laboratory at 3 different time points: approximately 2 weeks prior to the educational intervention, 2 weeks following the intervention, and then 7 months following the intervention.
Institutional Review Board approval was obtained, and all participants provided informed consent.
Results
Although all 27 interns enrolled in the study, due to scheduling constraints, 20 participants attended the workshop, and attendance at skills assessments varied (preintervention, n = 23; postintervention, n = 25; and year end, n = 27). An intention-to-treat analysis was performed so that regardless of workshop attendance, all participant data were included. Residents who did not attend the workshop were still subject to the 2 other components of the intervention: implementation of a standardized format and expectation of compliance.
Skills Assessment Session
During the handover skills assessments, participants received a single patient case in the form of a history and a physical. Cases were derived from actual general pediatric inpatients, reflecting varying levels of complexity and acuity. Residents were instructed to hand over the simulated patient as they would a real patient for overnight coverage. The handover was videotaped for later evaluation. All participants completed a brief questionnaire. Patient scenarios and instructions were evaluated in a pilot study and modified for clarity.
Measures
The primary outcome measure was the inclusion of key content in the handover communication. Five content areas were selected as the most relevant and specific to patient handovers: patient assessment, pertinent baseline exam findings, to-do tasks, anticipatory guidance, and indicators of patient acuity.
A secondary outcome measure was the performance of 2 communication behaviors for recipients of handover information: repeating back critical information and asking clarifying questions. These behaviors were briefly addressed in the workshop, but they were not the primary target of the intervention.
Data Collection
Two independent, blinded, trained observers scored the handover simulations. The observers used a checklist containing the aforementioned items and checkboxes for “yes” (item present in handover encounter) or “no” (item not present). Observers were trained using videotaped handover simulations from a pilot study, and a faculty member familiar with the resident handover process gave feedback. Observers separately reviewed the deidentified, randomly sorted study encounter videos, and they were blinded to the timing of the encounter, the names of participants, and participants' prior training. Each observer rereviewed a random subset (10 of 75) of encounters to allow determination of intrarater reliability.
Data Analysis
Fisher exact test, 2-tailed P values were calculated to detect differences in baseline handover assessments and postintervention assessments, as well as sustained differences between baseline assessments and year-end assessments. Intraclass correlation coefficients were calculated to assess the observers' intrarater and interrater reliabilities.
Results
In this study, intrarater reliability via the intraclass correlation coefficient for observer A was .83, and for observer B it was .91. Combined interrater reliability was .92. The minimum acceptable r2 was 70%.
Primary Outcome: Inclusion of Key Content
In the baseline assessments, one content item was rarely present: patient acuity rating (13%). Inclusion of this content improved dramatically following the educational intervention (92%; P < .001), with the improvement sustained at year end (96%; P < .001; figure 1).



Citation: Journal of Graduate Medical Education 5, 1; 10.4300/JGME-D-12-00139.1
Baseline exam inclusion improved significantly, from 61% preintervention to 92% postintervention (P = .02). At year end, 81% of handover encounters included exam findings; this did not represent a significant sustained rise from preintervention data. For to-do tasks, 70% of participants included this item preintervention, increasing to 92% postintervention (P = .07). Although this increase was not statistically significant, the increase to 96% at year end was (P = .02).
The remaining 2 content items were present at high rates in baseline handover assessments—patient assessment (100%) and anticipatory guidance (87%)—and remained high following the intervention.
Secondary Outcome: Communication Behavior Performance
Clarifying questions were asked 78% of the time at baseline, with no significant change following the intervention or at year's end. Repeat back was rarely performed at baseline (35%), and this fraction did not improve following the intervention (figure 2).



Citation: Journal of Graduate Medical Education 5, 1; 10.4300/JGME-D-12-00139.1
Discussion
Our study sought to improve the quality of intern handovers via a multifaceted intervention. The intervention was successful in achieving both an immediate and sustained (7 months later) level of inclusion of key content during resident handovers. This sustained impact deserves comment, and we believe it is directly related to 2 factors: broad resident buy-in for the intervention, and ease of use of SAFETIPS.
In particular, the widespread adoption of the patient acuity rating is an important achievement. Communicating acuity is critical for patient safety; it enables oncoming providers to recognize and prioritize the sickest patients. Although we did not assess the accuracy of the rating system in this study, a recent study by Edelson et al15 documented the predictive value of a similar rating system for patient acuity by internal medicine residents.
Our analysis of key communication behaviors showed a baseline level of inclusion that is consistent with other studies.16 Our intervention was insufficient to improve performance. We believe this shortcoming was due to inadequate emphasis on the topic. Therefore, we have revised subsequent workshops to include practice of the behaviors with feedback. Additionally, we modified the SAFETIPS card to contain a reminder (provided as online supplemental material). From a practical perspective, our senior residents monitored intern handovers to assure quality.
There are several limitations to our study, including its single-site nature and sample size, limiting generalizability. Also, the study occurred in a simulation laboratory and may not reflect the real clinical environment. The study lacked a control group, making it impossible to assess how much intern handover skills might improve without an intervention. To mitigate this weakness, the baseline skills assessment was separated from the postintervention assessment by 1 month, making natural improvement less likely. We included all residents in the postintervention analysis, regardless of workshop participation. The effect of this intention-to-treat analysis would be to minimize the effect of the intervention. In addition, residents were not blinded to the study, and our results may be subject to attention bias. Finally, we measured the inclusion of key content items, not the quality of the content items, a subjective measure.
Conclusions
A multifaceted intervention, including development of a standardized handover format (SAFETIPS), implementation of a brief educational workshop, and obtainment of buy-in from the residency program was successful in achieving sustained improvements in the inclusion of key content, most notably a rating of patient acuity, in pediatric intern handovers. The intervention did not improve the performance of key communication behaviors.

Handover Content
*P < .05. **P < .001

Receiver Behaviors
Note: differences were not statistically significant.
Author Notes
Erin E. Shaughnessy, MD, is Assistant Professor of Pediatrics, Cincinnati Children9s Hospital Medical Center, Cincinnati, OH; Kimberly Ginsbach, MD, is Pediatrics Resident at Rainbow Babies and Children, Cleveland, OH; Nicole Groeschl, MD, is Family Medicine Resident at Columbia St. Mary9s, Milwaukee, WI; Dawn Bragg, PhD, is Associate Professor of Pediatrics-Medical Education at the Medical College of Wisconsin, Department of Pediatrics; and Michael Weisgerber, MD, is Associate Professor of Pediatrics in the Department of Pediatrics at the Medical College of Wisconsin.
Funding: The authors report no external funding source for this study.
We gratefully acknowledge Thomas Sitzman, MD, for critical feedback throughout the study and technical assistance with figure design.



