A Novel Comprehensive In-Training Examination Course Can Improve Residency-Wide Scores

MD,
MD,
MD, MS, and
MD
Online Publication Date: 01 Sept 2012
Page Range: 378 – 380
DOI: 10.4300/JGME-D-11-00154.1
Save
Download PDF

Abstract

Introduction

The annual American Board of Emergency Medicine (ABEM) in-training examination is a tool to assess resident progress and knowledge. We implemented a course at the New York-Presbyterian Emergency Medicine Residency Program to improve ABEM scores and evaluate its effect. Previously, the examination was not emphasized and resident performance was lower than expected.

Methods

As an adjunct to required weekly residency conferences, an intensive 14-week in-training examination preparation program was developed that included lectures, pre-tests, high-yield study sheets, and a remediation program. We compared each residents in-training examination score to the postgraduate year-matched national mean. Scores before and after course implementation were evaluated by repeat measures regression modeling. Residency performance was evaluated by comparing residency average to the national average each year and by tracking ABEM national written examination pass rates.

Results

Following the course's introduction, odds of a resident scoring higher than the national average increased by 3.9 (95% CI 1.9-7.3) and percentage of residents exceeding the national average increased by 37% (95% CI 23%-52%). In the time since the course was started the overall residency mean score has outperformed the national average and the first-time ABEM written examination board pass rate has been 100%.

Conclusion

A multifaceted residency-wide examination curriculum focused around an intensive 14-week course was associated with marked improvement on the in-training examination.

Editor's Note: The online version of this article contains the curriculum and added details of the program.

Introduction

The in-training examination by the American Board of Emergency Medicine (ABEM) is completed annually by emergency medicine (EM) residents. The examination provides a standardized method to track residents' progress and allows remediation of residents who may subsequently have difficulty passing their ABEM written board examination.1 Poor performance on the in-training examination may require remediation, academic probation, failure to progress in the program, or ultimately termination. Many academic institutions monitor board examination scores for their programs and use it to gauge their program director's performance and program success.

Emergency medicine residents at New York Presbyterian Medical Center (NYP) are chosen from a highly competitive pool of applicants and strong test-taking skills are the norm. As we analyzed the performance of our new residency program, we found that residents were exceeding our expectations clinically, but our program's mean on the in-training examination was consistently below the national average before 2007; in 2007 our first residency class had only a 78% pass rate on the ABEM written board examination. We wanted to improve overall in-training examination scores without changing our curriculum or overall teaching philosophy.

Methods

Setting and Participants

In the NYP EM residency program, residents divide their time between Weill Cornell and Columbia University Medical Centers. The emergency departments are tertiary care centers and have a combined volume of greater than 200 000 visits per year. The NYP program is a 4-year format with 10 residents per year (12 as of 2010) and graduated its first class in 2007.

The educational philosophy of NYP EM is based on adult learning theory. We encourage our residents to be independent, self-directed learners. Our weekly conference series is structured to teach evidence-based EM and advanced problem-solving skills, and by design it omits direct teaching to the in-training examination.

Intervention

In 2007 we designed and implemented an intensive 4-month in-training examination program to be taught independently of the required weekly residency educational conference series. The program was taught by board-certified EM faculty. Tuesday evenings were chosen because this allowed maximum resident attendance. Residents who scored below the 30th percentile nationally on the prior year's in-training examination were required to attend at least 50% of sessions and to meet with residency leaders regarding the creation of an individualized study program. Attendance for other residents was voluntary. The program core curriculum was broken up into 24 topics that fit into thirteen 120- to 150-minute sessions. Individual sessions emphasized in-training examination topics and covered all major EM subjects. Sessions were designed to be interactive. An audience response system (ARS) was used, and residents were encouraged to ask questions. In addition to core lectures, 3 review sessions were held: an introductory session emphasizing test-taking strategies, a “marathon” 7-hour session, and a final “slam” session. Multiple formats, including visual diagnosis, word associations, or in-training examination–style questions, were used in these review sessions. Additionally, sessions taught test-taking strategies for deducing correct answers to standardized questions. All residents were required to take the pretest and were provided high-yield fact handouts. The curriculum and added details of the program are provided as supplemental information.

Data Analysis

We compared in-training examination scores before and after the 2007 implementation of the study program. Individual in-training examination scores were compared to the postgraduate year–matched national average. A regression model was built to measure the association between a resident taking the program and that resident outperforming the national average. Resident performance was graphed over time. Generalized estimating equation regression, a method for handling repeated measures analysis, was used. This method was preferable to other statistical methods because it accounts for the nonindependent nature of the data (residents who perform well in one year also are likely to perform well in subsequent years).

The significance of in-training examination improvement after the program was also measured by using 1-way repeated measures analysis of variance (ANOVA), and by t test comparing the percentage of residents outperforming the national average before and after the program. Graduating residents' ABEM written board examination pass rates were also noted.

Resident attendance was as follows: 50% for core lectures, 95% for “marathon” sessions, and 80% for “slam” sessions. Approximately 35% of residents were on remediation and were provided counseling on creating an individualized study program.

Results

After institution of the program, the percentage of residents exceeding the national average rose by 37% (95% confidence interval [CI], 23%–52%). The overall residency mean was consistently below the national average before 2007, but after program implementation the residency mean score outperformed the national average every year (figure 1). The intervention improved the odds of a resident exceeding the national average, which were 3.9 times higher (95% CI, 1.9–7.3; figure 2). By 1-way repeated measures ANOVA, the change in score associated with the test was statistically significant even after conservative adjustment: P < .004.

FIGURE 1. Average Examination Scores by Year and Postgraduate Year LevelFIGURE 1. Average Examination Scores by Year and Postgraduate Year LevelFIGURE 1. Average Examination Scores by Year and Postgraduate Year Level
FIGURE 1 Average Examination Scores by Year and Postgraduate Year Level

Citation: Journal of Graduate Medical Education 4, 3; 10.4300/JGME-D-11-00154.1

FIGURE 2. Percentage of Residents Who Scored Above the National Average for Their Postgraduate YearFIGURE 2. Percentage of Residents Who Scored Above the National Average for Their Postgraduate YearFIGURE 2. Percentage of Residents Who Scored Above the National Average for Their Postgraduate Year
FIGURE 2 Percentage of Residents Who Scored Above the National Average for Their Postgraduate Year

Citation: Journal of Graduate Medical Education 4, 3; 10.4300/JGME-D-11-00154.1

For the 4 years following program implementation, there has been a 100% first-time pass rate on the ABEM written board examination (36 residents). Nationally, the first-time pass rate by EM graduates was 91% during this time period (2008–2011).2 Before program implementation, 7 of 9 graduating NYP residents had passed the written examination on the first attempt.

Discussion

We demonstrate a significant improvement in resident performance on the in-training examination after the institution of an intensive 4-month in-training examination program. Previously reported initiatives with similar goals have not shown a consistent effect on EM resident in-service scores. In one study,3 a 40-hour didactic EM examination program failed to show any residency-wide improvement. In another study,4 a mandatory monthly reading program that incorporated monthly quizzes and discussion had mixed results.

Before the program's implementation, our residency did not have a formal in-training examination preparation program. We believe our program was successful for a number of reasons. First, by being supplemental to the residency's educational program, the program augments rather than supplants traditional conference time. Second, the interactive style facilitated by the ARS engages residents and forces them to answer questions in real time. The rapidity introduced by the ARS simulates test taking. Aggregate group responses from the ARS allow lecturers to gauge audience overall knowledge in real time; they can then focus discussion on knowledge deficits. Finally, the format fosters adult learning and creates an active learning environment, which facilitates material retention; immediate identification of knowledge gaps allowed learners to reflect on areas that needed further study.

We faced several implementation obstacles. Recruiting faculty to lecture on Tuesday evenings was difficult; faculty perceived the time as beyond normal faculty expectations. Over time, faculty came to appreciate the value the program represented to the residents, and they became more willing to participate. Currently, the program is taught by more than 15 faculty members with no additional compensation. Similarly, residents were initially skeptical about attending sessions on a “free night”; this resistance faded as the educational benefit became clear. Scheduling around clinical obligations was difficult. Schedules were modified to maximize attendance. Special consideration was given to residents on “remediation.” To maximize attendance for the marathon and final slam sessions, the emergency department was staffed with additional faculty, off-service residents, and midlevel providers. The normal residency conference space was used at no expense and the ARS had already been purchased before the implementation of the program. Food was an additional cost since dinner was provided during each session.

Our study has several limitations. It is possible that other factors could have contributed to the improvement in examination scores, Also, with the exception of residents on remediation, review program attendance was not mandatory and it is possible that the review program's effect could have been even greater if all residents had fully participated. Finally, although the didactic sessions were the cornerstone of the intervention, the improvement in scores may have also been related to a culture change in which residency leadership emphasized the importance of in-training examination preparation.

Conclusions

A multifaceted, residency-wide in-training examination curriculum centered on an intensive 4- month targeted review program was associated with a significant improvement in residency program performance on the in-training examination. We successfully used additional time outside the normal weekly curriculum but within current duty hour limits. The educational program required additional faculty time for the sessions as well as access to an ARS. Although our effort was labor-intensive, we believe it is feasible for other residency programs to enact similar initiatives.

Copyright: 2012
word
FIGURE 1
FIGURE 1

Average Examination Scores by Year and Postgraduate Year Level


FIGURE 2
FIGURE 2

Percentage of Residents Who Scored Above the National Average for Their Postgraduate Year


Author Notes

All authors are at the New York Presbyterian Hospital Emergency Medicine Residency Program. Rahul Sharma, MD, is Assistant Professor, Weill Cornell Medical College; Jeremy D. Sperling, MD, is Assistant Professor, Weill Cornell Medical College and Assistant Director, Emergency Medicine Residency; Peter W. Greenwald, MD, MS, is Assistant Professor, Weill Cornell Medical College; and Wallace A. Carter MD, is Program Director, Emergency Medicine Residency, Associate Professor, Weill Cornell Medical College, and Associate Professor, Columbia University College of Physicians and Surgeons.

Corresponding author: Rahul Sharma, MD, New York Presbyterian Hospital, 525 E 68th St, New York, NY 10065, 212.746.0780, rahul_sharma@hotmail.com

Funding: The authors report no external funding source for this study.

Received: 01 Jul 2011
Accepted: 23 Jan 2012
  • Download PDF