Implementing the 5 Core Components of Competency-Based Medical Education in US Emergency Medicine Residency Programs

MD, MHPE,
MD, MEd,
MD, MPH,
MD, MS,
MD,
MD, MHPE,
MD, MSEd,
MD,
MD, MSEd, and
MD, MCR
Online Publication Date: 15 May 2025
Page Range: 57 – 63
DOI: 10.4300/JGME-D-24-00639.1
Save
Download PDF

ABSTRACT

Background As graduate medical education programs implement competency-based medical education (CBME) approaches, many specialties struggle to adopt this paradigm in a way that successfully incorporates the 5 core components of CBME.

Objective To develop and implement the 5 core components of CBME within 8 US emergency medicine (EM) residency programs and assess acceptability and feasibility.

Methods We designed an intervention to implement the 5 core components of CBME: (1) an outcomes framework; (2) developmental progression; (3) tailored learning experiences; (4) competency-focused instruction or coaching; and (5) programmatic assessment. A consensus process to develop the framework and developmental trajectory was followed and included the development and deployment of programmatic assessment, coaching programs, and individualized learning plans using a shared model for implementation. We implemented the intervention beginning in August 2021. We surveyed site implementation leads about its feasibility and acceptability.

Results The survey response rate was 100% (8 of 8). Estimated time required for the project intervention was 2 to 15 hours per month and 4 to 21.4 hours per month for the program coordinator and program leadership, respectively, with no additional salary provided. Residents and faculty received brief training about the CBME program (0.25 to 1 hours for residents and 0.5 to 1 hour for faculty), with periodic reminders afterward. Site leads perceived mixed acceptability from residents and faculty. Perceived challenges to implementation included resistance to change, time limitations, faculty discomfort with providing written assessment data, and difficulties navigating institutional barriers to technology-enhanced data collection.

Conclusions CBME was estimated to require manageable time for program staff and leadership, with mixed acceptability from residents and faculty.

Introduction

In the shift to a competency-based approach to graduate medical education (GME),1-3 specialties often struggle to adopt this paradigm in a comprehensive way.4-9 There is a gap in the literature regarding successful competency-based medical education (CBME) implementation, and few interventions have addressed the 5 core components of CBME: (1) an outcomes framework; (2) developmental progression; (3) tailored learning experiences; (4) competency-focused instruction or coaching; and (5) programmatic assessment.10 For CBME in GME programs to be effectively and sustainably implemented, innovative examples may help to inform programs.

The American Medical Association (AMA) Reimagining Residency (RR) initiative aims to facilitate innovative, systemic changes that improve GME. With this initiative, we developed and implemented the 5 core components of CBME within 8 US emergency medicine (EM) residency programs and assessed acceptability and feasibility (online supplementary data Figure 1).

KEY POINTS

Methods

Setting and Participants

We implemented CBME in 8 EM residency programs, 6 of whom volunteered to participate from the onset of the RR project in 2019 and 2 others joined, one each of the following 2 years after initial implementation at the original sites at their request. These programs are described in Table 1.

Table 1 Description of Residency Program Settings and Participants
Table 1

Intervention

To develop an outcomes framework that would fit diverse EM residency training programs, we convened an advisory board representative of the specialty to create, through consensus, a set of 22 entrustable professional activities (EPAs).11 The advisory board adopted the previously published Ottawa Surgical Competency Operating Room Evaluation (O-Score)12 (online supplementary data Figure 2) after making modifications to reflect the direct observation and attending availability in the EM clinical learning environments, to assess resident EPA performance. After site-specific resident and faculty training, residency programs implemented the EPAs and assessment platforms, with the goal of completing at least one EPA assessment per EM resident per shift. The project team partnered with the Society for Improving Medical Professional Learning (SIMPL)13 to create an app-based assessment platform for these EPAs. The data, including narrative comments, collected from this tool were used by programs in their Clinical Competency Committee (CCC) decision-making processes for progression decisions and to provide feedback to residents via individualized learning plans (ILPs).

To integrate the outcomes framework of EPAs directly to developmental progression, a team of 8 members of the core grant team met iteratively to map the EPAs to the EM Milestones 2.0.14 This ensured that each of the relevant Milestone subcompetencies were tied directly to their corresponding EPAs.11 The team additionally crafted a crosswalk of entrustment ratings with the Milestones underneath each subcompetency. This served 2 purposes: (1) to connect EPAs to Milestones for ease of translation and reporting to the Accreditation Council for Graduate Medical Education (ACGME) and (2) to track, using EPAs, developmental progression along the Milestones such that if there were EPA achievement issues, the subcompetencies mapped to that EPA could be used as a diagnostic assessment by program leadership (program director, assistant/associate program directors, or a designated faculty site lead) or the CCC.

Outcomes

We measured perceived feasibility and user acceptability of the CBME implementation process by surveying site leaders at the midpoint of year 5 of the grant funding period. Site leads were chosen from the residency program leadership teams, who served in this role for at least most of the duration of the grant. Site leads were asked to describe their perceptions of all outcomes and use estimates of time and costs when necessary. The survey was drafted and reviewed by all site leads to optimize content and response process validity, and revised based on team feedback for clarity and content prior to collecting responses.15 The survey was not otherwise tested. The final survey consisted of 13 open-ended questions and was administered online via Google Forms. For feasibility, site leads were asked to estimate direct costs, program coordinator and program leadership time, and resources required, as well as to describe resident and faculty training duration and methods. For acceptability, site leads were asked their perceptions of resident and faculty acceptability, as well as estimates of resident and program leadership participation rates. Site leads provided free-text responses regarding the current status of CBME implementation, problems encountered, and lessons learned. They also identified next steps at each site (survey provided as online supplementary data).

Analysis

We reported descriptive statistics, including ranges from the numerical responses, to open-ended questions asking about time and money. Means were often not feasible as sites responded with open-ended numbers often including their own program range. Two authors (H.A.C.W., L.M.Y.) extracted representative comments that reflected the most common responses from the narrative short answer responses.

This project was deemed exempt by the Stanford University Institutional Review Board (#51828).

Results

The CBME intervention was implemented at all 8 participating sites (Table 1) with 6 sites implementing all 5 of the core components and 2 sites implementing 4 of the 5 to date (Table 2). The assessment tool, CCC processes, and ILPs were at least partially implemented at 7 of 8 programs, with one program awaiting approval for the assessment tool. Six programs implemented formal coaching programs around these ILPs, with assistant/associate program directors, core faculty, or a mix of both meeting with residents 2 to 5 times per year. ILPs were implemented broadly at half of the programs while others used them specifically for residents in difficulty. Given the variability of residency leadership structures, coaching programs, programmatic assessment, and resources, each program sought to incorporate the 5 core components of CBME as appropriate for their specific learning environment. All 8 site leads responded to the survey to assess perceptions regarding feasibility and acceptability. Site lead perceptions relating to implementation are summarized in Table 3.

Table 2 Competency-Based Medical Education in Graduate Medical Education
Table 2
Table 3 Representative Site Lead Perceptions Regarding Implementation
Table 3

Overall, site leaders reported that the intervention was feasible to implement. No programs reported major additional direct financial costs, although estimated additional time costs for administrative and faculty time varied widely across programs. All aspects of development, including the consensus meetings, site lead meetings, focus groups with stakeholders that included residents and patients, ILP and coaching program development, technology development to support programmatic assessment, and central data management were funded by the AMA Reimagining Residency grant.

Site leads reported varying estimations of program coordinator time (range 2-15 hours per month) and residency leadership time (range 4-30 hours per month). Several programs noted surges in time commitments during certain times, such as prior to a CCC meeting. Additional faculty time varied as well, with some programs training specific core faculty to support the initiative and others doing minimal training for all faculty (up to 2.4 dedicated faculty full-time equivalent, or range 1-30 hours for all faculty). Reported formal introductory CBME project training was brief for residents (0.25-1 hour) and faculty (0.5-1 hour). Introductory training was accompanied by periodic reminders or one-on-one discussions. Representative site lead comments regarding feasibility are reported in online supplementary data Table 1.

Site leads perceived variable user engagement as measured by percentage of residents (range 50%-100%) and faculty (range 22%-100%) participating in the CBME intervention. Site leads commented on this variability:

  • “Generally this program has been well received by the faculty; it is an easier way for them to provide post-shift feedback for residents.”

  • “This has become a positive culture change for our program; residents surveyed and 71% responding prefer the EPA end-of-shift assessment over our prior system.”

  • “Most [residents] love the frequent and targeted feedback although some find they don’t get it real-time and face-to-face which has been a target of our faculty development efforts. The quality of feedback is still a work in progress.”

  • “Biggest obstacles have been support for additional faculty time, resident buy-in, and navigating legal/political systems within our institution.”

Sites worked to ensure residents received one EPA assessment per shift, with resulting numbers varying significantly by site due to technology used (SIMPL vs other, mobile app vs desktop) and whether the resident or faculty member initiated the assessment. Many programs started out using a free or institutional-based assessment tool and many have subsequently switched to SIMPL over time as it became available; however, this presented additional challenges. Further representative site lead comments regarding acceptability across the CBME intervention are included in online supplementary data Table 2.

Discussion

This CBME implementation intervention, using 5 key CBME components, in 8 EM programs over 2 to 5 years found that CBME was generally feasible, with variable engagement in and acceptability by residents and faculty. Full uptake of the components was high but also varied among programs.

These findings demonstrate that CBME implementation is not a one-size-fits-all consideration. While a unifying framework of outcomes and developmental progression is fundamental, adoption of the other core components of tailored instruction, coaching, and programmatic assessment varied in different programs (ie, contextually).10 Program resources, structure, and culture may impact how individualized learning plans, coaching programs, and CCC processes are designed and implemented. Our site leads advised that programs wishing to implement CBME should develop a change management plan, provide protected training for residents and faculty, and iteratively evaluate and revise programs after initial implementation. With these observations, we are developing best practice guidelines, implementation templates, faculty and resident development tools, and data visualization and discussion guides to help programs optimize this process and expand CBME across sites.

From the perspective of the intervention site leads, there were several key successes. Residents and faculty engaged in the program, and site leads perceived an increased quantity of assessments and resident satisfaction with feedback related to EPA assessment. Also, all sites are still participating in the CBME implementation project and plan to continue after the grant ends. After the results of a broad realist evaluation of the initiative and in conjunction with the Council of Residency Directors in EM, the entire EM specialty is poised to implement this initiative in summer 2026.

Other specialties seeking to implement CBME may consider a similar consensus process to determine an outcomes framework for competencies required for unsupervised practice and a scaffolding of developmental progression within the framework. A shared mental model may facilitate the identification of necessary tools and resources to support programmatic assessment, individualized learning experiences, and coaching. Valuing programmatic flexibility and contextual variability, within the unified framework for a given specialty, may facilitate adoption of more CBME elements as well as promote sharing of resources and best practices across programs.

Our findings regarding the implementation of this CBME intervention are limited by a small number of programs, similar size programs, geography, and program format. In addition, most of the programs were well-resourced and academically affiliated. Site lead perceptions are likely to be biased due to being the leads of a grant-funded project. As EM is a shift-based, procedurally oriented specialty with a culture of innovation and frequent opportunities for direct observation, this limits generalizing to other specialties with different characteristics. The project feasibility was estimated, not measured, and the perceptions of residents and non-lead faculty were not directly measured but estimated by the site project site leads.

Based on our findings, future directions for study include directly studying the engagement and perspectives of residents, faculty, and other stakeholders, as well as the factors promoting successful adoption of all CBME elements. Studies on the benefits of aggregated, frequent assessments to track resident trajectories over time and drive individualized learning and coaching, in line with precision education principles, would be helpful.25

Conclusions

This implementation of the 5 core components of CBME at 8 US EM residency programs suggested that CBME was feasible to implement. Acceptability to residents and faculty was variable and adoption of CBME components differed among sites. This suggests that further expansion will require consideration of contextual factors.

Copyright: 2025
pdf
excel

Author Notes

Corresponding author: Holly A. Caretta-Weyer, MD, MHPE, Stanford University School of Medicine, Palo Alto, California, USA, hcweyer@stanford.edu, X @holly_cw
Received: 05 Aug 2024
Accepted: 22 Feb 2025
  • Download PDF