ABSTRACT
Background Program directors need concrete indicators to monitor uptake of competency-based medical education (CBME). Entrustable professional activity (EPA) observation completion rates offer practical measures of CBME adoption.
Objective In this study, we used residents’ EPA observation data in clinical departments, specifically the submission and expiration of EPA observation forms and assessment scores, to explore the uptake of CBME practices across departments. Our research question asked: What are the patterns and contributing factors (department group, resident year, calendar year, program size) associated with EPA observation submission rates, expiration rates, and assessment scores?
Methods We conducted exploratory analysis of de-identified EPA observation data (n=233 176) from residents’ electronic portfolios (n=2110) across 45 programs in 12 departments at one Canadian institution from 2018 to 2023. Descriptive statistics summarized submission, expiration, and score distributions. Spearman correlations and logistic regression examined 4 predictors: department group, resident year, calendar year, and program size.
Results EPA submission rates (81.0%), expiration rates (7.7%), and assessment O-scores (M=4.4 out of 5) did not differ significantly by training department. Calendar year increased odds of an independent or full score by 26.3% per year (OR, 1.263; 95% CI, 1.259-1.267) while resident year (OR, 0.818; 95% CI, 0.813-0.825) and program size (OR, 0.995; 95% CI, 0.994-0.996) decreased those odds.
Conclusions EPA submission, expiration, and scoring patterns are consistent across departments and correlate with implementation year, resident training stage, and program size.