Entrustable Professional Activities: Correlation of Entrustment Assessments of Pediatric Residents With Concurrent Subcompetency Milestones Ratings

MD, MEd,
MD,
MD, and
MD, MPH, MEd
Online Publication Date: 01 Feb 2020
Page Range: 66 – 73
DOI: 10.4300/JGME-D-19-00408.1
Save
Download PDF

ABSTRACT

Background

In competency-based medical education, subcompetency milestones represent a theoretical stepwise description for a resident to move from the level of novice to expert. Despite their ubiquitous use in the assessment of residents, they were not designed for that purpose. Because entrustable professional activities (EPAs) require observable behaviors, they could serve as a potential link between clinical observation of residents and competency-based assessment.

Objective

We hypothesized that global faculty-of-resident entrustment ratings would correlate with concurrent subcompetency milestones-based assessments.

Methods

This prospective study evaluated the correlation between concurrent entrustment assessments and subcompetency milestones ratings. Pediatric residents were assessed in 4 core rotations (pediatric intensive care unit, neonatal intensive care unit, general inpatient, and continuity clinic) at 3 different residency training programs during the 2014–2015 academic year. Subcompetencies were mapped to rotation-specific EPAs, and shared assessments were utilized across the 3 programs.

Results

We compared 29 143 pairs of entrustment levels and corresponding subcompetency levels from 630 completed assessments. Pearson correlation coefficients demonstrated statistical significance for all pairs (P < .001). Multivariate linear regression models produced R-squared values that demonstrated strong correlation between mapped EPA levels and corresponding subcompetency milestones ratings (median R2 = 0.81; interquartile range 0.73–0.83; P < .001).

Conclusions

This study demonstrates a strong association between assessment of EPAs and subcompetency milestones assessment, providing a link between entrustment decisions and assessment of competence. Our data support creating resident assessment tools where multiple subcompetencies can be mapped and assessed by a smaller set of rotation-specific EPAs.

Introduction

Over the past decade, we have witnessed a transformation of medical education and training to a competency-based model. This new paradigm emphasizes a contextual and developmental approach to medical education and training with the promise of more proficient physicians and ultimately improved safety and quality of patient care.15 In the movement toward competency-based medical education, each specialty developed subcompetencies in each of the 6 Accreditation Council for Graduate Medical Education (ACGME) competency domains with milestones levels that represented a blueprint for the development of knowledge, skills, and attitudes germane to their field.6 While 51 subcompetencies were developed in pediatrics, 21 were identified for tracking during residency training.7,8 It was never the intent that these theoretical milestones would serve as an assessment tool, but rather they represented a shared mental model to track trainee progress over time, and serve as a roadmap for individual improvement.6,9 Many programs have nevertheless employed the subcompetency milestones as a direct assessment tool, as these milestones ratings are the accepted ACGME standard in the required semiannual reporting of resident competence. Challenges with this approach include the milestones' lengthy descriptions, wide range of applicability, and room for interpretation.6,1013 Most importantly, the subcompetency milestones are a theoretical construct, and while they provide a roadmap for moving from novice to expert, they do not necessarily reflect specific observable behaviors.13

Entrustable professional activities (EPAs) require observable behaviors and skills that support decisions to trust a trainee to independently perform within a particular field.1316 In the context of competency-based medical education, EPAs represent an attractive assessment tool for several reasons: they are easily observed and reliably assessed4,9,10,17; entrustment incorporates supervision and safety considerations1820; and competence is implicit in the eventual entrustment of trainees to perform EPAs.13,21,22 Because EPAs are directly observable, they may serve as a potential link between the theoretical framework of subcompetencies and point-of-service clinical practice.13,2325 From a practical perspective, if multiple subcompetency milestones ratings are correlated to single entrustment decisions, an approach that utilizes EPAs as assessment tools may greatly simplify our approach to competency-based assessment.5

Recognizing the promise of EPAs in assessment, many have explored their use as an assessment tool. These studies have demonstrated successful mapping of EPA assessments to the subcompetency milestones.9,2630 However, there has been no study to date that has investigated validity evidence for this mapping strategy.

The primary objective of our study was to determine the interitem correlation of resident assessment on global faculty-of-resident rotation assessments by parallel and concurrent EPA rating and primary subcompetency milestones level rating. A secondary objective was to determine evidence of relations to other variables' validity for the EPA entrustment scale used for this project.

Methods

This prospective study was conducted across 3 diverse pediatric training programs: University of Vermont (21 residents), University of South Alabama (39 residents), and Children's National Medical Center in Washington, DC (117 residents). The sites included a children's hospital within a larger medical center, a free-standing children's and women's hospital, and a large free-standing children's hospital, respectively. Pediatric residents across the 3 years of training were assessed in 4 separate core rotations during the 2014–2015 academic year. These rotations included the pediatric intensive care unit (PICU), neonatal intensive care unit (NICU), general inpatient pediatrics, and continuity clinic representing a mix of inpatient and outpatient rotations and general pediatrics and subspecialty rotations.

Common rotation-specific EPAs were developed for each of these core rotations by adapting existing rotation-specific goals and objectives from the 3 training programs. Consensus was achieved by using a modified Delphi method in which iterative cycles of verbal and written feedback from the program directors, rotation directors, and rotation-specific supervising faculty at each of the institutions were conducted until consensus was reached.31 Numerous pediatric subcompetencies were mapped to each rotation-specific EPA based on utilizing a similar modified Delphi method with the same personnel. The decision for each mapping was based on whether the observation of the knowledge, skills, and attitudes embedded in the EPA could inform the performance of that individual in a particular subcompetency domain. In order to quantitatively compare the evaluation of entrustment to a subcompetency milestone level, we employed a 5-level entrustment scale (Box), which was adapted from the literature and mirrored the Dreyfus 5-level novice-to-expert scale that is embedded within the subcompetency milestones.16,32,33

New faculty-of-resident global rotation assessments for the 4 rotations were assembled with 2 primary sections: an entrustment section and a subcompetency section (provided as online supplemental material). The entrustment section included the rotation-specific EPAs with the corresponding 1 to 5 entrustment scale, including half-point graduations. The subcompetency section included all correspondingly mapped subcompetencies and their respective 1 to 5 milestones levels, including half-point graduations. Half-point intervals in the entrustment scale were used to afford scoring flexibility when summating multiple observations of discrete clinical situations and, in the case of composite assessments, discrepancies in faculty entrustment decisions based on their particular observations. Identical assessments were built in the respective electronic management systems of the 3 programs (New Innovations, Uniontown, OH, and MedHub, Minneapolis, MN). Faculty development was carried out at each of the institutions by the program directors and rotation directors to optimize understanding and compliance with the assessment system and milestones ratings. While rotations across the institutions used a similar end-of-rotation assessment strategy, some rotation assessments were completed by an individual attending physician while other rotations employed a composite assessment approach in which multiple attending physicians jointly completed single assessments.

For each completed assessment, the entrustment level assigned for each rotation-specific EPA was then compared to the milestone level assigned for each of the subcompetencies mapped to that EPA to determine level of correlation (Figure 1). Each comparison between entrustment level and milestone level represented a data point. A REDCap database (Vanderbilt University, Nashville, TN) was utilized to collect data across the 3 programs. Pearson correlation coefficients were calculated for each rotation-specific EPA and subcompetency milestones pair. Multivariate linear regression produced R-squared values for each EPA, which measured the strength of the relationship between the primary milestones ratings for the collective subcompetencies mapped to that EPA.

Figure 1. Diagram Representing Comparison Between Parallel and Concurrent EPA and Subcompetency Milestones AssessmentsAbbreviations: EPA, entrustable professional activity; PC, patient care.Figure 1. Diagram Representing Comparison Between Parallel and Concurrent EPA and Subcompetency Milestones AssessmentsAbbreviations: EPA, entrustable professional activity; PC, patient care.Figure 1. Diagram Representing Comparison Between Parallel and Concurrent EPA and Subcompetency Milestones AssessmentsAbbreviations: EPA, entrustable professional activity; PC, patient care.
Figure 1 Diagram Representing Comparison Between Parallel and Concurrent EPA and Subcompetency Milestones Assessments Abbreviations: EPA, entrustable professional activity; PC, patient care.

Citation: Journal of Graduate Medical Education 12, 1; 10.4300/JGME-D-19-00408.1

The study received Institutional Review Board exemption from each of the participating institutions.

Results

Using the modified Delphi method, rotation-specific EPAs were generated for each of the 4 rotations: 6 for PICU, 8 for NICU, 6 for inpatient pediatrics team, and 7 for continuity clinic (provided as online supplemental material). With judicious mapping, 16 subcompetencies spanning the ACGME 6 competency domains were mapped to the 6 PICU EPAs, 16 to the 8 NICU EPAs, 16 to the 6 inpatient pediatrics team EPAs, and 17 to the 7 continuity clinic EPAs (provided as online supplemental material; Table 1). Of the 21 reportable pediatric subcompetencies, 18 were mapped to the 4 sets of rotation-specific EPAs. There were 630 assessments completed on these 4 rotations at the 3 participating institutions during the study period, comprising 29 143 paired data points distributed across the rotation-specific EPAs and subcompetency milestones included in the study.

Table 1 Pearson Correlation Coefficients for Each Mapping of Subcompetency Milestone to Rotation-Specific Entrustable Professional Activity (EPA)

          
            Table 1

Pearson correlation coefficients were calculated for all pairs of rotation-specific EPAs and their mapped subcompetency milestones. Statistical significance (P < .001) was demonstrated for all pairs (Table 1). Nearly all Pearson coefficients ranged from 0.70 to 0.90, indicating high correlation. Mean correlation among the mapped milestone-entrustment pairs for the 4 rotations were 0.86 for PICU, 0.79 for NICU, 0.87 for inpatient pediatrics team, and 0.86 for continuity clinic.

Multivariate linear regression modeling assessing the percentage of the variance for each EPA entrustment rating that was predicted by the group of mapped subcompetency milestones ratings yielded statistically significant R2 values (median R2 = 0.81; interquartile range 0.73–0.83; P < .001) for all EPAs (Table 2). Hence, the milestones scores assigned to that resident on each rotation explained a consistently high proportion of the variance in entrustment ratings.

Table 2 Assessing Percentage of Variance for Each Entrustable Professional Activity (EPA) Entrustment Rating Predicted by Mapped Subcompetency Milestones Ratings

          
            Table 2

Both EPA level (entrustment rating) and subcompetency milestones rating increased linearly with level of training (P < .001). The overlapping lines with similar slopes provide evidence of relations to other variables' validity for the entrustment and milestones scales used (Figure 2).

Figure 2. Comparison of Average EPA Level (Entrustment Rating) and Subcompetency Milestones Rating by Year of TrainingAbbreviations: EPA, entrustable professional activity; PGY, postgraduate year.Figure 2. Comparison of Average EPA Level (Entrustment Rating) and Subcompetency Milestones Rating by Year of TrainingAbbreviations: EPA, entrustable professional activity; PGY, postgraduate year.Figure 2. Comparison of Average EPA Level (Entrustment Rating) and Subcompetency Milestones Rating by Year of TrainingAbbreviations: EPA, entrustable professional activity; PGY, postgraduate year.
Figure 2 Comparison of Average EPA Level (Entrustment Rating) and Subcompetency Milestones Rating by Year of Training Abbreviations: EPA, entrustable professional activity; PGY, postgraduate year.

Citation: Journal of Graduate Medical Education 12, 1; 10.4300/JGME-D-19-00408.1

Discussion

The results of this study demonstrate that rotation-specific EPAs correlate with multiple mapped subcompetency milestones. The statistically significant correlations appear to hold across institutions and rotations. This implies that graduate medical education programs may be able to simplify their faculty-of-resident assessment strategies by replacing numerous individual subcompetencies and associated milestones with rotation-specific EPAs derived from stated rotation-specific goals and objectives.

In comparing average correlations across rotations, it appears some experiences lend themselves better to rotation-specific EPAs. This implies that the setting or the specific wording of the EPA may affect the validity evidence for using EPAs as a predictor of subcompetency milestones levels. Additionally, as EPAs represent observable skills, a rotation where the attending and the resident work more closely would presumably afford a more reliable EPA assessment than a rotation where the resident works more indirectly with the supervisor.

Because rotation-specific EPAs include a number of subcompetencies, they are holistic in nature, and their assessment represents a more global screening assessment of trainees.5,10,13,34 If a learner is assessed to be at a high level on an EPA, then it follows that they are generally proficient in the subcompetency milestones mapped to it. However, when a weakness is identified using an EPA assessment, alternative ways to assess learners (eg, objective structured clinical examination or another form of direct observation) are needed in order to deconstruct and tease out the problem area(s). This represents a natural way of assessing learners—to screen and then verify with additional assessment tools. Therefore, while using rotation-specific EPAs can be a valuable screening assessment, this does not preclude the need for other assessment methods.

The use of 4 rotations at 3 institutions within 1 specialty (pediatrics) limits generalizability. The potential presence of faculty who developed the EPA-subcompetency mapping and who were also responsible for completing the end-of-rotation resident assessments might have unduly positively skewed correlations between the EPAs and subcompetencies. However, we estimate there was less than 5% overlap between these faculty groups. Participating faculty also expressed concern about the length of the assessment forms; thus, assessment fatigue in which an assessor chooses the same level throughout the assessment form to reduce cognitive load could have positively skewed the results. We acknowledge that high-quality EPAs should focus on professional tasks, not individual qualities of a learner.35 In reviewing our developed EPAs, the final rotation-specific EPAs (PICU 6, NICU 8, inpatient team 6, and continuity clinic 7) do include an aspect of personal development; however, these EPAs also include activities that are directly relevant to patient care. There may have been variability in the manner in which individual versus composite assessments were completed, which may have affected entrustment decisions. However, as different rotations at the 3 institutions used individual versus composite assessments, this factor is less likely to produce systematic bias. We also recognize that greater time devoted to faculty development and iteratively revising the rotation-specific EPAs verbiage might have improved correlations between entrustment and competency.

Further standardization and characterization of EPA and entrustment scale content and construct should be broadly explored and studied.3436 Additionally, furthering faculty development strategies will be important to mitigate unwanted variability in assessment of trainee performance, especially around the use of entrustment decision-making tools.34,37,38

Conclusions

In this study of 3 pediatric residency programs over an entire academic year, we found a strong association between entrustment-level assessments using rotation-specific EPAs and concurrent subcompetency milestones assessments in global end-of-rotation assessments by faculty. Our results lend support to the use of EPAs as an observable component of global faculty-of-resident evaluation assessments.

References

  • 1
    Nasca TJ,
    Philibert I,
    Brigham T,
    Flynn TC.
    The next GME accreditation system—rationale and benefits. N Engl J Med. 2012;366(
    11
    ):10511056. doi:10.1056/NEJMsr1200117.
  • 2
    Holmboe ES,
    Sherbino J,
    Englander R,
    Snell L,
    Frank JR,
    ICBME Collaborators.
    A call to action: the controversy of and rationale for competency-based medical education. Med Teach. 2017;39(
    6
    ):574581. doi:10.1080/0142159X.2017.1315067.
  • 3
    Holmboe ES.
    Competency-based medical education and the ghost of Kuhn: reflections on the messy and meaningful work of transformation. Acad Med. 2018;93(
    3
    ):350353. doi:10.1097/ACM.0000000000001866.
  • 4
    Frank JR,
    Snell LS,
    Cate OT,
    Holmboe ES,
    Carraccio C,
    Swing SR,
    et al.
    Competency-based medical education: theory to practice. Med Teach. 2010;32(
    8
    ):638645. doi:10.3109/0142159X.2010.501190.
  • 5
    Touchie C,
    ten Cate O.
    The promise, perils, problems and progress of competency-based medical education. Med Educ. 2016;50(
    1
    ):93100. doi:10.1111/medu.12839.
  • 6
    Holmboe ES.
    Realizing the promise of competency-based medical education. Acad Med. 2015;90(
    4
    ):411413. doi:10.1097/ACM.0000000000000515.
  • 7
    Hicks PJ,
    Schumacher DJ,
    Benson BJ,
    Burke AE,
    Englander R,
    Guralnick S,
    et al.
    The pediatrics milestones: conceptual framework, guiding principles, and approach to development. J Grad Med Educ. 2010;2(
    3
    ):410418. doi:10.4300/JGME-D-10-00126.1.
  • 8
    Schumacher DJ,
    Sectish TC,
    Vinci RJ.
    Optimizing clinical competency committee work through taking advantage of overlap across milestones. Acad Pediatr. 2014;14(
    5
    ):436438. doi:10.1016/j.acap.2014.06.003.
  • 9
    Mink RB,
    Schwartz A,
    Herman BE,
    Turner DA,
    Curran ML,
    Myers A,
    et al.
    Validity of level of supervision scales for assessing pediatric fellows on the common pediatric subspecialty entrustable professional activities. Acad Med. 2018;93(
    2
    ):283291. doi:10.1097/ACM.0000000000001820.
  • 10
    Hawkins RE,
    Welcher CM,
    Holmboe ES,
    Kirk LM,
    Norcini JJ,
    Simons KB,
    et al.
    Implementation of competency-based medical education: are we addressing the concerns and challenges? Med Educ. 2015;49(
    11
    ):10861102. doi:10.1111/medu.12831.
  • 11
    Hicks PJ,
    Margolis M,
    Poynter SE,
    Chaffinch C,
    Tenney-Soeiro R,
    Turner TL,
    et al.
    The pediatrics milestones assessment pilot: development of workplace-based assessment content, instruments, and processes. Acad Med. 2016;91(
    5
    ):701709. doi:10.1097/ACM.0000000000001057.
  • 12
    El-Haddad C,
    Damodaran A,
    McNeil HP,
    Hu W.
    The ABCs of entrustable professional activities: an overview of ‘entrustable professional activities' in medical education. Intern Med J. 2016;46(
    9
    ):10061010. doi:10.1111/imj.12914.
  • 13
    Ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice?Acad Med. 2007;82(
    6
    ):542547. doi:10.1097/ACM.0b013e31805559c7.
  • 14
    Ten Cate O.
    AM last page: what entrustable professional activities add to a competency-based curriculum. Acad Med. 2014;89(
    4
    ):691. doi:10.1097/ACM.0000000000000161.
  • 15
    Carraccio C,
    Englander R,
    Gilhooly J,
    Mink R,
    Hofkosh D,
    Barone MA,
    et al.
    Building a framework of entrustable professional activities, supported by competencies and milestones, to bridge the educational continuum. Acad Med. 2017;92(
    3
    ):324330. doi:10.1097/ACM.0000000000001141.
  • 16
    Ten Cate O.
    Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5(
    1
    ):157158. doi:10.4300/JGME-D-12-00380.1.
  • 17
    Damodaran A,
    Shulruf B,
    Jones P.
    Trust and risk: a model for medical education. Med Educ. 2017;51(
    9
    ):892902. doi:10.1111/medu.13339.
  • 18
    Rekman J,
    Gofton W,
    Dudek N,
    Gofton T,
    Hamstra SJ.
    Entrustability scales: outlining their usefulness for competency-based clinical assessment. Acad Med. 2016;91(
    2
    ):186190. doi:10.1097/ACM.0000000000001045.
  • 19
    Ten Cate O,
    Hart D,
    Ankel F,
    Busari J,
    Englander R,
    Glasgow N,
    et al.
    Entrustment decision making in clinical training. Acad Med. 2016;91(
    2
    ):191198. doi:10.1097/ACM.0000000000001044.
  • 20
    Carraccio CL,
    Englander R.
    From Flexner to competencies: reflections on a decade and the journey ahead. Acad Med. 2013;88(
    8
    ):10671073. doi:10.1097/ACM.0b013e318299396f.
  • 21
    Ten Cate O, Tobin S, Stokes ML. Bringing competencies closer to day-to-day clinical work through entrustable professional activities. Med J Aust. 2017;206(
    1
    ):1416. doi:10.5694/mja16.00481.
  • 22
    Ten Cate O, Snell L, Carraccio C. Medical competence: the interplay between individualability and the health care environment. Med Teach. 2010;32(
    8
    ):669675. doi:10.3109/0142159X.2010.500897.
  • 23
    Carraccio C,
    Burke AE.
    Beyond competencies and milestones: adding meaning through context. J Grad Med Educ. 2010;2(
    3
    ):419422. doi:10.4300/JGME-D-10-00127.1.
  • 24
    Chen HC,
    van den Broek WS,
    ten Cate O.
    The case for use of entrustable professional activities in undergraduate medical education. Acad Med. 2015;90(
    4
    ):431436. doi:10.1097/ACM.0000000000000586.
  • 25
    Ten Cate O.
    Competency-based education, entrustable professional activities, and the power of language. J Grad Med Educ. 2013;5(
    1
    ):67. doi:10.4300/JGME-D-12-00381.1.
  • 26
    Schultz K,
    Griffiths J,
    Lacasse M.
    The application of entrustable professional activities to inform competency decisions in a family medicine residency program. Acad Med. 2015;90(
    7
    ):888897. doi:10.1097/ACM.0000000000000671.
  • 27
    Choe JH,
    Knight CL,
    Stiling R,
    Corning K,
    Lock K,
    Steinberg KP.
    Shortening the miles to the milestones: connecting EPA-based evaluations to ACGME milestone reports for internal medicine residency programs. Acad Med. 2016;91(
    7
    ):943950. doi:10.1097/ACM.0000000000001161.
  • 28
    Warm EJ,
    Held JD,
    Hellmann M,
    Kelleher M,
    Kinnear B,
    Lee C,
    et al.
    Entrusting observable practice activities and milestones over the 36 months of an internal medicine residency. Acad Med. 2016;91(
    10
    ):13981405. doi:10.1097/ACM.0000000000001292.
  • 29
    Hart D,
    Franzen D,
    Beeson M,
    Bhat R,
    Kulkarni M,
    Thibodeau L,
    et al.
    Integration of entrustable professional activities with the milestones for emergency medicine residents. West J Emerg Med. 2019;20(
    1
    ):3542. doi:10.5811/westjem.2018.11.38912.
  • 30
    Weiss A,
    Ozdoba A,
    Carroll V,
    DeJesus F.
    Entrustable professional activities: enhancing meaningful use of evaluations and milestones in a psychiatry residency program. Acad Psychiatry. 2016;40(
    5
    ):850854. doi:10.1007/s40596-016-0530-2.
  • 31
    Cook DA,
    Beckman TJ.
    Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006;119(
    2
    ):166.e7e16. doi:10.1016/j.amjmed.2005.10.036.
  • 32
    Cate OT.
    Entrustment as assessment: recognizing the ability, the right, and the duty to act. J Grad Med Educ. 2016;8(
    2
    ):261262. doi:10.4300/JGME-D-16-00097.1.
  • 33
    Dreyfus HL,
    Dreyfus SE,
    Anthanasiou T.
    Mind Over Machine: The Power of Human Intuition and Expertise in the Era of the Computer.
    New York, NY
    :
    The Free Press;
    1986.
  • 34
    Peters H,
    Holzhausen Y,
    Boscardin C,
    ten Cate O,
    Chen HC.
    Twelve tips for the implementation of EPAs for assessment and entrustment decisions. Med Teach. 2017;39(
    8
    ):802807. doi:10.1080/0142159X.2017.1331031.
  • 35
    Taylor DR,
    Park YS,
    Egan R,
    Chan MK,
    Karpinski J,
    Touchie C,
    et al.
    EQual, a novel rubric to evaluate entrustable professional activities for quality and structure. Acad Med. 2017;92(
    11S
    ):110117. doi:10.1097/ACM.0000000000001908.
  • 36
    O'Dowd E,
    Lydon S,
    O'Connor P,
    Madden C,
    Byrne D.
    A systematic review of 7 years of research on entrustable professional activities in graduate medical education, 2011–2018. Med Educ. 2019;53(
    3
    ):234249. doi:10.1111/medu.13792.
  • 37
    Dewey CM,
    Jonker G,
    ten Cate O,
    Turner TL.
    Entrustable professional activities (EPAs) for teachers in medical education: has the time come? Med Teach. 2017;39(
    8
    ):894896. doi:10.1080/0142159X.2016.1270447.
  • 38
    Carraccio C,
    Englander R,
    Holmboe ES,
    Kogan JR.
    Driving care quality: aligning trainee assessment and supervision through practical application of entrustable professional activities, competencies, and milestones. Acad Med. 2016;91(
    2
    ):199203. doi:10.1097/ACM.0000000000000985.
Copyright: 2020
pdf
<bold>
  <sc>Figure</sc>
  1
</bold>
Figure 1

Diagram Representing Comparison Between Parallel and Concurrent EPA and Subcompetency Milestones Assessments

Abbreviations: EPA, entrustable professional activity; PC, patient care.


<bold>
  <sc>Figure</sc>
  2
</bold>
Figure 2

Comparison of Average EPA Level (Entrustment Rating) and Subcompetency Milestones Rating by Year of Training

Abbreviations: EPA, entrustable professional activity; PGY, postgraduate year.


Author Notes

Editor's Note: The online version of this article contains faculty-of-resident global rotation assessments and specific entrustable professional activities for each of the 4 rotations.

Funding: The authors report no external funding source for this study.

Conflict of interest: The authors declare they have no competing interests.

This work was previously presented at the Accreditation Council for Graduate Medical Education Annual Educational Conference, San Diego, California, February 26–March 1, 2015; Pediatric Academic Societies Annual Meeting, San Diego, California, April 25–28, 2015; and Association of Pediatric Program Directors Annual Spring Meeting, New Orleans, Louisiana, March 30–April 2, 2016.

Corresponding author: Jerry G. Larrabee, MD, MEd, University of New Mexico, Department of Pediatrics, MSC10 5590, 1 University of New Mexico, Albuquerque, NM 87131-0001, 505.272.5551, fax 505.272.6845, jlarrabee@salud.unm.edu
Received: 07 Jun 2019
Accepted: 23 Oct 2019
  • Download PDF