Use of Milestones and Development of Entrustable Professional Activities in 2 Hematology/Oncology Training Programs

DO,
MD,
MD,
,
MEd, and
MD
Online Publication Date: 01 Mar 2015
Page Range: 101 – 104
DOI: 10.4300/JGME-D-14-00283.1
Save
Download PDF

Abstract

Background

The Next Accreditation System (NAS) increases the focus on educational outcomes and meaningful evaluation of learners. This requires that key clinical faculty develop new assessment formats such as entrustable professional activities (EPAs).

Objectives

To build and develop milestone-based assessment tools supporting 5 EPAs for a hematology/oncology fellow continuity clinic, and to educate key clinical faculty regarding the Clinical Competency Committee (CCC) and the NAS.

Methods

Program directors from 2 hematology/oncology fellowship programs developed 5 EPAs for continuity clinic evaluation supported by milestone-based assessment. The program directors met to create a unified CCC charter. Key clinical faculty helped to develop a milestone-based evaluation of fellow continuity clinic through creation of 5 hematology/oncology-specific EPAs. Formal entrustment regarding EPAs was deliberated by the CCC.

Results

A total of 18 fellows were evaluated. Clinical Competency Committee deliberation at each institution took approximately 10 minutes per fellow for discussion and decision regarding entrustment for all 5 EPAs supporting continuity clinic. One-third of postgraduate year (PGY)–4s, 50% of PGY-5s, and 100% of PGY-6s were deemed competent in all 5 EPAs by the CCC.

Conclusions

All hematology/oncology trainees in San Antonio were evaluated using milestone-based assessment for continuity clinic, and entrustment decisions regarding 5 EPAs were made by the CCC. This project may provide other programs with a sound basis for adoption and further development of the next generation of evaluation tools at their institutions. Entrustable professional activities that are rotation specific should be used as a starting point for linking to the competencies, subcompetencies, and the reporting milestones.

Editor's Note: The online version of this article contains a Clinical Competency Committee charter, an Entrustable Professional Activity worksheet, and milestone-based assessment tools.

Introduction

In July 2014, implementation of the Next Accreditation System (NAS) began for internal medicine subspecialty programs. Programs were tasked with semiannual reporting of milestone data, with the first report scheduled for December 2014. While there have been some examples of implementation of the NAS and milestone-based curricula in core internal medicine (IM) programs,13 there are few data to guide IM subspecialty programs in this process. The IM Subspecialty Reporting Milestones were released in February 2014,4 leaving only a few months for programs to incorporate and build a milestone-based assessment platform. The pace of implementation and a need to educate our key clinical faculty (KCF) regarding this process was the impetus for a pilot project between 2 hematology/oncology (HO) fellowship programs in San Antonio, Texas.

This collaborative effort resulted in the creation of 5 entrustable professional activities (EPAs) and the development of a milestone-based assessment tool. We describe the development of the EPAs and milestone-based assessments and our experience with the implementation of these tools, and provide recommendations for other programs.

Methods

The HO program directors (PDs) at 2 academic medical centers in San Antonio (San Antonio Uniformed Services Health Education Consortium [SAUSHEC] and the University of Texas Health Science Center San Antonio [UTHSCSA]) started weekly 1-hour phone conferences in the Spring of 2013 to develop and implement a milestone-based assessment tool and educate KCF about evaluation in the NAS. Weekly calls were held for 8 months and included program coordinators and a chief fellow.

We focused our pilot project on 1 clinical rotation, the HO continuity clinic. This allowed us to quickly implement and “practice” using the milestones. Our first step was clarifying the roles, responsibilities, and composition of the Clinical Competency Committee (CCC). After a review of the Accreditation Council for Graduate Medical Education (ACGME) requirements, we developed a CCC charter used by both institutions (provided as online supplemental material). We then engaged 2 to 3 KCFs at each institution to help develop 5 EPAs for the HO continuity clinic by using a template suggested by ten Cate5 (figure 1). Each template took 30 to 45 minutes on average to complete. It has been suggested that a manageable number of EPAs for an entire curriculum is between 20 and 30,6 so we limited our continuity clinic evaluation to 5 EPAs. The EPA templates were reviewed during our weekly phone calls. The 5 EPAs chosen were writing chemotherapy orders, performing toxicity checks, monitoring response to therapy, performing bone marrow biopsies, and end-of-life care (provided as online supplemental material).

FIGURE 1. Entrustable Professional Activity Worksheeta. / Abbreviations: MK, Medical Knowledge; PC, Patient Care; ICS, Interpersonal and Communication Skills; P, Professionalism; PBLI, Practice-Based Learning and Improvement; SBP, Systems-Based Practice; RECIST, Response Evaluation Criteria in Solid Tumors; ASCO, American Society of Clinical Oncology; NCCN, National Comprehensive Cancer Network. / aAdapted from ten Cate,5 2013.FIGURE 1. Entrustable Professional Activity Worksheeta. / Abbreviations: MK, Medical Knowledge; PC, Patient Care; ICS, Interpersonal and Communication Skills; P, Professionalism; PBLI, Practice-Based Learning and Improvement; SBP, Systems-Based Practice; RECIST, Response Evaluation Criteria in Solid Tumors; ASCO, American Society of Clinical Oncology; NCCN, National Comprehensive Cancer Network. / aAdapted from ten Cate,5 2013.FIGURE 1. Entrustable Professional Activity Worksheeta. / Abbreviations: MK, Medical Knowledge; PC, Patient Care; ICS, Interpersonal and Communication Skills; P, Professionalism; PBLI, Practice-Based Learning and Improvement; SBP, Systems-Based Practice; RECIST, Response Evaluation Criteria in Solid Tumors; ASCO, American Society of Clinical Oncology; NCCN, National Comprehensive Cancer Network. / aAdapted from ten Cate,5 2013.
FIGURE 1 Entrustable Professional Activity Worksheeta Abbreviations: MK, Medical Knowledge; PC, Patient Care; ICS, Interpersonal and Communication Skills; P, Professionalism; PBLI, Practice-Based Learning and Improvement; SBP, Systems-Based Practice; RECIST, Response Evaluation Criteria in Solid Tumors; ASCO, American Society of Clinical Oncology; NCCN, National Comprehensive Cancer Network. aAdapted from ten Cate,5 2013.

Citation: Journal of Graduate Medical Education 7, 1; 10.4300/JGME-D-14-00283.1

Once our EPAs were complete, we started to build milestone-based evaluations supporting these EPAs. At the time, we did not yet have the IM Subspecialty Reporting Milestones and the 22 core IM Reporting Milestones were just published.7 Fortunately, our core IM program had created a continuity clinic evaluation template by using the original 142 developmental milestones published by the American Board of Internal Medicine and the ACGME.8 We modified this template to include 2 to 3 milestones supporting each of the 6 core competencies for each tool developed. These milestones were edited to fit our subspecialty. Our CCC members chose which milestones to use; the process of editing and reviewing our milestone-based assessment tools, and then ensuring that they supported our EPAs, took about 3 to 4 weeks (provided as online supplemental material).

From July through December 2013, we piloted our new evaluation system at both institutions. Milestone evaluations were loaded into New Innovations software (New Innovations Inc) for dissemination to the KCF at 3 and 6 months. After 6 months, each institution's CCC met so that formal entrustment decisions regarding EPAs were deliberated per the CCC charter.

The study was reviewed and considered nonregulated research and therefore exempt by the Institutional Review Board of the University of Texas Health Science Center.

Results

During the 6-month study period, 100% of the fellows (n  =  18; 7 SAUSHEC, 11 UTHSCSA) were evaluated. On average, it took the CCC from each institution approximately 10 minutes per fellow for discussion and decision regarding entrustment for all 5 EPAs supporting continuity clinic. Thirty-three percent (2 of 6) of postgraduate year (PGY)–4s were deemed competent in all 5 EPAs; 50% (3 of 6) of PGY-5s were deemed competent in all 5 EPAs; and 100% (6 of 6) of PGY-6s were deemed competent in all 5 EPAs.

Discussion

This is the first published experience describing implementation of a milestone-based assessment and use of EPAs for an IM subspecialty. We offer the following suggestions for those working to implement the NAS.

The first recommendation is to establish goals (EPAs) for each clinical rotation. This is the foundation on which to begin building an evaluation framework supporting the NAS (figure 2). If published EPAs are not available, we recommend creating your own rotation-specific EPAs. For smaller training programs, 20 to 30 broad-based EPAs would be reasonable. Our KCFs interact with the trainees on a daily basis, making our CCC deliberations less reliant on detailed evaluation tools. For larger programs, a greater number of more specific EPAs may be needed to ensure that a trainee is competent, as CCC members may have had little if any interaction with the trainee. Warm et al9 have implemented and described in detail an example of how to modify EPAs for larger programs.

FIGURE 2. Suggested Framework for Next Accreditation System. / Abbreviations: EPA, Entrustable Professional Activity; PC, Patient Care; MK, Medical Knowledge; PBLI, Practice-Based Learning and Improvement; ICS, Interpersonal and Communication Skills; P, Professionalism; SBP, Systems-Based Practice.FIGURE 2. Suggested Framework for Next Accreditation System. / Abbreviations: EPA, Entrustable Professional Activity; PC, Patient Care; MK, Medical Knowledge; PBLI, Practice-Based Learning and Improvement; ICS, Interpersonal and Communication Skills; P, Professionalism; SBP, Systems-Based Practice.FIGURE 2. Suggested Framework for Next Accreditation System. / Abbreviations: EPA, Entrustable Professional Activity; PC, Patient Care; MK, Medical Knowledge; PBLI, Practice-Based Learning and Improvement; ICS, Interpersonal and Communication Skills; P, Professionalism; SBP, Systems-Based Practice.
FIGURE 2 Suggested Framework for Next Accreditation System Abbreviations: EPA, Entrustable Professional Activity; PC, Patient Care; MK, Medical Knowledge; PBLI, Practice-Based Learning and Improvement; ICS, Interpersonal and Communication Skills; P, Professionalism; SBP, Systems-Based Practice.

Citation: Journal of Graduate Medical Education 7, 1; 10.4300/JGME-D-14-00283.1

A second recommendation is to link the core competencies to the EPAs your program has developed. We accomplished this by using a very simple EPA template as previously described.5 Once this has been done, program leadership should review the milestones for their specialty in the context of supporting all clinical rotations. This process takes time to complete, as ACGME terminology can be confusing. For example, for IM subspecialty programs, there are 23 reporting milestones. However, a closer look shows that there are really 23 subcompetencies, for which there are 5 columns of entrustment, supported by a total of 354 narrative milestones. The focus should not be on using all 354 narrative milestones, but instead ensuring that each of the 23 subcompetencies are addressed by several different evaluation tools anchored by some of these narrative descriptors. At the completion of a training program, the collective data from all of the evaluations should provide the program a global sense of trainee competence.

Finally, we recommend including fellows and program coordinators in this process. The success of our collaboration was dependent on the organizational efforts of our program coordinators and the enthusiasm from the chief fellow.

Conclusion

Our pilot project provides a sound approach to assessment that can be adopted by other programs, and can foster the development of the next generation of evaluation tools. We recommend the use of EPAs that are rotation specific as a starting point for linking to the competencies, subcompetencies, and the reporting milestones.

Copyright: 2015
FIGURE 1
FIGURE 1

Entrustable Professional Activity Worksheeta

Abbreviations: MK, Medical Knowledge; PC, Patient Care; ICS, Interpersonal and Communication Skills; P, Professionalism; PBLI, Practice-Based Learning and Improvement; SBP, Systems-Based Practice; RECIST, Response Evaluation Criteria in Solid Tumors; ASCO, American Society of Clinical Oncology; NCCN, National Comprehensive Cancer Network.

aAdapted from ten Cate,5 2013.


FIGURE 2
FIGURE 2

Suggested Framework for Next Accreditation System

Abbreviations: EPA, Entrustable Professional Activity; PC, Patient Care; MK, Medical Knowledge; PBLI, Practice-Based Learning and Improvement; ICS, Interpersonal and Communication Skills; P, Professionalism; SBP, Systems-Based Practice.


Author Notes

Nathan M. Shumway, DO, is Program Director, Hematology/Oncology Fellowship, San Antonio Uniformed Services Health Education Consortium; Jennifer J. Dacus, MD, is Key Clinical Faculty, Division of Hematology/ Oncology, University of Texas Health Science Center; Kate I. Lathrop, MD, is Chief Fellow, Division of Hematology/Oncology, University of Texas Health Science Center; Elizabeth P. Hernandez is Program Coordinator, Hematology/ Oncology Fellowship, San Antonio Uniformed Services Health Education Consortium; Maria Miller, MEd, is Program Coordinator, Division of Hematology/Oncology, University of Texas Health Science Center; and Anand B. Karnad, MD, is Program Director, Division of Hematology/Oncology, University of Texas Health Science Center.

Corresponding author: Nathan M. Shumway, DO, San Antonio Military Medical Center, 3551 Roger Brooke Drive, Fort Sam Houston, TX 78234, 210.916.4808, fax 210.916.1184, nathan.shumway@us.army.mil

Funding: The authors report no external funding source for this study.

Conflict of interest: The authors declare they have no competing interests.

This research was presented as a poster during the 2014 ACGME Annual Education Conference, February 27–March 2, National Harbor, Maryland.

The views expressed herein are those of the authors and do not reflect the official policy or position of Brooke Army Medical Center, the US Army Medical Department, the US Army Office of the Surgeon General, the Department of the Army and the Department of Defense, or the US government.

Received: 01 May 2014
Accepted: 28 Oct 2014
  • Download PDF