Programmatic Assessment: The Secret Sauce of Effective CBME Implementation

MD, FACP and
MD, MACP, FRCP
Online Publication Date: 01 Aug 2020
Page Range: 518 – 521
DOI: 10.4300/JGME-D-20-00702.1
Save
Download PDF

In 2001, the Accreditation Council for Graduate Medical Education (ACGME) launched the Outcome Project and introduced the graduate medical education (GME) community to the 6 general competencies as a framework to guide the transformation to competency-based medical education (CBME).1 We are now 20 years into this transformation and while the GME community has advanced CBME through activities such as the creation and implementation of specialty-specific milestones, there remains significant unfinished work if the goal of the Outcome Project is to be achieved.

Essential to this transformation is the creation of a “shared mental model” of the desired outcome of GME. At a basic level, that outcome is the creation of a physician workforce with abilities to meet 21st century societal health care needs. The general competency framework recognized the need for new competencies as well as advancing the teaching and assessment of traditional competencies such as medical knowledge and professionalism. As described by the International Competency-based Medical Education (ICBME) Collaborators, CBME is an approach to preparing physicians for practice that is fundamentally oriented to graduate outcome abilities and organized around competencies derived from an analysis of societal and patient needs.2 These needs are captured by the Institute for Healthcare Improvement Quadruple Aim: health of the population, per capita cost of care, the experience of care, and the well-being of the health care workforce.3

While the desired outcomes have been codified, operationalizing the path to achieving these outcomes has proved challenging. The reasons are many. At a systems level, transforming the time-based processes and structures that are hardwired into GME requires major rethinking by the entire GME enterprise. At the same time, transformation requires developing new approaches to the development, delivery, and evaluation of educational programs. Robust assessment has always been essential to effective learning and professional development. CBME and the 6 ACGME Core Competencies require a systematic approach to assessment, which means training programs must employ an integrated combination of multiple assessments. This combination of integrated assessments should be managed as a program of assessments, sometimes referred to as programmatic assessment. The days of simply relying on high-stakes tests and faculty rotational evaluations should be over. Programmatic assessment requires ongoing, longitudinal training and development of those faculty and administrators charged with driving this transformation.

This discussion will focus on competency-based assessment from a systems perspective and will describe ACGME initiatives targeting the development of GME educators and administrators in programmatic assessment.

CBME as a System

Conceptualizing a CBME program through a systems lens highlights the complexity of the GME enterprise. Simply defined, a system consists of 2 or more interdependent parts that work together to accomplish a shared aim. GME programs consist of multiple parts that are interdependent and need to interact effectively and efficiently to produce a high quality of care within the teaching institution and ultimately physicians highly prepared for 21st century practice. In order to design a CBME program using systems thinking, we need a better understanding of the core components of a CBME-designed program. Van Melle and colleagues have identified 5 core components of CBME.4 These include defining an outcomes-based competency framework, progressive sequencing of competencies within that framework, learning experiences tailored to those competencies, teaching tailored to those competencies, and effective programmatic assessment. Effective programmatic assessment is essential to ensuring that the desired outcome of CBME is achieved.

CBME Programmatic Assessment as Part of the Larger GME System

Within the larger CBME system, effective programmatic assessment can be conceptualized as a subsystem. The programmatic assessment subsystem consists of those individuals and groups who work together on a regular basis to perform appropriate assessments that enable valid and reliable determinations of learner progression toward highly competent, unsupervised practice. This program of assessment shares the agreed upon goals and outcomes, the linked individual learner assessment and program evaluation processes, the information about learner's performance (ie, both feedback and feedforward mechanisms), and the desire to produce a trainee fully prepared to enter fellowship or the health care system to provide high-quality care. Done accurately and effectively, effective programmatic assessment optimizes learning, facilitates decision making regarding learner progression toward desired outcomes, and informs quality improvement activities of the program.5

While programmatic assessment requires an integrated combination of assessment methods and tools to achieve this vision, assessment by faculty remains an essential component. In order to maximize the value of faculty assessment within a system of assessment, faculty training will be crucial. Assessment requires a sophisticated set of knowledge, skills, and attitudes. In other words, assessment involves key competencies such as understanding and applying core pedagogical concepts and practices, cultivating robust observational and questioning skills, providing effective feedback and coaching, and using assessment data to help individuals and programs continually improve. Given that the assessment methods and tools are only as good as the faculty using them, it is imperative that faculty develop the necessary competencies to assess accurately and effectively.

ACGME Faculty Development Initiatives on Programmatic CBME Assessment

Like other key stakeholders in GME, the ACGME has committed significant time and energy to developing a portfolio of faculty development resources.6 In addition to its Annual Educational Conference, the ACGME has delivered a 6-day course addressing competency-based assessment since 2014. This course, held in Chicago, has been offered 3 to 4 times a year since its inception and to date has educated more than 600 program directors, associate and assistant program directors, faculty, and program administrators. In addition, a regional hub faculty development program was launched in 2014 to expand access to the content of the Chicago course. There are currently 17 regional hub sites (Box 1) that provide a modified 3-day version of the 6-day course. Since its inception, the regional hub program has enrolled nearly 600 learners. Eleven of these hubs are located within the United States and 5 are international. Regional hub course core content is presented through interactive workshops, addressing core concepts and the language of CBME, assessment as a program, defining the characteristics of good assessment, effective feedback strategies, and best practices in direct observation as a work-based assessment. All regional hub sessions also include a hands-on workshop for practicing feedback and direct observation skills that utilizes live simulation practice. Both the 6-day course and the regional hub sessions also require that attendees complete a “commitment to change” describing how they will apply course content to innovation in their home GME program.

Information about all these faculty development offerings can be found on the ACGME website under Meetings and Educational Activities. Times and locations of courses are listed under Courses and Workshops–Developing Faculty Competencies in Assessment. Attendees of the Developing Faculty Competencies in Assessment course or any of the regional hub programs also have access to the modules listed in Box 2.

The ACGME also has developed a distance learning platform—Learn at ACGME—which provides some online faculty development content in assessment for all faculty. CME credit is offered for some of this content. This online portal, sponsored and curated by the ACGME Office of Distance Learning, can be accessed at www.acgme.org/distancelearning. In addition to providing on-demand content, Learn at ACGME hosts discussion forums for conversations with colleagues and ACGME staff. The website lists content by areas of interest for designated institutional officials, program directors, program coordinators, faculty, and residents and fellows. A list of currently available open access content on Learn at ACGME is provided in Box 3.

Future Learn at ACGME content is in development and will include presentations from additional ACGME courses, the Annual Educational Conference sessions, and materials essential for advancing CBME and accreditation. Later this year, open access web-based apps to facilitate and record assessment through direct observation and to assess interprofessional teamwork will also be made available.

Faculty training in assessment will be crucial to maximize its value. As members of the GME community, we invite you to use these resources as needed. If you have specific questions that are not answered by website or portal content, please contact the ACGME Department of Research, Milestones Development and Evaluation (milestones@acgme.org) or the Office of Distance Learning (de@acgme.org).

References

  • 1
    Batalden P,
    Leach D,
    Swing S,
    Dreyfus H,
    Dreyfus S.
    General competencies and accreditation in graduate medical education. Health Aff (Millwood). 2002;21(
    5
    ):103111. doi:10.1377/hlthaff.21.5.103.
  • 2
    Frank JR,
    Snell LS,
    Cate OT,
    Holmboe ES,
    Carraccio C,
    Swing SR,
    et al.
    Competency-based medical education: theory to practice. Med Teach. 2010;32(
    8
    ):638645. doi:10.3109/0142159X.2010.501190.
  • 3
    Bodenheimer T,
    Sinsky C.
    From triple to quadruple aim: care of the patient requires care of the provider. Ann Fam Med. 2014;12(
    6
    ):573576. doi:10.1370/afm.1713.
  • 4
    Van Melle E,
    Frank JR,
    Holmboe ES,
    Dagnone D,
    Stockley D,
    Sherbino J,
    et al.
    A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94(
    7
    ):10021009. doi:10.1097/ACM.0000000000002743.
  • 5
    van der Vleuten CP,
    Schuwirth LW,
    Driessen EW,
    Dijkstra J,
    Tigelaar D,
    Baartman LK,
    et al.
    A model for programmatic assessment fit for purpose. Med Teach. 2012;34(
    3
    ):205214. doi:10.3109/0142159X.2012.652239.
  • 6
    Holmboe ES,
    Ward DS,
    Reznick RK,
    Katsufrakis PJ,
    Leslie KM,
    Patel VL,
    et al.
    Faculty development in assessment: the missing link in competency-based medical education. Acad Med. 2011;86(
    4
    ):460467. doi:10.1097/ACM.0b013e31820cb2a7.
Copyright: Accreditation Council for Graduate Medical Education 2020 2020

Author Notes

Editor's Note: The ACGME News and Views section of JGME includes data reports, updates, and perspectives from the ACGME and its review committees. The decision to publish the article is made by the ACGME.

Corresponding author: William F. Iobst, MD, FACP, Accreditation Council for Graduate Medical Education, 401 N Michigan Avenue, Suite 2000, Chicago, IL 60611, 312.755.5076, wiobst@acgme.org
  • Download PDF