Perceptions of Internal Medicine Residency Program Candidates on the Use of Simulation in the Selection Process

MD,
MPAS, PA-C,
MD,
MD,
MD,
MD, and
MD
Online Publication Date: 01 Jun 2014
Page Range: 338 – 340
DOI: 10.4300/JGME-D-13-00276.1
Save
Download PDF

Abstract

Background

The recruitment of skilled candidates into internal medicine residency programs has relied on traditional interviewing techniques with varying degrees of success. The development of simulated medical technology has provided a new arena in which to assess candidates' clinical skills, knowledge base, situational awareness, and problem-solving dexterities within a standardized environment for educational and assessment purposes.

Objective

The purpose of this study was to investigate the interest of program candidates in incorporating simulation medicine into the internal medicine residency interview process.

Methods

As a prospective, survey-based analysis, potential candidates who completed an interview between October 2012 and January 2013 with an accredited internal medicine residency program were sent a postmatch survey that incorporated 3 additional questions relating to their prior experience with medical simulation and their views on incorporating the technology into the interview format.

Results

Of the 88 candidates who completed an interview, 92% (n  =  81) were scheduled to graduate medical school in 2013 and were graduates of a US medical school. All survey responders described previous experience with medical simulation. Fifty-eight percent (n  =  51) of responders described being “less likely” to interview with or join a residency program if they were required to participate in a 10-minute medical simulation during the interview process.

Conclusions

The results of this study suggest that despite the increasing role of technology in medical education, its role in high-stakes evaluations (such as residency interviews) requires further maturation before general acceptance by residency candidates can be expected.

Introduction

Graduate medical education programs routinely use traditional interview techniques to assess applicants' clinical skills and cognitive knowledge. This form of assessment has historically demonstrated poor reliability and variable predictive validity. Other fields, such as aviation, firefighting, and the military, have recognized these inherent limitations and subsequently incorporated the performance of candidates within simulated environments as a metric by which to enhance candidate selection.1

Within the past decade, simulation technology has gained greater acceptance within the medical community. By providing standardized, safe clinical scenarios that accommodate skill rehearsal without compromising patient care, the technology has experienced widespread application within most academic medical institutions. Despite its general acceptance in training and skill assessment, few institutions have used simulation in high-stakes evaluations such as residency applicant interviews.

In light of the increasing acceptance and availability of medical simulation, we sought to evaluate whether candidates applying to our internal medicine residency program would be interested in interviewing and matching with our residency if such technology were routinely incorporated into the residency interview.

Methods

In this prospective, survey-based analysis, candidates who completed an interview between October 2012 and January 2013 with the Mayo Clinic in Phoenix, Arizona, an accredited internal medicine residency program, were sent a postmatch survey. The survey was developed by program staff without formal validation or piloting and has been used for 10 consecutive years to improve recruitment and the quality of the interview experience. All surveys were sent electronically (SurveyMonkey, Palo Alto, CA) within 1 month of the match. Survey participation was voluntary, and all results were anonymously collected. In addition to standard survey questions regarding the participant's interview experience, 3 questions were added relating to their prior experience with medical simulation and views on incorporating the technology into the interview format (table 1). Descriptive statistics were used to summarize the data.

TABLE 1 Simulation-Based Postmatch Survey Questions With Candidate Responses
TABLE 1

The study was deemed to be a quality improvement project by our Institutional Review Board after review of the project's study design.

Results

A total of 103 applicants were offered interviews. Of those applicants, 88 candidates completed an interview (58% were men). Among surveyed candidates, 81 (92%) were scheduled to graduate medical school in 2013 (graduation date range 1995–2013). In addition, 81 (92% of applicants) were graduates of a US medical school. A total of 50 (57%) applicants returned the survey (table 1). All survey responders had previous training experience with medical simulation, and 47% (n  =  41) had more than 10 hours of experience. Most responders (58%, n  =  51) described being “less likely” to interview or join the residency program if they were required to participate in a 10-minute medical simulation scenario during the interview process. Respectively, only 4% (n  =  4) and 2% (n  =  2) of responders described themselves as “more likely” to interview and to join the residency if required to participate in the simulation. In general, candidates with less experience with simulation appeared less interested in interviewing with or joining the residency program (table 2).

TABLE 2 Comparison of Candidate Experience With Simulation and Responses to Questions Regarding Interviewing and Joining Residency Program
TABLE 2

Discussion

The emergence of medical simulation as an accepted tool for medical education, skill training, and technique assessment has offered a new platform from which to construct standardized candidate appraisals that appositely support the traditional interview format. However, with more than half of our applicants indicating reluctance to interview with or join the residency program if mandated to participate in a simulation exercise, the results of this study suggest that despite adequate candidate exposure to the technology, high-stakes assessments may be met with apprehension that could compromise recruitment efforts.

Delineating the reasoning behind these unanticipated responses was beyond the scope of this study and indeed warrants future investigations. One may hypothesize that perhaps the anxiety caused by being observed in a simulation scenario outweighed the benefits of interviewing with the institution. However, though anxiety was not examined directly within this study, an evaluation of medical students undergoing medical school interviews with a simulation component identified that candidates “disagree[d] somewhat” with the assertion that the simulation-inclusive format was stressful.2 Alternatively, our candidates may have assumed that an institution willing to invoke such “aggressive” assessment methods runs a program of a similarly rigorous nature.

Given the anonymous nature of the survey, it remains unclear if the candidates who were hesitant about simulation were ranked higher or lower by the interviewing institution than candidates who were interested in or indifferent to participating. It is possible that those who signaled a refusal to participate did so as a response to concerns with their own baseline clinical skills. Alternatively, these people may possess superior skills and thus feel no need to engage in rigorous assessment methods if their competitive profile would likely provide entrance into other acceptable programs. It should be emphasized that conclusions drawn from this study are limited by the unvalidated nature of the survey and that candidates may have interpreted the survey questions differently, subsequently affecting their responses. Additionally, the applicant pool responding to this survey was representative of one specialty at a single program during a single time period; thus, results may not be generalizable to other institutions or specialties.

Independent of applicant perceptions, the role of simulation in the interview process has been increasingly recognized within the health care field. In 2000, the release of the Institute of Medicine's To Err Is Human: Building a Safer Health System set the tone of an era where the prime directive remains reducing medical errors and enhancing patient safety.3 Using the current tools available, the selection of training candidates whose clinical skills would best prepare them to succeed in a quality-focused environment has remained a challenge for most graduate medical education programs. Historical reliance on standardized tests (eg, US Medical Licensing Examination [USMLE]), medical school grades, clinical rotation evaluations, and behavioral interviewing techniques frequently leads to the selection of residents whose medical knowledge may belie their potential clinical acumen. The American Board of Medical Specialties and the Accreditation Council for Graduate Medical Education require documentation of competency in 6 cognitive and functional domains from all graduates, and the use of simulation in assessing these skills has been encouraged by both institutions as the technology's availability has grown. Indeed, the incorporation of standardized patients into the USMLE Step 2 Clinical Skills reinforces a growing recognition that patient care skills are derived from more than baseline clinical knowledge as currently assessed by standardized testing and that these skills may be aptly assessed by replicating real health care encounters. Internationally, the Multiple Mini-Interviews at McGill University is an example of incorporating medical simulation into the medical school admissions process.2 Within the past decade, simulation has been increasingly applied to applicant interviews internationally and to the hiring process of practicing health care staff.4,5 This exploratory study suggests that candidates could demonstrate hesitancy to its assimilation. Further research will be required to ascertain the ideal use of simulation in high-stakes interviews before applying the technology as a reliable candidate selection tool.

Copyright: 2014

Author Notes

All authors are at the Department of Internal Medicine, Mayo Clinic in Arizona. Keith Cannon, MD, is Department Chair, Hospital Internal Medicine, and former Program Director, Internal Medicine Residency; Zachary Hartsell, MPAS, PA-C, is Physician Assistant, Department of Hospital Medicine, and Assistant Professor of Medicine; Ilko Ivanov, MD, is Attending Physician, Department of Hospital Medicine; Joseph Charles, MD, is Attending Physician, Department of Hospital Medicine; Harshad Joshi, MD, is Attending Physician, Department of Hospital Medicine; Janis Blair, MD, is Program Director, Internal Medicine Residency; and Holly Geyer, MD, is Attending Physician, Department of Hospital Medicine.

Corresponding author: Holly Geyer, MD, Mayo Clinic, Department of Internal Medicine, 13400 E Shea Boulevard, Scottsdale, AZ 85259, geyer.holly@mayo.edu

Funding: The authors report no external funding source for this study.

Conflict of interest: The authors declare they have no competing interests.

Received: 28 Jul 2013
Accepted: 12 Jan 2014
  • Download PDF