High-Fidelity Simulation as an Experiential Model for Teaching Root Cause Analysis

MD, MHA,
MD,
MD, PhD, and
MD
Online Publication Date: 01 Dec 2011
Page Range: 529 – 534
DOI: 10.4300/JGME-D-11-00229.1
Save
Download PDF

Abstract

Purpose

The purpose of this study was to assess the effectiveness of high-fidelity simulation for teaching root cause analysis (RCA) in graduate medical education.

Methods

Thirty clinical anesthesiology-1 through clinical anesthesiology-3 residents were randomly assigned to 2 groups: group A participants received a 10-minute lecture on RCA and participated in a simulation exercise where a medical error occurs, and group B participants received the 10-minute lecture on RCA only. Participants completed baseline, postintervention, and 6-month follow-up assessments, and they were evaluated on their attitude toward as well as understanding of RCA and “systems-based” care.

Results

All 30 residents completed the surveys. Baseline attitudes and knowledge scores were similar between groups. Postintervention knowledge scores were also similar between groups; however, group B was significantly more skeptical (P < .001) about the use of RCA and “systems improvement” strategies. Six months later, group A demonstrated retained knowledge scores and unchanged attitude, whereas group B demonstrated significantly worse knowledge scores (P  =  .001) as well as continued skepticism toward a systems-based approach (P < .001) to medical error reduction.

Conclusion

High-fidelity simulation in conjunction with focused didactics is an effective strategy for teaching RCA and systems theory in graduate medical education. Our findings also suggest that there is greater retention of knowledge and increased positive attitude toward systems improvement when focused didactics are coupled with a high-fidelity simulation exercise.

Editor's Note: The online version of this article contains the assessment tool used to evaluate residents for internal and external validity prior to implementation of the current study.

Introduction

Root cause analysis (RCA) investigates and categorizes the events that affect safety, quality, reliability, and production of health care services.1 It helps to identify not only what and how an event occurred, but why it happened.2 Generally, “human errors” can be traced to some well-defined causes.3 Understanding the reasons why these events occur is the key to developing effective recommendations and to specifying workable corrective measures to prevent future adverse events.

RCA implementation consists of 7 critical steps (figure).4 The aim is to recognize that prevention of recurrence by a single intervention is not always possible because there may be several root causes for a particular situation. RCA is often considered an iterative process, and it is viewed as a tool for continuous quality improvement.5 Although its potential in health care is now being realized, RCA has been implemented successfully for decades in manufacturing and aviation.69

FIGURE. Root Cause Analysis: The Critical Steps. / Adapted from the Office of Nuclear Safety Policy and Standards. Root Cause Analysis Guidance Document. Washington, DC: US Department of Energy; 1992FIGURE. Root Cause Analysis: The Critical Steps. / Adapted from the Office of Nuclear Safety Policy and Standards. Root Cause Analysis Guidance Document. Washington, DC: US Department of Energy; 1992FIGURE. Root Cause Analysis: The Critical Steps. / Adapted from the Office of Nuclear Safety Policy and Standards. Root Cause Analysis Guidance Document. Washington, DC: US Department of Energy; 1992
FIGURE Root Cause Analysis: The Critical Steps Adapted from the Office of Nuclear Safety Policy and Standards. Root Cause Analysis Guidance Document. Washington, DC: US Department of Energy; 1992

Citation: Journal of Graduate Medical Education 3, 4; 10.4300/JGME-D-11-00229.1

Market forces along with internal pressure from organizations dedicated to patient safety have ushered in a new era in medicine focused on enhancing the quality of care.3,1013 Accordingly, innovations in medical education must be nurtured to develop physicians equipped with the knowledge and tools to cultivate a culture of quality. Yet, traditional quality improvement strategies in graduate medical education have been largely reactive in nature.1416 Often, such “after-the-fact” analyses of medical errors focus on individual accountability and reprimand, which are ineffective methods of enhancing patient safety and seldom help to improve the overall quality of care.1719

In an effort to transform graduate medical education, the Accreditation Council for Graduate Medical Education (ACGME) issued 6 guiding principles as part of its Outcome Project.20 Two of these competencies—systems-based practice, and practice-based learning and improvement—have direct implications on health care quality and call for a shift from narrow, discipline-specific views of patient care to an integrated model that enhances organizational excellence.21,22 As these concepts have matured, so has the expectation that training programs will meet the challenge by teaching residents to systematically analyze practice with quality improvement methods, implement change strategies with the goal of practice improvement, work in interprofessional teams to enhance patient safety, and participate in the identification of “systems” errors with the goal of implementing “systems” solutions.23 And although “error reduction” has received significant attention within the medical education community, there is limited research on novel methods to empower residents to meaningfully improve quality.

In recent years, simulation has emerged within residency programs as a valuable tool to attain basic technical adeptness and to improve resident performance in medical procedures.24 Limited evidence also suggests that simulation is a useful means of preparing physicians to function more effectively within the context of multidisciplinary care teams and to build leadership skills for crisis resource management.2529 We sought to assess the effectiveness of high-fidelity simulation as a proactive tool for teaching RCA in graduate medical education to assist physicians-in-training to identify and manage systems-based obstacles that may impede optimal patient care.

Methods

Assessment Tool

The assessment instrument for the study (provided as online supplemental material) was designed by a group of perioperative physicians with expertise in systems theory. The instrument was successfully pilot tested with 5 residents (not involved with the current study) prior to its use in the study.

Setting and Participants

We recruited clinical anesthesiology-1 through clinical anesthesiology-3 residents from the Pennsylvania State University College of Medicine to participate in this exercise as an alternative to their weekly department-sponsored didactic session. Written consent was documented for all study participants. An electronic random number generator (SISA Binomial, Southampton, United Kingdom) was used to assign residents to either group A (n  =  15) or group B (n  =  15). The study received approval from the appropriate Institutional Review Boards. Data were collected anonymously.

Knowledge and Attitude Assessments

Group A completed an initial evaluation designed to assess baseline knowledge of organizational behavior, attitude toward systems thinking, and understanding of RCA. The group then participated in a 10-minute oral presentation on RCA and systems improvement theory as well as a simulation exercise, which was immediately followed by a debriefing session. Both groups completed a postintervention evaluation to conclude their primary assessment. Group B members completed the same initial evaluation as group A, but they did not participate in a simulation exercise. All participants were retested 6 months later using the original assessment tool.

Simulation Intervention

Group A members were split into smaller subgroups, and a consistent simulation exercise was used for the assessments. Two volunteers from each subgroup played the roles of resident and attending anesthesiologist in a simulated operating room equipped with an Emergency Care Simulator manikin (METI, Sarasota, FL). Remaining group members observed the scenario. Experienced actors played the roles of the care team members (surgeon, scrub nurse, and circulating nurse), and the manikin was controlled via a remote computer. The case involved a healthy, young patient undergoing nasal surgery who developed hypertension and tachycardia after injection of local anesthetic by the surgeon. The simulated patient eventually developed electrocardiograph changes consistent with ischemia. The goal of this element was to evaluate the clinical decision-making skills of the participating residents. In the event that the residents did not stop the care team from continuing with the procedure, the surgeon was trained to abort the surgery and initiate an appropriate course of action to stabilize the patient. Following the simulation exercise, all participants convened for a debriefing session, where an experienced facilitator led a discussion to ensure that the participants had recognized key facts related to the simulated event.

Group A members were then asked to formulate a list of reasons to explain why such an event occurred and to propose “appropriate” measures to prevent recurrence. The goal of this session was to ensure that group A members had understood the underlying theoretical foundation of RCA and systems thinking, and that they could apply it to a real-world scenario.

Statistical Analysis

We determined a minimum sample size of 14 (n  =  7 for each group) under the assumption that a 15-point difference in mean knowledge scores would be observed between groups, with a common standard deviation of 10, α  =  0.05, and the power set at 0.8 (SISA Binomial). We also assumed that there would be a 50% dropoff in participation at the 6-month follow-up assessment, and as such we planned to recruit a minimum of 28 participants. We used a Welch t test to compare group means for attitude and knowledge assessments (α  =  0.05), and continuity-corrected Wald confidence intervals to calculate the difference in the group means (SISA Binomial).

Results

Knowledge and Attitude Assessments

All participants completed the entire study (n  =  30). Baseline demographics were similar between groups in terms of average age, sex, and level of training.

Self-Reported Assessments

Ninety-three percent of the participants (n  =  28) reported that they had not heard of RCA prior to this exercise, and 13% (n  =  4) had not heard of systems improvement theory. At baseline, participants reported similar low scores regarding their understanding of systems improvement (P  =  .28) and RCA (P  =  .09). After intervention, all participants reported similar markedly higher scores regarding their understanding of systems improvement (P  =  .82) and RCA (P  =  .50). At 6-month follow-up, all participants continued to report a high level of understanding of systems improvement (P  =  .81) and RCA (P  =  .25; table 1).

TABLE 1 Self-Reported Knowledge of Root Cause Analysis (RCA) and Systems Theory
TABLE 1

Objective Knowledge Assessments

Initial knowledge scores related to systems improvement and RCA remained consistent with baseline self-reported scores (P  =  .28). Postintervention assessment demonstrated similar improvement in knowledge scores among both groups (P  =  .84). At 6-month follow-up, knowledge scores were significantly lower among members of group B (P  =  .001; table 2).

TABLE 2 Objective Knowledge Assessment a
TABLE 2

Attitude Evaluation

Initial evaluation of attitudes toward systems improvement and RCA were similar between groups (P  =  .67); however, immediate postintervention scores revealed that group B members were significantly less likely to accept systems improvement strategies and the use of RCA to improve health care quality (P < .001). At 6-month follow-up, members of group A continued to express positive attitudes toward the use of systems improvement strategies and RCA, whereas members of group B remained significantly less likely to do so (P < .02; table 3).

TABLE 3 Attitude Toward “Systems Improvement” and Root Cause Analysis (RCA) a
TABLE 3

Simulation Intervention

During the simulation exercise, the volunteers in each group A subgroup initially attempted to pharmacologically control the hypertension and tachycardia. With limited success and persistent ST changes on the electrocardiogram, they worked with the care team to abort the procedure. In addition, all subgroups initiated some form of a myocardial infarction protocol and recommended that the patient be transferred to the intensive care unit to commence an appropriate cardiac work-up and to determine further management. The root causes and potential corrective measures identified after the debriefing session (table 4) support the notion that group A understood the concept of RCA and systems thinking, and was successful in applying the concepts to a real-world scenario.

TABLE 4 Identified Root Causes and Proposed Corrective Strategies a
TABLE 4

Discussion

Simulation has been increasingly used in medical education as a means to safely and objectively develop resident knowledge and skills.24 From simple, individualized modules to elaborate, team-based exercises, a myriad of simulation experiences are potentially available.26,30 An appreciation and understanding of the most common simulation methods as well as their strengths and weaknesses can assist educators in developing a more diverse training portfolio.

The results of our study suggest that high-fidelity simulation, paired with succinct didactics, is a novel and effective method for teaching RCA and systems thinking to residents. Study limitations include small sample size and single-site intervention, and our study lacked validated and standardized measures. Consequently, these results are not necessarily generalizable to other institutions. In addition, we used only one clinical scenario directly involving a pair of residents from the same specialty, which undermines the teamwork development, cross-specialty training, and multiple-scenario capabilities of high-fidelity simulation. Our group is in the process of developing more elaborate scenarios that incorporate these features.

The use of simulation in continuous quality improvement offers several advantages over traditional (reactive) or didactic methods. First, it does not rely on accidental harm to patients to allow for an educational opportunity, thereby enabling residents to experience even clinically rare events (eg, malignant hyperthermia); second, it provides a “safe” environment in which residents can practice responding to critical situations autonomously without potential harm to patients; third, feedback is instantaneous; and fourth, a large number of clinical scenarios can be constructed to meet the educational goals of each session. Aside from these direct advantages and despite direct costs related to acquisition and operation of equipment, there exists the likelihood for overall cost savings through error reduction. However, the most important advantage of high-fidelity simulation is rooted in the experiential learning that it fosters. Popularized by Kolb, the essence of experiential learning is best summarized by a famous dictum attributed to Confucius, which states: “Tell me, and I will forget. Show me, and I may remember. Involve me, and I will understand.” Experiential learning occurs when individuals engage in an activity, reflect upon the activity critically, derive some useful insight from the analysis, and incorporate the result through a change in understanding and/or behavior.31

Unlike in manufacturing or aviation, where RCA is used routinely to achieve the highest degree of mechanical precision, human interaction and subjective decision making are critical components of medical care; highly trained and well-intentioned individuals will make mistakes, but the expertise, compassion, and emotional intelligence they possess are not easily replaced. Therefore, it is important to ingrain mechanisms that attenuate the impact of errors and to implement continuous quality improvement efforts to persistently refine existing systems. Given limited opportunities to introduce residents to important concepts such as systems improvement and RCA, educational sessions aimed at such topics should maximize their impact through the use of experiential learning (such as high-fidelity simulation) in addition to focused didactics. Moreover, the use of high-fidelity simulation as a complement to lectures on health care management theory represents a unique opportunity to reinforce the core concepts emphasized in the 2011 ACGME Common Program Requirements.23 Although we were able to demonstrate good short-term retention of knowledge and sustained positive attitudes toward RCA and systems thinking, further work must determine whether this can be sustained over the long term. Future research should also attempt to prospectively estimate the impact of simulation-based RCA education in terms of overall resident error reduction and the severity/nature of medical errors that do occur.

References

  • 1
    Iedema R,
    Flabouris A,
    Grant S,
    Jorm C.
    Narrativizing errors of care: critical incident reporting in clinical practice. Soc Sci Med. 2006;62(
    1
    ):134144.
  • 2
    Iedema R,
    Jorm C,
    Long D,
    Braithwaite J,
    Travaglia J,
    Westbrook M.
    Turning the medical gaze in upon itself: root cause analysis and the investigation of clinical error. Soc Sci Med.. 2006;62(
    7
    ):16051615.
  • 3
    Kohn LT,
    Corrigan JM,
    Donaldson MS,
    eds. To Err Is Human: Building a Safer Health System.
    Washington, DC
    :
    National Academy Press
    ; 2000.
  • 4
    United States Government Accountability Office. VA Patient Safety Program: A Cultural Perspective at Four Medical Facilities.
    Washington, DC
    :
    US Government Accountability Office
    ; 2004.
  • 5
    Braithwaite J,
    Westbrook MT,
    Mallock NA,
    Travaglia JF,
    Iedema RA.
    Experiences of health professionals who conducted root cause analyses after undergoing a safety improvement programme. Qual Saf Health Care. 2006;15(
    6
    ):393399.
  • 6
    Heget JR,
    Bagian JP,
    Lee CZ,
    Gosbee JW.
    John M. Eisenberg Patient Safety Awards. System innovation: Veterans Health Administration National Center for Patient Safety. Jt Comm J Qual Improv. 2002;28(
    12
    ):660665.
  • 7
    Iedema RA,
    Jorm C,
    Braithwaite J,
    Travaglia J,
    Lum M.
    A root cause analysis of clinical error: confronting the disjunction between formal rules and situated clinical activity. Soc Sci Med. 2006;63(
    5
    ):12011212.
  • 8
    Tamuz M,
    Harrison MI.
    Improving patient safety in hospitals: contributions of high reliability theory and normal accident theory. Health Serv Res.2006;41(
    4
    ):16541676.
  • 9
    Battles JB,
    Dixon NM,
    Borotkanics RJ,
    Rabin-Fastmen B,
    Kaplan HS.
    Sensemaking of patient safety risks and hazards. Health Serv Res. 2006;41(
    4
    ):15551575.
  • 10
    Wachter RM,
    Pronovost PJ.
    The 100,000 Lives Campaign: a scientific and policy review. Jt Comm J Qual Patient Saf. 2006;32(
    11
    ):621627.
  • 11
    Galvin RS,
    Delbanco S,
    Milstein A,
    Belden G.
    Has the leapfrog group had an impact on the health care market? Health Aff (Millwood). 2005;24(
    1
    ):228233.
  • 12
    Miller MR,
    Elixhauser A,
    Zhan C,
    Meyer GS.
    Patient Safety Indicators: using administrative data to identify potential patient safety concerns. Health Serv Res. 2001;36(
    6
    ):110132.
  • 13
    Committee on Quality Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century.
    Washington, DC
    :
    National Academy Press
    ; 2001.
  • 14
    Engel KG,
    Rosenthal M,
    Sutcliffe KM.
    Residents' responses to medical error: coping, learning, and change. Acad Med. 2006;81(
    1
    ):8693.
  • 15
    Fischer MA,
    Mazor KM,
    Baril J,
    Alper E,
    DeMarco D,
    Pugnaire M.
    Learning from mistakes: factors that influence how students and residents learn from medical errors. J Gen Intern Med. 2006;21(
    5
    ):419423.
  • 16
    Wu AW.
    Do house officers learn from their mistakes? JAMA. 1991;265:20892098.
  • 17
    Studdert DM,
    Brennan TA.
    No-fault compensation for medical injuries: the prospect for error prevention. JAMA. 2001;286(
    2
    ):217223.
  • 18
    Cox PM Jr,
    D'Amato S,
    Tillotson DJ.
    Reducing medication errors. Am J Med Qual. 2001;16(
    3
    ):8186.
  • 19
    McNeill PM,
    Walton M.
    Medical harm and the consequences of error for doctors. Med J Aust. 2002;176(
    5
    ):222225.
  • 20
    Batalden P,
    Leach D,
    Swing S,
    Dreyfus H,
    Dreyfus S.
    General competencies and accreditation in graduate medical education. Health Aff (Millwood). 2002;21(
    5
    ):103111.
  • 21
    Bingham JW,
    Quinn DC,
    Richardson MG,
    Miles PV,
    Gabbe SG.
    Using a healthcare matrix to assess patient care in terms of aims for improvement and core competencies. Jt Comm J Qual Patient Saf. 2005;31(
    2
    ):98105.
  • 22
    Aspden P,
    Corrigan JM,
    Wolcott J,
    Erickson SM,
    eds. Patient Safety: Achieving a New Standard for Care.
    Washington, DC
    :
    National Academy Press
    ; 2003.
  • 23
    Accreditation Council for Graduate Medical Education. 2011 Accreditation Council for Graduate Medical Education Common Program Requirements. http://www.acgme.org/acwebsite/home/Common_Program_Requirements_07012011.pdf. Accessed July 5, 2011.
  • 24
    Jha AK,
    Duncan BW,
    Bates DW.
    Simulator-based training and patient safety. In:
    ShojaniaKG,
    ed. Making Health Care Safer: A Critical Analysis of Patient Safety Practices.
    Rockville, MD
    :
    Agency for Healthcare Research and Quality
    ; 2001.
  • 25
    Shapiro MJ,
    Morey JC,
    Small SD,
    et al.
    Simulation based teamwork training for emergency department staff: does it improve clinical team performance when added to an existing didactic teamwork curriculum? Qual Saf Health Care. 2004;13(
    6
    ):417421.
  • 26
    Beaubien JM,
    Baker DP.
    The use of simulation for training teamwork skills in health care: how low can you go? Qual Saf Health Care. 2004;13(
    suppl 1
    ):i51i56.
  • 27
    Kim J,
    Neilipovitz D,
    Cardinal P,
    Chiu M,
    Clinch J.
    A pilot study using high-fidelity simulation to formally evaluate performance in the resuscitation of critically ill patients: the University of Ottawa Critical Care Medicine, High-Fidelity Simulation, and Crisis Resource Management I Study. Crit Care Med. 2006;34(
    8
    ):21672174.
  • 28
    Blum RH,
    Raemer DB,
    Carroll JS,
    Sunder N,
    Felstein DM,
    Cooper JB.
    Crisis resource management training for an anaesthesia faculty: a new approach to continuing education. Med Educ. 2004;38(
    1
    ):4555.
  • 29
    Reznek M,
    Smith-Coggins R,
    Howard S,
    et al.
    Emergency medicine crisis resource management (EMCRM): pilot study of a simulation-based crisis management course for emergency medicine. Acad Emerg Med. 2003;10(
    4
    ):386389.
  • 30
    Kneebone R,
    Nestel D,
    Wetzel C,
    et al.
    The human face of simulation: patient-focused simulation training. Acad Med. 2006;81(
    10
    ):919924.
  • 31
    Kolb DA.
    Experiential Learning: Experience as the Source of Learning and Development.
    Englewood Cliffs, NJ
    :
    Prentice Hall
    ; 1984.
Copyright: Accreditation Council for Graduate Medical Education 2011
word
FIGURE
FIGURE

Root Cause Analysis: The Critical Steps

Adapted from the Office of Nuclear Safety Policy and Standards. Root Cause Analysis Guidance Document. Washington, DC: US Department of Energy; 1992


Author Notes

Sadeq A. Quraishi, MD, MHA, is Instructor in Anaesthesia, Harvard Medical School, Harvard University, and Assistant in Anesthesia, Department of Anesthesia, Critical Care, and Pain Medicine, Massachusetts General Hospital; Stephen J. Kimatian, MD, is Chair, Department of Pediatric Anesthesiology, and Vice Chair for Education, Anesthesiology Institute, Cleveland Clinic; W. Bosseau Murray, MD, PhD, is Professor, Department of Anesthesiology, and Director of Research, Clinical Simulation Center, Pennsylvania State University College of Medicine; and Elizabeth H. Sinz, MD, is Professor, Departments of Anesthesiology and Neurosurgery, Associate Dean of Clinical Simulation, Clinical Simulation Center, Pennsylvania State University College of Medicine.

Funding: This study was conducted with Pennsylvania State University College of Medicine Department of Anesthesiology Research Funds.

Corresponding author: Sadeq A. Quraishi, MD, MHA, Department of Anesthesia, Critical Care, and Pain Medicine, Massachusetts General Hospital, 55 Fruit Street, GRJ 402, Boston, MA 02114, 617.643.5430, squraishi@partners.org
Received: 19 Nov 2010
Accepted: 19 Jul 2011
  • Download PDF