A Comparison of Assessment Tools: Is Direct Observation an Improvement Over Objective Structured Clinical Examinations for Communications Skills Evaluation?

MD,
MPH,
MD,
MD, MPH,
MD,
MD, and
MD
Online Publication Date: 01 Apr 2018
Page Range: 219 – 222
DOI: 10.4300/JGME-D-17-00587.1
Save
Download PDF

ABSTRACT

Background 

Evaluation of resident physicians' communications skills is a challenging task and is increasingly accomplished with standardized examinations. There exists a need to identify the effective, efficient methods for assessment of communications skills.

Objective 

We compared objective structured clinical examination (OSCE) and direct observation as approaches for assessing resident communications skills.

Methods 

We conducted a retrospective cohort analysis of orthopaedic surgery resident physicians at a single tertiary care academic institution, using the Institute for Healthcare Communication “4 Es” model for effective communication. Data were collected between 2011 and 2015. A total of 28 residents, each with OSCE and complete direct observation assessment checklists, were included in the analysis. Residents were included if they had 1 OSCE assessment and 2 or more complete direct observation assessments.

Results 

There were 28 of a possible 59 residents (47%) included. A total of 89% (25 of 28) of residents passed the communications skills OSCE; only 54% (15 of 28) of residents passed the direct observation communications assessment. There was a positive, moderate correlation between OSCE and direct observation scores overall (r = 0.415, P = .028). There was no agreement between OSCE and direct observation in categorizing residents into passing and failing scores (κ = 0.205, P = .16), after adjusting for chance agreement.

Conclusions 

Our results suggest that OSCE and direct observation tools provide different insights into resident communications skills (simulation of rare and challenging situations versus real-life daily encounters), and may provide useful perspectives on resident communications skills in different contexts.

Introduction

Resident education in orthopaedic surgery has shifted from a traditional apprenticeship model to a model that includes competency-based teaching and assessment. The education of orthopaedic surgeons includes teaching and evaluating the proficiency of technical skills, as well as refining and assessing communications skills.1

Assessing competence in nontechnical skills is challenging. Objective structured clinical examinations (OSCEs)2,3 have been utilized to assess residents' orthopaedic surgery skills,4 but to date have not specifically targeted communications skills. A well-designed OSCE can assess communications skills and diagnostic capabilities across postgraduate years.3,5,6 While the OSCE is a useful tool, it is a scheduled event that residents can prepare for by reviewing relevant topics prior to the examination. Thus, the assessment may not reflect resident performance in actual clinical situations. Additionally, simulated cases often focus on rare and challenging communication encounters, and do not represent the range and complexity of patient interactions. Workplace assessments are more likely to indicate functional competence. In the Miller pyramid, direct observation assesses the higher competence level of “does,” whereas the OSCE demonstrates the “shows” level.7

The purpose of this study is to compare the OSCE and direct observation methods in the assessment of orthopaedic residents' communications skills.

Methods

We analyzed data from 2011 through 2015 for resident assessments of communications skills using OSCE and direct observation from 28 residents at a single tertiary care academic institution. The assessments were done as residents progressed to postgraduate year 4 (PGY-4) and PGY-5. The OSCE consisted of 5 stations: (1) disclosing a surgical error; (2) providing sign-out to a physician at risk for depression and alcohol abuse; (3) obtaining informed consent; (4) delivering bad news; and (5) interacting on the telephone with a nurse concerning a preoperative delay in an orthopaedic patient. OSCE cases (role descriptions) and materials (checklists) were developed, and standardized patients and nurses were trained in case portrayal and checklist completion.8 Each OSCE station was 10 minutes, with a 5-minute break between stations to allow scoring of the resident using a behaviorally anchored checklist.9 Checklist items assessing communications skills were scored as not done, partially done, and well done (the checklist is provided as online supplemental material). Residents received feedback during a debriefing that occurred after completion of the OSCE.

The results were reported as the percentage well done in the predetermined domains of information gathering (Engagement), relationship development (Empathy), educating and counseling (Education), and closing (Enlistment).8 These domains represent the Institute for Healthcare Communication “4 Es” model for effective communication (box).10 Communication scores for each resident indicate the mean percentage of well done items in each of the 4 domains.

Our institution developed a direct observation program that entailed direct observation of an actual patient encounter with a resident in the outpatient clinic setting.11,12 This offers immediate, constructive feedback because residents are debriefed in real time.13 Faculty observers were trained on observing clinical skills and providing feedback. Residents were observed by faculty during clinical sessions for a complete patient encounter. A checklist, similar to the one used for the OSCEs, was completed by faculty, immediate verbal feedback was given, and a copy of the checklist was provided to the resident.

The OSCE was conducted prior to the direct observation assessment in all cases. Institutional Review Board approval was obtained prior to initiation of this study.

We calculated a mean score for each resident from the multiple direct observations, and we compared it to the scores from the OSCE. We calculated descriptive statistics and compared residents' direct observation and OSCE scores for the 4 communication domains. We used Cohen's κ to assess agreement between the 2 methods, and we performed correlation tests to explore the relationship between the 2 assessments.14 Statistical analysis was performed using PASW Statistics version 20.0 (IBM Corp, Armonk, NY), with statistical significance set to P < .05.

Results

A total of 28 of a possible 59 residents (47%) had 1 OSCE and 2 or more completed direct observation scores, and were included in the analysis (table 1). A total of 89% (25 of 28) of the residents passed the OSCE; only 54% (15 of 28) passed the direct observation assessment. The figure shows the proportion of residents passing by communication subdomains, with scores for the direct observation assessment lower than those for the OSCE for all domains except Enlistment. Despite lower scores for direct observation, the majority of residents achieved acceptable scores for Engagement, Empathy, and Education on both assessment formats. For the Enlistment domain, the percentage with a passing score was slightly more than one-half using direct observation, compared with a quarter of the residents on the OSCE.

table 1 Resident Cohort Demographics

          
            table 1
figure. Comparison of Resident Communication Assessments by Domainfigure. Comparison of Resident Communication Assessments by Domainfigure. Comparison of Resident Communication Assessments by Domain
figure Comparison of Resident Communication Assessments by Domain

Citation: Journal of Graduate Medical Education 10, 2; 10.4300/JGME-D-17-00587.1

There was a positive, moderate correlation between OSCE and direct observation scores overall (r = 0.415, P = .028), yet there was no agreement between overall OSCE and direct observation scores on categorizing residents into passing and failing scores (κ = 0.205, P = .16). table 2 shows a moderate level of agreement between the 2 assessments on the Enlistment subdomain.

table 2 Agreement Between Objective Structured Clinical Examination and Direct Observation Assessments

          
            table 2

Discussion

Our comparison of OSCE and direct observation in assessing communications skills in orthopaedic residents found that 2 approaches may yield different insights into their skills. Using direct observation, the same residents' communications skills were consistently lower for each domain assessed, except for Enlistment. The higher score in this domain may represent a strength of the direct observation assessment in real patient encounters, where residents can demonstrate greater communication of shared decision-making, an integral aspect of the physician-patient relationship.

This makes it important to assess and address communication deficiencies early to ensure they are corrected, because in surgical specialties miscommunicating or inadequately communicating information to patients can result in adverse events.15

While direct observation showed lower performance for engagement and empathy skills than the OSCE assessment, the majority of residents had passing scores on both methods. Physician empathy has been shown to increase trust in the clinician, to improve adherence, and is associated with improved biologic indicators of disease.16

A study comparing assessments in family medicine residents concluded there is little correlation between evaluation methods.17 Our study shows similar findings, with a larger cohort assessment, providing justification for use of more than 1 complementary evaluation tool to assess resident communications skills. Many residents have experience taking OSCEs during medical school and are conscious of the objectives of the OSCE and the performance expected of them, but they may not translate this awareness and these skills to actual patient encounters. Direct observation provides an opportunity for immediate structured feedback to elucidate communications skills that need improvement.

For the senior (PGY-4 and PGY-5) residents in our study, the percentage with “passing” direct observation scores was lower than that for the OSCE. This suggests that direct observation assessments provide a unique insight into resident communication in an actual clinical setting, compared with the highly coordinated and scripted process of the OSCE.

Limitations of this study include the potential for differences in evaluation between standardized patients and faculty, as well as from assessment to assessment. Our OSCE scenarios focused predominantly on rare and challenging communication events, while direct observation allowed for widely variant interactions with patients. Finally, residents may have been more conscious of the performance expectations of the OSCE scenario, compared with their observed actual clinical encounters. The study was conducted at a single site with 28 residents, limiting generalizability.

Future research into this subject is currently underway at our institution. Direct observations of junior residents are tracked over time for changes with experience and feedback. A larger, multicenter study would provide for more generalizable conclusions into the insights afforded by these assessments.

Conclusion

Our study shows that OSCE and direct observation tools may provide different insights into residents' communications skills (simulation of rare and challenging situations versus real-life daily encounters). Both tools offer feedback to allow areas for improvement in resident communications skills to be addressed.

References

  • 1
    Ranawat AS.
    Dirschl DR.
    Wallach CJ.
    et al.
    Symposium. Potential strategies for improving orthopaedic education. Strategic dialogue from the AOA Resident Leadership Forum Class of 2005. J Bone Joint Surg Am. 2007;89(
    7
    ):16331640.
  • 2
    Harden RM.
    Stevenson M.
    Downie WW.
    et al.
    Assessment of clinical competence using objective structured examination. Br Med J. 1975;1(
    5955
    ):447451.
  • 3
    Phillips D.
    Zuckerman JD.
    Strauss EJ.
    et al.
    Objective structured clinical examinations: a guide to development and implementation in orthopaedic residency. J Am Acad Orthop Surg. 2013;21(
    10
    ):592600.
  • 4
    Griesser MJ.
    Beran MC.
    Flanigan DC.
    et al.
    Implementation of an objective structured clinical exam (OSCE) into orthopedic surgery residency training. J Surg Educ. 2012;69(
    2
    ):180189.
  • 5
    Zayyan M.
    Objective structured clinical examination: the assessment of choice. Oman Med J. 2011;26(
    4
    ):219222.
  • 6
    Dwyer T.
    Glover Takahashi S.
    Kennedy Hynes M.
    et al.
    How to assess communication, professionalism, collaboration and the other intrinsic CanMEDS roles in orthopedic residents: use of an objective structured clinical examination (OSCE). Can J Surg. 2014;57(
    4
    ):230236.
  • 7
    George M.
    The assessment of clinical skills. Acad Med. 1990;65(
    9
    ):6367.
  • 8
    Zabar S.
    Kachur EK.
    Kalet A.
    Hanley K.
    eds. Objective Structed Clinical Examinations.
    New York, NY
    :
    Springer Science+Business Media;
    2013.
  • 9
    Chander B.
    Kule R.
    Baiocco P.
    et al.
    Teaching the competencies: using objective structured clinical encounters for gastroenterology fellows. Clin Gastroenterol Hepatol. 2009;7(
    5
    ):509514.
  • 10
    Boyle D.
    Dwinnell B.
    Platt F.
    Invite, listen, and summarize: a patient-centered communication technique. Acad Med. 2005;80(
    1
    ):2932.
  • 11
    Hamburger EK.
    Cuzzi S.
    Coddington DA.
    et al.
    Observation of resident clinical skills: outcomes of a program of direct observation in the continuity clinic setting. Acad Pediatr. 2011;11(
    5
    ):394402.
  • 12
    Phillips DP.
    Zuckerman JD.
    Kalet A.
    et al.
    Direct observation: assessing orthopaedic trainee competence in the ambulatory setting. J Am Acad Orthop Surg. 2016;24(
    9
    ):591599.
  • 13
    Duffy FD.
    Gordon GH.
    Whelan G.
    et al.
    Assessing competence in communication and interpersonal skills: the Kalamazoo II report. Acad Med. 2004;79(
    6
    ):495507.
  • 14
    Hallgren KA.
    Computing inter-rater reliability for observational data: an overview and tutorial. Tutor Quant Methods Psychol. 2012;8(
    1
    ):2334.
  • 15
    Nagpal K.
    Vats A.
    Ahmed K.
    et al.
    A systematic quantitative assessment of risks associated with poor communication in surgical care. Arch Surg. 2010;145(
    6
    ):582588.
  • 16
    Hojat M.
    Empathy in Health Professions Education and Patient Care.
    New York, NY
    :
    Springer International Publishing;
    2016.
  • 17
    Nuovo J.
    Bertakis KD.
    Azari R.
    Assessing resident's knowledge and communication skills using four different evaluation tools. Med Educ. 2006;40(
    7
    ):630636.
Copyright: Accreditation Council for Graduate Medical Education 2018 2018
word
<bold>
  <sc>figure</sc>
</bold>
figure

Comparison of Resident Communication Assessments by Domain


Author Notes

Funding: The authors report no external funding source for this study.

Conflict of interest: The authors declare they have no competing interests.

Editor's Note: The online version of this article contains the direct observation and objective structured clinical examination checklist.

Corresponding author: Donna Phillips, MD, New York University Hospital for Joint Diseases, 462 First Avenue, New York, NY 10016, 212.263.2611, donna.phillips@nyumc.org
Received: 15 Aug 2017
Accepted: 05 Dec 2017
  • Download PDF