Resident and Faculty Perceptions of Program Strengths and Opportunities for Improvement: Comparison of Site Visit Reports and ACGME Resident Survey Data in 5 Surgical Specialties
ABSTRACT
Resident and faculty views of program strengths and opportunities for improvement (OFIs) offer insight into how stakeholders assess key elements of the learning environment.Background
This study sought (1) to assess the degree to which residents and faculty in 359 programs in 5 surgical specialties (obstetrics and gynecology, orthopaedic surgery, otolaryngology, plastic surgery, and surgery) were aligned or divergent in their respective views of program strengths and OFIs; and (2) to evaluate whether responses to selected questions on the Accreditation Council for Graduate Medical Education (ACGME) Resident Survey correlated with strengths or OFIs identified by the residents during the site visit.Objective
Faculty and resident lists of program strengths and OFIs in site visit reports for 2012 and 2013 were aggregated, analyzed, and compared to responses on the Resident Survey.Methods
While there was considerable alignment in resident and faculty perceptions of program strengths and OFIs, some attributes were more important to one or the other group. Collegiality was valued highly by both stakeholder groups. Responses to 2 questions on the ACGME Resident Survey were associated with resident-identified OFIs in site visit reports pertaining to aspects of the didactic program and responsiveness to resident suggestions for improvement.Results
The findings offer program leadership additional insight into how 2 key stakeholder groups view elements of the learning environment as program strengths or OFIs and may serve as useful focal areas for ongoing improvement activities.Conclusions
Introduction
Continuous program improvement is an essential component of the self-study process that was introduced by the Accreditation Council for Graduate Medical Education (ACGME) as part of the Next Accreditation System.1 In an ideal context, residents and faculty in a given program should have closely aligned perceptions about program strengths and opportunities for improvement (OFIs), and consensus around high-priority OFIs should inform decisions about the program's short-term and long-term improvement goals. Furthermore, OFIs at the program level should be echoed in residents' perceptions about their learning environment, as reported in the Resident Survey.
This survey is intended by the ACGME as a tool for program improvement at the local level. The survey also is 1 of the screening tools the ACGME uses annually in the new accreditation system to assess trainee satisfaction with various aspects of their program's learning environment.2 To date, no studies have addressed the degree of agreement between residents and faculty about residency programs' strengths and OFIs. Because areas suggesting lower resident satisfaction are discussed with residents, faculty, and program leadership during the site visit, correlating selected responses on the Resident Survey and information gained during site visits provide an opportunity to gain insight into these issues and to validate the responses to the Resident Survey.
The aims of this study were (1) to assess the degree to which residents and faculty were aligned or divergent in their respective views of program strengths and OFIs; (2) to explore whether agreement between residents and faculty is greater for program strengths than for OFIs; and (3) to evaluate whether responses to selected questions on the Resident Survey correlated with strengths or OFIs identified by the residents during the site visit. For the third aim, we analyzed how responses to the Resident Survey question about faculty interest in resident education related to how residents described the quality of didactic sessions, their perception of faculty as teachers, and faculty attendance at conferences, as described by residents during the site visit.
We also analyzed how responses to the Resident Survey question about resident satisfaction with how the program used their evaluations for improvement related to residents asking for more input in program matters as an OFI during the site visit.
Methods
Site visit reports by ACGME field representatives include listings of program strengths and OFIs, as perceived independently by residents and faculty. Resident and faculty lists are developed during the site visit; a resident list is also requested prior to the site visit and is reviewed and augmented during the resident interview. This study used data for 359 programs in 5 surgical specialties (obstetrics and gynecology, orthopaedic surgery, otolaryngology, plastic surgery, and surgery) with site visits during 2012 and 2013.
Site visit reports were grouped by specialty. A unique identification number was assigned to each report and its corresponding Resident Survey. The site visit date, the number of residents in the program, and the resident and faculty lists of program strengths and OFIs were extracted from each report, deidentified, and placed into a file grouped by specialty. From each resident and faculty list, the total number of strengths and OFI themes were coded. Another article presents an analysis of the program attributes residents identified as important strengths and OFIs.3 For this study, the number of similar themes present in both lists was recorded. Four individuals (the first author, a physician in academic medicine, a chemist, and a lawyer) validated the coding of themes describing program strengths and OFIs in the resident and faculty lists. The study used a criterion of 75% agreement for a theme to be recorded as the same in the residents' and faculty's respective lists.
The Resident Survey was matched to the year of the program's site visit and to the program's unique identification number. The responses of residents who participated in the survey were recorded as follows: For the question “Thinking about the faculty and staff in your program, how interested are they in your residency education?” compliance was denoted by the total percentage of respondents who answered “extremely interested” and “very interested”; and for the question “How satisfied are you with the way your program uses the evaluations that residents provide to improve the program?” the total percentage of residents who answered “extremely satisfied” and “very satisfied” was used to indicate compliance.
Institutional Review Board approval for the study was granted by the American Institutes for Research.
Data from the residents' and faculty's lists in site visit reports were aggregated at the program level and all analyses were conducted using the program as the unit of analysis. Quantitative data were analyzed using SAS Enterprise Guide version 4.3 (SAS Institute Inc, Cary, NC). To assess the degree of agreement between resident and faculty reports of program strengths and OFIs, we used the percentage of programs where residents and faculty have reported a given program attribute as a strength or being in need of improvement. We used a logistic regression model to determine whether satisfaction with faculty's interest in education on the Resident Survey was a significant positive predictor of how residents rated didactics. Programs were grouped by specialty, and whether residents made no comment on didactics, rated them as a strength, or rated them as an OFI. Resident comments in site visit reports about the quality of didactics, faculty giving didactics, and faculty attendance at didactics were coded into 3 groups: a strength, improve or increase, or no comment. For analysis, the groups were dichotomized between “improve” which was considered a focal category and “other” which included “no comment or a strength.” Mean compliance scores were calculated for faculty interest in resident education on the Resident Survey, and then aggregated by specialty.
Logistic regression was used to determine whether satisfaction on the Resident Survey with the way the program used resident evaluations for improvement was associated with residents reporting their level of input into program improvement as an OFI during the site visit. Programs were grouped by specialty and coded as “1” if residents commented in the site visit report that they wanted more input; otherwise, they were coded as “0.” Mean compliance scores for the survey question were calculated and aggregated by specialty.
Results
The study included nearly half of the ACGME-accredited programs in obstetrics and gynecology, otolaryngology, and surgery, and one-fourth of the programs in orthopaedic surgery and plastic surgery (table 1).
Tables 2a–c show the program attributes that were commonly listed in resident and faculty lists of strengths and OFIs across the 5 specialties in the study, as well as the prevalence of them being mentioned in site visit reports. Prevalence was defined as strong when an attribute was mentioned in more than 50% of programs, moderate it if was mentioned in more than 30% of programs, and modest if it was mentioned in more than 20% of programs. This shows both similarities and differences in resident and faculty reports of strengths and OFIs. Overall, there was agreement among residents and faculty on several key strengths, including collegiality, operative volume and variety, progressive autonomy, adequate nurse practitioner/physician assistant staffing, and senior residents teach junior residents in the operating room. Tables 2a–c also show that residents identified more OFIs than faculty, and that across the different programs, the same attributes were identified both as a strength and an OFI.
The assessment of program attributes against responses to the ACGME Resident Survey showed that for 3 specialties (obstetrics and gynecology, otolaryngology, and surgery) lower resident satisfaction with faculty interest in their education on the Resident Survey was associated with a statistically significantly higher likelihood that the quality of didactics, increasing faculty giving didactics, or improving faculty attendance at didactics were identified by residents as OFIs in site visit reports (tables 3a–c).
Abbreviations: OFIs, opportunities for improvement; NP, nurse practitioner; PA, physician assistant.
Note: Prevalence is defined as Strong (more than 50% of programs), Moderate (30% or more of programs, or Modest (20% or more of programs).
For 3 specialties (obstetrics and gynecology, plastic surgery, and surgery), lower satisfaction on the Resident Survey with the use of resident evaluations for program improvement was associated with a higher likelihood of residents reporting their input into program improvement as an OFI during the site visit (table 4). Programs in orthopedic surgery and otolaryngology had high satisfaction ratings in this area on the Resident Survey and frequently mentioned responsiveness to resident input as a strength during the site visit.
Discussion
There are both similarities and differences in the attributes of residency programs valued by residents and their faculty members. Program and institutional leaders should be mindful of these different priorities to ensure meeting the needs of both key stakeholder groups and contributing to an optimal learning experience for residents and working conditions for faculty that foster participation in the educational process.
In the new accreditation system, the annual program evaluation and the self-study are intended to facilitate a comprehensive assessment of program strengths and OFIs.4 Successful outcomes of this process will be predicated on a set of improvement priorities agreed on by all stakeholders. Programs in which there is disagreement between residents and faculty improvement priorities may benefit from added dialogue among all stakeholders, such as focus groups or a retreat, to reconcile the views of resident and faculty about areas in need of improvement during their annual program evaluation or self-study.
Responses to 2 questions on the Resident Survey were associated with areas in need of improvement highlighted by residents during the site visit. The characteristics of didactics that residents indicated as OFIs included the overall quality of the didactic program, insufficient faculty teaching during formal didactics (as opposed to residents serving as teachers), and low faculty attendance at conferences. These results indicate that residents appear to judge faculty interest in their education by the presence of faculty at required educational sessions. For the annual program evaluation or self-study, a key improvement could entail developing optimal ways to engage faculty in program didactics.
In 3 specialties (obstetrics and gynecology, plastic surgery, and surgery), satisfaction on the Resident Survey with how resident evaluations were used for program improvement was a predictor of residents identifying resident input for program improvement. This identifies resident feedback and the degree to which programs implement resident suggestions (or provide legitimate reasons why this is not feasible or advantageous) as another important area for the annual program evaluation and the self-study.
This study has several limitations, including its retrospective nature and the use of different methods of obtaining program strengths and OFIs from residents (presite visit consensus list, augmented during the site visit interview) and from faculty (development during the site visit interview). Also, the analysis focused on just 2 questions from the Resident Survey, and it is not possible to generalize the results to the other questions. Finally, having the lists of program strengths and OFIs prepared for and/or during the accreditation site visit may have introduced response and social desirability bias in the data.
Conclusion
There are both similarities and differences in the attributes of residency programs valued by residents and their faculty members. Lower satisfaction reported in response to questions on the ACGME Resident Survey was associated with resident-identified OFIs in the area of didactic education and the use of resident input to improve the program. Our findings provide program directors with helpful information on stakeholder perceptions worthy of exploration as they embark on the annual program evaluation and the program self-study.
Author Notes
All authors are with the Accreditation Council for Graduate Medical Education. Donna A. Caniano, MD, is Accreditation Field Representative; Kenji Yamazaki, PhD, is Outcome Assessment Project Associate; Nicholas Yaghmour, MPP, is Research Associate for Milestones Evaluation; Ingrid Philibert, PhD, MBA, is Senior Vice-President for Field Activities; and Stanley J. Hamstra, PhD, is Vice President for Milestone Research and Evaluation.
Funding: This study received funding from the Accreditation Council for Graduate Medical Education's Nathan K. Blank Fellowship Program. Dr. Caniano was awarded the Fellowship Grant for 2014.
The authors are grateful to Rebecca S. Miller, MS, Margarita Perez, Gerald Kosicki, PhD, and Li Tang, EdD, MPH, for their assistance in completion of this project. We acknowledge the following individuals who provided coding validation in resident and faculty lists for strengths and opportunities for improvement in site visit reports: Richard A. Flores, Esq, Sarah Kirtland, PhD, and Roberta E. Sonnino, MD.



