Financing Residency Training Redesign
Abstract
Background
Redesign in the health care delivery system creates a need to reorganize resident education. How residency programs fund these redesign efforts is not known.
Methods
Family medicine residency program directors participating in the Preparing Personal Physicians for Practice (P4) project were surveyed between 2006 and 2011 on revenues and expenses associated with training redesign.
Results
A total of 6 university-based programs in the study collectively received $5,240,516 over the entire study period, compared with $4,718,943 received by 8 community-based programs. Most of the funding for both settings came from grants, which accounted for 57.8% and 86.9% of funding for each setting, respectively. Department revenue represented 3.4% of university-based support and 13.1% of community-based support. The total average revenue (all years combined) per program for university-based programs was just under $875,000, and the average was nearly $590,000 for community programs. The vast majority of funds were dedicated to salary support (64.8% in university settings versus 79.3% in community-based settings). Based on the estimated ratio of new funding relative to the annual costs of training using national data for a 3-year program with 7 residents per year, training redesign added 3% to budgets for university-based programs and about 2% to budgets for community-based programs.
Conclusions
Residencies undergoing training redesign used a variety of approaches to fund these changes. The costs of innovations marginally increased the estimated costs of training. Federal and local funding sources were most common, and costs were primarily salary related. More research is needed on the costs of transforming residency training.
Editor's Note: The online version of this article contains a summary of innovations implemented in Preparing the Personal Physician for Practice (P4) programs.
Introduction
Changes in health care delivery models are resulting in a need for innovation in graduate medical education. Finding the funds to implement and sustain innovative curricula represents a major challenge for university- and community-based residency programs. Many innovations in care delivery for primary care residencies relate to implementing the patient-centered medical home (PCMH) model.1–3 To learn how to practice in a PCMH, residents need to experience this model during training.4,5 Little information exists on how residency programs fund their educational innovations, such as training in a PCMH. This information could help program directors and leaders in graduate medical education in redesign efforts.
As part of the family medicine project, Preparing Personal Physicians for Practice (P4), we analyzed the experiences of 14 family medicine residency training programs in financing educational redesign. We present cost estimates for university- and community-based family medicine residencies during the 5-year period of the project, including the amount of revenue obtained to support innovations, revenue sources, and how funds were allocated. We hypothesized that the funding sources and revenues would differ for university- versus community-based programs, as funding for program operations differs between the 2 settings.
Methods
Study Setting
The P4 project is a comparative case study of innovative changes in the length, structure, location, and content of training occurring in 14 family medicine residency programs that took place between academic years 2006–2007 and 2010–2011, and the project has been described in detail elsewhere.3,6 Programs were selected by a review committee of educational experts based on applications submitted by 40 programs in good standing with the Family Medicine Review Committee. The 14 programs selected for the study received no funding to implement their curricular redesigns, although travel funding was provided for collaborative meetings held every 18 months.
Data Collection
Data for this study were collected using an annual survey, which had undergone extensive pilot testing. The survey asked respondents to estimate the revenues and expenses related to their residency redesigns during the study period. Careful data checking via telephone interviews was conducted to separate innovation expenses from other residency activities. A consensus process was used to map final revenue and expense variables into categories for analysis. These variables included funding received from grants, legislative initiatives, department revenue, and charitable contributions. Specific expense variables included faculty, staff, and evaluator full-time equivalent; funds attributed to redesign effort meetings and retreats; programming support; other contract services; computer hardware and software; and tuition and books.
The Oregon Health & Science University Institutional Review Board granted a waiver (IRB No. 3788) for this project.
Residency program categories were based on self-assigned residency codes used by the American Academy of Family Physicians: (1) community-based, unaffiliated with a university; (2) community-based, university-administered; (3) community-based, university-affiliated; and (4) university-based. None of the P4 sites fit into the community-based unaffiliated with a university category. To attain enough power to test our hypotheses regarding level of affiliation with universities, we collapsed categories 2 and 4 above into a group representing university-based and administered sites. We then considered community-based, university-affiliated to be community-based programs because their association is more distant than those programs either operated or administered by universities. Hereafter, we refer to the 2 groups as university-based and community-based programs.
To estimate the costs of actually training residents, which would allow us to consider the finances of changing training, we obtained data from the National Institute for Program Director Development (NIPDD). Briefly, NIPDD is a fellowship for residency program directors established in 1994 and currently operated by the Association of Family Medicine Residency Directors.7 The NIPDD fellows build a financial pro forma of their programs' global revenues and expenses (fully allocated costs or the summation of direct and allocated costs where no joint or common costs are left unassigned) in accordance with published fiscal modeling of residency costs.8 We used the mean per resident fully allocated costs from the 2009 through 2012 NIPDD financial pro formas to estimate the costs of training residents in programs with 3-year structures of 6 (6-6-6), 7 (7-7-7), and 8 (8-8-8) residents per year. These estimates represented approximately 135 family medicine residencies.
Data Analysis
Descriptive statistics were used to characterize revenues and expenses. All financial amounts were rounded up or down to the closest dollar amount. Responses of “not applicable” were excluded from some results, as described in the captions of select tables. Comparisons of university-based versus community-based programs were based on total award amounts for the entire 5-year study period using independent sample t tests. All tests were 2-tailed, with alpha levels to determine statistical significance set at P ≤ .05.
Results
Six of the enrolled programs were university based and 8 were community based (table 1). The number of continuity clinics associated with university-based programs was 11 at the end of the study period, and 17 for community-based programs. Most residencies were 3-year programs, with the number of residents per year ranging from 1 small program (4-4-4) to 1 very large program (23-23-23), with a mode of 7-7-7.
Innovations implemented in P4 programs are summarized in the online supplemental material. Many involved implementing PCMH features, such as team-based and patient-centered care. Other innovations included changing the length of training, changing training sites, resequencing curriculum, or increasing clinic hours. Between 67% and 100% of innovations were either fully or partially implemented.
The mean per resident costs, derived from NIPDD between 2009 and 2012, were $280,680 and ranged from $262,884 to $288,073 (data not shown). The estimated annual cost for programs with a 6-6-6 structure was $5,052,240, for a 7-7-7 structure it was $5,894,280, and it was $6,736,320 for programs with an 8-8-8 structure (data not shown).
A total of 2 of the 8 community-based P4 programs (14.3%) received no funding from any source to support their redesign efforts. University-based programs received just over $5.2 million during the entire study period, compared with $4.7 million received by the 6 remaining community-based programs (table 2). Most of the funding came from grants in both university-based and community-based programs. Although all university-based programs were awarded grant funding, only 5 of 8 community-based programs (62.5%) received grant awards. University-based programs received just over $2 million in nongrant support, which included capture of grants' indirect costs and state education commission funding (table 2). Community-based programs received no support in this category.
Department-obtained revenue represented 3.4% of university-based support ($180,625) and 13.1% of community-based support ($615,856). Five programs received departmental support: 2 in university settings and 3 in community settings. Departmental support increased in both settings during the course of the project (table 2).
University-based programs received 17 grants totaling $3,032,457 to fund their innovations (table 3), with a per program average of 2.5 grants awarded during the project period. Community-based programs received 28 grants totaling $4,101,877, with each program receiving an average of 5.6 grants or about $820,375 per program. The Health Resources and Services Administration (HRSA) awarded most of the grants received by university-based programs (99%), whereas local foundations awarded 65.1% of the grants received by community-based sites.
Funding allocations for innovations, reported as percentage of total revenues, are shown in table 4. Most of the funds were used for salary support (64.8% in university-based settings versus 79.3% in community-based settings). Salary support in community-based settings was used to pay for faculty (between 0.2 and 2.2 full-time equivalents per funded project year), whereas in university-based programs it was devoted to evaluation, and staff and faculty support (between 0.25 and 0.55 full-time equivalents per funded project year). Most of the nonsalary funds supported resident stipends or other resident costs, computer programming, hardware and software, clinic-related fees, meetings, retreats, tuition, books, and other contract services (table 4).
Program-based average revenues received for innovations were $873,419 for university-based programs and $589,868 for community-based programs (table 5). The mean per program per year revenues for university-based residencies was $174,684, and it was $117,974 in community-based settings. The acquired new funding for innovation relative to the annual costs of training residents using NIPDD data showed that for an average 7-7-7 program, net additional costs associated with innovating were 3% in university-based program budgets and 2% in community-based program budgets (table 5).
Discussion
This study is the first, to our knowledge, to characterize how residencies funded meaningful changes in the content, structure, length, and location of training. We collected longitudinal data on a cohort of family medicine residencies that were active participants in a study of educational redesign. All programs successfully implemented innovations, and most were able to obtain funding to support these efforts. The grant funding source for university-based programs was primarily HRSA, and local foundations typically supported efforts at community-based programs. These differences may be related to the inability of community-based programs to meet HRSA's submission/tracking requirements and funding preferences and priorities. Alternatively, community-based programs may have better relationships with local foundations than university-based programs, because universities often restrict individual departments and residencies from approaching foundations so as not to compete with institutional priorities for fundraising.
Departmental support was relatively small, especially in university-based programs, although it did increase during the course of the project. It may be that programs needed increased resources as the project progressed to ensure that implementation and evaluation activities could be completed to meet funding reporting requirements. It is unclear whether department support flowed from clinical revenues or other institutional subsidies. Other nongrant support (eg, institutional return of indirect costs and state-based education dollars) appeared to be a significant source of funding for university-based program support. These resources vary by institution and state, but it appears that programs actively sought new funds or allocated funds already received toward their redesign efforts.
Most newly obtained funds for residency redesign were allocated to personnel. This makes sense, as the extent of planning, development, and execution was likely significant. Personnel costs represented approximately 60% of the total costs of training (personal communication, August 30, 2013), which further underscores the need for human resources to make significant educational redesign change. Community-based programs allocated nearly 4 times the full-time equivalent expense for evaluation compared with university-based programs, which is likely due to a gap in local or institutional resources for evaluation. As noted in a recent editorial by Gill and Bagley,9 clinical practice transformation toward patient-centeredness costs money, with 1 estimate being $117,000 per physician per year. The ultimate savings attributable to the PCMH model accrue for patients, payers, and the health system,10–12 and investment in practice and residency transformation is needed to better prepare the future workforce.
Two programs, both community-based, received no additional funding to support their innovations, yet both successfully implemented redesign innovations. We learned that one program was able to generate additional clinical revenue to support educational innovations by successfully aligning PCMH innovations with activities that enhanced revenue, including maximizing payer incentives, improving billing and coding, and increasing patient visits by more efficient senior residents. The other program that received no funding underwent a change in leadership that prevented us from determining how it was able to implement its innovation without supplemental funding.
Limitations of this study include the high probability of some measurement error and misclassification bias in both the revenue and expense categories. Our study was based on a survey implemented annually with intensive data checking and cleaning. We did not collect receipts, invoices, or other billing or revenue records.
Accurate accounting of the revenues that flow to residencies is difficult because of variable program structures, graduate medical education funding flowing to hospitals rather than ambulatory training sites, and the comingling of clinical and educational revenues. How funds are allocated in residency programs is also difficult to analyze across programs because of individualized relationships with sponsoring institutions, and varying methods of accounting for operational and support expenses, such as human resources, information technology, and billing costs.13
Determining how residency education is financed deserves more in-depth study if we are to plan for future redesign efforts. This will require improved precision in clinical and educational revenues and expenses as well as standard measures to ensure accurate comparisons can be made of financial performance of residencies and teaching clinics.14 In addition, more research is needed on how training costs relate to educational outcomes, such as workforce adequacy and sustainability, scope of practice, and competency development and maintenance.
Conclusion
Residencies undergoing training redesign used a variety of approaches to fund their innovative changes. Federal and local funding sources were most common. Primary expenses were salary related and we found no differences in learning setting. Further research is needed to fully understand the costs of redesigning training to adequately prepare our future workforce.
We found programs were persistent in their pursuit of grant funding and achieved success; other programs might consider these sources to support their own redesign efforts.
Author Notes
Patricia A. Carney, PhD, is Professor of Family Medicine and of Public Health and Preventive Medicine, Oregon Health & Science University; Elaine Waller, BA, is Research Associate, Department of Family Medicine, Oregon Health & Science University; Larry A. Green, MD, is Professor of Family Medicine, University of Colorado; Steven Crane, MD, is Director of Practice Innovation, Mountain Area Health Education Center, Hendersonville Family Health Center; Roger D. Garvin, MD, is Assistant Professor of Family Medicine, Oregon Health & Science University; Perry A. Pugno, MD, MPH, is Vice President for Education, American Academy of Family Practice; Stanley M. Kozakowski, MD, is Director, Medical Education Division, American Academy of Family Physicians; Alan B. Douglass, MD, is Director, Middlesex Hospital Family Medicine Residency Program, and Professor of Family Medicine, University of Connecticut; Samuel Jones, MD, is Program Director, Virginia Commonwealth University-Fairfax Residency Program; and M. Patrice Eiff, MD, is Professor and Vice Chair, Department of Family Medicine, Oregon Health & Science University.
Funding: This work was supported by the P4 Project, which is jointly sponsored by the American Board of Family Medicine Foundation, the Association of Family Medicine Residency Directors, and the Family Medicine Research Program at Oregon Health & Science University.
Conflict of interest: The authors declare they have no competing interests.



