Measuring Progressive Independence With the Resident Supervision Index: Empirical Approach

PhD, JD,
DO,
MD, MA,
MS,
PhD,
MD, MS,
MD,
MD, MPH,
PhD,
MD, MPH,
MD, PhD,
MD,
MD, and
BS
Online Publication Date: 01 Mar 2010
Page Range: 17 – 30
DOI: 10.4300/1949-8357-2.1.17
Save
Download PDF

Abstract

Background

A Resident Supervision Index (RSI) developed by our research team quantifies the intensity of resident supervision in graduate medical education, with the goal of testing for progressive independence. The 4-part RSI method includes a survey instrument for staff and residents (RSI Inventory), a strategy to score survey responses, a theoretical framework (patient centered optimal supervision), and a statistical model that accounts for the presence or absence of supervision and the intensity of patient care.

Methods

The RSI Inventory data came from 140 outpatient encounters involving 57 residents and 37 attending physicians during a 3-month period at a Department of Veterans Affairs outpatient clinic. Responses are scored to quantitatively measure the intensity of resident supervision across 10 levels of patient services (staff is absent, is present, participated, or provided care with or without a resident), case discussion (resident-staff interaction), and oversight (staff reviewed case, reviewed medical chart, consulted with staff, or assessed patient). Scores are analyzed by level and for patient care using a 2-part model (supervision initiated [yes or no] versus intensity once supervision was initiated).

Results

All resident encounters had patient care supervision, resident oversight, or both. Consistent with the progressive independence hypothesis, residents were 1.72 (P  =  .019) times more likely to be fully responsible for patient care with each additional postgraduate year. Decreasing case complexity, increasing clinic workload, and advanced nonmedical degrees among attending staff were negatively associated with supervision intensity, although associations varied by supervision level.

Conclusions

These data are consistent with the progressive independence hypothesis in graduate medical education and offer empirical support for the 4-part RSI method to quantify the intensity of resident supervision for research, program evaluation, and resident assessment purposes. Before informing policy, however, more scientific research in actual teaching settings is needed to better understand the relationships among patient outcomes, clinic workload, case complexity, and graduate medical education experience in resident supervision and professional development.

Background

The concept of graded responsibility for care and progressive independence from supervision has long been a model for graduate medical education (GME)13 and has been incorporated into accreditation standards, policy statements, and supervision requirements.47 However, no study to date has attempted to quantitatively estimate progressive independence in actual clinical settings as residents are promoted from one postgraduate year to the next.

The 4-part Resident Supervision Index (RSI) was developed to quantitatively measure and assess the intensity of resident supervision, which can be used to test for progressive independence. A prior article has described the feasibility and psychometric reliability of the RSI Inventory as a survey instrument to collect supervision information from attending staff and residents (appendix 1),8 and in the companion article published in this issue of Journal of Graduate Medical Education, we describe the theory of patient-centered optimal supervision and derive the 2-part analytic models designed to test theory-driven hypotheses.9 In this article, we introduce the final part of the 4-part RSI method by describing how RSI Inventory responses are scored to compute supervision intensity.

To measure the intensity of resident supervision in patient care and to test for progressive independence, scores are computed for outpatient encounters with internal medicine residents who rotated through a Department of Veterans Affairs (VA) clinic in 2008. The RSI method is evaluated by testing 3 RSI theory-driven hypotheses, derived elsewhere,9 in which the intensity of resident supervision is expected to decrease (1) for residents with longer lengths of GME training (progressive independence), (2) in clinics with more workload (workload effect), and (3) with patients who present with less complex medical problems (complexity effect).

Resident Supervision Index

Encounter

To score RSI data, clinical activities in teaching clinics are divided into supervision encounters containing interactions among a resident, an attending physician, and the services they provide to a given patient. Services can be defined as narrowly as a clinical procedure or as broadly as an acute episode of care. For this study, encounters are defined by outpatient visit.

Levels and Phases

Listed in table 1, each encounter can be segmented into 3 clinical phases ordered sequentially over time, beginning with resident oversight, when attending physicians gather information to assess patient progress, monitor resident performance, and evaluate clinical care. Resident oversight informs the second phase, care discussions, when the attending physician interacts with the resident to discuss the patient's case to inform the third phase, patient services, when the attending physician and resident perform medical procedures. Care discussions and patient services combine to form patient care, distinct from oversight. Phases may run intermittently during an encounter as residents and attending physicians go back and forth between oversight, discussions, and providing services.

Table 1 Supervision Levels by Phase, With Participants and Resident Supervision Index (RSI) Data Sources
Table 1

Phases are further segmented into levels representing degrees of supervision intensity. During the resident oversight phase, attending physicians collect information by assessing the patient in separate examinations, by consulting with clinical staff, by reviewing the medical chart, and by asking residents to give case presentations. During the care discussions phase, attending physicians interact with residents to change care, order tests, or direct services. During the patient services phase, attending physicians may separately provide care when residents are physically absent or provide care when residents are present to observe. When residents are providing care, the attending physician may either participate in care, observe care, or be absent from the room but otherwise available in the clinic or on call.

In addition to oversight, attending physicians may become informed when engaged in the patient's care or during care discussions with residents. An activity is classified as oversight, however, whenever its sole purpose is to gather information about the case. Encounter minutes that can be classified simultaneously into 2 phases are to be classified by the later phase. For example, time the attending physician spends simultaneously collecting information (resident oversight phase) and directing care (care discussions phase) would be classified as care discussions.

Scores

Based on theory of patient-centered optimal supervision,9 intensity scores are computed for each of 10 levels listed in table 1, the 3 phases, patient care, and encounter. Case examples are given in appendix 2.

Residents supervised at the least intensive level (staff absent from the room [level 3.5]) are said to have “no direct supervision” during that encounter moment. The score is measured in minutes and is represented symbolically by [RSI3.5]. Scores for the remaining 9 “directly supervised” levels ([RSI1.1], [RSI1.2],…[RSI3.4]) are measured as time proportions, with the numerator equal to the time at the given level and with the denominator equal to the sum of time over all levels of equal or lesser intensity plus the time at all later phases. Scores for directly supervised levels range between 0 and 1, with higher scores indicating more intensive supervision. Scores are calculated so that the time when the resident was supervised at a given level and phase is compared only with time during the encounter when the resident was at an equal or lesser level of supervision intensity, or at a later phase. Thus, higher scores are associated with more minutes in the given level, fewer minutes in levels of lesser intensity during the same phase, and fewer minutes in later phases. Scores are computed to weigh each moment at a given level against the rest of the time during the encounter when the resident experienced less, not more, intensive supervision.

Summary scores can be computed by phase, for patient care, and for the encounter. Formulas and definitions are given in table 2. The summary score for the encounter [RSIenc] equals the proportion of total encounter time when the resident was directly supervised. We define responsibility for care as 1 minus the intensity of resident supervision. As intensity of supervision decreases from 1 to 0, the intensity of assigned responsibility for patient care increases from 0 to 1. If supervision intensity equals the proportion of encounter time when the resident was directly supervised, then responsibility represents the proportion of encounter time when the resident provided care with staff absent from the room. Our theoretical framework is based on the assumption that attending physicians are fully informed about the case (oversight) when supervising residents for care discussions and patient services.9 We thus compute patient care responsibility [RSIresp] after staff oversight as follows:

  1. RSIresp  =  1 − (RSIenc | RSIover  =  0)

  2.  =  1 − {1 − [(1 − RSIserv) × (1 − RSIdisc) × (1 − {RSIover  =  0})]}

  3.  =  [(1 − RSIserv) × (1 − RSIdisc)]

  4.  =  1 − RSIcare

Table 2 Definitions and Computation Formulas for Phase and Encounter Resident Supervision Index (RSI) Summary Scores
Table 2

During an encounter, residents are said to be (1) (a) directly supervised at level l if the resident is supervised at level l during any encounter moment (RSIl > 0) or (b) autonomous from supervision at level l if otherwise (RSIl  =  0), (2) directly supervised for the encounter whenever the resident is supervised during any encounter moment at any of 9 directly supervised levels (RSIenc > 0), (3) (a) autonomously providing care (RSIcare  =  0) or (b) fully responsible for care (RSIresp  =  1) whenever attending staff did not hold care discussions and was absent from the room throughout resident-provided patient services, or (4) unattended if the resident was autonomously providing care and was without resident oversight (RSIenc  =  0). Fully responsible residents (RSIcare  =  0) are not unattended (RSIenc > 0) if they receive oversight (RSIover > 0).

Methods

Data

The analyses use data from the VA RSI feasibility trial. The trial shows that RSI Inventory version 3.11 is both feasible and reliable in collecting information about time spent during supervision encounters, as described by Byrne et al.8 Briefly, after receiving 1-day training for the manualized RSI Inventory instrument by study investigators and under supervision by the associate chief of staff for education, 2 registered nurses and 3 clinical care coordinators interviewed consenting attending staff and their consenting residents rotating through primary care general internal medicine clinics at the Jerry L. Pettis Memorial VA Medical Center, Loma Linda, California, from May through September 2008. Under a VA Institutional Review Board–approved protocol, the RSI Inventory was administered at the end of the resident's shift for patient encounters selected at random from among scheduled clinic appointments. Patients were limited to those who had a diagnosis of diabetes or major depression. These diagnoses are highly prevalent among VA patients, and the patients often present with moderate case complexity. The RSI scores were computed based on resident responses only.

Patient care demographics, International Classification of Diseases, Ninth Revision (ICD-9) diagnosis codes, and inpatient and outpatient care information were obtained from the VA's electronic databases.10,11 Demographic, education, and GME information were obtained from self-reports during baseline interviews with residents and their attending physicians.

The length of the resident's current GME program was measured in months, but effect sizes were reported in years. Case complexity was computed as the number of ICD-9 clinical conditions reported in the patient's medical chart for the indexed visit that aggregated into 1 of 17 mutually exclusive and exhaustive disorder classes (table 3). Case complexity was further refined by including data about patients' private health insurance coverage derived from the VA's electronic medical chart. Results of prior studies11,12 suggested that VA patients with private health insurance coverage are in overall better health and require less complex care than VA patients without private insurance. Workload by shift was computed by the number of procedures performed in the clinic during each shift per available attending staff.

Table 3 Demographics, Use of Inpatient and Outpatient Care, and Diagnoses of Study Patients
Table 3

Analyses

The association between length of GME training and RSI supervision intensity is computed using a 2-part model.9 An exhaustive search was used within each covariate category to identify potential confounders.13 With complex 4-way interactions between resident, attending physician, patient, and clinic shift, we assumed that supervision intensity was independently distributed over 140 encounters. The length of time when residents provided care with an absent attending (RSI3.5) was regressed using a log linking function, with exponentiated coefficients measuring effect sizes as simple time ratios.

Results

Fifty-seven residents (table 4) and 37 attendings (table 4) cared for 136 patients (table 3) during 140 encounters from May through September 2008. The mean (SD) daily workload over 137 shifts averaged 578 (36) patients and 2309 (189) procedures per day. The total mean (SD) time per encounter was 32.7 (14.8) minutes (range, 5–81).

Table 4 Demographic, Specialty, and Medical Education Characteristics of Responding Resident Physicians and Attending Physicians
Table 4

table 5 summarizes RSI scores by level of intensity. In 90 of 140 encounters (64%), residents presented the case to the attending staff, representing a mean of 23% of the total encounter time. The attending physician engaged in care discussions in 45 of 140 encounters (32%), consuming 23% of the total time devoted to patient care. By contrast, there were no encounters when care was provided while the resident was not present. Among patient services, 35 of 140 encounters (25%) involved an attending physician participating in care during 24% of the total time when the attending physician was participating, observing, or absent from care. In 72 of 140 encounters (51%), attending physicians directly supervised residents, accounting for 28% of the total time residents were providing care. Attending physicians directly supervised their residents for all 140 encounters so that no resident was left unattended (RSIenc  =  0).

Table 5 Number of Encounters when the Resident was Supervised, and Mean Resident Supervision Index (RSI) Score among Supervised Residents, by Supervision Levela
Table 5

table 6 gives estimates of the associations between length of GME training and RSI supervision intensity adjusted for case complexity, clinic workload, and patient, resident, and attending physician characteristics. Consistent with the progressive independence hypothesis, residents who were advanced by 1 year in their GME training were only 58% as likely to have been directly supervised during a patient care encounter, or alternatively 1.72 (1 divided by 0.58) (95% confidence interval, 1.09–2.70) times more likely to be fully responsible for patient care. Greater responsibility for patient care (progressive independence) was the result of attending staff being less likely to participate in services or hold case discussions with residents. However, once supervised, length of GME training had little statistically significant effect on the intensity of supervision. On the other hand, residents advanced by 1 year spent only 62% as much time providing care with attending staff absent than their lower-level counterparts. This is consistent with the theory that upper-level residents are more efficient producers of health care.

Table 6 Adjusted Effect of Predictors on the Likelihood and Intensity of Supervision for Patient Care and Selected Supervision Levelsa
Table 6

Patients who presented with more medical conditions were associated qualitatively with different levels of supervision, but not quantitatively with a higher intensity of supervision. Attending physicians who supervised residents for patients presenting with more conditions tended to be more likely to interact with the resident in care discussions but less likely to participate in providing care.

Residents treating patients with private health insurance were only 42% as likely to be supervised, with only 37% of the intensity once supervision began, than their counterparts who were treating uninsured patients. Supervision was less because their attending staff held fewer care discussions and participated less in patient care.

The quantity of workload facing clinic staff did not effect whether residents were supervised. However, once supervision began, residents were supervised at only 33% of the intensity as their counterparts who rotated through clinics where 100 fewer procedures were produced per day per attending physician. Supervision was less because attending staff participated less in patient care.

We also found that attending physicians with advanced degrees (other than medicine) were no more likely to initiate supervision. Once supervision began, degreed attendings supervised with only 24% of the intensity as their non-degreed counterparts, a consequence of degreed staff participating less in patient care. A resident's foreign medical graduate status had little effect on the likelihood or intensity of supervision.

Discussion

In this study, residents in internal medicine were granted more autonomy from supervision in a VA outpatient clinic as they progressed through GME training. Our data also showed more intensive supervision when residents faced more complex patients in clinics with greater workloads. Such findings offer empirical support for an RSI method. Quantifying supervision and measuring progressive responsibility have policy implications for defining supervision standards and measuring GME educational outcomes.14

Building on previous work,13,15 the RSI method consists of a survey instrument (RSI Inventory),8 scoring strategy (presented herein), theoretical framework (patient-centered optimal supervision),9 and analytic framework (2-part model).9 This article shows how RSI Inventory responses were scored to quantify different levels of supervision intensity that, taken together, profile supervision during encounters among residents, attending physicians, and patients in outpatient care settings. Our data provide support for the RSI method by showing intensity scores covarying with resident experience, complexity of patient cases, clinic workload, and attending physician characteristics, consistent with the patient-centered theory of optimal supervision.

Supervision has often been described as an oversight function designed to ensure the quality of care15,16 and measured by whether the attending physician made a medical chart notation,17 was physically present,18 was involved,19,20 identified discrepancies,21 or participated on the health team.22 In contrast, the RSI defines supervision broadly to include resident oversight, interactive discussions, and attending involvement. To test for progressive independence, we calculated scores that quantified progressive responsibility separately from resident oversight so that residents could be assigned full responsibility for patient care, while remaining under faculty oversight to inform appropriate supervision decisions. In fact, almost half of the VA encounters studied were full-care responsibility assignments with attending oversight.

The RSI method may serve as a tool to help GME directors evaluate a resident's progress toward independence. In recent years, the medical education community has adopted the Accreditation Council for Graduate Medical Education general competencies and learning objectives associated with them to assess residents' competence. At the same time, some have argued that, while competencies have advanced assessment in GME, competency evaluations do not measure clinical performance based on what physicians actually do in practice.2325 Some have noted that the general competency approach risks diverting GME assessment away from actual clinical performance by deconstructing clinical competence into demonstrations of knowledge, skill, and learning objectives.24 By contrast, clinical performance requires the integration and application of all 6 competencies and their application to complex context-specific clinical scenarios.26,27 Therefore, clinical competence may be best assessed through residents' performance of clinical activities and judged by expert clinicians who are familiar with the resident's clinical performance. In fact, a resident's supervisor may be the best judge of a resident's progress toward practice independence.28,29 While these judgments are subjective, potentially biased, and limited by a lack of direct observation,26 the collective judgment of faculty over the course of a resident's training may provide a measure to assess a resident's clinical competency.24 Furthermore, few validated tools are available to directly observe trainees' skills and to track the progress of clinical skill development.30 Progressive independence as measured by the RSI relies on the supervisor's judgment about the resident's clinical competencies in situations when his or her first duty is to represent the interest of the patient. Therefore, the RSI potentially provides an opportunity to quantify those judgments as a measure of progression to practice independence, the ultimate educational goal of GME.

The present study has several limitations. Study data were derived from a single site. Patient use of non-VA sources of care and resident rotations to non-VA facilities were not considered. The study did not measure the appropriateness, quality, or efficiency of resident supervision, nor did it include measures of quality of care or patient health outcomes. Further theoretical and empirical research is recommended.

Conclusions

Data on resident supervision at a VA outpatient clinic offer empirical support for the progressive independence hypothesis and for the 4-part RSI method. The RSI was designed to measure the intensity of resident supervision for research, program evaluation, and resident assessment purposes. An important advantage of RSI scores is that they do not need to be adjusted for patient outcomes, but only if supervisors aim first and foremost to maximize patient outcomes and residents contribute to patient care. Before informing policy, however, more scientific research in actual teaching settings is needed to better understand the relationships among patient outcomes, clinic workload, complexity of assigned cases, and GME experience in resident supervision and professional development.

References

Appendix 1. Appendix 1. Appendix 1.
Appendix 1. Continued. Appendix 1. Continued. Appendix 1. Continued.

Appendix 2

Case Examples Computing Resident Supervision Index Scores

Case 1

A first-year resident examines a patient with dyspnea alone (15 minutes), suggests pleural effusion to the attending physician, who after discussion with the resident recommends a chest x-ray (9 minutes). The attending physician confirms the resident's interpretation of the x-ray and recommendation for thoracentesis (6 minutes). The attending physician performs the procedure, while the inexperienced resident observes (30 minutes). The attending physician reviews and signs the medical chart (3 minutes). The attending was absent from care for 15 minutes (level 3.5), 30 minutes providing care (level 3.2), 9 minutes interacting with residents to direct care (level 2.1), 6 minutes overseeing and confirming the resident's thoracentesis recommendation (level 1.4), and 3 minutes signing the patient's chart (level 1.3). Thus, absent-attending care RSI3.5  =  15 minutes, providing care RSI3.2  =  30 / (15 + 30)  =  0.67, interaction RSI2.1  =  9 / (15 + 30 + 9)  =  0.17, oversight case presentation RSI1.4  =  6 / (15 + 30 + 9 + 6)  =  0.10, and oversight medical chart review RSI1.3  =  3 / (15 + 30 + 9 + 6 + 3)  =  0.05. Patient care summary is RSIcare  =  1 − [(1 − 0.67) × (1 − 0.17)]  =  0.73, indicating the resident was under direct supervision during 73% of patient care and was responsible for RSIresp  =  27%, or (1 − 0.73), of patient care. The encounter summary is RSIenc  =  1 − [(1 − 0.67) × (1 − 0.17) × (1 − 0.10) × (1 − 0.05)]  =  0.77, or the resident was under direct supervision during 77% of the encounter.

Case 2

Same as case 1, but a second-year resident orders the x-ray and recommends thoracentesis to the attending physician, who confirms both diagnosis and treatment plan (12 minutes). The resident also performs the thoracentesis with the physician watching (30 minutes). The attending physician continues to be absent for 15 minutes during patient services but now spends 30 minutes (level 3.4) observing the resident performing the thoracentesis. Attending-observed care RSI3.4  =  0.0 in case 1 increased to RSI3.4  =  30 / (15 + 30)  =  0.67 in case 2, while attending-provided care RSI3.2  =  0.67 decreased to RSI3.2  =  0. That is, direct supervision shifted from the attending providing care to a lesser intensive attending observing the resident providing care. There is no attending interaction, with RSI2.1 decreasing from 0.17 in case 1 to 0.0 in case 2. Oversight case presentation increased from 0.10 to RSI1.4  =  12 / (15 + 30 + 12)  =  0.21. Oversight medical chart review remains essentially unchanged at RSI1.3  =  1 / (15 + 30 + 12 + 3)  =  0.05. Patient care supervision decreased from 0.73 in case 1 to RSIcare  =  0.67 and RSIresp  =  1 − 0.67  =  0.33 in case 2. Encounter supervision decreased slightly from 0.77 to RSIenc  =  1 − [(1 − 0.67) × (1 − 0.21) × (1 − 0.05)]  =  0.75 in case 2.

Case 3

Same as case 2, but a third-year resident performs the thoracentesis without the attending physician present. Time for absent-attending care increased from 15 minutes to RSI3.5  =  45 minutes. Time for attending observing care decreased to RSI3.4  =  0.00, with attending interaction and oversight intensities unchanged. Thus, the intensity of supervision for patient care decreased from 0.67 to RSIcare  =  0.00, with RSIresp  =  1 − 0.00  =  1.00. That is, the resident was autonomously providing patient care, with overall supervision for the encounter decreasing from 0.75 to RSIenc  =  1 − [(1 − 0.21) × (1 − 0.05)]  =  0.25.

Case 4

Same as case 3, but the attending physician tells the resident not to report back unless a problem occurs. Oversight case presentation decreased from 0.21 to RSI1.4  =  0.00, leaving supervision for the encounter to decrease from 0.25 in case 3 to RSIenc  =  0.05 in case 4.

Summary

Taken together, these 4 cases provide an example of how the increasing clinical competencies of a resident can lead to reduced intensity of supervision for patient care from 0.73 to 0.00, and for the encounter from 0.77 to 0.05, with assigned responsibility increasing from 0.27 to 1.00. From case 1 to case 4, residents were progressively assigned to full responsibility for patient care, while remaining supervised for the encounter.

Copyright: Accreditation Council for Graduate Medical Education 2010
Appendix 1

Appendix 1. Continued.

Author Notes

T. Michael Kashner, PhD, JD, is Professor and Associate Chair for Translational Research, Department of Medicine, Loma Linda University Medical School and Director of the Center for Advanced Statistics in Education at the Jerry L. Pettis Memorial VA Medical Center, Loma Linda, CA, Professor of Psychiatry at the University of Texas Southwestern Medical Center at Dallas, TX, and Health Specialist with the Office of Academic Affiliations, Department of Veterans Affairs, Washington, DC; John M. Byrne, DO, is Associate Chief of Staff for Education and Co-Director of the Center for Advanced Statistics in Education at the Jerry L. Pettis Memorial VA Medical Center, Loma Linda, CA, and Assistant Professor of Medicine, Loma Linda University Medical School, Loma Linda, CA; Barbara K. Chang, MD, MA, is Director of Medical and Dental Education, Office of Academic Affiliations, Department of Veterans Affairs, Washington, DC, and Professor of Medicine (emeritus), University of New Mexico School of Medicine, Albuquerque, NM; Steven S. Henley, MS, is President, Martingale Research Corporation, Plano, TX; Richard M. Golden, PhD, is Professor of Cognitive Science and Engineering, School of Behavioral and Brain Sciences, University of Texas at Dallas, Richardson, TX; David C. Aron, MD, MS, is Associate Chief of Staff for Education, VA Senior Scholar, Louis Stokes Cleveland DVA Medical Center, Cleveland, OH, and Professor of Medicine & Epidemiology & Biostatistics, School of Medicine, and Professor of Organizational Behavior at the Weatherhead School of Management, Case Western Reserve University, Cleveland, OH; Grant W. Cannon, MD, is the Associate Chief of Staff for Academic Affiliations, George E. Wahlen VA Medical Center, Salt Lake City, UT and Professor and Thomas E. and Rebecca D. Jeremy Presidential and Endowed Chair for Arthritis Research, School of Medicine, University of Utah, Salt Lake City, UT; Stuart C. Gilman, MD, MPH, is Director, Advanced Fellowships and Professional Development, Office of Academic Affiliations, Department of Veterans Affairs, Washington DC and Clinical Professor of Health Sciences, University of California Irvine School of Medicine, Irvine, CA; Gloria J. Holland, PhD, is Special Assistant for Policy and Planning, Office of Academic Affiliations, Veterans Health Administration, Department of Veterans Affairs, Washington, DC; Catherine P. Kaminetzky, MD, MPH, is Associate Chief of Staff for Education, Department of Veterans Affairs Medical Center, Durham, NC, and Assistant Professor, Department of Medicine, Duke University School of Medicine, Durham, NC; Sheri A. Keitz, MD, PhD, is Chief, Medical Service, Miami VA Healthcare System, and Professor of Medicine and Associate Dean, Miller School of Medicine, University of Miami Medical School, Miami, FL; Elaine A. Muchmore, MD, is Associate Chief of Staff for Education at the VA Medical Center in San Diego, CA, and Professor of Clinical Medicine and Vice Chair for Education, Department of Medicine, School of Medicine, University of California at San Diego, CA; Tetyana K. Kashner, MD, is a resident, Department of Obstetrics and Gynecology, Pennsylvania State University Milton S. Hershey Medical Center, Hershey, PA; Annie B. Wicker, BS, is a Health Science Specialist, Office of Academic Affiliations and Data Coordinator for Center for Advanced Statistics in Education, Jerry L. Pettis Memorial VA Medical Center, Loma Linda, CA.

This study was funded in part by grant SHP 08-164 from the Department of Veterans Affairs' Health Services Research and Development Service (Dr T. M. Kashner). Development of the statistical methods was supported in part by grant R44CA139607 from the Small Business Innovation Research program of the National Cancer Institute and by grant R43AA013670 from the National Institute on Alcohol Abuse and Alcoholism (Mr Henley). All statements and descriptions expressed herein do not necessarily reflect the opinions or positions of the Department of Veterans Affairs or the National Institutes of Health of the Department of Health and Human Services.

Corresponding author: T. Michael Kashner, PhD, JD, Jerry L. Pettis Memorial VA Medical Center, Loma Linda VA Healthcare System, 11201 Benton Street, Loma Linda, CA 92357, 214.648.4608, michael.kashner@va.gov
Received: 10 Nov 2009
Accepted: 21 Jan 2010
  • Download PDF