Medical Education
Jun 2004

Adding Continuous Quality Improvement to a Medical School Curriculum: Problems and Possibilities

Bruce E. Gould, MD
Virtual Mentor. 2004;6(6):267-269. doi: 10.1001/virtualmentor.2004.6.6.medu1-0406.

 

In the early 1990s, the 4-year curriculum for the medical school at University of Connecticut was redesigned and a new curriculum was implemented in 1995 for the class that would graduate in 1999. As part of the curriculum reform, lectures and discussion about key aspects of performance improvement were added to the Human Development and Health course in the beginning of the second year. Students learned about the organization of the US health care system and systems change.

A specific curriculum in quality improvement and patient safety was launched in 1998. This was done as part of the university's participation in the undergraduate medical education (UME-21) project sponsored by the US Health Resources and Services Administration. UConn was awarded an associate partnership grant.

The curriculum was nested in the Principles of Clinical Medicine course. Students heard a review of the principles of quality improvement including its history in industry and health care, its use in inpatient and ambulatory setting, use of tools (fishbone diagrams, process mapping, etc.), data analysis, Plan, Do, Study, Act cycles, and so on. They were then given the assignment to conduct a chart audit on a full cycle of quality improvement activities using a random sample of patients from their student continuity practice. (Each UConn student sees patients at a community-based primary care physician's office one-half day per week for the first three years of medical school.)

For the first year (1998) the chart audits was limited to patients with diabetes mellitus. Data collection instruments were developed by the state peer review organization in conjunction with a manual that defined the project. Students randomly selected a sample of diabetes patient charts to examine. Data were collected and returned to the Area Health Education Center Program office at UConn for analysis. Students and their preceptors were given baseline reports on the quality indicators for diabetes treatment for each of the practice sites where students were assigned. The reports included population demographics and compliance rates as well as comparison data with other practices participating in the Student Continuity Practice program at UConn. Benchmarks developed by local HMOs were also included.

Students reviewed the results of their chart audits with their preceptors and then identified opportunities for improvement. They next designed and implemented interventions to improve their outcomes. Afterward, they performed post-intervention chart audits to assess improvement. Since there was no control group (the whole class participated in the exercise), we cannot definitively claim that the improvement in indicators was due to the Continuous Quality Improvement (CQI) activities. With that caveat, a statistically significant improvement in outcome indicators was documented in practices participating in the project, as described in an article published in Academic Medicine in October 2002.1

Student feedback on the experience, however, was much less favorable. Although students admitted that quality improvement was an important activity, they felt the project was overly time-consuming and was inappropriate at a point in their careers when they were just starting to learn the art of medicine.

Subsequent classes were given a choice of studying diabetes, hypertension, or pediatric asthma. The classes of 2002 and 2003, however, were even less enamored of the project. In an attempt to decrease the workload associated with the project and garner more positive reviews, we collaborated with the Health Plan Association in Connecticut, a loose confederation of major health plans (CIGNA, AETNA, Anthem, ConnectiCare, and Community Health Network) that meets monthly to discuss common issues and develop pilot collaborative projects. The health plans agreed to give practice-site-specific HEDIS (Health plan Employer Data Information Set) data to the Connecticut Area Health Education Center (AHEC) Program and the University for use in the project.

For the classes of 2004 and 2005, students and their preceptors were given reports on their practice sites, including quality indicators for diabetes care, well child care, mammography and pap smear rates, early periodic screening, diagnosis, and treatment compliance, and use of inhaled steroids in asthma, among others.

As we moved forward we found that the HEDIS data for many of the practices did not contain sufficient sample size to allow for analysis. (Many had fewer than 10 cases per indicator after merging data from multiple plans.) Medicaid-predominant practices also had very little data, since Medicaid plans are required only to report early periodic screening, diagnosis, and treatment data at this time. Practices that had adequate sample size often argued that the data were not accurate and not a representative sample of their patients and practice.

Reviews of the CQI curriculum and project again were not favorable. Many students complained that the project, using health plan data, put them in the adversarial position of attempting to be a "change agent" in practices that were not ready for change. The source of the data and its paucity incited hostility in preceptors who were concerned that they were being evaluated and judged unfairly. The methodology for assembling HEDIS measures is problematic. Health plans can choose to sample as few as 411 enrollees as the sample of their statewide population, raising the possibility that some practices may be underrepresented in the sample. Additionally, since the focus of HEDIS is to document quality of outcomes for those in the plan for health industry accrediting agencies such as National Committee for Quality Assurance, the patients used in the sample must be enrolled in the plan during the study year. It is much less certain whether they are tied to a specific practice or are receiving portions of their care outside of the identified practice to which they are administratively tied. This makes it difficult to convince preceptors that the data accurately reflect their practice outcomes. As a result of both student and preceptor feedback, changes will be made in the curriculum for the class of 2006.

Future Plans

We have begun the process of moving the CQI experience to the third year and integrating it within the six-week internal medicine ambulatory block of the Multidisciplinary Ambulatory Experience (MAX). This portion of the MAX is sited at clinics (either hospital-associated or federally qualified) all of which are accredited by the Joint Commission on Accreditation of Healthcare Organizations. Each of these sites has an ongoing performance improvement program guided by committed faculty who understand CQI and can serve as mentors to students. Each group of students rotating through the facility will participate in the site's own CQI program. They will attend meetings of performance improvement committees, participate in chart audit activities and interventions, perform a clinic systems assessment to identify opportunities for improvement, and so on, depending upon the site.

Site mentors will introduce the students to the site and identify patient safety programs and performance improvement efforts. All students rotating through the site during the academic year will comprise a site-specific performance improvement work group. They will follow one yearlong project as a group and meet with their mentors as the project proceeds. They review baseline data and guidelines used to identify opportunities for improvement, develop interventions, and review post-intervention data and outcomes. The group will develop a story board to be presented to the whole class at a CQI symposium. Mentors will assist groups in developing hypothetical follow-up plans based on the results of their project.

By redesigning the experience into a group-driven process, transporting it to clinical venues that are supportive of CQI, and moving it to the third year, when students are more clinically mature and perhaps ready to accept the concepts of outcomes measurement and process improvement, we hope students will gain an understanding of CQI and maintain a positive attitude as they integrate into their future practice.

Other CQI programs

Various residency training programs at the University of Connecticut Health Center have integrated CQI experiences and training into their curricula.

Primary Care Internal Medicine Residency Program: All interns spend a month during their first year on "ambulatory block" during which they are taught the nuances of providing primary care to diverse populations in an outpatient setting. During that month, the group meets weekly as a CQI committee. Students learn the history and theory of CQI, techniques for research guideline development and use, and they perform a chart audit at their ambulatory site, collecting baseline data on indicators of care.

Traditional Internal Medicine Residency Program: Residents are trained in the principles and theory of CQI at their continuity sites and during ambulatory blocks, and they assist in the Plan, Do, Study, Act cycle at their sites.

Curricula pertaining to patient safety and CQI are in development at other discipline residency training programs.

The Office of Primary Care and the Connecticut AHEC Program are working in collaboration with 5 major health plans, the states' malpractice insurer, and the county medical association to develop models of intervention and to introduce principles of systems analysis and performance improvement into ambulatory practices in Connecticut. As noted above, UConn has an 8-year history of using community primary care offices as teaching sites for student continuity practices. For 6 years we have required our students to perform CQI projects at those sites with variable success. Funding for this effort is being sought both from the health plans and foundations.

The Connecticut AHEC Program for the past 6 years has delivered continuing education programs to practicing physicians as well as assisting federally qualified community health centers as they prepare for application for JCAHCO accreditation.

References

  1. Gould BE, Grey MR, Huntington CG, et al. Improving patient care outcomes by teaching quality improvement to medical students in community-based practices. Acad Med. 2002;77(10):1011-1018.

Citation

Virtual Mentor. 2004;6(6):267-269.

DOI

10.1001/virtualmentor.2004.6.6.medu1-0406.

The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.