Revamping Residency Program Accreditation
Medical Education Feature — April 2014
By Amy Lynn Sorrel
Tex Med. 2014;110(4):27-31.
Starting in July, big changes are coming for faculty members who train aspiring U.S. physicians.
The new Accreditation Council for Graduate Medical Education (ACGME) rules for residency training program accreditation come largely in response to calls for increased quality and patient safety. Tracking the maintenance-of-certification process for practicing physicians, the shift intends to prepare future physicians to practice in a rapidly changing care delivery system. The rules also aim to reduce administrative burdens on programs so they can focus less on administrative compliance and more on educational innovation.
Many aspects of accreditation stay the same: Faculty still evaluate residents on prescribed core competencies, and programs report that information along with board pass rates, clinical data from case logs, and faculty and resident surveys, for example. Requirements for duty hours and supervision remain in place, and site visits still occur.
Some new requirements under the Next Accreditation System (NAS), however, represent significant change: Resident evaluations now incorporate specialty-specific criteria called "milestones" to judge residents' progress. ACGME will be less hands-on: Programs in good standing only have to undergo a wholesale, on-site review every 10 years, instead of every two to five, and can skip voluminous reporting requirements and regular site visits. Instead, ACGME will accredit programs on an ongoing basis through annual electronic data collection and as-needed site visits.
ACGME leaders say graduate medical education (GME) has come under increasing pressure to demonstrate that the billions of dollars dedicated to training the nation's physicians are actually doing what they are supposed to: produce doctors well trained in a broad range of skills and core competencies.
To that end, the NAS better enables ACGME to measure residents' progress and collect data to compare educational outcomes, says Houston urologist Michael Coburn, MD. He chairs the ACGME Residency Review Committee (RRC) in urology.
The changes will help "standardize the nature of training such that the public and other stakeholders can feel assured what it takes to be deemed competent in urology in one part of the country is the same as elsewhere and trust that everyone is getting adequately trained and reaching a high level of proficiency in that training," he said.
ACGME is aware of the learning curve, and RRCs will take that into account during the transition, adds Dr. Coburn, who leads Baylor College of Medicine's Department of Urology. Rather than penalize, the intent behind the NAS is to identify training issues early in the process and get them fixed sooner rather than later so resident training can continue to improve and ultimately innovate.
As with any change, Texas GME leaders expect the transition will require substantial time and effort early on, and the Texas Medical Association Council on Medical Education is exploring collaborative ways among the state's teaching institutions to address program challenges, facilitate training opportunities, and share best practices along the way.
But for the most part, program leaders are optimistic that the NAS moves GME in a positive direction.
"It is a big shift in how we evaluate residents, and it does take a lot of work to convert. It may even take several years to figure out what's feasible and what meets the requirements," said council member Troy Fiesinger, MD, a Houston family physician and faculty member of the Memorial Family Medicine Residency Program.
But the new system appears to offer more flexibility and "lets us better assess what [residents'] actual needs are and be more personal in the way we teach them. And we welcome the bigger picture approach, instead of the very prescriptive" nature of the current accreditation system, he says.
ACGME began phasing in the new requirements in July 2013 with seven specialties: emergency medicine, internal medicine, neurological surgery, orthopedic surgery, pediatrics, diagnostic radiology, and urology. All remaining ACGME-accredited core specialties must implement the changes by this July.
A cornerstone of the NAS is the use of milestones in each specialty to gauge residents' progress in six core competencies. They track those required of physicians board certified by the American Board of Medical Specialties and used in the maintenance-of-certification process for physicians in practice:
- Patient care,
- Medical knowledge,
- Practice-based learning and improvement,
- Systems-based practice,
- Professionalism, and
- Interpersonal and communication skills.
Instead of using general evaluation forms to rate residents on a 1-10 scale or relying only on documenting a certain number of procedures done, as under the previous accreditation system, the milestones spell out by specialty what makes for a well-prepared practitioner based on five levels of progression. In patient care, for example, "Level 1" urology residents must show they can perform an accurate physical exam and ask about genitourinary complaints, whereas more advanced "Level 4" residents can routinely identify subtle or unusual findings pertinent to genitourinary conditions.
Dr. Coburn, who helped develop the urology milestones, acknowledges that programs will have to invest the time and effort to learn a new assessment system and come up with tools to implement it. Programs also will have to organize a formal "clinical competency committee" to assess residents based on the milestones.
"ACGME knows the process will evolve over an extended period of time, and we were carefully counseled to identify key aspects of the six competencies that apply to [each specialty]. But we were not to be dissuaded in identifying areas we thought were important simply because we knew there were not yet in existence valid assessment tools to evaluate those specific capabilities," he said.
Ultimately, Dr. Coburn believes it is a better way of evaluating residents and identifying training problems early on. Though Baylor's residency program is only six months into the process, already "the most visible impact for us are the discussions our clinical competence committee has about each trainee, how they excelled or had certain challenges, and the notes we made to support our semiannual performance evaluation for the milestones. Those discussions are by far the most detailed and thoughtful and analytical discussions we've had about the progress residents are making in all areas."
The NAS also transforms the accreditation cycle and the role RRCs play in reviewing programs. Instead of the current two- to five-year timeline, ACGME will monitor programs continuously and will electronically collect milestones data and other information on resident performance, such as faculty development and program quality, annually.
Fewer Site Visits
The annual reporting also means the on-site visits as programs once knew them will significantly change. Instead of regularly scheduled visits under the former two- to five-year accreditation cycle — and the significant preparation and paperwork that went along with each visit — ACGME will visit once every 10 years to verify overall compliance and program improvement opportunities, barring any major concerns.
In the meantime, reviewers still have the flexibility to call for a site visit at any time, but programs won't have to scramble to prepare the voluminous program information reports, or PIFs, they did in the past. So-called "focused" visits will address specific issues such as complaints or potential red flags raised in the annual reporting. A "full" site visit, on the other hand, might examine a broader range of program needs, like application for a new program or serious concerns identified by RRCs.
Overall, Dr. Coburn says site visits will be less frequent and less burdensome, and the continuous accreditation process gives programs the chance to fix problems sooner, rather than waiting up to five years to erase a citation.
ACGME also will implement a third type of site visit — known as a clinical learning environment review (CLER) — aimed specifically at addressing the quality and safety aspects of the residents' learning environment among the multiple teaching hospitals a single GME institution uses to support resident training. Reviewers will conduct CLER visits every 18 months with an eye toward standardizing patient safety and supervision protocols and will make recommendations for improvement in six areas:
- Patient safety,
- Health care quality,
- Care transitions,
- Duty hours fatigue management and mitigation, and
ACGME Senior Vice President for Patient Safety and Institutional Review Kevin B. Weiss, MD, clarified that the CLER visits, at least initially, are not a required component of accreditation.
"But we've seen a pretty large hole in patient safety in the learning environment, and we need to get that fixed pretty quickly, not just for residents as learners but for patient care. We recognize that good institutions are giving good care, but residents by and large are not involved in patient safety," Dr. Weiss said, adding that the visits may look at areas ripe for faculty development and education.
A Learning Curve
Overall, Dr. Fiesinger expects the changes "will help — once we get all the tools in place." Assuming the annual reporting mechanisms are simple enough, the new system would replace regular site visits that he described as "huge disruptions that put everyone on hold for a few days, plus the six months it took to prepare, on top of everything else we have to do to maintain operations." He also "like(s) the idea of knowing ever year where we are, that we're on the right track, and that we can fix things more quickly."
The milestones also allow programs to tailor residents' experiences and evaluations, he says. Past requirements could threaten a family medicine training program, for example, if the hospital didn't get enough deliveries and trainees could not hit their targets.
"We can't control the macroeconomics of Houston. It upsets [residents'] future careers," Dr. Fiesinger said.
And because faculty don't have to rate every trainee on every milestone in every rotation, "I know a resident in an ENT [otolaryngology] rotation can focus on how to treat ear infections and hearing loss, and working with specialists. They can get training in systems-based care somewhere else that is more appropriate. That excites me," he said.
While the milestones may prove to be more useful than what Dr. Fiesinger described as "an abstract scale," the programs must design measures for the various levels of progressions.
That is easier for testing straightforward competencies such as medical knowledge; not so, for other less concrete competencies.
"When I get to patient safety or cost-effective care, do I write a test?" he asked. "We can try to come up with measurement tools like the percent of generic drugs a resident prescribes, but only if IT says we can do it and come up with different levels. We will come up with something, but it is more challenging for these gray areas. And then the question is: Does it meet the [accreditation] standards?"
Many GME programs also rely on a significant number of volunteer community physicians to train residents in various specialties and subspecialties. Reaching and training those community physicians in the new milestone evaluations could prove challenging for some GME programs, says David P. Wright, MD, chair of TMA's Council on Medical Education. He supervises residents and directs The University of Texas Southwestern Medical Center's family medicine clerkship program at Austin's University Medical Center Brackenridge Hospital.
To reach community faculty, TMA is exploring faculty development opportunities and partnerships with Texas programs. GME leaders also must find a way to design tools that balance programs' responsibility to collect and report meaningful information without overloading those physicians who donate their time, Dr. Wright says.
"This will involve getting people to think differently, and some aspects will be more intensive. We want to continue to get valid, objective feedback from preceptors but at the same time not burden them," he said.
Accreditation Not at Risk
Dr. Coburn clarifies that it's not necessary for every volunteer faculty member to be an expert in the milestones. Rather, they should familiarize themselves with the criteria and overall objectives.
It is the job of GME programs' internal competency review committees to compile the feedback and score residents. To prepare, Baylor, for example, gave a grand rounds presentation to all of its faculty on the NAS, and its clinical competency committees likely will have teaching physicians focus on a certain subset of the milestones.
During the transition, RRCs generally want to see that programs are putting the key elements of the NAS into place, particularly the milestones, Dr. Coburn says, adding it will likely take a few years to start benchmarking performance levels among the state's teaching institutions.
"The focus of the RRC's work will be more of a mentoring role with a focus on continuous improvement, rather than just identifying deficiencies and expecting them to be corrected the next time around. And the data will be used to look at ways of doing what we do better. Right now we want to see that [programs] are doing the work, have clinical competency committees that are functioning and meeting, and inputting the data," Dr. Coburn said.
ACGME leaders expect that the vast majority of programs will retain accreditation. Barring egregious violations, those that fall short will receive a warning with the chance to resolve the citations more quickly.
Dr. Weiss of the ACGME says the system, so far, "is working as expected" with most early programs passing their first reviews and only a small number requiring intermediate feedback or site visits. "Those are good signs suggesting change is taking place."
Dr. Coburn advises programs to take initiative and take the changes seriously. Anything less would undermine the new system, he says.
"The message needs to be loud and clear that the use of this [NAS] program and the milestones needs to be genuine, and there should not be concern that [programs'] accreditation status is going to be impaired. When I see programs using the whole scale and scoring residents genuinely, to me that's a sign of a good program because it is taking its educator role and the task of critically assessing competency seriously. I would be concerned with a program where all residents seemed to be perfect," he said.
Amy Lynn Sorrel can be reached by telephone at (800) 880-1300, ext. 1392, or (512) 370-1392; by fax at (512) 370-1629; or by email.
April 2014 Texas Medicine Contents
Texas Medicine Main Page