A Continuum of Competency Assessment: The Potential for Reciprocal Use of the Accreditation Council for Graduate Medical Education Toolbox and the Components of the American Board of Pediatrics Maintenance-of-Certification Program
Reduction of unexplained variation in medical practice and health outcomes is of paramount importance, which indicates a need for a continuum of medical learning that begins in medical school and continues until the end of a professional career. That, in turn, indicates need for continuing assessment of professional competence. The American Board of Pediatrics, the American Academy of Pediatrics, and the Accreditation Council for Graduate Medical Education are working together to develop a common approach to documenting acquisition of competence during residency and maintenance of competence thereafter. A common approach will eliminate redundancy and make it possible to follow the evolution of professional competence over time.
- maintenance of certification
Ideally, the assessment of physician competency should be a seamless process, with the roots of the continuum established during the medical school experience and continuing during residency training and throughout one's professional career. Currently, the transition from medical school into residency training and then into practice consists of a series of handoffs that cannot be viewed as seamless. If high-quality care is the goal, there must be integration across this continuum.
The 1990s were the start of a conscious and concerted effort by those both inside and outside the medical profession to begin to systematically evaluate the quality of medical care, to codify the reasons for inadequate care and medical errors, and to make recommendations for system-wide improvement. The impetus for this reevaluation came from the work of health services researchers such as Wennberg,1,2 Brook et al,3 and others who have demonstrated significant unexplained variations in care even among well-trained, motivated physicians and practices; from consumer and employer groups such as the Leapfrog Group; from agencies and foundations that support quality research, such as the Commonwealth Fund, the Agency for Healthcare Research and Quality, and the Robert Wood Johnson Foundation; and from entities that examine quality, such as the National Quality Forum. The Institute of Medicine (IOM) also debated these issues more than 10 years ago, and the IOM publications To Err Is Human: Building a Safer Health Care System4 (1999) and Crossing the Quality Chasm: A New Health System for the 21st Century5 (2001) received widespread and immediate public attention. These reports also made specific recommendations for change, including the now-familiar goals that care should be safe, effective, efficient, equitable, patient centered, and timely. The reports immediately accelerated the calls for change in both medical education and delivered care.
During this period, the Accreditation Council for Graduate Medical Education (ACGME) and the American Board of Medical Specialties (ABMS) and its member boards were jointly attempting to define, through available research, the areas in which a physician must be competent to practice excellent medicine (ie, practice that adheres to goals defined by the IOM).6,7 Six core general competencies (patient care, medical knowledge, professionalism, systems-based practice, practice-based learning and improvement, and interpersonal and communication skills) were adopted by the ACGME in 1999 and by the ABMS in 2000. The implementation, measurement, and mastery of these competencies then became the motivation for change in both the education of residents and the recertification of practicing physicians.8,9
The ACGME, through its Outcome Project,10 began to change from accrediting residency programs on the basis of their ability to meet predetermined programmatic requirements toward implementing a system that would actually measure outcomes of residency training. In addition, the ACGME subsequently indicated that each program would be required to demonstrate “continuous improvement in its educational processes.”10 At the same time, the ABMS changed from a system of periodic assessment of medical knowledge to one that involves a more continuous demonstration of competence to practice. The result of that change is the current maintenance-of-certification (MOC) program, now a feature of the recertification programs of all certifying boards of the ABMS. MOC measures physician performance in 4 areas: professional standing (MOC part 1); lifelong learning and periodic self-assessment (MOC part 2); cognitive expertise (MOC part 3); and practice performance assessment (MOC part 4). All 24 specialty boards of the ABMS, including the American Board of Pediatrics (ABP), adopted the principles of MOC in 2000.
The similarities in the measurement and outcome requirements of the ACGME and the ABMS are obvious and intentional. The ACGME and the ABMS member boards, however, have been on parallel tracks in developing tools and methods used for measuring outcomes. The ACGME “Toolbox of Assessment Methods” includes, in part, such measurement methods as written examinations, oral examinations, patient surveys, and 360-degree evaluations. Although development of the toolbox was a joint initiative of the ACGME and the ABMS, the member boards of the ABMS had the freedom to develop their own tools for measuring the components of their MOC programs. There are similar and different instruments in each organization's toolbox that clearly overlap in their ability to measure specific competencies. It requires little imagination to see that the Residency Review and Redesign in Pediatrics (R3P) Project could provide a unique opportunity to begin to converge and integrate the Outcome Project and its toolbox with the ABP and its menu of activities available to fulfill the requirements of MOC. Both the ACGME and the ABP have an interest in developing systems that are efficient, flexible, and nonredundant. A seamless transition from the mastery of the 6 competencies of residency training into the 4-part MOC program would benefit the ACGME, the ABP, physicians, patients, and the public at large.
A current example of a cooperative effort using a shared measurement tool is the residency in-training examination administered to both general pediatrics and subspecialty trainees. The examinations, produced and validated by the ABP, are used by most training programs as a measure of the medical knowledge of residents and fellows during their training. This is useful to training program directors to demonstrate to the ACGME that they are fulfilling their requirements to perform such assessments, as well as to the ABP because test results are predictive of success on the formal certification examination.
Residents in training may also benefit from access to assessment and educational instruments that are included in the MOC program to evaluate pediatricians in practice. The ABP has established an approval process to allow credible external projects to qualify for both MOC part 2 (knowledge self-assessment) and MOC part 4 (performance in practice) credit. The criteria for approval are available on the ABP Web site (www.abp.org). For example, the American Academy of Pediatrics (AAP) Pediatrics Review and Education Program has already been approved as an MOC part 2 activity. Included in the available MOC components are general pediatrics and subspecialty knowledge self-assessments as well as practice-improvement tools. These are robust and well-developed programs produced for practicing pediatricians. The AAP, through its PediaLink Learning Center, is promoting the development of individualized learning plans for its members. The AAP is considering incorporating the concepts of MOC and awareness of competencies mentioned previously into its individualized learning plans for residents and fellows that would meet ACGME requirements.
The ABP also has a menu of MOC part 2 activities, including a decision-skills module based on clinical scenarios and literature-based subspecialty-specific self-assessment modules; modules on patient safety are available to generalists and subspecialists. Many AAP educational tools are already available to trainees, and the tools developed by the ABP could easily be made available to training programs as another option for measuring competencies such as medical knowledge and practice-based learning and improvement.
In addition to ABP practice-improvement modules, which are short plan-do-study-act projects that currently measure outcomes in asthma and attention-deficit disorder, the ABP has approved several ongoing external quality improvement Web-based instruments, including AAP Education in Quality Improvement for Pediatric Practice (eQIPP) modules and the ABMS patient-safety module. These could be used jointly by residents and faculty. The ABP has also approved several ongoing and well-established quality improvement projects that can meet the requirements for MOC part 4, including the California Perinatal Quality Care Collaborative; the Vermont Oxford Network neonatology collaborative; the National Association of Children's Hospitals and Related Institutions bloodstream-infection collaborative; the Iowa BlueCross BlueShield asthma and immunization collaborative; the Cystic Fibrosis Foundation collaborative; the statewide Utah Pediatric Partnership obesity collaborative; and the Cincinnati Children's Hospital Medical Center access-to-care collaborative. Several additional quality improvement initiatives are under review. The active participation of physicians in any of the above-mentioned programs will automatically be reported to the ABP and will result in credit toward MOC. Such projects offer the opportunity for faculty participating in the education activities of training programs to work with residents to demonstrate that they can assess and improve care in a systematic way. Teaching faculty are, of course, responsible for delivering the same quality of care that is delivered in nonteaching environments that do not have teams that involve physicians in training. Faculty, in fulfilling their own MOC requirements, can include residents in these activities, thus demonstrating the potential for a true continuum of competency assessment.
Assuming certain requirements are met, the ABP, through its approval process, could approve components of the ACGME Outcome Project toolkit to allow credit for either MOC part 2 or MOC part 4. If, for example, as part of the Outcome Project, there were an ongoing quality improvement project involving trainees and their patients, faculty involved in those projects would automatically qualify for MOC credit if the program received approval from the ABP. In addition, community practitioners could be invited to join the quality improvement project, making it a truly collaborative effort between an academic center and the private-practice community. The latter would also qualify for credit toward certification.
As part of measuring professionalism, the ACGME and the ABP are or will be using peer and patient surveys. It is feasible that residents and faculty involved in the 360-degree evaluations from the Outcome Project toolkit would be exempt from any additional patient survey required by the ABMS. In addition, the ABMS peer and patient survey could be used in training programs, which would be entirely within the spirit of efficiency and reducing redundancy.
In addition to addressing their own professional development through MOC, teaching faculty are required to understand and teach the core competencies, which requires a level of understanding that is not required of most nonteaching physicians. There is an opportunity to create a faculty-development curriculum that would include tools and techniques designed to assist faculty in mentoring residents and fellows with whom they work to demonstrate how to sustain lifelong professional development and continuously improve practice performance. The development of such a curriculum could be a combined effort of the ACGME, the ABP, program directors organizations, and the AAP.
Last, the use of simulation laboratories to assist in the acquisition of procedural and communication skills in medical training is developing rapidly. These same tools are being used for maintaining skills for practicing physicians and for addressing medical errors. The opportunity exists to use simulation techniques as part of MOC as well as during ACGME-accredited training.
The R3P Project will encourage meaningful innovation in training programs. It is logical to assume that one such innovation will be the creation of program designs that permit a seamless transition from residency training in a competency-based system to a competency-based certification process such a MOC using similar, if not identical, measurement tools provided from a variety of sources. In this regard, a collaborative effort between the ACGME, ABP, AAP, and others can only hasten the closure of the quality chasm.
- Accepted September 22, 2008.
- Address correspondence to H. James Brown, MD, American Board of Pediatrics, 111 Silver Cedar Ct, Chapel Hill, NC 27514. E-mail:
The authors have indicated they have no financial relationships relevant to this article to disclose.
- ↵Wennberg JE. Variation in Use of Medicare Services Among Regions and Selected Academic Medical Centers: Is More Better? New York, NY: New York Academy of Medicine; 2005
- ↵Institute of Medicine. To Err Is Human: Building a Safer Health Care System. Washington, DC: National Academies Press; 1999
- ↵Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press; 2001
- ↵Accreditation Council for Graduate Medical Education. Program requirements for residency education in pediatrics. Available at: www.acgme.org/acWebsite/downloads/RRC_progReq/320pediatrics07012007.pdf. Accessed May 13, 2008
- ↵Accreditation Council for Graduate Medical Education. Outcome project: glossary. Available at: www.acgme.org/outcome/project/glossary2.asp. Accessed May 13, 2008
- Copyright © 2009 by the American Academy of Pediatrics