Objectives. To evaluate an encounter-based immunization prompting system on resident performance in administering vaccines and knowledge of immunization guidelines.
Design/Methods. Prospective randomized, controlled trial. Subjects were first- and second-year pediatric residents in a hospital-based continuity clinic. The intervention group received manual prompts of immunizations due. Postclinic chart review compared immunizations due with those administered. Acceptable and unacceptable reasons for not administering vaccines were assigned. Resident knowledge was measured by a 70-item examination.
Results. The intervention group had significantly less missed opportunities/vaccine administration errors (11.4% vs 21.6%). The most common reason for unacceptable errors in the intervention group: vaccine was given too early; in the control group: vaccine was postponed to next visit. Pre- and postintervention knowledge scores were similar: intervention group (75.5% vs 80.7%, control group; 76.5% vs 81.3%).
Conclusion. An immunization prompting system in a hospital-based pediatric resident continuity clinic reduced missed opportunities/vaccine administration errors without significantly impacting resident knowledge of immunization guidelines.immunization schedule, vaccination, immunization, prompting systems, resident education.
Immunization rates for medically underserved children remain below the year 2000 goal of 90% up-to-date by 2 years of age.1 Missed opportunities to immunize contribute significantly to underimmunization. Reluctance to administer immunizations when a child is ill, failure to immunize at all well-child care visits, and inadequate knowledge of the current immunization schedules are the major reasons for missed opportunities.2–5
Efforts to decrease missed opportunities which have focused on changing physician knowledge have had variable success. Siegel et al3 demonstrated that despite reporting a good knowledge of contraindications to immunization, physicians were still reluctant to administer vaccines at acute care visits when vaccination was not contraindicated. In contrast, reminders have been effective at changing physician performance. Szilagyi et al6 showed that screening by nurses at the time of the visit and attaching reminder cards to the chart increased the rate of vaccine administration by providers. Computer-generated reminders systems can improve the performance of practicing physicians.7–13
The most common and effective type of reminder system is a patient-specific report of actions due made available to the physician at the time of an encounter. Automatic computerized reminders have increased compliance with standards, significantly reduced physician error, and improved health care outcomes.7–9,12,14
To date, no study has examined the effect of patient-specific reminders on the delivery of immunizations by pediatric residents. The immunization practices of American pediatric residents fall short15 of the US Department of Health and Human Service'sStandards for Pediatric Immunization Practices (Ad Hoc Working Group for the Development of Standards for Pediatric Immunization Practices. Standards for pediatric immunization practices.JAMA. 1993;269:1817–1822). Residents fail to recognize true contraindications, are reluctant to administer simultaneous vaccinations, and do not take advantage of all opportunities to immunize children.15,,16 The low level of performance among pediatric residents suggests the use of computerized prompting may be an appropriate strategy for performance improvement. Preliminary data from well-child care visits in our clinic revealed that the residents had 1 or more missed opportunities/vaccine administration errors at 24.9% visits for children <5 years old. However, according to Campbell et al,16 “… little has been reported regarding immunization knowledge among [these] residents.”
According to adult learning theory, active participation is important in the acquisition of knowledge.17 Although an automatic prompting system may enable residents to administer appropriate immunizations, it may also represent more passive learning and inhibit knowledge acquisition of immunization guidelines. The distinction between knowledge and performance is particularly important in the arena of resident training, where the focus must include both. To date, no study has considered the effects of prompting systems on the acquisition of residents' immunization knowledge.
The purpose of this study was to evaluate the impact of an encounter-based immunization prompting system on both pediatric residents' performance in administering vaccines and their knowledge of immunization guidelines.
Study Population and Setting
Pediatric Health Associates is a large primary care practice located at Children's Hospital, Boston, where residents, nurse practitioners, and attending pediatricians provide primary care to 10 000 children. The majority of children using Pediatric Health Associates reside within the city of Boston, with 80% receiving Medicaid. There is an average of 36 000 visits/year.
Pediatric residents are assigned to the same weekly afternoon continuity clinic session during their 3-year residency program. Preceptors supervise all visits conducted by a resident. Preceptors include both hospital-based and community-based attending physicians as well as general pediatric fellows. All first- and second-year pediatric residents were included in this study; senior residents were excluded from the intervention to allow for longer-term follow-up. The hospital's institutional review board approved the study.
The intervention took place between November 1996 and March 1997. First- and second-year residents were randomized by day of the week and assigned to either the control (no prompting) or the intervention (prompting) group. A random-number generator program was used to choose 2 days for the intervention group and 3 days for the control group. There was no difference at baseline in percentage of errors committed between the 2 groups. The intervention was limited to well-child care visits because the sizeable error rate gave sufficient power to detect a difference and staffing did not support the administration of vaccines during urgent care visits. The investigation was limited to visits of children <5 years old because almost 80% of routine childhood immunizations are recommended to be administered during this age.18 Visits where a child was new to the practice were excluded because of a lack of an immunization history before the visit. Reasons for missed opportunities/vaccine administration errors, categorized during the collection of baseline data, are listed in Table 1.
Preceptors supervising the resident were recorded for each visit. Nurses assisting the resident were not recorded because they are not assigned to a specific resident.
First-, second-, and third-year residents completed a 70-item examination during their weekly primary care conference time. The examination was administered before the intervention and a routine lecture on childhood immunizations and then 2 weeks after the intervention. At least 3 personal contacts were made to residents requesting completion of any missing examinations. Although the study was conducted with first- and second-year residents, third-year residents also completed the examination as a measure of comparison. Examination questions were taken from instruments previously used by the Centers for Disease Control and Prevention and the University of Vermont.19 Examination subject areas included immunization basic science, scheduling and administration, and contraindications. Questions about influenza and pneumococcal vaccines were excluded, as they are not currently a part of routine childhood immunization. After pilot-testing the examination on a small group of attending physicians (n = 5), minor revisions to improve clarity and to remove ambiguity were made before distribution to the residents.
Standard Resident Education
An annual 1-hour lecture on childhood immunizations, given by a senior resident, occurred at the beginning of the academic year before the intervention. The lecture included, but was not limited to, vaccine scheduling, interval spacing of immunizations, contraindications, and basic science. Current immunization guidelines from the American Academy of Pediatrics/ Advisory Committee on Immunization Practices/American Academy of Family Physicians (AAP/ACIP/AAFP)20 were posted in the main conference room and in all examination rooms. These guidelines were updated in January 1997 when the new recommendations were published.
An electronic immunization registry, without a prompting or feedback component, had functioned within our institution for 11 months before this study. Immunizations administered in the primary care program are entered into the computer registry either by the primary care provider during the visit or by a clinical assistant at the end of each clinical session.
Immunization prompting was based on an algorithm consistent with the 1996 AAP/ACIP/AAFP guidelines. The algorithm was based on 3 criteria: the age of the child, the number of doses of each vaccine previously administered, and the interval since the last dose of each vaccine. Figure 1 gives an example of the algorithm used for calculating the diphtheria-pertussis-tetanus vaccine (DPT) due for children <4 years old. We were limited in our ability to examine the appropriateness of previously administered doses because immunizations due were determined manually by a research assistant. However, there were 2 situations where a provider was prompted to readminister an immunization based on a previous dose given incorrectly. If the interval between DPT dose 3 and 4 was <6 months or the first dose of measles-mumps-rubella vaccine was administered at <12 months of age, the provider was prompted to readminister the dose, providing the appropriate interval had passed. Otherwise, all previously administered doses were considered valid.
Automated Printouts and Prompting
To ensure that the immunization information in the hospital's immunization registry was accurate and up-to-date, 1 day before each visit the registry report was compared with the vaccine record in the chart (considered the gold standard for this study), and any discrepancies were resolved. At the time of the visit, both groups of residents received an updated computerized printout of the child's immunization history, which was attached to the child's chart. For the intervention group, a trained research assistant manually determined the immunizations due, using the algorithm. A list of all possible immunizations along with the words due today was stamped on to the computerized printout, and the appropriate immunizations due that day were checked off. The control group received no prompt.
For those patients with incomplete immunization records, parents were granted 2 opportunities to obtain the immunization history from a previous provider. If at the third visit the chart still revealed a missing immunization history, the child was considered incompletely immunized, and the resident was prompted to administer all appropriate immunizations.
At the end of each day, postencounter chart review of all charts in both groups compared immunizations administered with those due. No variance was defined as complete administration of the vaccines due. Those charts with any variance from complete administration of immunizations due were further reviewed by a clinician, blinded to the resident groups. Before the study, unacceptable and acceptable reasons for not administering the immunization were categorized(Table 1). These reasons were based on errors documented during a 1-month period of pilot data collection and on the list of AAP/ACIP/AAFP contraindications to vaccination. The clinician reviewer assigned the reasons for not administering the immunization as prompted.
In assessing the performance of the 2 groups, the unit of comparison was the visit. Thus, 1 or more unacceptable failures to immunize at a visit counted as a single error. The comparison of errors between the 2 groups was assessed using the χ2test. To account for the nonindependence of observations the data were then analyzed using the physician as the unit of analysis. The proportion of visits in which 1 or more unacceptable missed opportunities/vaccine administration errors were made was calculated for each physician (number of visits with 1 or more missed opportunities/vaccine administration errors/total number of visits). The median percent of visits with unacceptable missed opportunities/vaccine administration errors was compared between the 2 groups using Mann-Whitney U test.
The 2 groups were also compared applying the same statistical approach, with only the first visit of each child in the analysis, to avoid repeated observations on the same child.
The Kolmogorov-Smirnov goodness-of-fit test21 indicated that the percents showed some departure from a normal distribution attributable to skewness. Therefore, the nonparametric Mann-WhitneyU test was used to compare median errors between the 2 groups. The Fisher's exact test was used to compare the proportion of types of missed opportunities/vaccine administration errors between the groups. Two-tailed P < .05 was considered significant.
The statistical significance of group differences on the examination was assessed using a 2-tailed Student's t test for comparison of means. All statistical analysis was performed using the Statistical Product and Service Solutions (SPSS) software package (version 9.0, SPSS Inc, Chicago, IL).
Four hundred ninety-five children <5 years old had a total of 686 well-child care visits during the study period. Sixty visits were excluded; 59 because no chart was available before the visit, and 1 visit where the same missed opportunity was scored at a subsequent visit. Hence, 626 visits (91%) were eligible for analysis, including 298 visits to residents who received prompts (intervention group) and 328 visits to residents who did not (control group). During the study period, residents in the intervention group had an average of 13.6 ± 3.3 visits, while residents in the control group had an average of 10.6 ± 4.6 visits (P = .012). There was no difference between groups in the mean age of children seen (intervention group, 1.96 ± 1.5 years; control group, 1.92 ± 1.4 years).
Thirty-four of the 298 visits (11.4%) to residents in the intervention group and 71 of the 328 visits (21.6%) to residents in the control group had 1 or more unacceptable missed opportunities/vaccine administration errors. This difference in unacceptable missed opportunities/vaccine administration errors between groups was highly significant, P < .001. The difference in missed opportunities/vaccine administration errors remained highly significant, P = .006 (11.1% vs 20.1%, intervention vs control) when using the physician as the unit of analysis (n = 52) and when using only the first visit that occurred during the study period (n = 495),P = .004 (11.2% vs 20.8%, intervention vs control).
Table 2 summarizes the unacceptable reasons for missed opportunities/vaccine administration errors. The intervention group when compared with the control group was less likely to postpone a vaccine to a subsequent visit group (5.9% vs 25.4%;P < .02). They also were more likely to give a vaccine too early (23.5% vs 8.5%; P = .062), although this difference did not reach statistical significance.
There were no differences in missed opportunities/vaccine administration errors based on the individual preceptor who supervised the resident or the type of the preceptor (staff, visiting attending, or fellow). Although there were slightly more errors made by residents in the control group who were supervised by visiting attendings, this difference was not significant.
Eighty-seven percent (45/52) of the residents attended the lecture on immunization before the intervention. All residents completed the examination preintervention, but only 74% completed the examination postintervention. There was no difference in completion rate between the intervention and control group. The control group (n = 30 residents) scored 76.5% preintervention and 81.3% (n = 23) postintervention. The intervention group (n = 22 residents) scored 75.5% preintervention and 80.7% (n = 16) postintervention (P= NS). There was no significant difference based on year of residency pre- or postintervention. Before the intervention, residents scored 91% on questions related to the basic science of immunizations, 84% on questions about immunization contraindications, and 76% on questions about scheduling and administration, with no significant changes in examination subscores in these areas postintervention.
This study demonstrated that prompting residents for immunizations due reduced missed opportunities/vaccine administration errors. The greatest detectable difference indicated residents were less likely to postpone administering vaccines to a future visit. Thus, prompting served as a reminder to the resident of an immunization due, which was then administered. Other studies have demonstrated that prompting systems improve performance.22,,23
A new finding in this study was that the majority of immunization administration errors among residents who received the prompts resulted from an immunization being given too early. This occurs when an immunization is administered either at too early an age or too close to a previous dose. Vaccination before the recommended interval may be an unintended consequence of our longstanding effort to encourage vaccination at the earliest possible time. The majority of children in our clinic are considered high-risk with high rates of failing to keep appointments (30%). Therefore, providers are prompted to administer the immunization at the earliest possible time, providing the requirements for minimum age and minimum interval have been met. Although the ACIP/AAP/AAFP schedule offers a considerable range in the timing of vaccine administration, children at high-risk are more likely to not be up-to-date by the age of 2 years.23,,24Considering the clinic was not staffed to offer immunizations during urgent care visits, a schedule of vaccination at the earliest possible time was used.
Providers often cite the problem of multiple injections as a reason for a missed opportunity.25–27 In this study, there was only 1 case where a provider documented postponing an immunization because of multiple injections.
Manual calculation of immunizations due is a very time-consuming process requiring trained individuals to review an immunization history and for each immunization, make a decision if it is due. Although for this study a trained research assistant performed this process, in most practices the task falls to a provider or to a nurse. As with any manual process, errors can occur either in the calculation or in the transfer of this information to the provider. As new vaccines or new recommendations for old vaccines are added to the immunization regime, the calculation of immunizations due at a particular visit becomes increasingly complex. This is especially true when minimal age requirements, minimal intervals, and interactions between live vaccines are considered. With the increase in computerized immunization registries, computer-generated prompts offer a more accurate method for calculating immunizations due. Future studies should examine the use of a computerized prompting system on the reduction of immunization administration errors and missed opportunities, especially in a resident training environment.
Before this study, we were concerned that in the setting of a residency training program, prompting the resident of immunizations due for a well-child care encounter would have a detrimental effect on learning. Despite prompting for immunizations due at each well-child care encounter, the residents' overall knowledge of immunizations was not altered. Residents entered this study with good knowledge of immunization delivery and basic science; however, they were less knowledgeable of immunization scheduling and administration. The high level of resident immunization knowledge at the outset of the study reduced the power to detect a change in resident knowledge based on the intervention. Longer-term use of immunization prompting in a training environment will help to determine if this effect is sustained over time.
A limitation in this study was the use of prompting at only well-child care visits. Staffing did not allow for the administration of immunizations during acute care encounters. It is likely that prompting during acute care visits would positively affect the rate of missed opportunities. Had this occurred in this study, it could have increased the demonstrated effect.
Although careful attention was paid to when a resident attended a clinical session other than their assigned day of the week, we recognize that some of the residents in the control group may have known that their colleagues were receiving prompts. Had this contamination occurred to a large extent, it would have decreased the difference between the 2 groups.
This study suggests that in a pediatric resident continuity clinic, immunization-prompting systems can reduce missed opportunities/vaccine administration errors to vaccinate children without affecting resident learning. Indeed, prompting stimulated many questions from the residents, providing the impetus to examine how to refine prompting as an educational tool in the future. In a training environment, immunization educational interventions must maintain a low level of missed opportunities, while at the same time augment learning. In the future, a combination of computer-generated prompting with feedback on performance offers the potential for improving performance and increasing vaccine administration knowledge in training settings.
This study was supported in part by the Massachusetts Immunization Program, Massachusetts Department of Public Health.
We gratefully acknowledge the residents, clinicians, and administrative staff in Pediatric Health Associates for their participation and support of this project.
- Received September 16, 1999.
- Accepted December 29, 1999.
Reprint requests to (J.S.S.) Vermont Child Health Improvement Project, Arnold 5459, UHC Campus, One South Prospect Street, Burlington, VT 05401. E-mail:
This research was presented in part at the 38th Annual Meeting of the Ambulatory Pediatric Association; May 2, 1998; New Orleans, LA.
- AAP =
- American Academy of Pediatrics •
- ACIP =
- Advisory Committee on Immunization Practices •
- AAFP =
- American Academy of Family Physicians •
- DPT =
- diphtheria- pertussis-tetanus vaccine •
- SPSS =
- Statistical Product and Service Solutions
- ↵US Dept of Health and Human Services. Healthy People 2000: National Health Promotion and Disease Prevention Objectives. Washington, DC: US Public Health Service; 1990
- McConnochie KM,
- Roghmann KJ
- Siegel RM,
- Schubert CJ
- Rosser WW,
- Hutchison BG,
- McDowell I,
- Newell C
- ↵Grimshaw JM, Russell IT. Effect of clinical guidelines on medical practice: a systematic review of rigorous evaluations. Lancet. 1993;342:1317–1322. See comments
- David TJ,
- Patel L
- American Academy of Pediatrics, Committee on Infectious Diseases
- Raszka WV,
- Colletti RB
- American Academy of Pediatrics, Committee on Infectious Diseases
- ↵Zar JH. Biostatistical Analysis. 3rd ed. Englewood Cliffs, NJ: Prentice-Hall International Inc; 1996:474–480
- Holt E,
- Guyer B,
- Hughart N,
- et al.
- Askew GL,
- Finelli L,
- Lutz J,
- DeGraaf J,
- Siegel B,
- Spitalny K
- Copyright © 2000 American Academy of Pediatrics