Objective. Many children enter the emergency medical system through primary care offices, yet these offices may not be adequately prepared to stabilize severely ill children. We conducted this study to evaluate the effectiveness of an office-based educational program designed to improve the preparation of primary care practices for pediatric emergencies.
Methods. A prospective, randomized, controlled trial was conducted of primary care practices (pediatric, family practice, and health departments) that were recruited from an existing database of North Carolina practices. Practices that agreed to participate were randomly assigned to either the intervention or the control group. Unannounced mock codes were conducted in the intervention practices by 2 emergency medicine clinicians (medical doctor and/or registered nurse). Practices were expected to respond to the mock code using their own staff, equipment, and local emergency medical system. After the exercise, the emergency medicine clinicians and the local emergency medical system team led a structured debriefing session providing constructive feedback to the staff on their performance, a review of the office’s equipment, and a resource manual designed for the project. The primary outcome measures were obtained by survey 3 to 6 months postintervention and included 1) purchase of new pediatric emergency equipment and medications, 2) receipt or updating of basic life support/pediatric advanced life support/advanced life support training by staff members, and 3) development of written emergency pediatric protocols. The control practices received no interventions during the trial and completed a similar outcome survey.
Results. Thirty-nine practices (20 intervention, 19 control) completed the trial. There were no significant differences in practice characteristics between the 2 groups. Intervention practices were more likely to develop written office protocols (60% vs 21%); more staff in the intervention practices received additional basic life support/pediatric advanced life support/advanced life support training 3 to 6 months after the intervention (118 vs 54). There were no significant differences in the purchase of new equipment or medications. Ninety percent of the intervention practices rated the intervention as useful for their practice, and 95% believed that the program should be continued.
Conclusions. The findings suggest that the intervention was well received and motivated practices to take concrete actions to prepare for pediatric emergencies.
- office preparedness
- pediatric emergencies
- emergency medical services for children
- continuing medical education
- pediatrics advanced life support; randomized
- controlled trial
The 1993 report of the Institute of Medicine, Emergency Medical Services for Children,1 described numerous deficiencies in the country’s pediatric emergency medical system (EMS). It also identified the evaluation of effective approaches to EMS education and training as a priority area for research. Since 1993, substantial advances have been made in the delivery of sophisticated, high-quality pediatric emergency care, particularly in emergency departments and prehospital transport systems. However, many children enter the EMS through primary care offices. Retrospective studies have found that the rates of emergencies in primary care practices that provide care to children vary from 0.9 to 38 per office per year.2–7
A periodic survey conducted by the American Academy of Pediatrics found that 73% of respondents encountered 1 or more patients per week requiring emergency treatment or hospitalization.3 Flores et al5 surveyed 52 pediatric offices in Fairfield County, Connecticut, and found that these practices saw a median of 24 emergencies per year; 82% averaged at least 1 emergency per month. Fuchs et al6 surveyed a sample of pediatricians and family practitioners in the Chicago area and found that 62% reported that they assessed in their offices each week >1 patient who required hospitalization or urgent care. Most recently, Heath et al,2 in a study of 38 pediatric practices throughout Vermont, found a rate of 0.9 emergencies per office per year. Much of the variation in rates among these studies is attributable to the lack of a specific operational definition of a medical emergency in primary care settings.8 Another important finding of these studies was that many practices were not prepared to manage many pediatric emergencies. Deficiencies were documented in equipment, organization, and training.
The North Carolina Office of Emergency Medical Services responded to this need for enhanced office preparedness for pediatric emergencies by developing the educational intervention “Office Preparedness for Pediatric Emergencies” (OPPE). The goal of this novel office-based educational program is to ensure that children who have life-threatening illnesses or injuries and present to primary care practices receive high-quality care before being transferred to an appropriate emergency department. The objectives of the OPPE are to 1) increase the number of primary care physicians with up-to-date pediatric resuscitation knowledge and skills, 2) train office staff in the identification of acutely ill and injured children, 3) provide practices with emergency treatment protocols, and 4) educate practices about the role and level of training of local EMS providers.
The purpose of our study was to evaluate the effectiveness of the OPPE intervention. The intervention was piloted in 20 practices before this study, and results of an externally conducted, qualitative evaluation were very positive. However, the intervention is labor intensive and there are hundreds of practice sites that provide primary care to children in North Carolina. In addition, we were interested in determining whether the intervention would result in practices’ taking concrete actions to prepare for pediatric emergencies rather than simply improving their knowledge or attitudes about these events. Thus, we concluded that a prospective trial was warranted before deciding to promulgate the program statewide.
Study Participants, Recruitment, and Randomization
Practices were randomly selected from an existing database of approximately 1100 childhood immunization providers in North Carolina maintained by the Immunization Branch of the state health department. University- and hospital-based practices were eliminated because of their close proximity to hospital emergency departments. The list included pediatric and family medicine practices, health department, and community health centers. For the purposes of analysis, community health centers were considered family medicine practices because physicians were on sight, they saw patients of all ages, and they typically were staffed by family physicians.
The entire list of providers was randomly divided into 2 lists using a computerized random-number generator to create 2 recruitment lists, one for intervention practices and the other for control practices. All practices were mailed information regarding the study. After the mailing, a research assistant telephoned the offices, in the order in which they appeared on the 2 lists, to invite them to participate. Intervention practices were scheduled to receive the office-based training as soon as possible based on the availability of regional trainers and the practices’ preferences. These interventions took place over approximately 9 months. Control practices received the intervention >1 year later, after the intervention practices all had been visited and the outcomes surveys completed.
The core of the intervention was an unannounced “mock code” conducted in the practices by 2-person instructor teams (physicians and/or nurses with expertise in pediatric emergency medicine) and a local EMS unit. One team member would pose as a parent and arrive at the practices’ front desk and give a history that he or she could not wake up his or her child that morning. The scenario typically involved an infant that was febrile, lethargic, and severely dehydrated. Practices were told to work through the scenario in real time using their own staff and equipment. The scenario was completed after care had been successfully transferred to the EMS team.
After the mock code, the instructors led the office staff through a structured debriefing of the scenario. Special attention was paid to 1) recognition of an emergency by reception area staff, 2) movement of the patient to an appropriate resuscitation area for definitive care, 3) delineation of staff responsibilities in the treatment area, 4) availability and organization of equipment, 5) activation of the EMS, and 6) transfer of care to the EMS unit. The debriefing session focused on staff and office preparation and organization rather than specific resuscitation protocols. EMS providers actively participated in debriefing sessions and focused on issues related to local EMS capabilities and transfer of care.
Offices were left a copy of the provider’s manual produced for the intervention for future reference and practice.9 The manual provides suggestions on how to prepare offices for pediatric emergencies (eg, staff organization, training and equipment and medication lists), tips on running mock codes in offices, treatment protocols for 14 specific emergency conditions, and examples of several suggested resuscitation documentation forms. Two hours of category 1 continuing medical education credit was provided to physicians who took part in the exercise.
Three surveys were used to measure baseline and outcome variables. At the time of study enrollment, a brief descriptive survey was completed to measure baseline practice characteristics. These included practice type (solo, group single specialty, group multispecialty), specialty (pediatric vs family practice), patient volume, and the number of pediatric emergencies experienced in the past 6 months.
Outcomes were measured in the intervention practices using a mailed questionnaire 3 to 6 months after the mock code session. Control practices completed a modified version of the same questionnaire (without specific intervention questions) immediately after enrolling in the study. We were particularly interested in capturing concrete actions that the practices took to prepare their offices better as a result of the intervention. Thus, the questionnaire asked whether the practices had 1) developed office protocols for responding to pediatric emergencies (eg, where in the office the child is to be taken, how providers are notified, how EMS is activated, what happens when office is closed for lunch), 2) acquired or updated staff training (eg, basic life support [BLS], pediatric advanced life support [PALS], APLS [The Pediatric Emergency Medicine Course], advanced cardiovascular life support [ACLS]), 3) purchased new pediatric-specific emergency equipment or medications (eg, pediatric bag-valve mask device, intraosseous needles), 4) conducted subsequent mock codes or other practice exercises, or 5) experienced a change in perceived office preparedness for pediatric emergencies.
Intervention practices were instructed to answer the outcome questions as they applied to the 6-month interval since they had received the intervention. Control practices were asked to answer the questions for the 6 months before enrolling in the study. We used a nonconcurrent time frame for the outcomes in the control practices for 2 reasons. First, it decreased the number of times we had to contact the control practices, saving them the interruptions and the project scarce resources. Second, we believed that this would minimize the possibility of “contamination” of the control group. Intervention and control practices were located in the same communities in several areas of the state. We worried that the purpose and the contact of the office-based sessions could easily be shared between practices, potentially diluting the apparent impact of the intervention. Immediately after the workshop, a third survey that evaluated whether the instructors effectively met the learning objectives of the session and whether they believed that the content was relevant to their practice was completed by the intervention practices.
In most cases, surveys were completed by the practices’ head nurse or lead physician. These individuals were encouraged to seek answers from other practice members as needed. Project staff made multiple calls to the practice to ensure high response rates and accurate answers to all questions. Several practices chose to complete surveys by telephone.
Three of the primary outcome variables were categorical (development of protocols, purchase of new equipment, and conduct of additional mock codes). The number of staff who obtained new training or updated existing training was recorded as a continuous outcome. The χ2 test was used to compare nominal and categorical variables, and t test was used to compare continuous variables. The small number of practices enrolled precluded stratified analysis or multivariate modeling. The study was approved by the Committee on the Protection of the Rights of Humans Subjects at the University of North Carolina School of Medicine.
A total of 448 practices were contacted by telephone and asked to participate in the study. Forty-seven practices (23 intervention, 24 control) consented to participate. By the study closure date, 3 intervention practices had not received the OPPE, and 5 control practices did not complete the outcome instrument. Thus, a total of 39 practices (20 intervention, 19 control) completed the trial. Of the 401 that did not participate, 20 practices had already received the OPPE workshop, 77 did not meet the inclusion criteria, 43 stated that they were too busy for the training, and 94 declined to participate. An additional 167 practices were excluded because we were unable to contact the senior partner after 4 attempts.
The groups were comparable with respect to specialty, organization, and distance from nearest hospital (Table 1 ). However, there was a trend toward larger pediatric patient volume in the intervention practices when compared with the controls, although this finding was not statistically significant (P = .13). There were differences in the geographical distribution of intervention and control practices across the state. The intervention practices did call EMS to their offices for pediatric emergencies more often in the previous 6 months than the control practices (2.6 calls vs 0.3 calls; P = .05).
Twelve (60%) of the 20 intervention practices developed written response plans for pediatric emergencies, versus only 4 (21%) of 19 control practices (P = .02). Practices that received the intervention were also more likely to get additional BLS/PALS/ALS training for staff after the intervention (118 vs 54; P = .02). Table 2 shows the types of new training that the practices received in the 3 to 6 months after OPPE (intervention) or after enrollment (control).
There were no significant differences between intervention and control practices in the purchase of new equipment or medications (20% of intervention practices vs 22% of control practices; P = .86). There were also no differences between the 2 groups in the proportion that had conducted subsequent mock codes or other practice exercises.
Intervention practices tended to feel more prepared for emergencies than control practices. On a 1 to 10 scale (1 = not prepared, 10 = very prepared), the mean score in the group of intervention practices was 7, compared with 6 in the control practices, although the results did not meet statistical significance (P = .07).
Satisfaction with the OPPE was evaluated twice, first immediately postintervention and then again as part of the 3- to 6-month outcome evaluation survey. Table 3 shows the responses of the intervention practices regarding their staff’s reactions to the OPPE exercise immediately after the training session. At the 3- to 6-month post-OPPE evaluation, 90% of the intervention practices reported that the OPPE was useful for their practice and 95% believed that the program should be continued.
The findings suggest that the OPPE intervention motivated practices to take concrete actions to prepare for pediatric emergencies. The practices that received the training expressed a high degree of satisfaction with the program.
This intervention is unique in several ways. First, it was performed in primary care offices by multidisciplinary teams of physicians, nurses, and EMS providers. This format allowed primary care practices to assess their preparation in their own practice setting, using their own equipment and staff. Most participants commented that the element of surprise and realism enhanced the utility of the exercise. In addition, the interaction between EMS providers and office staff was invariably fruitful. EMS staff appreciated the opportunity to practice pediatric resuscitation skills. Offices learned more about the pediatric capabilities and organization of their local EMS. Second, we had a randomly selected control group, which allowed us to assess better the impact of the intervention. Previous studies in this area either have relied on retrospective surveys alone or failed to have control groups.
Heath et al2 used a somewhat similar approach in 38 pediatric practices in Vermont. In their study, a team consisting of a physician, a nurse, and an emergency medical technician scheduled visits with practices, discussed office organization for emergencies, reviewed PALS algorithms, provided a standardized resuscitation kit, and demonstrated an office mock code using the donated resuscitation kit. As in our study, the majority of participating practices believed that the intervention and donated equipment were useful. In the 12-month period after the intervention, 30% had involved EMS in office planning for emergencies, 97% reported checking their equipment on a regular basis, and 30% performed additional mock codes in their offices. These investigators also prospectively measured office emergency prevalence over 12 months and found that 14 of 38 practices experienced office emergencies requiring the use of resuscitation equipment and 30% of the practices had called EMS to their office for an emergency. There was no comparison group in this study, so valid assessments of their interventions are limited.
The intervention that we report resulted in numerous benefits not captured by the primary outcomes. With respect to equipment, many offices had Broselow tapes; however, they typically used them to obtain weights but then calculated medication doses rather than reading them off the tape. The importance of an emergency response plan was most keenly apparent in one health department clinic where a nurse was the only clinician present for the initial mock code. During the scenario, she recognized that the infant was becoming cyanotic and needed oxygen. After several minutes, she was asked whether she would give oxygen. She replied, “Yes, but I do not have an order.” This clinic subsequently developed standing orders to cover the periods when no physicians were available. Although we found no significant difference in the purchase of new equipment, 3 practices sent us pictures of their equipment reorganized using the Broselow color coding system and inexpensive materials obtained at office supply stores. EMS involvement in the offices led to additional benefits. For example, the EMS providers who took part in one mock code conducted a BLS training session for the entire office staff the next week, thus allowing the entire nonclinical staff to increase their training quickly and conveniently.
There are several limitations to this study. The true outcome of interest in office emergencies is the outcome of actual resuscitations. This study did not determine whether the intervention improved the provision of emergency care to actual patients. Because of the infrequent nature of such events and the difficulties in capturing such data, we were forced to choose what we believed were valid proxies for these events. We did choose outcomes that required practices to take concrete steps to prepare themselves for emergencies rather than simple knowledge, attitude, and awareness measures.
All outcomes were measured by self-report, which allows for the effect of social desirability. However, because outcomes were measured similarly in the intervention and control groups, it is unlikely that this factor had a differential effect. It was also not possible to blind the research assistant who was performing the follow-up surveys to practices’ “treatment” assignment, allowing for the possibility of bias. We attempted to minimize this by using standardized instruments with mutually exclusive responses.
There were 2 additional challenges to conducting this study. Practice recruitment via the telephone presented a challenge. Letters were sent to all practices describing the project and our role as researchers from the University of North Carolina before the enrollment telephone call. Nevertheless, it became apparent that many of the offices contacted were concerned that we represented a state regulatory agency and the OPPE was a disguised site visit to evaluate their emergency preparedness. This explained the large number of practices that declined to participate. The problem was minimized through a procedure instituted midstudy. After a brief initial telephone call to the practice, the research assistant immediately faxed a detailed information sheet describing the study and our participation in the project, after which a subsequent telephone call was placed to answer questions, address concerns, and invite the practice to participate in the study. This practice alleviated most suspicions.
The second challenge was delivering the intervention to the practices throughout the state of North Carolina. At the outset of the study, it was expected that a core of 10 volunteer OPPE faculty would be committed to providing 5 intervention exercises during a 2- to 3-month period. Although the core of volunteer nurses and physicians were systematically trained to provide the OPPE, it became evident that competing demands on faculty time, practice scheduling constraints, and practice staff turnover resulted in fewer OPPE exercises being completed in the projected time frame.
Sustaining this program will be challenging, despite its positive impact. Like PALS and other ALS courses, OPPE relies on volunteers to obtain training and conduct exercises. Currently, no regulatory agencies require practices to prepare systematically for emergencies. The American Academy of Pediatrics has published policy statements on pediatric emergency preparedness in schools,10 urgent care centers,11 and hospitals12,13 but not in primary care practices. However, the topic has received increased attention recently as evidenced by the publication of several review articles,14,15 a new textbook,16 and an American Academy of Pediatrics resource manual17 and the availability of workshops on the subject at professional society meetings.
Despite these challenges, we are encouraged by the program’s results and the many benefits not systematically assessed in this study. Traditional continuing medical education is didactic and focuses on knowledge acquisition. OPPE represents a growing trend to make continuing medical education more experiential and focused on competency and improvement.
We thank the primary care offices listed below that participated in this project. Without their participation, this study would not have been possible. We also thank the instructors and EMS personnel who conducted this statewide intervention. They generously contributed their time both to be trained and to conduct the office sessions.
The primary care office that participated in this project are as follows: Avery County Health Department, Caldwell Family Physicians, Capitol Pediatrics, Charles Drew Community Health Center, Coastal Immediate and Primary Care, Durham Children’s Clinic, Gaffney Health Services, Gaston County Health Department, Halifax County Health Department, Hart Family Practice, Haywood Pediatrics, Health East Family Care-Hattaras, Henderson Pediatric Clinic, Hot Springs Health Program, Johnston’s Family Care Center, Knox Clinic Pediatrics (Wilmington Office), LeBauer Healthcare at Brassfield, Matthews Children’s Clinic, Maxton Family Practice, Medical Associates of Wilkes County, Medstop Medical Center, Moncure Community Health Center, Mt. Olive Pediatrics, New Hanover Community Health Center, New River Family Medicine, North Carolina Pediatrics, Park Family Practice, Pediatric Partners, Presbyterian Mint Hill Family Practice, Providence Pediatrics, Randolph Medical Associates, Rex Pediatrics of Cary, Roanoke Chowan Medical Practice, Rockingham Children’s Clinic, Shelby Children’s Clinic, Sylva Pediatrics, Troy Medical Services, Unifour Family Practices, and Vance Warren Comprehensive Health.
- Received May 14, 2002.
- Accepted January 14, 2003.
- Address correspondence to W. Clayton Bordley, MD, MPH, Division of Emergency Medicine, DUMC Box 3096, Durham, NC 27710. E-mail:
- ↵Institute of Medicine. Emergency Medical Services for Children. Washington, DC: National Academy Press; 1993
- ↵Heath BW, Coffey JS, Malone P, Courtney J. Pediatric emergencies and emergency preparedness in a small rural state. Pediatrics.2000;106 :1391– 1396
- ↵American Academy of Pediatrics. Periodic Survey 27. Elk Grove Village, IL: American Academy of Pediatrics; 1995
- Schweich P, DeAngelis C, Duggan A. Preparedness of practicing pediatricians to manage emergencies in the office. Pediatrics.1991;88 :223– 229
- ↵Fuchs S, Jaffe DM, Christoffel KK. Pediatric emergencies in the office practices: prevalence and office preparedness. Pediatrics.1989;83 :931– 939
- ↵Altieri M, Bellet J, Scott H. Preparedness for pediatric emergencies encountered in the practitioner’s office. Pediatrics.1990;85 :710– 714
- ↵Flores G, Barry W. Heath. Methodologic flaws, wrong answers, and right questions: pediatric office emergencies. Pediatrics.2001;108 :1052– 1053
- ↵Frush K, Hohenhaus S, Bailey B, Cinoman M. Office Preparedness for Pediatric Emergencies. North Carolina Office of Emergency Medical Services, 1997. Available at: www.ncems.org/pediatri.htm. Accessed January 13, 2003
- ↵American Academy of Pediatrics, Committee on School Health. Guidelines for emergency medical care in school. Pediatrics.2001;107 :435– 436
- ↵American Academy of Pediatrics, Committee on Pediatric Emergency Medicine. Pediatric care recommendations for freestanding urgent care facilities pediatrics. Pediatrics.1999;103 :1048– 1049
- ↵American Academy of Pediatrics, Committee on Pediatric Emergency Medicine. Guidelines for pediatric emergency care facilities. Pediatrics.1995;96 :526– 537
- ↵American Academy of Pediatrics, Committee on Pediatric Emergency Medicine. Care of children in the emergency department: guidelines for preparedness. Pediatrics.2001;107 :777– 781
- ↵Schuman AJ. Be prepared: equipping your office for medical emergencies. Contemp Pediatr.1996;13 :27– 43
- ↵Barton CW. Management of Office Emergencies. New York, NY: McGraw Hill; 1999
- ↵American Academy of Pediatrics. Childhood Emergencies in the Office, Hospital, and Community. Elk Grove Village, IL: American Academy of Pediatrics; 2000
- Copyright © 2003 by the American Academy of Pediatrics