BACKGROUND: Pediatric housestaff are required to learn basic procedural skills and demonstrate competence during training. To our knowledge, an evidenced-based procedural skills curriculum does not exist.
OBJECTIVE: To create, implement, and evaluate a modular procedural skills curriculum for pediatric residents.
METHODS: A randomized, controlled trial was performed. Thirty-eight interns in the Boston Combined Residency Program who began their training in 2005 were enrolled and randomly assigned. Modules were created to teach residents bag-mask ventilation, venipuncture, peripheral intravenous catheter (PIV) insertion, and lumbar puncture skills. The curriculum was administered to participants in the intervention group during intern orientation. Interns in the control group learned procedural skills by usual methods. Subjects were evaluated by using a structured objective assessment on simulators immediately after the intervention and 7 months later. Success in performing live-patient procedures was self-reported by subjects. The primary outcome was successful performance of the procedure on the initial assessment. Secondary outcomes included checklist and knowledge examination scores, live-patient success, and qualitative assessment of the curriculum.
RESULTS: Participants in the intervention group performed PIV placement more successfully than controls (79% vs 35%) and scored significantly higher on the checklist for PIV placement (81% vs 61%) and lumbar puncture (77% vs 68%) at the initial assessment. There were no differences between groups at month 7, and both groups demonstrated declining skills. There were no statistically significant differences in success on live-patient procedures. Those in the intervention group scored significantly higher on knowledge examinations.
CONCLUSIONS: Participants in the intervention group were more successful performing certain simulated procedures than controls when tested immediately after receiving the curriculum but demonstrated declining skills thereafter. Future efforts must emphasize retraining, and residents must have sufficient opportunities to practice skills learned in a formal curriculum.
The Accreditation Council for Graduate Medical Education mandates that pediatric residencies develop innovative, competency-based approaches for teaching clinical skills.1–3 Teaching procedural skills using the “see one, do one, teach one” approach is not adequate.4 Physician trainees are less likely than nurses to receive formal teaching and supervision when first performing a procedure, and are more likely to feel inadequately trained to initially perform a procedure.5 Pediatric residents do not competently perform skills included on the Residency Review Committee's list of basic procedures, despite feeling confident to do so.6–9 We surveyed pediatric program directors10 and found that trainees may fail to achieve competence in basic procedures by the end of training.
To our knowledge, no comprehensive, validated curriculum exists to teach essential procedures to pediatric residents. Experience in other fields suggests that outcomes-based training leads to earlier acquisition and independent application of procedural skills.1,11 Residents have learned to perform circumcision,12 injections,13 and resuscitation skills14,15 with documentation of competence at the end of training. Educational theory supports the teaching of psychomotor competence through simulated experiences,16 a preferred method given the current emphasis on error reduction and patient safety.17 Anatomic models and high-fidelity patient simulators are potentially useful training aids for teaching psychomotor skills,16,17 and they eliminate the need for trainees to practice on patients during the period of skill acquisition.
Pediatric educators also lack objective evaluation tools to document procedural competence. Procedure logs and subjective evaluation have traditionally been used to evaluate procedural skills16 despite the absence of data establishing a relationship between procedural exposure and performance. Increasing evidence suggests that valid and reliable assessments of procedural skills may be achieved in simulated experiences.15,18 Data also support the use of structured checklists for assessing procedural skills, and they reveal technical errors even when a procedural end point is reached successfully.8,9,18
We designed a modular curriculum to teach procedures to pediatric interns using simulation during training and assessment, and conducted a randomized, interventional study to assess its effectiveness. Using a structured evaluation tool, we hypothesized that interns receiving the curriculum would demonstrate better procedural skills than the control group when assessed on simulators and live patients.
Twenty senior fellows and faculty at Children's Hospital Boston (CHB) comprised focus groups identifying bag-mask ventilation (BMV), venipuncture, peripheral intravenous catheter placement (PIV), and lumbar puncture (LP) as the most critical procedural skills for pediatric trainees. Individual modules and assessment tools (structured binary checklists) were then developed by expert clinical faculty from the divisions of critical care and emergency medicine (BMV, LP), and by the director of the CHB vascular access team (venipuncture, PIV). Each module was designed to encourage repetition leading to automatic behavior (behaviorist learning) and the development of problem solving skills through experience (constructionist learning).19
Study Population and Setting
All 38 pediatric interns starting in the Boston Combined Residency Program in June 2005 were eligible for participation. The study was conducted at an urban, academic referral hospital with both vascular access and phlebotomy teams available 24 hours a day.
We performed a randomized, controlled trial. The study was approved by the CHB institutional review board, and written informed consent was obtained from participants. Before randomization we recorded interns' previous experience and confidence with these procedures. Random assignments were generated by computer by using a permuted blocks design with random block sizes. Participants were assigned to intervention or control groups by 2 study investigators working together (Drs Gaies and Sandora). Intervention group interns received the curriculum, whereas control group interns learned procedures via traditional educational strategies (observation of more experienced clinicians). Pediatric advanced life support (PALS) instruction occurred during intern orientation; no other sessions were dedicated to procedural teaching during the intern year. There were no requirements for interns to perform a certain number of these procedures. The curriculum was offered to the control group after the 6-month study period.
The BMV, venipuncture, and PIV modules were taught during orientation, after subjects had received PALS training (Fig 1). The intervention group participated in a didactic session with faculty, observed the instructor performing the procedure, and then practiced the skills multiple times on a simulator under direct observation. Two days later both groups underwent a structured observational assessment on simulators. Seven senior residents blinded to group assignment were trained to use a binary checklist of weighted procedure subcomponents to score performance; 1 step was chosen a priori as the measure of overall success for the procedure (see Appendices 1–4). Each subject was scored by only 1 evaluator. Scorers were observed by the 2 primary investigators (Drs Gaies and Morris) during the evaluation to ensure accurate assessment. Sessions were videotaped and available for additional review, but this was not necessary. The final checklist score was calculated by expressing the sum of the weighted points earned for correctly performed tasks as a percentage of the total possible points. Feedback immediately after the exercise was neither mandatory nor prohibited. Subjects were not given access to the checklist. Each intern also completed a multiple-choice knowledge examination. Examinations and checklists were adapted from the PALS curriculum, previous studies of BMV skills,8,15 and the CHB Department of Nursing vascular access curriculum. The high-fidelity BMV simulator was capable of real-time physiologic responsiveness to intervention. Anatomic models capable of giving a red fluid sample from a vessel were used for venipuncture and PIV.
The LP module was delivered during each intervention subject's first 2- or 4-week emergency department rotation. This module was administered similarly to the methods described above. The simulator used was an anatomic model capable of giving a clear spinal fluid sample. A nonblinded observer assessed subjects in both groups during a structured observational assessment on the simulator before completion of the rotation, and a knowledge examination created by the module faculty was administered. The first 4 subjects were videotaped, and a different blinded observer scored the subjects by watching the video, with 94% agreement between observers on all scored items. All subsequent subjects were assessed by the original nonblinded observer.
Venipuncture, PIV, and LP skills were evaluated in live clinical situations. All subjects prospectively reported on handwritten data cards the number of attempts and outcome for each procedure performed during the first 6 months of internship. A successful venipuncture was defined as the ability to fill a specimen tube with a quantity sufficient for the desired diagnostic test within 2 attempts. A successful PIV insertion was defined as the ability to place an intravenous catheter suitable for at least 1 therapeutic infusion within 2 attempts. LP success was defined as obtaining a suitable cerebrospinal fluid sample (not further specified) for culture. Procedure cards were not reviewed by senior physicians, and no incentive was offered to interns for procedure reporting.
At the conclusion of the study period, the direct observational assessment on simulators and written examinations were repeated for all 4 procedures. The blinded observers who scored the original assessment were retrained as necessary for the final assessment. Two new observers were trained to provide coverage when an original observer was not available. It was assumed that most procedures would be performed during NICU, emergency department, and ward rotations; all subjects had at least 1 rotation in each before the final assessment.
The primary outcome was successful performance of the procedure on the simulator assessment at the end of the module (see Appendices 1–4 for overall success measures). Secondary outcomes included scores on structured checklists at the initial assessment, overall success on simulators and checklist score at the final assessment, and knowledge examination scores at the initial and final assessments. Success in performing venipuncture, PIV, and LP on live patients was also analyzed.
We summarized participant demographics with means and SDs for continuous variables and percentages for categorical variables and compared the control and intervention groups with Wilcoxon signed rank tests and χ2 tests, as appropriate. We used generalized estimating equations to model the outcomes as a function of group, time, and their interaction. We used the logistic link for the success outcome and the identity link for the checklist scores and knowledge examinations. Statistical analyses were performed by using SAS 9.1 of the SAS System for Windows (SAS Institute, Inc, Cary, NC). Two-sided P values of <.05 indicated statistical significance.
Eighteen subjects were randomly assigned to the intervention group and 20 to the control group. There were no differences between the groups for age, gender, or life support certifications obtained before internship (Table 1). The control group tended to have more procedural experience and greater confidence. All intervention group subjects received the curriculum, and no subjects withdrew.
Simulated Skills Testing
At the initial assessment, there was no difference between the groups in overall success (72% [intervention] vs 80% [control]; P = .57) or on checklist scores (intervention: 66% versus control: 62%; P = .46) (Fig 2). There was a significant decline in performance over time for both groups on the BMV assessment (72% success initially versus 27% at follow-up for intervention [P = .029] and 80% vs 45% for control [P = .027]). Participants in the intervention group initially scored significantly higher on the knowledge examination (76% vs 62%; P < .0001) and at follow-up (73% vs 65%; P = .04).
Initially, there (Fig 3) was no difference between the groups in overall success (58% [intervention] vs 55% [control]; P = .85), but participants in the intervention group trended toward a higher checklist score (Fig 3; 79% vs 68%; P = .16). At follow-up, there was no difference between the 2 groups by either measure. Those in the intervention group demonstrated no significant decline in venipuncture skills, but the control group trended toward declining performance. On the venipuncture/PIV component of the knowledge examination, the intervention group scored significantly higher at both the initial (86% vs 76%; P = .001) and final (83% vs 77%; P = .049) assessments.
Initially, participants in the intervention group significantly outperformed controls in overall success (78% vs 35%; P = .01) and on the checklist (81% vs 61%; P = .003) (Fig 4). At follow-up, the intervention group trended toward better overall success and higher checklist scores, but the differences were not significant. The control group demonstrated no significant improvement over time, whereas the intervention group had a declining trend in its checklist scores (P = .06).
Initially, there was no difference between the groups in overall success (77% [intervention] vs 62% [control]; P = .38); however, participants in the intervention group scored significantly higher on the checklist (77% vs 68%; P = .04) (Fig 5). At follow-up, there was no difference between the 2 groups by either measure. The control group's overall success did not improve, but they had significantly higher scores on the checklist at follow-up compared with the initial assessment (P = .006). Overall LP success for those in the intervention group was lower at follow-up than initially, but this was not a significant decline. The intervention group had lower scores than the control group on the knowledge examination (59% vs 82%; P < .0001) initially, but there was no difference at the final assessment (69% vs 69%; P = .80).
The results of 126 venipunctures in live patients were reported, with participants in the intervention group trending toward more successful performance (83% vs 69%; P = .07). Fifty-three PIV insertions were reported, with no significant difference in performance observed (66% [intervention] vs 50% [control; P = .25). Finally, 64 LPs were reported with no difference in performance between groups (70% [intervention] vs 62% [control]; P = .6). There were no differences in the number of procedures reported by each group, confidence to perform the procedure, or patient demographics.
Interns receiving our curriculum performed PIV and LP skills on simulators with greater success than controls immediately after the intervention, and scored higher on knowledge examinations for venipuncture/PIV and BMV. Participants in the intervention group demonstrated declining skills at follow-up, and the control group did not improve.
There was a trend toward better performance of venipuncture on the procedure subcomponent checklist, and our success measure may have underestimated the intervention group members' skills. The overall success measure for venipuncture was withdrawal of 3 mL of fluid from the simulator, although participants were not instructed to withdraw a specific volume. The success rate for the intervention group on this step was 59%; however, 77% of intervention subjects successfully performed a later step (transferring a sample from their syringe to the specimen tube), demonstrating that at least that proportion obtained a fluid sample of some volume. We concluded those in the intervention group obtained a specimen more successfully than controls but did not achieve a significantly higher overall success rate because of withdrawal of an inadequate volume.
There were no differences between the groups in BMV performance on a high-fidelity patient simulator when evaluated shortly after PALS training. We had hypothesized that adding the BMV module to PALS training would result in better retention of BMV skills, but this was not observed. The intervention group scored higher on the BMV component of the knowledge test initially, suggesting that the curriculum provided a better understanding of the procedure than that achieved by PALS training alone.
At follow-up there were no significant differences between the groups for any of the skills. The only significant improvement by the control group over time was on the LP checklist, suggesting that traditional educational methods did not lead to acquisition of most procedural skills during the study period. Declining performance of participants in the intervention group on PIV and LP also contributed to the lack of differences at follow-up, and both groups performed significantly worse on BMV. These findings highlight the important issue of skill retention. A systematic review of medical simulator training programs found that only 2 of 44 studies included an assessment of skill retention after intervention.20 In a study of microsurgical skills training,21 subjects trained with simulators demonstrated attrition of skills at a 4-month follow-up, but continued to outperform subjects trained by didactic methods alone. Curran et al22 studied the retention of neonatal resuscitation skills after standard and simulated training. There was a significant decline in performance over 4 months, and retraining with simulation did not result in further retention of skills. Other controlled and uncontrolled studies of cardiopulmonary resuscitation,23 colonoscopy,24 and surgical training25 have produced mixed results with regards to skill maintenance over time.
Our subjects' declining skills may be explained by insufficient opportunities to practice procedures during clinical work. The number of self-reported live procedures was likely an underestimate of the actual number performed, but the small numbers corroborate the authors' personal experience at CHB that interns perform very few procedures relative to trainees of years past. Many factors have contributed to this trend including work hours restrictions, the availability of ancillary vascular access teams, and increasing documentation requirements placed on residents. The impact of any intervention to teach procedural skills will be significantly lessened without opportunities to practice skills in clinical situations. Multiple studies highlight the importance of a short time lapse between training and practice and a greater number of practice opportunities for retention of surgical skills.22,26,27 Given the risk of skill attrition, it is imperative that future curricula incorporate scheduled retraining. Pediatric residencies must provide an environment that requires trainees to practice procedural skills, and competence should be assessed repeatedly throughout training.
A critical aspect of curriculum evaluation is determining whether performance in simulated situations translates to performance in clinical encounters. Previous studies have measured transference of skills from the simulator to the clinical arena. Lynagh et al20 reported 11 studies showing significant improvements in clinical performance of surgical skills by subjects receiving simulator training. Other studies have addressed procedures more closely resembling those taught in our study. Velmahos et al28 demonstrated that surgical interns learning central venous catheter insertion through simulation performed the skill equally in simulated and live encounters, and were more successful than controls when performing the skill on patients. Wayne et al29 reported that internal medicine residents trained to perform advanced cardiac life support on high-fidelity simulators showed greater adherence to quality indicators when responding to cardiac arrests compared with residents trained by traditional methods. Our ability to determine if intervention group subjects outperformed controls in clinical scenarios was limited by the low number of reported procedures. There was a trend toward greater success by participants in the intervention group for each procedure, but none of the differences were significant.
Our study had several limitations. The results of a single-program study may not be generalizable to other training programs. Previous data were not available to guide curriculum development, so our modules, checklists, and knowledge examinations were necessarily created specifically for the study, and as such, they have not been validated internally or externally. If our results are to be confirmed, validation of these assessment tools is a crucial next step. The small sample size of a single intern class limited our power to detect significant differences between groups, and some subjects did not complete follow-up testing for at least 1 of the procedures. The demands of our residency program prohibited complete follow-up testing within a narrow time frame. We elected not to extend follow-up out of concern that skill levels might improve or decline significantly. Finally, our analysis of procedures performed on live patients was limited by the self-reported nature of the data. We specifically chose self-reporting instead of a more robust system using senior resident observers to preserve interns' educational relationship with senior residents.
Pediatric interns who received our simulation-based, modular procedural curriculum to teach BMV, venipuncture, PIV, and LP performed better on standardized assessments of procedural skills for PIV and LP, and likely were more skilled at venipuncture compared with control subjects who learned these skills via traditional educational methods. However, these differences did not persist 7 months after the curriculum was delivered, in part because of declining performance by intervention group members. Reinforcing training and creating opportunities to practice procedural skills in live clinical situations may be necessary to achieve long-lasting improvements. Additional research is needed to determine the optimal balance between simulated and live experiences. Assessment tools must be validated so that a standardized method exists to document the attainment and maintenance of competency. The pediatric Residency Review Committee must recognize the limitations of traditional educational methods, and should require formal training and assessment to ensure that graduating trainees are able to competently perform the skills necessary to care for children.
We all were involved in study design, data collection and analysis, and manuscript preparation. We acknowledge the support of the Association of Pediatric Program Directors, who funded this study through their Special Projects Research Grant, and the CHB Frederick Lovejoy Resident Research Fund. Financial supporters of the study were not involved in the design or conduct of the study; collection, management, analysis, or interpretation of the data; or preparation, review, or approval of the manuscript.
We thank Liana Kappus, Judith Harrington, RN, and Marisa Brett-Fleegler, MD (CHB), and Ilan Schwartz, MD (Boston Medical Center), for work in creating and administering the curriculum and the assessment tools. These contributors received no compensation for their work on the project. We also acknowledge Miren Creixell Plazas for essential work as a research assistant on the project. Finally, we thank the Boston Combined Residency Program intern class of 2005 for enthusiastic participation in the study and dedication to the advancement of residency training.
- Accepted January 21, 2009.
- Address correspondence to Michael G. Gaies, MD, MPH, C. S. Mott Children's Hospital, L1242 Women's, SPC 5204, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5204. E-mail:
This work was presented in part at the 2007 annual meeting of the Pediatric Academic Societies; May 5–8, 2007; Toronto, Ontario, Canada, where this project received the Ambulatory Pediatric Association's Ray E. Helfer Award for Innovation in Pediatric Education.
Financial Disclosure: The authors have indicated they have no financial relationships relevant to this article to disclose.
What's Known on This Subject:
To our knowledge, no validated curriculum exists for teaching basic procedures to pediatric interns.
What This Study Adds:
This study is an evaluation of a multimodular curriculum used to teach basic procedural skills to pediatric interns by using objective assessment tools and simulation.
- ↵Sectish TC, Zalneraitis EL, Carraccio C, Behrman RE. The state of pediatric residency training: a period of transformation of graduate medical education. Pediatrics.2004;114 (3):832– 841
- Accreditation Council for Graduate Medical Education. Outcome project, 2001. Available at: www.acgme.org/Outcome. Accessed October 24, 2004
- ↵Association of American Medical Colleges. AAMC/1998 report I: learning objectives for medical student education guidelines for medical schools. Available at: www.aamc.org/publications/showfile.cfm?file=version87.pdf&prd_id=198&prv_id=239&pdf_id=87. Accessed October 24, 2004
- ↵Wigton RS. See one, do one, teach one. Acad Med.1992;67 (11):742
- ↵Accreditation Council for Graduate Medical Education. Residency Review Committee, section 2001. Available at: www.acgme.org/acWebsite/downloads/RRC_progReq/320_pediatrics_07012007.pdf. Accessed June 11, 2009
- Falck AJ, Escobedo MB, Baillargeon JG, Villard LG, Gunkel JH. Proficiency of pediatric residents in performing neonatal endotracheal intubation. Pediatrics.2003;112 (6 pt 1):1242– 1247
- ↵Gaies MG, Landrigan CP, Hafler JP, Sandora TJ. Assessing procedural skills training in pediatric residency programs. Pediatrics.2007;120 (4):715– 722
- ↵Quan L, Shugerman RP, Kunkel NC, Brownlee CJ. Evaluation of resuscitation skills in new residents before and after pediatric advanced life support course. Pediatrics.2001;108 (6). Available at: www.pediatrics.org/cgi/content/full/108/6/e110
- ↵Schubert WH. Curriculum: Perspective, Paradigm, and Possibility. New York, NY: Macmillan Publishing Co; 1986
- ↵Merriam SB, Caffarella RS. Key theories of learning. In: Learning in Adulthood. 2nd ed. New York, NY: Jossey-Bass; 1999:248–266
- Copyright © 2009 by the American Academy of Pediatrics