Improving Delivery of Bright Futures Preventive Services at the 9- and 24-Month Well Child Visit
OBJECTIVES: To determine if clinicians and staff from 21 diverse primary care practice settings could implement the 2008 Bright Futures Guidelines for Health Supervision of Infants, Children, and Adolescents, 3rd edition recommendations, at the 9- and 24-month preventive services visits.
METHODS: Twenty-two practice settings from 15 states were selected from 51 applicants to participate in the Preventive Services Improvement Project (PreSIP). Practices participated in a 9-month modified Breakthrough Series Collaborative from January to November 2011. Outcome measures reflect whether the 17 components of Bright Futures recommendations were performed at the 9- and 24-month visits for at least 85% of visits. Additional measures identified which office systems were in place before and after the collaborative.
RESULTS: There was a statistically significant increase for all 17 measures. Overall participating practices achieved an 85% completion rate for the preventive services measures except for discussion of parental strengths, which was reported in 70% of the charts. The preventive services score, a summary score for all the chart audit measures, increased significantly for both the 9-month (7 measures) and 24-month visits (8 measures).
CONCLUSIONS: Clinicians and staff from various practice settings were able to implement the majority of the Bright Futures recommended preventive services at the 9- and 24-month visits at a high level after participation in a 9-month modified Breakthrough Series collaborative.
The Bright Futures Guidelines for Health Supervision of Infants, Children, and Adolescents, 3rd edition1 (Bright Futures) published in 2008 is a uniform set of guidance to assist clinicians in pediatric preventive care, containing recommended preventive services for US infants, children, adolescents, and young adults from birth to age 21 years. These guidelines are recognized as “the evidence-informed preventive care and screenings provided for in the comprehensive guidelines supported by the Health Resources and Services Administration for purposes of coverage without co-pay” under Section 2713 of the Public Health Service (PHS) Act, as amended by the Affordable Care Act.2
Studies document that ,development and publication of guidelines alone do not translate into improved care.3 Practical tools and strategies have been identified as important components to assist clinicians in making change in clinical settings.4,5 Evidence shows that change requires focus on the system being impacted, not just on the individual practitioner. This means that the team of providers and staff needs to be engaged.6,7 One major tool for system change has been the quality collaborative, a group of practices or hospitals, meeting over a defined timeframe to effect the specific change targeted for improving care.8–13
A natural extension of the collaborative model is an organized network of providers or institutions engaging its members in topic-specific collaboratives. In 2005, the American Academy of Pediatrics (AAP) developed the Quality Improvement Innovation Networks (QuIIN) to organize a platform of practices for testing “change packages” of new measures, guidelines of care, and innovative care delivery approaches before widespread dissemination. The AAP’s Bright Futures Initiative partnered with QuIIN to conduct the Preventive Services Improvement Project (PreSIP). A third project partner, the Academic Pediatric Association’s Continuity Research Network, contributed expertise and recruitment of 4 residency continuity clinics. The PreSIP’s aim was to assist practices in making office systems–based changes to implement the 15 screening and anticipatory guidance recommendations from Bright Futures Guidelines for Health Supervision of Infants, Children, and Adolescents, 3rd edition. The hypothesis was that practices would perform each service at least 85% of the time during 9- and 24-month preventive services visits as a result of PreSIP participation.
Team Selection and Characteristics
Pediatric primary care practices were recruited through the AAP QuIIN and the Academic Pediatric Association’s Continuity Research Network. Applicants were told that Maintenance of Certification (MOC) Part 4 had been applied for, but was not assured. Applicant practices provided information on their location, size, practice type, practice setting, patient population, and experience with quality improvement (QI), and identified a 3-member physician-led core improvement team. Twenty-two pediatric primary care practices from 15 states were selected from 51 applicants to participate in a QI project focused on implementing Bright Futures in practice. Practices were selected to represent diversity in practice types, practice settings, and patient populations. In each selected practice the lead core team physician and in some cases the whole practice had previous QI experience. Before the intervention, 1 practice declined participation owing to local institutional review board challenges; 2 months before the project’s end date, another practice withdrew owing to core team members’ change in employment locations, however, this team provided final system level data at the time of withdrawal and is considered in the system level analysis. Table 1 summarizes practice characteristics for the 21 project teams.
An intervention was designed to support the 21 pediatric teams’ efforts to implement Bright Futures recommendations at the 9- and 24-month preventive services visits. Project faculty, staff, and practice participants discussed how to change current preventive services delivery by using a modified Breakthrough Series (BTS) collaborative model.14 The BTS Collaborative intervention model includes 3-day conferences (learning sessions), with “action periods” between sessions during which participants improve their practice settings by using Plan-Do-Study-Act cycles and self-measurement. Participants have monthly coaching phone calls with project staff to review data, discuss progress, and brainstorm solutions to challenges. This approach uses a 3-person practice team, which could include a physician, nurse practitioner, office nurse, practice support person, or business manager, who attend all learning sessions. Teams bring information back to their practices and lead practice participation in monthly data collection, run charts review, and all-practice phone calls. Teams work with the rest of the practice to plan, initiate, and study the practice-wide systems change efforts. PreSIP used a modified version of the BTS lasting 9 months, with 2 instead of 3 learning sessions. Learning session 1 was attended by teams from all 21 practices, and learning session 2 at month 10 was attended by teams from 20 practices. Teams exchanged ideas at the learning sessions, on monthly phone calls, and by E-mail and listserv with faculty and each other about improvement strategies. At the second session, teams discussed in detail the successful strategies used as well as challenges and opportunities to maintain gains and continue progress on goals not yet accomplished. This project received AAP Institutional Review Board approval and American Board of Pediatrics approval for 25 points toward Maintenance of Certification, Part 4.
Measurement was designed to assess progress toward the 85% or higher goal for recommended preventive services. A critical step was “translation” of the Bright Futures Guidelines and Bright Futures Tool and Resource Kit into components measurable through chart audit or office systems inventory. The project measured a combination of nationally endorsed measures and measures tested in and/or adapted from previous QI preventive services projects.8,11,12,15–19 Measurements included in the PreSIP are detailed in Table 2. One particularly challenging but critical measurement area was the “partnership with parents,” a unique and integral component of Bright Futures not reflected in adult practice guidelines. Two chart audit measures (asking about and addressing parental concerns and identification of parent strengths) and 1 office-based systems measure (shared decision-making) were used. To determine the pre- and post-preventive services score (PSS) for a practice, the number of recommended preventive services (7 for 9-month-old children and 8 for 24-month-old children) that each patient received were summed. These were then averaged to provide a practice PSS at baseline (time 1) and completion (time 2). The PSS for all practices was determined by calculating an average score from all PreSIP patients at time 1 and time 2. Over 9 months, the project measured practice teams’ care processes and tested improvement changes to health supervision care processes in 3 areas:
Newer Screening Recommendations: oral health risk assessment and developmental and autism screening
Additional Health Supervision Care: anticipatory guidance, age-appropriate risk assessment, assessment of parental strengths and eliciting parent concerns, weight for length, and BMI percentile based on age and gender
Office-Based Changes: recall/reminder system, referral tracking, identification of children who have special health care needs, linkages to community resources (organized list and someone to update the list), shared decision-making/motivational interviewing, use of a preventive prompting system, system to screen for maternal depression, and collection and use of family feedback
Each practice was required to review 20 charts at baseline and completion for both the 9- and 24-month visits, and 10 charts monthly during the 9-month action period for both the 9- and 24-month visits. A χ2 test was used to evaluate any changes between patients and within the participating practices. A P < .05 was used to test for significance. Participants were queried about the existence of key office systems at baseline and post intervention. Maternal depression screening was measured as a system issue at month 2 and in the post-chart audit. One balancing measure, length of visit, was included.20 Data were also collected on the monthly reports and during the second learning session about both barriers and successful strategies.
With all patients included, there were statistically significant increases in all measures between pre- and post-intervention measurement (Table 3, Figs 1 and 2). The PSS increased significantly for both the 9- and 24-month visits (Table 3). Table 4 illustrates which systems were implemented at baseline and post intervention. Some support the improvement in chart audit data (eg, the preventive services prompting system to remind clinicians and office staff to do each screen, and systems to ensure that all positive screens were followed appropriately). Recall and reminder systems impact both immunizations and appropriate periodicity of visits. (These Health Care Effectiveness Data and Information Set (HEDIS) measures were not included in the chart audit, as practices already have immunization audit information from health plan and public health audits, and improvements in visit periodicity are not well reflected in a monthly audit measure.) In addition, a system for maternal depression screening was reported by 8 practices at month 2 and 64% of 9-month charts post-intervention included a completed maternal depression screen. In Fig 3 A, B, C, and D, summary run charts demonstrate 3 different patterns. On some measures, practices:
started high and remained high (eg, anticipatory guidance, weight for length, eliciting parent concerns);
improved early in the collaborative and stayed high (eg, developmental screening); and
made incremental progress across the 10-month period (eg, autism screening, assessments of parental strengths, medical risk, and oral health risk). This may reflect a longer or more challenging startup or a preventive service only addressed later in the collaborative.
The PSS summary measure for each practice for 24-month-old children is presented in Fig 4, indicating a numeric increase in the number of provided preventive services for most practices. The barriers mentioned by practices on the monthly reports were categorized, and the number of times each barrier was mentioned by a practice in the monthly report is in parentheses in Table 5.
The major question is simple: can Bright Futures actually be done in a real life busy practice? With coaching from faculty, staff, and each other, these practices were able to implement the majority of Bright Futures at the selected ages.
Lessons From PreSIP
During monthly calls, individual coaching calls, and the second learning session, participants were asked to share strategies and practice characteristics that facilitated their success or posed barriers. Several themes emerged that can inform future implementation efforts. The implementation of all recommended preventive services in 1 project was an advantage, giving practices an opportunity to make the practice-wide changes necessary to address all screening items rather than focusing on a single screening at a time, each for a 9-month period. One of the most effective strategies for incorporating new screenings and risk assessment questions into the preventive services visit was a pre-visit questionnaire. Previous studies on the use of pre-visit questionnaires have demonstrated positive results, for example, helping to set the visit agenda,21 implementing screening in the waiting room,22 and improving parents’ acquisition of needed anticipatory guidance.23 Bright Futures pre-visit questionnaires24 captured medical/oral health risk screening information and parental concerns/questions. Formal parental screening tools for development, autism, and maternal depression were efficient. The practice managers, nurses, and support staff were critical in designing a pre-visit workflow that was specific for their particular office setting. The use of paper or electronic versions with home or office completion was determined by the practices based on their setting and patient population, proactively addressing literacy/language concerns. Arrival time was scheduled 15 minutes before the visit to allow for questionnaire completion. Questionnaire preparation, distribution, and collection were standardized and questionnaire completion and scoring mechanisms were identified. Templates in electronic or paper format functioned as a prompt for practitioners and staff and a site to document completion of the appropriate screenings, risk assessment, and anticipatory guidance. When Bright Futures was incorporated into an electronic health record, reported reliability of implementation increased dramatically. Practice size and information technology (IT) support impacted implementation. Smaller or independent practices have fewer resources, but could easily change or add templates, forms, and pre-visit questionnaires, especially with in-practice Electronic Health Record (EHR) expertise. Large health care systems enjoy IT and data expertise, but can face layers of decision-making and permissions to effect change. Teaching clinics with many practitioners find standardized EHR templates important for consistent change implementation. Referral and follow-up systems for developmental/dental/community services and mental health were critical. Three examples of strategies 1 or more practices found useful were: (1) selection of a practice/clinic care coordinator to manage relations with community and organizational resources and update accessible resource lists for parents, (2) hosting a practice “mixer” with community resources for relationship-building, and (3) co-location of a behavioral and/or developmental specialist, if needed. Full-practice team buy-in was sometimes challenging. Practice-wide sessions reinforcing the impact of each person’s effort and contribution were important. Many practices used chart audit data to demonstrate gaps, progress, and improvements. Some compared their practice to other sites and assessed improved patient care, as well as possible or real fiscal impact. Most had a few people try a screening tool and testify that it was effective and doable. Practices with a larger number of providers, such as teaching continuity clinics, benefited from multiple reinforcement sessions and by involving resident providers in the improvement process. These suggestions from PreSIP teams, the barriers list, and summary run charts can provide practices interested in implementing Bright Futures with ideas about areas they might want to address first, what problems they might anticipate, and practice-tested strategies to consider.
Special Considerations: Parental Strengths, Maternal Depression Screening, and Visit Time
The greatest implementation challenge was the strengths assessment and feedback to parents. Although this is the only chart audit measure not achieving the 85% benchmark, it showed the largest percentage increase of any measure. First learning session discussions revealed that practices had already been identifying and acknowledging parental strengths in an informal but inconsistent manner. This project gave intentional focus to this important aspect of family functioning. Entire practice team participation in identifying what parents did well helped formalize this component, by asking parents whom they have for support and what they like to do with their child, taking an interest in their lives, and/or recognizing the strength of extended family. Some practitioners felt this personal “non-mechanical” interaction enhanced relational access, dovetailing with shared decision-making if change was needed. An important consideration in QI interventions is inclusion of a balancing measure. The Institute for Healthcare Improvement (IHI) defines this as a measure that “looks at a system from different directions or dimensions to see what happened to the system as we improved the outcome.”20 This study measured visit time and found that comprehensive Bright Futures implementation did not result in a longer 2-year visit, but did show a slight increase (<3 minutes) for the 9-month visit. The large SD (12.33) indicated substantial visit time variation among practices. There was no easily identifiable association of visit time with practice type or results achieved. Further study would be needed to identify any correlates of increased visit time.
A small number of charts were audited with non-probability sampling, consistent with the rapid-cycle change aspect of the “model for improvement.” When applied in practice improvement, even this small number of charts allowed participants to quickly determine, using a plan-do-study-act approach, if their efforts that month had been effective. Participants were encouraged to test innovations for effectiveness and to make immediate modifications if needed. Our methods examined only preventive services visit screening and counseling, not those done at acute or other “non-well” visits, potentially underestimating total screening and counseling. With no comparison group, improvements over the 9-month study period may be attributable to factors other than this intervention; however, feedback received during the collaborative and at the second learning session indicated that changes were likely attributable to the practice efforts promoted by this intervention. Emphasis was placed on the importance of continued monitoring of progress and sustainability. As in most QI measurement studies, outcomes were measured based on practice self-audit, and practices were encouraged to use paper or electronic templates, which may have resulted in improved documentation of care. Independent chart audit, patient report of visit components, or recorded visits would have added more rigor to measurements,25 but these were outside the logistical and financial scope of this project. These limitations precluded direct parental assessment of the visit experience and utility of care. Parental experience is an important component of care and, along with impact on parent behaviors and children’s health, should be assessed in future studies. Finally, can this intervention be generalized to other practices nationally? Diverse practice settings were selected to generate the broadest set of strategies and tools that supported improvement in preventive services delivery. Although several approaches worked well for most settings, the limited number of practices did not permit specifying which strategies were most effective for each type of practice. Future research should address this issue so that such specific strategies can be coordinated with practice type for practices beginning implementation. Each PreSIP practice had at least 1 person with QI experience, and was presumably already motivated to improve preventive services. These results are likely to be generally reproducible, as practitioners are becoming increasingly familiar with QI approaches through participation in MOC Part 4. In the near future, MOC requirements could function as an incentive for measurable Bright Futures implementation. This QI project tested the feasibility of using a modified BTS methodology to increase provision of preventive services for children birth to age 3 years. The strategies and tools identified could be helpful to a wide variety of practice settings. Future studies should include parental input as well as a sufficient number of practices to allow correlating successful approaches with practice setting type.
We thank the pediatric practices that participated in this project:
All About Children Pediatric Partners PC (West Reading, PA), All Pediatrics (Lorton, VA), Atlantic Coast Pediatrics (Merritt Island, FL), Children’s Health Center at St Elizabeth’s Hospital (Appleton, WI), Children’s Hospital Primary Care Center (Boston, MA), CMC Myers Park Pediatrics (Charlotte, NC), Community Medicine Associates (San Antonio, TX), Comprehensive Pediatric Care (Williston, ND), Cook Children’s Physician Network (Hurst, TX), Danis Pediatrics (St Louis, MO), Fair Oaks Children’s Clinic (Redwood City, CA), Haverstraw Pediatrics (Haverstraw, NY), Hays Med Pediatrics Center (Hays, KS), Kressly Pediatrics PC (Warrington, PA), Lutheran Family Health Center – Sunset Park Pediatrics (Brooklyn, NY), Roxborough Pediatrics/ECHA (Philadelphia, PA), San Xavier Clinic (Tucson, AZ), Sandhills Pediatrics, Inc (Southern Pines, NC), Sixteenth Street Community Health Center (Milwaukee, WI), University of Iowa Department of Pediatrics (Iowa City, IA), and Wind River Service Unit (Ft Washakie, WY).
We thank Alison Baker, Jane Bassewitz, Edward Curry, Nui Dhepyasuwan, R.J. Gillespie, Chuck Norlin, Linda Radecki, Janet Serwint, Darcy Steinberg-Hastings, and Keri Thiessen for their contributions to the success of the project.
- Accepted June 19, 2014.
- Address correspondence to Paula Duncan, MD 315 Lost Nation Rd, Essex Junction, VT 05452. E-mail:
Dr Duncan helped in all aspects of the design, practice recruitment, and selection, learning session and follow-up call content, and recruitment of other learning session faculty. She reviewed data for improvement, wrote the first draft of the manuscript with Ms Pirretti, and did the 2 revisions with input from the other authors; Ms Pirretti had a leadership role in the project conceptualization, design, and implementation. She worked with Ms Healy to support the implementation, practice recruitment and selection, design and implementation of the learning sessions, and follow-up calls. Ms Pirretti performed the data analysis and worked with Dr Duncan to write and revise the manuscript; Dr Earls participated in the concept and design, helped design the details of the intervention and practice selection criteria, helped with the selection, participated as faculty for the learning sessions and follow-up monthly phone calls, reviewed data, suggested additional analyses, and reviewed and revised the manuscript; Dr Stratbucker helped design the details of the intervention and practice selection criteria, helped with the recruitment of continuity clinics and the selection of all participants, participated as faculty for the learning sessions and follow-up monthly phone calls, reviewed practice data, and critically reviewed the manuscript; Ms Healy was involved in project conceptualization and design, taking a leadership role with Ms Pirretti in the implementation of the practice recruitment and selection, and the design and implementation of the learning sessions and follow-up calls. She interacted with the practices individually at many points for data collection and improvement consultation. She reviewed and revised the manuscript, making several significant additions for tables; Dr Shaw contributed to the conceptualization and design of the study, participated as faculty at the first learning session, and reviewed the manuscript with special contributions to the tables; Dr Kairys assisted with study planning, design, and implementation, as he provided direction of the Improvement Network for Practice Improvement. He also critically reviewed the manuscript with important contributions to several major revisions and helped edit the manuscript; and all authors approved the final manuscript as submitted.
FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.
FUNDING: Funded in part by the Health Resources and Services Administration, Maternal and Child Health Bureau, under a cooperative agreement to the American Academy of Pediatrics (#U04MC07853), and by the Friends of Children, a Charitable Fund of the American Academy of Pediatrics.
POTENTIAL CONFLICT OF INTEREST: Drs Duncan and Shaw are coeditors of the Bright Futures Guidelines, 3rd edition. The other authors have indicated they have no potential conflicts of interest to disclose.
- Hagan JF,
- Shaw JS,
- Duncan PM
- 2.↵Patient Protection and Affordable Care Act, Pub. L. No. 111-148, §2702, 124 Stat. 119, 318-319 (2010)
- Institute of Medicine
- Bodenheimer T,
- Grumbach K
- Nutting PA,
- Crabtree BF,
- Miller WL,
- Stange KC,
- Stewart E,
- Jaén C
- Young PC,
- Glade GB,
- Stoddard GJ,
- Norlin C
- Lannon CM,
- Flower K,
- Duncan P,
- Moore KS,
- Stuart J,
- Bassewitz J
- 12.↵King TM, Tandon SD, Macias MM, et al. Implementing developmental screening and referrals: lessons learned from a national project. Pediatrics. 2010;125(2):350–360
- 13.↵Duncan P, Frankowski B, Carey P, et al. Improvement in adolescent screening and counseling rates for risk behaviors and developmental tasks Pediatrics. 2012;130(5). Available at: www.pediatrics.org/cgi/content/full/130/5/e1345
- 14.↵Institute for Healthcare Improvement. The Breakthrough Series: IHI’s Collaborative Model for Achieving Breakthrough Improvement. IHI Innovation Series white paper. Boston, MA: Institute for Healthcare Improvement; 2003
- 15.↵Bethell C, Peck C, Abrams M, Halfon N, Sareen H, Scott-Collins K. Partnering with parents to promote the healthy development of young children enrolled in Medicaid. Commonwealth Fund Report 2002. Available at: www.commonwealthfund.org/publications/fund-reports/2002/sep/partnering-with-parents-to-promote-the-healthy-development-of-young-children-enrolled-in-medicaid. Accessed January 3, 2014
- Bethell C,
- Reuland CH,
- Halfon N,
- Schor EL
- 17.National Committee for Quality Assurance. Weight Assessment and Counseling for Nutrition and Physical Activity for Children/Adolescents (WCC) NCQA website. Available at: www.ncqa.org/portals/0/Weight%20Assessment%20and%20Counseling.pdf. Accessed October 1, 2014
- 18.Abatemarco DJ, Gubernick R. Practicing safety: a child abuse and neglect prevention improvement project. Final Data Analysis Report. August 2010. Unpublished data. Available at: www.aap.org/en-us/professional-resources/practice-support/quality-improvement/Quality-Improvement-Innovation-Networks/Pages/Practicing-Safety-A-Child-Abuse-and-Neglect-Prevention-Improvement-Project.aspx. Accessed January 3, 2014
- 19.↵O’Connor K, Boulter S, Keels M, Krol DM, Lewis C, Mouradian WE. Oral health screening among pediatricians: a national survey. Presented at the June 2009 Academy Health Child Health Annual Meeting. Available at: http://www.aap.org/en-us/professional-resources/Research/Documents/ps70pas09_OralHealth.pdf. Accessed October 1, 2014
- 20.↵Institute for Healthcare Improvement. How to improve. Available at: www.ihi.org/resources/Pages/HowtoImprove/ScienceofImprovementEstablishingMeasures.aspx. Accessed October 1, 2014
- 23.↵Bethell C. Patient-centered quality improvement of well child care. Available at: www.mchlibrary.info/MCHBfinalreports/docs/R40MCO8959.pdf. Accessed October 1, 2014
- Duncan P,
- Shaw J,
- Gottesman MM,
- Swanson J,
- Hagan J,
- Pirretti J
- Copyright © 2015 by the American Academy of Pediatrics