Increasing the Screening and Counseling of Adolescents for Risky Health Behaviors: A Primary Care Intervention
Objective. To determine whether a systems intervention for primary care providers resulted in increased preventive screening and counseling of adolescent patients, compared with the usual standard of care.
Methods. The intervention was conducted in 2 outpatient pediatric clinics; 2 other pediatric clinics in the same health maintenance organization served as comparison sites. The intervention was implemented in 2 phases: first, pediatric primary care providers attended a training workshop (N = 37) to increase screening and counseling of adolescents in the areas of tobacco, alcohol, drugs, sexual behavior, and safety (seatbelt and helmet use). Second, screening and charting tools were integrated into the intervention clinics. Providers in the comparison sites (N = 39) continued to provide the usual standard of care to their adolescent patients. Adolescent reports were used to assess changes in provider behavior. After a well visit, 13- to 17-year olds (N = 2628) completed surveys reporting on whether their provider screened and counseled them for risky behavior.
Results. Screening and counseling rates increased significantly in each of the 6 areas in the intervention sites, compared with rates of delivery using the usual standard of care. Across the 6 areas combined, the average screening rate increased from 58% to 83%; counseling rates increased from 52% to 78%. There were no significant increases in the comparison sites during the same period. The training component seems to account for most of this increase, with the tools sustaining the effects of the training.
Conclusions. The study offers strong support for an intervention to increase clinicians' delivery of preventive services to a wide age range of adolescent patients.
The majority of adolescent morbidity and mortality is preventable and associated with behaviors such as substance use and abuse, unsafe sexual practices, and risky vehicle use.1 Accidents and unintentional injuries account for the greatest number of adolescent deaths and often involve use of alcohol.2–4 Sexually transmitted diseases are the most common infectious diseases among adolescents; among older adolescents, pregnancy and childbirth are the leading causes of hospitalization.5
Current trends in adolescent morbidity and mortality have turned greater attention to the preventive role of the health care system. The majority of adolescents visit a health care provider once a year,6 providing an ideal opportunity to integrate prevention into clinical encounters. To facilitate screening and counseling for risky health behaviors, guidelines specifically targeting the delivery of adolescent clinical preventive services have been developed.7–13 However, despite the existence of guidelines, rates of screening and counseling consistently are lower than recommended.12,14,15 Although primary care providers often screen adolescents for some risky behavior,15,16 there is inconsistency in screening across various risk areas.15
Barriers to guideline implementation include physician knowledge, physician attitudes, and external factors.17,18 Physician factors, such as knowledge and attitudes, may be linked to training. For example, almost half (45%) of primary care clinicians who see adolescent patients cited insufficient training as the most significant barrier to the delivery of health care to adolescents,19 yet even with adequate knowledge and attitudes, external barriers, such as a lack of tools or reminder systems, can affect a provider's ability to follow recommendations. Recent reviews cite the lack of appropriate screening tools as a major barrier to the delivery of preventive services.18 When tools do exist, they often are too lengthy to be feasible for use in the context of a primary care visit, or, if shorter, they tend to focus on only a single risk behavior.20 Thus, both clinician and external/system barriers need to be overcome to implement guidelines effectively.
The manner in which guidelines are incorporated into a system also influences the delivery of services. Effective components include educational outreach, feedback and reminders, interactive educational/clinical workshops, involving local opinion leaders, and reaching local consensus.21 In addition, combining 2 or more modalities (eg, combination of training, feedback, and resources at the systems level such as tools) leads to greater success in improving professional practice.22–25
Evidence of the feasibility of systemwide implementation of adolescent preventive guidelines is limited, and experimental research evaluating the process of implementation is scarce. In 1 study of the implementation of Guidelines for Preventive Services (GAPS) in community health centers, Klein et al26 found that when health center staff were trained to implement GAPS and were provided patient questionnaires, resource materials, and clinician manuals, adolescents received more screening and counseling. However, even after implementation of GAPS, screening rates across the various areas ranged from a low of 21% to a high of 48%. In the only study to implement and evaluate the delivery of adolescent preventive services in managed care, we found that the provision of provider training, customized screening and charting tools, and preventive services support staff (health educator) resulted in very high rates of screening and counseling of adolescents by primary care providers.27 Although provider training resulted in significant increases in screening and counseling rates,28 it was the subsequent addition of screening and charting forms, as well as the resources of a health educator, that resulted in markedly higher rates of preventive services delivery.27
This research indicates that interventions that provide the major components of (1) training and (2) screening and charting tools may be particularly effective in increasing delivery of services to adolescents. However, a limitation of our previous research27 is that it is not possible to separate the effects of tools from the effect of adding a health educator to help facilitate the process because both components were implemented at the same time. It therefore is not clear how great an increase in provider delivery of services can be expected by introducing training and tools into the health care system. Furthermore, no study of the implementation of adolescent preventive services has included comparison sites in the design. This is an essential next step in determining the most effective way to provide adolescent preventive services. The primary objective of this study was to determine the percentage increase in provider delivery of adolescent preventive services that will result from a systems intervention involving provider training and the utilization of screening and charting tools.
We used the Precede/Proceed Model as a framework to assist in the development of this training and tools systems intervention.29,30 The model posits that predisposing factors, enabling factors, and reinforcing factors influence clinician behavior in preventive care. Predisposing factors relate to the necessary attitudes and motivation to perform a behavior; enabling factors include the competence, skills, and resources necessary to perform the behavior; and reinforcing factors are those that support or reward the behavior.29,31
Training clinicians in the delivery of preventive services addresses predisposing variables such as knowledge, attitudes, and self-efficacy to deliver preventive services; enabling factors such as skill in communicating with adolescents; and reinforcing factors such as feedback and support from colleagues. Implementing screening and charting tools should also address predisposing, enabling, and reinforcing factors that impede or enhance delivery of preventive services. For example, tools address predisposing factors such as providers' competence to screen and counsel by facilitating their ability to deliver services efficiently. Tools also enable providers to screen and counsel adolescents by providing prompts, cues, and charting forms. The need to document screening and counseling provides a form of accountability and monitoring that serves as a reinforcing factor.
The goal of this intervention was to increase clinicians' screening and brief counseling of adolescents in the targeted health risk areas of tobacco, alcohol and drug use, sexual behavior, seatbelt use, and helmet use. Our primary hypothesis was:
A systems-level intervention, consisting of training and tools to facilitate the delivery of adolescent clinical preventive services, will result in significantly higher rates of provider delivery of services in 6 targeted risk areas, compared with rates of delivery using the usual standard of care.
A secondary set of hypotheses to test the additive effects of the 2 components of the system level intervention were:
Training alone (component 1 of systems-level intervention) will result in significantly higher rates of provider delivery of services in 6 targeted risk areas, compared with rates of delivery using the usual standard of care.
The provision of tools (component 2 of systems-level intervention) will result in significantly higher rates of provider delivery of services in 6 targeted risk areas, compared both with the results of training alone and with rates of delivery using the usual standard of care.
The study was conducted in 4 outpatient pediatric clinics within a large health maintenance organization (HMO) throughout Northern California; 2 clinics served as intervention sites, and 2 served as comparison sites. The clinic-wide systems intervention focused on increasing the delivery of preventive services during routine well visits. In the intervention sites, 2 components were implemented in 2 separate phases: the first phase consisted of training for primary care providers in the delivery of preventive services; and the second phase consisted of the integration of screening and charting forms into the clinics. The comparison sites continued to deliver the usual standard of care. There were 3 separate assessment time periods in the clinics: (1) a pretraining baseline period (T0), (2) a posttraining period (T1), and (3) a posttools full implementation period (T2). Data were collected at both the intervention and the comparison sites throughout the 3 periods.
Providers' screening and counseling behaviors during adolescent well visits, as reported by adolescents who attended the visits, served as the basis for the evaluation of the intervention. These independent adolescent reports of provider behavior were obtained immediately after well visits in both the intervention and the comparison clinics during the 3 assessment periods. The health behaviors targeted in the intervention included tobacco use, alcohol use, drug use, sexual behavior, and safety (seatbelt and helmet use). All procedures were approved by the internal review boards at the University of California, San Francisco, and at the participating HMO.
Clinics were selected on the basis of their provision of care to large numbers of adolescents and their agreement to participate in a study of clinical preventive services to adolescents. Our selection of sites into intervention versus comparison group was based on factors such as size and location of clinic, balancing ethnicity of the adolescents, and previous use of adolescent screening tools. Providers were eligible to participate in the study when they saw adolescents for well visits in the clinic, and all eligible providers agreed to participate in the study.
The original provider sample consisted of 86 participants, 42 from the 2 intervention clinics and 44 from the 2 comparison clinics. Ten providers (5 from the intervention and 5 from the comparison clinics) were excluded from the present analyses because we lacked sufficient evaluation data (at least 2 adolescent well visits per phase). Thus, the present sample consisted of 37 providers in the intervention group and 39 in the comparison group, for a total of 76 providers. Providers in the intervention group were similar to those in the comparison group in terms of gender, age, and proportion of nurse practitioners to physicians. In the provider intervention sample, 62.2% were female, the mean age was 41.2, and 4 were nurse practitioners. In the comparison sample, 64.1% were female, the mean age was 44.1, and 4 were nurse practitioners. The majority of providers were either white or Asian. There were significantly more black providers in the intervention group than in the comparison group. There were no other significant differences in provider characteristics between the intervention and comparison groups (Table 1).
Preventive Services Intervention
The preventive services intervention was composed of clinician trainings in adolescent preventive services and the later implementation of screening and charting forms customized for this study (see below for description). The intervention focused on the targeted risk areas of tobacco, alcohol, drugs, sexual behavior, and safety (helmet and seatbelt use). All clinicians participated in the training, and the tools were implemented on a clinic-wide basis. Thus, we expected that all adolescents who were 13 to 17 years of age and attended well visits would receive screening and counseling in the targeted risk areas. Adolescents met with their primary care provider for a well visit that lasted 20 to 30 minutes. The amount of time allocated for a well visit was consistent with the usual standard of care in the health care system.
At each intervention clinic, either the chief or assistant chief of pediatrics served as a physician “champion” who promoted the study and served as the primary contact with the study investigators. Clinicians and administrative staff also participated in committees that collaborated on the methods of data collection and the plans for implementation of the clinical tools into the clinics.
Component 1: Training
The training workshops for providers focused on increasing clinicians' knowledge, attitudes, self-efficacy, and skills to conduct preventive services and were based on social cognitive theory.32,33 This training model had been developed and evaluated previously and had been shown to be effective.28 Minor modifications were implemented to address specific needs in the intervention clinics. The provider training was conducted by an expert panel of adolescent medicine specialists from the University of California, San Francisco, and the participating HMO, with consultation from our clinician committees in the intervention clinics. In addition, we used actors from an educational theater program within the HMO to portray adolescent patients in the demonstration and interactive practice role plays.
The 8-hour workshop focused on adolescent health, confidentiality, screening, and conducting a brief office-based intervention that included anticipatory guidance/brief counseling for the 6 risk behaviors. As suggested by the review of effective interventions for health professionals,24 the workshops contained 4 components: (1) didactic presentations, (2) discussion, (3) demonstration role plays, and (4) interactive role plays. The didactic component included presentations of the following: adolescent health and risk behavior statistics, adolescent development, the role of primary care providers in health and risk prevention, interviewing adolescents, confidentiality, screening and brief counseling, and prioritizing in clinical visits. The second component included discussion and question-and-answer sessions based on the presentations. The third component included demonstration role plays conducted by the expert panel with the theater actors playing adolescents; and in the fourth component, providers had the opportunity to practice screening and counseling, using the theater actors. All nonprovider staff attended a 1-hour lunchtime training that focused on general topics of adolescent development and adolescent priorities in the health care setting.
Component 2: Adolescent Health Screening Questionnaire and Charting Forms Implementation
Before this study, the Regional Health Education Department of the HMO had developed 2 forms for use in adolescent well visits: (1) an adolescent health screening questionnaire and (2) a provider charting form. On the basis of our earlier work,27 we modified these region forms for use in this study. These modifications, intended to facilitate the delivery of preventive services in the targeted risk areas, were made in collaboration with Regional Health Education, with feedback from the clinicians at the intervention sites.
Before this study, only 1 of the study intervention sites was using the region's screening form. Furthermore, the site was not using the screening form consistently with all teenagers before well visits. The process of implementing tools was enhanced in several ways at the intervention clinics. First, we collaborated with clinicians and staff in both intervention sites to develop a system for distributing the modified adolescent screening questionnaires. Second, we assisted administrators and clinicians in establishing a process for providing an area for adolescents to complete the form confidentially and for retrieving them when completed. Third, we conducted 1-hour lunch meetings with clinicians and staff to introduce the forms and explain the implementation procedures.
Adolescent Screening Questionnaire
The region's original adolescent health screening form included questions about risk engagement in a broad variety of areas. We modified the form to include the following: (1) addition of follow-up questions in the target areas of sexual behavior, tobacco use, and alcohol and drug use; (2) addition of a screening question regarding helmet use; and (3) addition of prompts and cues in the target areas reminding providers to screen and to deliver brief counseling messages. Providers were cued to give adolescents positive reinforcement if they reported that they were engaging in healthy behavior. Examples of healthy behavior include not using tobacco or always using a bicycle helmet when riding. Providers were cued to express concern to adolescents if they reported engaging in risky behavior, such as using tobacco or alcohol.
Provider Charting Form
Both intervention clinics were already using the charting form previously developed by the HMO region, and modifications made for the study to the region's charting form were minor. They included placing an asterisk next to the behavior areas that were targeted in the intervention. The asterisks referred to a boxed area at the bottom of the page, reminding clinicians to screen and counsel in those areas.
Evaluation of the Preventive Services Intervention
Adolescents completed the Adolescent Report of the Visit (AROV), an independent survey of provider behavior, immediately after well visits (see below). The survey was distributed by trained research or clinic staff, who approached adolescents as they left examination rooms and asked them to complete the assessment. To have a representative array of adolescents per provider, we obtained reports from as many adolescents as possible across all providers. We focused on collecting data from a wide age range of adolescent patients (13–17 years of age) and from both male and female patients. (In cases in which we did not achieve broad representation for a provider, it tended to be female providers who saw very few adolescent male patients, or vice versa.) Data collection took place in 3 of the clinics on a daily basis and in the fourth clinic 2 to 3 days per week. This was necessary because of staffing limitations and physical location of the clinic. Parent consent and adolescent assent were not required because adolescents completed the AROV anonymously. On the basis of clinic schedules of completed visits in the intervention sites, we estimate that 75% of the adolescents who were asked to complete questionnaires agreed to do so. The adolescents were similar in age and gender in the intervention and comparison samples. See Table 2 for detailed description of the adolescent-reporter demographic characteristics.
Adolescents completed the AROV during each of the 3 evaluation periods in both the intervention and comparison clinics. Each of the periods lasted ∼4 months: pretraining (T0) data collection included 226 intervention and 246 comparison adolescents; posttraining (T1) data collection began immediately after the training and included 551 intervention adolescents and 260 comparison adolescents. The T2 data collection began immediately after the implementation of tools into the clinics and included 940 intervention and 405 comparison adolescents. (The adolescent sample from the comparison sites was smaller than the sample from the intervention sites for 2 reasons: (1) In the intervention sites, the reception and medical assistant staff were involved with implementing the intervention. They were interested in the research and made an effort to help our research staff track prospective adolescent reporters as they left exam rooms. The adolescents then were approached and asked to complete the AROV. (2) In 1 of the comparison sites, our research assistant was not present on a full-time basis, thus limiting data collection opportunities. We adjust for these differences in sample size as needed in the analyses.)
There were larger numbers of adolescents during T2 implementation because of the seasonal increase in well visits during the summer months. (We examined correlations among the numbers of adolescents seen per provider at each phase and found that providers who saw a higher number of patients at T0 tended to see a higher number during T1 and T2 implementation. Thus, the potential bias as a result of uneven numbers of adolescents seen across phases in terms of provider characteristics was minimized.) Across all phases, the full intervention sample size was 1717; the full comparison sample size was 911, for a total of 2628 adolescents reporting on their providers' behaviors.
Assessment of Clinician Practices
The AROV is a 45-item patient-report measure that includes questions about whether clinicians screen and offer brief counseling messages for each of the 6 target risk areas. Adolescent-based assessments of provider behavior yield an appraisal of clinician practices that is free of the confounding influences of provider self-report and social desirability biases and have been shown to be a valid indicator of delivery of services.26 The AROV has been used previously and possesses adequate construct validity.27,28 An example of a screening question is, “Did your doctor ask if you smoke or chew tobacco?” Items that assessed counseling differed, by skip patterns, depending on whether an adolescent was engaging in a particular risk behavior and whether she or he had informed the clinician about engagement in the risk behavior. An example of a counseling question for adolescents who were not engaging is, “Did your doctor encourage you to remain a nonsmoker or nontobacco user?” An example of a counseling question for adolescents who were engaging in a risky behavior is, “Did your doctor express concern that you use tobacco?” Similar screening and counseling questions were asked for each of the 6 risk areas. The response categories were dichotomous: yes or no. On average, adolescents completed the measure in 3 to 5 minutes.
Each adolescent questionnaire identified the provider who conducted the visit. A provider's score for each screening and counseling area was obtained by taking the average of the individual items for that area, summed across all of the adolescent questionnaires available for that provider. The resulting score for each item (eg, screening for tobacco use) represented the percentage of the time a provider performed screening or counseling in that area. For example, if a provider saw 4 adolescents during the T0 period in the study and 2 of them reported that they were asked whether they used tobacco and 2 reported that they were not asked, then that provider's score for screening in tobacco use for T0 would be 0.50. Thus, each provider had a mean score representing his or her screening rate across adolescent reports for each behavior area and each time period.
The primary focus of the evaluation was to assess providers' rates of screening and counseling in the intervention group, in relation to providers' rates of screening and counseling in the comparison group. The 6 targeted areas of screening and counseling were tobacco, alcohol, drugs, sexual behavior, and seatbelt and helmet use. The unit of analysis was the provider. Rates of screening and counseling for each provider were obtained on the basis of reports from their adolescent patients. Each behavioral risk area was evaluated individually for screening and counseling rates (eg, screening for tobacco use, counseling for tobacco use).
First, we present descriptive statistics for the adolescent reports that comprise the provider screening and counseling means. Second, we present rates of clinic-wide tools implementation. Third, we present provider screening and counseling means for each of the 3 phases separately for the intervention and comparison groups. Fourth, we present analysis of covariance (ANCOVA) to compare screening and counseling rates between the intervention and comparison groups.
Descriptive Statistics: Number of Adolescent Reporters Seen for Well Visits
The mean number of adolescent reporters per provider during T0 was 6.6 (SD: 3.8) for the intervention group and 6.6 (SD: 4.2) for the comparison group. During T1, the mean number of adolescent reporters per provider was 14.9 (SD: 6.0) for the intervention group and 6.7 (SD: 3.9) for the comparison group. During T2, the mean number of adolescents seen per provider was 25.4 (SD: 10.6) for the intervention and 10.4 (SD: 5.6) for the comparison groups. The number of adolescents was uneven across the time periods and between the intervention and comparison groups. To evaluate potential bias, we examined correlations between number of adolescent reporters per provider during each period and screening and counseling rates. This allowed us to determine whether providers who conducted a greater number of well visits tended to screen or counsel at higher or lower rates during those visits. In >80% of the screening and counseling behaviors, the number of adolescent reporters was not significantly correlated with screening or counseling rates. In the cases in which the number of adolescent reporters was significantly correlated with screening or counseling rates, the number of adolescent reporters per provider was included as a covariate in the ANCOVA for that behavior. (The number of adolescent reporters per provider was positively correlated with counseling for drug use at T2 and with helmet screening and counseling at both T1 and T2.)
Assessment of Tools Implementation
During T2, we monitored the integration of the study screening form into the intervention clinics through questions on the AROV. Ninety-seven percent of the adolescents reported that they received the Health Screening Questionnaire, 80% reported that they had time to complete the questionnaire, and 89% reported that they were able to fill it out privately.
Descriptive Statistics: Provider Screening and Counseling Rates in the Intervention Group
Table 3 presents the screening and counseling rates for the intervention group at each of the 3 phases: T0, T1, and T2. Analyses of change in the rates are presented in the ANCOVA sections that follow the descriptive statistics. At T0 in the intervention group, providers' average screening rates ranged from 42% for helmet use to 71% for tobacco use (Table 3). During this phase, the screening rates for seatbelt and helmet use were considerably lower than screening rates for substance use or sexual behavior. At T1, screening rates ranged from 70% for helmet use to 85% for tobacco use. At T2, screening rates tended to remain stable from posttraining rates.
Counseling rates across the 3 phases in the intervention group followed a similar pattern. At T0, average counseling rates ranged from 39% for helmet counseling to 65% for tobacco counseling. At T1, counseling rates ranged from 71% for helmet use to 83% for tobacco use. At T2, the pattern of the counseling rates remained relatively stable from posttraining rates.
Descriptive Statistics: Provider Screening and Counseling Rates in the Comparison Group
In the comparison group, providers delivered the usual standard of care during each of the 3 phases. T0 screening rates in the comparison sites ranged from 30% for helmet use to 65% for tobacco use (Table 3). Screening rates in the comparison group tended to remain stable at T1 and T2, with the lowest rate of screening in the area of helmet use and the highest rate in the area of tobacco use. Counseling rates followed a similar pattern to those for screening during each phase.
ANCOVA to Test Hypotheses
We conducted ANCOVA to test the differences in screening and counseling rates between the intervention and comparison groups. We tested a separate model for each screening and for each counseling behavior (eg, screening for tobacco; counseling for tobacco). We took into account several sources of potential bias in each analysis. To control for differences in baseline levels of screening and counseling, we entered the baseline level of screening and counseling as a covariate in each model. To control for the potential influence of unequal numbers of adolescent reporters per provider, we entered the number of adolescent reporters into the model when it was significantly correlated with the level of screening or counseling in a specific time period. To control for individual provider and adolescent characteristics, we included as covariates the age, gender, and ethnicity of provider or adolescent when they were correlated with screening or counseling levels.
We also examined the possibility that provider screening and counseling rates within an individual site could account for effects (eg, that one intervention site might be intrinsically different from the other intervention site). Controlling for the baseline values, we conducted post hoc analyses using Tukey b to test for the influence of site on the screening and counseling outcomes. We used this method, rather than hierarchical linear modeling, to examine the effects of clinic site, as the present study included only 2 sites in the intervention group and 2 sites in the comparison group. Therefore, multilevel analyses were not feasible.
Hypothesis 1: ANCOVA Results Testing the Full Intervention of Training and Tools: Intervention Versus Comparison Groups
Our primary hypothesis was that, after implementation of the full intervention, consisting of both provider training and tools, provider screening and counseling rates would be significantly higher in our intervention group than in our comparison group. We conducted ANCOVA to test the differences in screening and counseling rates between the intervention and comparison groups.
ANCOVA results indicated that screening rates after the implementation of the full intervention (training plus tools) were significantly higher for each of the 6 target areas in the intervention group than in the comparison group, controlling for baseline levels of screening and other covariates. Fs ranged from 17.11 for drug screening to 52.21 for helmet screening (all P < .000; Table 3). The effect sizes (η2) ranged from 0.19 for drug screening to 0.43 for helmet screening, all considered large effects.
ANCOVA results indicated that counseling rates after the implementation of the full intervention (training plus tools) were significantly higher for each of the 6 target areas, compared with the comparison group, controlling for baseline counseling levels and other covariates. Fs ranged from 23.08 (df 1,75) for tobacco counseling to 58.68 (df 1,75) for helmet counseling (all P < .001; Table 3). The effect sizes (η2) were large, ranging from 0.24 for tobacco counseling to 0.46 for helmet counseling.
Hypothesis 2a: ANCOVA Results Testing the Effect of Training Alone: Intervention Versus Comparison Groups
We hypothesized that training alone (component 1 of systems-level intervention) would result in significantly higher rates of provider delivery of services in 6 targeted risk areas, compared with rates of delivery using the usual standard of care.
ANCOVA results indicated that rates of screening during T1 were significantly higher in the intervention group than in the comparison group in each of the 6 areas after taking into account baseline and other covariates. Fs ranged from 8.26, (df 1,75; P < .01) for alcohol screening to 25.56 (df 1,75; P < .001) for helmet screening (Table 3). The η2 ranged from 0.11 (medium effect size) for alcohol screening to 0.27 (large effect size) for helmet screening
ANCOVA results indicated that counseling rates were significantly higher in each of the 6 target areas in the intervention group compared with the comparison group after the training. This was after controlling for baseline counseling level and other covariates. Fs ranged from 7.01, (df 1,74; P < .01) for tobacco counseling to 31.62 (df 1,75; P < .001) for helmet counseling (Table 3). The η2 ranged from 0.09 (medium effect size) for tobacco counseling to 0.31 (large effect size) for helmet counseling.
Hypothesis 2b: ANCOVA Results Testing the Effect of the Addition of Tools: Intervention Versus Comparison Groups
We hypothesized that the addition of the tools component of the intervention would result in additional increases in screening and counseling rates in the intervention group, in relation to the comparison group. We conducted repeated measures ANCOVA to test the effect of adding the tools component in the intervention group. To determine whether screening and counseling rates increased significantly in the intervention group after the addition of the tools component, we examined the interaction between group (intervention vs comparison) and the repeated measure (the screening rates at the T1 and T2). There were no significant interactions in the ANCOVAs, indicating that screening and counseling rates did not increase significantly from T1 to T2 in the intervention group, in relation to the comparison group.
Summary of ANCOVA Results
The results of the ANCOVAs demonstrated that (1) screening and counseling rates were significantly higher in the intervention group than in the comparison group after the full implementation of the intervention (T2); (2) screening and counseling rates were significantly higher in the intervention group than in the comparison group after the training component alone (T1); and (3) screening and counseling rates did not increase significantly in the intervention group, in relation to the comparison group, after the addition of the tools component. Our analyses of effect of site indicated that provider screening and counseling rates within the 2 intervention sites and within the comparison sites were not significantly different from one another in the majority of the analyses. Thus, provider screening and counseling rates within an individual site did not account for the significant effects of the intervention.
This study evaluated an intervention to increase clinicians' delivery of preventive services to a wide age range of adolescent patients in a group-model HMO. The findings support the general hypothesis that a systems intervention, consisting of training and tools, resulted in higher rates of provider delivery of services in 6 targeted risk areas, compared with rates of delivery using the usual standard of care. A secondary set of hypotheses testing the additive effects of the 2 components of the system intervention was only partially supported: provider training resulted in significantly higher rates of clinician screening and counseling of adolescent patients across all of the targeted risk areas; however, the subsequent addition of a modified screening tool did not further significantly increase screening and counseling rates. Thus, although the full intervention of training and tools significantly increased clinician screening and counseling rates, most of this increase seems to be accounted for by the training component.
Component 1: Training
The full-day provider training workshops that were conducted in the intervention sites were associated with large increases in rates of clinician screening and counseling (an average of 24%) across all targeted risk areas. The effect of this training is especially noteworthy as continuing medical education workshops alone have not been found to be an effective method of changing clinician behavior.34 There are several reasons as to why this particular workshop might have been effective at changing clinician behavior. First, the training workshops, based on social cognitive theory,33 addressed multiple barriers that impede implementation of preventive services for adolescents. The training focused on increasing knowledge, skills, and self-efficacy to deliver effectively preventive services to adolescents. Second, local opinion leaders were integrally involved in the intervention, a strategy that has been associated with improvements in professional practice.24 The chief or assistant chief of pediatrics served as our MD champions in each of the intervention sites, and adolescent medicine specialists from within the health care plan were involved in planning and conducting the training sessions. Third, the majority of clinicians, as well as the chief of pediatrics, attended the trainings together. This cohesive approach facilitated incorporating the screening and counseling methods not only into the practices of individual clinicians but also throughout the clinic system.
These findings support previous work on the effectiveness of skill-based training for clinicians.28 However, these findings extend the previous work in several important ways. First, the inclusion of comparison sites further validates the efficacy of this skill-based training program. Second, this training focused on delivering preventive services to all adolescents. Consequently, adolescents who were screened and counseled ranged in age from 13 to 17 years. Earlier research had included only a cohort of 14-year-old patients. Third, the magnitude of the change in screening and counseling rates in this current study are considerably larger than reported in previous research. The larger increase in rates of screening and counseling may be because clinicians were encouraged to deliver preventive services to all adolescents who came to their clinic for a well visit. Although this may initially seem more difficult than delivering services to a subset of adolescents (eg, 14-year-olds), these findings suggest that delivering services systematically to all patients may make it easier to integrate new guidelines into practice.
Full Intervention: Training and Modified Screening and Charting Tools
Although studies have found that interventions that combine 2 or more modalities are more likely to improve clinical practice,21,26,27 the subsequent addition of tools did not result in additional significant increases in screening and counseling rates in the intervention sites in relation to the comparison sites. Although rates of screening and counseling for helmet use increased between phases 2 and 3 in the intervention sites, once the full-model ANCOVA took into account any changes in the comparison sites and controlled for all covariates, this increase was not significant, relative to the comparison sites. Thus, although introducing the modified questionnaire had some impact on specific areas of screening and counseling, overall, it did not add significantly more than the initial training.
One likely explanation as to why the addition of tools may have had less impact than we expected is related to ceiling effects. After the training, adolescents reported being screened by their provider >80% of the time across all targeted areas, except for helmet use (at 70%). These are very high rates of screening across a wide age range of adolescent patients. It may be unrealistic to expect rates to increase greater than ∼80% in a busy clinical practice when clinicians are attempting to screen all adolescents. After introducing the screening questionnaire, helmet use increased to approximately the same screening rate (81%) as the other risk areas. The consistent implementation of the adolescent screening questionnaire may have served to maintain the posttraining screening and counseling rates across all risk areas. The already high rates of screening posttraining may have led to the screening questionnaire's serving primarily as a maintenance tool.
It should also be noted that at baseline, before our intervention, all clinics were using a charting form that had previously been distributed by the HMO's Regional Health Education for use in adolescent well visits. We made only minor changes to this form for the purposes of this study. Because clinicians were already using the charting form during the baseline phase (when screening and counseling rates were lower), it is clear that distributing a charting form alone does not result in high rates of screening and counseling. However, because all clinicians had some tool to assist them before the intervention, the full-intervention tools phase was more of a modification/add-on phase than a pure tools phase. This also may have contributed to the addition of tools having less of an impact.
During the full intervention phase (posttools), clinicians screened adolescents across all targeted risk areas, on average, 81% of the time. Screening across all risk areas reached approximately the same level regardless of the differential in screening rates at baseline. Consistent with our previous research,27 during the pretraining phase, clinicians were most likely to screen adolescents for substance use and least likely to screen for seatbelt and helmet use. Thus, the greatest absolute increase in screening rates, as a result of the intervention, was in the areas of seatbelt and helmet use. This suggests that before our intervention, screening adolescents for seatbelt and helmet use was not focused on and/or viewed as important as other risk areas such as substance use. This is especially striking given that the majority of adolescent deaths are attributable to unintentional injuries,1 screening for seatbelt and helmet use is relatively straightforward, and recent data suggest that clinician screening and counseling do have an effect on adolescent safety behavior.35,36
Although the findings of this intervention study are promising, a limitation of the study is that it was conducted within a group-model HMO that has distinct characteristics. For example, the ability to conduct a training that includes all primary care providers and is tailored to a particular clinic setting is not easily transferable to all health care settings. However, although the effect size may differ, the basic components of combining a provider skill-based training with tools that assist in enhancing or maintaining behavior change are generalizable to providers across a wide range of clinical settings.
Improving adolescent health is the ultimate goal of adolescent clinical preventive guidelines. However, an essential first step is to develop successful implementation models to facilitate prevention efforts.37 This is the first study to propose a model to implement adolescent clinical preventive services into busy clinical practices within managed care, using a research design that includes intervention and comparison sites. It is now time for research to begin to assess the behavioral/health effects of adolescents who receive clinical preventive services.
This study offers strong support for an intervention to increase clinicians' delivery of preventive services to a wide age range of adolescent patients. The enhancement of screening and counseling across all 6 targeted risk behaviors, compared with delivery using the usual standard of care, indicates that it is possible to improve the delivery of preventive services in the context of outpatient pediatric visits. This is a key step toward fully using the context of the health care setting for promoting adolescent health.
This research was supported primarily by grant U18 HS11095 from the Agency for Healthcare Research and Quality. Additional support was provided through a cooperative agreement from the Centers for Disease Control and Prevention through the Association of American Medical Colleges (MM-0162-02/02); the Hellman Family Award for Early Career Faculty; and by the Maternal and Child Health Bureau (MCJ-000978), Health Resources and Services Administration, Department of Health and Human Services (T71MC00003, U45MC00002, and U45MC000023).
We thank Jan Babb, Martha Barbosa, Sherille Pedron, and Natalie Redmond for the exceptional job in data collection and Ilse Larson and Alison Goldberg for skillful work in organizing the trainings and helping with multiple aspects of this study. Sheila Husting has played an integral role in data management, and we thank Jeanne Tschann for consultation on data analysis. We also appreciate the assistance of Michael Berlin in the preparation of the manuscript. Finally, we are grateful to Charito Sico and Gail Udkow for support for this study as well as the clinicians and staff in the Kaiser Permanente Northern California clinics who participated in this study and demonstrated a commitment to delivering preventive health care to adolescents.
- Accepted July 28, 2004.
- Reprint requests to (E.O.) Division of Adolescent Medicine, Department of Pediatrics, Box 0503, LH 245, University of California, San Francisco, CA 94143. E-mail:
This work was presented in part at the annual research meeting of Academy Health; June 29, 2003; Nashville, TN.
No conflict of interest declared.
- ↵Ozer EM, Park MJ, Paul T, Brindis CD, Irwin CE Jr. America's Adolescents: Are They Healthy? San Francisco, CA: University of California, National Adolescent Health Information Center; 2003
- ↵Anderson RN. Deaths: leading causes for 2000 [serial online]. Natl Vital Stat Rep.2002;50 :1– 85. Available at: www.cdc.gov/nchs/products/pubs/pubd/nvsr/50/50-16.htm. Accessed September 3, 2003
- National Center for Injury Prevention and Control. Injury mortality reports 1999–2000 [database online]. Atlanta, GA: Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2002. Available at: webapp.cdc.gov/sasweb/ncipc/mortrate10.html. Accessed September 3, 2003
- ↵National Highway Traffic Safety Administration. Traffic Safety Facts 2000: A Compilation of Motor Vehicle Crash Data From the Fatality Analysis Reporting System and the General Estimates System. Washington, DC: National Highway Traffic Safety Administration, National Center for Statistics and Analysis, US Department of Transportation; 2001. Available at: www-fars.nhtsa.dot.gov/pubs/1.pdf. Accessed September 3, 2003
- ↵Mackay AP, Fingerhut LA, Duran CR. Adolescent Health Chartbook. Health, United States, 2000. Hyattsville, MD: National Center for Health Statistics. Available at: www.cdc.gov/nchs/products/pubs/pubd/hus/2010/2010.htm#hus00. Accessed September 3, 2003
- ↵Newacheck PW, Brindis CD, Cart CU, Marchi K, Irwin CE Jr. Adolescent health insurance coverage: recent changes and access to care. Pediatrics.1999;104(suppl) :195– 202
- ↵Elster AB, Kuznets N. Guidelines for Adolescent Preventive Services (GAPS): Recommendations and Rationale. Chicago, IL: American Medical Association; 1994
- Green M, ed. Bright Futures: Guidelines for Health Supervision of Infants, Children and Adolescents. Arlington, VA: National Center for Education in Maternal and Child Health; 1994
- Stein M, ed. Health Supervision Guidelines. 3rd ed. Elk Grove Village, IL: American Academy of Pediatrics; 1997
- Department of Health and Human Services, Public Health Service, Office of Disease Prevention and Health Promotion. The Clinician's Handbook of Preventive Services: Put Prevention Into Practice. Alexandra, VA: International Medical Publishers; 1994
- US Preventive Services Task Force. Guide to Clinical Preventive Services. 2nd ed. Baltimore, MD: Williams & Wilkins; 1996
- ↵Park MJ, Macdonald TM, Ozer EM, et al. Investing in Clinical Preventive Health Services for Adolescents. San Francisco, CA: University of California, Policy Information and Analysis Center for Middle Childhood and Adolescence, and National Adolescent Health Information Center; 2001
- ↵Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. BMJ.1998;317 :465– 468
- Lomas J, Haynes RB. A taxonomy and critical review of tested strategies for the application of clinical practice recommendations: from “official” to “individual” clinical policy. Am J Prev Med.1988;4(suppl 4) :77– 97
- ↵Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ.1995;153 :1423– 1431
- ↵Klein JD, Allan MJ, Elster AB, et al. Improving adolescent preventive care in community health centers. Pediatrics.2001;107 :318– 327
- ↵Ozer EM, Adams SH, Lustig JL, et al. Can it be done? Implementing adolescent clinical preventive services. Health Serv Res.2001;36(suppl) :150– 165
- ↵Lustig JL, Ozer EM, Adams SH, et al. Improving the delivery of adolescent clinical preventive services through skills-based training. Pediatrics.2001;107 :1100– 1107
- ↵Green LW, Eriksen MP, Schor EL. Preventive practices by physicians: behavioral determinants and potential interventions. Am J Prev Med.1988;4(suppl 4) :101– 110
- ↵Lawrence RS. Diffusion of the U.S. preventive services task force recommendations into practice. J Gen Intern Med.1990;5(suppl 5) :S99– S103
- ↵Bandura A. Social Foundations of Thought and Action: A Social Cognitive Theory. Englewood Cliffs, NJ: Prentice Hall; 1986
- ↵Bandura A. Self-efficacy: The Exercise of Control. New York, NY: WH Freeman; 1997
- ↵Ozer EM, Adams SH, Lustig JL, et al. The effect of preventive services on adolescent behavior [abstract]. Pediatr Res.2003;53(suppl 4) :265A
- ↵Johnston BD, Rivara FP, Droesch RM, Dunn C, Copass MK. Behavior change counseling in the emergency department to reduce injury risk: a randomized, controlled trial. Pediatrics.2002;110 :267– 274
- ↵Agency for Healthcare Research and Quality. Translating Evidence Into Practice: Conference Summary. Rockville, MD: Agency for Health Care Policy and Research; 1998. Available at: www.ahcpr.gov/clinic/trip1998. Accessed September 3, 2003
- Copyright © 2005 by the American Academy of Pediatrics