Abstract
Context. Prescribing practices for otitis media are not consistent with current evidence-based recommendations.
Objective. To determine whether point-of-care evidence delivery regarding the use and duration of antibiotics for otitis media decreases the duration of therapy from 10 days and decreases the frequency of prescriptions written.
Design. Randomized, controlled trial.
Setting. Primary care pediatric clinic affiliated with university training program.
Intervention. A point-of-care evidence-based message system presenting real time evidence to providers based on their prescribing practice for otitis media.
Main Outcome Measures. Proportion of prescriptions for otitis media that were for <10 days and frequency with which antibiotics were prescribed.
Results. Intervention providers had a 34% greater reduction in the proportion of time they prescribed antibiotics for <10 days. Intervention providers were less likely to prescribe antibiotics than were control providers.
Conclusions. A point-of-care information system integrated into outpatient pediatric care can significantly influence provider behavior for a common condition.
Keeping current and applying the best evidence in the care of patients remains a problem for physicians. Recent studies suggest that providers do not prescribe the appropriate or optimal medications for their patients.1,2 Among other consequences, this has led to significant unnecessary use of antibiotics and a growing problem of microbial resistance.3,4 For pediatricians, the most common indication for prescribing antibiotics is otitis media.5 Data suggest that this practice could be improved in several ways including delaying the initiation of therapy for 2 to 3 days from the start of symptoms and a shorter course than the frequently used 10 days.6–8 Both reducing therapy duration and the frequency of antibiotic prescriptions for otitis media are important because they are cost-saving and reduce the spread of penicillin-resistant pneumococcus.9,10
Many past attempts to change physician behavior have been unsatisfactory or yielded mixed results.11 Clinical practice guidelines have not been well received by pediatricians,12 and their dissemination alone has not proven an effective means of improving the quality of care.13,14 Passive information retrieval systems that rely on providers to actively seek answers to their questions are also primarily ineffective.15 Embedding explicit recommendations into the flow of care has been demonstrated to improve the care of surgery patients and health care workers.16,17We hypothesized that a system that did not make explicit recommendations but rather attempted to make providers aware of the most recent advances in medicine could also be effective even for conditions for which providers were unlikely to seek assistance. Therefore, we conducted a randomized, controlled trial to test whether presenting pertinent, timely, and relevant evidence to providers at the point of care could change their prescribing practices for otitis media.
METHODS
Design
We performed a randomized, controlled trial of provider behavior change. We measured provider behavior change in the intervention and control groups by assessing prescribing behavior before and after the initiation of the evidence-based decision support system. This approach enabled us to measure the independent effects of the intervention, controlling for differences in previous antibiotic use as well as for changes in prescribing that were unrelated to the intervention.
Setting
This study was conducted at the Pediatric Care Center (PCC) at the University of Washington, the primary outpatient teaching clinic for the university's pediatrics residents. In addition to the 29 resident physicians who care for patients, there are 2 nurse practitioners and 7 attending physicians who follow their own patient panels. The clinic uses a computerized patient flow manager developed by one of us (J.A.W.). This system includes a daily electronic roster of scheduled patients as well as their vital signs and height and weight, which are entered by the medical assistants at each visit. Each provider logs into the system, which then displays the patients scheduled for the day. A computer workstation was placed in each examination room as well as in the physician work area and nursing station. These computers are connected to a server via a local area network.
In preparation for the study, an on-line prescription writer was developed to interface with the existing computerized patient flow manager. To prescribe a medication for any given patient, a provider selects a patient's name within the system and clicks on a treatment icon. Providers can then select a medication and an indication as well as a dosage and duration. The system prints out a copy of the prescription, which providers sign. Before beginning the study, we instituted a 6-month run-in period during which all providers used the prescription writer. This provided data on baseline prescribing practices of all participants. At the time that the intervention began, no paper prescriptions were being written in the clinic.
Participants
The participants in the study were the 38 providers caring for patients at the PCC, excluding the principal investigators. The protocol was approved by the University of Washington Institutional Review Board. Providers were not informed of the purpose of the study. These participants provided care during 14 414 visits over the course of the baseline and intervention periods of the trial, including 1339 visits for acute otitis media.
Randomization
We used stratified randomization using an electronic random number generator. Providers in 3 strata—residents, nurse practitioners, and attending physicians—were randomly assigned to either receive evidence-based prompts or not.
Intervention
Providers in the treatment arm were immediately greeted with pop-up screens based on their selection of antibiotic, indication, and duration. The screens were designed to convey the evidence in stages. Each first screen was a 5-line summary of the evidence. At the bottom of the first screen were options to: 1) see more information; 2) see the abstract of the articles from which the summaries were derived; 3) see the articles themselves that had been scanned and could be viewed in a portable document format using Acrobat reader; or 4) have the references e-mailed to the provider automatically. (Fig 1). The initial message screens are presented in Table 1. The intervention period lasted for 8 months.
Sample evidence screen.
Summary of Actions Triggering First Level Evidence Screen
Outcomes
Our primary outcome was reduced duration of therapy below the 10-day course typically used. We chose this as our primary outcome because we deemed it unlikely that we would be able to significantly reduce the proportion of patients with otitis media who were treated with antibiotics, although this was our secondary outcome. For all outcomes, return visits within 30 days for the same diagnosis were not counted.
Statistical Analysis
Our small sample size made it impossible to ensure complete comparability of the 2 groups at the start of the trial. In addition, we suspected that there would be a considerable amount of diffusion of the effects of the evidence, which would diminish our power to detect significant differences. In light of these problems, we collected pretrial provider behavior data and analyzed provider behavior change during the trial phase. This served 2 purposes. First, it explicitly controlled for each provider's baseline prescribing practice. Second, it had the statistical effect of reducing the random variance in the outcome measure, thereby affording the analysis greater power.
Provider behavior change was assessed as the difference between the outcomes in the period before and after the beginning of the trial. Because the outcomes are expressed as a change in individual provider behavior, it was unnecessary to control for potential confounders (such as years of provider training) of the behavioral outcome. The possibility for effect-modification by level of training was tested in a logistic framework. Student's t tests were used to assess the statistical significance of the differences in the outcomes between the treatment and control groups. As a subanalysis, t tests were used to assess the significance of the differences in provider behavior in the before versus after periods.
The providers' panels were quite unbalanced because of varying work schedules, with some providers having over 100 visits for acute otitis media and others having fewer than 10. Accordingly, the outcomes, which were expressed as means of provider behavior across visits, are estimated with different levels of precision for different providers. Therefore, all of the analyses were conducted using weights, in which each provider's actions contributed information to the analysis according to the precision with which their mean was estimated.18 The mean outcomes for providers with more visits could be more precisely estimated than the mean outcomes for providers with fewer visits, and accordingly, these providers contribute more weight in the analysis of the outcomes. This method was used because it achieves greater precision in the estimates of the effects of the intervention than does unweighted analysis.
Because not all providers treated patients for otitis media during both the before and after periods, the total number of providers is <38. Of the 19 providers initially randomized to the control group, 18 had visits for acute otitis media in both the baseline and intervention periods—and could, therefore, be included in the analysis of whether to prescribe antibiotics—and 16 prescribed antibiotics for acute otitis media in both the baseline and intervention periods—and could, therefore, be included in the analysis of the length of antibiotic treatment. Of the 19 providers initially randomized to the intervention group, 17 had visits for acute otitis media in both the baseline and intervention periods, and 12 prescribed antibiotics for acute otitis media in both the baseline and intervention periods. There were 1339 visits for acute otitis media, of which 1325 were to the 35 providers included in the first analysis, and 960 were included in the second analysis.
All analyses were conducted with Stata, Version 6.0statistical software (Statacorp, College Station, TX).
RESULTS
The mean age of providers in the study was 35 years (range: 27–67). Approximately one half (47%) were female. Among the attending physician and nurse practitioners, the average number of years since graduating from professional school (medical or nursing) was 5.9 (range: 1–39). There were 488 visits for acute otitis media in the baseline period (March through September) and 851 visits in the intervention period (October through May).
During the baseline period, 50.7% of all prescriptions for otitis media were written for a duration of <10 days. After the intervention, this average increased to 69.7%. For the primary outcome, providers in the intervention arm had a 44% change in the frequency with which they prescribed antibiotics for <10 days, whereas providers in the control arm had a 10% change. As presented in Table 2, this change in behavior was significantly related to the intervention, although both groups improved (P < .01). The possibility of effect modification by provider training was tested in the context of a logistic regression. The results were similar to those reported here, and no effect modification could be detected.
Comparisons of Outcomes of Study Effects for Both Outcomes
For the secondary outcome of treating acute otitis media without antibiotics, behavior in both groups deteriorated, ie, both groups became more likely to prescribe antibiotics. However, the differences in behavior before and after were significant only in the control group. The difference in change in behavior is marginally significant, with a P value of .095, indicating that the intervention may have had some protective effect against a trend of generally increasing antibiotic use.
DISCUSSION
We have demonstrated that presenting pertinent evidence to providers during critical junctures and at the point of care can significantly affect their prescribing behavior. This intervention shortened the treatment course with antibiotics for acute otitis media and had some effect on whether antibiotics were prescribed at all. In addition, the change in the behavior of control providers suggests that the evidence about shorter treatment course may have been rapidly and effectively disseminated among providers.
Our interpretation of the control group behavior change was that these providers were getting the evidence indirectly through casual discussions with the intervention group providers. It is possible that the control group's behavior change was caused by unrelated secular trends. Nevertheless, the intervention group experienced a significantly greater behavior change over and above that experienced by the control group.
We were surprised by the increased frequency with which antibiotics were prescribed for acute otitis media during the course of the study. However, this may be confounded by seasonal factors. The baseline period before the intervention consisted of summer months, whereas the trial period consisted of fall and winter months. Because each period covered a different set of months, it was not possible to control for seasonal effects. It may be that provider- prescribing tendencies increase in winter months (ie, during the trial period). However, the fact that intervention providers had less of an increase is noteworthy and suggests that here too point-of-care evidence may be beneficial.
Two key features of our approach distinguish it from previous interventions. First, the evidence-based decision support system is neither prescriptive nor proscriptive. Relevant information was summarized and presented to providers in an objective and concise manner. Providers were free to ignore it, act on it, or learn more about it. This distinguishes our system from clinical practice guidelines as well as academic detailing that are more directive in their suggestions.19,20 A second key feature of our system is that it was seamlessly integrated into practice, presenting relevant information to providers at the point of care without disrupting their workflow. Enabling ready access to information has been shown to be effective for changing provider behavior,21 but typically providers have to seek such information, which can be time-consuming and cumbersome and, by design, can only affect those conditions for which providers feel in need of assistance. Otitis media is unlikely to be such a condition because it is so common.5
Some limitations to our study relate to the setting in which it was conducted and the providers that it enrolled. The PCC is a resident teaching clinic. How strong the effects of such a system might be in other settings is unknown. Moreover, we had only 2 nurse practitioners and 8 attending physicians in our sample, which limited our ability to study the effects on them. To what extent this method of translating research into practice would be efficacious with more experienced pediatric clinicians warrants additional study.
Despite these limitations, some meaningful conclusions can be drawn from our findings. First, presenting providers with nondirective information at critical junctures in their workflow is an effective means of encouraging providers to practice evidence-based medicine. In fact, the effect size we found was considerably greater than those reported using face-to-face methods, which are considerably more labor intensive.20 Second, it may not be necessary to provide this evidence to every provider in a clinic, because the diffusion effect seems to be strong. Finally, as computer-based prescribing becomes increasingly prevalent, a system such as this may play a critical role in keeping providers current with the increasingly brisk pace of medical advances.
ACKNOWLEDGMENTS
We are grateful to the Packard Foundation for its generous support of this project.
Dr Christakis is a Robert Wood Johnson Generalist Faculty Physician Scholar.
We thank Linda Levenson for her help with data transfer and Alta Lyn Bassett for her editorial assistance.
Footnotes
- Received July 18, 2000.
- Accepted September 19, 2000.
Reprint requests to (D.A.C.) Child Health Institute, 146 N Canal St, Suite 300, Seattle, WA 98103. E-mail:dachris{at}u.washington.edu
- PCC =
- pediatric care center
REFERENCES
- Copyright © 2001 American Academy of Pediatrics