Objective. To develop and implement a methodology to teach clinical skills to medical students in busy clinical settings.
Methods. The Structured Clinical Observation (SCO) program with guidelines and observation sheets for history-taking, physical examination, and information-giving skills was created. Faculty development preceded SCO implementation for pediatric clerkship students at Jefferson Children's Health Center. SCO observation sheets were tabulated and faculty and student questionnaires were administered.
Results. The mean number of observations per student was 6, with 368 observations done for 63 students. SCOs were highly rated as an educational tool by faculty and students. The impact of the SCO program on faculty ability to perform clinical duties was initially minimal, but increased over the year. Observations were used primarily for feedback, but did influence outpatient clinical faculty's evaluation of two thirds of the students. Only 50% of students reported being observed in other rotations.
Conclusions. SCOs are a feasible, inexpensive, qualitatively effective method of teaching clinical skills. The quantitative effect of SCOs on performance needs to be evaluated.
Most medical schools in the United States have courses on medical interviewing, interpersonal skills, and physical diagnosis,1 which usually take place in the first 2 years of medical school. However, the majority of schools do not continue the interviewing and interpersonal skills courses into the clinical years identifying as barriers lack of time, lack of money and lack of resources to motivate and support faculty.2
A 1995 literature review3 concluded that many medical students have deficient skills in interviewing, physical examination, and management of psychosocial issues. Clinical skill deficiencies in medical students were discussed at the 1998 Group on Educational Affairs plenary of the Association of American Medical Colleges (AAMC)4 because data from recent clinical performance examinations suggested that competency is not being obtained through the use of current educational models. These shortcomings in the teaching and evaluation of clinical skills are summarized and discussed in a recent Association of American Medical Colleges article.5
It was because of this data and our own personal experience teaching medical students during the pediatric clerkship that we identified a need for a clinical skills teaching model that could be used in busy clinical settings that was both economical of time and low cost. In this article we describe the development, implementation, and qualitative outcomes of a teaching model we call Structured Clinical Observations (SCOs).
Development of the SCO Program
Our goal was to teach clinical skills to third-year medical students and because the role of observation is implicit in the Liaison Committee on Medical Education6 directive to ensure clinic competence, our starting point for program development was a method based on direct observation. We chose to use repeated brief observations instead of observation of an entire encounter because we wanted a method that was economical of faculty time and one that would remove faculty members from their clinical duties only for very short periods of time. In order for the observations to be of learning value, they would be followed by immediate feedback. The short, frequent observation model would limit the number of feedback points, so that the feedback would not become overwhelming for the student and would not take more than a few minutes for the faculty member to complete. We also hypothesized that frequent repeated observations would give the students a chance to incorporate into their future encounters what they had been taught, and feel that they were gaining mastery and improving their clinical skills. This follows the basic guidelines of adult learning theory in that it shows respect for the learner, builds on previous experiences, has immediacy of application and gives opportunity to practice.7
The clinical skills were divided into history-taking, physical examination, and information-giving, thereby structuring the learning experience by dividing educational tasks into component parts.8 The clinical skills areas we defined are similar to those used for clinical skills assessment examination checklists but with the interpersonal skill items incorporated into them.9 Observation sheets were created for each clinical skill area with skill guidelines printed on them and a space for the observer to write.
Implementation of the SCO Program
Implementation took place in the Children's Health Center (CHC) of Thomas Jefferson University Hospital, Philadelphia, Pennsylvania.
All third-year medical students who did their pediatric clerkship at Thomas Jefferson University Hospital during the academic year 1997–1998 (n = 63) participated in the SCO program during the 2 weeks they spent in the CHC. All of the general pediatricians who worked in the CHC agreed to participate. Initially 10 faculty members were trained and participated in the SCOs, however, over the course of the study year, the number of faculty available to precept in the CHC had decreased to 6; by midyear 7 of the original 10 were in doing frequent SCOs, (at least once a week); 3 of the 10 were doing SCOs less than once a week; by the end of the academic year only 4 of the original 7 were still doing frequent SCOs; however, 2 new faculty had been trained and had joined the frequent SCO group.
Participating faculty members attended a 2-hour faculty development workshop. The workshop consisted of a discussion of the rationale for the SCO program, an overview of the SCO methodology, followed by an overview of the basic tenets of effective feedback. Faculty members practiced observing and using the SCO forms by looking at 2- or 3-minute video vignettes of students interacting with patients and after they practiced giving feedback using role-play. Points that were emphasized during the workshop were: observations should be no longer than 3 minutes; if 1 or 2 major feedback points are identified in the first few seconds of an observation, then the SCO is ended; try to write down exactly what the students say and exactly what they do to give focused effective feedback; different faculty members, after the same observation, might select different items about which to give feedback; the patient-student interaction must not be disrupted; and feedback is to be given outside the room after the encounter is completed and should last no longer than 2 minutes.
One faculty member was assigned to do SCOs each clinical session, and he/she attempted to do 1 SCO per student per session.
Students were oriented to the SCO program at the start of each 6-week pediatric clerkship and again just before starting the SCO program when each group of 2 or 3 students began their 2-week block in the CHC. The skill guidelines were reviewed and discussed with the students and it was emphasized that this was a teaching experience and not an evaluation.
Students were instructed to tell the scheduled SCO faculty member when they were going to see a patient, and in which room they would be. Students were also instructed to inform the patient and family that a faculty member might enter the room to observe the student. Students were told to expect specific feedback that would help them improve their clinical skills, and were instructed to talk with the faculty observer before leaving the clinical session.
SCO Program Evaluation
Questionnaires were given to faculty members when the SCO program had been running for 4 clerkship blocks and again at the end of the academic year after 8 blocks were completed. Questionnaires about the SCO program were given to each student at the end of their clerkship block. They were also asked about observations they had experienced in other preclinical and clinical courses. The student questionnaires were completed anonymously.
Data were entered into a Microsoft Excel (Microsoft Corporation, Redmond, WA) database for analysis.
Sixty-three students rotated through the Jefferson Pediatric Clerkship during the 1997–1998 academic year, and 60 students participated in the SCO program. There were a total of 368 SCO observations done. The mean number of observations per clerkship block was 46. The mean number of observations per student was 6. The number of observations in each skill area is shown in Fig 1.
The overall student rating of the SCO program on a 5 (high) to 1 (low) Likert scale is shown in Fig 2. Seven (12%) of the students found inconsistencies in the feedback they received. Student comments about the SCOs are listed in Table 4. Thirty students (50%) said that they had been observed in previous rotations, of these 5 (8%) had been observed during the first and second years of medical school and 25 students (42%) had been observed in clinical clerkships.
Faculty members rated the effectiveness of the workshop in preparing them to participate in the SCO program either 4 or 5 on a Likert scale, (5 [high] to 1 [low]). At the midpoint of the year (end of block 4), the faculty rated the impact of the program on their ability to fulfill clinical duties low at 1 or 2, this rating had an upward trend to 2, 3, or 4, indicating a greater impact by the end of the academic year. The faculty members rated the value of the SCO to student education at 4 or 5 (Fig 2). All of the faculty who did frequent SCOs said that they had evidence of students incorporating feedback suggestions into future encounters either by actually observing them or by the student telling them that they had done so. All of the faculty identified physical examination as the students' weakest skill. Although the SCOs were presented to the faculty as a teaching tool, we asked the faculty if doing the SCOs ever changed their opinion of a student's abilities and affected the final clinical grade assigned. Faculty reported that, because of the SCOs, they changed their opinion of two thirds of the students, and two thirds of the time this was in a positive direction.
A compelling case can be made for medical schools to focus on ensuring that graduates have high level clinical skills because an accurate history and physical examination is the basis for correct diagnosis about 90% of the time,10 and because physician behaviors have a direct effect on patient satisfaction, patient adherence, and clinical outcomes.11,,12 We know that it is possible to effectively teach medical interviewing skills and we know that these skills decline during the clinical years as students get more involved with medical problem solving.13–15 However, during clinical rotations little direct observation and feedback to trainees about performance occurs.16–19 Our data indicate that this is still the case at our medical school, and that half of our students complete their third year clerkships without ever being observed.
Our results show that the SCO program was qualitatively successful in that it was highly regarded as a clinical skills teaching tool by both students and faculty. The SCO program allowed us to identify specific student behaviors in history-taking, physical examination, and interpersonal behaviors that, if uncorrected, would most likely lead to future negative clinical outcomes. Deficiencies in information-giving skills were less easily identified using our methodology as the random nature of the SCOs rarely resulted in a faculty member observing a student giving information. In the future, changes could be made in implementation to target information giving by students and teach these skills. More importantly, we need to undertake a quantitative evaluation of the effect of the SCO methodology on overall student performance to demonstrate if there are any measurable differences in skills between students who participate in the program and those who do not.
The SCO program was feasible in the sense that we were able to maintain the number of observations per clerkship block over the year of this study despite the cutback in the number of faculty preceptors and the perceived impact this had on the ability of the faculty to fulfill clinical duties by the end of the year. We believe that a major factor contributing to the ability to sustain the program was that only 5 minutes of faculty time was required per session per student. Faculty development was a key part of the success of the SCO program and enabled faculty members to feel confident in observing and giving focused, effective feedback. Ende,20 and many others since,3,,9,20 have stressed the necessity for feedback in medical education, without which our SCO observations would have been educationally useless. One outcome of the SCOs that was not entirely unexpected was that it did have a moderate influence on evaluation and sometimes did effect the grade assigned.
The generalizability of this method was inadvertently demonstrated during the study year when several of our faculty, on their own initiative, used the SCOs in the inpatient and nursery settings. The SCO methodology is generalizable to other clerkships as well as other clinical settings. The costs we incurred were those related to time resources: 2 hours for a faculty development session and then a small ongoing time commitment performing the SCOs. Faculty who regularly did SCOs maintained a high level of comfort and rated their ability to identify items for feedback and to give effective feedback at either 4 or 5 level. Repeat workshops might be necessary for those faculty who do infrequent SCOs or who return to the program after a time away.
Redirecting some of the resources that medical schools spend on summative assessment of clinical skills to support a SCO type program of daily observation and feedback would ensure that students are given the opportunity to learn the skills that are tested in a summative clinical skills assessment. Also, by embracing a program that uses the principles of adult learning and enables students to build on and improve their skills, students may attain a higher level of competency than they otherwise might.
We have learned from this first year of the SCO program that observation allows us to diagnose the learner in the clinical performance arena in the same way that the 5-step microskills model21 enables faculty to diagnose the learner in the knowledge and clinical reasoning arena.
Our experience with developing and implementing the SCO program demonstrates that valuable clinical teaching programs can be created without funding, and that it is feasible for a program that is valued by students and faculty to be sustained even in the current medical education climate where faculty members have reduced time for teaching. A quantitative evaluation of the educational impact of a SCO program needs to be undertaken.
We thank Lisa Sudell for her assistance with the data and for help with preparation of the manuscript. We also gratefully acknowledge the contribution made by the faculty members who participate in the SCO program.
- SCO =
- Structured Clinical Observations •
- CHC =
- Children's Health Center
- ↵Varner K, ed. Association of American Medical Colleges Directory 1998–1999. Washington, DC: Association of American Medical Colleges; 1998
- ↵Association of American Medical Colleges. The GEA Correspondent. Washington, DC: Association of American Medical Colleges; 1999
- ↵Functions and Structure of a Medical School. Standards for Accreditation of Medical Education Programs Leading to the M.D. Degree. Washington, DC: Liaison Committee on Medical Education; 1997
- Roberts KB
- Scheiner AP
- ↵Eichna LW. Medical school education, 1975–1979. A student's perspective. N Engl J Med. 1980;303:727–734. Special article
- Copyright © 2000 American Academy of Pediatrics