The Development of a Pediatric Inpatient Experience of Care Measure: Child HCAHPS®
The Centers for Medicare and Medicaid Services (CMS) uses Adult Hospital Consumer Assessment of Healthcare Providers and Systems (Adult HCAHPS®) scores for public reporting and pay-for-performance for most US hospitals, but no publicly available standardized survey of inpatient experience of care exists for pediatrics. To fill the gap, CMS and the Agency for Healthcare Research and Quality commissioned the development of a pediatric version (Child HCAHPS), a survey of parents/guardians of pediatric patients (<18 years old) who were recently hospitalized. This article describes the development of Child HCAHPS, which included an extensive review of the literature and quality measures, expert interviews, focus groups, cognitive testing, pilot testing of the draft survey, a national field test with 69 hospitals in 34 states, psychometric analysis, and end-user testing of the final survey. We conducted extensive validity and reliability testing to determine which items would be included in the final survey instrument and develop composite measures. We analyzed national field test data of 17 727 surveys collected in November 2012 to January 2014 from parents of recently hospitalized children. The final Child HCAHPS instrument has 62 items, including 39 patient experience items, 10 screeners, 12 demographic/descriptive items, and 1 open-ended item. The 39 experience items are categorized based on testing into 18 composite and single-item measures. Our composite and single-item measures demonstrated good to excellent hospital-level reliability at 300 responses per hospital. Child HCAHPS was developed to be a publicly available standardized survey of pediatric inpatient experience of care. It can be used to benchmark pediatric inpatient experience across hospitals and assist in efforts to improve the quality of inpatient care.
- CEPQM —
- Center for Excellence for Pediatric Quality Measurement
- CMS —
- Centers for Medicare and Medicaid Services
- HCAHPS —
- Hospital Consumer Assessment of Healthcare Providers and Systems
Patient-centeredness, a key component of healthcare quality, refers to the principle that care should be designed around patients’ needs, preferences, circumstances, and well-being.1 In pediatrics, the goal is family-centeredness, meaning care that addresses the needs of the family as well as the child.2 The National Quality Forum lists assessment of patient experience, often conducted using patient experience surveys, as a top priority.3,4 In hospitals caring for adults, patient experience is generally assessed using the Adult Hospital Consumer Assessment of Healthcare Providers and Systems (Adult HCAHPS®) Survey. The Centers for Medicare and Medicaid Services (CMS) uses results from Adult HCAHPS to inform consumer choice through public reporting on the Hospital Compare website and calculate incentive payments for the CMS Hospital Value-Based Purchasing Program.5,6
Whereas Adult HCAHPS has become the national standard for adult inpatients, an analogous pediatric inpatient survey has not been previously developed. In response to this gap, the Agency for Healthcare Research and Quality and CMS through the Pediatric Quality Measures Program funded the Center of Excellence for Pediatric Quality Measurement (CEPQM) to design the Child HCAHPS. In this article, we describe the development and national field test of the Child HCAHPS survey. Throughout the development of Child HCAHPS, we followed standard CAHPS development methods, which involve extensive validation and testing, and adhered to all CAHPS design principles.5,7,8
Survey Development Process
Literature Review, Measures Review, and Expert Input
To inform the Child HCAHPS development process, we conducted a systematic literature search of the PubMed database on patient experience of care, reviewing >1500 abstracts and articles. We also examined existing adult and child experience-of-care surveys. We drew on Adult HCAHPS whenever appropriate so that Child HCAHPS could be harmonized with the adult survey.5,7,8 Our goals were to understand the value of patient experience as a measure of healthcare quality and to identify essential domains for potential incorporation into the Child HCAHPS survey. As part of this process, we reviewed 67 surveys and articles describing patient experience surveys. The literature highlights the importance of measuring patient experience as a key element of quality and a target area for quality improvement efforts.9–18 The preponderance of the evidence shows positive correlations between patient-centeredness and a variety of quality measures, including performance on clinical processes of care and patient adherence to recommended treatment plans, as well as health outcomes.9–12,14,16,19–23 In pediatric studies, positive parent report of communication with providers is associated with positive parent perception of discharge preparedness and correlates with a lower readmission likelihood and higher overall quality ratings by parents.13,15,17 In addition, parents and patients have identified several domains of quality as important to patient- and family-centered care. These domains (eg, communication with providers, being kept informed, patient safety) encompass aspects of care with which patients are able to reliably report their experiences.18,24
To inform the development and use of Child HCAHPS, we interviewed experts in the fields of quality measurement, pediatric care, and patient experience. These experts, representing providers, payers, and professional organizations, provided technical and clinically relevant advice on quality measurement, quality improvement, health disparities, and information technology. Experts highlighted the importance of ensuring that items were age-appropriate and culturally sensitive. Throughout the development process, CEPQM’s National Stakeholder Panel advised on Child HCAHPS content and item wording and its usefulness as a measure for future use in public reporting and quality improvement. National Stakeholder Panel participants supported the domains included in the survey and the use of Child HCAHPS for measurement at the local level and for comparisons across hospitals. In addition, we received feedback from parent advocacy groups such as Family Voices and professional groups such as the Child Life Council.
On behalf of CEPQM, the Agency for Healthcare Research and Quality published a notice in the Federal Register in January 2012 requesting public input on measures and instruments to review and key domains to consider in developing the Child HCAHPS survey. Submissions included items on age-appropriateness of care, discharge efficiency, and patient comfort, as well as suggestions for domains such as emergency department care, comfort, and privacy.
During November and December 2011, we conducted 6 focus groups with parents of recently hospitalized children and 2 with recently hospitalized adolescents in Boston, Los Angeles, and St. Louis. Two of these groups were conducted with parents of children with special health care needs. Focus groups were conducted in both English and Spanish. The focus groups covered several aspects of care important to patients and families. Table 1 provides examples of findings. The focus groups also commented on the domains identified by the literature review and experts and raised additional domains for potential inclusion in our instrument, including family involvement and child-appropriate care.
Throughout the survey development process, we conducted 94 full and 25 partial cognitive interviews with parents of recently hospitalized children during which candidate items were tested. Cognitive interviews took place in Boston, Los Angeles, Miami, and St. Louis in both English and Spanish. The English and Spanish versions of the survey elicited similar responses. Overall, parents reported effectively on their own experiences of their child’s inpatient stay and were capable of distinguishing between their own experiences and those of their child; Child HCAHPS contains both types of items. Parents of children of all ages with a broad range of reasons for hospitalization were able to answer most survey questions appropriately and accurately. However, items for some domains were not included in Child HCAHPS because parents either lacked information to report on the experience or did not have a uniform understanding of the concept. For instance, parents were not able to report consistently on care coordination, which in the inpatient setting often occurs out of view of the parent. In addition, collecting data about experiences with shared decision-making was unsuccessful. For example, parents of children with planned hospital stays (eg, tonsillectomy) often felt that because the major decisions (eg, the decision to do the tonsillectomy) were made before the hospitalization, decisions made during the hospitalization seemed minor by comparison and were not sufficiently salient for parents to recall the decision-making process. Other times, when there was an emergent medical problem that led to hospitalization, parents felt that there were no “real” decisions to be made because the severity of the condition dictated the treatment course (eg, surgery for appendicitis).
Child HCAHPS Survey Administration
We followed the Adult HCAHPS survey administration processes whenever possible. The survey was conducted by CAHPS-approved vendors that routinely field proprietary patient experience surveys for the participating hospitals. Parents or guardians (henceforth referred to as parents) of eligible patients were randomly selected using standard sampling procedures and were contacted 48 hours to 6 weeks after the child was discharged. The survey was administered by mail or by telephone in English or Spanish. Spanish surveys were administered to parents whom the hospital’s records identified as preferring Spanish. Exclusion criteria included patients ≥18 years of age, “no-publicity” patients (ie, parents who do not want to be contacted), court/law enforcement patients, wards of the state, observation patients, healthy newborns, obstetric patients, those with a foreign home address, those excluded because of state regulations, those admitted for a psychiatric diagnosis, those discharged to another health care facility, and deceased patients. Healthy newborns were excluded because their care is usually closely associated with a mother’s obstetric care and thus may not reflect a pediatric service’s quality of care. Patients admitted for obstetric care were excluded because care related to pregnancy does not generally fall within the purview of pediatric providers. Patients discharged with psychiatric diagnoses were excluded because these patients typically require specialized psychiatric care that is different from other care received in general acute-care hospitals and would be better assessed by a mental health–specific instrument. Emancipated minor patients and those with parents <18 years old were also excluded in field testing.
After revising some survey items based on input from cognitive interviews, we conducted a pilot field test of the draft survey with parents of recently hospitalized children in 8 hospitals across the United States. The pilot instrument was administered by mail and contained 78 items: 55 items that asked respondents about whether or how often they had had a particular experience and overall rating of the hospital stay, 14 screeners (ie, gateway items that ensure that respondents are answering only items that are relevant to their child’s experience), 8 demographic or descriptive items, and 1 open-ended item. We received 2092 responses for an average hospital response rate of 22.5%, which we analyzed for item nonresponse, inter-item correlation, and response variation. In addition, we administered 60 telephone surveys and coded the respondent–interviewer interaction to identify any additional difficulties for participants completing the survey by telephone (eg, confusion about items, need for an item to be repeated, hang-ups, substantial delays in answering). Based on the analyses of the pilot test and behavioral coding, the survey was revised before the national field test.
National Field Test
From December 2012 to January 2014, we tested the revised draft survey in 69 hospitals across 34 states. Hospitals that participated included free-standing children’s hospitals (36%), children’s hospitals within an adult hospital (41%), and pediatric wards (23%). The sampling frame for the national field test included medical and surgical patients who were discharged between November 2012 and February 2014 after a stay of ≥1 night at a participating hospital. The total number of patients whose parents were contacted by mail or phone was 103 565. Of the 69 hospitals, 59 hospitals administered Child HCAHPS by mail and 10 by phone. We received 17 727 completed surveys, for an overall response rate of 17.1%, which is comparable to that attained by proprietary, pediatric patient-experience surveys. There was an average of 257 responses per hospital, with broad representation with respect to child and respondent characteristics (Table 2).
To determine which items would be included in the final Child HCAHPS survey instrument and develop composite measures, we conducted extensive validity and reliability testing. Quantitative methods included exploratory factor analysis, correlations of items and composites with overall ratings, estimation of hospital-level and internal consistency reliability, and case-mix models to adjust comparisons of hospitals for effects of measured patient and respondent characteristics. Each of the measures within the Child HCAHPS survey was conceived of as a hospital-level measure of hospital performance, and therefore, validity and reliability testing of each measure focused on hospital-level analyses.
Exploratory Factor Analysis
We investigated the structure underlying the covariance matrices of case mix–adjusted hospital-level item scores to identify groups of items that were empirically related at the hospital level. We estimated a Bayesian hierarchical model for hospital-level correlation structure after removing sampling variation due to individual variability in responses25; this estimate was used for the factor analysis and all of the following correlational analyses. We explored analyses with different numbers of factors and with both varimax and promax rotations, with generally consistent results regarding item groupings. This analysis generally confirmed that items that we would group together on conceptual grounds were also empirically related, such as the discharge items (Supplemental Figure 1). We found that doctor and nurse communication items were substantially related to other communication items such as “providers talked and acted age-appropriately” and “kept parent informed.” On conceptual grounds and for consistency with the Adult HCAHPS composites, we organized items into several composites for reporting purposes (see Table 3 for list of composite and single-item measures).
Hospital-Level Unit Reliability
Hospital-level unit reliability reflects measure variation between hospitals relative to random variation in the mean response within hospitals. CMS recommends collecting ≥300 responses per hospital to provide hospital-level unit reliabilities for the Adult HCAHPS item composites.26 We used the Spearman–Brown formula to calculate the reliability at 300 completed surveys per hospital, aiming for a hospital-level reliability of ≥0.7 for most composite and single-item measures. Our composite and single-item measures demonstrated good to excellent hospital-level reliability (Table 3). Only 1 measure, “involving teens in their care,” had a hospital-level reliability <0.7 at 300 responses per hospital. The lower reliability reflects the fact that the constituent items were asked only of parents with teenage patients, which were only 22.5% of completed surveys. Six measures had reliabilities of 0.7 to <0.8, 6 had reliabilities of 0.8 to <0.9, and 5 had reliabilities of ≥0.9, indicating excellent ability of the survey to distinguish high and low performers on the corresponding dimensions of patient experience.
Composite and Single-Item Correlations With Overall Rating
Criterion validity is the extent to which a measure relates to other measures as predicted by theory. We evaluated criterion validity by examining hospital-level correlations of composites and single-item measures with overall hospital performance, as reflected in the “overall rating” item. Of the 17 composite or single-item measures considered, 14 had significant positive correlations with the overall hospital rating, with a correlation of 0.90 for “would recommend this hospital” and correlations ranging from 0.49 to 0.71 for other measures (Table 3).
Additional Statistical Testing
Internal consistency reliability, commonly assessed with the Cronbach coefficient (α), quantifies how well a scale calculated from a set of items measures a single underlying construct. Internal consistency reliabilities for our composite measures were good to excellent. Although 3 composites had an internal consistency reliability <0.7, the others ranged from 0.75 to 0.94 (Supplemental Table 5). Item-to-composite correlations indicate how each item within a composite correlates with the overall composite. The item-to-composite correlations ranged from −0.23 to 0.91 (Supplemental Table 5). The “mistakes and concerns,” “communication about your child’s medicines,” and “helping your child feel comfortable” composites had low item-to-composite correlations, probably because each consists of items that are conceptually related but deal with fairly distinct processes of care. Composite-to-composite correlations are used to determine whether composites are measuring distinct aspects of patient experience. These ranged from 0.33 to 0.88; the higher correlations reflected the strong associations at the hospital level among measures of communication (Supplemental Table 6).
When comparing hospitals, it is desirable to adjust for case-mix differences to estimate how different hospitals would score if they all provided care to comparable groups of patients. Case-mix adjustment estimates and removes the predictable effects of patient and respondent characteristics, such as age and health status, that are not under the control of the hospital and may affect scores on performance measures. We tested the effects of variables available from the survey and hospitals’ administrative data and identified those that were predictive of responses and also had unequal distributions at different hospitals.27 Our final model adjusted for the following variables, entered categorically: child age, child global health status, respondent age, respondent education, respondent relationship to child, and language preference. Of these, the 2 variables that had the largest effects were child global health status and parent education.
After analyzing the national field test and proposing draft composite measures, we conducted an additional 23 cognitive interviews with parents to evaluate the understandability and validity of measure concepts and of measure labels used to describe each measure (eg, “privacy with providers”). End-user testing occurred in 2 rounds in Atlanta and Washington, DC. Based on this testing, measure labels were modified to improve their understandability. As a result of feedback from end-user testing, 3 measures—communication, patient safety, and pain—were prioritized as most important to a majority of parents.
We considered options for combining areas of communication (eg, combining nurse and doctor communication with children or combining child and parent communication with nurses) because of psychometric testing, stakeholder input, and the desire to simplify the survey results. However, end-user testing revealed that parents instead preferred to see each of the aspects of communication reported as separate measures (eg, nurse–parent communication, doctor–child communication). They perceived communication with doctors as different from communication with nurses and distinguished communication with the parent from communication with the child. End-user testing also indicated that organizing measures into categories for reporting helped ease the cognitive burden of examining a long list of measures. Participants overwhelmingly expressed a preference for the use of categories, which enabled them to focus on the grouping that was most important to each of them.
Based on the quantitative analyses and end-user testing, the final Child HCAHPS instrument has 62 items, including 39 patient experience items, 10 screeners, 12 demographic/descriptive items, and 1 open-ended item. The 18 composite and single-item measures are categorized into 5 overarching groups: communication with parent, communication with child, attention to safety and comfort, hospital environment, and hospital rating. Table 4 shows the finalized wording for the Child HCAHPS items, including the response scales and any screener questions that precede the item.
Child HCAHPS is a publicly available, validated instrument of pediatric inpatient family experience of care. Patient experience is linked to other quality measures and health outcomes including mortality, readmission rates, and clinical processes of care. Child HCAHPS was developed according to standardized methodology and national testing in accordance with CAHPS development principles, and the Child HCAHPS measures have recently been endorsed by the National Quality Forum.28 Adult HCAHPS has become the national standard of how hospitals measure and publicly report adult patient experience. Child HCAHPS provides a new tool to assess and benchmark hospital performance, both within and across hospitals, and has the potential to become a national standard.
We thank the CAHPS Consortium for their support and guidance throughout the development of Child HCAHPS. In particular, we thank Julie A. Brown, BA, Paul D. Cleary, PhD, Ron D. Hays, PhD, Lise Ribowski, MBA, and Carla L. Zema, PhD. We thank Susan M. Shaw, MSN, MS, RN, and Patricia A. Branowicki, PhD, RN, NEA-BC, for their review of our items and Stephanie Wagner, BA, for her support in preparing the manuscript.
We also thank the staff of the CEPQM at Boston Children’s Hospital, members of CEPQM’s Scientific Advisory Board and National Stakeholder Panel, and members of the Massachusetts Child Health Quality Coalition. Last, we thank the participants in our focus groups, cognitive interviews, and field tests and all the others who contributed to the development and testing of the Child HCAHPS survey.
- Accepted May 20, 2015.
- Address correspondence Sara L. Toomey, MD, MPhil, MPH, MSc, Division of General Pediatrics, Boston Children’s Hospital, 300 Longwood Ave, Boston, MA 02115. E-mail:
All authors participated in the development of Child HCAHPS and reviewed analyses; Dr Toomey drafted the initial manuscript; Drs Zaslavsky, Elliott, Gallagher, Fowler, Shulman, and Schuster and Mr Klein reviewed and revised the manuscript; and all authors approved the final manuscript as submitted.
The content is solely the responsibility of the authors and does not necessarily represent the official views of the Agency for Healthcare Research and Quality.
FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.
FUNDING: Support for this work was provided by the U.S. Department of Health and Human Services Agency for Healthcare Research and Quality and Centers for Medicare and Medicaid Services, Children's Health Insurance Program Reauthorization Act of 2009 (CHIPRA) Pediatric Quality Measures Program Centers of Excellence, under grant number U18 HS 020513 (PI, Dr Schuster). This work was also conducted with the support of a KL2/Catalyst Medical Research Investigator Training award (an appointed KL2 award) from Harvard Catalyst/The Harvard Clinical and Translational Science Center (National Center for Research Resources and the National Center for Advancing Translational Sciences, National Institutes of Health Award KL2 TR001100).
POTENTIAL CONFLICT OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose.
- ↵National Quality Forum. Prioritizing Measure Gaps: Person-Centered Care and Outcomes. Availlable at: www.qualityforum.org/Prioritizing_Measure_Gaps_-_Person-Centered_Care_and_Outcomes.aspx. Accessed June 6, 2015
- ↵National Quality Forum. MAP Families of Measures: Safety, Care Coordination, Cardiovascular Conditions, Diabetes. Washington, DC: NQF; 2012. Available at: http://www.qualityforum.org/Publications/2012/10/MAP_Families_of_Measures.aspx. Accessed May 29, 2015
- ↵CMS. Hospital HCAHPS. Baltimore, MD: CMS. Available at: www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/Downloads/Hospital-HCAHPSFactSheet201007.pdf. Accessed June 6, 2015
- ↵Medicare. Hospital Compare. Available at: www.medicare.gov/hospitalcompare/search.html. Accessed March 10, 2015
- Giordano LA,
- Elliott MN,
- Goldstein E,
- Lehrman WG,
- Spencer PA
- ↵Goldstein E, Farquhar M, Crofton C, Darby C, Garfinkel S. Measuring hospital care from the patients’ perspective: an overview of the CAHPS hospital survey development process. Health Serv Res. 2005;40(6 pt 2):1977–1995
- Glickman SW,
- Boulding W,
- Manary M,
- et al
- Jaipaul CK,
- Rosenthal GE
- Berry JG,
- Ziniel SI,
- Freeman L,
- et al
- Doyle C,
- Lennox L,
- Bell D
- Mack JW,
- Hilden JM,
- Watterson J,
- et al
- Husson O,
- Mols F,
- van de Poll-Franse LV
- Anhang Price R,
- Elliott MN,
- Zaslavsky AM,
- et al
- ↵Xu X, Buta E, Anhang Price R, Elliott MN, Hays RD, Cleary PD. Methodological considerations when studying the association between patient-reported care experiences and mortality [published online ahead of print December 7, 2014]. Health Serv Res. doi:10.1111/1475-6773.12264doi:10.1111/1475-6773.12264
- Elliott MN,
- Lehrman WG,
- Goldstein E,
- Hambarsoomian K,
- Beckett MK,
- Giordano LA
- Zaslavsky AM
- ↵National Quality Forum. NQF-Endorsed Measures for Person- and Family-Centered Care. Washington, DC: NQF; 2015. Available at: http://www.qualityforum.org/Publications/2015/03/Person-_and_Family-Centered_Care_Final_Report_-_Phase_1.aspx. Accessed May 29, 2015
- Copyright © 2015 by the American Academy of Pediatrics