Skip to main content
  • Study protocol
  • Open access
  • Published:

Sustainable deimplementation of continuous pulse oximetry monitoring in children hospitalized with bronchiolitis: study protocol for the Eliminating Monitor Overuse (EMO) type III effectiveness-deimplementation cluster-randomized trial

Abstract

Background

Methods of sustaining the deimplementation of overused medical practices (i.e., practices not supported by evidence) are understudied. In pediatric hospital medicine, continuous pulse oximetry monitoring of children with the common viral respiratory illness bronchiolitis is recommended only under specific circumstances. Three national guidelines discourage its use for children who are not receiving supplemental oxygen, but guideline-discordant practice (i.e., overuse) remains prevalent. A 6-hospital pilot of educational outreach with audit and feedback resulted in immediate reductions in overuse; however, the best strategies to optimize sustainment of deimplementation success are unknown.

Methods

The Eliminating Monitor Overuse (EMO) trial will compare two deimplementation strategies in a hybrid type III effectiveness-deimplementation trial. This longitudinal cluster-randomized design will be conducted in Pediatric Research in Inpatient Settings (PRIS) Network hospitals and will include baseline measurement, active deimplementation, and sustainment phases. After a baseline measurement period, 16–19 hospitals will be randomized to a deimplementation strategy that targets unlearning (educational outreach with audit and feedback), and the other 16–19 will be randomized to a strategy that targets unlearning and substitution (adding an EHR-integrated clinical pathway decision support tool). The primary outcome is the sustainment of deimplementation in bronchiolitis patients who are not receiving any supplemental oxygen, analyzed as a longitudinal difference-in-differences comparison of overuse rates across study arms. Secondary outcomes include equity of deimplementation and the fidelity to, and cost of, each deimplementation strategy. To understand how the deimplementation strategies work, we will test hypothesized mechanisms of routinization (clinicians developing new routines supporting practice change) and institutionalization (embedding of practice change into existing organizational systems).

Discussion

The EMO trial will advance the science of deimplementation by providing new insights into the processes, mechanisms, costs, and likelihood of sustained practice change using rigorously designed deimplementation strategies. The trial will also advance care for a high-incidence, costly pediatric lung disease.

Trial registration

ClinicalTrials.gov,NCT05132322. Registered on November 10, 2021.

Peer Review reports

Background

Reducing the use of health interventions that are not supported by evidence is essential to maximize quality and value while minimizing harm, waste, and inefficiencies in health care [1, 2]. Medical overuse, as defined by the Institute of Medicine, is provision of care in the absence of a clear medical indication, or when the benefit does not outweigh the risk [3]. Overuse can be identified and measured when evidence-based guidelines specify conditions in which a practice is appropriate and also consider the balance between benefits and harms [4]. Deimplementation is the systematic and intentional reduction in overused practices that do not improve outcomes [5, 6]. In recent years, experts have called for deimplementation research to identify the best strategies for minimizing low-value care delivery, including in pediatrics [7, 8].

To investigate useful strategies to deimplement medical overuse, we focus on inpatient pediatric treatment of viral bronchiolitis (“bronchiolitis”), a common acute lung disease caused by a respiratory viral infection in children under 2 years old [9,10,11]. In the USA, bronchiolitis leads to over 100,000 hospitalizations annually [12]. Historically, this has occurred in a seasonal pattern, with most cases occurring between December and March [13]. Treatment typically includes feeding support, nasal suctioning, and in some situations supplemental oxygen [11]. Bronchiolitis patients are often continuously monitored with pulse oximetry (SpO2) despite evidence that it does not improve outcomes if used during periods of hospitalization when the patient is “in room air,” or not receiving supplemental oxygen [14]. Rather, in those patients, continuous SpO2 monitoring may identify brief, self-limited desaturations that do not require treatment and do not affect patient outcomes [15]. Overuse of continuous SpO2 monitoring is associated with increased oxygen administration, prolonged length of stay, unnecessary monitor alarms that can generate alarm fatigue, and increased costs [16,17,18]. Two clinical trials have demonstrated that intermittent SpO2 measurement is an equally safe alternative to continuous SpO2 monitoring for children in room air [19, 20], and three sets of national guidelines now discourage the use of continuous SpO2 monitoring in hospitalized children with bronchiolitis who are in room air [11, 21, 22]. Despite the evidence and guidelines, continuous SpO2 monitoring continues to be overused in hospitalized children with bronchiolitis, making it a prime target for deimplementation.

To prepare for the clinical trial outlined in this protocol, members of this authorship group conducted several preliminary studies. First, we conducted a 56-hospital, 3612-patient observational study of SpO2 monitoring and found that across all hospitals at baseline, 46% of bronchiolitis patients in room air were continuously SpO2-monitored, discordant with guidelines [14]. Furthermore, we found strikingly wide variation between hospitals, ranging from 2 to 92%, which was not attributable to differences in patient populations. This variation suggests that achieving guideline-concordant care is feasible, but the degree of success may be related to contextual factors. Second, we conducted qualitative interviews with clinicians and administrators from 12 hospitals to understand barriers to deimplementation, guided by the Consolidated Framework for Implementation Research [23, 24]. Key barriers included educational gaps, lack of clear instructions about when to monitor, and culture that normalizes monitoring. Third, we convened 39 stakeholders from 15 hospitals to develop deimplementation strategies using implementation mapping, a systematic approach to identifying strategies to address identified needs [25]. Applying Helfrich’s Dual Process Theory-Based Model for Deimplementation (described in detail under the “Frameworks and mechanisms” section below) [26], we categorized strategies into the following categories: (a) unlearning (educational outreach with audit and feedback [A&F]) and (b) substitution (replacing continuous monitoring with intermittent measurement, supported by a clinical pathway integrated into the electronic health record [EHR]). Fourth, we performed a 6-hospital, single-arm pilot trial of educational outreach with A&F, using historical controls. Each hospital improved compared to baseline, with mean rates of guideline-discordant care decreasing from 53 to 23% [27]. More than 90% of participating nurses and physicians also rated education and A&F to be feasible and acceptable deimplementation strategies [27]. Although the pilot trial showed immediate short-term success with deimplementation of unnecessary SpO2 monitoring, sustainment can be challenging; a systematic review of implementation studies found that less than half of the studies measuring sustainability outcomes reported successful sustainment more than 1 year after the initiatives to change practice [28]. Thus, further study of strategies that lead to successful sustainment is needed [29], especially given that no studies have examined the sustainment of deimplementation practice changes in pediatric hospital settings.

Building on this body of evidence, we will conduct the Eliminating Monitor Overuse (EMO) SpO2 trial, with study arms rooted in what we have learned from our observational studies and pilot trial. The trial will test the effects of an unlearning only strategy (educational outreach with A&F) compared to an unlearning + substitution strategy (educational outreach, A&F, and an EHR-integrated clinical pathway to encourage alternate recommended monitoring approaches) on the sustainment of deimplementation of SpO2 monitoring in children with bronchiolitis who are in room air. The trial will allow us to determine if the EHR-integrated clinical pathway enhances sustainment by continuing to support practice change, with a focus on its effects 1 year after withdrawal of educational outreach with A&F, and will offer insight into implementation strategy mechanisms. By focusing on an outcome (sustainment of deimplementation) that is highly relevant to both clinical practice and implementation science, we expect that trial results will have both clinical and scientific significance.

Methods/design

This manuscript adheres to the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) [30] and the CONSORT extension for cluster randomized trials [31] (Additional files 3 and 4).

Trial management and protection of human subjects

The trial will be led by two principal investigators, CPB and RSB. Central management and regulatory coordination of the trial will be led by CRB and KA. Oversight of study operations and science will be provided by the Steering Committee of Co-Investigators, comprised of the remainder of the authors, plus two representatives from the National Heart, Lung, and Blood Institute: Aruna Natarajan, Program Director, Pediatric Lung Disease and Critical Care, and Karen Bienstock, Clinical Trials Specialist. The Data Coordinating Center will be housed at the Clinical Research Computing Unit at the University of Pennsylvania. Data entry will be done using Research Electronic Data Capture (REDCap) [32], with data validation checks performed in the electronic forms in real time and data quality check queries conducted by the Data Coordinating Center weekly. The Analytic Core will be housed at the Data Science and Biostatistics Unit at Children’s Hospital of Philadelphia. Data Use Agreements have been established with each participating site to regulate data flow and confidentiality procedures. A Data and Safety Monitoring Board (DSMB) has also been convened (Charter in Additional file 1). Site PIs at each participating site will provide direct oversight of local research activities.

This study was approved by the Institutional Review Board (IRB) at Children’s Hospital of Philadelphia. Prior to study commencement at each participating site, each US site established an IRB reliance agreement with Children’s Hospital of Philadelphia’s IRB using an electronic reliance platform. The Canadian site obtained local Research Ethics Board (REB) approval independently.

Aims and hypotheses

This hybrid type III effectiveness-deimplementation cluster-randomized clinical trial includes three specific aims.

In aim 1, we will compare the effects of an unlearning only strategy (educational outreach with A&F) versus an unlearning + substitution strategy (educational outreach with A&F + an EHR-integrated clinical pathway) on the primary outcome of deimplementation sustainment and secondary outcomes at the hospital level, including equity, fidelity, and cost. Compared to the unlearning only strategy, we hypothesize that the unlearning + substitution strategy will result in better sustainment.

In aim 2, we will identify deimplementation strategy mechanisms linked to deimplementation outcomes using mixed methods, including questionnaires and qualitative interviews. Our mechanistic hypothesis is that the unlearning + substitution strategy will result in better deimplementation sustainment compared to the unlearning only strategy because the EHR-integrated clinical pathway will generate better routinization (clinicians developing new routines supporting practice change) and institutionalization (the organization embedding practice change into existing systems) of guideline-concordant care [33].

In aim 3, we will examine the effects of deimplementation on clinical outcomes and unintended consequences. We hypothesize that increased deimplementation penetration (i.e., a reduction in overuse of continuous monitoring) will be associated with decreased length of hospital stay for bronchiolitis. We will also perform active surveillance for underuse of continuous SpO2 monitoring in severely ill bronchiolitis patients as a potential unintended consequence of deimplementation.

Trial overview

As shown in Fig. 1, this hybrid type III effectiveness-deimplementation trial [34] with a longitudinal cluster-randomized design includes three main phases (baseline, active deimplementation, and sustainment). The unit of clustering is at the hospital level. Given the typical seasonal pattern of bronchiolitis, we originally designed each phase of the trial to take place during one of three winter periods (December–March). However, the COVID-19 pandemic and associated practices to reduce spread altered the typical seasonal pattern of viruses that cause bronchiolitis. Therefore, post-award, we revised the trial design, unlinking study phases from specific seasons with specific inter-phase durations described below.

Fig. 1
figure 1

Trial diagram

Phase 1 (baseline, or P1)

We will measure baseline rates of overuse (guideline-discordant monitoring) in approximately 45 hospitals (see Setting). Based on this, we will exclude hospitals with data collection challenges (i.e., fewer than 15 patients observed for the presence or absence of continuous SpO2 monitoring overuse), those with low rates of overuse (i.e., less than 20%), and those with other feasibility challenges that preclude further participation. Hospitals may also elect not to continue in the trial after the baseline phase. We anticipate that this will result in 32–38 randomizable hospitals. If more than 38 hospitals remain after those exclusions, we will then randomize the 38 hospitals with the highest baseline rates of overuse. P1 was originally designed to occur over a 4-month period; however, due to low numbers of bronchiolitis patients in the first winter attributable to a seasonal shift in the incidence of respiratory viral disease (most notably the respiratory syncytial virus, RSV) [35], P1 was extended to a 7-month duration.

In the interim between P1 and P2, hospitals will have 6 months to prepare the deimplementation strategy rollout. This interim period may be extended beyond 6 months at the discretion of the Steering Committee in any of the following conditions: (a) ≥20% of sites in either arm are unprepared to start active deimplementation, (b) national RSV percent positivity is <2%, or (c) for other reasons, with DSMB and NHLBI approval.

Phase 2 (active deimplementation, or P2)

During this 4-month phase, deimplementation strategies will be deployed in the hospitals and overuse of continuous SpO2 monitoring will be simultaneously re-measured. At the end of P2, unlearning (educational outreach with A&F) will be withdrawn from both arms.

In the interim between P2 and P3 is a washout period that will last a minimum of 6 months and may be extended beyond 6 months at the discretion of the Steering Committee in any of the following conditions: (a) national RSV percent positivity is <2%, (b) the 4-month proposed phase 3 includes the month of July (coinciding with the arrival of new pediatric residents, a key stakeholder group), or (c) for other reasons, with DSMB and NHLBI approval.

Phase 3 (sustainment, or P3)

During this 4-month phase, the EHR-integrated pathway will be maintained exclusively in the unlearning + substitution arm. There will be no educational outreach or A&F in either arm. Overuse of continuous SpO2 monitoring will be re-measured and the primary outcome (deimplementation sustainment) will be contrasted between arms.

Figure 2 provides a CONSORT diagram.

Fig. 2
figure 2

CONSORT diagram

Trial setting and hospital eligibility criteria

The trial will be conducted within Pediatric Research in Inpatient Settings (PRIS) Network hospitals [36]. PRIS is a 117-hospital research network whose Executive Council has experience leading high-impact studies of hospital care in children [37,38,39,40] and effectively sets the agenda for pediatric hospital medicine research nationally [10]. Sites participating in the EMO trial are listed on clinicaltrials.gov.

PRIS hospitals in the USA and Canada are eligible to participate in the trial. We will exclude sites that participated in our prior EMO observational study [14] but failed to collect sufficient data in that study to be included in the final analysis and/or had low baseline overuse.

Study populations

Children with bronchiolitis

Our patient population includes children aged 2–23 months old with bronchiolitis who are hospitalized on non-intensive care, non-emergency department, non-step down inpatient units at participating hospitals. Bronchiolitis must be their primary diagnosis, and they must be cared for by a generalist inpatient service. Children with major comorbidities, those with COVID-19, and those born prior to 28 weeks will be excluded.

Parents or guardians of bronchiolitis patients

A subset of parents or guardians of bronchiolitis patients who were treated on a study hospital unit will participate in qualitative interviews. Interviews will focus on those who received treatment during the most recent study phase. Recruitment details are provided in the “Qualitative mechanistic measures” section below.

Hospital staff

A subset of hospital staff will complete study questionnaires and participate in qualitative interviews. Recruitment details are provided in the “Quantitative mechanistic measures” and “Qualitative mechanistic measures” sections below.

Frameworks and mechanisms

As noted, Helfrich’s Dual Process Theory-Based Model for Deimplementation forms the theoretical basis for our experimental design [26]. Dual Process Theory specifies two types of reasoning underlying decisions. Type 1 reasoning is fast and intuitive [41, 42] and type 2 reasoning is analytical, slow, and resource-intensive [41, 42]. Helfrich’s framework separates deimplementation strategies into approaches that target each type of reasoning. Unlearning the ineffective practice using knowledge-based methods engages type 2 reasoning (e.g., presenting clinicians with evidence and guidelines, conducting A&F), and substituting the ineffective practice with an alternative practice supports type 1 thinking (e.g., using EHR-integrated clinical pathways). Helfrich’s model also highlights the importance of psychological reactance (a combination of negative emotion and cognition) that can occur in response to deimplementation efforts when freedom—in this case clinical autonomy—is perceived to be threatened [43].

We also draw on Slaghuis’s Framework for Sustainability of Work Practices, which complements Helfrich’s model [33] and posits that sustainment requires (a) routinization, whereby clinicians develop new routines such that the practice change becomes part of their everyday work, and (b) institutionalization, whereby the organization embeds the practice into its existing systems and structures via clinical protocols, policies, or pathways. In combination, these models explain the mechanisms behind our hypothesis that adding the substitution approach to the unlearning approach is expected to result in higher sustainment of deimplementation gains. The addition of the EHR-integrated clinical pathway will support routinization and institutionalization of the practice change while also harnessing type 1 reasoning to mediate the relationship between the EHR-integrated clinical pathway and deimplementation sustainment outcomes [33].

Deimplementation strategies

All deimplementation strategies are assigned and delivered at the cluster (hospital) level.

Educational outreach (both trial arms)

Educational outreach to clinicians will focus on communicating core messages to staff, including the national guidelines, the evidence and rationale underlying the guidelines, and talking points to use if parents ask about monitoring, using language adapted from a parent-focused intervention [44]. Educational outreach will include several formats: (1) in-person sessions on each participating unit of each hospital prior to the start of active deimplementation, delivered in site-specific forums, with refresher sessions monthly for the remainder of the active deimplementation phase; (2) locally adapted handouts and posters; and (3) short educational videos and messaging distributed by email.

Audit and feedback (both trial arms)

A&F will follow our successful pilot study methods [27] and will include two levels: (1) weekly unit-level feedback and (2) real-time, clinician-level, inquiry-based feedback. Each week, we will compute the prior week’s percentage of bronchiolitis patients in room air who were inappropriately monitored continuously at the hospital and unit level and distribute these data to sites in the form of a visual dashboard that includes comparisons over time and between hospitals. Site PIs will then share the dashboards locally with clinicians in person (e.g., during staff meetings) and via email. Real-time 1:1 feedback will occur during clinical care; when collecting data on individual patients (as described below), data collectors encountering monitor overuse—continuous monitoring in a patient not receiving supplemental oxygen—are empowered to ask any available clinician responsible for that patient’s care, in a nonjudgmental way, about the indications for monitoring that patient.

Clinical pathway integrated into the EHR (substitution trial arm only)

The substitution strategy includes a clinical pathway integrated into the EHR, to guide clinicians step-by-step through guideline-concordant monitoring practices [45]. During year 1 of the trial, clinical stakeholders will participate in a guideline-to-pathway translation exercise. Based on the existing guidelines, the new pathway will clearly specify (a) situations when it is appropriate to initiate intermittent SpO2 measurement (the alternative practice) instead of continuous SpO2 monitoring and (b) when it is appropriate to discontinue continuous SpO2 monitoring and transition to intermittent SpO2 measurement.

Since integrating pathways into an EHR is a form of clinical decision support, we will incorporate the “Five Rights” of clinical decision support, which aim to ensure delivery of (1) the right information, (2) to the right people, (3) in the right intervention format, (4) through the right channels, (5) at the right point in the workflow (Table 1) [46, 47]. This will facilitate a standard approach to EHR integration while also allowing flexibility in format to encourage maximum feasibility and fit with local workflow [48, 49]. Following randomization, each site assigned to the unlearning + substitution arm will be matched with an EHR integration “coach” drawn from the Pediatric Clinical Decision Support Collaborative [50]. Each coach will facilitate integration of the clinical pathway into the local EHR by liaising directly with the Site PI, local clinicians, and informatics staff to ensure decision support that is aligned with the guiding principles in Table 1 is in place, on time, and within local capabilities.

Table 1 Guiding principles for EHR integration

Randomization

Hospitals eligible for randomization based on baseline measurement results will be cluster-randomized to either the unlearning only (anticipated n=19) or unlearning + substitution (anticipated n=19) arm. We will use covariate-constrained randomization methods [51] to achieve optimal balance between arms for three important hospital characteristics: (1) hospital type (freestanding children’s hospitals vs. general or community hospitals), (2) presence of pre-existing EHR clinical decision support for bronchiolitis that promotes the use of intermittent “spot checks” instead of continuous pulse oximetry in patients not requiring supplemental oxygen, and (3) baseline overuse rate. Randomization and assignment to study arms will be conducted by the Analytic Core overseen by the lead biostatistician (RX).

Equitable deimplementation

We recognize that efforts to change clinical practice have the potential to inadvertently increase inequities [52]. Throughout the trial, our Data Coordinating Center will perform ongoing surveillance for signals in the data that may suggest hospital- or study-level inequities in deimplementation, with a focus on patient sex, race, and ethnicity (primarily contrasting non-Hispanic white with non-Hispanic Black and Hispanic patients), and preferred language of the patient’s family (primarily contrasting families who report a preference to communicate about their child’s health in a language other than English versus those who prefer English). If clinically significant signals are identified at any point, we will meet with the site PIs at affected hospitals promptly to discuss possible underlying reasons for the disparities and to develop mitigation plans with input from the study’s DSMB and Steering Committee [52].

Study measures, procedures, and analysis

Deimplementation measures

Deimplementation sustainment/penetration

The primary outcome of deimplementation sustainment will be assessed as a longitudinal difference-in-differences in deimplementation penetration, or the extent to which the overused continuous SpO2 monitoring practice has been discontinued [53]. This will be captured by analyzing the change in the percentage of bronchiolitis patients who are in room air but are continuously SpO2-monitored across the 3 study phases. Because initiation and discontinuation decisions may differ from one another, we will assess penetration in two distinct categories of patients in room air: (1) those who never required supplemental oxygen and (2) those who previously did but subsequently stabilized. We will observe continuous SpO2 monitoring in order to measure this outcome, as we have done successfully in prior studies [14, 27], given that our prior research has shown that analyzing orders for monitoring does not accurately capture actual monitoring status [54]. Research staff at each hospital will perform cross-sectional observational data collection rounds during each phase. During these data collection rounds, trained research staff walk to the units of all eligible children with bronchiolitis and determine the continuous monitoring status of each patient based on visual examination of waveforms displayed on the monitor in each patient’s room or at a central monitoring station. In hospitals with direct integration of the monitors within the EHR or remote monitor viewing systems, visual examination of waveforms or parametric data may be performed using those platforms.

The primary analysis will be based on the intention-to-treat principle, with a secondary per-protocol analysis. We will analyze deimplementation sustainment as a longitudinal difference-in-differences comparison between study arms of the change in deimplementation penetration between the baseline phase (P1) and the sustainment phase (P3, after withdrawal of educational outreach with A&F), expressed as (P3-P1 | Arm 2) - (P3-P1 | Arm 1). We will use generalized hierarchical mixed-effects models with logit link for longitudinal binary outcome data. To account for differences in patient-level factors, we will adjust for patient age, gestational age, time since weaning from supplemental oxygen, presence of an enteral feeding tube, and whether data were captured during an overnight shift.

Sample size calculation based on deimplementation sustainment

The trial’s overall power analysis is based upon the primary outcome (deimplementation sustainment). The sample size is primarily driven by the number of hospitals, the within-hospital correlation over time, and the variation across hospitals. The degree of correlation can be expressed as either the intra-cluster correlation coefficient or the between-cluster coefficient of variation [55]. While the two approaches are equally valid, we have used the between-cluster coefficient of variation method in our calculations because it is more flexible and is more readily understood [55, 56]. Based on our preliminary studies, we estimate 50% deimplementation penetration (i.e., 50% overuse) at baseline (in P1). With 2-sided alpha = .05, moderate within-hospital correlation across phases of 0.6, and moderate to high between-hospital standard deviation of 15 percentage points, we will have 80% power to detect a difference of 16 percentage points between study arms if 24 total hospitals (12 per arm) complete the active deimplementation and sustainment phases (P2 and P3). Challenges related to the COVID-19 pandemic and disruption of the normal bronchiolitis seasonality prompted us to take a conservative approach to choosing the number of sites to randomize, accounting for the potential for unexpectedly high dropout between randomization and the end of P3. If 38 hospitals are randomized, our calculations allow for 37% dropout over the course of the trial.

Other deimplementation measures

Acceptability among nurses and physicians will be captured using the Acceptability of Intervention Measure (AIM) during the active deimplementation phase [57]. Site PIs will each facilitate the distribution of questionnaires to nurses and physicians who provide care for bronchiolitis patients on the units participating in the study, as well as to hospital administrators. Deimplementation fidelity will be captured as the extent to which educational outreach, A&F, and the EHR-integrated pathways are performed per protocol during active deimplementation. Fidelity data for educational outreach with A&F will be extracted from intervention logs maintained by Site PIs (e.g., to capture whether meetings happened as planned). Fidelity data for the EHR-based clinical decision support tool will be assessed using local EHR screenshots taken during the active deimplementation phase in order to assess alignment of the actual EHR interface with each guiding principle in the “Five Rights” framework. This fidelity assessment will focus on function rather than form, given that form is expected to vary. The cost of delivering each of the strategies will be assessed during the active deimplementation and sustainment phases, using the time-driven activity-based costing method [58].

Quantitative mechanistic measures

Using the same distribution methods and participants as described for the acceptability questionnaire above, we will administer questionnaires to assess hypothesized mediators and moderators. We will distribute the Slaghuis Measurement Instrument for Sustainability of Work Practices [33] to eligible clinical staff to assess potential mediators at two time points: following randomization and again in the final month of the sustainment phase, when we would expect the hypothesized mechanisms associated with sustainment to have occurred. The instrument assesses two closely related but conceptually distinct processes: routinization, in which clinicians develop new routines such that the practice change becomes part of their everyday work, and institutionalization, in which the organization embeds the practice change into its existing systems and structures. We will capture potential moderators during the active deimplementation phase, using the Implementation Climate Scale to understand whether hospital clinicians and staff perceive that they are expected, supported, and rewarded for deimplementation of continuous SpO2 monitoring [59] and the Implementation Leadership Scale to understand leader behaviors with regard to SpO2 monitoring [60]. We will also measure psychological reactance in clinicians using the same multiple choice instruments used in seminal reactance work to assess perceptions of threats to freedom in response to deimplementation messaging, emotional responses, and cognitive responses [61,62,63].

Mediation analysis will allow us to separate the direct effects of the exposure from effects that occur via an intermediate variable (indirect effects) [64]. For each outcome, we will perform separate mediation analyses for the routinization and institutionalization dimensions of the Slaghuis questionnaire [33]. Mediation will be tested using the product of coefficients approach [65,66,67], which we have used in previous studies [68,69,70]. In this approach, the total effect of the deimplementation strategy is parsed into direct and indirect effects. As shown in Fig. 3, path a represents the effect of the deimplementation strategy on the hospital-level mediators. Path b represents the effect of the hospital-level mediators on the outcomes. An unbiased estimate of the indirect mediated effect is derived via the product of the a and b paths [66, 67, 69]. Moderators (implementation climate, implementation leadership, and psychological reactance) will be tested separately by adding terms for each moderator and its interaction with the deimplementation strategy to the aim 1 models.

Fig. 3
figure 3

Mediation model

Qualitative mechanistic measures

Our qualitative inquiry aims to better understand mechanisms of practice change and potential effects on parents and guardians.

Hospital staff

Using a deviance case sampling approach [71, 72], we will conduct 48 semi-structured interviews with nurses and physicians who provide care to bronchiolitis patients in hospitals with the lowest and highest sustainment. Eligible clinicians will be identified at random from staff rosters and invited to participate in interviews to discuss their experiences related to the process of deimplementation and to explore mechanistic relationships between (a) the strategies, (b) quantitative study findings, and (c) sustainment.

Parents

We will conduct 15 semi-structured interviews with parents or guardians of children hospitalized with bronchiolitis who were found to be continuously SpO2-monitored while in room air during aim 1 data collection. Eligible parents or guardians will be identified at random from trial records during the sustainment phase and invited to interview by telephone within 4 weeks following discharge to explore their perceptions of, and reactions to, continuous SpO2 monitoring deimplementation.

Qualitative analysis will follow an integrated approach using the Consolidated Framework for Implementation Research as a starting framework while also allowing new concepts to emerge and become part of the coding scheme [73]. Our approach to integrating qualitative data with the quantitative data from Aim 2 will follow a “QUAN → qual” structure, where the function is to expand upon the quantitative findings to understand strategy mechanisms and the stakeholder perspectives on deimplementation efforts, and where the process is connecting [74]. We will use the quantitative data to identify patterns in the qualitative data by entering quantitative findings into NVivo as attributes of each participant. These attributes will be used to compare important themes among subgroups.

Clinical measures

To examine the effects of deimplementation on clinical outcomes and unintended consequences, we will measure the primary clinical outcome of length of hospital stay in hours and the secondary clinical outcome of oxygen supplementation duration in hours among enrolled bronchiolitis patients. We will also collect additional data to capture any underuse of monitoring that could plausibly occur in response to deimplementation in patients with more severe diseases [75]. We define underuse as failing to continuously monitor bronchiolitis patients receiving ≥2L/min supplemental oxygen or flow (a marker of more severe disease) [20] and will measure it using the same observational data collection methods used for the primary outcome. We will perform surveillance for additional unintended safety consequences [1]: code blue and rapid response team activations in bronchiolitis patients who were unmonitored at the time of the event and were subsequently found to be hypoxemic and (2) readmission of bronchiolitis patients to the hospital within 7 days of discharge with a finding of hypoxemia upon re-presentation to the emergency department. These outcomes will be extracted from charts and local patient safety databases.

In analyzing the clinical outcomes, hospital-level deimplementation penetration for each study phase will be the primary exposure variable. Hospital-level deimplementation penetration will be merged with patient-level length of stay and duration of oxygen supplementation. We will use generalized mixed-effects regression models to model the length of stay and duration of oxygen supplementation and use hospital-specific random intercepts to account for within-hospital clustering [76]. We will examine the underuse of continuous SpO2 monitoring during each study phase as the percentage of patients with bronchiolitis observed receiving ≥2L/min oxygen who are inappropriately unmonitored. We will analyze underuse longitudinally and by study arm using similar patient-level mixed effects logistic regression models as in the primary analysis.

Data sharing

After all participant enrollment has been completed, the Data Coordinating Center will prepare a final study database that has been stripped of identifiers for sharing. We will make the data available to users only under a data sharing agreement that provides for (1) a commitment to using the data only for research purposes and not to identify any individual participant, (2) IRB approval, (3) a commitment to securing the data using appropriate computer technology, and (4) a commitment to and an agreed-upon plan for destroying the data after analyses are completed. A plan to disseminate the findings is available in Additional file 2.

Discussion

To our knowledge, the Eliminating Monitor Overuse (EMO) SpO2 trial will be the first in the field of pediatric hospital medicine to use a hybrid type III design to evaluate the comparative utility of two active strategies targeting sustained deimplementation of an overused practice. This trial builds upon our prior work, which demonstrated that about half of hospitalized children with bronchiolitis are monitored unnecessarily but also established that clinical practice can be quickly aligned with guidelines using educational outreach with A&F. This trial will allow us to determine whether the short-term gains we observed in our pilot trial are able to be sustained over time and compare alternative approaches to reaching sustainment that can inform the field of implementation science beyond our particular clinical focus in this trial.

Our study design has several strengths. First, we compare our combined strategy of educational outreach, A&F, and EHR-based clinical decision support to a common approach to clinical practice change and quality improvement in pediatric hospital medicine (educational outreach with A&F alone) [77,78,79], in keeping with both a National Heart Lung and Blood Institute Implementation Science Work Group conclusion that educational outreach and A&F are generally effective in improving outcomes [80], and with calls from experts to test A&F alone vs. A&F + co-interventions [81, 82]. Second, our chosen deimplementation strategies are based in theory (i.e., Helfrich’s Dual Process Theory-based deimplementation framework [26], Slaghuis’s Framework for Sustainability of Work Practices [33]) and consistent with evidence that multicomponent approaches have the greatest potential for success when aiming to reduce low-value care, that education is necessary but rarely sufficient, and that A&F and EHR-based clinical decision support approaches are the most promising strategies to address medical overuse [83,84,85]. In addition, our use of an EHR integration coach assisting each participating hospital goes beyond typical clinical decision support efforts and is meant to ensure that workflow prompts are optimized for the local context. Finally, we include family perspectives in our qualitative inquiry, in recognition of the fact that understanding their experiences is essential for achieving patient-centered care.

We also note several limitations. First, the pandemic and associated measures to reduce the spread of COVID-19 disrupted the well-established seasonal patterns of bronchiolitis disease [35], which led us to reconfigure study phases. The uncertainty of whether, and when, bronchiolitis will revert to being a disease confined to the winter months may threaten the ability to complete the trial as planned and may demand additional trial modifications. Second, it is possible that some hospitals in the unlearning only arm could develop EHR-based clinical decision support related to SpO2 monitoring during the trial period, leading to contamination between conditions, although this is discouraged.

In summary, the EMO SpO2 trial will advance the science of deimplementation, an understudied area of implementation science, by providing new insights from a pediatric research network into the processes, mechanisms, costs, and sustainment of rigorously designed deimplementation strategies. The trial will also advance pediatric hospital care for a high-incidence, costly pediatric lung disease that hospitalizes over 100,000 children annually.

Availability of data and materials

Not applicable.

Abbreviations

A&F:

Audit and feedback

AIM:

Acceptability of Implementation Measure

CONSORT:

Consolidated Standards of Reporting Trials

COVID-19:

Novel coronavirus 2019

DSMB:

Data and Safety Monitoring Board

EHR:

Electronic health record

EMO:

Eliminating Monitor Overuse

HIPAA:

Health Insurance Portability and Accountability Act of 1996

IRB:

Institutional Review Board

NHLBI:

National Heart, Lung, and Blood Institute

NIH:

National Institutes of Health

P1, 2, 3:

Study phase 1, 2, or 3

PI:

Principal investigator

PRIS:

Pediatric Research in Inpatient Settings

REB:

Research Ethics Board

REDCap:

Research Electronic Data Capture

RSV:

Respiratory syncytial virus

SPIRIT:

Standard Protocol Items: Recommendations for Interventional Trials

SpO2 :

Pulse oximetry

US:

United States

References

  1. Norton WE, Chambers DA. Unpacking the complexities of de-implementing inappropriate health interventions. Implement Sci. 2020;15(1):2.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Nilsen P, Ingvarsson S, Hasson H, von Thiele SU, Augustsson H. Theories, models, and frameworks for de-implementation of low-value care: a scoping review of the literature. Implement Res Pract. 2020;1:2633489520953762.

    Google Scholar 

  3. Morgan DJ, Brownlee S, Leppin AL, Kressin N, Dhruva SS, Levin L, et al. Setting a research agenda for medical overuse. BMJ. 2015;351:h4534.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Brownlee S, Chalkidou K, Doust J, Elshaug AG, Glasziou P, Heath I, et al. Evidence for overuse of medical services around the world. Lancet. 2017;390(10090):156–68.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Prasad V, Ioannidis JP. Evidence-based de-implementation for contradicted, unproven, and aspiring healthcare practices. Implement Sci. 2014;9:1.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Parsons Leigh J, Niven DJ, Boyd JM, Stelfox HT. Developing a framework to guide the de-adoption of low-value clinical practices in acute care medicine: a study protocol. BMC Health Serv Res. 2017;17(1):54.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Wolf ER, Krist AH, Schroeder AR. Deimplementation in pediatrics: past, present, and future. JAMA Pediatr. 2021;175(3):230–2.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Grimshaw JM, Patey AM, Kirkham KR, Hall A, Dowling SK, Rodondi N, et al. De-implementing wisely: developing the evidence base to reduce low-value care. BMJ Qual Saf. 2020;29:409–17.

  9. Hasegawa K, Tsugawa Y, Brown DFM, Mansbach JM, Camargo CA. Trends in bronchiolitis hospitalizations in the United States, 2000-2009. Pediatrics. 2013;132(1):28–36.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Keren R, Luan X, Localio R, Hall M, McLeod L, Dai D, et al. Prioritization of comparative effectiveness research topics in hospital pediatrics. Arch Pediatr Adolesc Med. 2012;166(12):1155–64.

    Article  PubMed  Google Scholar 

  11. Ralston SL, Lieberthal AS, Meissner HC, Alverson BK, Baley JE, Gadomski AM, et al. Clinical practice guideline: the diagnosis, management, and prevention of bronchiolitis. Pediatrics. 2014;134(5):e1474–502.

    Article  PubMed  Google Scholar 

  12. Fujiogi M, Goto T, Yasunaga H, Fujishiro J, Mansbach JM, Camargo CA, et al. Trends in bronchiolitis hospitalizations in the United States: 2000–2016. Pediatrics. 2019;144(6):e20192614.

    Article  PubMed  Google Scholar 

  13. Haynes AK, Prill MM, Iwane MK, Gerber SI. Respiratory syncytial virus — United States, July 2012–June 2014. MMWR Morb Mortal Wkly Rep. 2014;63(48):1133–6.

    PubMed  PubMed Central  Google Scholar 

  14. Bonafide CP, Xiao R, Brady PW, Landrigan CP, Brent C, Wolk CB, et al. Prevalence of continuous pulse oximetry monitoring in hospitalized children with bronchiolitis not requiring supplemental oxygen. JAMA. 2020;323(15):1467–77.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  15. Quinonez RA, Coon ER, Schroeder AR, Moyer VA. When technology creates uncertainty: pulse oximetry and overdiagnosis of hypoxaemia in bronchiolitis. BMJ. 2017;358:j3850.

    Article  PubMed  Google Scholar 

  16. Schroeder AR, Marmor AK, Pantell RH, Newman TB. Impact of pulse oximetry and oxygen therapy on length of stay in bronchiolitis hospitalizations. Arch Pediatr Adolesc Med. 2004;158(6):527.

    Article  PubMed  Google Scholar 

  17. Unger S, Cunningham S. Effect of oxygen supplementation on length of stay for infants hospitalized with acute viral bronchiolitis. Pediatrics. 2008;121(3):470–5.

    Article  PubMed  Google Scholar 

  18. Rasooly IR, Makeneni S, Khan AN, Luo B, Muthu N, Bonafide CP. The alarm burden of excess continuous pulse oximetry monitoring among patients with bronchiolitis. J Hosp Med. 2021;16(12):727–9.

    Article  PubMed  Google Scholar 

  19. McCulloh R, Koster M, Ralston S, Johnson M, Hill V, Koehn K, et al. Use of intermittent vs. continuous pulse oximetry for nonhypoxemic infants and young children hospitalized for bronchiolitis: a randomized clinical trial. JAMA Pediatr. 2015;169(10):898–904.

    Article  PubMed  Google Scholar 

  20. Mahant S, Wahi G, Bayliss A, Giglia L, Kanani R, Pound CM, et al. Intermittent vs. continuous pulse oximetry in hospitalized infants with bronchiolitis: a randomized clinical trial. JAMA Pediatr. 2021;175(5):466–74.

    Article  PubMed  Google Scholar 

  21. Quinonez RA, Garber MD, Schroeder AR, Alverson BK, Nickel W, Goldstein J, et al. Choosing wisely in pediatric hospital medicine: five opportunities for improved healthcare value. J Hosp Med. 2013;8(9):479–85.

    Article  PubMed  Google Scholar 

  22. Schondelmeyer AC, Dewan ML, Brady PW, Timmons KM, Cable R, Britto MT, et al. Cardiorespiratory and pulse oximetry monitoring in hospitalized children: a Delphi process. Pediatrics. 2020;146(2):e20193336.

    Article  PubMed  Google Scholar 

  23. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Wolk CB, Schondelmeyer AC, Barg FK, Beidas R, Bettencourt A, Brady PW, et al. Barriers and facilitators to guideline-adherent pulse oximetry use in bronchiolitis. J Hosp Med. 2021;16(1):23–30.

  25. Fernandez ME, ten Hoor GA, van Lieshout S, Rodriguez SA, Beidas RS, Parcel G, et al. Implementation Mapping: using Intervention Mapping to develop implementation strategies. Front Public Health. 2019;7:158.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Helfrich CD, Rose AJ, Hartmann CW, van Bodegom-Vos L, Graham ID, Wood SJ, et al. How the Dual Process Model of human cognition can inform efforts to de-implement ineffective and harmful clinical practices: a preliminary model of unlearning and substitution. J Eval Clin Pract. 2018;24(1):198–205.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Schondelmeyer AC, Bettencourt AP, Xiao R, Beidas RS, Wolk CB, Landrigan CP, et al. Evaluation of an educational outreach and audit and feedback program to reduce continuous pulse oximetry use in hospitalized infants with stable bronchiolitis: a nonrandomized clinical trial. JAMA Netw Open. 2021;4(9):e2122826.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Ament SMC, de Groot JJA, Maessen JMC, Dirksen CD, van der Weijden T, Kleijnen J. Sustainability of professionals’ adherence to clinical practice guidelines in medical care: a systematic review. BMJ Open. 2015;5(12):e008073.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Wiltsey Stirman S, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012;7(1):17.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Chan AW, Tetzlaff JM, Altman DG, Laupacis A, Gøtzsche PC, Krleža-Jerić K, et al. SPIRIT 2013 statement: defining standard protocol items for clinical trials. Ann Intern Med. 2013;158(3):200–7.

    PubMed  PubMed Central  Google Scholar 

  31. Campbell MK, Piaggio G, Elbourne DR, Altman DG. Consort 2010 statement: extension to cluster randomised trials. BMJ. 2012;345:e5661 Available from: https://www.bmj.com/content/345/bmj.e5661 [cited 4 Jul 2020].

    Article  PubMed  Google Scholar 

  32. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42:377–81.

    Article  PubMed  Google Scholar 

  33. Slaghuis SS, Strating MM, Bal RA, Nieboer AP. A framework and a measurement instrument for sustainability of work practices in long-term care. BMC Health Serv Res. 2011;11(1):314.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Agha R, Avner JR. Delayed seasonal RSV surge observed during the COVID-19 pandemic. Pediatrics. 2021;148(3):e2021052089.

    Article  PubMed  Google Scholar 

  36. Simon TD, Starmer AJ, Conway PH, Landrigan CP, Shah SS, Shen MW, et al. Quality improvement research in pediatric hospital medicine and the role of the Pediatric Research in Inpatient Settings (PRIS) network. Acad Pediatr. 2013;13(6 Suppl):S54–60.

    Article  PubMed  Google Scholar 

  37. Starmer AJ, Spector ND, Srivastava R, West DC, Rosenbluth G, Allen AD, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803–12.

    Article  CAS  PubMed  Google Scholar 

  38. Khan A, Coffey M, Litterer KP, Baird JD, Furtak SL, Garcia BM, et al. Families as partners in hospital error and adverse event surveillance. JAMA Pediatr. 2017;171(4):372–81.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Landrigan CP, Stockwell D, Toomey SL, Loren S, Tracy M, Jang J, et al. Performance of the Global Assessment of Pediatric Patient Safety (GAPPS) tool. Pediatrics. 2016;137(6):e20154076.

    Article  PubMed  Google Scholar 

  40. Keren R, Shah SS, Srivastava R, Rangel S, Bendel-Stenzel M, Harik N, et al. Comparative effectiveness of intravenous vs oral antibiotics for postdischarge treatment of acute osteomyelitis in children. JAMA Pediatr. 2015;169(2):120–8.

    Article  PubMed  Google Scholar 

  41. Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009;84(8):1022–8.

    Article  PubMed  Google Scholar 

  42. Croskerry P. Clinical cognition and diagnostic error: applications of a dual process model of reasoning. Adv Health Sci Educ Theory Pract. 2009;14(Suppl 1):27–35.

    Article  PubMed  Google Scholar 

  43. Brehm SS, Brehm JW. Psychological reactance: a theory of freedom and control. New York: Academic Press; 2013.

    Google Scholar 

  44. Chi KW, Coon ER, Destino L, Schroeder AR. Parental perspectives on continuous pulse oximetry use in bronchiolitis hospitalizations. Pediatrics. 2020;146(2):e20200130.

    Article  PubMed  Google Scholar 

  45. Kinsman L, Rotter T, James E, Snow P, Willis J. What is a clinical pathway? Development of a definition to inform the debate. BMC Med. 2010;8:31.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Osherhoff JA. Improving medication use and outcomes with clinical decision support: a step-by-step guide. Chicago: Healthcare Information and Management Systems Society; 2009.

    Google Scholar 

  47. Campbell R. The five “rights” of clinical decision support. J AHIMA. 2013;84(10):42–7 quiz 48.

    PubMed  Google Scholar 

  48. Hawe P, Shiell A, Riley T. Complex interventions: how “out of control” can a randomised controlled trial be? BMJ. 2004;328(7455):1561–3.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Pediatric CDS Collaborative. Pediatric CDS Collaborative. Available from: http://pediatriccds.org/. [cited 19 Jul 2022].

  51. Moulton LH. Covariate-based constrained randomization of group-randomized trials. Clin Trials. 2004;1(3):297–305.

    Article  PubMed  Google Scholar 

  52. Helfrich CD, Hartmann CW, Parikh TJ, Au DH. Promoting health equity through de-implementation research. Ethn Dis. 2019;29(Suppl 1):93–6.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Prusaczyk B, Swindle T, Curran G. Defining and conceptualizing outcomes for de-implementation: key distinctions from implementation outcomes. Implement Sci Commun. 2020;1:43.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Brady PW, Schondelmeyer AC, Landrigan CP, Xiao R, Brent C, Bonafide CP, et al. Validity of continuous pulse oximetry orders for identification of actual monitoring status in bronchiolitis. J Hosp Med. 2020;15(11):665–8.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Hayes RJ, Bennett S. Simple sample size calculation for cluster-randomized trials. Int J Epidemiol. 1999;28(2):319–26.

    Article  CAS  PubMed  Google Scholar 

  56. Hayes RJ, Moulton LH. Sample size for unmatched trials. In: Cluster randomised trials. 2nd ed. Boca Raton: CRC Press; 2017. p. 128–39.

    Google Scholar 

  57. Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12(1):108.

    Article  PubMed  PubMed Central  Google Scholar 

  58. Cidav Z, Mandell D, Pyne J, Beidas R, Curran G, Marcus S. A pragmatic method for costing implementation strategies using time-driven activity-based costing. Implement Sci. 2020;15(1):28.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS). Implement Sci. 2014;9(1):157.

    Article  PubMed  PubMed Central  Google Scholar 

  60. Aarons GA, Ehrhart MG, Farahnak LR. The implementation leadership scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci. 2014;9(1):45.

    Article  PubMed  PubMed Central  Google Scholar 

  61. Dillard JP, Shen L. On the nature of reactance and its role in persuasive health communication. Commun Monogr. 2005;72(2):144–68.

    Article  Google Scholar 

  62. Silvia PJ. Reactance and the dynamics of disagreement: multiple paths from threatened freedom to resistance to persuasion. Eur J Soc Psychol. 2006;36(5):673–85.

    Article  Google Scholar 

  63. Reynolds-Tylus T, Bigsby E, Quick BL. A comparison of three approaches for measuring negative cognitions for psychological reactance. Commun Methods Meas. 2021;15(1):43–59.

  64. Robins JM, Greenland S. Identifiability and exchangeability for direct and indirect effects. Epidemiology. 1992;3(2):143–55.

    Article  CAS  PubMed  Google Scholar 

  65. Krull JL, MacKinnon DP. Multilevel modeling of individual and group level mediated effects. Multivar Behav Res. 2001;36(2):249–77.

    Article  CAS  Google Scholar 

  66. Pituch KA, Murphy DL, Tate RL. Three-level models for indirect effects in school- and class-randomized experiments in education. J Exp Educ. 2009;78(1):60–95.

    Article  Google Scholar 

  67. Zhang Z, Zyphur MJ, Preacher KJ. Testing multilevel mediation using hierarchical linear models: problems and solutions. Organ Res Methods. 2009;12(4):695–719.

    Article  Google Scholar 

  68. Williams NJ, Glisson C, Hemmelgarn A, Green P. Mechanisms of change in the ARC organizational strategy: increasing mental health clinicians’ EBP adoption through improved organizational culture and capacity. Admin Pol Ment Health. 2017;44(2):269–83.

    Article  Google Scholar 

  69. Glisson C, Williams NJ, Hemmelgarn A, Proctor E, Green P. Aligning organizational priorities with ARC to improve youth mental health service outcomes. J Consult Clin Psychol. 2016;84(8):713–25.

    Article  PubMed  PubMed Central  Google Scholar 

  70. Williams NJ, Becker-Haimes EM, Schriger SH, Beidas RS. Linking organizational climate for evidence-based practice implementation to observed clinician behavior in patient encounters: a lagged analysis. Implement Sci Commun. 2022;3(1):64.

    Article  PubMed  PubMed Central  Google Scholar 

  71. Marsh DR, Schroeder DG, Dearden KA, Sternin J, Sternin M. The power of positive deviance. BMJ. 2004;329(7475):1177–9.

    Article  PubMed  PubMed Central  Google Scholar 

  72. Lawton R, Taylor N, Clay-Williams R, Braithwaite J. Positive deviance: a different approach to achieving patient safety. BMJ Qual Saf. 2014;23(11):880–3.

    Article  PubMed  PubMed Central  Google Scholar 

  73. Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Serv Res. 2007;42(4):1758–72.

    Article  PubMed  PubMed Central  Google Scholar 

  74. Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Admin Pol Ment Health. 2011;38(1):44–53.

    Article  Google Scholar 

  75. Aron DC, Tseng CL, Soroka O, Pogach LM. Balancing measures: identifying unintended consequences of diabetes quality performance measures in patients at high risk for hypoglycemia. Int J Qual Health Care. 2019;31(4):246–51.

    Article  PubMed  Google Scholar 

  76. Hayes RJ, Moulton LH. Regression analysis based on individual-level data. In: Cluster randomised trials. 2nd ed. Boca Raton: CRC Press; 2017. p. 245–79.

    Google Scholar 

  77. Biondi EA, McCulloh R, Staggs VS, Garber M, Hall M, Arana J, et al. Reducing Variability in the Infant Sepsis Evaluation (REVISE): a national quality initiative. Pediatrics. 2019;144(3):e20182201.

    Article  PubMed  Google Scholar 

  78. Kaiser SV, Jennings B, Rodean J, Cabana MD, Garber MD, Ralston SL, et al. Pathways for Improving Inpatient Pediatric Asthma Care (PIPA): a multicenter, national study. Pediatrics. 2020;145:e20193026.

    Article  PubMed  Google Scholar 

  79. Ralston SL, Garber MD, Rice-Conboy E, Mussman GM, Shadman KA, Walley SC, et al. A multicenter collaborative to reduce unnecessary care in inpatient bronchiolitis. Pediatrics. 2016;137(1):e20150851.

    Article  Google Scholar 

  80. Chan WV, Pearson TA, Bennett GC, Castillo G, Cushman WC, Gaziano TA, et al. ACC/AHA special report: Clinical practice guideline implementation strategies: a summary of systematic reviews by the NHLBI Implementation Science Work Group: a report of the American College of Cardiology/American Heart Association Task Force on clinical practice guidelines. Circulation. 2017;135(9):e122–37.

    Article  PubMed  Google Scholar 

  81. Ivers NM, Grimshaw JM. Reducing research waste with implementation laboratories. Lancet. 2016;388(10044):547–8.

    Article  PubMed  Google Scholar 

  82. Grimshaw JM, Ivers N, Linklater S, Foy R, Francis JJ, Gude WT, et al. Reinvigorating stagnant science: implementation laboratories and a meta-laboratory to efficiently advance the science of audit and feedback. BMJ Qual Saf. 2019;28(5):416–23.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  83. Soong C, Shojania KG. Education as a low-value improvement intervention: often necessary but rarely sufficient. BMJ Qual Saf. 2020;29(5):353–7.

    Article  PubMed  Google Scholar 

  84. Colla CH, Mainor AJ, Hargreaves C, Sequist T, Morden N. Interventions aimed at reducing use of low-value health services: a systematic review. Med Care Res Rev. 2017;74(5):507–50.

    Article  PubMed  Google Scholar 

  85. Kwan JL, Lo L, Ferguson J, Goldberg H, Diaz-Martinez JP, Tomlinson G, et al. Computerised clinical decision support systems and absolute improvements in care: meta-analysis of controlled clinical trials. BMJ. 2020;370:m3216.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

Pediatric Research in Inpatient Settings (PRIS) Network collaboration group members leading the trial at their institutions contributed to discussions about the trial design and modifications to the protocol in response to the COVID-19 pandemic:

Institution

First name

Last name

Degrees

Akron Children’s Hospital

Prabi

Rajbhandari

MD

Albany Medical Center

Emily

Knuth

MD, PhD

Alberta Children’s Hospital

Michelle

Bailey

BS, MD, MS

Ann & Robert H. Lurie Children’s Hospital of Chicago

Kate

Lucey

BA, MD, MS

Boston Children’s Hospital

Patty

Stoeck

BS, MD

Children’s Hospital at Dartmouth-Hitchcock

Samantha

House

DO, MPH

Children’s Hospital at Montefiore

Alyssa

Silver

MD

Children’s Hospital at Oklahoma University Medical Center

Monique

Naifeh

MD, MPH

Children’s Hospital Colorado

Michael

Tchou

MD, MS

Children’s Hospital Colorado

Amy

Tyler

MD, MS

Children’s Hospital Los Angeles (CHLA)

Vivian

Lee

MD

Children’s Hospital of Pittsburgh of UPMC

Erin

Cummings

MD

Children’s Hospital of Richmond at VCU

Clifton

Lee

MD

Children’s Hospital of The King's Daughters

Kyrie

Shomaker

MD

Children’s Hospital Orange County (CHOC)

Alexandra

Mihalek

MD

Children’s Medical Center Dallas

Courtney

Solomon

MD

Children’s Memorial Hermann

Raymond

Parlar-Chun

MD

Children’s Mercy Kansas City

Kathleen

Berg

MD

Children’s Minnesota

Nick

Ryan

DO, MS

Children’s National Medical Center

Tina

Halley

MD

Children’s of Alabama

Mary

Orr

MD, MS, MPH

Children’s Wisconsin

Tracey

Liljestrom

MD

Children’s Wisconsin

Erin

Preloger

MD

Children’s Hospital of Philadelphia, Philadelphia Campus

Padmavathy

Parthasarathy

MD

Children’s Hospital of Philadelphia Care Network @ Virtua

Rashida

Shakir

MD

Children’s Hospital of Philadelphia at Grand View Hospital

Andrew

Chu

MD

Children’s Hospital of Philadelphia Middleman Family Pavilion, King of Prussia Campus

Morgan

Greenfield

MD

Children’s Hospital of Philadelphia Pediatric Care at Penn Medicine/Princeton Health

Julianne

Prasto

MD

Cohen Children’s Medical Center

Ann

Le

DO

CS Mott Children’s Hospital

Kimberly

Monroe

MD, MS

Hoops Family Children’s Hospital at Marshall University

Andrea

Lauffer

MD

Inova Children’s Hospital

Meredith

Carter

MD

Inova Children’s Hospital

Kamilah

Halmon

MD

Intermountain Riverton Hospital

Glen

Huff

MD

Komansky Children’s Hospital/New York Presbyterian Medical Center/Weill Cornell Medicine

Kiran

Gadani Patel

MD, MPH

Komansky Children’s Hospital/New York Presbyterian Medical Center/Weill Cornell Medicine

Jennie

Ono

MD, MS

Lucile Packard Children’s Hospital Stanford

Alan

Schroeder

MD

Monroe Carell Jr. Children’s Hospital at Vanderbilt

Gregory (Greg)

Plemmons

MD

Nationwide Children’s Hospital

Michael

Perry

MD

New York-Presbyterian-Morgan Stanley Children’s Hospital (Columbia)

Sumeet

Banker

MD

New York-Presbyterian-Morgan Stanley Children’s Hospital (Columbia)

Jennifer

Lee

MD

Primary Children’s Hospital (Utah)

Robert

Willer

BA, DO

Rady Children’s Hospital/University of California San Diego

Begem

Lee

MD

Rady Children’s Hospital/University of California San Diego

Kyung

Rhee

MD, MS, MSc, MA

Riley Hospital for Children at Indiana University Health

Richelle

Baker

MD

Seattle Children’s Hospital

Polina

Frolova Gregory

DO, MS

Texas Children’s Hospital

Vipul

Parikh

MD

Texas Children’s Hospital

Mini

Wallace

DO, MS

Texas Children’s Hospital The Woodlands

Stephen

Edwards

MD

Texas Children’s Hospital West Campus

Lisa

Beckner

MD

University of California Davis

Michelle

Hamline

MD, PhD

University of Rochester Golisano Children’s Hospital

Lauren

Solan

MD, MEd

University of Vermont Children’s Hospital

Leigh-Anne

Cioffredi

MD

University of Vermont Children’s Hospital

Scarlett

Johnson

MD, MPH

Upstate Golisano Children’s Hospital

John

Andrake

MD

Valley Children’s Hospital

Nicole

Webb

MD

Yale-New Haven Children’s Hospital

Adam

Berkwitt

MD

Funding

Research reported in this publication was supported by a cooperative agreement with the National Heart, Lung, and Blood Institute of the National Institutes of Health under award number U01HL159880. The funder had no role in any of the following: study design; collection, management, analysis, or interpretation of the data; writing of the report; the decision to submit the report for publication; or ultimate authority over any of these activities. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Author information

Authors and Affiliations

Authors

Consortia

Contributions

CPB and RSB are the co-principal investigators of the study and in those roles led the conception and design of the trial. CPB drafted the manuscript. RSB substantively revised the manuscript. RX is the lead biostatistician and in that role led the statistical design and sample size calculations. She made major contributions to the design of the trial and substantively revised the manuscript. ARP is a research consultant who contributed to editing and revision of the grant proposal that secured funding and substantively revised the first draft of the manuscript and subsequent versions. She was paid for these contributions. ACS, PWB, CPL, CBW, ZC, NM, and NJW made major contributions to the design of the trial and substantively revised the manuscript. HR, ES, CRB, and KA substantively revised the manuscript as their primary contributions. All authors approved the submitted version of the manuscript.

Authors’ information

Not applicable.

Corresponding author

Correspondence to Christopher P. Bonafide.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the Institutional Review Board (IRB) at Children’s Hospital of Philadelphia (FWA00000459) on July 23, 2021. Prior to study commencement at each participating site, each US site established an IRB reliance agreement with Children’s Hospital of Philadelphia’s IRB using an electronic reliance platform. The Canadian site obtained local Research Ethics Board (REB) approval independently. Due to a number of challenges posed by the pandemic, including but not limited to staffing challenges and low bronchiolitis patient volumes, we amended the protocol to improve the feasibility for participating sites. In order to provide the most up-to-date protocol for publication, this manuscript represents the fourth scientific amendment, including recommendations from our Data and Safety Monitoring Board, protocol version date May 20, 2022, approved by the IRB on June 20, 2022. All amendment letters describing each change are automatically forwarded to relying sites via the electronic IRB system. For patients who are the subjects of data collection, the IRB granted a waiver of consent/parental permission per 45 CFR 46.116(f)(3), a waiver of assent per 45 CFR 46.408(a), and a waiver of HIPAA authorization per 45 CFR 164.512(i)(2)(ii). For staff who are the subjects of questionnaires and qualitative interviews, the IRB granted a waiver of documentation of consent under 45 CFR 46.117(c)(1)(ii). For parents/guardians who are the subjects of qualitative interviews, the IRB granted a waiver of documentation of consent under 45 CFR 46.117(c)(1)(ii) and an alteration of HIPAA authorization (to obtain verbal authorization) under 45 CFR 164.512(i)(2)(ii).

Consent for publication

Not applicable.

Competing interests

RSB is an Associate Editor of Implementation Science; all decisions on this paper were made by other editors. RSB is the principal at Implementation Science & Practice, LLC. She receives royalties from Oxford University Press, consulting fees from United Behavioral Health and OptumLabs, and serves on the advisory boards for Optum Behavioral Health, AIM Youth Mental Health Foundation, and the Klingenstein Third Generation Foundation, outside of the submitted work. The other authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

DSMB Charter, Version Dated September 14, 2021. Description: Charter outlining the roles, responsibilities, practices, and procedures of the EMO Trial Data and Safety Monitoring Board.

Additional file 2.

Dissemination plan for results of the EMO Trial. Description: Description of the plan for disseminating results and products of the EMO trial. This document was also submitted with the EMO Trial grant proposal.

Additional file 3.

CONSORT 2010 checklist

Additional file 4.

SPIRIT 2013 Checklist

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bonafide, C.P., Xiao, R., Schondelmeyer, A.C. et al. Sustainable deimplementation of continuous pulse oximetry monitoring in children hospitalized with bronchiolitis: study protocol for the Eliminating Monitor Overuse (EMO) type III effectiveness-deimplementation cluster-randomized trial. Implementation Sci 17, 72 (2022). https://doi.org/10.1186/s13012-022-01246-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-022-01246-z

Keywords