Best strategies to implement clinical pathways in an emergency department setting: study protocol for a cluster randomized controlled trial

  • Mona Jabbour29, 30, 27Email author,

    Affiliated with

    • Janet Curran28, 30,

      Affiliated with

      • Shannon D Scott29,

        Affiliated with

        • Astrid Guttman30, 27, 28,

          Affiliated with

          • Thomas Rotter29,

            Affiliated with

            • Francine M Ducharme30, 27,

              Affiliated with

              • M Diane Lougheed28, 29,

                Affiliated with

                • M Louise McNaughton-Filion30, 27, 28,

                  Affiliated with

                  • Amanda Newton29,

                    Affiliated with

                    • Mark Shafir30, 27,

                      Affiliated with

                      • Alison Paprica28,

                        Affiliated with

                        • Terry Klassen29, 30,

                          Affiliated with

                          • Monica Taljaard27, 28,

                            Affiliated with

                            • Jeremy Grimshaw29, 30 and

                              Affiliated with

                              • David W Johnson27, 28, 29

                                Affiliated with

                                Implementation Science20138:55

                                DOI: 10.1186/1748-5908-8-55

                                Received: 3 April 2013

                                Accepted: 15 May 2013

                                Published: 22 May 2013

                                Abstract

                                Background

                                The clinical pathway is a tool that operationalizes best evidence recommendations and clinical practice guidelines in an accessible format for ‘point of care’ management by multidisciplinary health teams in hospital settings. While high-quality, expert-developed clinical pathways have many potential benefits, their impact has been limited by variable implementation strategies and suboptimal research designs. Best strategies for implementing pathways into hospital settings remain unknown. This study will seek to develop and comprehensively evaluate best strategies for effective local implementation of externally developed expert clinical pathways.

                                Design/methods

                                We will develop a theory-based and knowledge user-informed intervention strategy to implement two pediatric clinical pathways: asthma and gastroenteritis. Using a balanced incomplete block design, we will randomize 16 community emergency departments to receive the intervention for one clinical pathway and serve as control for the alternate clinical pathway, thus conducting two cluster randomized controlled trials to evaluate this implementation intervention. A minimization procedure will be used to randomize sites. Intervention sites will receive a tailored strategy to support full clinical pathway implementation. We will evaluate implementation strategy effectiveness through measurement of relevant process and clinical outcomes. The primary process outcome will be the presence of an appropriately completed clinical pathway on the chart for relevant patients. Primary clinical outcomes for each clinical pathway include the following: Asthma—the proportion of asthmatic patients treated appropriately with corticosteroids in the emergency department and at discharge; and Gastroenteritis—the proportion of relevant patients appropriately treated with oral rehydration therapy. Data sources include chart audits, administrative databases, environmental scans, and qualitative interviews. We will also conduct an overall process evaluation to assess the implementation strategy and an economic analysis to evaluate implementation costs and benefits.

                                Discussion

                                This study will contribute to the body of evidence supporting effective strategies for clinical pathway implementation, and ultimately reducing the research to practice gaps by operationalizing best evidence care recommendations through effective use of clinical pathways.

                                Trial registration

                                ClinicalTrials.gov: NCT01815710

                                Keywords

                                Clinical pathways Key interventions Intervention strategy Pediatric emergency care Theory-based strategy Process outcomes Clinical outcomes

                                Background

                                The evidence to practice gap in medicine remains a healthcare challenge [18]. While knowledge syntheses and clinical practice guidelines (CPGs) have emerged as rigorous means to translate and make research more accessible for practitioners, these may not be sufficient to change practice behavior in complex settings, such as the chaotic environment of an emergency department (ED) [9, 10] where there is also exceeding demand to achieve favorable wait times and patient throughput [11]. This pressure threatens the quality and safe care that are important to health providers who must contend with a diverse population of varying ages, medical conditions, and treatments. The clinical pathway (CP) has emerged as a potentially important knowledge translation strategy for promoting effective healthcare. As a clinical decision-making tool, CPs operationalize best evidence recommendations and CPGs into an accessible bedside format for health provider teams, and in this sense, can promote standardized evidence based practices, patient safety, and efficiency in the health system [1121]. Well-designed CPs also offer opportunity to free clinicians’ cognitive abilities to focus on more complex thought-requiring activities [22] and can support clinicians to deliver key management priorities in a timely manner. As a result, CPs are being increasingly used in health settings and recommended by broader health systems internationally as a form of quality improvement [23, 24].

                                While CPs have potential to link evidence to practice via integration of guidelines into local systems, and to improve patient outcomes while decreasing hospitalizations and other health costs, their true impact has been limited by variable implementation strategies and suboptimal research designs [25, 26]. Because a CP involves the full health team and become part of the patient record, hospital contextual issues and team dynamics are important factors that must be considered in its implementation. Current evidence-based strategies that are used to implement CPGs may not be sufficient to promote CP adoption in hospital settings, because the complexities of behavior change among health providers are compounded by organizational and system barriers. Best strategies for implementing CPs are largely unknown [26, 27] however, and this knowledge gap must be addressed before their full impact can be realized. Further study is needed to understand why and under which circumstances CPs lead to improved care [13, 28, 29].

                                Most CPs are developed internally within a hospital, and while contextual knowledge may facilitate local uptake, CP quality may be limited by lack of rigour and expertise locally in interpreting best evidence for incorporation into that pathway. Working at a broader level (e.g., provincial/state-wide, national) expert-developed CPs are created by multidisciplinary teams of clinical and research experts, including end users, and offer opportunity for high quality features and professional design. Expert CPs can also be a means to ensure the standard of care is provided across healthcare settings in different jurisdictions. Additional benefits include efficiencies in development and momentum for expert pathway updates with emergence of new evidence. However, expert CPs cannot simply be imposed and implementation at local levels can be challenging [30, 31]. An effective intervention strategy requires thoughtful understanding of current and anticipated obstacles [32]. In this process, change management issues such as leadership, resources, and organizational culture must be explored and addressed.

                                This project seeks to provide new knowledge in an area of active ED practice and current interest. We will develop and evaluate the effectiveness of a theory-based intervention strategy using two expert-derived pediatric emergency CPs—Asthma and Vomiting and Diarrhea (V&D). These conditions have a strong evidence base for care, documented gaps in quality of care, and existing provincial leadership in pathway development [6, 7, 3339]. Rigorous evaluation of key clinical and process outcomes related to each CP, and process evaluation of the implementation experience will identify important factors for implementation success that will be highly valuable in guiding future CP implementation strategies.

                                Research objectives

                                We will conduct a 42-month mixed methods health services project with the following study objectives:
                                1. 1.

                                  To design a theory-based and knowledge user-informed intervention strategy to implement two provincial pediatric emergency CPs into practice in community EDs.

                                   
                                2. 2.

                                  To evaluate the effectiveness of this implementation strategy, using a cluster randomized controlled trial (cRCT) design, through measurement of relevant process and clinical outcomes in community EDs in Ontario, Canada.

                                   
                                3. 3.

                                  To conduct a process evaluation of the implementation strategy.

                                   
                                4. 4.

                                  To conduct an economic analysis of implementation costs and benefits.

                                   

                                Methods

                                To study the effect of a theory based strategy on CP implementation, we will conduct two cRCTs to evaluate implementation of two different pediatric CPs, namely for asthma and V&D, in a sample of 16 Ontario community EDs. Developed by a multidisciplinary team of clinical and research experts, as well as end users, each CP was designed for broad dissemination for use in the emergency care of pediatric patients in any ED setting. To minimize any potential Hawthorne effect resulting from engagement alone, we will use a balanced incomplete block design [40]. We chose this design to ensure all sites are exposed to the intervention, thus balancing the Hawthorne effect. The design is ‘incomplete’ because the complete intervention will differ between the two groups. One-half of the ED sites will be allocated to the arm receiving the implementation intervention for the pediatric asthma CP (group one), while serving as a control for the Pediatric V&D CP intervention. The other half of the hospital sites (group two) will be allocated to the arm receiving the implementation intervention for the V&D CP, while serving as control for the asthma CP intervention. There will be no dissemination of documents or related materials to control sites. Using this design, we will be conducting two cRCTs while ensuring that all sites receive exposure to the intervention as we evaluate the implementation strategy for two different pathways.

                                Sampling

                                Setting and site selection

                                A community ED will be defined as one that does not usually act as a referral center and is not a primary teaching hospital. Selecting from 149 community EDs in Ontario to ensure representativeness, we will enroll the 16 sites based on total patient volumes, namely very high (four sites), high (six sites), medium (four sites) and low (two sites).

                                Inclusion/exclusion criteria

                                We will recruit EDs that do not currently use CPs for either pediatric condition, asthma or V&D. To minimize contamination between sites, we will request data from Health Force Ontario, an organization that assists with physician coverage at different EDs, to ensure ED physicians at any site are not also working at a study site in the alternate arm of the study. An additional inclusion criterion is commitment to the implementation intervention by an administrative lead on behalf of the hospital. For each CP, specific inclusion/exclusion criteria have been defined for use with patients.

                                Participants

                                The individual hospital sites and their ED teams will be the focus of this study. Participants at each site will include ED staff and physicians, as well as hospital administrators with responsibility for the ED. Clinical outcomes for pediatric patients (defined as <18 years of age) with either asthma or gastroenteritis will also be studied.

                                Intervention strategy

                                Given the hospital context and involvement of the full ED team, CP implementation is a complex process requiring an informed and well-designed intervention strategy. To address the busy and competing pressures on EDs, an optimal intervention will be effective, practical, and feasible, yet with a strong evidence base for the design.

                                Core components

                                Our first objective is to fully develop the intervention strategy for optimal CP uptake at each site. Because our intervention strategy is dependent on ED site input, it is not possible to fully design the intervention until the study begins. However, we have provisionally selected the core intervention components with the collected knowledge gained from project team members’ experience with previous CP implementation and related initiatives [4145]. These components address health professional, ED team and organizational (hospital) issues, and are summarized in Table 1. To ensure feasibility and sustainability, the implementation strategy will be designed for success within existing hospital resources. This is to minimize the impact of study support and infrastructure, and to assess implementation in a realistic setting without external support.
                                Table 1

                                Core components of implementation strategy

                                Core component

                                Description

                                Local site champion teams

                                Nurse educator or Emergency Nurse

                                 

                                Emergency Physician

                                Pre-implementation site visits

                                Assessment of local ED culture, organization and feedback on CP usability (Human Design Factor Analysis)

                                Ongoing site support

                                Bimonthly teleconferences with site teams

                                Educational workshops

                                Train-the-trainer model

                                Website support

                                E-learning modules for each CP

                                 

                                Resource materials

                                Posters/reminders

                                CP-specific visual tools

                                 

                                Reminders to use CP on relevant charts

                                Hospital Commitment

                                Facilitation through hospital approval processes

                                 

                                Allocation of hospital resources;

                                 

                                Priorization within other hospital initiatives

                                Clinical pathway key interventions

                                Given their complex and interprofessional nature, we will identify each CP’s key interventions to ensure focus and emphasis on critical aspects for the pathway effectiveness. The development teams have identified evidence-based key interventions for each CP, and these will be the focus of training workshops, reminders and evaluations.

                                Theory-based design with knowledge user input

                                Our intervention strategy will be based on the Theoretical Domains Framework (TDF), which describes a comprehensive structure of 14 theoretical domains from 33 behavior change theories and 128 constructs [46]. This framework provides a useful approach to understand behavioral determinants and inform intervention design. A TDF-interview guide [47] is available to systematically elicit barriers and concerns from relevant stakeholders. To inform our strategy development, CP-specific behavioral domains will first be explored through TDF-guided [48] site visits (described below) and key informant (KI) interviews at each site. One administrator per site will be selected for KI interviews to identify readiness for change, as well as barriers and facilitators at the organizational level. Interviews will be audiotaped and transcribed. Data collection and analysis will follow an iterative and concurrent process [49].

                                Site visits

                                At the start of the implementation phase, site visits will be done to assess ED organizational issues, such as flow of pediatric patients, specific and shared roles of health providers, and current experience with CPs in the ED. Infrastructure requirements and readiness to implement the intervention CP will also be assessed. In previous implementation initiatives, this has proven to be very useful in understanding site issues and identification with the project. Additionally, site visits will be used to explore acceptability and potential issues with the intervention pathway. During each visit we will seek input from ED staff (i.e., nurses and allied health professionals) and physicians on duty, using case scenarios to probe for clinical or procedurally important issues as we ‘walk through’ their intervention CP. A structured form will be used to guide these visits and capture field notes. To pilot this form and obtain preliminary information, we will do pre-site visits with two local EDs, exploring issues for one of the CPs in each. Site visit interviews will be audiotaped and transcribed for subsequent data analysis.

                                Intervention mapping

                                Information derived from the KI interviews and site visits will then be used to further develop site-specific intervention strategies. A mapping exercise will be conducted to link relevant behavioral elements identified to appropriate behavior change strategies [50]. We will apply known taxonomies of behavior change techniques [51] to identify relevant methods, and will select the optimal mode of delivery for each to create a multifaceted intervention strategy. To ensure implementation success, strong consideration will be held for feasibility and practicality. The full project team will be involved in the ultimate intervention design via web-conference meeting, and further input will be sought from site partners at our launch meeting.

                                Ongoing support and communication

                                We found from previous implementation experience that without frequent communication and support, the implementation becomes stalled and displaced by other priorities. Consequently, our strategy will include bimonthly teleconferences with site champions to discuss and support progress during the implementation phase, and every four months in the post-implementation phase to discuss sustainability issues. Teleconferences will be useful to share best practices and local solutions to address common barriers.

                                Control sites

                                Control sites will continue with standard care, without additional materials. We will inform sites at recruitment that relevant pediatric patient data will be collected for ED presentations of asthma and V&D in the pre- and post-implementation periods. We will monitor relevant activities, such as self-implementation of any protocols that may relate to the control CP, within the process evaluation. While it is not possible to blind site partners to their allocated intervention, we will endeavor to conceal their allocation in the control arm. Control sites will not receive any CP documents. We will minimize potential co-interventions at recruitment by selecting only sites that do not have either related pediatric CPs in place. Based on recent studies of ED-based protocols [52, 53], most community EDs in Ontario do not have an asthma or V&D CP in place.

                                Randomization

                                Because the number of cluster sites is relatively small, simple unrestricted randomization would not be sufficient to ensure balance between the study arms. Pairing of hospitals would improve balance but may reduce study power and precision due to loss of degrees of freedom associated with a matched analysis. We will therefore implement a minimization procedure [54] to ensure overall balance based on three important covariates: annual pediatric visits, location (urban/rural), and recent experience with any ED process improvement initiatives (yes/no). A statistician not associated with the study will use a computerized algorithm to identify all possible allocations that meet the balancing constraints and one of the allocations will be randomly selected [55].

                                Project phases

                                As shown in Table 2, the project is divided into five discrete phases. Following a nine-month preparation phase, the implementation phase will take place over the subsequent nine-month period. We have found that an open-ended implementation period is likely to lead to delays at various stages and might also complicate our data collection processes. For these reasons, we have designated a specific nine-month implementation period, with negotiated interim target dates, for all sites. Completed implementation will be defined when the following have been done: site-customization and committee approvals for the intervention CP, the CP is ready for use in the ED, delivery of at least two educational workshops, and promotion of the e-learning module through a central website. Our project coordinator will communicate regularly to ensure negotiated target deadlines are met. Should there be incomplete implementation at any site, reasons for this will be documented and explored further in the post-implementation interviews.
                                Table 2

                                Description of project phases

                                Project phase

                                Duration

                                Activities

                                Preparation Phase

                                9 Months

                                Site Recruitment & REB Approvals

                                  

                                Site Randomization

                                  

                                Intervention Development

                                  

                                Project Launch Meeting

                                  

                                Site Champion Training

                                Implementation Phase

                                9 Months

                                Site Readiness Visits

                                  

                                Site Customization & Approval of Forms

                                  

                                CP Training

                                Post-Implementation Phase

                                12 Months

                                Post-Implementation Site Visits

                                  

                                Qualitative Interviews and Analysis

                                Data Collection and Analysis

                                7 Months

                                Chart Abstractor Training

                                  

                                Chart Audits

                                  

                                Chart to Administrative Database Linkage

                                  

                                Quantitative Data Analysis

                                  

                                Economic Analysis

                                Follow-up

                                5 Months

                                Full Partner Meeting: Review of Findings

                                  

                                Wrap-up and Dissemination

                                Evaluation

                                As shown in Table 3, we have planned a comprehensive evaluation approach including process and clinical outcomes. These are described in detail below.
                                Table 3

                                Evaluation components

                                Quantitative evaluation

                                Qualitative evaluation

                                - Patient Chart Audits

                                Process Evaluation

                                - Administrative Data

                                - Process Log

                                - Economic Analysis

                                - Pre/Post Site Visits

                                 

                                - Pre-implementation Key Informant Interviews

                                 

                                - Post-implementation Qualitative Interviews

                                Process evaluation

                                To address the third study objective, a concurrent process evaluation will be conducted throughout the trial to document and assess the degree of variability and fidelity in implementation of the intervention across the 16 sites. This evaluation will include a process log, and post-implementation site visits and qualitative interviews.

                                Process log

                                A process log will be used longitudinally to track key outcomes and capture issues related to site customization of documents, barriers and delays, workshop attendance and interest, ease of use, and degree of uptake. Components of this log will include workshop participant feedback and facilitator observation forms, brief bi-monthly feedback from site champion teams, and progress with negotiated target dates for various steps toward implementation. Additional support will be provided as needed to ensure targets are met. Website utilization, e-learning module feedback, and discussion boards will also be tracked to assess usefulness of the website as a complementary resource. We will use Google analytics (http://​www.​google.​com/​intl/​en/​analytics) to monitor website traffic and performance.

                                Site visits

                                Using a standardized tracking form, site visits will be conducted two months after completed implementation to assess awareness, accessibility and ease of use of the CP. Findings at each site will be compared with pre-implementation site visits.

                                Qualitative interviews

                                Post-implementation interviews will be conducted to gain a richer understanding of perceptions, team dynamics, and other relevant issues occurring at the individual and organizational levels. We will conduct post-implementation interviews at one-half of the sites for each intervention, for a total of 8 KI interviews and eight focus groups with up to ten ED provider participants each. Additional interviews will be conducted if data saturation is not achieved. To identify readiness for change and barriers and facilitators at the organizational level, one administrator per site will be selected for KI interviews. Interviews will be audiotaped and transcribed. At each focus group site, up to 10 ED health professionals with direct patient involvement will be invited to participate in a one-hour focus group session. The focus group moderator will record field note observations. A court reporter will do real-time transcription at all focus groups, a method that yields more complete and rapidly available records of greater fidelity [56]. Using the TDF as a coding framework, two coders will independently analyze all transcripts and field notes. Data collection and analysis will follow an iterative and concurrent process [49].

                                Outcome measures

                                Given the nature of our study, both clinical and process outcomes are important to evaluate. Process outcomes relate to use of the CP for relevant patients. Clinical outcomes relate to adherence with key interventions as recommended by each pathway as well as patient specific outcomes. Table 4 presents a description of the specific outcome measures for this study.
                                Table 4

                                Process and clinical outcomes

                                Outcome measure

                                Description

                                Details

                                Primary Process Outcome

                                Completed CP on relevant patient charts

                                1) Initial: CP started; little or no documentation;

                                  

                                2) Partial: some but incomplete documentation; or

                                  

                                3) Full: meets requirements for CP success

                                Secondary Process Outcomes

                                CP use based on ED busyness

                                CP use for relevant patients, adjusted for shift-level ED data [66].

                                Primary Clinical Outcomes

                                Proportion of pediatric patients with asthma and V&D who received appropriate treatment, based on CP Key Interventions

                                Asthma CP: Treatment with corticosteroids [35, 36, 67, 68] in the ED and at discharge, defined as:

                                  

                                i). Patients with moderate to severe exacerbation are treated with systemic corticosteroids in the ED, and systemic plus inhaled corticosteroids at discharge,

                                  

                                ii). Patients with mild exacerbation are treated with either inhaled or oral corticosteroids at discharge.

                                  

                                V&D CP: Appropriate treatment with oral rehydration therapy (ORT), [37, 39] defined as:

                                  

                                i). Patients with moderate to severe dehydration are treated with ORT, and

                                  

                                ii). Patients with no to mild dehydration are not treated with ORT.

                                Secondary Clinical Outcomes

                                Proportion of pediatric patients with asthma and V&D who received appropriate assessment or treatment, based on CP Key Interventions Patient specific outcomes

                                Asthma CP:

                                  

                                Documentation of a Pre-school Respiratory Assessment Measure (PRAM) score [6971].

                                  

                                V&D CP:

                                  

                                1) Documentation of a Gorelick score [37] for dehydration; and

                                  

                                2) Proportion of children treated with intravenous therapy for rehydration [39].

                                  

                                Both CPs: EDLOS, admission to hospital and re-visits to the ED within 72 hours.

                                Clinical outcomes will be measured through data abstracted from patient records and administrative databases in the 9-month pre and post-intervention periods. Prospective audits of the pre-implementation data are not possible as this period pre-dates the trial beginning. The post-implementation period can be clearly identified and prospective audit collection will therefore be done. Following completed implementation, we have scheduled a three-month settling in period before post-intervention audits will commence.

                                Chart audits

                                Using a previously successful approach with community ED studies [57], we will ask local medical records departments to pull relevant charts using International Classification of Diseases-10 (ICD-10) codes for all diagnoses related to our index conditions (asthma and V&D) during the defined study periods. Chart auditors will review all records to ensure eligibility criteria are met. All retrieved and eligible patient charts will be audited. Based on data from previous work, and pediatric asthma visits in Ontario EDs [58], we estimate an average of 85 Asthma and 115 V&D patient charts per site during each nine-month pre- and post-intervention period. Therefore, at each site 200 charts will be audited in each period, for a study total of 6,400 chart audits. In the event of multiple visits for the same patient, data will only be collected for the index visit. Four health record auditors will be trained to abstract data from patient records and directly enter these into a secure online database. To assess inter-rater agreement, pairs of auditors will initially each abstract the same 100 charts (50 asthma, 50 V&D). Agreement will be measured with a kappa coefficient and further training will be done until a kappa >0.8 is achieved. A data dictionary will be created to guide the chart auditors and ensure standardized data collection procedures. Auditors will be blinded to the study aims, study design, and group allocation.

                                Administrative data

                                Health administrative databases available at the Institute for Clinical Evaluative Sciences (ICES) will be used to study the impact of being on a CP on overall ED length of stay (LOS), hospital admissions, return visits within 72 hours following ED discharge for non-admitted patients, and death. Appropriate procedures will be used to link health administrative and patient specific data. The relevant databases are National Ambulatory Care Reporting System (NACRS), containing data from all Ontario EDs, and vital statistics data. Only data elements known to be reliable and valid will be used [59]. Use of NACRS data allows efficient capture of return visits, irrespective of whether the patient returns to the index hospital. These data also allow evaluation of shift-level characteristics, such as mean triage adjusted total LOS of all other patients in the ED on the same shift, which is an important contextual variable that may be associated with whether a CP was used and can be used to adjust for the LOS of the study patients. We will model whether the intervention had an impact on EDLOS, hospital admissions, return visits within 72 hours, and death; and whether ED busyness is associated with patients not being put on a pathway [66]. We will control for age, gender, Canadian Triage Acuity Scale (CTAS) rating, and EDLOS for other similar rated CTAS patients.

                                Sample size calculation

                                We conducted sample size calculations for our primary clinical outcome measures using standard formulas for comparing two proportions in cluster-randomized trials [60]. We conducted separate calculations for each CP and selected the larger of the two requirements as our target sample size. We assumed an intracluster correlation coefficient (ICC) of 0.01. This was the largest ICC based on preliminary data from a similar, not yet published study, which showed ICCs for dichotomous outcomes ranging from 0.001 to 0.01. Based on preliminary data, we assumed proportions of 10% in the control arm. We anticipated average numbers of asthma and V&D patients per hospital over the nine-month study duration of 85 and 110 respectively. After applying 10% inflation to account for cluster size imbalances, and allowing for attrition of one hospital per arm, eight hospitals are required in each study arm to yield 90% power to detect an absolute difference of 10% between study arms (10% in the control arm versus 20% in the intervention arm) using a two-sided test at the 5% level of significance. At recruitment, hospitals will be asked for written commitment to the study. Regardless of whether full implementation at that site has been completed, all relevant patient charts will be audited as per the protocol described. Therefore, barring exceptional circumstances, it is unlikely that hospitals will be lost to follow-up.

                                Analysis

                                Quantitative analysis

                                Corresponding to the incomplete block design, separate analyses, as described below, will be carried out for each CP. We will use descriptive statistics to compare hospitals and patient characteristics between the study arms: means and standard deviations will be calculated for continuous variables (or medians and interquartile ranges in the case of skewed distributions) and frequencies and proportions for categorical variables. Because we have a relatively small number of clusters per arm, the assumptions for mixed-effects logistic regression analyses and Generalized Estimating Equations (GEE) are not satisfied; primary analyses will therefore be carried out using cluster-level data [61].

                                We will first calculate hospital level summary measures (proportions in the case of dichotomous outcomes, or means in the case of continuous outcomes) in the pre- and post-intervention periods. A simple unweighted mean of the change from pre- to post-intervention will then be calculated for each hospital. The main effect measure for each outcome will be calculated as the difference between the mean changes in intervention and control arms, together with 95% confidence interval. If the distribution of the observed hospital summaries is markedly skewed, a logarithmic transformation will be applied, and the effect measures will be expressed as fold-changes. If descriptive analyses reveal important baseline differences in patient case mix between the study arms, a two-stage procedure will be used consisting of a standard regression analysis to obtain a residual adjusted for the covariates of interest in each hospital; hospital residuals will then be analyzed using the approach described above. Statistical significance will be assessed at the 5% level using a two-sample unpaired t-test.

                                We will conduct exploratory individual-level analyses using GEE to test whether ED busyness is associated with patients not being put on a pathway. Shift busyness will be measured as EDLOS for other similar rated Canadian Triage Acuity Scale (CTAS) patients. This analysis will also include the following individual-level covariates: patient age, condition (asthma, V&D) and CTAS rating.

                                Qualitative analysis

                                The focus groups and interviews will yield a substantial amount of complex data. To monitor progress and pursue emerging ideas, data collection and analysis will proceed concurrently [49]. The qualitative analyst will feed relevant emerging themes back to the focus group moderator and augment the interview guide as needed to capture any new ideas. Inductive analysis will be managed using N-VIVO software and will occur in three phases: coding, categorizing, and developing themes. As described earlier, all data will be coded using the TDF framework [46]. Codes will be broadly categorized corresponding to the major unit of analysis. As categories emerge, their theoretical properties will be defined. Comparisons between multiple categories will be carried out in order to locate similarities and differences between them. Finally, categories will be synthesized into themes. This process will be replicated for the qualitative data for each ‘case’.

                                Economic analysis

                                To address the fourth study objective, we will conduct an economic analysis [6264] in parallel to assess costs associated with implementation versus benefits related to improvements in wait times, hospital admissions and ED revisits. To determine if the CP generates financial benefits (i.e., savings), we will compare its use to standard care. We will also perform a decision analysis, from the hospital’s perspective with respect to costs, of implementation and uptake of both CP interventions. CP-associated costs will be divided into one-time start-up costs for implementation and recurrent costs for CP use and maintenance.

                                Implementation costs

                                Throughout the project, we will track costs for all activities required for successful CP implementation at each site. This will include costs for: document customization and committee approval processes at each site, production of forms, posters and educational aides, preparation and delivery of workshops, and staff participation at these presentations. Additional costs to be tracked include refinement of web-based resources and access, travel and orientation for site champions and opinion leaders, telecommunication support, and other incidental activities. The hourly wage will be assumed to be the provincial mean for health professionals and administrators as per Ontario collective bargaining agreements, and $150 per hour for physician time.

                                Healthcare costs

                                Payer costs associated with emergency care for pediatric patients with asthma or V&D will be accounted for, based on ED visits, return visits, hospital admissions, and physician fees, which will be determined from the ICES administrative database. Costs for ED visits will be based on standard methodologies using resource intensity weightings (RIWs) available at ICES from the Canadian Institute for Health Information for hospital admissions and a derived RIW for ED visits. Physician fees associated with ED visits and admissions, including those for laboratory and diagnostic imaging services, will be captured through associated Ontario Health Insurance Plan billings and fee schedule. Cost data will be presented in Canadian dollars for the standard price year of 2012. To determine the effect of a full CP implementation, these health costs will also be compared for all patients at each site presenting with asthma and V&D in the pre- versus post-implementation phases.

                                Synthesizing costs and effects

                                In the final economic evaluations, additional costs and outcomes will be synthesized in an incremental cost-effectiveness ratio (ICER) comparing the intervention versus control. This will result in two cost-effectiveness analyses (CEAs) relating to each CPs. The ICER for each CEA will be expressed as the incremental costs per proportion of change related to the primary clinical outcome for that CP.

                                Uncertainty analysis

                                In the final analysis, because an economic evaluation is always surrounded with uncertainty [64] the robustness of the ICER will be checked for sample uncertainty using non-parametric bootstrapping. The bootstrapped cost-effectiveness ratios will be subsequently plotted in a cost-effectiveness plane, in which the vertical line reflects the difference in costs and the horizontal line reflects the difference in effectiveness. The choice of treatment depends on the maximum cost, known as the ceiling ratio, that society is prepared to pay for a gain in effectiveness. Therefore, the bootstrapped ICERs will also be depicted in a cost-effectiveness acceptability curve [65] showing the probability that the intervention is cost-effective using a range of ceiling ratios.

                                Ethics and registration

                                Ethical approval has been granted for this study at the coordinating hospital (Children’s hospital of Eastern Ontario Research Ethics Board), and we are seeking site ethics approval at each hospital as their study participation is confirmed. Informed consent will be sought for interviews and focus group meetings. All research data will be stored on a secure server and will be transmitted through a secure virtual private network (VPN). It will then be securely deleted from the laptop hardrive using the software Eraser (http://​www.​eraser.​heidi.​ie). This trial is registered with ClinicalTrials.gov (http://​www.​clinicaltrials.​gov/​NCT01815710).

                                Trial status

                                We are currently in the preparation phase, enrolling sites and seeking local ethics approvals.

                                Discussion

                                CPs represent an important opportunity to narrow the evidence to practice gap in specific clinical settings. Given their interprofessional and typically hospital-based nature, integration of CPs into these settings involves complex interventions. Further evidence is needed, however, to guide effective methods for successful CP implementation. We have described a comprehensive study that will generate new knowledge based on a theory-based intervention strategy to implement CPs in community hospital settings. This study will address behavior change among health professionals, interprofessional team issues, patient outcomes and health economic impact. While CP outcomes are of ultimate importance, they cannot be viewed as proxies for implementation success. Through evaluation of a broad set of process and clinical outcomes, patient outcomes and clinician management—based on key interventions for each CP—can be assessed relative to degree of success in achieving implementation process measures. This knowledge can then be leveraged to inform future CP implementation activities.

                                We expect our study findings to be generalizable to other settings. By including EDs with varying sizes, urban/rural locations and target population patient visits, we will gain a richer understanding of relevant implementation issues across different situations. Through site visits, interviews, and continued communication throughout implementation, we also hope to identify important considerations that can be generalized to other CPs. To ensure a pragmatic approach, we have planned intervention components that can be achieved without significant additional workload or resources. Less reliance on study support and infrastructure will provide insight into success of future implementations without any such external support.

                                An obvious challenge for this project relates to the complex nature of the interventions being studied, and our potentially limited control in some aspects of implementation. However this affirms the importance of conducting this work in a research capacity, with a rigorous concurrent process evaluation. Another challenge relates to ED and hospital issues in particular. EDs are chaotic settings with multiple and often varying health professional team members. By focusing on pediatric conditions, which are typically an identified need in community EDs, we may uncover general issues with pediatric emergency care in these settings; many of these cannot be addressed with our intervention. Other hospital issues, such as competing priorities, budget restrictions, and staff turnover may also challenge implementation success at some of our sites. However, these are real issues confronting hospitals and must be considered for any implementation strategy.

                                Our research findings will be important to health professionals and organizations that are impelled to deliver evidence-based care. We will contribute to the body of evidence supporting effective strategies for CP implementation, and ultimately reducing the research to practice gaps by operationalizing best evidence care recommendations through use of CPs.

                                Authors’ information

                                The team includes expertise in knowledge translation (JG, DJ, JC, SDS, FD), clinical pathways (MJ, DJ, MDL, FD, SDS, TR), trial methodology (DJ, TPK, MDL), biostatistics (MT), health service research (AG), health economics (TR), ED administration (MJ, LMF, MS) and pediatric emergency care (MJ, DJ, TPK, FD). AP provides expertise in health policy and health system strategy, and will ensure that the findings from the study are disseminated through the Ontario Ministry of Health and Long-Term Care.

                                Abbreviations

                                CEAs: 

                                Cost-Effectiveness Analyses

                                CPs: 

                                Clinical Pathways

                                CPGs: 

                                Clinical Practice Guidelines

                                cRCT: 

                                Cluster Randomized Controlled Trial

                                CTAS: 

                                Canadian Triage Acuity Scale

                                ED: 

                                Emergency Department

                                GEE: 

                                Generalized Estimating Equation

                                ICC: 

                                Intracluster Correlation Coefficient

                                ICD-10: 

                                International Statistical Classification of Diseases

                                ICER: 

                                Incremental Cost-Effectiveness Ratio

                                ICES: 

                                Institute for Clinical Evaluative Sciences

                                KI: 

                                Key Informant

                                LOS: 

                                Length of Stay

                                NACRS: 

                                National Ambulatory Care Reporting System

                                RIWs: 

                                Resource Intensity Weightings

                                TDF: 

                                Theoretical Domains Framework

                                V&D: 

                                Vomiting and Diarrhea

                                VPN: 

                                Virtual Private Network

                                Declarations

                                Acknowledgements

                                I. Funding sources in support of this Study Protocol include:

                                1.CIHR Operating Grant, Health Services and Policy Research Institute; June 2012

                                2. CIHR Meetings, Planning and Dissemination Grant: Partnerships for Health System Improvement; Oct 2010

                                II. Funding sources for authors include the following:

                                Janet Curran, Investigator Award from the IWK Health Centre, Halifax, Nova Scotia.

                                Shannon D. Scott receives research personnel funding from CIHR as a New Investigator and AHFMR as a Population Health Investigator.

                                Astrid Guttmann is supported by a CIHR Applied Chair in Child Health Services and Policy Research.

                                Amanda Newton is supported by a New Investigator award from the Canadian Institutes of Health Research (CIHR).

                                Jeremy Grimshaw holds a Canada research Chair in health Knowledge Transfer and Uptake.

                                Authors’ Affiliations

                                (1)
                                Division of Emergency Medicine, Children’s Hospital of Eastern Ontario
                                (2)
                                Departments of Pediatrics and Emergency Medicine, University of Ottawa
                                (3)
                                Children’s Hospital of Eastern Ontario Research Institute
                                (4)
                                IWK Health Centre
                                (5)
                                University of Alberta
                                (6)
                                Institute for Clinical Evaluative Sciences
                                (7)
                                Division of Paediatric Medicine, Hospital for Sick Children
                                (8)
                                Department of Paediatrics and Health Policy, Management and Evaluation, University of Toronto
                                (9)
                                College of Pharmacy and Nutrition, University of Saskatchewan
                                (10)
                                Departments of Pediatrics and of Social and Preventive Medicine, University of Montreal
                                (11)
                                Research Centre, CHU Sainte-Justine
                                (12)
                                Departments of Medicine (Respirology), Biomedical and Molecular Sciences (Physiology) and Community Health and Epidemiology, Queen’s University
                                (13)
                                ICES-Queen’s University
                                (14)
                                University of Ottawa
                                (15)
                                Montfort Hospital
                                (16)
                                Champlain Local Health Integrated Network
                                (17)
                                Department of Pediatrics, Faculty of Medicine & Dentistry, University of Alberta
                                (18)
                                Department of Emergency Medicine, Cambridge Memorial Hospital
                                (19)
                                Michael G. DeGroote School of Medicine, McMaster University
                                (20)
                                Ontario Ministry of Health and Long-Term Care
                                (21)
                                Faculty of Medicine, University of Manitoba
                                (22)
                                Manitoba Institute of Child Health
                                (23)
                                Department of Epidemiology and Community Medicine, University of Ottawa
                                (24)
                                Clinical Epidemiology Program, Ottawa Hospital Research Institute
                                (25)
                                Ottawa Hospital Research Institute
                                (26)
                                Department of Medicine, University of Ottawa
                                (27)
                                Division of Emergency Medicine, Alberta Children’s Hospital
                                (28)
                                Alberta Children’s Hospital Research Institute
                                (29)
                                Department of Pediatrics, Physiology and Pharmacology, University of Calgary
                                (30)
                                School of Nursing, Dalhousie University

                                References

                                1. Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, Caswell W, Robinson N: Lost in knowledge translation: time for a map? J Contin Educ Health Prof 2006, 26:13–24.PubMedView Article
                                2. McGlynn EA, Asch SM, Adams J, Keesey J, Hicks A, DeCristofaro A, Kerr EA: The quality of health care delivered to adults in the United States. N Engl J Med 2003, 348:2635–2645.PubMedView Article
                                3. National Institute of Clinical Studies: Evidence-practice gaps report volume 1: a review of developments: 2004–2007. Canberra: National Health and Medical Research Council; 2008.
                                4. Huckson S, Davies J: Closing evidence to practice gaps in emergency care: the Australian experience. Acad Emerg Med 2007, 14:1058–1063.PubMed
                                5. Seddon ME, Marshall MN, Campbell SM, Roland MO: Systematic review of studies of quality of clinical care in general practice in the UK, Australia and New Zealand. Qual Health Care 2001,10(3):152–158.PubMedView Article
                                6. Lougheed MD, Garvey N, Chapman KR, Cicutto L, Dales R, Day AG, Hopman WM, Lam M, Sears MR, Szpiro K, To T, Paterson NA: Variations and gaps in management of acute asthma in Ontario emergency departments. Chest 2009, 135:724–736. Published ahead of print on-line November 18, 2008. ChestPubMedView Article
                                7. Freedman SB, Gouin S, Bhatt M, Black KJ, Johnson D, Guimont C, Joubert G, Porter R, Doan Q, van Wylick R, Schuh S, Atenafu E, Eltorky M, Cho D, Plint A: Pediatric emergency research Canada: prospective assessment of practice pattern variations in the treatment of pediatric gastroenteritis. Pediatrics 2011.
                                8. Goodman DC: Unwarranted variation in pediatric medical care. Pediatr Clin North Am 2009,56(4):745–55.PubMedView Article
                                9. Rowe BH, Diner B, Camargo CA Jr, Worster A, Colacone A, Wyer PC, Knowledge Translation-Consensus Conference Theme lb Members: Effective synthesized/Pre-appraised evidence formats in emergency medicine and the Use of supplemental knowledge translation techniques. Acad Emerg Med 2007, 14:1023–1029.PubMed
                                10. Grol R: Successes and failures in the implementation of evidence-based guidelines for clinical practice. Med Care 2001, 39:46–54.View Article
                                11. : Ontario’s Emergency room wait times strategy. http://​www.​health.​gov.​on.​ca/​en/​pro/​programs/​waittimes/​edrs/​strategy.​aspx
                                12. Kurtin P, Stucky E: Standardize to excellence: improving the quality and safety of care with clinical pathways. Pediatr Clin North Am 2009,56(4):893–904.PubMedView Article
                                13. Vanhaecht K, De Witte K, Panella M, Sermeus W: Do pathways lead to better organized care processes? J Eval Clin Pract 2009, 15:782–788.PubMedView Article
                                14. De Bleser L, Depreitere R, De Waele K, Vanhaecht K, Vlayen J, Sermeus W: Defining pathways. J Nurs Manag 2006,14(7):553–63.PubMedView Article
                                15. Browne GJ, Giles H, McCaskill ME, Fasher BJ, Lam LT: The benefits of using clinical pathways for managing acute paediatric illness in an emergency department. J Qual Clin Pract 2001, 21:50–55.PubMedView Article
                                16. Thomson P, Angus NJ, Scott J: Building a framework for getting evidence into critical care education and practice. Intensive Crit Care Nurs 2000,16(3):164–174.PubMedView Article
                                17. Joint Policy Statement: Guidelines for care of children in the emergency department. Pediatrics 2009, 124:1233–1243.View Article
                                18. Kozer E, Scolnik D, MacPherson A, Rauchwerger D, Koren G: Using a preprinted order sheet to reduce prescription errors in a pediatric emergency department: a randomized, controlled trial. Pediatrics 2005, 116:1299–1302.PubMedView Article
                                19. McCue JD, Beck A, Smothers K: Quality toolbox: clinical pathways can improve core measure scores. J Healthc Qual 2009,31(1):43–50.PubMedView Article
                                20. Kent P, Chalmers Y: A decade on: has the use of integrated care pathways made a difference in Lanarkshire? J Nurs Manag 2006,14(7):508–20.PubMedView Article
                                21. Panella M, Marchisio S, Di Stanislao F: Reducing clinical variations with clinical pathways: do pathways work? Int J Qual Health Care 2003,15(6):509–521.PubMedView Article
                                22. Gaddis GM, Greenwald P, Huckson S: Toward improved implementation of evidence-based clinical algorithms: clinical practice guidelines, clinical decision rules, and clinical pathways. Acad Emerg Med 2007, 14:1015–1022.PubMed
                                23. Vanhaecht K, Bollmann M, Bower K, Gallagher C, Gardini A, Guezo J, Jansen U, Massoud R, Moody K, Sermeus W, Van Zelm R, Whittle C, Yazbeck AM, Zander K, Panella M: Prevalence and use of clinical pathways in 23 countries – an international survey by the European pathway association. Intl J Care Pathw April 2006, 10:28–34.View Article
                                24. Darzi A: High quality care for all: NHS Next Stage Review final report. 2008. http://​www.​dh.​gov.​uk/​prod_​consum_​dh/​groups/​dh_​digitalassets/​@dh/​@en/​documents/​digitalasset/​dh_​085828.​pdf
                                25. Rotter T, Kinsman L, James E, Machotta A, Gothe H, Willis J, Snow P, Kugler J: Clinical pathways: effects on professional practice, patient outcomes, length of stay and hospital costs. Cochrane Database Syst Rev 2010, 3:CD006632.PubMed
                                26. Kinsman LD, Buykx P, Humphreys JS, Snow PC, Willis J: A cluster randomized trial to assess the impact of clinical pathways on AMI management in rural Australian emergency departments. BMC Health Serv Res 2009, 9:83.PubMedView Article
                                27. Evans-Lacko S, Jarrett M, McCrone P, Thornicroft G: Facilitators and barriers to implementing clinical care pathways. BMC Health Serv Res 2010, 10:182.PubMedView Article
                                28. Simmons J, Kotagal UR: Reliable implementation of clinical pathways: what will it take-that is the question. J Pediatr 2008,152(3):303–4.PubMedView Article
                                29. Van Herck P, Vanhaecht K, Sermeus W: Effects of clinical pathways: do they work? J Int Care Path 2004,8(3):95–107.
                                30. De Allegri M, Schwarzbach M, Loerbroks A, Ronellenfitsch U: Which factors are important for the successful development and implementation of clinical pathways? A qualitative study. BMJ Qual Saf 2011,20(3):203–208. Date of Electronic Publication:?2011 Jan 05PubMedView Article
                                31. Sharek PJ, Mullican C, Lavanderos A, Palmer C, Snow V, Kmetic K, Antman M, Knutson D, Dembry LM: Best practice implementation: lessons learned from 20 partnerships. Jt Comm J Qual Patient Saf 2007,33(12):16–26.PubMed
                                32. Rumelt RP: Good strategy Bad strategy. Crown Business: Random House; 2011.
                                33. Lougheed MD, Garvey N, Chapman KR, Cicutto L, Dales R, Day AG, Hopman WM, Lam M, Sears MR, Szpiro K, To T, Paterson NA, Ontario Respiratory Outcomes Research Network: The Ontario asthma regional variation study: emergency department visit rates and the relation to hospitalization rates. Chest 2006,129(4):909–17.PubMedView Article
                                34. Guttmann A, Zagorski B, Austin PC, Schull M, Razzaq A, To T, Anderson G: Effectiveness of emergency department asthma management strategies on return visits in children: a population-based study. Pediatrics 2007,120(6):e1402–10.PubMedView Article
                                35. Bhogal S, McGillivray D, Bourbeau J, Benedetti A, Bartlett S, Ducharme FM: Early administration of systemic corticosteroids reduces hospital admission rates in children with moderate and severe asthma exacerbation. Ann Emerg Med 2012,60(1):84–91. e3PubMedView Article
                                36. Zemek R, Plint A, Osmond MH, Kovesi T, Correll R, Perri N, Barrowman N: Triage nurse-initiation of corticosteroids in pediatric asthma is associated with improved ED efficiency. Pediatrics 2012,129(4):671–80.PubMedView Article
                                37. Gorelick MH, Shaw KN, Murphy KO: Validity and reliability of clinical signs in the diagnosis of dehydration in children. Pediatrics 1997,99(5):e6.PubMedView Article
                                38. Fedorowicz Z, Jagannath VA, Carter B: Antiemetics for reducing vomiting related to acute gastroenteritis in children and adolescents. Cochrane Database Syst Rev 2011., (9): Art. No.: CD005506
                                39. Hartling L, Bellemare S, Wiebe N, Russell KF, Klassen TP, Craig WR: Oral versus intravenous rehydration for treating dehydration due to gastroenteritis in children. Cochrane Database Syst Rev 2006. Issue 3. Art. No.: CD004390
                                40. Verstappen WHJM, van der Weijden T, ter Riet G, Grimshaw J, Winkens R, Grol RPTM: Block design allowed for control of the Hawthorne effect in a randomized controlled trial of test ordering. J Clin Epidemiol 2004, 57:1119–1123.PubMedView Article
                                41. Scott SD, Grimshaw J, Klassen TP, Nettel-Aguirre A, Johnson DW: Understanding implementation processes of clinical pathways and clinical practice guidelines in pediatric contexts: a study protocol. Implement Sci 2011, 6:133.PubMedView Article
                                42. Lougheed MD, Olajos-Clow J, Szpiro K, Moyse P, Julien B, Wang M, Day AG: Emergency medicine advances: multicentre evaluation of an emergency department asthma care pathway for adults. CJEM 2009,11(3):215–29.PubMed
                                43. Szpiro KA, Harrison MB, VandenKerkhoff EG, Lougheed MD: Asthma education delivered in an emergency department and an asthma education centre: A feasibility study. Adv Emerg Nurs J 2009,31(1):65–77.
                                44. Bhogal S, McGillivray D, Bourbeau J, Plotnick LH, Bartlett SJ, Benedetti A, Ducharme FM: Focusing the focus group: impact of the awareness of major factors contributing to non-adherence to acute paediatric asthma guidelines. J Eval Clin Pract 17(1):160–167.
                                45. Bhogal S, Bourbeau J, McGillivray D, Benedetti A, Bartlett S, Ducharme F: Adherence to pediatric asthma guidelines in the emergency department: a survey of knowledge, attitudes and behaviour among health care professionals. Can Respir J 2010, 17:175–82.PubMed
                                46. Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A: Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care 2005,14(1):26–33.PubMedView Article
                                47. French SD, Green S, Buchbinder R, Barnes H: Interventions for improving the appropriate use of imaging in people with musculoskeletal conditions. Cochrane Database Syst Rev 2010, 1:CD006094.PubMed
                                48. Caine J, Michie S: Validating a theoretical framework for implementation and other behaviour change research. Psychol Health 2011, 26:Supplement 1.
                                49. Morse J, Field P: Qualitative research methods for health professionals. 2nd edition. Thousand Oaks: Sage; 1995.
                                50. Michie S, Johnston M, Francis J, Hardeman W, Eccles M: From theory to intervention: mapping theoretically derived behavioural determinants to behaviour change techniques. Applied Psychology 2008,57(4):660–680.View Article
                                51. Michie S, van Stralen MM, West R: The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implement Sci 2011,6(1):42.PubMedView Article
                                52. Li PTS: A population-based study on the association of standardized protocols in the emergency department for childhood asthma with outcomes in Ontario. Canada. MSc thesis: University of Toronto, Graduate Department of Health Policy, Management and Evaluation (Clinical Epidemiology and Health Care Research); 2011.
                                53. Kinlin LM, Bahm A, Guttmann A, Freedman SB: A survey of emergency department resources and strategies employed in the treatment of pediatric gastroenteritis. Acad Emerg Med in press
                                54. Raab GM, Butcher I: Balance in cluster randomized trials. Stat Med 2001, 30:351–365.View Article
                                55. Chaudhary MA, Moulton LH: A SAS macro for constrained randomization of group-randomized designs. Comput Methods Programs Biomed 2006,83(3):205–210.PubMedView Article
                                56. Scott SD, Sharpe H, O'Leary K, DeHaeck U, Hindmarsh K, Moore JG, Osmond MH: Court reporters: a viable solution for the challenges of focus group data collection? Qual Health Res 2009,19(1):140–6.PubMedView Article
                                57. Plint AC, McGahern C, Taljaard M, Scott S, Grimshaw J, Klassen TP, Johnson DW: Abstract 146: PRACTICE variation in bronchiolitis management in Ontario community emergency departments [abstract]. CJEM 2010,12(3):229–278.
                                58. Guttmann A, Weinstein M, Bhamani A, Austin P, Anderson G: Room for improvement across All emergency department settings: discretionary Use of X-rays for acute respiratory conditions in children. CJEM 2012. in press
                                59. Canadian Institute for Health Information (CIHI): CIHI data quality study of emergency department visits for 2004–2005: volume II of IV - main study findings. Ottawa: CIHI; 2008. https://​secure.​cihi.​ca/​free_​products/​vol1_​nacrs_​executive_​summary_​nov2_​2007.​pdf
                                60. Donner A, Klar N: Design and analysis of cluster randomization trials in health research. New York: Arnold publishers; 2000.
                                61. Hayes RJ, Moulton LH: Cluster randomized trials. Chapman & Hall: CRC Press; 2009.View Article
                                62. Glick HA, Doshi JA, Sonnad SS, Polsky D: Economic evaluation in clinical trials. Oxford: University Press; 2007.
                                63. Ramsey S, Willke R, Briggs A, Brown R, Buxton M, Chawla A, Cook J, Glick H, Liljas B, Petitti D, Reed S: Good research practices for cost-effectiveness analysis alongside clinical trials: the ISPOR RCT-CEA Task Force report. Value Health 2005, 8:521–533.PubMedView Article
                                64. Briggs AH, O'Brien BJ, Blackhouse G: Thinking outside the box: recent advances in the analysis and presentation of uncertainty in cost-effectiveness studies. Annu Rev Public Health 2002, 23:377–401.PubMedView Article
                                65. Fenwick E, O'Brien BJ, Briggs A: Cost-effectiveness acceptability curves–facts, fallacies and frequently asked questions. Health Econ 2004, 13:405–415.PubMedView Article
                                66. Guttmann A, Schull MJ, Fund C, Vermeulen MJ, Stukel TA: Association between waiting times and short term mortality. BMJ 2011, 342:d2983.PubMedView Article
                                67. Rowe B, Spooner H, Ducharme F, Bretzlaff J, Bota G: Early emergency department treatment of acute asthma with systemic corticosteroids. Cochrane Database Syst Rev
                                68. Becker A, Bérubé D, Chad Z, Dolovich M, Ducharme F, D’Urzo TD, Ernst P, Ferguson A, Gillespie C, Kapur S, Kovesi T, Lyttle B, Mazer B, Montgomery M, Pedersen S, Pianosi P, Reisman JJ, Sears M, Simons E, Spier S, Thivierge R, Watson W, Zimmerman B, Canadian Network For Asthma Care; Canadian Thoracic Society: Canadian pediatric asthma consensus guidelines. CMAJ 2005,173(6):S12–4.PubMed
                                69. Ducharme FM, Chalut D, Plotnick L, Savdie C, Kudirka D, Zhang X, Meng L, McGillivray D: The pediatric respiratory assessment measure: a valid clinical score for assessing acute asthma severity from toddlers to teenagers. J Pediatr 2008,152(4):476–480e1.PubMedView Article
                                70. Chalut D, Ducharme F, Davis G: The preschool respiratory assessment measure (PRAM): a responsive index of acute asthma severity. J Pediatr 2000,137(6):762–768.PubMedView Article
                                71. Gorelick MH, Stevens MW, Schultz T, Scribano PV: Difficulty in obtaining peak expiratory flow measurements in children with acute asthma. Pediatr Emerg Care 2004, 20:22–6.PubMedView Article

                                Copyright

                                © Jabbour et al.; licensee BioMed Central Ltd. 2013

                                This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://​www.​creativecommons.​org/​licenses/​by/​2.​0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.