- Study protocol
- Open Access
Care team and practice-level implementation strategies to optimize pediatric collaborative care: study protocol for a cluster-randomized hybrid type III trial
Implementation Science volume 17, Article number: 20 (2022)
Implementation facilitation is an effective strategy to support the implementation of evidence-based practices (EBPs), but our understanding of multilevel strategies and the mechanisms of change within the “black box” of implementation facilitation is limited. This implementation trial seeks to disentangle and evaluate the effects of facilitation strategies that separately target the care team and leadership levels on implementation of a collaborative care model in pediatric primary care. Strategies targeting the provider care team (TEAM) should engage team-level mechanisms, and strategies targeting leaders (LEAD) should engage organizational mechanisms.
We will conduct a hybrid type 3 effectiveness–implementation trial in a 2 × 2 factorial design to evaluate the main and interactive effects of TEAM and LEAD and test for mediation and moderation of effects. Twenty-four pediatric primary care practices will receive standard REP training to implement Doctor–Office Collaborative Care (DOCC) and then be randomized to (1) Standard REP only, (2) TEAM, (3) LEAD, or (4) TEAM + LEAD. Implementation outcomes are DOCC service delivery and change in practice-level care management competencies. Clinical outcomes are child symptom severity and quality of life.
This statewide trial is one of the first to test the unique and synergistic effects of implementation strategies targeting care teams and practice leadership. It will advance our knowledge of effective care team and practice-level implementation strategies and mechanisms of change. Findings will support efforts to improve common child behavioral health conditions by optimizing scale-up and sustainment of CCMs in a pediatric patient-centered medical home.
ClinicalTrials.gov, NCT04946253. Registered June 30, 2021.
Benefits and challenges of integrated care models to improve pediatric behavioral health
Fewer than half of all children with disruptive behavior disorders (DBD; 46%) or attention deficit hyperactivity disorder (ADHD; 48%) receive treatment , so many may exhibit long-term impairments . Proactive intervention by pediatric primary care providers (PCPs) in patient-centered medical homes may prevent or attenuate these impairments [3,4,5,6,7,8]. Integrated care approaches, such as collaborative care models (CCM), target behavioral health (BH) problems in health care settings [9,10,11,12]. Meta-analyses show that these approaches improve clinical outcomes in adults [12,13,14,15,16], especially women and people of color [17, 18] and, to a lesser extent, in children/youth [3, 8].
Based on Wagner’s Chronic Care Model , CCMs include core components (e.g., delivery system redesign, self-management support) [11, 20] to support key features that include team-based care, progress monitoring, and brief evidence-based interventions [3, 11, 21]. CCM teams typically include PCPs, care managers/coordinators, and a mental health specialist (e.g., psychiatrist) who provides consultation and decision support for complex cases, with most functions coordinated and delivered by the care manager . Because CCMs are complex multi-component interventions, their implementation presents practical challenges at multiple levels [20, 22,23,24,25,26,27,28,29,30,31].
Multilevel determinants of CCM implementation
This study draws upon the EPIS framework to organize our understanding of barriers and facilitators (i.e., determinants) [32, 33]. Common inner context determinants of CCMs include those related to individual provider characteristics (e.g., attitudes, self-efficacy), leadership, and organizational characteristics [20, 25,26,27,28,29,30, 34]. Team functioning is also a key determinant in team-based service settings like primary care . Implementing evidence-based practices (EBPs) requires teams to adapt to respond to new demands. Team functioning includes affective, behavioral, and cognitive processes and states (e.g., trust, coordination, shared knowledge) and is associated with implementation and patient outcomes [35,36,37,38,39]. At the leadership and organizational levels, successful CCM implementation requires supportive leadership, positive organizational climate and culture, and strong implementation climate . Organizations that reinforce EBP use and provide ongoing support to providers set the stage for successful adoption [40, 41]. Effective leaders encourage positive views about the innovation, leverage time, and resources to support it and may directly champion implementation [41,42,43,44,45,46,47,48].
Achieving public health impact requires scale-up and sustainment of CCMs, especially in low-resource areas [3, 22,23,24, 28, 31, 49]. However, the current science about how to target these determinants provides few answers. None of the trials in the aforementioned pediatric meta-analysis tested the effects of specific implementation strategies on provider or patient outcomes or their mechanisms of action [3, 8]. We lack effective implementation strategies to guide the scale-up and maintenance of CCMs in pediatric medical homes .
Implementation facilitation can promote uptake of CCMs
Multi-level implementation strategies targeting CCM determinants can improve implementation outcomes [51, 52]. One promising approach is implementation facilitation, a type of interactive assistance designed to overcome barriers and leverage strengths to foster EBP implementation [52, 53]. Facilitation, based on the PARiHS framework , is a discrete and multifaceted implementation strategy intended to be flexible and responsive to local circumstances [55, 56]. It is hypothesized to promote organizational learning , although our understanding of the specific mechanisms through which facilitation operates is limited [56,57,58,59].
Facilitation has been broadly operationalized in two forms, sometimes described as external and internal facilitation . External facilitation involves the use of a facilitator outside of the organization who provides ongoing consultation, coaching, and support to enhance the clinical competencies of providers [52, 60,61,62,63,64]. Facilitation strategies that support front-line providers’ capacity to adopt and deliver a CCM have improved uptake, fidelity, and clinical outcomes in mental health and primary care settings [53, 61, 62, 65, 66].
Internal facilitation involves supporting and training leaders to serve as facilitators within their settings who can bolster EBP delivery by reducing organizational barriers [52, 53, 60, 67, 68]. These strategies (e.g., mentoring managers to adapt workflows and support/reinforce EBP delivery) are designed to reduce organizational barriers and leverage resources to support EBP integration. Internal facilitation has augmented the impact of external facilitation on uptake in community settings, but not always [51, 53, 65, 69]. Generally, research has shown benefits of internal facilitation on EBP competencies/fidelity with providers in adult primary care [66, 70] and mental health agencies [58, 71, 72].
Some studies, however, have not found incremental benefits for external or internal facilitation, and others have found more limited benefits of internal facilitation in typical, low-resource community practices [53, 65, 69]. Internal facilitation has primarily been examined in combination with external facilitation, so its separate effects are relatively unknown. Importantly, most studies have used facilitation to target multiple levels (e.g., individual, team, leadership, organization) simultaneously, including using “blended” or “two-tiered” facilitation models [73, 74], limiting our understanding of mechanisms of change within the “black box” of implementation facilitation [57,58,59].
Care team providers and practice leaders have different levers of influence that can aid in sustainment of EBPs [22, 24]. At this point, no implementation trial to our knowledge has evaluated the separate and combined effects of facilitation strategies targeting the care team and facilitation strategies targeting practice leadership. It is plausible that these two facilitation strategies have synergistic effects on implementation outcomes by potentiating greater engagement of their respective target mechanisms [75, 76]. Strategies targeting the provider care team should engage team-level mechanisms (e.g., team functioning) , whereas strategies targeting leaders should engage organizational mechanisms (e.g., implementation climate, implementation leadership) [77, 78]. Testing mechanisms of action of implementation strategies at specific levels will advance implementation science [79,80,81,82,83,84].
We propose to disentangle and further refine facilitation strategies targeting the care team and leadership levels to support implementation of a CCM in pediatric primary care. We will conduct a cluster-randomized, hybrid type 3 effectiveness–implementation trial  using a 2 × 2 factorial design to test the main and interactive effects of implementation strategies that target the care team level (TEAM) or leadership level (LEAD) on implementation and clinical outcomes. All practices will first receive standard implementation strategies based on the Replicating Effective Programs (REP) model . Practices then will be randomized to four conditions: (1) Standard REP only; (2) TEAM, (3) LEAD, and (4) TEAM + LEAD.
Standard REP is a low-cost and low-burden strategy consisting of a tailored intervention manual, didactic training, and brief technical support [87,88,89]. REP is based on the Centers for Disease Control and Prevention’s Research-to-Practice Framework [86, 90, 91] and derived from Social Learning Theory  and Rogers’ diffusion model . It is easily scalable in most community-based practices. Although standard REP only may help some sites to adequately adopt an EBP, evidence suggests it is unlikely to be sufficient in many lower-resourced settings, and augmentations to REP may be necessary . In this study, we will evaluate the effects of augmenting REP with two different types of facilitation (TEAM and LEAD) targeting different levels, audiences, and mechanisms.
TEAM facilitation is informed by existing approaches to facilitation, including external facilitation [52, 53, 67], practice facilitation [62, 64, 94, 95], and coaching [63, 64], in which an outside expert helps providers improve EBP uptake. TEAM also incorporates strategies from team development interventions (i.e., team building [96, 97], team training [98, 99], debriefing [100, 101]) to improve care team functioning and effectiveness. TEAM aims to improve implementation outcomes by targeting provider clinical competencies, team functioning, and team integration/quality.
LEAD is based on the Kirchner  and Kilbourne et al.  internal facilitation role. It focuses on reducing organizational barriers to implementation by promoting practice champions who sustain the EBP. LEAD aims to improve implementation outcomes by targeting implementation climate [77, 103, 104] and implementation leadership [105, 106].
The EBP: DOCC
Doctor–Office Collaborative Care (DOCC) is a cross-diagnosis intervention for treating DBDs and comorbid ADHD in community pediatric practices [107,108,109]. DOCC is based on the CCM’s core components adapted for the medical home [11, 12, 50, 110]. In randomized clinical trials, DOCC improved service access (99% vs. 46%) and completion (77% vs. 12%), personalized behavioral and ADHD targets, externalizing and ADHD symptoms, remission rates, family satisfaction, and provider self-efficacy and effectiveness, with most symptom resolution in fewer than 11 contacts [107,108,109, 111]. We also documented lower BH care costs at 12-month follow-up for DOCC .
In this trial, the DOCC package includes an implementation guide and a provider treatment manual. The implementation guide includes resources and guidelines for key care management processes that support the six CCM principles: organizational support, delivery system redesign (e.g., team roles, workflows), clinical decision support (e.g., use of standardized rating scales), clinical information systems (e.g., patient registry), self-management support (e.g., workbook), and community resources. The treatment manual includes DBD modules with skills for caregivers and children (e.g., anger management, parenting) and ADHD modules that address psychoeducation, shared decision-making, and medication recommendations. The care team is responsible for assessing treatment progress, individualizing session frequency and treatment dose, and coordinating with other services.
This trial seeks to accelerate understanding of the implementation strategies needed to deliver and scale up CCMs in pediatric primary care. The specific aims are the following:
Aim 1: test the effects of TEAM and LEAD on implementation outcomes and child clinical outcomes
Our implementation outcomes are DOCC service delivery, specifically the number of encounters for each case (primary), and change in practice-level care management competencies (exploratory) at 4 timepoints (6, 12, 18, and 24 months). We will also examine secondary clinical outcomes (change in severity of child symptoms). We hypothesize that augmenting REP with TEAM alone or LEAD alone is superior to Standard REP, and that TEAM + LEAD is superior to all other conditions because it targets both levels.
Aim 2: test for target engagement in each implementation condition and if hypothesized mechanisms mediate the effects of TEAM and LEAD on implementation outcomes
Our hypothesized mechanisms for TEAM are team functioning and effectiveness; hypothesized mechanisms for LEAD are implementation leadership and implementation climate. We hypothesize that each condition will have main effects on one or both of its targets. We also expect that the effects of TEAM and LEAD on outcomes will be mediated by their hypothesized targets.
Aim 3: examine provider-, practice-, and family-level moderators of the effects of TEAM or LEAD
Proposed provider-level moderators of TEAM effects are attitudes about delivering BH care and care manager discipline. Proposed practice-level moderators of LEAD are baseline implementation leadership and climate. Lastly, proposed family-level moderators of TEAM or LEAD are caregiver gender, caregiver race/ethnicity, and child baseline symptom severity.
We propose a hybrid type 3, cluster-randomized effectiveness–implementation trial in a 2 × 2 factorial design in 24 pediatric primary care practices across Pennsylvania. After all sites receive Standard REP, they will be randomized to one of four conditions: (1) Standard REP only (continued technical assistance), (2) TEAM, (3) LEAD, or (4) TEAM + LEAD. Figure 1 outlines the trial design. Care teams will deliver DOCC to 25 children with elevated behavioral problems and their caregivers. We will collect longitudinal data from practice staff and caregivers. SPIRIT, CONSORT, and TIDieR checklists [113,114,115] are in Supplemental File 1, and CONSORT flow diagrams are in Supplemental File 2. All procedures were approved by the University of Pittsburgh Institutional Review Board.
Setting and participants
Study sites will be 24 primary care practices affiliated with the Pennsylvania Chapter of the American Academy of Pediatrics Medical Home Program (PA-MHP) [4, 7, 116]. These practices are heterogenous with respect to their size, location, health system, insurance mix, and population diversity. PA-MHP leadership identified eligible and interested practices and organized orientation meetings with key practice leaders.
In each practice, we will enroll the lead PCP or medical director (N = 24) and practice manager (N = 24). Individuals in these positions are responsible for decision-making and management tasks at their sites. Lead PCPs and practice managers in practices randomized to LEAD will participate in Leadership Facilitation.
Primary care providers (PCPs)
We seek to enroll all eligible PCPs at participating practices (M = 4.4 PCPs per practice, range: 1–10) to maximize the likelihood that families have an enrolled PCP and enhance generalizability. All PCPs will be provided access to training and encouraged to deliver DOCC. PCPs in practices randomized to TEAM will participate in Team Facilitation.
Care managers (CMs)
In each practice, we will enroll the individual acting as the practice’s behavioral health resource to serve as the CM (N = 24). Individuals in this role may vary in professional discipline (e.g., nursing, mental health) and experience delivering psychosocial interventions. The CM will deliver DOCC in collaboration with PCPs and the care team. CMs in practices randomized to TEAM will participate in Team Facilitation.
We will enroll up to 25 caregivers of 5–12-year-old children in each practice for a possible total of 600 caregivers. Eligibility criteria are (1) child age (5–12 years), (2) parent/guardian with parental rights, and (3) child meets clinical cutoff on 7-item externalizing problems scale of the Pediatric Symptom Checklist-17 (PSC-17) . Based on prior trials, we expect to recruit more female than male caregivers and for participants to vary in race/ethnicity by practice. Our primary informant is the caregiver due to the children’s young age and ethical concerns about assessing children without face-to-face contact.
Implementation conditions and strategies
The four implementation conditions are shown in Fig. 1. Table 1 lists the mechanisms of change at the individual, team, and practice levels targeted by each condition. Supplemental File 3 shows the ERIC implementation strategies included within each condition and their hypothesized mechanisms of action (Supplemental Table 1) and the specific actions within each condition (Supplemental Table 2).
Standard REP (no facilitation)
All practices will receive DOCC manuals and training and participate in the same study initiation meetings (e.g., staff introductions, orientation to study procedures). Each provider will receive access to DOCC virtual training, which includes content and care processes organized into nine clinical topics, each with brief videos and post-training knowledge quizzes. Providers can use the platform to contact study staff for technical assistance and clarification or discussion of training content. The training platform will record data on progress, completion, and performance (e.g., modules accessed, quiz scores), and providers will receive continuing education credits. All sites will receive ongoing technical assistance during the implementation phase.
Care team facilitation (TEAM)
TEAM is a phased approach designed to improve providers’ skill in using DOCC, teamwork quality, and team effectiveness. The TEAM facilitator is a licensed clinician who delivered DOCC in a prior trial and has lived experience as a consumer of integrated care for a child with ADHD. TEAM facilitation will occur via a graded schedule of videoconference calls over 18 months (weekly to bimonthly).
In the first phase of TEAM, the facilitator will engage the care team in identifying barriers and facilitators, enhancing motivation to use DOCC, and setting goals for implementation in their practice. The second phase focuses on reviewing and revising roles, responsibilities, and workflows within the team to improve collaboration, coordination, and use of DOCC. As part of this phase, the facilitator will support the team in developing effective communication and problem-solving skills. The third phase focuses on increasing the team’s competency and fidelity to DOCC through ongoing training in treatment content (e.g., didactics, modeling, role plays), structured reviews of patient progress, audit and feedback on use of the patient registry, consultation on challenging cases, and support in overcoming barriers and balancing model adaptations and fidelity. The last phase of TEAM focuses on the team’s capacity to sustain and continually improve DOCC in their practice. The facilitator will conduct structured team debriefings and encourage the team to identify and address potential problems in team processes. The facilitator will also guide the team in reflecting on implementation, reviewing and revising implementation goals, and planning for sustainability and continuous quality improvement.
Practice leadership facilitation (LEAD)
LEAD is a phased approach designed to strengthen practice leadership’s capacity to lead change and overcome practice-level barriers. The LEAD facilitator and consulting psychiatrist is a pediatrician and faculty member with expertise in pediatric integrated care, consulting with PCPs and CMs, ADHD medication management, and behavior problems. LEAD will follow the same graded schedule of videoconference calls as TEAM.
In the first phase of LEAD, the facilitator will engage leadership in identifying barriers and facilitators to DOCC, learning about the model, and setting goals for practice implementation. The second phase focuses on aligning implementation with practice priorities, reinforcing leaders’ attention to DOCC, and creating an action plan to reach implementation goals. During the third phase, the facilitator will engage leadership in reducing barriers by leveraging relationships, aligning fiscal resources with DOCC core activities, engaging community partners, and sharing lessons learned from other practice leaders. The last phase focuses on leadership’s capacity to sustain and continually improve DOCC in their practice by encouraging them to think strategically about system-level barriers and facilitators and plan for sustainability and ongoing monitoring of progress. The facilitator will guide leaders to reflect on implementation goals and work to transfer responsibility for DOCC use in their practice.
Team and leadership facilitation (TEAM + LEAD)
TEAM + LEAD combines both approaches described above to improve distinct but potentially complementary targets at the team and practice levels. Practices randomized to TEAM + LEAD will participate in all the above activities, and the TEAM and LEAD facilitators will work together to align the goals and actions of the care team and leadership.
For feasibility, the trial will occur in three cohorts. Randomization will occur at the beginning of each planned cohort. If possible, we will stratify the practices in each cohort by Medicaid rate before randomizing. Randomization will be conducted by the project’s data manager using random number generation in SAS. Practices will be informed of randomization after baseline data collection is completed. Research staff who have contact with participants will be unaware of implementation condition.
Caregiver recruitment and screening
Caregivers of children visiting the practice will be informed about the study using multiple methods (e.g., posters, brochures, PCP referral). Recruitment materials, available in English and Spanish, direct caregivers to their PCP and a website with more information and a short orientation video. Caregivers complete an online screening process. Eligible caregivers will then be given login credentials to access and complete an online consent and the baseline assessment via a smartphone, tablet, or computer. Caregivers who cannot read or write and caregivers without a smartphone or computer are directed to call the study coordinator to complete the screening and consent process.
Professional and caregiver assessments
Practice staff will complete assessments at 0, 6, 12, 18, and 24 months. Caregivers will complete assessments at 0, 3, 6, and 12 months and provide consent for their child’s teacher to complete rating scales at each timepoint. Measures are listed in Table 2. All assessments can be completed online, on paper, or by phone with research staff. Participants will be paid for each completed assessment. We estimate high retention rates for practice staff (> 90%) and caregivers (> 85%) based on our previous trials (91–92%) [107,108,109]. We will use strategies from prior trials (e.g., trained staff, on-time bonus payments) to support retention.
At baseline, each practice manager will complete a Practice Information and Needs Survey (PINS) form, and professionals will complete a Staff Information Form (SIF). Caregivers will complete a Family Information Form (FIF) to provide background information, including any other treatment services .
Services Provided Log (SPL)
After each service encounter, providers will record the type of contact (e.g., treatment session, collaborative care meeting), individuals present, intervention content, and plans for next contact [107, 108]. Our primary outcome is the number of DOCC service encounters delivered to each patient by all providers.
Mental Health Practice Readiness Inventory (MHPRI)
The MHPRI  will document practice-level care management competencies shown to predict care uptake [120, 121].
Practice staff will rate each of the 32 items at the practice-level (0 = no function exists; 1 = some function; 2 = function is complete). We will aggregate all informants’ scores to create a total score for practice achievement of care management competencies.
Acceptability of Intervention Measure (AIM), Intervention Appropriateness Measure (IAM), and Feasibility of Intervention Measure (FIM)
Secondary implementation outcomes are DOCC acceptability, feasibility, and appropriateness. They will be assessed with three 4-item scales that have excellent internal consistency and good content validity .
Vanderbilt ADHD Diagnostic Rating Scale
Rating scales will be completed by caregivers (VADPRS) and teachers (VADTRS) at each timepoint [123, 124]. Both versions include symptom severity scales as well as performance/impairment items and have excellent psychometrics.
Pediatric Quality of Life (PEDS-QL)
Caregivers will complete the PEDS-QL to measure health-related quality of life [125, 126]. It has excellent reliability and treatment validity and is sensitive to DOCC .
TEAM and LEAD targets and mediators of implementation outcomes
Primary Care Team Dynamics Survey (PCTDS)
Staff will complete the 29-item PCTDS to assess affective, behavioral, and cognitive dimensions of team functioning and overall team effectiveness . It has high reliability and discriminant validity .
Pediatric Integrated Care Survey (PICS)
Caregiver perceptions of team effectiveness will be assessed with the 6-item PICS . The PICS has good reliability, construct validity, and discriminant validity .
Team Development Measure (TDM)
As a secondary measure of team functioning, staff will complete the TDM, which assesses specific dimensions of team functioning and provides an overall team development score . The TDM has excellent internal consistency and a clear factor structure .
Implementation Leadership Scale (ILS)
Staff will complete the ILS to capture the extent to which practice leadership is proactive, knowledgeable, supportive, and perseverant toward DOCC implementation . The ILS has strong psychometric properties and contributes to EBP adoption [105, 106, 130,131,132,133]. Minor word changes were made to focus the scale on implementation of EPBs for behavioral health in primary care practices.
Implementation Climate Scale (ICS)
Staff will complete the ICS to assess extent to which the practice prioritizes and values implementation of evidence-based practices for behavioral health . The ICS has high reliability and construct validity with organizational measures and is associated with EBP use [77, 104]. As with the ILS, minor word changes were made.
Inner Setting Measures (ISM)
As secondary measures, staff will complete three scales assessing overall culture, implementation climate for DOCC specifically, and leadership engagement. These scales have good factor structure, internal consistency, and discriminant validity .
Potential moderators of the effects of TEAM or LEAD on implementation outcomes
We will test whether negative staff attitudes about BH services (Physician Belief Scale (PBS))  or the CM’s discipline (nursing vs. mental health; SIF) at baseline moderates the effects of TEAM.
We will examine baseline implementation leadership (ILS) and implementation climate (ICS) as moderators of the LEAD condition.
We will test three family characteristics as moderators of TEAM or LEAD (i.e., caregiver gender, caregiver race/ethnicity, child baseline ADHD severity).
Fidelity to implementation condition
We will follow Proctor et al.’s recommendations for specifying and reporting implementation strategies  and develop implementation manuals for both TEAM and LEAD describing key steps and activities (see Supplemental File 3). Facilitators will track attendance, participation, and activities completed during facilitation calls. They will also record specific barriers, solutions, and next steps.
Fidelity to DOCC
Fidelity to DOCC will be documented in two ways. First, we will evaluate dosage for all cases. We define adequate dosage as at least 6 DOCC encounters and at least 1 care management meeting for each case [137, 138]. Second, we will assess fidelity by reviewing audio recordings of DOCC treatment sessions. CMs will upload two session audio recordings for cases that consent to recording via a secure, HIPAA-compliant audio portal. A trained research assistant (unaware of implementation condition) will rate recordings using the Treatment Integrity Rating Form .
Power and sample size
Our power calculations for Aim 1 are based on comparisons of provider and patient outcomes among the six practices in each of the four conditions in the factorial design. For our primary implementation outcome (number of DOCC sessions), we estimate 80% power to detect an effect size (ES) of 0.42. Given our modest practice sample size, we will explore group differences in practice-level CCM core competencies. In our prior trial, large ESs were found for provider changes in behavior management practices (ES = 0.78) and perceived competencies (ES = 0.77) . For patient outcomes, we assume an ICC of 0.01 in cases treated by the same provider and 20% attrition based on prior work [107, 109]. In a simple RCT with 262 cases/condition, we can detect an effect size as small as 0.25. The effect size for clinical improvement in individualized targets in our prior trial was 0.60, indicating that our sample size provides adequate power to detect group differences in patient outcomes .
Power calculations for mediator analyses (Aim 2) are based on simulation studies . Because the sample sizes per cell are modest (6 sites per condition), we are powered to detect large effects. This seems justifiable insofar as smaller effect sizes (e.g., 0.35–0.50) are of less interest given the higher cost of using more intensive combined implementation strategies. Thus, our mediational tests are powered only for a large effect size. For our moderator analyses (Aim 3), we have power 0.80 to detect small effects of f-square = 0.03.
Data management and monitoring
Our IRB-approved protocol specifies plans for data entry, coding, security, and storage of data on a secure server. Our web-based assessment system includes many mechanisms to protect data integrity and promote data quality (e.g., only allowing valid values, warnings of missing responses), and the data manager will maintain detailed data management procedures (e.g., range checks, data quality reports). The Principal Investigator will meet weekly with study personnel to discuss study goals, participant recruitment/retention, progress in data collection and analysis, and any adverse events or participant complaints.
The study team has established procedures for monitoring and managing risks to participants. A Data and Safety Monitoring Board of five external professionals with varied clinical and research expertise will review study reports and summaries of human subjects’ issues annually and submit recommendations regarding study continuation or proposed modifications. Study modifications will be approved by the IRB, and any significant changes in methods will be reported to the project’s program officer and described in an update to the registered protocol on https://ClinicalTrials.gov. The Principal Investigator and approved study team members will have access to the final trial datasets. Study co-investigators and consultants can access the datasets by request after obtaining IRB approval. Per NIMH policy, a deidentified dataset will be prepared for the National Data Archive.
Our primary analytical tool for Aim 1 will be a two-way analysis of variance with interaction using mixed-effects linear models for continuous outcomes and generalized linear mixed models for categorical variables. We will use an intent-to-treat approach. We will examine patterns of missingness and, if necessary, use imputation methods using available covariates [141, 142]. For exploratory analyses, we will use variable selection methods such as Lasso and elastic net to get parsimonious models that fit well. Analyses will be conducted at the end of the trial; no interim analyses are planned.
For implementation outcomes, we hypothesize that relative to REP only, TEAM, LEAD, and their interaction will significantly improve (1) the number of DOCC service encounters per case and (2) collaborative care competencies within the practice. We will test effects on outcomes at months 6, 12, 18, and 24. Fixed effects include time, condition, and practice, and random effects will account for nesting. We will conduct planned contrasts to test differences between conditions.
For patient outcomes, we hypothesize that relative to REP only, TEAM, LEAD, and their interaction will significantly improve (1) child symptom severity and (2) quality of life. Fixed effects include time and condition, and random effects will account for nesting. As exploratory analyses, we will run larger models adjusting for demographic and baseline clinical variables and assess goodness of fit.
We hypothesize that TEAM practices will show significant gains in our proposed TEAM targets (team functioning, integration/quality) and that LEAD practices will show significant gains in our proposed LEAD targets (implementation leadership, implementation climate). We will use larger models to adjust for demographic and baseline clinical variables, test goodness of fit, and compare different time periods to check model fit. We will test if the association between each condition (TEAM, LEAD) and each implementation outcome is mediated by their respective targets and explore serial mediation.
We will test moderation of TEAM effects by provider characteristics, moderation of LEAD effects by practice characteristics, and moderation of TEAM and/or LEAD effects on child outcomes by family characteristics. We will adapt standard tests for interaction for a detailed study of the candidate moderators. We will use Wallace et al.’s method to derive a single optimal linear combination of moderators . The composite moderator typically has a larger ES than any one variable; the relative weights of the composite index can be interpreted to determine the relative importance of moderators for practical applications.
Study results will be shared with participating practices, disseminated through scientific conferences and journals, and reported on https://ClinicalTrials.gov. Results will be shared regardless of the magnitude or direction of effects. Authorship decisions will be based on the International Committee of Medical Journal Editors criteria .
Effective behavioral health interventions based on the CCM are among the most complex healthcare services provided in primary care. They require reciprocal interactions among primary care staff and specialists, monitoring of symptoms and interventions, shared language and goals, and multilevel infrastructure supports. Implementation facilitation may be necessary to support such interventions even in the best primary care practices. This statewide trial is one of the first studies to test multilevel implementation strategies to improve implementation of a CCM in pediatric primary care. Our factorial design will allow us to test the separate and interactive effects of facilitation strategies targeting the care team and strategies targeting practice leadership on hypothesized team-level and organizational mechanisms of action, furthering our understanding of change mechanisms in implementation [79,80,81,82,83,84].
This trial is designed to yield the best possible outcomes for community-based primary care practices. It will take place in motivated sites interested in behavioral health interventions who will be engaged by experienced practice network leadership and research teams to deliver this complex intervention. Study parameters were designed in collaboration with practice network leadership and an experienced investigator team to minimize burden and enhance generalizability. We recognize the many challenges to conducting a large community-based trial and the uncertainty underlying the decisions made to address them. We will recruit practices across the state who vary in size, geography, population served, and availability of BH services. Still, it is not clear if practices with fewer resources will participate or if practices will be able to deliver care to children most in need of services.
We considered alternative study designs that might inform large-scale implementation of DOCC (e.g., SMART design, group additive, hybrid type 2) before choosing a factorial design that allows testing two distinct strategies with a feasible sample size. We incorporate implementation science guidelines [55, 136, 145] and prior trial methods (e.g., [53, 63, 65, 66, 70]) to enhance rigor, operationalize our TEAM and LEAD facilitation strategies to enhance reproducibility, and endeavor to advance implementation science by including a sustainability period and planning multilevel tests of mediation [84, 146]. Our study design and parameters may push the limits of our knowledge, but hopefully strike a balance between rigor and feasibility in our efforts to improve implementation of complex BH interventions. Effective strategies for implementing CCMs can enable their scale-up in pediatric primary care and improve children’s behavioral health outcomes.
Availability of data and materials
Not applicable. No data have been collected yet.
Merikangas KR, He J-P, Brody D, Fisher PW, Bourdon K, Koretz DS. Prevalence and treatment of mental disorders among US children in the 2001–2004 NHANES. Pediatrics. 2010;125(1):75–81.
Tempel AB, Herschell AD, Kolko D. Conduct disorder. In: The encyclopedia of clinical psychology: American Cancer Society; 2015. p. 1–6. Available from: http://www.onlinelibrary.wiley.com/doi/abs/10.1002/9781118625392.wbecp174. Cited 2021 Oct 28.
Asarnow JR, Rozenman M, Wiblin J, Zeltzer L. Integrated medical-behavioral care compared with usual primary care for child and adolescent behavioral health: a meta-analysis. JAMA Pediatr. 2015;169(10):929–37.
Elango S, Whitmire R, Kim J, Berhane Z, Davis R, Turchi RM. Family experience of caregiver burden and health care usage in a statewide medical home program. Acad Pediatr. 2021; Available from: http://www.sciencedirect.com/science/article/pii/S1876285921003636. Cited 2021 Nov 8.
McAllister JW, Cooley WC, Cleave JV, Boudreau AA, Kuhlthau K. Medical home transformation in pediatric primary care—what drives change? Ann Fam Med. 2013;11(Suppl 1):S90–8.
Lichstein JC, Ghandour RM, Mann MY. Access to the medical home among children with and without special health care needs. Pediatrics. 2018;142(6):e20181795.
Mohanty S, Wells N, Antonelli R, Turchi RM. Incorporating patient- and family-centered care into practice: the PA medical home initiative. Pediatrics. 2018;142(3). https://doi.org/10.1542/peds.2017-2453 Cited 2021 Nov 8.
Richardson LP, McCarty CA, Radovic A, Suleiman AB. Research in the integration of behavioral health for adolescents and young adults in primary care settings: a systematic review. J Adolesc Health. 2017;60(3):261–9.
Campo JV, Geist R, Kolko DJ. Integration of pediatric behavioral health services in primary care: improving access and outcomes with collaborative care. Can J Psychiatry. 2018;63(7):432–8.
Council on Children with Disabilities and Medical Home Implementation Project Advisory Committee, Turchi RM, Antonelli RC, Norwood KW Jr, Adams RC, Brei TJ, et al. Patient- and family-centered care coordination: a framework for integrating care for children and youth across multiple systems. Pediatrics. 2014;133(5):e1451–60.
Goodrich DE, Kilbourne AM, Nord KM, Bauer MS. Mental health collaborative care and its role in primary care settings. Curr Psychiatry Rep. 2013;15(8):383.
Woltmann E, Grogan-Kaylor A, Perron B, Georges H, Kilbourne AM, Bauer MS. Comparative effectiveness of collaborative chronic care models for mental health conditions across primary, specialty, and behavioral health care settings: systematic review and meta-analysis. Am J Psychiatry. 2012;169(8):790–804.
Katon W, Unützer J, Wells K, Jones L. Collaborative depression care: history, evolution and ways to enhance dissemination and sustainability. Gen Hosp Psychiatry. 2010;32(5):456–64.
Huffman JC, Mastromauro CA, Beach SR, Celano CM, DuBois CM, Healy BC, et al. Collaborative care for depression and anxiety disorders in patients with recent cardiac events: the management of sadness and anxiety in cardiology (MOSAIC) Randomized Clinical Trial. JAMA Intern Med. 2014;174(6):927–35.
Unützer J, Chan Y-F, Hafer E, Knaster J, Shields A, Powers D, et al. Quality improvement with pay-for-performance incentives in integrated behavioral health care. Am J Public Health. 2012;102(6):e41–5.
Miller CJ, Grogan-Kaylor A, Perron BE, Kilbourne AM, Woltmann E, Bauer MS. Collaborative chronic care models for mental health conditions: cumulative meta-analysis and metaregression to guide future research and implementation. Med Care. 2013;51(10):922–30.
Davis TD, Deen T, Bryant-Bedell K, Tate V, Fortney J. Does minority racial-ethnic status moderate outcomes of collaborative care for depression? Psychiatr Serv. 2011;62(11):1282–8.
Grubbs KM, Cheney AM, Fortney JC, Edlund C, Han X, Dubbert P, et al. The role of gender in moderating treatment outcome in collaborative care for anxiety. Psychiatr Serv. 2015;66(3):265–71.
Wagner EH, Austin BT, Korff MV. Organizing care for patients with chronic illness. Milbank Q. 1996;74(4):511.
Davy C, Bleasel J, Liu H, Tchan M, Ponniah S, Brown A. Effectiveness of chronic care models: opportunities for improving healthcare practice and health outcomes: a systematic review. BMC Health Serv Res. 2015;15(1):194.
Lyon AR, Whitaker K, Richardson LP, French WP, McCauley E. Collaborative care to improve access and quality in school-based behavioral health. J Sch Health. 2019;89(12):1013–23.
Asarnow JR, Hoagwood KE, Stancin T, Lochman JE, Hughes JL, Miranda JM, et al. Psychological science and innovative strategies for informing health care redesign: a policy brief. J Clin Child Adolesc Psychol Off J Soc Clin Child Adolesc Psychol Am Psychol Assoc Div 53. 2015;44(6):923–32.
Cohen DJ, Balasubramanian BA, Isaacson NF, Clark EC, Etz RS, Crabtree BF. Coordination of health behavior counseling in primary care. Ann Fam Med. 2011;9(5):406–15.
Crabtree BF, Nutting PA, Miller WL, McDaniel RR, Stange KC, Jaen CR, et al. Primary care practice transformation is hard work: insights from a 15-year developmental program of research. Med Care. 2011;49:S28.
Cromp D, Hsu C, Coleman K, Fishman PA, Liss DT, Ehrlich K, et al. Barriers and facilitators to team-based care in the context of primary care transformation. J Ambul Care Manage. 2015;38(2):125–33.
Kadu MK, Stolee P. Facilitators and barriers of implementing the chronic care model in primary care: a systematic review. BMC Fam Pract. 2015;16(1):12.
Lipschitz JM, Benzer JK, Miller C, Easley SR, Leyson J, Post EP, et al. Understanding collaborative care implementation in the Department of Veterans Affairs: core functions and implementation challenges. BMC Health Serv Res. 2017;17(1):691.
Nutting PA, Crabtree BF, Miller WL, Stange KC, Stewart E, Jaén C. Transforming physician practices to patient-centered medical homes: lessons from the National Demonstration Project. Health Aff (Millwood). 2011;30(3):439–45.
Overbeck G, Davidsen AS, Kousgaard MB. Enablers and barriers to implementing collaborative care for anxiety and depression: a systematic qualitative review. Implement Sci. 2016;11(1):165.
Wood E, Ohlsen S, Ricketts T. What are the barriers and facilitators to implementing collaborative care for depression? A systematic review. J Affect Disord. 2017;214:26–43.
Kolko DJ, Perrin EC. The integration of behavioral health interventions in children’s health care: services, science, and suggestions. J Clin Child Adolesc Psychol. 2014;43(2):216–28.
Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23.
Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14(1):1.
Kolko DJ, Torres E, Rumbarger K, James E, Turchi R, Bumgardner C, et al. Integrated pediatric health care in Pennsylvania: a survey of primary care and behavioral health providers. Clin Pediatr (Phila). 2019;58(2):213–25.
Dinh JV, Traylor AM, Kilcullen MP, Perez JA, Schweissing EJ, Venkatesh A, et al. Cross-disciplinary care: a systematic review on teamwork processes in health care. Small Group Res. 2020;51(1):125–66.
Schmutz J, Manser T. Do team processes really have an effect on clinical performance? A systematic literature review. BJA Br J Anaesth. 2013;110(4):529–44.
Cramm JM, Strating MMH, Nieboer AP. The role of team climate in improving the quality of chronic care delivery: a longitudinal study among professionals working with chronically ill adolescents in transitional care programmes. BMJ Open. 2014;4(5):e005369.
Stout S, Zallman L, Arsenault L, Sayah A, Hacker K. Developing high-functioning teams: factors associated with operating as a “real team” and implications for patient-centered medical home development. Inq J Med Care Organ Provis Financ. 2017;54:0046958017707296.
Lukas C, Mohr D, Meterko M. Team effectiveness and organizational context in the implementation of a clinical innovation. Qual Manag Health Care. 2009;18(1):25–39.
Denti L, Hemlin S. Leadership and innovation in organizations: a systematic review of factors that mediate or moderate the relationship. Int J Innov Manag. 2012;16(03):1240007.
Aarons GA, Sommerfeld DH. Leadership, innovation climate, and attitudes toward evidence-based practice during a statewide implementation. J Am Acad Child Adolesc Psychiatry. 2012;51(4):423–31.
Aarons GA, Fettes DL, Hurlburt MS, Palinkas LA, Gunderson L, Willging CE, et al. Collaboration, negotiation, and coalescence for interagency-collaborative teams to scale-up evidence-based practice. J Clin Child Adolesc Psychol. 2014;0(0):1–14.
Aarons GA, Fettes DL, Sommerfeld DH, Palinkas LA. Mixed methods for implementation research: application to evidence-based practice implementation and staff turnover in community-based organizations providing child welfare services. Child Maltreat. 2012;17(1):67–79.
Aarons GA, Sklar M, Mustanski B, Benbow N, Brown CH. “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implement Sci. 2017;12:111.
Egeland KM, Skar A-MS, Endsjø M, Laukvik EH, Bækkelund H, Babaii A, et al. Testing the leadership and organizational change for implementation (LOCI) intervention in Norwegian mental health clinics: a stepped-wedge cluster randomized design study protocol. Implement Sci. 2019;14(1):28.
Michaelis B, Stegmaier R, Sonntag K. Shedding light on followers’ innovation implementation behavior: the role of transformational leadership, commitment to change, and climate for initiative. J Manag Psychol. 2010;25(4):408–29.
Proctor E, Ramsey AT, Brown MT, Malone S, Hooley C, McKay V. Training in Implementation Practice Leadership (TRIPLE): evaluation of a novel practice change strategy in behavioral health organizations. Implement Sci. 2019;14(1):66.
Miech EJ, Rattray NA, Flanagan ME, Damschroder L, Schmid AA, Damush TM. Inside help: an integrative review of champions in healthcare-related implementation. SAGE Open Med. 2018;6 Available from: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5960847/. Cited 2019 Jan 25.
Kelleher KJ, Stevens J. Evolution of child mental health services in primary care. Acad Pediatr. 2009;9(1):7–14.
Asarnow JR, Kolko DJ, Miranda J, Kazak AE. The pediatric patient-centered medical home: innovative models for improving behavioral health. Am Psychol. 2017;72(1):13–27.
Smith SN, Liebrecht CM, Bauer MS, Kilbourne AM. Comparative effectiveness of external vs blended facilitation on collaborative care model implementation in slow-implementer community practices. Health Serv Res. 2020;55(6):954–65.
Kilbourne AM, Abraham KM, Goodrich DE, Bowersox NW, Almirall D, Lai Z, et al. Cluster randomized adaptive implementation trial comparing a standard versus enhanced implementation intervention to improve uptake of an effective re-engagement program for patients with serious mental illness. Implement Sci. 2013;8(1):136.
Kilbourne AM, Goodrich DE, Nord KM, Van Poppelen C, Kyle J, Bauer MS, et al. Long-term clinical outcomes from a randomized controlled trial of two implementation strategies to promote collaborative care attendance in community practices. Adm Policy Ment Health Ment Health Serv Res. 2015;42(5):642–53.
Rycroft-Malone J. The PARIHS Framework—a framework for guiding the implementation of evidence-based practice. J Nurs Care Qual. 2004;19(4):297–304.
Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21.
Berta W, Cranley L, Dearing JW, Dogherty EJ, Squires JE, Estabrooks CA. Why (we think) facilitation works: insights from organizational learning theory. Implement Sci. 2015;10(1):1–13.
Hoagwood KE, Atkins M, Ialongo N. Unpacking the black box of implementation: the next generation for policy, research and practice. Adm Policy Ment Health Ment Health Serv Res. 2013;40(6):451–5.
Nadeem E, Gleacher A, Beidas RS. Consultation as an implementation strategy for evidence-based practices across multiple contexts: unpacking the black box. Adm Policy Ment Health Ment Health Serv Res. 2013;40(6):439–50.
Parker L, Ritchie M, Bonner L, Kirchner J. Examining inside the black box of implementation facilitation: process and effects on program quality. Bethesda; Presented at the National Institutes of Health/AcademyHealth 7th Annual Conference on the Science of Dissemination and Implementation, Bethesda, MD, December 9, 2014. Abstracts for all talks within the panel - http://www.implementationscience.biomedcentral.com/articles/10.1186/1748-5908-10-S1-A46.
Ritchie MJ, Dollar KM, Miller CJ, Smith JL, Oliver KA, Kim B, Connolly, SL, Woodward E, Ochoa-Olmos T, Day S, Lindsay JA, Kirchner JE. Using Implementation Facilitation to Improve Healthcare (Version 3). Veterans Health Administration, Behavioral Health Quality Enhancement Research Initiative (QUERI). 2020. Available at: http://www.queri.research.va.gov/tools/Facilitation-Manual.pdf.
Stetler CB, Legro MW, Rycroft-Malone J, Bowman C, Curran G, Guihan M, et al. Role of “external facilitation” in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement Sci. 2006;1(1):1–15.
Parchman ML, Noel PH, Culler SD, Lanham HJ, Leykum LK, Romero RL, et al. A randomized trial of practice facilitation to improve the delivery of chronic illness care in primary care: initial and sustained effects. Implement Sci. 2013;8(1):93.
Bhat A, Bennett IM, Bauer AM, Beidas RS, Eriksen W, Barg FK, et al. Longitudinal remote coaching for implementation of perinatal collaborative care: a mixed-methods analysis. Psychiatr Serv. 2020;71(5):518–21.
Walunas TL, Ye J, Bannon J, Wang A, Kho AN, Smith JD, et al. Does coaching matter? Examining the impact of specific practice facilitation strategies on implementation of quality improvement interventions in the Healthy Hearts in the Heartland study. Implement Sci. 2021;16(1):33.
Smith SN, Almirall D, Prenovost K, Liebrecht C, Kyle J, Eisenberg D, et al. Change in patient outcomes after augmenting a low-level implementation strategy in community practices that are slow to adopt a collaborative chronic care model: a cluster randomized implementation trial. Med Care. 2019;57(7):503–11.
Fortney JC, Pyne JM, Ward-Jones S, Bennett IM, Diehl J, Farris K, et al. Implementation of evidence-based practices for complex mood disorders in primary care safety net clinics. Fam Syst Health. 2018;36(3):267–80.
Kirchner JE, Ritchie MJ, Pitcock JA, Parker LE, Curran GM, Fortney JC. Outcomes of a partnered facilitation strategy to implement primary care–mental health. J Gen Intern Med. 2014;29(S4):904–12.
Baloh J, Zhu X, Ward MM. Types of internal facilitation activities in hospitals implementing evidence-based interventions. Health Care Manage Rev. 2018;43(3):229–37.
Smith S, Almirall D, Bauer M, Liebrecht C, Kilbourne A. (When) is more better? Comparative effectiveness of external vs external+internal facilitation on site-level uptake of a collaborative care model in community-based practices that are slow to adopt. Health Serv Res. 2020;55(S1):61–2.
Saldana L, Bennett I, Powers D, Vredevoogd M, Grover T, Schaper H, et al. Scaling implementation of collaborative care for depression: adaptation of the Stages of Implementation Completion (SIC). Adm Policy Ment Health. 2020;47(2):188–96.
Nadeem E, Olin SS, Hill LC, Hoagwood KE, Horwitz SM. Understanding the components of quality improvement collaboratives: a systematic literature review. Milbank Q. 2013;91(2):354–94.
Nadeem E, Gleacher A, Pimentel S, Hill LC, McHugh M, Hoagwood KE. The role of consultation calls for clinic supervisors in supporting large-scale dissemination of evidence based treatments for children. Adm Policy Ment Health Ment Health Serv Res. 2013;40(6):530–40.
Bauer MS, Miller C, Kim B, Lew R, Weaver K, Coldwell C, et al. Partnering with health system operations leadership to develop a controlled implementation trial. Implement Sci. 2016;11:22.
Rattray NA, Khaw A, McGrath M, Damush TM, Miech EJ, Lenet A, et al. Evaluating the feasibility of implementing a Telesleep pilot program using two-tiered external facilitation. BMC Health Serv Res. 2020;20(1):357.
Weiner BJ, Lewis MA, Clauser SB, Stitzenberg KB. In search of synergy: strategies for combining interventions at multiple levels. J Natl Cancer Inst Monogr. 2012;2012(44):34–41.
Schilling S, Bigal L, Powell BJ. Developing and Applying Synergistic Multilevel Implementation Strategies to Increase Adoption of Screening and Referral to an Evidence-Based Parenting Intervention in Primary Care. Manuscript submitted for publication. Implementation Research and Practice. 2022.
Williams NJ, Ehrhart MG, Aarons GA, Marcus SC, Beidas RS. Linking molar organizational climate and strategic implementation climate to clinicians’ use of evidence-based psychotherapy techniques: cross-sectional and lagged analyses from a 2-year observational study. Implement Sci. 2018;13(1):85.
Williams NJ, Glisson C, Hemmelgarn A, Green P. Mechanisms of change in the ARC organizational strategy: increasing mental health clinicians’ EBP adoption through improved organizational culture and capacity. Adm Policy Ment Health. 2017;44(2):269–83.
Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6 Available from: https://www.frontiersin.org/articles/10.3389/fpubh.2018.00136/full?&utm_source=Email_to_authors_&utm_medium=Email&utm_content=T1_11.5e1_author&utm_campaign=Email_publication&field=&journalName=Frontiers_in_Public_Health&id=336504. Cited 2018 Oct 12.
Lewis CC, Boyd MR, Walsh-Bailey C, Lyon AR, Beidas R, Mittman B, et al. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15(1):1–25.
Lewis CC, Powell BJ, Brewer SK, Nguyen AM, Schriger SH, Vejnoska SF, et al. Advancing mechanisms of implementation to accelerate sustainable evidence-based practice integration: protocol for generating a research agenda. BMJ Open. 2021;11(10):e053474.
Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, et al. Enhancing the impact of implementation strategies in healthcare: a research agenda. Front Public Health. 2019;7 Available from: https://www.frontiersin.org/articles/10.3389/fpubh.2019.00003/full. Cited 2019 Jan 29.
Williams NJ. Multilevel mechanisms of implementation strategies in mental health: integrating theory, research, and practice. Adm Policy Ment Health Ment Health Serv Res. 2016;43(5):783–98.
Williams NJ, Beidas RS. Annual research review: the state of implementation science in child psychology and psychiatry: a review and suggestions to advance the field. J Child Psychol Psychiatry. 2019;60:430–50.
Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs. Med Care. 2012;50(3):217–26.
Kilbourne AM, Neumann MS, Pincus HA, Bauer MS, Stall R. Implementing evidence-based interventions in health care: application of the replicating effective programs framework. Implement Sci. 2007;2(1):42.
Green LW, Kreuter MW. Health promotion planning: an educational and environmental approach. 2nd ed. Mountain View: Mayfield Publishing Company; 1991. p. 506.
Kirchner JE, Kearney LK, Ritchie MJ, Dollar KM, Swensen AB, Schohn M. Research and services partnerships: lessons learned through a national partnership between clinical leaders and researchers. Psychiatr Serv. 2014;65(5):577–9.
Kilbourne AM, Goodrich DE, Lai Z, Almirall D, Nord KM, Bowersox NW, et al. Reengaging veterans with serious mental illness into care: preliminary results from a national randomized trial. Psychiatr Serv. 2015;66(1):90–3.
Kilbourne AM, Neumann MS, Waxmonsky J, Bauer MS, Kim HM, Pincus HA, et al. Public-academic partnerships: evidence-based implementation: the role of sustained community-based practice and research partnerships. Psychiatr Serv. 2012;63(3):205–7.
Neumann MS, Sogolow ED. Replicating effective programs: HIV/AIDS prevention technology transfer. AIDS Educ Prev Off Publ Int Soc AIDS Educ. 2000;12(5 Suppl):35–48.
Bandura A. Social learning theory: Englewood Cliffs. NJ: Prentice-Hall; 1977.
Rogers EM. Diffusion of innovations. 4th ed. New York, NY. Simon and Schuster; 2010. p. 550.
Knox L, Taylor EF, Geonnotti K, Machta R, Kim J, Nysenbaum J, et al. Developing and running a primary care practice facilitation program: a how-to guide. Mathematica Policy Research Reports. Mathematica Policy Research; 2011. (Mathematica Policy Research Reports). Report No.: cf15b9115d714275b538a9c835145652. Available from: https://ideas.repec.org/p/mpr/mprres/cf15b9115d714275b538a9c835145652.html. Cited 2021 Oct 29.
Baskerville NB, Liddy C, Hogg W. Systematic review and meta-analysis of practice facilitation within primary care settings. Ann Fam Med. 2012;10(1):63–74.
Klein C, DiazGranados D, Salas E, Le H, Burke CS, Lyons R, et al. Does team building work? Small Group Res. 2009;40(2):181–222.
Miller CJ, Kim B, Silverman A, Bauer MS. A systematic review of team-building interventions in non-acute healthcare settings. BMC Health Serv Res. 2018;18(1):146.
McEwan D, Ruissen GR, Eys MA, Zumbo BD, Beauchamp MR. The effectiveness of teamwork training on teamwork behaviors and team performance: a systematic review and meta-analysis of controlled interventions. PLoS One. 2017;12(1):e0169604.
Salas E, DiazGranados D, Klein C, Burke CS, Stagl KC, Goodwin GF, et al. Does team training improve team performance? A meta-analysis. Hum Factors. 2008;50(6):903–33.
Reyes DL, Tannenbaum SI, Salas E. Team development: the power of debriefing. People Strategy. 2018;41(2):46.
Tannenbaum SI, Cerasoli CP. Do team and individual debriefs enhance performance? A meta-analysis. Hum Factors. 2013;55(1):231–45.
Kilbourne AM, Almirall D, Eisenberg D, Waxmonsky J, Goodrich DE, Fortney JC, et al. Protocol: Adaptive Implementation of Effective Programs Trial (ADEPT): cluster randomized SMART trial comparing a standard versus enhanced implementation strategy to improve outcomes of a mood disorders program. Implement Sci. 2014;9(1):132.
Aarons GA, Ehrhart MG, Farahnak LR, Sklar M. The role of leadership in creating a strategic climate for evidence-based practice implementation and sustainment in systems and organizations. Front Public Health Serv Syst Res. 2014;3(4):3.
Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS). Implement Sci. 2014;9:157.
Aarons GA, Farahnak LR, Ehrhart MG, Sklar M. Aligning leadership across systems and organizations to develop strategic climate to for evidence-based practice implementation. Annu Rev Public Health. 2014;35:255–74.
Aarons GA, Ehrhart MG, Farahnak LR. The implementation leadership scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci. 2014;9(1):45.
Kolko DJ, Campo J, Kilbourne AM, Hart J, Sakolsky D, Wisniewski S. Collaborative care outcomes for pediatric behavioral health problems: a cluster randomized trial. Pediatrics. 2014;133(4):e981–92.
Kolko DJ, Campo JV, Kilbourne AM, Kelleher K. Doctor-Office Collaborative Care for pediatric behavioral problems: a preliminary clinical trial. Arch Pediatr Adolesc Med. 2012;166(3):224–31.
Kolko DJ, Campo JV, Kelleher K, Cheng Y. Improving access to care and clinical outcome for pediatric behavioral problems: a randomized trial of a nurse-administered intervention in primary care. J Dev Behav Pediatr. 2010;31(5):393–404.
Stange KC, Nutting PA, Miller WL, Jaén CR, Crabtree BF, Flocke SA, et al. Defining and measuring the patient-centered medical home. J Gen Intern Med. 2010;25(6):601–12.
Lindhiem O, Kolko DJ. Trajectories of symptom reduction during treatment for behavior problems in pediatric primary-care settings. Adm Policy Ment Health Ment Health Serv Res. 2011;38(6):486–94.
Yu H, Kolko DJ, Torres E. Collaborative mental health care for pediatric behavior disorders in primary care: does it reduce mental health care costs? Fam Syst Health J Collab Fam Healthc. 2017;35(1):46–57.
Chan A-W, Tetzlaff JM, Gøtzsche PC, Altman DG, Mann H, Berlin JA, et al. SPIRIT 2013 explanation and elaboration: guidance for protocols of clinical trials. BMJ. 2013;346:e7586.
Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687.
Moher D, Hopewell S, Schulz KF, Montori V, Gotzsche PC, Devereaux PJ, et al. CONSORT 2010 explanation and elaboration: updated guidelines for reporting parallel group randomised trials. BMJ. 2010;340(mar23 1):c869.
Hamburger R, Berhane Z, Gatto M, Yunghans S, Davis RK, Turchi RM. Evaluation of a statewide medical home program on children and young adults with asthma. J Asthma. 2015;52(9):940–8.
Jellinek MS, Murphy JM, Robinson J, Feins A, Lamb S, Fenton T. Pediatric symptom checklist: screening school-age children for psychosocial dysfunction. J Pediatr. 1988;112(2):201–9.
Stiffman AR, Horwitz SM, Hoagwood K, Compton W, Cottler L, Bean DL, et al. The Service Assessment for Children and Adolescents (SACA): adult and child reports. J Am Acad Child Adolesc Psychiatry. 2000;39(8):1032–9.
Pediatrics AA of. Appendix S3: mental health practice readiness inventory. Pediatrics. 2010;125(Supplement 3):S129–32.
King MA, Wissow LS, Baum RA. The role of organizational context in the implementation of a statewide initiative to integrate mental health services into pediatric primary care. Health Care Manage Rev. 2018;43(3):206–17.
Baum RA, King MA, Wissow LS. Outcomes of a statewide learning collaborative to implement mental health services in pediatric primary care. Psychiatr Serv. 2019;70(2):123–9.
Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12(1):108.
Wolraich ML, Lambert W, Doffing MA, Bickman L, Simmons T, Worley K. Psychometric properties of the Vanderbilt ADHD Diagnostic Parent Rating Scale in a referred population. J Pediatr Psychol. 2003;28(8):559–68.
Wolraich ML, Bard DE, Neas B, Doffing M, Beck L. The psychometric properties of the Vanderbilt Attention-Deficit Hyperactivity Disorder Diagnostic Teacher Rating Scale in a community population. J Dev Behav Pediatr. 2013;34(2):83–93.
Varni JW, Seid M, Kurtin PS. PedsQLTM 4.0: reliability and validity of the Pediatric Quality of Life InventoryTM Version 4.0 Generic Core Scales in healthy and patient populations. Med Care. 2001;39(8):800–12.
Varni JW, Katz ER, Seid M, Quiggins DJL, Friedman-Bender A, Castro CM. The Pediatric Cancer Quality of Life Inventory (PCQL). I. Instrument development, descriptive statistics, and cross-informant variance. J Behav Med. 1998;21(2):179–204.
Song H, Chien AT, Fisher J, Martin J, Peters AS, Hacker K, et al. Development and validation of the Primary Care Team Dynamics Survey. Health Serv Res. 2015;50(3):897–921.
Ziniel SI, Rosenberg HN, Bach AM, Singer SJ, Antonelli RC. Validation of a parent-reported experience measure of integrated care. Pediatrics. 2016;138(6) Available from: https://pediatrics.aappublications.org/content/138/6/e20160676. Cited 2021 Nov 3.
Stock R, Mahoney E, Carney PA. Measuring team development in clinical care settings. Fam Med. 2013;10:691–700.
Finn NK, Torres EM, Ehrhart MG, Roesch SC, Aarons GA. Cross-validation of the Implementation Leadership Scale (ILS) in child welfare service organizations. Child Maltreat. 2016;21(3):250–5.
Torres EM, Ehrhart MG, Beidas RS, Farahnak LR, Finn NK, Aarons GA. Validation of the Implementation Leadership Scale (ILS) with supervisors’ self-ratings. Community Ment Health J. 2018;54(1):49–53.
Aarons GA, Ehrhart MG, Torres EM, Finn NK, Beidas RS. The humble leader: association of discrepancies in leader and follower ratings of implementation leadership with organizational climate in mental health. Psychiatr Serv. 2016;68(2):115–22.
Aarons GA, Green AE, Trott E, Willging CE, Torres EM, Ehrhart MG, et al. The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: a mixed-method study. Adm Policy Ment Health Ment Health Serv Res. 2016;43(6):991–1008.
Fernandez ME, Walker TJ, Weiner BJ, Calo WA, Liang S, Risendal B, et al. Developing measures to assess constructs from the Inner Setting domain of the Consolidated Framework for Implementation Research. Implement Sci. 2018;13(1):52.
McLennan JD, Jansen-McWilliams L, Comer DMA, Gardner WP, Kelleher KJ. The Physician Belief Scale and psychosocial problems in children: a report from the pediatric research in office settings and the Ambulatory Sentinel Practice Network. J Dev. 1999;20(1):24–30.
Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8(1):139.
Waxmonsky J, Kilbourne AM, Goodrich DE, Nord KM, Lai Z, Laird C, et al. Enhanced fidelity to treatment for bipolar disorder: results from a randomized controlled implementation trial. Psychiatr Serv. 2014;65(1):81–90.
Kolko DJ, McGuier EA, Campo JV, Kilbourne AM, Kelleher KJ. The role of treatment team collaboration in a clinical trial for behavior problems and ADHD in pediatric primary care. Poster presentation at the 14th Annual Conference on the Science of Dissemination and Implementation in Health, Virtual. 2021.
Kolko DJ. Individual cognitive behavioral treatment and family therapy for physically abused children and their offending parents: a comparison of clinical outcomes. Child Maltreat. 1996;1(4):322–42.
MacKinnon DP, Lockwood CM, Hoffman JM, West SG, Sheets V. A comparison of methods to test mediation and other intervening variable effects. Psychol Methods. 2002;7(1):83.
Mazumdar S, Tang G, Houck PR, Dew MA, Begley AE, Scott J, et al. Statistical analysis of longitudinal psychiatric data with dropouts. J Psychiatr Res. 2007;41(12):1032–41.
Schafer JL, Graham JW. Missing data: our view of the state of the art. Psychol Methods. 2002;7(2):147–77.
Wallace ML, Frank E, Kraemer HC. A novel approach for developing and interpreting treatment moderator profiles in randomized clinical trials. JAMA Psychiatry. 2013;70(11):1241–7.
International Committee of Medical Journal Editors. Recommendations for the conduct, reporting, editing, and publication of scholarly work in medical journals. 2021. Available from: http://www.icmje.org/icmje-recommendations.pdf. Cited 2021 Dec 21.
Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14(1):42.
Shelton RC, Cooper BR, Stirman SW. The sustainability of evidence-based interventions and practices in public health and health care. Annu Rev Public Health. 2018;39:55–76.
We appreciate the support of the research and clinical staff of SKIP, including Eunice Torres, Amy Laughlin, Justin Schreiber, Shelby Parsons, Alanna Manigault, Kevin Rumbarger, Jeffrey Rounds, and Jonathan Hart; the Office of Academic Computing in the Department of Psychiatry, University of Pittsburgh; and the practices affiliated with the Medical Home Program of the Pennsylvania Chapter of the American Academy of Pediatrics.
Study team roles and responsibilities
The Principal Investigator designs and conducts the trial, coordinates and manages study team, oversees participant safety, and ensures integrity of data analyses and reporting.
The PA-AAP Medical Home Program recruits primary care practices and coordinates with participating sites.
The co-investigators and consultants contribute to study design and methods, provide feedback and advice to optimize trial feasibility and quality, and contribute to scientific presentations and publications.
The statistician oversees randomization, data management, and statistical analyses.
The research staff conduct participant recruitment and retention, data collection, data management and analysis as well as develop and maintain study materials and procedures.
The Data and Safety Monitoring Board monitors and evaluates participant safety, study conduct, and scientific validity and integrity of the trial.
December 29, 2021
The study is funded by a grant from the National Institute of Mental Health (MH124914), with additional support from MH115838. EAM is supported by MH123729. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. Funding sources had no role in study design and will not have any role in execution, analyses, interpretation, or presentation of results. The authors report no financial conflicts of interest.
Ethics approval and consent to participate
All study procedures were approved by the University of Pittsburgh Institutional Review Board (20080207-004).
Consent for publication
The authors declare no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Additional file 1:
SPIRIT 2013 Checklist, CONSORT 2010 Checklist, and TIDierR Checklist.
Additional file 2: Supplemental Figure 1.
CONSORT Flow Diagram for Primary Care Provider Participants. Supplemental Figure 2. CONSORT Flow Diagram for Caregiver Participants.
Additional file 3: Supplemental Table 1.
Implementation Strategies and Targets by Condition. Supplemental Table 2. Actions for Implementation Strategies Organized by Phase and Condition.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
About this article
Cite this article
Kolko, D.J., McGuier, E.A., Turchi, R. et al. Care team and practice-level implementation strategies to optimize pediatric collaborative care: study protocol for a cluster-randomized hybrid type III trial. Implementation Sci 17, 20 (2022). https://doi.org/10.1186/s13012-022-01195-7
- Collaborative care model
- Implementation strategies