Skip to main content

Protocol for a hybrid type 2 cluster randomized trial of trauma-focused cognitive behavioral therapy and a pragmatic individual-level implementation strategy

Abstract

Background

More than two-thirds of youth experience trauma during childhood, and up to 1 in 5 of these youth develops posttraumatic stress symptoms that significantly impair their functioning. Although trauma-focused cognitive behavior therapy (TF-CBT) has a strong evidence base, it is rarely adopted, delivered with adequate fidelity, or evaluated in the most common setting where youth access mental health services—schools. Given that individual behavior change is ultimately required for successful implementation, even when organizational factors are firmly in place, focusing on individual-level processes represents a potentially parsimonious approach. Beliefs and Attitudes for Successful Implementation in Schools (BASIS) is a pragmatic, motivationally focused multifaceted strategy that augments training and consultation and is designed to target precise mechanisms of behavior change to produce enhanced implementation and youth clinical outcomes. This study protocol describes a hybrid type 2 effectiveness-implementation trial designed to concurrently evaluate the main effects, mediators, and moderators of both the BASIS implementation strategy on implementation outcomes and TF-CBT on youth mental health outcomes.

Methods

Using a cluster randomized controlled design, this trial will assign school-based mental health (SMH) clinicians and schools to one of three study arms: (a) enhanced treatment-as-usual (TAU), (b) attention control plus TF-CBT, or (c) BASIS+TF-CBT. With a proposed sample of 120 SMH clinicians who will each recruit 4–6 youth with a history of trauma (480 children), this project will gather data across 12 different time points to address two project aims. Aim 1 will evaluate, relative to an enhanced TAU condition, the effects of TF-CBT on identified mechanisms of change, youth mental health outcomes, and intervention costs and cost-effectiveness. Aim 2 will compare the effects of BASIS against an attention control plus TF-CBT condition on theoretical mechanisms of clinician behavior change and implementation outcomes, as well as examine costs and cost-effectiveness.

Discussion

This study will generate critical knowledge about the effectiveness and cost-effectiveness of BASIS—a pragmatic, theory-driven, and generalizable implementation strategy designed to enhance motivation—to increase the yield of evidence-based practice training and consultation, as well as the effectiveness of TF-CBT in a novel service setting.

Trial registration

ClinicalTrials.gov registration number NCT04451161. Registered on June 30, 2020.

Background

Childhood trauma exposure and treatment

More than two-thirds of youth experience trauma during childhood, with one-third experiencing multiple traumatic events [1]. Up to 1 in 5 trauma-exposed youth develops posttraumatic stress symptoms or posttraumatic stress disorder (PTSD) that results in significant impairment in daily functioning [1]. Most youth with trauma-related mental health disorders do not receive treatment, and when they do access care, the services they receive are not evidence-based or delivered with adequate fidelity [2, 3].

Trauma-focused cognitive-behavioral therapy (TF-CBT) has the strongest evidence base of any child trauma treatment, with over 16 randomized trials demonstrating a range of positive outcomes across sex, age range, and ethnic and cultural groups for reduced symptoms of PTSD, anxiety, depression, and trauma-related behavioral problems [4, 5]. Evidence exists for sustained impact 1 year later [6]. Despite its efficacy, there are no effectiveness studies evaluating TF-CBT in schools. This is concerning, as there are attempts to scale-up TF-CBT in schools and other community settings through a variety of national initiatives [7, 8], even in the absence of evidence for effects in the education sector.

School mental health and the implementation gap

Efficacious mental health services such as TF-CBT are unlikely to yield public health impact unless they are consistently implemented in accessible settings. School-based mental health (SMH) services account for 50–80% of all youth mental health services in the USA [9,10,11]. Despite this, treatments found to be efficacious in other settings (e.g., community mental health) have not been “scaled-out” to the educational sector [12]. Like other service sectors, SMH service delivery is characterized by uneven adoption and insufficient fidelity to evidence-based services [13,14,15,16]. As a consequence, the potential of EBTs, like TF-CBT, to reduce symptoms, improve functioning, and ultimately impact public health is significantly curtailed.

Multilevel implementation determinants

As outlined in the Exploration, Preparation, Implementation, Sustainment (EPIS) framework [17], implementation success depends on both individual- and system-level determinants (i.e., barriers and facilitators). Although the importance of organizational influences on implementation is well established [17, 18], organizational change is time consuming and expensive [19], requiring years of consistent and sustained effort to show effects. In contrast, focusing on individual-level processes represents a potentially parsimonious and pragmatic approach to improve the quality of trauma-focused services. Individual behavior change is ultimately required for successful EBT implementation, even when organizational factors are firmly in place [20,21,22]. Indeed, some studies have found that individual factors (especially attitudes) may be significantly more predictive of the use of EBT than organizational factors (e.g., organizational culture, implementation climate) [23, 24]. Although individual attitudes and behaviors are embedded within larger contexts [25], individual barriers may be more malleable and proximal to EBT implementation.

Individual-level implementation strategies

Research on individual-level implementation strategies has largely focused on training and consultation, which are necessary but insufficient to ensure successful EBT implementation [18, 26]. Many providers fail to successfully deliver an EBT even after receiving high-quality training and consultation [27]. Clinicians often remain resistant or ambivalent to change [28, 29] or lack the sustained self-efficacy or motivation to overcome perceived barriers to implementation [28].

To influence provider behaviors in response to EBT training and consultation, the current project applies two well-established theories of behavior change: the Theory of Planned Behavior (TPB) and Health Action Process Approach (HAPA) [30,31,32], which include motivational and volitional phases of behavior change and have been applied to implementation efforts [33, 34]. We selected these behavior change theories because of their clear explication of individual-level factors impacting behavior and their strong empirical support [35,36,37,38]. The central tenet of TPB is that one of the best predictors of behavior is a person’s behavioral intentions [30, 32] or an individual’s motivation or conscious plan to exhibit a particular behavior. Behavioral intentions, in turn, are a function of an individual’s attitudes (cognitive appraisals of the behavior), subjective norms (social pressure to perform the behavior), and self-efficacy (an individual’s confidence about performing a behavior).

The ability to predict behavior using TPB significantly increases with the addition of volitional strategies that support individuals in enacting the behaviors they are motivated to exhibit [39]. HAPA articulates volitional strategies including action planning, or specifying the “when,” “where,” and “how” of a behavior, and problem-solving planning, or articulating how to overcome barriers that interfere with one’s action plan [40]. When applied in combination, these two volitional strategies increase maintenance self-efficacy (i.e., a person’s optimistic beliefs about their capability to overcome barriers that arise while attempting to enact and maintain behavior) and facilitate the link between intentions and behavior, thus increasing the likelihood that specific implementation outcomes (e.g., adoption, fidelity) will occur [41].

Even when TPB has been used to guide process evaluations, efforts have explored traditional implementation strategies such as clinician education and performance-based feedback [42,43,44,45,46,47]. Aspects of HAPA, such as planning interventions, have demonstrated success in single-case or correlational implementation studies but have not been evaluated in randomized trials. Considering the promise of TPB and HAPA for shifting behavior, these theories should inform the development and testing of novel individual-level implementation strategies.

Beliefs and Attitudes for Successful Implementation in Schools

Grounded in TPB and HAPA (see Fig. 1) [30, 31, 35], the Beliefs and Attitudes for Successful Implementation in Schools (BASIS) strategy serves to augment EBT training and consultation. BASIS aims to increase implementation intentions and outcomes by shifting clinician attitudes, subjective norms, and self-efficacy during the motivational phase and maintaining self-efficacy during the volitional phase of behavior change. BASIS is delivered immediately prior to and immediately after EBT training. In addition, clinicians receive an individualized BASIS booster roughly 15 days post-training that is tailored to whether implementation has been initiated. Within the EPIS framework, BASIS sits at the intersection of the preparation/adoption and active implementation phases [17].

Fig. 1
figure1

BASIS and TF-CBT intervention components, hypothesized mechanisms of change, and target outcomes. Mechanisms appear in boxes with rounded corners. Blue = BASIS. Green = TF-CBT. Gray = outcomes

Figure 1 displays core BASIS and TF-CBT components, as well as their respective mechanisms of change (components are described in the Approach). Evaluation of implementation mechanisms is critical to ensuring the most effective and streamlined implementation strategies. There are very few theoretically informed implementation strategies that target precise mechanisms of behavior change [48, 49]. BASIS mechanisms are organized according to whether they are motivational or volitional. Each of the motivational mechanisms drawn from TPB (i.e., attitudes, subjective norms, self-efficacy) represents a malleable individual determinant linked to increased intentions to participate in training and consultation. The volitional mechanism derived from HAPA (i.e., maintenance self-efficacy) captures a critical determinant of the likelihood of clinicians initiating and maintaining implementation following training [50]. Each of the motivational and volitional mechanisms informs a specific component of the BASIS implementation strategy, described below.

In previous pilot work, BASIS has demonstrated large effects on target mechanisms and overall feasibility [23]. For instance, when a preliminary version of BASIS was delivered to 1181 educators in 62 schools, pre-post surveys showed that BASIS led to more favorable post-intervention EBT attitudes (d = 1.03) [51]. Attitudes, in turn, were associated with two measures of EBT fidelity (d = .51; d = .67). In a small-scale randomized trial with SMH clinicians, 23 results indicated that BASIS was highly feasible, acceptable, and contextually appropriate. Further, moderate to large effects at post-training for BASIS mechanisms of change encouraged the current trial.

Objectives and aims

The objective of this hybrid type 2 randomized effectiveness-implementation trial is to simultaneously examine the theoretical mechanisms through which a clinical intervention (TF-CBT) and an implementation strategy (BASIS) impact clinical and implementation outcomes, respectively [52].

Aim 1: experimentally evaluate the effectiveness and cost-effectiveness of TF-CBT in schools versus an enhanced treatment-as-usual condition

Aim 1 will evaluate, relative to control, the effects of TF-CBT conditions on TF-CBT’s identified mechanisms of change (trauma-related cognitions, emotion regulation, behavioral avoidance), differences in child mental health outcomes, and intervention costs and cost-effectiveness.

Aim 2: experimentally evaluate the impact and cost-effectiveness of BASIS versus attention control

Aim 2 will evaluate the main effects of BASIS, relative to control, on its theoretical implementation mechanisms, implementation outcomes, as well as costs and cost-effectiveness. We will also evaluate “hypothesis-defying residuals” (i.e., clinicians whose implementation behaviors are unaccounted for by the theoretical model) to further refine our BASIS theory of change.

Method

In this hybrid type 2 effectiveness-implementation stratified cluster randomized trial, a single clinician from each participating school will be randomized to BASIS plus TF-CBT (BASIS+TF-CBT), attention control plus TF-CBT (AC + TF-CBT), or enhanced treatment as usual (TAU) (see CONSORT diagram in Fig. 2 and Additional Files 1 and 2 for completed CONSORT and SPIRIT checklists). Participants will be recruited over three waves, one per year. Youth participants will be assigned to condition based on their school clinician’s condition. Surveys, interviews, and direct observations will be used to evaluate the impacts and costs of each intervention. In addition, sequential mixed-methods data collection [53] will explore how mechanisms are linked to implementation outcomes for “hypothesis-defying residuals” (e.g., clinicians whose intentions to implement are inconsistent with their documented implementation behaviors).

Fig. 2
figure2

CONSORT diagram

Additional File 3 contains human subjects’ approval. All data collected will be de-identified and stored on secure servers accessible only to members of the research team. Any adverse events or protocol modifications will be tracked and, when indicated, reported in a timely manner to the institutional review board and sponsor.

Changes due to COVID-19

As a result of the COVID-19 pandemic—and with approval from the study’s Program Officer—delivery of BASIS, AC, and training for clinicians in TF-CBT or enhanced TAU was switched from in-person to remote for the duration of the project. In addition, the study was adjusted to allow clinicians to provide virtual services to youth as indicated by their school district/employer policies.

Participants and recruitment

Clinicians

Clinician participants will include 120 SMH providers (40 per wave). SMH providers will consist of school psychologists, social workers, mental health counselors, etc. recruited from elementary, middle, and high schools from economically and ethnically diverse districts. One clinician per school will be recruited with the assistance of district administrators. The research team will contact eligible participants to describe the purpose of the study, research procedures, and incentives. Clinicians will be included if they (a) serve in a professional role to provide school-based mental health services, (b) hold a graduate degree or equivalent certification or experience, (c) have not previously received formal training in TF-CBT, and (d) are not actively receiving support to implement another related intervention.

Students

Youth participants will include 480 students (160 per wave), recruited by TF-CBT and TAU therapists. Youth will meet TF-CBT eligibility criteria, including (a) be enrolled in grades 3–12 and (b) have traumatic event exposure (e.g., exposure to violence) and (c) significant post-traumatic stress symptoms. SMH clinicians will recruit students by screening through their standard referral pathways (e.g., teacher- or self-referral; screening) for PTSD symptoms. Once students are identified, clinicians will ask caregivers whether they are open to being contacted by the research team. For caregivers who agree, contact information will be relayed to the team, who will follow-up by phone to describe the research project, estimated time to participate, participant compensation, and obtain consent (see Additional File 4 for all consent forms). For caregivers who consent, research staff will explain the study to students, answer questions, and obtain assent for participation. Four to six students will be recruited by each clinician. Based on previous research [54, 55], it is expected that students will be ~ 50% female and representative of the schools in which they are enrolled.

Randomization

Clinician random assignment will occur at the school level, with one clinician per school recruited. We use this strategy to minimize analytic difficulties associated with reliably partitioning school and clinician variance with small numbers of clinicians per school and eliminate the risk of condition contamination. Clinicians and TF-CBT trainers/consultants will be blind to their BASIS/AC condition assignment. To ensure comparability of conditions at baseline, minimize potential confounding, and maximize accuracy of effect estimates, we will use a stratified randomization design, carried out by the study’s lead methodologist (MP). We will collect baseline clinician data on implementation intentions and mechanisms (e.g., attitudes, norms, self-efficacy), clinician years of experience, and school characteristics (school size, attendance rates, free/reduced lunch rates, percentage White, disciplinary rates). We will use the nearest neighbor algorithm to generate paired distance estimates between each clinician and then select matched groups of three based on smallest distance estimates. Each group of three will be randomly assigned to each of the three conditions.

Clinical interventions

Both BASIS and AC will bookend a TF-CBT virtual training and will be followed by TF-CBT consultation and online booster session.

TF-CBT

TF-CBT is a 12–16 session intervention for children aged 3 to 18 years with trauma-exposure and related mental health sequelae. TF-CBT includes individual sessions for the youth, individual sessions for parents, and conjoint sessions that include both the youth and parent. TF-CBT has established training and consultation protocols, as well as a psychometrically strong objective fidelity instrument (see Table 1) [56]. All clinicians assigned to one of the TF-CBT conditions will participate in “gold standard” virtual TF-CBT training including completing of an online, self-paced 8–10 h didactic training, a virtual live 3-day training (3–4 h per day) with a certified TF-CBT trainer, and 6 months of post-training consultation. Consultation groups (6–8 providers/each) will be formed within condition to avoid contamination.

Table 1 BASIS strategy components

TAU

The enhanced TAU condition is intended to ensure safety and support SMH providers who have variable levels of experience treating trauma directly and reduce unwanted variance in TAU. The research team developed a brief protocol to provide guidelines for psychoeducation and post assessment connection and support as scaffolding for usual care in schools. To maintain internal validity, this companion protocol does not include in depth attention to the elements of TF-CBT (e.g., exposure) hypothesized to be responsible for its effects. TAU scaffolding will be provided via an online presentation with a live question and answer section at the conclusion.

BASIS strategy

The BASIS multifaceted implementation strategy is group-based and interactive, with a pre-training session (~ 3 h) delivered prior to TF-CBT training, a post-training session (~ 90 min) delivered immediately after training, and an online booster ~ 15 days post-training. Below, we describe BASIS motivational and volitional components, as well as the BASIS structure and fidelity assessment. Table 1 displays all BASIS components.

BASIS motivational components

The first BASIS component involves strategic education focused on increasing beliefs and attitudes about the benefits of EBT and intervention fidelity to improve attitudes. Examples of popular but ineffective practices (e.g., learning styles) are used to help clinicians identify cognitive shortcuts that enhance vulnerability to adopting non-EBTs. In addition, participants are prompted to reflect on the importance of fidelity across a range of professions (e.g., engineering, farming, aviation).

Second, BASIS includes social influence techniques to alter perceptions of subjective norms. Evidence-based social influence strategies consist of two broad categories: (1) social proofing messages that use data or testimonials to describe the behavior or attitudes of others and (2) strategies to induce cognitive dissonance. Social proofs are most influential from individuals with whom they closely identify [57], especially when testimonials speak to the usefulness of the specific behavior [58, 59]. Strategies to induce cognitive dissonance operate on the premise that individuals strive for consistency between their attitudes and actions [60]. Thus, desired behaviors can be increased by evoking commitments that are active (vs. passive), public (vs. private), and voluntary (vs. coerced) [61, 62]. These strategies are integrated throughout BASIS. For instance, normative data and testimonials are used to normalize clinician experience of barriers to EBT implementation (e.g., lack of time, low administrative support), express commitment to problem solve barriers, and debunk common myths about EBTs.

Third, motivational interviewing (MI) is used to enhance self-efficacy. MI is a nondirective, patient-centered approach with strong evidence for building engagement and commitment for behavior change [63,64,65,66,67,68]. The BASIS facilitator utilizes group MI techniques by adopting an empathic, nondirective, and person-centered style to elicit self-motivational statements and encourage “change talk” (i.e., statements about making behavioral changes). Participants engage in a values affirmation activity [69, 70] that has been shown to decrease defensiveness toward change and enhance motivation [71]. To further enhance self-efficacy, participants also anticipate barriers that may arise in implementation and collaborate to generate solutions to those barriers and engage in decisional balance activities to reflect on the pros and cons of changing or not changing.

BASIS volitional components

To address the intention-behavior gap, BASIS includes volitional planning interventions derived from HAPA to increase the likelihood that clinicians will maintain self-efficacy and act upon their intentions by enacting implementation behaviors. Specifically, action planning and problem-solving planning have been shown to facilitate health behaviors such as breast cancer self-examinations, medication adherence, exercise, and healthy eating [72,73,74,75,76,77]. Action planning supports translation of intentions into actions through detailed planning of how to perform behaviors in specific contexts. Problem-solving generates solutions in response to both situational and internal (e.g., cognitive) barriers to facilitate follow through with the action plan. In combination, action planning and problem-solving planning increase the likelihood that implementation intentions translate into behavior change [39]. Action planning and problem-solving planning occur immediately post-EBT training.

BASIS structure

The BASIS pre-training session targets attitudes, subjective norms, and self-efficacy via the motivational components listed above. The pre-training opens with the facilitator engaging SMH providers in an activity to clarify their professional values (MI component). Strategic education components are not delivered didactically, but rather to facilitate interaction among participants. Open-ended questions are used to elicit change talk. Testimonials are interspersed throughout (social influence). At the end of the pre-training session, providers collaborate with each other to develop an individualized menu of potential solutions to common implementation barriers that they can select from when encountering challenges to adopting and delivering new practices with fidelity. Last, they set value-congruent goals related to participation and engagement in the upcoming EBT training.

The BASIS post-training session includes volitional strategies shown in prior research to maintain implementation intentions and facilitate actual enactment of behavior change. Specifically, clinicians are supported to develop action plans and problem-solving plans. Clinicians are provided with an action planning template to detail precisely what TF-CBT components, how, with whom, where/when, and the environmental cues and resources needed to initiate delivery of TF-CBT with fidelity. The problem-solving plan involves clinicians anticipating situational and internal barriers and generating solutions to overcome those barriers to develop personalized if-then plans that can be used when confronted with specific barriers.

The BASIS online booster is delivered at approximately 15 days post-training, a time point when clinicians’ implementation intentions and behaviors may first weaken [78]. The aim of the BASIS booster is to provide adaptive content to clinicians to either increase intentions to implement or maintain self-efficacy to implement the EBT, depending on whether or not clinicians have initiated TF-CBT implementation. For instance, clinicians who have not yet initiated implementation will (1) be provided with a distilled version of the pre-training BASIS content (improving attitudes, subjective norms, and self-efficacy) and (2) be supported to revise their action and problem-solving plans with specific attention to components of these plans that were not aligned with their implementation intentions or the constraints of their service settings.

BASIS fidelity

A BASIS fidelity tool [78] will be used to rate recordings of BASIS delivery by trained research assistants. Each recording will be coded independently by two different raters and disagreements resolved through consensus dialog to capture facilitator adherence and participant responsiveness via engagement [79, 80].

Attention control

Providers randomly assigned to AC condition will receive a 3-h pre-training, 90-min post-training session, and an online booster roughly 15 days post-training to mirror the duration of BASIS and control for dose, information provided, and interventionist effects. The AC condition will be delivered by the same facilitator as BASIS to control for facilitator effects. Content will be didactic, as is typical in trainings for SMH clinicians [81, 82]. The AC pre-training session will provide content on the definition of EBT, how EBTs are established, why clinicians should use EBTs, clinical outcomes associated with different EBTs, and defining different dimensions of fidelity. The post-training session will involve having control clinicians reflect on TF-CBT and its core components and discuss the outcomes associated with TF-CBT. The AC booster will prompt clinicians to reflect on and describe TF-CBT and identify and define each of its core components.

Clinician data collection

Clinician data collection will span the active implementation and sustainment phases (18 months in total). Data will include clinician quantitative surveys and qualitative interviews, fidelity assessments of recorded TF-CBT sessions (via objective coding), and ratings of TF-CBT case presentations completed by TF-CBT consultants. Data collection will be incentivized in both the implementation and sustainment phases.

Quantitative surveys

Clinician surveys will be administered via a secure web-based system at 12 time points for BASIS/AC and 9 time points for TAU, beginning in the fall of each year. Clinicians will self-report their demographic characteristics, BASIS mechanisms (attitudes, subjective norms, self-efficacy, maintenance self-efficacy), implementation intentions, organizational moderators (implementation climate, leadership), and TF-CBT sessions delivered. For BASIS/AC, surveys will be administered at baseline through post-training (T1 prior to the self-paced course, T2 after the self-paced course, T3 after BASIS/AC pre-training session, T4 after the TF-CBT virtual training, and T5 at end of the BASIS/AC post-training session), 2 months from baseline and after the BASIS/AC booster (T6), winter (T7), spring (T8), and end of school year (T9), as well as three sustainment time points during the subsequent year (T10 in fall or year 2, T11 in winter, and T12 in spring). For TAU, we will administer the T1, T5, and T6–T12 surveys.

TF-CBT fidelity assessments

To assess TF-CBT fidelity, clinicians will record all sessions for participating TF-CBT and TAU students. Three sessions (one from each phase of TF-CBT) will be randomly selected and coded for fidelity using an established, systematic TF-CBT coding protocol (Table 2).

Table 2 Study measures by construct

TF-CBT toolkit

Adoption and sustainment data will be collected via the TF-CBT “Toolkit,” an online TF-CBT tracking system used by clinicians to determine client eligibility, log client sessions, and facilitate consultation.

Qualitative interviews

To fully address aim 2 and identify factors unaccounted for by our BASIS theory of change (Fig. 1), unexplained residuals from aim 2 mediation analyses will be explored qualitatively via semi-structured phone interviews at the end of the active implementation phase. Residuals are defined as clinicians whose implementation behavior is insufficiently accounted for by our mediation model (e.g., clinicians with favorable implementation outcomes, but who demonstrate low levels of BASIS mechanisms). Clinicians from aim 2 will be identified at the end of their first year of participation based on the results of aim 2 quantitative modeling. Participants will include those whose predicted probability score is greater than 1.0 standard deviations of the mean from their predicted behavior, balanced between implementers and non-implementers and TF-CBT and AC conditions (approximately 15–19 clinicians total). The mixed methods design will be sequential in structure; the functions are sampling and expansion; and the process is connecting [98,99,100]. We will develop a systematic, comprehensive semi-structured interview guide that draws from the EPIS framework [17] to examine multilevel (i.e., intervention, individual, inner setting, outer setting) [101] determinants that explain what processes facilitated or hindered EBT implementation and sustainment.

Cost assessments

Activity-based costing will estimate the respective, incremental costs of enhanced TAU, TF-CBT, and BASIS. For TF-CBT/TAU costs, we will estimate direct costs of TF-CBT and enhanced TAU, such as training, consultation and delivery of services, and indirect costs, such as lost opportunities for alternative activities. For BASIS costs, we will directly measure the resource use and incremental costs associated with BASIS and TF-CBT training as compared to status quo TF-CBT training (i.e., TF-CBT implementation-as-usual) [102]. We will identify activities related to training and associated labor and non-labor inputs. Inputs can include time, supplies, travel, overhead, and costs associated with TF-CBT and BASIS training meetings, including pre-work, scheduling, etc.

Cost-effectiveness

We will combine incremental costs collected for BASIS, TF-CBT, and TAU with estimates of effectiveness on implementation and clinical (see “Cost and cost-effectiveness analyses” section).

Youth data collection

Youth data collection will focus on mental health and functional outcomes targeted by TF-CBT, with evidence of effectiveness in previous studies. Youth will complete surveys via telephone prior to their first TF-CBT session (ST1) and at 3- (ST2) and 6-month (ST3) follow-ups. These surveys will assess PTSD, depression, and psychosocial functioning. On the same time frame, caregivers will complete a measure of their students’ psychosocial functioning and provide information regarding family income and parent occupation. At the end of each school year, academic records will be requested for all participants who received TF-CBT or TAU during that year. Attendance, discipline, and achievement (standardized test scores, grades) will be extracted from these academic records.

Measures

Measures evaluate basic demographics and aspects of each school’s context, mediators (mechanisms) and moderators, implementation outcomes, and youth clinical outcomes. Table 2 displays all study measures, and Additional File 5 provides additional detail for all study measures.

Data analytic plan

Aim 1: TB-CBT effects

We will test the direct main effects of TF-CBT on the hypothesized intervention mechanisms (trauma-related cognitions, emotion regulation, behavioral avoidance) and child mental health outcomes (e.g., symptoms of post-traumatic stress), via longitudinal mixed effects models (time within client within clinician) for each mechanism, using effect coding for each of the 3 conditions (BASIS, AC, enhanced TAU) in an intention-to-treat approach, and focusing our comparisons on the main effect of TAU vs. TF-CBT conditions. To increase statistical power to detect an effect of TF-CBT, we will include as covariates any variables at the client, provider, or school level that are associated with the mechanism variable, as indicated during preliminary bivariate analyses. Path analysis will then be used to test the mediated effect of mechanisms. Mixed effect models will estimate standardized coefficients for each path separately, as well as together, and any significant reduction in the condition to outcome coefficient when including the other paths will be suggestive of partial or full mediation, using the bias-corrected bootstrap method to test significance [103].

Aim 2: BASIS effects

For aim 2, clinicians assigned to the enhanced TAU condition will be excluded from analyses. We will test for the impact of BASIS on proximal implementation mechanisms of change via a series of piecewise longitudinal mixed effects models (time within clinician) to examine between-condition differences on the rates of change across all twelve time points pieced into three epochs (T1–T5, T6–T9, T10–T12) for each of the five primary BASIS mechanisms of change. As compared to AC, we hypothesize steeper gains for BASIS from T1 to T5, smaller rates of decline for BASIS from T6 to T9, and higher levels of sustainment for T10 to T12. The impact of BASIS on fidelity will be tested using mixed effects models, with fidelity measurement occurrence nested within clinician. Adoption and sustainment will be tested using logistic regressions using condition as a predictor (adoption: yes/no used TF-CBT in first year; sustainment: yes/no used TF-CBT in second year). We will stratify sustainment analyses (T10–T12) as follows: (1) the entire population of participants, regardless of year 1 implementation, in order to capture any sustained impact of BASIS, (2) the subgroup of clinicians who implemented in year 1, to examine the sustainment of TF-CBT implementation. Mediation will be tested as in aim 1.

Based on aim 2 quantitative models, we will identify clinicians who have a difference between predicted and actual implementation behavior of ≥ 1 SD. Transcribed data will be coded using an integrated directed content analysis [104] approach as certain codes will be conceptualized during the interview guide development (deductive approach) and other codes will be developed through a close reading of an initial subset of transcripts (inductive approach) [105]. These themes will provide a way of identifying and understanding the most salient factors that impact implementation and extend beyond the existing BASIS mechanisms and theory of change [106, 107]. Directed coding will be driven by the EPIS framework [17]. After a stable set of codes is developed, a consensus process will be used in which all reviewers independently code and compare their coding to arrive at consensus judgments through open dialog [79, 80, 108].

In all main effects analyses, the Benjamini-Hochberg procedure adjusts for familywise error within outcome families (e.g., “attitudes”). Missing data will be addressed in modeling using restricted maximum likelihood estimation.

Cost and cost-effectiveness analyses

For both aims, cost data collection and analyses of TF-CBT and BASIS will estimate incremental total and unit costs for TAU, BASIS+TF-CBT, and TF-CBT during both the implementation and sustainment periods [109]. We will estimate the average weighted cost metrics across the sample of study sites. For each condition, we will estimate the average incremental costs for (1) total economic costs and (2) cost per clinician trained (stratified by TAU training, BASIS training, and TF-CBT training). The cost-effectiveness analysis will compare the incremental net costs with the benefits defined as changes in clinical outcomes, such as PTS, depression, and anxiety, across the control (TAU) and the two arms TF-CBT and BASIS+TF-CBT.

Power

Aim 1

Our conservative power analyses indicate that we are powered to detect minimum detectable effect sizes (MDES) of Cohen’s d > .35 for direct effects on student mechanisms and outcomes, assuming an intraclass correlation of .05, paired two-group (e.g., BASIS vs. enhanced TAU) tests with 35 schools/clinicians in each group (to account for attrition/missing data) and 4 clients served by each provider. Assuming some variance in the number of clients per clinician ranging from 1 to 7, a Monte Carlo simulation estimated that the maximum detrimental impact on power will be less than 10% [110]. Using bias-corrected bootstrapping, we will have sufficient power to detect mediation when each of the two mediation paths is of d > .2 (small effects) [111].

Aim 2

Conservatively assuming 35 clinicians in each group and 12 time points, we will have sufficient power to detect a Cohen’s d of .68 for clinician implementation outcomes, which is lower than most significant effects during our pilot study [78]. We will have power to detect mediation effects when the relationship of BASIS to implementation mechanisms is d > .59 (as expected) and mechanisms to implementation outcomes has at least small/moderate effects (d > .31) [111].

Discussion

Innovation

This hybrid trial will address significant gaps in implementation science surrounding the impact of pragmatic, individually focused implementation strategies and their mechanisms. It is also the first project to conduct an effectiveness trial of TF-CBT in schools. Existing compilations of implementation strategies [112, 113] contain very few individually focused strategies [114], and none are explicitly designed to impact the mechanisms identified by TPB and HAPA. In particular, although TPB is the most commonly used social-cognitive theory for designing and evaluating the impact of implementation strategies [34, 115], no studies have tested TPB constructs as mechanisms of behavior change via mediation [49, 81]. BASIS isolates individual-level mechanisms of implementation; the understanding of which will inform the design and tailoring of efficient strategies. A recent systematic review [115] identified only three prior studies that have conducted experimental tests of the impact of implementation strategies on mechanisms (mediators) and implementation outcomes. We will conduct mediation analyses testing extent to which BASIS mechanisms (TPB/HAPA constructs) account for changes in implementation outcomes as well as whether TF-CBT mechanisms account for changes in youth outcomes.

Furthermore, relative to many existing implementation strategies that unfold over months or years [116,117,118], the majority of BASIS activities occur at a single time point. The resources required to deliver BASIS are minimal compared to those typically invested in EBT training and consultation [119]; all of which may be wasted due to unaddressed individual-level barriers to implementation. TF-CBT has also demonstrated efficacy in improving youth symptoms, but no TF-CBT cost-effectiveness estimates [120, 121] have been conducted in schools. Cost-effectiveness is a critical but understudied factor in implementation science [122, 123] that is often a primary driver of decision-making by policymakers and leadership in service settings [124]. In the proposed trial, we will explicitly examine the cost-effectiveness of BASIS and of TF-CBT for improving youth outcomes.

Finally, much of what has been written about the sustainment phase of implementation comes from conceptual models [125, 126], and the literature has been described as “fragmented and underdeveloped” [127]. Few implementation trials have explicitly assessed the maintenance of effects produced by strategies extending into a sustainment period [128, 129]. Moreover, virtually no studies have done so in the education sector [130,131,132]. Given that BASIS represents a pragmatic and time-limited implementation strategy, evaluation of its long-term effects is particularly critical. Consistent with recommendations for education sector research [13], the current project will track all clinician participants into a sustainment period that spans at least one summer break and the subsequent academic year.

Limitations

The current study only recruits one clinician per school site, which does not allow for robust evaluation of organizational covariates. We considered randomizing multiple clinicians from site to different conditions to increase sample size; however, given that BASIS is designed to target clinician perspectives of social norms, this would have presented significant risk of contamination.

Conclusion and impact

This research will generate critical knowledge about the effectiveness and cost-effectiveness of BASIS—a pragmatic, theory-driven, and generalizable implementation strategy—to support provider behavior change, as well as the effectiveness of TF-CBT in SMH. Trial results will be disseminated via publications, presentations, via traditional (e.g., press releases) and social (e.g., Twitter) media, and through networks of practitioners. If effective, BASIS could be adapted for use in initiatives across service sectors and evaluated in subsequent trials with additional interventions. Finally, evidence supporting the effectiveness of TF-CBT in schools would be cause for increased scale-up of this EBT in the most common setting where youth with history of trauma are likely to access needed mental health services.

Availability of data and materials

Please contact the lead author for more information.

Abbreviations

AC:

Attention control

AC + TF-CBT:

AC plus TF-CBT

BASIS:

Beliefs and Attitudes for Successful Implementation in Schools

BASIS+TF-CBT:

BASIS plus TF-CBT

EBT:

Evidence-based treatments

EPIS:

Exploration, Preparation, Implementation, Sustainment

HAPA:

Health Action Process Approach

MDES:

Minimum detectable effect sizes

MI:

Motivational interviewing

PTSD:

Posttraumatic stress disorder

SMH:

School-based mental health

TAU:

Treatment-as-usual

TF-CBT:

Trauma-focused cognitive behavior therapy

TPB:

Theory of Planned Behavior

References

  1. 1.

    Copeland WE, Keeler G, Angold A, Costello J. Traumatic events and posttraumatic stress in childhood. Arch Gen Psychiatry. 2007;64:577–84.

    Article  Google Scholar 

  2. 2.

    Bridges AJ, de Arellano MA, Rheingold AA, Danielson CK, Silcott L. Trauma exposure, mental health, and service utilization rates among immigrant and United States-born Hispanic youth: results from the Hispanic family study. Psychol Trauma Theory Res Pract Policy. 2010;2(1):40–8. https://doi.org/10.1037/a0019021.

    Article  Google Scholar 

  3. 3.

    Overstreet S, Chafouleas SM. Trauma-informed schools: introduction to the special issue. Sch Ment Heal. 2016;8(1):1–6. https://doi.org/10.1007/s12310-016-9184-1.

    Article  Google Scholar 

  4. 4.

    Dorsey S, McLaughlin KA, Kerns SEU, et al. Evidence base update for psychosocial treatments for children and adolescents exposed to traumatic events. J Clin Child Adolesc Psychol. 2017;46(3):303–30. https://doi.org/10.1080/15374416.2016.1220309.

    Article  PubMed  Google Scholar 

  5. 5.

    de Arellano MAR, Lyman DR, Jobe-Shields L, et al. Trauma-focused cognitive-behavioral therapy for children and adolescents: assessing the evidence. Psychiatr Serv. 2014;65(5):591–602. https://doi.org/10.1176/appi.ps.201300255.

    Article  PubMed  Google Scholar 

  6. 6.

    Mannarino AP, Cohen JA, Deblinger E, Runyon MK, Steer RA. Trauma-focused cognitive-behavioral therapy for children: sustained impact of treatment 6 and 12 months later. Child Maltreat. 2012;17(3):231–41. https://doi.org/10.1177/1077559512451787.

    Article  PubMed  Google Scholar 

  7. 7.

    Sigel BA, Benton AH, Lynch CE, Kramer TL. Characteristics of 17 statewide initiatives to disseminate trauma-focused cognitive-behavioral therapy (TF-CBT). Psychol Trauma Theory Res Pract Policy. 2013;5(4):323–33. https://doi.org/10.1037/a0029095.

    Article  Google Scholar 

  8. 8.

    Hanson RF, Gros KS, Davidson TM, et al. National trainers’ perspectives on challenges to implementation of an empirically-supported mental health treatment. Admin Pol Ment Health. 2014;41(4):522–34. https://doi.org/10.1007/s10488-013-0492-6.

    Article  Google Scholar 

  9. 9.

    Farmer TW, Estell DB, Bishop JL, O’Neal KK, Cairns BD. Rejected bullies or popular leaders? The social relations of aggressive subtypes of rural African American early adolescents. Dev Psychol. 2003;39:992–1004. https://doi.org/10.1037/0012-1649.39.6.992.

    Article  PubMed  Google Scholar 

  10. 10.

    Merikangas KR, He J, Burstein M, et al. Service utilization for lifetime mental disorders in U.S. adolescents: results of the National Comorbidity Survey-Adolescent Supplement (NCS-A). J Am Acad Child Adolesc Psychiatry. 2011;50(1):32–45. https://doi.org/10.1016/j.jaac.2010.10.006.

    Article  PubMed  Google Scholar 

  11. 11.

    Duong M, Bruns E, Lee K, et al. Rates of mental health service utilization by children and adolescents in schools and other common service settings: a systematic review and meta-analysis. Admin Pol Ment Health. 2020. https://doi.org/10.1007/s10488-020-01080-9.

  12. 12.

    Aarons GA, Sklar M, Mustanski B, Benbow N, Brown CH. “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implement Sci. 2017;12:111. https://doi.org/10.1186/s13012-017-0640-6.

    Article  PubMed  PubMed Central  Google Scholar 

  13. 13.

    Owens JS, Lyon AR, Brandt NE, et al. Implementation science in school mental health: key constructs in a developing research agenda. Sch Ment Heal. 2014;6(2):99–111. https://doi.org/10.1007/s12310-013-9115-3.

    Article  Google Scholar 

  14. 14.

    Ennett ST, Ringwalt CL, Thorne J, et al. A comparison of current practice in school-based substance use prevention programs with meta-analysis findings. Prev Sci. 2003;4(1):1–14. https://doi.org/10.1023/A:1021777109369.

    Article  PubMed  Google Scholar 

  15. 15.

    Evans SW, Weist MD. Implementing empirically supported treatments in the schools: what are we asking? Clin Child Fam Psychol Rev. 2004;7(4):263–7. https://doi.org/10.1007/s10567-004-6090-0.

    Article  PubMed  Google Scholar 

  16. 16.

    Odom SL, McLean ME, Johnson LJ, LaMontagne MJ. Recommended practices in early childhood special education: validation and current use. J Early Interv. 1995;19(1):1–17. https://doi.org/10.1177/105381519501900101.

    Article  Google Scholar 

  17. 17.

    Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Admin Pol Ment Health. 2011;38(1):4–23. https://doi.org/10.1007/s10488-010-0327-7.

    Article  Google Scholar 

  18. 18.

    Beidas RS, Kendall PC. Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clin Psychol Sci Pract. 2010;17(1):1–30. https://doi.org/10.1111/j.1468-2850.2009.01187.x.

    Article  Google Scholar 

  19. 19.

    Glisson C. The organizational context of children’s mental health services. Clin Child Fam Psychol Rev. 2002;5(4):233–53. https://doi.org/10.1023/A:1020972906177.

    Article  PubMed  Google Scholar 

  20. 20.

    Michie S, Van Stralen M, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6(1):1.

    Article  Google Scholar 

  21. 21.

    Kincaid D, Childs K, Blase KA, Wallace F. Identifying barriers and facilitators in implementing schoolwide positive behavior support. J Posit Behav Interv. 2007;9(3):174–84. https://doi.org/10.1177/10983007070090030501.

    Article  Google Scholar 

  22. 22.

    Sanford DeRousie RM, Bierman KL. Examining the sustainability of an evidence-based preschool curriculum: the REDI program. Early Child Res Q. 2012;27:55–65.

    Article  Google Scholar 

  23. 23.

    Locke J, Lawson GM, Beidas RS, et al. Individual and organizational factors that affect implementation of evidence-based practices for children with autism in public schools: a cross-sectional observational study. Implement Sci. 2019;14(1):29. https://doi.org/10.1186/s13012-019-0877-3.

    Article  PubMed  PubMed Central  Google Scholar 

  24. 24.

    Locke J, Kang-Yi C, Frederick L, Mandell DS. Individual and organizational characteristics predicting intervention use for children with autism in schools. Autism. 2020;24(5):1152–63. https://doi.org/10.1177/1362361319895923.

    Article  PubMed  Google Scholar 

  25. 25.

    Lyon AR, Whitaker K, Locke J, et al. The impact of inter-organizational alignment (IOA) on implementation outcomes: evaluating unique and shared organizational influences in education sector mental health. Implement Sci. 2018;13(1):24. https://doi.org/10.1186/s13012-018-0721-1.

    Article  PubMed  PubMed Central  Google Scholar 

  26. 26.

    McHugh RK, Barlow DH. The dissemination and implementation of evidence-based psychological treatments. A review of current efforts. Am Psychol. 2010;65(2):73–84.

    Article  Google Scholar 

  27. 27.

    Nadeem E, Saldana L, Chapman J, Schaper H. A mixed methods study of the stages of implementation for an evidence-based trauma intervention in schools. Behav Ther. 2018;49(4):509–24. https://doi.org/10.1016/j.beth.2017.12.004.

    Article  PubMed  Google Scholar 

  28. 28.

    Dart EH, Cook CR, Collins TA, Gresham FM, Chenier JS. Test driving interventions to increase treatment integrity and student outcomes. Sch Psychol Rev. 2012;41(4):467–81. https://doi.org/10.1080/02796015.2012.12087500.

    Article  Google Scholar 

  29. 29.

    Gonzalez JE, Nelson JR, Gutkin TB, Shwery CS. Teacher resistance to school-based consultation with school psychologists: a survey of teacher perceptions. J Emot Behav Disord. 2004;12(1):30–7. https://doi.org/10.1177/10634266040120010401.

    Article  Google Scholar 

  30. 30.

    Ajzen I. The theory of planned behavior. Organ Behav Hum Decis Process. 1991;50(2):179–211. https://doi.org/10.1016/0749-5978(91)90020-t.

    Article  Google Scholar 

  31. 31.

    Ajzen I. From intentions to actions: a theory of planned behavior. In: Action control: from cognition to behavior: Springer Berlin Heidelberg; 1985. p. 11–39.

  32. 32.

    Hill RJ, Fishbein M, Ajzen I. Belief, attitude, intention and behavior: an introduction to theory and research. Contemporary Sociology. 1977;6:244. https://doi.org/10.2307/2065853.

  33. 33.

    Eccles MP, Grimshaw JM, Johnston M, et al. Applying psychological theories to evidence-based clinical practice: identifying factors predictive of managing upper respiratory tract infections without antibiotics. Implement Sci. 2007;2:26. https://doi.org/10.1186/1748-5908-2-26.

    Article  PubMed  PubMed Central  Google Scholar 

  34. 34.

    Godin G, Bélanger-Gravel A, Eccles M, Grimshaw J. Healthcare professionals’ intentions and behaviours: a systematic review of studies based on social cognitive theories. Implement Sci. 2008;3:36. https://doi.org/10.1186/1748-5908-3-36.

    Article  PubMed  PubMed Central  Google Scholar 

  35. 35.

    Ajzen I, Manstead A. Changing health-related behaviours: an approach based on the theory of planned behaviour the scope of social psychology: theory and applications. In: The scope of social psychology: theory and applications. New York: Psychology Press; 2007. p. 43–63.

  36. 36.

    Steinmetz H, Knappstein M, Ajzen I, Schmidt P, Kabst R. How effective are behavior change interventions based on the theory of planned behavior? Z Psychol. 2016;224(3):216–33. https://doi.org/10.1027/2151-2604/a000255.

    Article  Google Scholar 

  37. 37.

    Potthoff S, Presseau J, Sniehotta FF, Johnston M, Elovainio M, Avery L. Planning to be routine: habit as a mediator of the planning-behaviour relationship in healthcare professionals. Implement Sci. 2017;12(1):24. https://doi.org/10.1186/s13012-017-0551-6.

    Article  PubMed  PubMed Central  Google Scholar 

  38. 38.

    Hagermoser Sanetti LM, Collier-Meek MA, Long ACJ, Byron J, Kratochwill TR. Increasing teacher treatment integrity of behavior support plans through consultation and implementation planning. J Sch Psychol. 2015;53(3):209–29. https://doi.org/10.1016/j.jsp.2015.03.002.

    Article  PubMed  Google Scholar 

  39. 39.

    Hagger MS, Luszczynska A. Implementation intention and action planning interventions in health contexts: state of the research and proposals for the way forward. Appl Psychol Health Well-Being. 2014;6(1):1–47. https://doi.org/10.1111/aphw.12017.

    Article  PubMed  Google Scholar 

  40. 40.

    Schwarzer R, Lippke S, Luszczynska A. Mechanisms of health behavior change in persons with chronic illness or disability: the Health Action Process Approach (HAPA). Rehabil Psychol. 2011;56(3):161–70. https://doi.org/10.1037/a0024509.

    Article  PubMed  Google Scholar 

  41. 41.

    Sanetti LMH, Collier-Meek MA, Long ACJ, Kim J, Kratochwill TR. Using implementation planning to increase teachers’ adherence and quality to behavior support plans. Psychol Sch. 2014;51(8):879–95. https://doi.org/10.1002/pits.21787.

    Article  Google Scholar 

  42. 42.

    Ramsay CR, Thomas RE, Croal BL, Grimshaw JM, Eccles MP. Using the theory of planned behaviour as a process evaluation tool in randomised trials of knowledge translation strategies: a case study from UK primary care. Implement Sci. 2010;5(1):71. https://doi.org/10.1186/1748-5908-5-71.

    Article  PubMed  PubMed Central  Google Scholar 

  43. 43.

    Presseau J, Grimshaw JM, Tetroe JM, et al. A theory-based process evaluation alongside a randomised controlled trial of printed educational messages to increase primary care physicians’ prescription of thiazide diuretics for hypertension [ISRCTN72772651]. Implement Sci. 2016;11(1):121. https://doi.org/10.1186/s13012-016-0485-4.

    Article  PubMed  PubMed Central  Google Scholar 

  44. 44.

    Desroches S, Gagnon M-P, Tapp S, Légaré F. Implementing shared decision-making in nutrition clinical practice: a theory-based approach and feasibility study. Implement Sci. 2008;3(1):48. https://doi.org/10.1186/1748-5908-3-48.

    Article  PubMed  PubMed Central  Google Scholar 

  45. 45.

    Lavis JN, Wilson MG, Grimshaw JM, et al. Effects of an evidence service on health-system policy makers’ use of research evidence: a protocol for a randomised controlled trial. Implement Sci. 2011;6(1):51. https://doi.org/10.1186/1748-5908-6-51.

    Article  PubMed  PubMed Central  Google Scholar 

  46. 46.

    Fraser KD, Sales AE, Baylon MAB, Schalm C, Miklavcic JJ. Data for improvement and clinical excellence: a report of an interrupted time series trial of feedback in home care. Implement Sci. 2017;12(1):66. https://doi.org/10.1186/s13012-017-0600-1.

    Article  PubMed  PubMed Central  Google Scholar 

  47. 47.

    Zwarenstein M, Grimshaw JM, Presseau J, et al. Printed educational messages fail to increase use of thiazides as first-line medication for hypertension in primary care: a cluster randomized controlled trial [ISRCTN72772651]. Implement Sci. 2016;11(1):124. https://doi.org/10.1186/s13012-016-0486-3.

    Article  PubMed  PubMed Central  Google Scholar 

  48. 48.

    Lewis CC, Klasnja P, Powell BJ, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6. https://doi.org/10.3389/fpubh.2018.00136.

  49. 49.

    Williams NJ. Multilevel mechanisms of implementation strategies in mental health: integrating theory, research, and practice. Admin Pol Ment Health. 2016;43(5):783–98. https://doi.org/10.1007/s10488-015-0693-2.

    Article  Google Scholar 

  50. 50.

    Schwarzer R. Health action process approach (HAPA) as a theoretical framework to understand behavior change. Actual En Psicol. 2016;30(121):119–30. https://doi.org/10.15517/ap.v30i121.23458.

    Article  Google Scholar 

  51. 51.

    Cook CR, Lyon AR, Kubergovic D, Browning Wright D, Zhang Y. A supportive beliefs intervention to facilitate the implementation of evidence-based practices within a multi-tiered system of supports. Sch Ment Heal. 2015;7(1):49–60. https://doi.org/10.1007/s12310-014-9139-3.

    Article  Google Scholar 

  52. 52.

    Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26.

    Article  Google Scholar 

  53. 53.

    Palinkas LA, Horwitz SM, Chamberlain P, Hurlburt MS, Landsverk J. Mixed-methods designs in mental health services research: a review. Psychiatr Serv. 2011;62(3):255–63. https://doi.org/10.1176/ps.62.3.pss6203_0255.

    Article  PubMed  Google Scholar 

  54. 54.

    Stein BD, Jaycox LH, Kataoka SH, et al. A mental health intervention for schoolchildren exposed to violence: a randomized controlled trial. JAMA. 2003;290(5):603–11. https://doi.org/10.1001/jama.290.5.603.

    Article  PubMed  Google Scholar 

  55. 55.

    Kataoka S, Jaycox LH, Wong M, et al. Effects on school outcomes in low-income minority youth: preliminary findings from a community-partnered study of a school trauma intervention. Ethn Dis. 2011;21(3 0 1):S1-71-77.

    Google Scholar 

  56. 56.

    Vona P, Wilmoth P, Jaycox LH, et al. A web-based platform to support an evidence-based mental health intervention: lessons from the CBITS web site. Psychiatr Serv. 2014;65(11):1381–4. https://doi.org/10.1176/appi.ps.201300512.

    Article  PubMed  PubMed Central  Google Scholar 

  57. 57.

    Cialdini RB, Reno RR, Kallgren CA. A focus theory of normative conduct: recycling the concept of norms to reduce littering in public places. J Pers Soc Psychol. 1990;58(6):1015–26. https://doi.org/10.1037/0022-3514.58.6.1015.

    Article  Google Scholar 

  58. 58.

    Abrams D, Wetherell M, Cochrane S, Hogg MA, Turner JC. Knowing what to think by knowing who you are: self-categorization and the nature of norm formation, conformity and group polarization*. Br J Soc Psychol. 1990;29(2):97–119. https://doi.org/10.1111/j.2044-8309.1990.tb00892.x.

    Article  PubMed  Google Scholar 

  59. 59.

    Berkowitz AD. Applications of social norms theory to other health and social justice issues. In: The social norms approach to preventing school and college age substance abuse: a handbook for educators, counselors, clinicians. San Francisco: Jossey-Bass; 2002.

  60. 60.

    Petrova PK, Cialdini RB, Sills SJ. Consistency-based compliance across cultures. J Exp Soc Psychol. 2007;43(1):104–11. https://doi.org/10.1016/j.jesp.2005.04.002.

    Article  Google Scholar 

  61. 61.

    Cioffi D, Garner R. On doing the decision: effects of active versus passive choice on commitment and self-perception. Personal Soc Psychol Bull. 1996;22(2):133–47. https://doi.org/10.1177/0146167296222003.

    Article  Google Scholar 

  62. 62.

    Pratkanis AR. An invitation to social influence research. In: The science of social influence: advances and future progress. New York: Psychology Press; 2007. p. 1–16.

  63. 63.

    Lundahl BW, Kunz C, Brownell C, Tollefson D, Burke BL. A meta-analysis of motivational interviewing: twenty-five years of empirical studies. Res Soc Work Pract. 2010;20(2):137–60. https://doi.org/10.1177/1049731509347850.

    Article  Google Scholar 

  64. 64.

    Boyd-Ball AJ, Dishion TJ. Family-centered treatment for American Indian adolescent substance abuse: toward a culturally and historically informed strategy. In: Treating adolescent substance abuse: state of the science. New York: Cambridge University Press; 2006. p. 423–48.

  65. 65.

    Connell AM, Dishion TJ, Yasui M, Kavanagh K. An adaptive approach to family intervention: linking engagement in family-centered intervention to reductions in adolescent problem behavior. Consult Clin Psychol. 2007;75(4):568–79. https://doi.org/10.1037/0022-006X.75.4.568.

    Article  Google Scholar 

  66. 66.

    Dishion TJ, Kavanagh K. Intervening in adolescent problem behavior: a family-centered approach. New York: Guilford Press; 2003.

  67. 67.

    Mouzakitis A, Codding RS, Tryon G. The effects of self-monitoring and performance feedback on the treatment integrity of behavior intervention plan implementation and generalization. J Posit Behav Interv. 2015;17(4):223–34. https://doi.org/10.1177/1098300715573629.

    Article  Google Scholar 

  68. 68.

    Frey AJ, Lee J, Small JW, Seeley JR, Walker HM, Feil EG. The motivational interviewing navigation guide: a process for enhancing teachers’ motivation to adopt and implement school-based interventions. Adv School Ment Health Promot. 2013;6(3):158–73. https://doi.org/10.1080/1754730X.2013.804334.

    Article  Google Scholar 

  69. 69.

    Cohen GL, Sherman DK. The psychology of change: self-affirmation and social psychological intervention. Annu Rev Psychol. 2014;65(1):333–71. https://doi.org/10.1146/annurev-psych-010213-115137.

    Article  PubMed  Google Scholar 

  70. 70.

    MCQueen A, Klein WMP. Experimental manipulations of self-affirmation: a systematic review. Self Identity. 2006;5(4):289–354. https://doi.org/10.1080/15298860600805325.

    Article  Google Scholar 

  71. 71.

    Walton GM, Cohen GL. A brief social-belonging intervention improves academic and health outcomes of minority students. Science. 2011;331(6023):1447–51. https://doi.org/10.1126/science.1198364.

    CAS  Article  Google Scholar 

  72. 72.

    Sanetti LMH, Kratochwill TR, Long ACJ. Applying adult behavior change theory to support mediator-based intervention implementation. Sch Psychol Q. 2013;28(1):47–62. https://doi.org/10.1037/spq0000007.

    Article  PubMed  Google Scholar 

  73. 73.

    Adriaanse MA, Vinkers CDW, De Ridder DTD, Hox JJ, De Wit JBF. Do implementation intentions help to eat a healthy diet? A systematic review and meta-analysis of the empirical evidence. Appetite. 2011;56(1):183–93. https://doi.org/10.1016/j.appet.2010.10.012.

    Article  PubMed  Google Scholar 

  74. 74.

    Bélanger-Gravel A, Godin G, Amireault S. A meta-analytic review of the effect of implementation intentions on physical activity. Health Psychol Rev. 2013;7(1):23–54. https://doi.org/10.1080/17437199.2011.560095.

    Article  Google Scholar 

  75. 75.

    Orbell S, Sheeran P. Motivational and volitional processes in action initiation: a field study of the role of implementation intentions1. J Appl Soc Psychol. 2000;30(4):780–97. https://doi.org/10.1111/j.1559-1816.2000.tb02823.x.

    Article  Google Scholar 

  76. 76.

    Sheeran P, Orbell S. Implementation intentions and repeated behaviour: augmenting the predictive validity of the theory of planned behaviour. Eur J Soc Psychol. 1999;29(2-3):349–69. https://doi.org/10.1002/(SICI)1099-0992(199903/05)29:2/3<349::AID-EJSP931>3.0.CO;2-Y.

    Article  Google Scholar 

  77. 77.

    Orbell S, Hodgkins S, Sheeran P. Implementation intentions and the theory of planned behavior. Personal Soc Psychol Bull. 1997;23(9):945–54. https://doi.org/10.1177/0146167297239004.

    Article  Google Scholar 

  78. 78.

    Lyon AR, Cook CR, Duong MT, et al. The influence of a blended, theoretically-informed pre-implementation strategy on school-based clinician implementation of an evidence-based trauma intervention. Implement Sci. 2019;14(1):54. https://doi.org/10.1186/s13012-019-0905-3.

    Article  PubMed  PubMed Central  Google Scholar 

  79. 79.

    DeSantis L, Ugarriza DN. The concept of theme as used in qualitative nursing research. West J Nurs Res. 2000;22(3):351–72. https://doi.org/10.1177/019394590002200308.

    CAS  Article  PubMed  Google Scholar 

  80. 80.

    Hill CE, Knox S, Thompson BJ, Williams EN, Hess SA, Ladany N. Consensual qualitative research: an update. J Couns Psychol. 2005;52(2):196–205.

    Article  Google Scholar 

  81. 81.

    Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ. 1995;153(10):1423–31.

    CAS  PubMed  PubMed Central  Google Scholar 

  82. 82.

    Tashakkori A, Teddlie C. Handbook of mixed methods in social & behavioral research. Thousand Oaks: SAGE Publications; 2010.

  83. 83.

    Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS). Ment Health Serv Res. 2004;6(2):61–74. https://doi.org/10.1023/B:MHSR.0000024351.12294.65.

    Article  PubMed  PubMed Central  Google Scholar 

  84. 84.

    Francis J, Eccles MP, Johnston M, et al. Constructing questionnaires based on the theory of planned behaviour: a manual for health services researchers. Published online 2004. http://openaccess.city.ac.uk/1735/.

  85. 85.

    Schwarzer R, Hallum S. Perceived teacher self-efficacy as a predictor of job stress and burnout: mediation analyses. Appl Psychol. 2008;57(s1):152–71 https://doi.org/10.1111/j.1464-0597.2008.00359.x.

    Article  Google Scholar 

  86. 86.

    Kortteisto T, Kaila M, Komulainen J, Mäntyranta T, Rissanen P. Healthcare professionals’ intentions to use clinical guidelines: a survey using the theory of planned behaviour. Implement Sci. 2010;5(1):51. https://doi.org/10.1186/1748-5908-5-51.

    Article  PubMed  PubMed Central  Google Scholar 

  87. 87.

    Meiser-Stedman R, Smith P, Bryant R, et al. Development and validation of the Child Post-Traumatic Cognitions Inventory (CPTCI). J Child Psychol Psychiatry. 2009;50(4):432–40. https://doi.org/10.1111/j.1469-7610.2008.01995.x.

    Article  PubMed  Google Scholar 

  88. 88.

    Vrolijk-Bosschaart TF, Verlinden E, Langendam MW, et al. The diagnostic utility of the child sexual behavior inventory for sexual abuse: a systematic review. J Child Sex Abuse. 2018;27(7):729–51. https://doi.org/10.1080/10538712.2018.1477215.

    Article  Google Scholar 

  89. 89.

    van Minnen A, Hagenaars MA. Avoidance behaviour of patients with posttraumatic stress disorder. Initial development of a questionnaire, psychometric properties and treatment sensitivity. J Behav Ther Exper Pschiatr. 2010;41(3):191–8. https://doi.org/10.1016/j.jbtep.2010.01.002.

    Article  Google Scholar 

  90. 90.

    Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS). Implement Sci. 2014;9(1):157. https://doi.org/10.1186/s13012-014-0157-1.

    Article  PubMed  PubMed Central  Google Scholar 

  91. 91.

    Lyon AR, Cook CR, Brown EC, et al. Assessing organizational implementation context in the education sector: confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implement Sci. 2018;13(1):5. https://doi.org/10.1186/s13012-017-0705-6.

    Article  PubMed  PubMed Central  Google Scholar 

  92. 92.

    Aarons GA, Ehrhart MG, Farahnak LR. The Implementation Leadership Scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci. 2014;9(1):45. https://doi.org/10.1186/1748-5908-9-45.

    Article  PubMed  PubMed Central  Google Scholar 

  93. 93.

    Dorsey S, Pullmann MD, Deblinger E, et al. Improving practice in community-based settings: a randomized trial of supervision – study protocol. Implement Sci. 2013;8(1):89. https://doi.org/10.1186/1748-5908-8-89.

    Article  PubMed  PubMed Central  Google Scholar 

  94. 94.

    Foa EB, Asnaani A, Zang Y, Capaldi S, Yeh R. Psychometrics of the child PTSD symptom scale for DSM-5 for trauma-exposed children and adolescents. J Clin Child Adolesc Psychol. 2018;47(1):38–46. https://doi.org/10.1080/15374416.2017.1350962.

    Article  PubMed  Google Scholar 

  95. 95.

    Holt T, Cohen JA, Mannarino A. Factor structure of the parent emotional reaction questionnaire: analysis and validation. Eur J Psychotraumatol. 2015;6(1):28733. https://doi.org/10.3402/ejpt.v6.28733.

    Article  PubMed  Google Scholar 

  96. 96.

    Angold A, Costello EJ, Messer SC, Pickles A, Winder F, Silver D. The development of a short questionnaire for use in epidemiological studies of depression in children and adolescents. Int J Methods Psychiatr Res. 1995;5:237–49.

    Google Scholar 

  97. 97.

    Goodman R. Psychometric properties of the strengths and difficulties questionnaire. J Am Acad Child Adolesc Psychiatry. 2001;40(11):1337–45. https://doi.org/10.1097/00004583-200111000-00015.

    CAS  Article  Google Scholar 

  98. 98.

    Creswell JW, Clark VLP. Designing and conducting mixed methods research. Los Angeles: SAGE Publications; 2017.

  99. 99.

    Creswell JW, Klassen AC, Plano Clark VL, Smith KC. Best practices for mixed methods research in the health sciences. Bethesda (Maryland): National Institutes of Health. 2011;2013:541–5.

  100. 100.

    Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Admin Pol Ment Health. 2011;38(1):44–53. https://doi.org/10.1007/s10488-010-0314-z.

  101. 101.

    Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50. https://doi.org/10.1186/1748-5908-4-50.

    Article  PubMed  PubMed Central  Google Scholar 

  102. 102.

    Rogers E. Diffusion of innovations. New York: Free Press; 2003.

  103. 103.

    MacKinnon DP, Fairchild AJ, Fritz MS. Mediation analysis. Annu Rev Psychol. 2007;58(1):593–614. https://doi.org/10.1146/annurev.psych.58.110405.085542.

    Article  PubMed  PubMed Central  Google Scholar 

  104. 104.

    Hsieh H-F, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–88. https://doi.org/10.1177/1049732305276687.

    Article  PubMed  Google Scholar 

  105. 105.

    Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Serv Res. 2007;42(4):1758–72. https://doi.org/10.1111/j.1475-6773.2006.00684.x.

    Article  PubMed  PubMed Central  Google Scholar 

  106. 106.

    Glaser B, Strauss A. The discovery of grounded theory: strategies for qualitative research; 1967.

    Google Scholar 

  107. 107.

    Strauss A, Corbin J. Basics of qualitative research: grounded theory procedures and techniques. Thousand Oaks: Sage Publications; 1990.

  108. 108.

    Hill CE, Thompson BJ, Williams EN. A guide to conducting consensual qualitative research. Couns Psychol. 1997;25(4):517–72. https://doi.org/10.1177/0011000097254001.

    Article  Google Scholar 

  109. 109.

    National Therapist Certification Program. Randomized clinical trials. Trauma-Focused Cognitive Behavioral Therapy. Published September 13, 2016. Accessed February 25, 2019. https://tfcbt.org/randomized-clinical-trials/.

  110. 110.

    Baldwin SA, Murray DM, Shadish WR, et al. Intraclass correlation associated with therapists: estimates and applications in planning psychotherapy research. Cogn Behav Ther. 2011;40(1):15–33. https://doi.org/10.1080/16506073.2010.520731.

    Article  PubMed  Google Scholar 

  111. 111.

    Fritz MS, MacKinnon DP. Required sample size to detect the mediated effect. Psychol Sci. 2007;18(3):233–9. https://doi.org/10.1111/j.1467-9280.2007.01882.x.

    Article  PubMed  PubMed Central  Google Scholar 

  112. 112.

    Powell BJ, McMillen JC, Proctor EK, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69(2):123–57. https://doi.org/10.1177/1077558711430690.

    Article  PubMed  Google Scholar 

  113. 113.

    Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21. https://doi.org/10.1186/s13012-015-0209-1.

    Article  PubMed  PubMed Central  Google Scholar 

  114. 114.

    Dopp AR, Parisi KE, Munson SA, Lyon AR. Integrating implementation and user-centred design strategies to enhance the impact of health services: protocol from a concept mapping study. Health Res Policy Sys. 2019;17(1):1. https://doi.org/10.1186/s12961-018-0403-0.

    Article  Google Scholar 

  115. 115.

    Lewis CC, Boyd MR, Walsh-Bailey C, et al. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15(1):1–25. https://doi.org/10.1186/s13012-020-00983-3.

    Article  Google Scholar 

  116. 116.

    Aarons GA, Ehrhart MG, Farahnak LR, Hurlburt MS. Leadership and organizational change for implementation (LOCI): a randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implement Sci. 2015;10(1):11. https://doi.org/10.1186/s13012-014-0192-y.

    Article  PubMed  PubMed Central  Google Scholar 

  117. 117.

    Glisson C, Schoenwald SK. The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Ment Health Serv Res. 2005;7(4):243–59. https://doi.org/10.1007/s11020-005-7456-1.

    Article  PubMed  Google Scholar 

  118. 118.

    Kirchner JE, Ritchie MJ, Pitcock JA, Parker LE, Curran GM, Fortney JC. Outcomes of a partnered facilitation strategy to implement primary care–mental health. J Gen Intern Med. 2014;29(4):904–12. https://doi.org/10.1007/s11606-014-3027-2.

    Article  PubMed  PubMed Central  Google Scholar 

  119. 119.

    Olmstead T, Carroll KM, Canning-Ball M, Martino S. Cost and cost-effectiveness of three strategies for training clinicians in motivational interviewing. Drug Alcohol Depend. 2011;116(1):195–202. https://doi.org/10.1016/j.drugalcdep.2010.12.015.

    Article  PubMed  PubMed Central  Google Scholar 

  120. 120.

    Aas E, Iversen T, Holt T, Ormhaug SM, Jensen TK. Cost-effectiveness analysis of trauma-focused cognitive behavioral therapy: a randomized control trial among Norwegian youth. J Clin Child Adolesc Psychol. 2018:1–14. https://doi.org/10.1080/15374416.2018.1463535.

  121. 121.

    Dopp AR, Hanson RF, Saunders BE, Dismuke CE, Moreland AD. Community-based implementation of trauma-focused interventions for youth: economic impact of the learning collaborative model. Psychol Serv. 2017;14(1):57–65. https://doi.org/10.1037/ser0000131.

    Article  PubMed  PubMed Central  Google Scholar 

  122. 122.

    Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Admin Pol Ment Health. 2011;38(2):65–76. https://doi.org/10.1007/s10488-010-0319-7.

    Article  Google Scholar 

  123. 123.

    Raghavan R. The role of economic evaluation in dissemination and implementation research. In: Dissemination and implementation research in health. New York: Oxford University Press; 2017.

  124. 124.

    Walker SC, Lyon AR, Aos S, Trupin EW. The consistencies and vagaries of the Washington state inventory of evidence-based practice: the definition of “evidence-based” in a policy context. Admin Pol Ment Health. 2017;44(1):42–54. https://doi.org/10.1007/s10488-015-0652-y.

    Article  Google Scholar 

  125. 125.

    Han SS, Weiss B. Sustainability of teacher implementation of school-based mental health programs. J Abnorm Child Psychol. 2005;33(6):665–79. https://doi.org/10.1007/s10802-005-7646-2.

    Article  PubMed  Google Scholar 

  126. 126.

    Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101(11):2059–67. https://doi.org/10.2105/AJPH.2011.300193.

    Article  PubMed  PubMed Central  Google Scholar 

  127. 127.

    Stirman SW. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012;7:1–19.

    Article  Google Scholar 

  128. 128.

    Aarons GA, Green AE, Willging CE, et al. Mixed-method study of a conceptual model of evidence-based intervention sustainment across multiple public-sector service settings. Implement Sci. 2014;9(1):183. https://doi.org/10.1186/s13012-014-0183-z.

    Article  PubMed  PubMed Central  Google Scholar 

  129. 129.

    Aarons GA, Green AE, Trott E, et al. The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: a mixed-method study. Admin Pol Ment Health. 2016;43(6):991–1008. https://doi.org/10.1007/s10488-016-0751-4.

    Article  Google Scholar 

  130. 130.

    Greenhalgh T, Robert G, MacFarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629. https://doi.org/10.1111/j.0887-378X.2004.00325.x.

    Article  PubMed  PubMed Central  Google Scholar 

  131. 131.

    McIntosh K, Horner RH, Sugai G. Sustainability of systems-level evidence-based practices in schools: Current knowledge and future directions. In: Handbook of Positive Behavior Support. Boston: Springer; 2009. pp. 327–52.

  132. 132.

    Moore JE, Mascarenhas A, Bain J, Straus SE. Developing a comprehensive definition of sustainability. Implement Sci. 2017;12(1):110. https://doi.org/10.1186/s13012-017-0637-1.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

None.

Funding

This publication was supported by grant R01MH119148 (Lyon and Cook), awarded by the National Institute of Mental Health. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Author information

Affiliations

Authors

Contributions

ARL, MDP, SD, CL, LG, SB, and CRC developed the overarching scientific aims and design of the project. ML, CMC, CD, IM, RR, and MJ assisted in the development and operationalization of the study methods, worked with ARL to obtain institutional review board approval, and are supporting study recruitment, data collection, and preliminary analyses. NJ, RB, DH, and MG developed and delivered study implementation strategies in collaboration with ARL, CRC, and SD. All authors contributed to the development, drafting, or review of the manuscript. All authors approved the final manuscript.

Corresponding author

Correspondence to Aaron R. Lyon.

Ethics declarations

Ethics approval and consent to participate

This project was approved by the University of Washington Institutional Review Board (IRB).

Consent for publication

Not applicable.

Competing interests

All authors declare that they have no completing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

CONSORT 2010 Checklist of Information to Include When Reporting Cluster Randomised Trial.

Additional file 2.

SPIRIT 2013 Checklist.

Additional file 3.

IRB Approval.

Additional file 4.

Study Consents.

Additional file 5.

Detailed Measures Table.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Lyon, A.R., Pullmann, M.D., Dorsey, S. et al. Protocol for a hybrid type 2 cluster randomized trial of trauma-focused cognitive behavioral therapy and a pragmatic individual-level implementation strategy. Implementation Sci 16, 3 (2021). https://doi.org/10.1186/s13012-020-01064-1

Download citation

Keywords

  • Individual determinants
  • Implementation strategy
  • Theory of planned behavior
  • Health action process approach
  • Education sector
  • Mental health
\