Skip to main content

Primary aim results of a clustered SMART for developing a school-level, adaptive implementation strategy to support CBT delivery at high schools in Michigan

A Correction to this article was published on 11 August 2022

This article has been updated



Schools increasingly provide mental health services to students, but often lack access to implementation strategies to support school-based (and school professional [SP]) delivery of evidence-based practices. Given substantial heterogeneity in implementation barriers across schools, development of adaptive implementation strategies that guide which implementation strategies to provide to which schools and when may be necessary to support scale-up.


A clustered, sequential, multiple-assignment randomized trial (SMART) of high schools across Michigan was used to inform the development of a school-level adaptive implementation strategy for supporting SP-delivered cognitive behavioral therapy (CBT). All schools were first provided with implementation support informed by Replicating Effective Programs (REP) and then were randomized to add in-person Coaching or not (phase 1). After 8 weeks, schools were assessed for response based on SP-reported frequency of CBT delivered to students and/or barriers reported. Responder schools continued with phase 1 implementation strategies. Slower-responder schools (not providing ≥ 3 CBT components to ≥10 students or >2 organizational barriers identified) were re-randomized to add Facilitation to current support or not (phase 2). The primary aim hypothesis was that SPs at schools receiving the REP + Coaching + Facilitation adaptive implementation strategy would deliver more CBT sessions than SPs at schools receiving REP alone. Secondary aims compared four implementation strategies (Coaching vs no Coaching × Facilitation vs no Facilitation) on CBT sessions delivered, including by type (group, brief and full individual). Analyses used a marginal, weighted least squares approach developed for clustered SMARTs.


SPs (n = 169) at 94 high schools entered the study. N = 83 schools (88%) were slower-responders after phase 1. Contrary to the primary aim hypothesis, there was no evidence of a significant difference in CBT sessions delivered between REP + Coaching + Facilitation and REP alone (111.4 vs. 121.1 average total CBT sessions; p = 0.63). In secondary analyses, the adaptive strategy that offered REP + Facilitation resulted in the highest average CBT delivery (154.1 sessions) and the non-adaptive strategy offering REP + Coaching the lowest (94.5 sessions).


The most effective strategy in terms of average SP-reported CBT delivery is the adaptive implementation strategy that (i) begins with REP, (ii) augments with Facilitation for slower-responder schools (schools where SPs identified organizational barriers or struggled to deliver CBT), and (iii) stays the course with REP for responder schools.

Trial registration, NCT03541317, May 30, 2018.

Peer Review reports


Depression and anxiety disorders impact approximately 15% and 30% of school-aged youth, respectively [1], and are increasing. Several evidence-based practices (EBPs), including cognitive behavioral therapy (CBT), can improve clinical outcomes for adolescents [2,3,4,5,6], but barriers to care limit access, with less than one in five teens receiving any kind of evidence-based care [7]. Barriers include cost, stigma, and a limited number of behavioral health providers, particularly in rural communities [8,9,10,11,12,13]. More than half of mental disorders begin during school-age years [14]. Untreated, these illnesses can impair development and academic performance and lead to poor physical and mental health outcomes, including suicide, self-injury, and substance abuse [2, 15,16,17,18] at substantial social and economic cost [2, 15, 19].

Given barriers to community-based care, schools have increasingly served as de facto providers of mental health care services. Schools provide a low stigma, convenient, and sustainable setting to overcome treatment barriers. Youth spend a great deal of time in school, and most have daily access to school professionals (SPs; social workers, counselors, psychologists) who can provide mental health and substance use support at no additional cost to families, in a familiar environment [19,20,21]. Students are also more willing to access mental health services in school than community settings [11, 12]. Between 2012 and 2015, nearly 60% of students receiving mental health services reported receiving some in school, and nearly 40% reported receiving services in schools exclusively [22]. Students receiving care exclusively in schools were disproportionately lower income, underrepresented minorities, and/or on public insurance [22].

Schools, however, face their own barriers to providing effective mental health care. SPs rarely have access to training in mental health EBPs, such as CBT, or the support they need to provide EBPs sustainably [23] and have reported low confidence in their ability to deliver treatments like CBT [24,25,26]. Organizational barriers, including competing demands on SP time, lack of (or barriers to accessing) space or other school resources, and lack of support by school or district administrators [27], also impede SP ability to provide CBT or other EBPs in schools.

Implementation strategies—or theory-based techniques “used to enhance the adoption, implementation, or sustainability of an [EBP]” [28, 29]—hold potential for improving SP delivery of EBPs like CBT in schools. Replicating Effective Programs (REP), Coaching, and Facilitation are three promising school-level implementation strategies that have the potential to mitigate barriers to SP-delivered CBT. REP, detailed below, is a relatively low-burden, low-cost, readily scalable strategy that packages EBP training with on-demand technical assistance (TA) to customize the EBP to local users’ (e.g., schools’) needs [30, 31]. REP addresses fundamental barriers to school-based EBP delivery and has been shown to improve the uptake of psychosocial EBPs in community-based settings across different community organizations and health systems [31,32,33,34,35]. However, REP’s low intensity may prove inadequate for schools where SPs require substantial skills training or where organizational barriers are significant [35, 36]. As such, some schools may require REP augmentations that provide more intensive support. For skills-related barriers, a Coaching model that provides SPs with more intensive post-training support through skills modeling, practice, and feedback has shown promise for promoting EBP delivery [24, 26, 37, 38]. For organizational barriers, Facilitation—based on the integrated-Promoting Action on Research Implementation in Health Services (i-PARIHS) [39] framework—provides schools with ongoing consultation from an expert in strategic thinking and EBP implementation to garner administrative support, solve logistical challenges, and build community buy-in. In several community-based cluster-randomized trials, Facilitation has been shown to improve mental health EBP uptake [30, 34, 40,41,42,43,44,45] and to be highly cost-effective [46, 47].

Given that there is substantial heterogeneity in terms of implementation barriers at schools [48] and in how different schools might respond to different combinations of strategies, there is a need to develop and evaluate effective adaptive implementation strategies [49,50,51]. An adaptive implementation strategy is a sequence of decision rules used to guide implementers in selecting which combination of implementation strategies (e.g., REP, Coaching, Facilitation) to offer and when, including considerations of a school’s changing needs. An example of a higher-intensity adaptive implementation strategy is shown in Fig. 1.

Fig. 1
figure 1

Example of a higher-intensity adaptive implementation strategy

However, there is currently no research that evaluates the effectiveness of this type of adaptive implementation strategy for improving CBT delivery in schools. Perhaps more fundamentally, there is no research to inform (i) the effectiveness of starting with REP vs. REP + Coaching, (ii) the effectiveness of augmenting with Facilitation vs. not among schools that are slower to respond to REP or REP + Coaching, or (iii) whether additional school factors ought to be taken into consideration when making decisions, e.g., to start with REP vs. REP + Coaching, or to augment with Facilitation vs. not among slower-responder schools.

The current study (R01MH114203)—The Adaptive School-based Implementation of CBT (ASIC) Study [51]—used a clustered, sequential, multiple-assignment randomized trial (SMART) [52] to inform the development of a school-level adaptive implementation strategy for adopting and scaling up SP delivery of CBT. ASIC was done in partnership with TRAILS (Transforming Research into Action to Improve the Lives of Students) [26], a program that aims to implement CBT in high schools across the state of Michigan.

This manuscript reports the results of ASIC’s primary research aim, which was to evaluate the effectiveness of the adaptive implementation strategy shown in Fig. 1 versus providing REP alone (not adaptive, no Coaching, no Facilitation). The primary outcome is the total number of SP-reported CBT sessions delivered to students by SPs over the 18-month study period. Secondary outcomes are the number of CBT sessions delivered by type of session: group vs. individual brief vs. individual full. We hypothesized that, compared to REP alone, the adaptive implementation strategy in Fig. 1 would lead to a higher total number of CBT sessions delivered, on average, over 18 months.

We also present outcomes for two other implementation strategies embedded within ASIC: REP + Coaching from the start for all schools (not adaptive, no Facilitation), and REP that is augmented with Facilitation for slower-responder schools and continued REP for responder schools (REP + Facilitation; adaptive, no Coaching).


Participants and eligibility

SPs at eligible Michigan high schools were recruited for study participation.

Schools were eligible to participate in ASIC if they:

  1. 1)

    Served high school students (grades 9–12) from a Michigan school district and had not previously participated in a school-based CBT implementation initiative

  2. 2)

    Were within a 2-h driving distance of a mental health professional who was trained by TRAILS and able to serve as one of the Coaches for the implementation study

  3. 3)

    Had at least one eligible SP that agreed to participate in study assessments for the duration of the study

  4. 4)

    Had minimally sufficient resources, including space to deliver CBT, to allow for delivery of individual and/or group mental health support services on school grounds but outside of the general classroom environment

Eligible SPs were:

  1. 1.

    Employed at an eligible Michigan high school

  2. 2.

    Had a background in clinical school social work, counseling, psychology, or similar field, and were able to meet with students regularly to deliver mental health support services outside of the general classroom environment

  3. 3.

    Able to read and understand English, comprehend study assessments, and give informed consent

  4. 4.

    Completed a 1-day didactic training in CBT

Recruitment of schools was done by first contacting SPs at schools and then contacting school administrators. Specifically, once SPs confirmed interest and eligibility, a principal or other senior school administrator was asked to provide data on building-wide socio-demographics and leadership priorities regarding EBPs. While SPs may sometimes work in multiple schools, in this study, all SPs were associated with only one ASIC-enrolled school.

Evidence-based program to be implemented: cognitive behavioral therapy (CBT)

This study focused on encouraging SP delivery of CBT for students with depression and/or anxiety. Modular CBT, wherein individual CBT components can be delivered flexibly and responsively depending on student needs, was selected given its strong evidence base [53, 54]. Modular CBT has been found to reduce symptoms of depression and anxiety relative to usual care [54,55,56,57,58], including with school-age youth [57, 59] and across different racial or ethnic groups [9, 56]. Furthermore, CBT has demonstrated effectiveness when delivered within school settings [60,61,62]. CBT components included psychoeducation, relaxation, mindfulness, cognitive restructuring, behavioral activation, and exposure and were defined based on established, evidence-based intervention protocols [63, 64] and an established “distillation” model [65].

Implementer: the TRAILS program

TRAILS (not research staff) coordinated and delivered all implementation strategies. Specifically, TRAILS delivered the in-person, didactic CBT trainings and REP TA; recruited and trained all Coaches [26, 66]; recruited and trained the Facilitator; delivered phase 1 Coaching; monitored schools for improvement (e.g., determined responder status at the end of phase 1); and delivered phase 2 strategies (including Facilitation).

Clustered, sequentially randomized trial design

ASIC used a four-phase, clustered SMART [52] (Fig. 2). The study spanned four phases (9–13 weeks each) across two school years. Full ASIC study details—including rationale, stratified randomizations, pre-specified primary outcome, and sample size calculations—are available elsewhere [51].

Fig. 2
figure 2

Full ASIC trial design

Run-in phase (pre-randomization)

The pre-randomization, run-in phase began in October 2018. All schools were offered REP for up to 3 months. SPs were provided with information on registering for the study data collection dashboard (see below). Two weeks prior to the first randomization (mid-January 2019), as part of REP, TRAILS offered SPs a 1-day didactic training in CBT.

Phase 1

Approximately 2 weeks following training (late January 2019), schools were randomized with equal probability to either augment REP with Coaching or not, marking the beginning of phase 1.

Phases 2a and 2b

In early April 2019, 8 weeks after the first randomization, schools identified by TRAILS as “slower responders” (defined below) were re-randomized with equal probability to augment with Facilitation or not. Phase 2 spanned two semesters, separated by the summer break. We label these phases 2a (remainder of Spring 2019 semester) and 2b (Fall 2019 semester).

Phase 3

At the end of the Fall 2019 semester (December 2019), Coaching and Facilitation were discontinued and all schools completed the study with access only to REP supports. Data collection continued through mid-April 2020.

Implementation strategies

Replicating Effective Programs

All schools were offered REP [32]. REP, based on Rogers’ diffusion model [67] and social learning theory [68], is a low-intensity strategy designed to enhance EBP uptake through development of customized intervention materials appropriate for the specific implementation setting, didactic training, and provision of low-level TA. Prior to ASIC, TRAILS developed customized CBT materials designed to address common symptoms of depression and anxiety, tailored for school-based delivery by SPs. All materials were made available to SPs through the TRAILS website ( Materials included standardized screening tools SPs could use to identify students appropriate for CBT, an overview of CBT components, session agendas for providing CBT to individual students or groups, talking points for teaching students about CBT, and CBT handouts, worksheets, and resources for use with students. TA to support SP delivery was provided by a PhD-level clinical psychologist board-certified in CBT and included a monthly newsletter with information on TRAILS resources, a monthly opt-in TA call, and contact information for as-needed support via phone or email.

REP also included a daylong, in-person didactic training. TRAILS staff clinicians (PhD- and LMSW-level practitioners) offered the training at several locations across Michigan. The training covered screening and identification of students, CBT core components, and theoretical underpinnings. Training strategies included didactic instruction, videos and live demonstrations, role-plays with feedback, and facilitated small group discussion.

Responder vs. slower-responder schools

REP included a monitoring strategy whereby, at the end of phase 1, TRAILS identified whether schools were responders or slower-responders through a short assessment to all SPs (Supplementary Appendix A). School-level responder/slower-responder status was determined for the explicit purpose of making the decision to offer Facilitation or not. (Recall that Facilitation is a school-level strategy intended to impact school-level processes and barriers.) Schools were categorized as “slower-responders” if any SPs reported not providing ≥3 CBT components to ≥10 students OR if SPs reported, on average, >2 barriers to CBT delivery. Slower-responder schools thus included (i) schools where any SPs struggled to deliver CBT and (ii) schools where SPs were delivering CBT but endorsed barriers potentially precluding long-term delivery or sustainability. Finally, (iii) schools where any SPs failed to complete the monitoring assessment were also considered slower responders.

The responder/slower-responder definition was based on pilot data with non-ASIC SPs that found that SPs tended to report no barriers or 3+ barriers, and those reporting barriers had poor prognosis for implementation without access to a strategy (like Facilitation) designed to address these barriers. From this pilot data, schools identified as “responders” were not thought to be in need of Facilitation; schools identified as “slower-responders” were thought to potentially benefit from Facilitation.


The Coaching implementation strategy, used by TRAILS in more than 20 Michigan high schools prior to ASIC [26], was derived from the school-based Positive Behavior Interventions and Supports (PBIS) model of coaching for individual development [69]. A comprehensive, operationalized coaching protocol guides TRAILS Coaches to support SP learning and CBT delivery. In addition, the Coaches were expected to attend ~12 SP-delivered CBT student group sessions, during which Coaches would observe SP CBT delivery, provide feedback [70, 71], and, as appropriate, model the use of CBT components to improve SP competence [69, 72,73,74,75,76]. Each school was assigned a Coach with whom they were to arrange weekly Coaching visits for a minimum of one semester. TRAILS then administered a short, objective CBT competency assessment to all SPs; Coaches completed standardized ratings of their assigned SPs. Schools were provided with either a second semester of in-person Coaching (when SPs showed gaps in competency) or a stepped-down version (when SPs demonstrated sufficient competency). Coaches were required to complete weekly logs, documenting interaction with their assigned SPs and utilization of specific coaching techniques.

Coaches were recruited and trained by TRAILS. Coaches were typically licensed community mental health (CMH) providers (e.g., Licensed Clinical Social Workers) serving in child- or family-treatment roles and were recruited through professional networking or contacts made to CMH clinical directors. To be eligible to serve as a Coach for ASIC, Coaches had to complete an initial didactic training in CBT and mindfulness, 15 weeks of one-to-one consultation with a TRAILS staff clinician, and a second didactic training focused on the TRAILS Coaching Protocol [66].


Facilitation is based on the i-PARIHS Framework [77] and was designed to improve CBT delivery by improving SP self-efficacy [78] through mitigation of organizational (i.e., school-level) barriers. All SPs at schools assigned to Facilitation had the opportunity to engage in regular phone calls with the Facilitator for up to 24 weeks (the duration of phase 2). In line with prior studies [34, 40, 41, 79], the Facilitator addressed local barriers to CBT delivery by supporting SPs in the development of strategic thinking, leadership skills, and amelioration of barriers to CBT delivery through a five-step process. This process includes helping SPs set measurable goals, aligning SP strengths and CBT delivery with existing school and administrator values and priorities, providing guidance on overcoming local barriers to CBT delivery, engagement with administrators and other key stakeholders, and communication and marketing regarding the added value of CBT delivery (Table 1) [34, 41, 77, 80, 81]. To encourage positive synergy at schools that previously had been offered Coaching, the Facilitator could encourage SPs to discuss CBT skill-development issues and/or discuss strategies for improving communication with their Coach.

Table 1 Five-step Facilitation process

Facilitation was provided by a PhD-level clinical psychologist with expertise in CBT delivery, strategic thinking, and school-based mental health delivery. The Facilitator received training in Facilitation through the Quality Enhancement Research Initiative (QUERI) for Team-based Behavioral Health [82].

Primary and secondary aims and hypotheses

The study includes four embedded implementation strategies (Table 2), two of which were adaptive based on response to phase 1 strategies [49,50,51]. The primary aim was to test whether the least intensive strategy—REP alone—versus the adaptive implementation strategy in Fig. 1—REP + Coaching + Facilitation—results in a difference in terms of average total CBT delivery (shaded rows in Table 2). We hypothesized that, on average, REP + Coaching + Facilitation would lead to greater CBT delivery than REP alone.

Table 2 Four embedded implementation strategies

As a secondary aim, we present results for all other pairwise comparisons for primary outcome and secondary outcomes. For all outcomes, we hypothesized that the four strategies would be ordered as follows, from greater to lesser amount of average CBT delivery:

  • REP + Coaching + Facilitation > REP + Coaching = REP + Facilitation > REP Alone.

Research measures

Quarterly surveys

SPs completed baseline and quarterly research surveys that included demographics, professional qualifications and duties, prior training in and exposure to CBT, CBT knowledge and comfort with delivery, and barriers to delivery. SPs received $10 for each survey completed. SP demographics are reported in Supplementary Appendix B.

Outcomes: CBT delivery

The primary outcome was the total number of self-reported CBT sessions delivered by SPs (hereafter: CBT delivery). SPs were asked to self-report their CBT delivery weekly through a secure dashboard used explicitly for research purposes (Fig. 3). Each weekly report included the number of group sessions and full (≥15 min) and brief (<15 min) individual sessions delivered. To minimize burden, SPs were also provided with physical weekly tracking notepads and could enter dashboard data for up to 4 weeks retrospectively. SPs received $3 for each weekly report provided and were encouraged to report even if/when they delivered no CBT.

Fig. 3
figure 3

Individual and group CBT reporting on the ASIC dashboard

The primary outcome was the total number of CBT sessions delivered (group + brief individual + full individual), and three secondary measures were total CBT delivery by type: group, full, and brief. Weekly CBT delivery data collection took place phases 1 through 3, except during summer break or known school holidays (e.g., winter holidays).

Impact of COVID-19 on research data collection

Data collection was planned through mid-April 2020 (60 weeks total). However, due to the COVID-19 pandemic, Michigan closed schools statewide starting March 16, 2020 (week 56 of study data collection) [83, 84]. Thus, analyses included data through week 55.


To analyze the data, we used a generalization of a marginal, weighted least squares approach specifically developed to ensure unbiased estimation of the comparison of the implementation strategies embedded in a clustered SMART. The method is a generalization of the approach described in [52] for accommodating a repeated measures outcome (total CBT delivery by phase).

Study sample

In accordance with intent-to-treat principles, all n = 169 SPs at all N = 94 schools randomized in phase 1 were included in all analyses.

Modeling and estimation

The same modeling and estimation strategy was used for the primary outcome (average total CBT delivery) and for each of the three secondary outcomes (average total CBT delivery by type).

At each phase (1, 2a, 2b, and 3), a separate regression model was fit for the CBT delivery outcome, as follows: The phase 1 regression included an intercept and a contrast coded (+1/−1) indicator for phase 1 strategy. The phase 2a, 2b, and 3 regression models included an intercept, a contrast-coded indicator for phase 1 strategy, a contrast-coded indicator for phase 2 strategy, and the interaction between phase 1 and phase 2 strategies. All models included the following pre-specified, school-level baseline covariates: school size (>500 or ≤ 500 students), location of school (rural or urban), percentage of students on free/reduced lunch program (≥ 50% or <50%), and pre-randomization CBT delivery (any vs. none), as well as school-aggregated SP education and job tenure.

Phase 1 outcomes cannot be impacted by strategies offered in the future (i.e., in phase 2), whereas phase 2a, 2b, and 3 outcomes can be impacted by the sequence of strategies offered in phases 1 and 2; the fitted regression models reflect this feature of the SMART [85].

Standard least squares was used to estimate the phase 1 regression model. Weighted and replicated generalized estimating equations were used to estimate the regression models in phases 2a, 2b, and 3 [52]. As slower-responder schools were randomized twice with probability 1/2, they had a 1/4 chance of following their assigned sequence of strategies, whereas responder schools, randomized once with probability 1/2, had a 1/2 chance of following their assigned strategy sequence. Weighting is used to account for this known under-representation of slower-responder schools. Specifically, data for SPs in slower-responder schools were assigned a weight of 4, whereas data for SPs in responder schools were assigned a weight of 2. In addition, since each group of responder schools is consistent with two strategies (i.e., schools in cell A are consistent with REP only and REP + Facilitation and schools in cell D with REP + Coaching and REP + Coaching + Facilitation; see Table 2), the data for these schools was used twice (i.e., replicated) to facilitate a more efficient comparison of the four strategies. For details, see [52].

All models used bootstrapped standard errors (based on 1000 samples) to account for (i) clustering of SPs within schools, (ii) multiple observations per SP, (iii) sampling variation in the unknown distribution of the weights, and (iv) replication.

The fitted regression models were used to calculate estimates of the average CBT delivery at each phase, under each of the four strategies. To facilitate the comparison of the strategies using a single-number summary, for each of the four strategies, the phase-specific averages were summed to calculate “average total CBT delivery.” As phases varied slightly in length, in secondary analyses, we also computed average weekly delivery during each phase by dividing average delivery during each phase by the number of phase-weeks (results are provided in Supplementary Appendix C).

Primary aim comparison

ASIC’s primary aim was to test the null hypothesis that there is no difference in average total CBT delivery (primary outcome) between the least intensive strategy—REP alone—and the adaptive implementation strategy in Fig. 1—REP + Coaching + Facilitation. A Wald test, calculated as the pairwise comparison divided by its estimated standard error, was used to test this null hypothesis.

Secondary aim comparisons

For each outcome, all pair-wise comparisons (and associated 95% confidence intervals) of the average total CBT sessions delivered were estimated to better understand how the four strategies compared to each other.

Effect sizes

To enhance clinical interpretation, effect sizes [86] were calculated for each pairwise difference. Effect sizes were calculated as the pairwise comparison divided by an estimate of the standard deviation of the average total CBT delivery. Effect sizes of 0.2, 0.5, and 0.8 were regarded as small, moderate, and large, respectively [86].

Missing data

Multiple imputation was used to replace missing values in the outcomes and other measures [87]. Forty data sets were generated. All estimates, standard errors, and hypothesis tests reported below were calculated using standard rules [88, 89] for combining the results of identical analyses performed on each of the 40 imputed data sets. All regression models were fit with and without multiply-imputed data and results did not change substantively (details in Supplementary Appendix D).


Participants and baseline data

Michigan schools (N = 312) were approached for participation, with N = 115 (n = 227 SPs) agreeing to participate. One hundred sixty-nine SPs at N = 94 schools completed training and were randomized. The most common SP roles (n = 169) were school counselor (59%) and social worker (23%); other roles (18%) included school psychologist, behavioral intervention specialist, and special education teacher. SPs had been in their roles for an average of 8 years (standard deviation (SD) = 7.7) and 153 (90.5%) reported some graduate education. Twenty-one percent (n = 35) served exclusively or primarily students in special education; the remainder served students in general education or both. Ninety-two percent (n = 156) reported seeing students for individual counseling and 58 (34%) reported convening student groups. Fifty-seven SPs (34%) reported prior formal training in CBT (e.g., lectures in a graduate course) and 77 (46%) informal training (e.g., brief presentation, self-directed readings) (Supplementary Appendix B). Sixty-one percent (n = 104) were at schools where at least one SP reported delivering CBT during the pre-training phase.

Figure 4 shows the N = 94 ASIC schools within Michigan. Fifty-six percent were rural; average school size was 869 students (SD = 600) with 44% (SD = 18) qualifying for free/reduced lunch. Most schools had either 1 (N = 38; 40%) or 2 (N = 37; 39%) participating SPs.

Fig. 4
figure 4

Map of Michigan High Schools enrolled in ASIC. Note: N = 94 schools participated. School location on the map was determined by the school address listed on the school’s website

Strategies assigned and received

In phase 1, schools were randomized to Coaching (n = 88 SPs in N = 47 schools) vs. no Coaching (n = 81 SPs in N = 47 schools; Table 3). N = 33 schools (70%) assigned to Coaching were documented as ever engaging in Coaching. At the end of phase 1, 83 schools (88%, 154 SPs) were deemed slower-responders and re-randomized. N = 41 slower-responder schools (n = 74 SPs) were re-randomized to augment with Facilitation (Table 4; Fig. 5). N = 41 schools (100%) were documented as ever engaging in Facilitation. TA was provided to all SPs under REP; however, engagement was minimal, with few SPs attending monthly opt-in calls and 270 total minutes of TA on-demand support documented across the entire study period.

Table 3 School-level characteristics by phase 1 randomization (Coaching vs. no Coaching) (N = 94 schools)
Table 4 School-level characteristics for phase 2 randomization (Facilitation vs. no Facilitation) for slower-responder schools (N = 83 schools)

At least one SP from all 94 schools remained in the study through completion; however, one SP dropped out of the study during phase 2b (Fig. 5). SPs completed 4720 of 7267 possible weekly CBT reports (65%) and a median of 32 weeks (interquartile range: 18–40).

Fig. 5
figure 5

CONSORT diagram for the ASIC study — schools and school professionals

Primary outcome: average total CBT delivery

CBT delivery increased across all groups: 154 of 169 SPs (91%) reported delivering CBT at least once during the 43-week post-randomization period, and 20,517 CBT sessions were reported by SPs during this period. Estimated average total CBT delivery by strategy ranged from 94.53 (REP + Coaching) to 154.06 (REP + Facilitation) sessions (Table 5).

Table 5 Average CBT delivery (primary outcome), by phase and total

Pairwise comparisons for average total CBT delivery are shown in Table 6. For ASIC’s primary aim, there is insufficient evidence to reject the null hypothesis that there is no difference in average total CBT delivery between REP alone and the REP + Coaching + Facilitation adaptive implementation strategy (estimate = 9.69; 95% CI: −30.03, 49.40; p-value = 0.63). Consistent with this finding, the estimated effect is very small (effect size = 0.08).

Table 6 Pairwise comparisons for total CBT delivery (primary outcome)

Estimated effects were larger for ASIC’s secondary aim comparisons. SPs at schools assigned to REP + Facilitation delivered an estimated average of 59.53 more CBT sessions than SPs at schools assigned to REP + Coaching (95% CI: 11.53, 107.53; effect size = 0.49) and 42.66 more sessions than REP + Coaching + Facilitation (95% CI: −10.13, 95.45; effect size = 0.35). SPs at schools assigned to REP delivered an estimated 32.97 fewer sessions than SPs at schools assigned to REP + Facilitation (95% CI: −88.16, 22.22; effect size = −0.27) but 26.56 more sessions than SPs at schools assigned REP + Coaching (95% CI: −10.75, 63.87; effect size = 0.22). Using an effect size equal to or below the absolute value of 0.1 to suggest no clinically significant difference between strategies, Table 7 shows the hypothesized versus estimated order for the implementation strategies.

Table 7 Hypothesized vs. estimated order for implementation strategies

Secondary outcomes: average total CBT delivery by type

Estimated average total CBT delivery by type ranged from 16.81 group sessions (REP alone) to 81.60 individual brief sessions (REP + Facilitation) (Table 8). Across all strategies, the greatest percentage of average total CBT delivery were individual brief sessions (≥43%) and the lowest group sessions (≤19%). Pairwise comparisons for average total CBT delivery by type (Table 9) and estimated ordering of strategies by type (Table 10) are also shown.

Table 8 Average CBT delivery, by type (secondary outcomes)
Table 9 Pairwise comparisons for average CBT delivery, by type (secondary outcomes)
Table 10 Estimated ordering for implementation strategies for CBT delivery, by type


This study compared four different implementation strategies on the total number of SP-reported CBT sessions delivered. The REP and REP + Coaching strategies were non-adaptive, offering the same types of support to all schools across all phases; the REP + Facilitation and REP + Coaching + Facilitation strategies were adaptive, augmenting with Facilitation in phase 2 for slower-responder schools. With respect to our primary aim hypothesis, we found no evidence of differences in average total CBT delivery between REP and REP + Coaching + Facilitation. In secondary aim comparisons, we found that across the four strategies, the adaptive REP + Facilitation strategy resulted in the highest average CBT delivery (154.1 sessions per SP) and the non-adaptive REP + Coaching strategy the lowest (94.5 sessions per SP). Examining CBT delivery by type, however, most differences across strategies reflected higher vs. lower reports of brief (≤15 min) individual CBT sessions.

Harnessing schools and school professionals for improving adolescent mental health

This manuscript adds to the growing literature supporting the potential for SPs to help fill gaps in adolescent access to mental health care by offering mental health EBPs, like CBT, in schools. Indeed, all strategies, including the low-intensity REP alone strategy, showed high levels of CBT uptake, with more than 90% of participating SPs (i.e., that completed training) reporting CBT delivery at least once (relative to CBT adoption rates of 35–40% elsewhere [90]). Combined, SPs delivered more than 20,000 CBT sessions over 43 weeks. Furthermore, as our secondary trend analyses show (Supplementary Appendix C), under all strategies CBT delivery remained consistent or increased across all study phases, including the maintenance phase, which followed discontinuation of Coaching and/or Facilitation.

This study also provides support for the feasibility of organizations like TRAILS offering adaptive implementation strategies in schools. As noted, TRAILS was exclusively responsible for monitoring response status after phase 1 and, as applicable, adapting implementation support in phase 2.

Explaining differences across implementation strategies

Although prior research has drawn attention to potential shortcomings of offering intensive implementation support, especially in lower-resourced settings [40, 91], our team was nonetheless surprised to not find support for our hypothesis that, for our primary outcome of average total CBT delivery, the most intensive strategy (REP + Coaching + Facilitation) would outperform the least intensive strategy (REP). Also surprising was that the REP + Facilitation strategy, which adapted phase 2 support based on response to phase 1, differed markedly from the non-adaptive REP + Coaching strategy for three of the four outcomes. Note, however, that these analyses focus on SP-reported CBT delivery; future analyses will examine other outcomes, including CBT fidelity and change in student mental health outcomes. However, these initial findings raise several potential explanations.

While future analyses will examine mechanisms more systematically, we postulate that much of the benefit of REP + Facilitation is due to the fact that Facilitation was provided only to schools identified as slower responders, i.e., in context of a recognized need. The salience of the targeted barriers and/or the perceived appropriateness of Coaching and Facilitation strategies may also have differed. SPs may have had concerns about Coach attendance at student CBT sessions risking student privacy or confidence or may have felt they did not need further CBT skill development. Facilitation, which has proven effective in other implementation trials [36, 40, 92, 93], also provides support that is highly tailored to specific school needs [80, 94, 95] by “bundling” other discrete implementation strategies [80, 94, 96, 97], and generally addresses barriers that were more external and jointly identified with the SP, thus not risking SP concerns regarding student privacy or perceptions of help-seeking. Prior TRAILS’ Coaching studies have not reported such concerns [26]; however, these studies largely recruited SPs motivated to receive Coaching. In contrast, ASIC recruited a more heterogeneous sample of SPs interested in receiving support for implementing CBT, but not necessarily via Coaching.

Lower SP engagement also suggests that Coaching may have led to greater real or perceived burden by SPs, relative to Facilitation. Coaching is typically offered in the context of CBT groups, which requires SPs to identify and coordinate student CBT groups and align group delivery with Coaching visits. As both SPs and Coaches were balancing many time commitments, this coordination may have lessened SP engagement with Coaching. Facilitation also includes some scheduling burden, but is done primarily over the phone and was also not dependent on student CBT group coordination. Furthermore, Facilitation’s scheduling burden may have been more acceptable to SPs given their awareness that Facilitation was offered based on a recognized need for further support.

Facilitation was also provided by a single Facilitator, ensuring strategy consistency and fidelity, while Coaching was provided by 42 existing providers across Michigan who were employed by other local agencies near the schools they served. While Coaches were required to complete Coach training through TRAILS [66], there was likely variability in the quality of Coaching provided, as well as in Coach commitment to their schools given other responsibilities.


First, our results rely on SP self-reported CBT delivery. Self-report of implementation outcomes like adoption, reach, and even fidelity is common in implementation studies [98], especially in lower-resourced settings [40, 41], including schools [99, 100]. As clinicians (e.g., social workers, counselors, psychologists), SPs were also accustomed to documenting mental health services, and the process for documenting CBT delivery was identical across arms. However, it is possible that SPs assigned to Coaching could have reported a lower number of CBT sessions if Coach feedback led to different perceptions of what counted as CBT, given Coaching’s explicit focus on improving CBT knowledge and expertise. However, we took proactive steps to protect against this, including (i) clearly explaining to all SPs prior to randomization how we were defining (and how they should be reporting) CBT delivery for research purposes and (ii) having SPs “practice” reporting CBT during the pre-randomization REP-only phase to ensure their comfort and consistency prior to offering any additional support (e.g., Coaching, Facilitation). SPs’ reporting CBT components delivered each week (Fig. 3) also likely protected against SPs reporting any mental health-focused interaction as CBT. Finally, as reported above, weekly CBT response rates did not vary across Coaching vs. no Coaching study arms.

Second, generalizability is limited to schools in Michigan that had a TRAILS Coach in their vicinity, but nonetheless included a diverse group of urban, suburban, and rural communities.

Future work

Future manuscripts will examine differences across strategies for two key secondary outcomes: CBT fidelity/component delivery and student mental health. We will also examine moderators of effectiveness for Coaching and (among slower-responder schools) Facilitation, including different definitions of “slower-responders” for purposes of deciding how to best tailor Facilitation (vs. no Facilitation). These analyses will help to inform a more fully tailored adaptive implementation strategy for efficiently scaling up SP-delivered CBT in schools by matching implementation strategies to identified barriers [101] or short-term implementation outcomes [41]. Qualitative interviews and strategy (e.g., Facilitation) tracking data will also be used to investigate mechanisms of effectiveness (e.g., strategy burden, need, adaptability), whether strategies addressed intended barriers (e.g., CBT knowledge, organizational barriers), and sustainability of SP-delivered CBT during and after COVID-19.


As the COVID-19 pandemic continues to shed light on the role schools and SPs play in student mental health, questions abound as to which implementation strategies are most effective at addressing barriers to offering EBPs like CBT at scale. Our findings suggest that, among the four strategies examined, the most effective strategy for increasing average SP CBT delivery is a two-phase adaptive implementation strategy that (i) offers REP (a low-intensity, low-cost strategy) in phase 1 to all schools and, in phase 2, (ii) augments REP with Facilitation for slower-responder schools and (iii) continues REP for schools that respond to REP.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Change history



Adaptive School-based Implementation of CBT


Cognitive behavioral therapy


Community mental health


Evidence-based practice


Integrated-Promoting Action on Research Implementation in Health Services


Positive Behavior Interventions and Supports


Quality Enhancement Research Initiative


Replicating Effective Programs


Sequential, Multiple-Assignment Randomized Trial


School professional


Transforming Research into Action to Improve the Lives of Students


  1. Merikangas KR, et al. Lifetime prevalence of mental disorders in U.S. adolescents: results from the National Comorbidity Survey Replication–Adolescent Supplement (NCS-A). J Am Acad Child Adolesc Psychiatry. 2010;49(10):980–9.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Charvat J. Research on the Relationship Between Mental Health and Academic Achievement, National Association of School Psychologists, 2012. [Online]. Available: Accessed 16 July 2021.

  3. Greenberg PE, et al. The economic burden of depression in the United States: how did it change between 1990 and 2000? J Clin Psychiatry. 2003;64(12):1465–75.

    Article  PubMed  Google Scholar 

  4. Smyth JM, Arigo D. Recent evidence supports emotion-regulation interventions for improving health in at-risk and clinical populations. Curr Opin Psychiatry. 2009;22(2):205–10.

    Article  PubMed  Google Scholar 

  5. Weisz JR, et al. Cognitive–behavioral therapy versus usual clinical care for youth depression: an initial test of transportability to community clinics and clinicians. J Consult Clin Psychol. 2009;77(3):383–96.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Zins JE, Bloodworth MR, Weissberg RP, Walberg HJ. The scientific base linking social and emotional learning to school success. J Educ Psychol Consult. 2007;17(2–3):191–210.

    Article  Google Scholar 

  7. Martini R, Hilt R, Marx L, Chenven M, Naylor M, Sarvet B, Ptakowski KK. Best principles for integration of child psychiatry into the pediatric health home. Washington, DC: American Academy of Child & Adolescent Psychiatry; 2012.

  8. Ginsburg GS, Becker KD, Drazdowski TK, Tein J-Y. Treating anxiety disorders in inner city schools: results from a pilot randomized controlled trial comparing CBT and usual care. Child Youth Care Forum. 2012;41(1):1–19.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Huey SJ, Polo AJ. Evidence-based psychosocial treatments for ethnic minority youth. J Clin Child Adolesc Psychol. 2008;37(1):262–301.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Kataoka SH, Zhang L, Wells KB. Unmet need for mental health care among U.S. children: variation by ethnicity and insurance status. AJP. 2002;159(9):1548–55.

    Article  Google Scholar 

  11. Farmer EMZ, Burns BJ, Phillips SD, Angold A, Costello EJ. Pathways into and through mental health services for children and adolescents. Psychiatr Serv. 2003;54(1):60–6.

    Article  PubMed  Google Scholar 

  12. Burns BJ, et al. Children’s mental health service use across service sectors. Health Aff. 1995;14(3):147–59.

    Article  CAS  Google Scholar 

  13. Weist MD, Rubin M, Moore E, Adelsheim S, Wrobel G. Mental health screening in schools. J Sch Health. 2007;77(2):53–8.

    Article  PubMed  Google Scholar 

  14. CDC, Data and statistics on children’s mental health | CDC, Centers for Disease Control and Prevention, 2020. (Accessed 26 Jul 2021).

    Google Scholar 

  15. Asarnow JR, et al. Depression and role impairment among adolescents in primary care clinics. J Adolesc Health. 2005;37(6):477–83.

    Article  PubMed  Google Scholar 

  16. Jaycox LH, et al. Support for students exposed to trauma: a pilot study. School Ment Health. 2009;1(2):49–60.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Edmunds JM, Beidas RS, Kendall PC. Dissemination and implementation of evidence-based practices: training and consultation as implementation strategies. Clin Psychol (New York). 2013;20(2):152–65.

    Article  Google Scholar 

  18. Mychailyszyn MP, et al. Assessing and treating child anxiety in schools: assessing and treating anxiety in schools. Psychol Schs. 2011;48(3):223–32.

    Article  Google Scholar 

  19. Bruns EJ, et al. Fostering SMART partnerships to develop an effective continuum of behavioral health services and supports in schools. Am J Orthopsychiatry. 2016;86(2):156–70.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Stephan SH, Weist M, Kataoka S, Adelsheim S, Mills C. Transformation of children’s mental health services: the role of school mental health. Psychiatr Serv. 2007;58(10):9.

    Article  Google Scholar 

  21. Schwebel DC, Plumert JM, Pick HL. Integrating basic and applied developmental research: a new model for the twenty-first century. Child Dev. 2000;71(1):222–30.

    Article  CAS  PubMed  Google Scholar 

  22. Ali MM, West K, Teich JL, Lynch S, Mutter R, Dubenitz J. Utilization of mental health services in educational setting by adolescents in the United States. J Sch Health. 2019;89(5):393–401.

    Article  PubMed  Google Scholar 

  23. Beidas RS, Barmish AJ, Kendall PC. Training as usual: can therapist behavior change after reading a manual and attending a brief workship on cognitive behavioral therapy for youth anxiety? 2009. p. 7.

    Google Scholar 

  24. Beidas RS, Mychailyszyn MP, Edmunds JM, Khanna MS, Downey MM, Kendall PC. Training school mental health providers to deliver cognitive-behavioral therapy. School Mental Health. 2012;4(4):197–206.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Nadeem E, Gleacher A, Beidas RS. Consultation as an implementation strategy for evidence-based practices across multiple contexts: unpacking the black box. Adm Policy Ment Health. 2013;40(6):439–50.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Koschmann E, Abelson JL, Kilbourne AM, Smith SN, Fitzgerald K, Pasternak A. Implementing evidence-based mental health practices in schools: feasibility of a coaching strategy. JMHTEP. 2019;14(4):212–31.

    Article  Google Scholar 

  27. Langley AK, Nadeem E, Kataoka SH, Stein BD, Jaycox LH. Evidence-based mental health programs in schools: barriers and facilitators of successful implementation. School Ment Health. 2010;2(3):105–13.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8(1):139.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Powell BJ, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Kilbourne AM, et al. Re-engaging veterans with serious mental illness into care: preliminary results from a national randomized trial of an enhanced versus standard implementation strategy. Psychiatr Serv. 2015;66(1):90–3.

    Article  PubMed  Google Scholar 

  31. Kelly JA, Heckman TG, Stevenson LY, Williams PN, et al. Transfer of research-based HIV prevention interventions to community service providers: fidelity and adaptation. AIDS Educ Prev. 2000;12:87–98.

    CAS  PubMed  Google Scholar 

  32. Kilbourne AM, Neumann MS, Pincus HA, Bauer MS, Stall R. Implementing evidence-based interventions in health care: application of the replicating effective programs framework. Implement Sci. 2007;2(1):42.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Kilbourne AM, et al. Public-academic partnerships: evidence-based implementation: the role of sustained community-based practice and research partnerships. PS. 2012;63(3):205–7.

    Article  Google Scholar 

  34. Kilbourne AM, et al. Protocol: Adaptive Implementation of Effective Programs Trial (ADEPT): cluster randomized SMART trial comparing a standard versus enhanced implementation strategy to improve outcomes of a mood disorders program. Implement Sci. 2014;9:132.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Waxmonsky J, et al. Enhanced fidelity to a psychosocial treatment for bipolar disorder: results from a randomized controlled implementation trial. Psychiatr Serv. 2014;65(1):81–90.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Kilbourne AM, et al. Enhancing outreach for persons with serious mental illness: 12-month results from a cluster randomized trial of an adaptive implementation strategy. Implement Sci. 2014;9:163.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Owens JS, et al. Implementation science in school mental health: key constructs in a developing research agenda. School Ment Health. 2014;6(2):99–111.

    Article  PubMed  Google Scholar 

  38. Eiraldi R, Wolk CB, Locke J, Beidas R. Clearing hurdles: the challenges of implementation of mental health evidence-based practices in under-resourced schools. Adv Sch Ment Health Promot. 2015;8(3):124–45.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2015;11(1):33.

    Article  Google Scholar 

  40. Smith SN, et al. Change in patient outcomes after augmenting a low-level implementation strategy in community practices that are slow to adopt a collaborative chronic care model: a cluster randomized implementation trial, Med Care. 2019;57(7):503.

  41. Smith SN, Liebrecht CM, Bauer MS, Kilbourne AM. Comparative effectiveness of external vs blended facilitation on collaborative care model implementation in slow-implementer community practices. Health Serv Res. 2020;55(6):954–65.

  42. Connolly SL, Sullivan JL, Ritchie MJ, Kim B, Miller CJ, Bauer MS. External facilitators’ perceptions of internal facilitation skills during implementation of collaborative care for mental health teams: a qualitative analysis informed by the i-PARIHS framework. BMC Health Serv Res. 2020;20:165.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Bauer MS, et al. The collaborative chronic care model for mental health conditions. Med Care. 2019;57(10 Suppl 3):S221–7.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Kim B, et al. Comparing variations in implementation processes and influences across multiple sites: What works, for whom, and how? Psychiatry Res. 2020;283:112520.

    Article  PubMed  Google Scholar 

  45. Sullivan JL, et al. Collaborative chronic care model implementation within outpatient behavioral health care teams: qualitative results from a multisite trial using implementation facilitation. Implement Sci Commun. 2021;2(1):33.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Eisman AB, Kilbourne AM, Dopp AR, Saldana L, Eisenberg D. Economic evaluation in implementation science: making the business case for implementation strategies. Psychiatry Res. Jan. 2020;283:S0165-1781(19)30752–8.

    Article  Google Scholar 

  47. Miller CJ, Griffith KN, Stolzmann K, Kim B, Connolly SL, Bauer MS. An economic analysis of the implementation of team-based collaborative care in outpatient general mental health clinics. Med Care. 2020;58(10):874–80.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Rickwood D, Deane FP, Wilson CJ, Ciarrochi J. Young people’s help-seeking for mental health problems. Austral e-J Advance Mental Health. 2005;4(3):218–51.

    Article  Google Scholar 

  49. Quanbeck A, et al. The Balanced Opioid Initiative: protocol for a clustered, sequential, multiple-assignment randomized trial to construct an adaptive implementation strategy to improve guideline-concordant opioid prescribing in primary care. Implement Sci. 2020;15(1):26.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Nahum-Shani I, Almirall D. An Introduction to Adaptive Interventions and SMART Designs in Education. National center for special education research. 2019.

  51. Kilbourne AM, et al. Adaptive School-based Implementation of CBT (ASIC): clustered-SMART for building an optimized adaptive implementation intervention to improve uptake of mental health interventions in schools. Implement Sci. 2018;13(1):119.

    Article  PubMed  PubMed Central  Google Scholar 

  52. NeCamp T, Kilbourne A, Almirall D. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations. Stat Methods Med Res. 2017;26(4):1572–89.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Becker EM, Becker KD, Ginsburg GS. Modular cognitive behavioral therapy for youth with anxiety disorders: a closer look at the use of specific modules and their relation to treatment process and response. School Mental Health. 2012;4(4):243–53.

    Article  Google Scholar 

  54. Chiu AW, et al. Effectiveness of modular CBT for child anxiety in elementary schools. Sch Psychol Q. 2013;28(2):141–53.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Lyon AR, Charlesworth-Attie S, Stoep AV, McCauley E. Modular psychotherapy for youth with internalizing problems: implementation with therapists in school-based health centers. School Psychol Rev. 2011;40(4):569–81.

    Article  Google Scholar 

  56. Ginsburg GS, Becker KD, Kingery JN, Nichols T. Transporting CBT for childhood anxiety disorders into inner-city school-based mental health clinics. Cogn Behav Pract. 2008;15(2):148–58.

    Article  Google Scholar 

  57. Weisz JR, et al. Testing standard and modular designs for psychotherapy treating depression, anxiety, and conduct problems in youth: a randomized effectiveness trial. Arch Gen Psychiatry. 2012;69(3):274.

    Article  PubMed  Google Scholar 

  58. Chorpita BF, et al. Long-term outcomes for the Child STEPs randomized effectiveness trial: a comparison of modular and standard treatment designs with usual care. J Consult Clin Psychol. 2013;81(6):999–1009.

    Article  PubMed  Google Scholar 

  59. Chorpita BF, et al. Child STEPs in California: a cluster randomized effectiveness trial comparing modular treatment with community implemented treatment for youth with anxiety, depression, conduct problems, or traumatic stress. J Consult Clin Psychol. 2017;85(1):13–25.

    Article  PubMed  Google Scholar 

  60. Creed TA, Waltman SH, Frankel SA, Williston MA. School-based cognitive behavioral therapy: current status and alternative approaches. Curr Psychiatry Rev. 2016;12(1):53–64.

    Article  Google Scholar 

  61. Masia Warner C, et al. Can school counselors deliver cognitive-behavioral treatment for social anxiety effectively? A randomized controlled trial. J Child Psychol Psychiatry. 2016;57(11):1229–38.

    Article  PubMed  Google Scholar 

  62. Haugland BSM, et al. Effectiveness of brief and standard school-based cognitive-behavioral interventions for adolescents with anxiety: a randomized noninferiority study. J Am Acad Child Adolesc Psychiatry. 2020;59(4):552–564.e2.

    Article  PubMed  Google Scholar 

  63. Chorpita BF, Daleiden EL, Weisz JR. Modularity in the design and application of therapeutic interventions. Appl Prev Psychol. 2005;11(3):141–56.

    Article  Google Scholar 

  64. Chorpita BF, Daleiden EL. Mapping evidence-based treatments for children and adolescents: application of the distillation and matching model to 615 treatments from 322 randomized trials. J Consult Clin Psychol. 2009;77(3):566–79.

    Article  PubMed  Google Scholar 

  65. Chorpita BF, Daleiden EL, Weisz JR. Identifying and selecting the common elements of evidence based interventions: a distillation and matching model. Ment Health Serv Res. 2005;7(1):5–20.

    Article  PubMed  Google Scholar 

  66. Meyer A, et al. Developing a statewide network of coaches to support youth access to EBPs. Implement Res Pract. In Press.

  67. Rogers EM. Diffusion of innovations. 4th ed. New York: Simon and Schuster; 2010.

  68. Green LW, Kreuter MW. Health promotion planning: an educational and environmental approach. In: Health promotion planning: an educational and environmental approach. California City: Mayfield Pub. Co.; 1991.

    Google Scholar 

  69. Hershfeldt PA, Pell K, Sechrest R, Pas ET, Bradshaw CP. Lessons learned coaching teachers in behavior management: the PBIS plus coaching model. J Educ Psychol Consult. 2012;22(4):280–99.

    Article  PubMed  PubMed Central  Google Scholar 

  70. Fixsen DL, Blase KA, Duda MA, Naoom SF, Van Dyke M. Implementation of evidence-based treatments for children and adolescents: research findings and their implications for the future, in Evidence-based psychotherapies for children and adolescents. 2nd ed. New York: The Guilford Press; 2010. p. 435–50.

    Google Scholar 

  71. Herschell AD, Kolko DJ, Baumann BL, Davis AC. The role of therapist training in the implementation of psychosocial treatments: a review and critique with recommendations. Clin Psychol Rev. 2010;30(4):448–66.

    Article  PubMed  PubMed Central  Google Scholar 

  72. McHugh RK, Barlow DH. The dissemination and implementation of evidence-based psychological treatments: a review of current efforts. Am Psychol. 2010 0208;65(2):73.

    Article  PubMed  Google Scholar 

  73. Lochman JE, Boxmeyer C, Powell N, Qu L, Wells K, Windle M. Dissemination of the Coping Power program: importance of intensity of counselor training. J Consult Clin Psychol. 2009;77(3):397–409.

    Article  PubMed  Google Scholar 

  74. Miller WR, Yahne CE, Moyers TB, Martinez J, Pirritano M. A randomized trial of methods to help clinicians learn motivational interviewing. J Consult Clin Psychol. 2004;72(6):1050–62.

    Article  PubMed  Google Scholar 

  75. Joyce BR. Student achievement through staff development. 3rd ed. Alexandria: Association for Supervision & Curriculum Development; 2002.

    Google Scholar 

  76. Funderburk B, Chaffin M, Bard E, Shanley J, Bard D, Berliner L. Comparing client outcomes for two evidence-based treatment consultation strategies. J Clin Child Adolesc Psychol. 2015;44(5):730–41.

    Article  PubMed  Google Scholar 

  77. Kilbourne AM, et al. Cluster randomized adaptive implementation trial comparing a standard versus enhanced implementation intervention to improve uptake of an effective re-engagement program for patients with serious mental illness. Implement Sci. 2013;8(1):136.

    Article  PubMed  PubMed Central  Google Scholar 

  78. Bandura A. Self-efficacy: toward a unifying theory of behavioral change. Psychol Rev. 1977;84(2):191–215.

    Article  CAS  PubMed  Google Scholar 

  79. Flumenbaum R, Smith SN, Eisman A, Kilbourne AM. ASIC facilitation manual. 2020.

  80. Ritchie MJ, Parker LE, Kirchner JE. From novice to expert: a qualitative study of implementation facilitation skills. Implement Sci Commun. 2020;1(1):25.

    Article  PubMed  PubMed Central  Google Scholar 

  81. Kirchner JE, Ritchie MJ, Pitcock JA, Parker LE, Curran GM, Fortney JC. Outcomes of a partnered facilitation strategy to implement primary care–mental health. J Gen Intern Med. 2014;29(4):904–12.

    Article  PubMed  PubMed Central  Google Scholar 

  82. Kirchner J. Behavioral Health QUERI Implementation Facilitation (IF) training hub. [Online]. Available: Accessed 17 Nov 2021.

  83. Whitmer - Executive Order 2020-05: temporary prohibition on large assemblages and events, temporary school closures - RESCINDED, 2020.,9309,7-387-90499_90705-521595%2D%2D,00.html. Accessed 24 Nov 2020.

  84. Whitmer - Executive Order 2020-35: provision of K-12 education during the remainder of the 2019-2020 school year - RESCINDED, 2020.,9309,7-387-90499_90705-524032%2D%2D,00.html. Accessed 24 Nov 2020.

  85. Lu X, et al. Comparing dynamic treatment regimes using repeated-measures outcomes: modeling considerations in SMART studies. Stat Med. 2016;35(10):1595–615.

    Article  PubMed  Google Scholar 

  86. Cohen J. Statistical power analysis for the behavioral sciences. Hillsdale: Academic Press; 2013.

  87. Shortreed SM, Laber E, Stroup TS, Pineau J, Murphy SA. A multiple imputation strategy for sequential multiple assignment randomized trials. Stat Med. 2014;33(24):4202–14.

    Article  PubMed  PubMed Central  Google Scholar 

  88. Rubin DB. Multiple imputation for nonresponse in surveys. In: Multiple imputation for nonresponse in surveys. New York: Wiley; 1987. p. i–xxix.

  89. J. Schafer, Multiple imputation: a primer, 1999. (Accessed 12 Oct 2021).

    Google Scholar 

  90. Kauth MR, et al. Employing external facilitation to implement cognitive behavioral therapy in VA clinics: a pilot study. Implement Sci. 2010;5(1):75.

    Article  PubMed  PubMed Central  Google Scholar 

  91. Parchman ML, et al. A randomized trial of external practice support to improve cardiovascular risk factors in primary care. Ann Fam Med. 2019;17(Suppl 1):S40–9.

    Article  PubMed  PubMed Central  Google Scholar 

  92. Parchman ML, et al. A randomized trial of practice facilitation to improve the delivery of chronic illness care in primary care: initial and sustained effects. Implement Sci. 2013;8(1):93.

    Article  PubMed  PubMed Central  Google Scholar 

  93. Chinman M, et al. Implementation of peer specialist services in VA primary care: a cluster randomized trial on the impact of external facilitation. Implement Sci. 2021;16(1):60.

    Article  PubMed  PubMed Central  Google Scholar 

  94. Harvey G, et al. The NIHR collaboration for leadership in applied health research and care (CLAHRC) for Greater Manchester: combining empirical, theoretical and experiential evidence to design and evaluate a large-scale implementation strategy. Implement Sci. 2011;6(1):96.

    Article  PubMed  PubMed Central  Google Scholar 

  95. Dogherty EJ, Harrison MB, Graham ID. Facilitation as a role and process in achieving evidence-based practice in nursing: a focused review of concept and meaning. Worldviews Evid Based Nurs. 2010;7(2):76–89.

    Article  PubMed  Google Scholar 

  96. Stetler CB, et al. Role of ‘external facilitation’ in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement Sci. 2006;1(1):23.

    Article  PubMed  PubMed Central  Google Scholar 

  97. Bidassie B, Williams LS, Woodward-Hagg H, Matthias MS, Damush TM. Key components of external facilitation in an acute stroke quality improvement collaborative in the Veterans Health Administration. Implement Sci. 2015;10(1):69.

    Article  PubMed  PubMed Central  Google Scholar 

  98. Martinez RG, Lewis CC, Weiner BJ. Instrumentation issues in implementation science. Implement Sci. 2014;9(1):118.

    Article  PubMed  PubMed Central  Google Scholar 

  99. Woodard GS, et al. Implementing mental health services for children and adolescents: caregiver involvement in school-based care. PS. 2020;71(1):79–82.

    Article  Google Scholar 

  100. Larson M, Cook CR, Fiat A, Lyon AR. Stressed teachers don’t make good implementers: examining the interplay between stress reduction and intervention fidelity. School Mental Health. 2018;10(1):61–76.

    Article  Google Scholar 

  101. Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14(1):42.

    Article  PubMed  PubMed Central  Google Scholar 

Download references


Not applicable.


This work is supported by the National Institute of Mental Health (R01MH114203; UG3 HL154280), the National Institute of Drug Abuse (P50DA054039, R01DA039901), the Institute for Education Sciences (R324B180003), and the State of Michigan. TRAILS is supported by funds the Centers for Medicare and Medicaid Services through the Michigan Department of Health and Human Services, the Michigan Health Endowment Fund, the Ethel and James Flinn Foundation, and other Michigan-based foundations.

Author information

Authors and Affiliations



SNS contributed to the study aims and conception of the methods and analyses and drafted the manuscript. DA co-conceived the study aims (together with AMK), led the study design and analyses, led the interpretation of results, and wrote, edited, and contributed critically important intellectual content to all sections of the manuscript. SC contributed to the methods and led analyses, drafted all tables and figures, and contributed to the interpretation of results. EK contributed to the study aims and conception of the methods, contributed to the interpretation of results, and contributed critical content to the “Background” and “Discussion” sections. AR developed the “Background” section and contributed critical content to the methods and discussion. EB contributed to the interpretation of results and contributed critical content to the “Background” and “Discussion” sections. AL conducted the literature review, wrote content for the “Background” and “Methods” sections, and provided critical content for the discussion. JLA contributed to the conceptual development, background, and design refinement and editing of the manuscript. DE contributed to the conceptual development, study aims and design, and editing of the manuscript. JAH contributed to the conceptual development, background, study aims and design, and editing of the manuscript. KDF contributed to the study aims and design and contributed content to the “Background” and “Discussion” sections. CL contributed to the design of the work and contributed critical content to the methods and discussion. AMK conceived of the study aims and contributed to the design, acquisition, and interpretation of data, wrote and edited sections of the manuscript, and contributed critically important intellectual content.

The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Shawna N. Smith.

Ethics declarations

Ethics approval and consent to participate

This study was reviewed and approved by the University of Michigan Institutional Review Board (HUM # 00132239).

Consent for publication

Not applicable.

Competing interests

The views expressed are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs or other public entity. TRAILS is in the process of applying for non-profit corporation status (501c.3).

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The original version of this article was revised: the incorrect version of Additional file 1 was published.

Supplementary Information

Additional file 1:

Appendix A. School Professional Assessment Survey. Appendix B. School Professional Characteristics and Background. Appendix C. Re-Analysis Focusing on CBT Delivery Trends. Appendix D. Missing Data and Imputation.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Smith, S.N., Almirall, D., Choi, S.Y. et al. Primary aim results of a clustered SMART for developing a school-level, adaptive implementation strategy to support CBT delivery at high schools in Michigan. Implementation Sci 17, 42 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: