Skip to main content

Protocol for a hybrid type 3 effectiveness-implementation trial of a pragmatic individual-level implementation strategy for supporting school-based prevention programming

Abstract

Background

For approximately one in five children who have social, emotional, and behavioral (SEB) challenges, accessible evidence-based prevention practices (EBPPs) are critical. In the USA, schools are the primary setting for children’s SEB service delivery. Still, EBPPs are rarely adopted and implemented by front-line educators (e.g., teachers) with sufficient fidelity to see effects. Given that individual behavior change is ultimately required for successful implementation, focusing on individual-level processes holds promise as a parsimonious approach to enhance impact. Beliefs and Attitudes for Successful Implementation in Schools for Teachers (BASIS-T) is a pragmatic, multifaceted pre-implementation strategy targeting volitional and motivational mechanisms of educators’ behavior change to enhance implementation and student SEB outcomes. This study protocol describes a hybrid type 3 effectiveness-implementation trial designed to evaluate the main effects, mediators, and moderators of the BASIS-T implementation strategy as applied to Positive Greetings at the Door, a universal school-based EBPP previously demonstrated to reduce student disruptive behavior and increase academic engagement.

Methods

This project uses a blocked randomized cohort design with an active comparison control (ACC) condition. We will recruit and include approximately 276 teachers from 46 schools randomly assigned to BASIS-T or ACC conditions. Aim 1 will evaluate the main effects of BASIS-T on proximal implementation mechanisms (attitudes, subjective norms, self-efficacy, intentions to implement, and maintenance self-efficacy), implementation outcomes (adoption, reach, fidelity, and sustainment), and child outcomes (SEB, attendance, discipline, achievement). Aim 2 will examine how, for whom, under what conditions, and how efficiently BASIS-T works, specifically by testing whether the effects of BASIS-T on child outcomes are (a) mediated via its putative mechanisms of behavior change, (b) moderated by teacher factors or school contextual factors, and (c) cost-effective.

Discussion

This study will provide a rigorous test of BASIS-T—a pragmatic, theory-driven, and generalizable implementation strategy designed to target theoretically-derived motivational mechanisms—to increase the yield of standard EBPP training and support strategies.

Trial registration

ClinicalTrials.gov ID: NCT05989568. Registered on May 30, 2023.

Background

Addressing children’s social, emotional and behavioral health

At least one in five children experience social, emotional, and behavioral/mental health (SEB) challenges [1], making accessible SEB prevention and intervention programming a high priority. When children experience SEB challenges, they are at an increased risk of academic and social difficulties in school and long-term experience with the judicial system, substance use problems, and unemployment [1]. In contrast, children who receive preventive SEB support experience social and academic gains into adulthood [2,3,4,5,6]). Given the widespread need to address children’s SEB, exacerbated by the COVID-19 pandemic [7, 8], accessible prevention programming is a high priority.

School-based prevention

Schools are the most common setting in which children and adolescents in the USA receive both preventive and indicated care for SEB concerns [9,10,11]. As a result, school-delivered SEB practices have increasingly been prioritized in policy and legislation [12,13,14]. In particular, there are a variety of universal evidence-based prevention practices (EBPPs) that exist to address children’s SEB challenges [15, 16]. Among these are high-quality, effective, universal EBPPs delivered at the classroom or school level to support student SEB health [17,18,19,20].

Need for improved implementation supports

As with other service sectors, EBPPs for school settings are adopted inconsistently and frequently delivered with poor fidelity [21,22,23]. In the education sector, this implementation gap has been resistant to change despite intervention at federal and state policy levels [24]. Even studies of quality implementation strategies, such as coaching and consultation, demonstrate that many EBPPs fail to be adopted by school-based implementers [25,26,27]. As a result, the potential public health impact of SEB-focused EBPPs is greatly diminished [28].

Implementation determinants in schools

Like other health service sectors, schools are multi-level implementation contexts with myriad priorities, decision-makers, implementers, and recipients of the intervention. Across sectors, organizational influences on implementation have been the subject of considerable research, but creating organizational change is time-consuming and expensive, often lasting years [29]. Furthermore, even with appropriate implementation support to address organizational-level barriers or enact organizational facilitators, educators’ EBPP implementation can still be stilted.

Most implementation frameworks include critical individual implementer factors [30, 31]. Indeed, front-line professionals—such as teachers—are ultimately responsible for the adoption and delivery of EBPPs and present their own set of implementation determinants such as attitudes, beliefs, and intentions to implement an EBPP [30, 32, 33]. Some school-based research has documented that individual-level determinants can be more predictive of EBPP implementation than organizational factors, such as climate or culture [32, 34], even within supportive implementation contexts [35, 36].

Individual-level implementation strategies

Despite significant research on cornerstone implementation strategies such as training and consultation [37, 38], additional approaches are often needed to change behavior. Individual theories of behavior change can be leveraged to further facilitate EBPP use. The current study applies the theory of planned behavior (TPB) [39] and the health action process approach (HAPA) [40]. The TPB states that an individual’s subjective norms (their perceptions of the social importance of performing the behavior), attitudes (appraisals of the behavior), and task self-efficacy (perceived behavioral control and confidence in their ability to implement the behavior), cumulatively predict intentions to perform a behavior [41]. Intention is a strong predictor of behavior change [42,43,44]. The HAPA augments the TPB by (1) positing that the intention to engage in a behavior is influenced by one’s outcome expectancies and perceived risks (beliefs about the possible consequences of a behavior and risks of not engaging in a behavior, factors that we cluster with “attitudes” in the TPB) and (2) emphasizing individual volition (initial action planning and planning for coping with barriers) that increase an implementers’ maintenance self-efficacy (the belief that one is capable of overcoming barriers while implementing the behavior) and facilitate the link between intentions and behavior. The most common implementation strategies, such as workshops, coaching, and consultation, primarily target knowledge and skills while often neglecting to explicitly attend to norms, attitudes, intentions, and volition.

Beliefs and attitudes for successful implementation in schools for teachers (BASIS-T)

BASIS-T is designed to address individual-level mechanisms of behavior change (e.g., self-efficacy) often missing from standard EBPP training that relate to motivation prior to receiving training and volition after training. It is an EBPP-agnostic implementation strategy designed to be delivered within the preparation/adoption phase, immediately prior to active implementation [45]. BASIS-T targets behavioral intentions via improvement in attitudes, subjective norms, and self-efficacy. Our theory of change (Fig. 1) shows the core BASIS-T components, mechanisms of change (volitional and motivational), implementation outcomes, and resulting child SEB outcomes within the current study. The BASIS-T strategy is grounded in the TPB and in the HAPA strategies of action planning (specifying the “when,” “where”, and “how” of implementing the EBPP) and problem-solving planning (generating solutions to specific barriers that one anticipates encountering when adopting a new practice) to overcome barriers to implementation.

Fig. 1
figure 1

BASIS-T theory of change: components, hypothesized mechanisms of change and target outcomes

Preliminary BASIS-T studies

The BASIS strategy was developed via an iterative user-centered design approach [46]. Initial pilots of BASIS-T and a version designed for school-based mental health clinicians (BASIS-C) have demonstrated promise in enhancing participants’ attitudes, subjective norms, self-efficacy, and adoption [30, 33, 47]. The BASIS-T pilot study on which the current project is based was an attention-control randomized trial conducted with 82 elementary school teachers implementing a universal, classroom-based EBPP [48,49,50]. That study found a statistically significant positive impact on implementation task self-efficacy and outcome expectancy immediately after training and significantly less decline in task self-efficacy than attention control throughout the academic year [33]. Positive attitudes towards evidence-based practices decreased for both groups over time but with a marginal time trend in favor of less decrease for the BASIS-T condition (p = 0.08). Except for ownership/role, all other mechanisms (outcome expectancies, subjective norms, self-efficacy, and intentions to implement) deteriorated for both groups after the post-BASIS timepoint (i.e., during active implementation), yet all effect sizes were in favor of less deterioration in the BASIS-T condition. Significantly higher proportions of teachers in BASIS-T immediately adopted the EBPP (74% of BASIS-T condition, 40% of attention control). With marginal significance, fidelity to the EBPP remained steady for the BASIS-T group over time but deteriorated for attention control (p = 0.052), and the BASIS-T condition engaged in the EBPP more frequently (p = 0.097). A preliminary analysis estimated the cost of BASIS-T at $256 per teacher based on time and material costs. Notwithstanding the promise of these findings, this was an underpowered pilot trial designed to lay a foundation for the current study.

Objectives and aims

This project will conduct a hybrid type 3 effectiveness-implementation randomized trial to evaluate the effects of BASIS-T on implementation mechanisms and outcomes when applied to positive greetings at the door (PGD), a low-burden, universal EBPP that has been found to reduce disruptive behavior and increase academic engagement [51,52,53], both important indicators of positive SEB functioning [54,55,56,57]. We will also examine for whom, under what conditions, and how efficiently BASIS-T works to improve outcomes.

PGD is a preventive classroom management strategy based on three major themes: (a) classroom climate, (b) pre-correction, and (c) positive reinforcement [53]. PGD has been found to be well aligned with school settings and effective at addressing SEB needs. Multiple studies have found increases in on-task behavior in middle school students, and reductions in latency-to-task engagement in high school students [51, 52]. These findings were replicated in a longitudinal randomized controlled efficacy trial conducted with 203 students across 10 classrooms, finding improvements in academic engagement and decreases in disruptive behavior [53]. However, consistent with more general research on universal EBPPs, results also suggested that some of the teachers delivering PGD struggled with initial adoption, with two of the five teachers in the PGD condition (40%) requiring extra consultative support due to initial low levels of implementation.

Aim 1: Experimentally evaluate the effects of BASIS-T versus active comparison control (ACC)

Aim 1 will evaluate the main effects of BASIS-T on proximal mechanisms (attitudes, subjective norms, self-efficacy, intentions to implement, and maintenance self-efficacy), implementation outcomes (adoption, reach, fidelity, and sustainment), and student outcomes (classroom aggregated grades, test scores, attendance, and teacher ratings of classroom on-task behavior, disruptive behavior, and prosocial behavior).

Research question (RQ) 1a. Is BASIS-T more effective than the ACC condition at producing changes in proximal mechanisms of behavior change?

RQ 1b. Is BASIS-T more effective than the ACC condition in promoting implementation outcomes?

RQ 1c. Is BASIS-T more effective than the ACC condition in promoting meaningful changes in student SEB and academic outcomes?

Aim 2: Evaluate how, for whom, under what conditions, and how efficiently BASIS-T works to improve outcomes

Aim 2 will evaluate the effects of BASIS-T on student outcomes via the mechanisms of implementation behavior change and if those effects are moderated by teacher factors and school contextual factors. We will also explore how mechanisms are linked to implementation outcomes for “hypothesis-defying residuals” (i.e., teachers whose attitudes, subjective norms, and self-efficacy surrounding EBPP implementation are inconsistent with their documented implementation behaviors).

RQ 2a. Are the effects of BASIS-T mediated via mechanisms of behavior change?

RQ 2b. Are the effects of BASIS-T on implementation and student outcomes moderated by teacher-level factors (e.g., demographics, stress, baseline intentions to implement) and school-level factors (e.g., geographic location, school demographics, supportive leadership, implementation climate)?

RQ 2c. What explains “residual” teachers whose implementation behaviors are not accounted for by the mediation model?

RQ 2d. What are the costs and cost-effectiveness of BASIS-T?

Method

This hybrid type 3 effectiveness-implementation trial employs a blocked randomized cohort design with an active comparison control (ACC) condition to provide a rigorous initial test of the efficacy of BASIS-T in authentic elementary school settings (see Additional file 1 for SPIRIT checklist). Schools will be randomized to BASIS-T or ACC conditions (see Fig. 2) and data will be gathered at the teacher/classroom level. There will be two cohorts of participants—one for each of two academic years—and multiple time points of data collection over 18 months across implementation and sustainment phases. Institutional review board approval has been obtained (Additional file 2), which includes plans for de-identification and secure data storage as well as tracking and reporting of adverse events or protocol modifications if needed.

Fig. 2
figure 2

CONSORT diagram

Participants and recruitment

Teachers from schools in the USA will be recruited to participate. Inclusion criteria include being a teacher at an elementary (typically K–5th grade or K–8th) school and not having been trained or supervised in delivering PGD in the past 5 years. We will recruit approximately 276 teachers from 46 elementary schools in the USA. The final balance of teachers and schools is dependent on recruitment, with the goal of meeting minimum statistical power. Schools will be approached to participate via multiple routes, including leveraging existing relationships and networks, educational listservs, and posting on social media. Interested school representatives may respond using the interest survey linked in flyers or by contacting the research team via email. The generalizer (generalizer.org), a free, web-based tool for selecting schools for randomized controlled designs that are statistically representative of a chosen inference population, will also be used to help assure the representativeness of our sample [58]. Strata will be created on urbanicity, school race/ethnicity, school percent female, school percent free and reduced lunch, school size, school percentage English language learners, and the number of schools in the district.

Once schools have been selected, teacher recruitment will proceed for both cohorts of schools with the assistance of site administrators. Principals will provide us with the email addresses of teachers in their schools. Research staff or school leadership will contact eligible participants by email or phone to describe the purpose of the study, research procedures, and incentives. Informed consent will be collected online prior to training (Additional file 3) and teachers will be free to decline participation. Monetary incentives will be provided to participating teachers and schools.

Randomization

This study will employ a randomized cluster-blocked cohort design with random assignment at the school level to eliminate the possibility of condition contamination among teachers. Schools will be blocked within the district by a number of teachers participating in the study, school enrollment, % of non-White students, % of students who qualify for free/reduced lunch, and mean teacher baseline BASIS-T mechanisms of change (e.g., self-efficacy). These variables were chosen because they are associated with EBPP use and student academic outcomes [59]. We will create matched pairs using the nearest neighbor approach [60] and randomly assign schools within pairs to condition. Randomization will occur to the greatest extent possible, although there may be some situations where trainers in only one condition are available when school staff are available to be trained; these situations will be considered essentially random. School personnel and participants will be masked to condition.

Intervention

All participants will receive standard tele-delivered training on PGD from educational consultants with expertise in its implementation and in school-based EBPP training more generally. PGD is a proactive classroom management strategy [61] that takes a prevention-based approach to responding to behavioral needs in the classroom [53]. Research shows that PGD can increase student-level outcomes such as on-task behavior and decrease disruptive behavior [51,52,53]. PGD was designed to facilitate smoother transitions in the classroom by (a) connecting with each student by greeting them by name, (b) using pre-corrective statements with the entire class to communicate expected behaviors for transitions into the classroom, (c) providing specific pre-corrective statements privately to individual students who have difficulty self-regulating their behavior, and (d) providing specific praise and encouragement to students to reinforce desired behaviors [53].

Implementation conditions

BASIS-T strategy

BASIS-T motivational components

The BASIS-T implementation strategy integrates three core motivational components (Table 1). First, the BASIS-T facilitator provides strategic education about implementing EBPP and overcoming barriers via maintaining an internal locus of control to improve attitudes. The second component is social influence techniques to alter perceptions of subjective norms, which consists of two broad approaches: (1) social proofing messages using data or testimonials to describe the behavior or attitudes of others and (2) techniques to induce cognitive dissonance. Social proofs have been used to reduce problem behaviors, including alcohol/drug use and disordered eating behaviors [62,63,64]. Techniques to induce cognitive dissonance operate on the premise that individuals strive for consistency between attitudes and actions [65]. Desired behaviors can be increased by evoking commitments that are active, public (vs. private), and voluntary (vs. coerced) [66]. Third, motivational interviewing (MI) is used to enhance self-efficacy. MI is a collaborative, person-centered approach to elicit and strengthen motivation to change [67]. MI has been adapted as a brief intervention with strong evidence, feasibility, and acceptability among school-based mental health clinicians [68]; shown to improve self-efficacy and implementation among teachers and primary care providers [68, 69]; and used in group contexts to promote change [47]. The BASIS-T facilitator uses group MI techniques by adopting an empathic, non-directive, and person-centered style to elicit self-motivational statements, encourage discussion of potential changes (“change talk”) and enhance self-efficacy.

Table 1 BASIS-T strategy components

BASIS-T volitional components

To address the intention-behavior gap, BASIS-T includes volitional planning interventions to increase the likelihood that teachers will maintain their self-efficacy by enacting specific implementation behaviors associated with an EBPP. These strategies have been shown to help people enact health behaviors they are already motivated to perform [70,71,72,73,74,75]. Moreover, these planning interventions have demonstrated success in improving teacher adoption of interventions for student behavior [76]. These interventions support the translation of intentions into actions through detailed planning of how to perform desired behaviors in specific contexts. Solutions are generated to situational and internal (e.g., cognitive) barriers to facilitate follow-through with the action plan. In combination, action planning and problem-solving planning increase the likelihood of behavior change [77].

BASIS-T structure

BASIS-T is delivered in a group-based format shortly before and after receiving EBPP (e.g., PGD) training. BASIS-T will be delivered via tele-facilitation (via Zoom or another similar video conferencing platform), pre-recorded video content, and electronic sharing of documents to promote scalability. BASIS-T facilitators are experienced school-based professionals. A pre-training session (75 min) targets attitudes, social norms, and perceived behavioral control. The pre-training opens with the facilitator engaging teachers in an activity to clarify their professional values (an MI component). The facilitator uses open-ended questions to elicit change talk and reflects, summarizes, and draws themes across participant responses. The pre-training session is designed to help participants (a) explore their professional values and goals and make connections between those and EBPP training opportunities, (b) link EBPP delivery to improved outcomes for students, and (c) recognize common cognitive shortcuts that leave individuals vulnerable to adopting non-EBPPs. Teachers collaboratively develop an individualized menu of potential solutions to implementation barriers from which they can select when encountering challenges and set value-congruent goals related to the upcoming EBPP training.

The BASIS-T post-training session (75 min) provides protected time and a structured experience to develop action plans and problem-solving plans. Teachers will be provided with an action planning template to detail what PGD components they will use, how, with whom, where/when, and the environmental cues and resources needed to serve as prompts to deliver PGD with fidelity. The problem-solving plan involves teachers anticipating situational barriers and generating solutions to overcome those barriers to develop personalized “if–then” plans for dealing with specific barriers. Teachers share their plans with colleagues to receive input and feedback and to publicly set values-based goals for implementation.

BASIS-T fidelity

The BASIS-T pre- and post-training intervention sessions will be recorded and independently coded by two trained research assistants with disagreements resolved through consensus dialogue [78, 79]. The research assistants will use the BASIS-T fidelity tool [47] to assess the fidelity of its delivery.

Active comparison control

Teachers assigned to the ACC will receive pre- and post-training experiences designed to mirror those received in the BASIS-T condition. These training experiences will be virtual (again via Zoom or similar) and approximately the same length as BASIS-T but will not contain any of the BASIS-T content or mechanisms of change. The ACC pre-training experience will define, describe, and advocate for EBPP implementation and fidelity of EBPP use in schools. Content will be delivered in modes that mirror that of BASIS-T with video content, workbooks, and didactic training. The ACC thus controls for dose, information provided, and delivery mode effects. Some trainers will provide ACC and BASIS-T to reduce the potential for trainer effects.

Teacher and school data collection

Teacher data collection will span both active implementation and sustainment phases (18 months per cohort). Data will include teacher quantitative web-based surveys and qualitative interviews, each of which will be incentivized. To promote data integrity, key items will be forced to choose to prevent unplanned missingness and out-of-range responses. All data will be de-identified and stored securely. Detailed information about all study measures (including citations) can be found in Additional file 4.

Quantitative surveys

Participants will complete secure web-based surveys at 13 timepoints spanning three phases: preparation (times 1–4), active implementation (times 5–11), and sustainment (times 12–13 follow-up during the next academic year) [45]. Time 1 will occur during informed consent, time 2 in the days immediately before BASIS-T/ACC pre-training, time 3 immediately after pre-training, time 4 immediately after post-training, time 5 two weeks after training, time 6 four weeks after training, times 7–11 monthly until the end of the academic year, time 12 at the beginning of the subsequent academic year, and time 13 during the spring of that academic year. Teachers will self-report demographics, perceptions about BASIS-T (when relevant) and PGD training, BASIS-T mechanisms, implementation outcomes, organizational moderators (implementation climate, leadership), and time and resources used for PGD implementation. From times 7–11, to reduce the respondent burden we will use a random item planned missingness design for measures of attitudes (1 item selected per subscale), action self-efficacy (2 items selected), and subjective norms (1 item selected per each scale).

Data collection regarding students will focus on classroom behavior, behavioral discipline, attendance, and academic performance and will be collected in aggregate at the classroom level, without individual identifiers. Teachers will complete the first secure online survey on student classroom behavior at time 4, as earlier data collection would prohibit teachers from having an adequate sample of student behavior. At the end of each school year, academic and behavioral data (attendance, disciplinary actions, grades, and standardized test scores) will be requested for all students who were in classrooms of participating teachers; these data will be obtained in aggregate at the classroom level and no individual identifiers will be provided to the research team.

PGD fidelity

Teachers will complete self-reported PGD fidelity assessments, aligned with recommendations to gather reliable and valid data, monthly in the time 4–13 surveys. Additionally, observations to assess the PGD fidelity of the implementing teachers will be conducted by trained school-based personnel using a standard PGD fidelity tool.

Qualitative interviews

Teachers whose implementation behavior is insufficiently accounted for by our mediation model (e.g., teachers with favorable implementation outcomes, but who demonstrate low levels of TPB constructs and/or teachers with low implementation outcomes, but high TPB constructs) will be invited to a qualitative interview at the end of the active implementation phase to explore additional implementation determinants. These teachers will be identified at the end of their first year of participation based on the results of statistical modeling, balanced between adopters and non-adopters, and BASIS-T and ACC conditions (15 to 19 interviewees total). Semi-structured phone interviews (approximately 60 min) will be conducted at a convenient time for identified teachers and audio-recorded for transcription and coding purposes.

Cost assessments

Cost data collection will occur with all participating teachers to capture major costs of PGD delivery with and without BASIS-T, using activity-based costing to focus on key expenses (e.g., teacher and staff time, materials) [80, 81]. We will measure costs from the payor (i.e., school system) perspective, since the primary costs and associated decision-making would be within the implementing school system in real-world implementation. These data will be collected along with other study measures depending on how often an item needs to be measured (e.g., one-time training prep in time 4 vs. PGD delivery collected monthly alongside the fidelity measures). Following expert guidance to use mixed methods in implementation cost studies [82], we will include open-ended items in each survey asking teachers to identify unexpected resources they have needed for BASIS-T or PGD. ACC costs will be excluded from the cost estimate for the comparison condition; PGD implementation-as-usual is the ideal counterfactual for the cost-effectiveness of BASIS-T because it represents “business as usual” for PGD and, unlike BASIS-T, ACC would never be delivered outside of a research project. To develop complete cost estimates for BASIS-T versus PGD implementation-as-usual, we will combine the teacher-reported data with information from other sources, such as BASIS-T training records and meetings with school partners.

Data analytic plan

Basic data screening and descriptives will be conducted for all quantitative variables. We will explore and statistically adjust for baseline equivalence between conditions on all individual outcomes and all school, teacher, and student variables following established guidelines [83]. For all longitudinal modeling, the statistical adjustment will use baseline intercepts as random terms; for dissimilar outcome domains, non-equivalent baseline variables will be included as covariates. Data missing at random will be modeled using full information maximum likelihood estimation or multiple imputation as appropriate.

Unless otherwise indicated, quantitative analyses will employ mixed-effects models using time points nested within classrooms within the school. Standard model-building procedures will be used [84, 85]. Classroom and time trends will be allowed to randomly vary. Piecewise time models will estimate slopes from T1 to T4 (training), T5 through T6 (first month of post-training implementation), T7–T11 (first academic year), and T12–T13 (sustainment during the subsequent year). Variables for condition and condition x time will be added, and iterative models with possible control variables will be tested. Covariates not contributing at p < 0.10 based on likelihood ratio tests will be removed. Level 2 and 3 predictors will be fit and excluded for the same reasons. We will obtain estimates of whether there were statistically significant differences among the groups on the rate of change over time (i.e., slope) and whether there are statistically significant group differences in the average score on each outcome variable at the final timepoint of each piecewise segment. Models will be generalized, with appropriate link functions (e.g., log-link, Poisson) applied based on distributional form (e.g., dichotomous, zero-inflation). Estimation will be fit using full maximum likelihood. Models will be assessed for possible violations of assumptions. Goodness-of-fit will be evaluated using likelihood ratios, deviance statistics, and fit criteria. The inference will be evaluated relative to p < 0.05. For RQs with multiple DVs, we will adjust for the false discovery rate using Benjamini and Hochberg’s procedure [86].

We will use thegeneralizer.org to compare our sample to national demographics to examine the generalizability of our findings [58]. If not generalizable, we will use inverse probability weighting to increase the representativeness of estimates [87].

RQ 1a: A series of 3-level piecewise mixed effects models will be used to test our primary hypotheses: (1) the BASIS-T condition will show steeper gains than the ACC condition from T1 to T2 (e.g., more favorable attitudes towards EBPPs, increased social norms; enhanced task and maintenance self-efficacy; stronger intentions to implement) and (2) both groups will decline after training (T3–T5) as has been found in past TPB research [88], and in the BASIS-T pilot trial [33], but the BASIS-T condition will have longer sustained between-condition effects after training from T3 to T13. Dependent variables will include subscale scores on attitudes, social norms, self-efficacy, intentions, and maintenance self-efficacy.

RQ 1b: BASIS-T impact on behavioral implementation outcomes will be tested in two ways. First, a mixed effects model will examine whether the proportion of teachers in the BASIS-T condition who adopt PGD (i.e., initiate PGD) is higher than the proportion in the ACC condition. Second, Kaplan–Meier time-to-event analyses will be used to compare conditions on the number of days between training and PGD initiation. Reach will be analyzed using mixed effects models comparing BASIS-T vs. ACC on the proportion of students in each classroom (out of those eligible based on whether their classroom teacher participated in PGD training) who received PGD practices. Impact on PGD fidelity will be analyzed using mixed effects models with sessions within teachers, testing for the main effects of condition on adherence and participant responsiveness ratings derived from both PGD observational and self-report fidelity data. We will test the effects of BASIS-T on PGD sustainment and delayed implementation using the multilevel longitudinal analytic approach described for RQ1a.

RQ1c: BASIS-T impact on student SEB and academic outcomes will be tested via mixed effects multilevel models as described in RQ1a, with academic data aggregated to the classroom level. Dependent variables will include post-intervention scores on teacher ratings of student behavior.

RQ2a: Whether mechanisms of behavior change mediate the impact of BASIS-T on implementation outcomes will be analyzed using path analysis [89] extending traditional mediation modeling to a multi-level framework for nested data [90, 91].

RQ2b: To test whether the effect of BASIS-T on implementation and student outcomes is moderated by teacher factors (e.g., demographics, baseline intentions to implement) and school-level factors, we will add moderators and interaction terms to the analytic approach described in RQs 1a-c.

RQ2c: To explore what explains “residual” teachers whose implementation behaviors are not accounted for by the mediation model, we will analyze qualitative interviews with teachers who have a difference between predicted (based on BASIS-T putative mechanisms) and actual implementation behavior of ≥ 1 SD. Data will be coded using an integrated directed and conventional content analysis [92] approach as certain codes will be conceptualized during the interview guide development and driven by the exploration, preparation, implementation, sustainment [93] framework (i.e., deductive approach) which will allow for the examination of influences on implementation across multiple levels and phases. Other codes will be developed through a close reading of an initial subset of transcripts (i.e., inductive approach). These themes will provide a way of identifying and understanding the most salient factors that impact implementation and extend beyond the existing BASIS-T mechanisms and theory of change. After a stable set of codes is developed, a consensus process will be used in which all reviewers independently code and compare their coding to arrive at consensus judgments through open dialogue [78, 79, 94].

RQ2d: We will process cost data by assigning monetary values to each cost. We will use CostOut [95], a web-based program for conducting cost-effectiveness analysis in education, to identify nationally representative unit prices for ingredients. For qualitative data, we will rapidly analyze responses on an ongoing basis [96, 97] and incorporate newly identified costs into future surveys for quantitative measurement.

Once cost data collection is complete, we will calculate the costs of BASIS-T versus PGD implementation-as-usual based on the unit price and amount of each cost category (e.g., hours spent, items purchased). We will use CostOut to standardize dollar values, including an index year for inflation; cost-of-living adjustments; and discounting costs from different years to account for preferencing of delayed over immediate costs [80, 95]. We will generate descriptive statistics describing typical costs (i.e., means, standard deviations) for each condition and incremental costs of BASIS-T over implementation as usual.

Once the cost analysis is complete, we will use CostOut to calculate the cost-effectiveness [81] of BASIS-T versus PGD implementation as usual. This will involve calculating a series of incremental cost-effectiveness ratios for each implementation and student outcome measure, representing the incremental costs of BASIS-T divided by its incremental benefit (i.e., effect size). Themes about BASIS-T mechanisms and outcomes from the qualitative teacher interviews will allow for mixed-method cost-effectiveness evaluation [82] in which participants’ views help determine whether the results were worth the cost. For both cost and cost-effectiveness analyses, we will conduct sensitivity analyses that vary key sources of uncertainty in the models to examine the robustness of our estimates [98, 99].

Power

Our planned sample, including attrition, will provide sufficient power to test linear effects on teacher- and classroom school-level variables with a minimum detectable effect size (MDES) of small to medium effects, d = 0.37. We used PowerUp! [100] assuming clustering, final samples of 46 schools evenly randomized to condition, 6 teachers per school (after 15% attrition), school ICCs of 0.10 (consistent with our pilot data), and 13 timepoints. With the same assumptions as above, plus an assumption that 50% of the schools will be in a given moderator subgroup, our MDES for detecting any single school-level moderator variable is 0.48 and teacher-level moderator variable is 0.45. For mediator analyses, PowerUp! identifies power using the Sobel test, joint test, and Monte Carlo simulations. Across all tests, we will have a power of greater than 0.80 to detect reasonable and likely effects, based on our pilot trial and standard interpretations of effect sizes. We will detect an MDES equivalent to Cohen’s d = 0.60 for the treatment-mediator pathways (e.g., BASIS-T to implementation intention) which is lower than our actual obtained effect size from the pilot trial (which ranged from d = 0.61 to 1.16 for our primary mediators), an MDES of Pearson’s r = 0.3 for the mediator-outcome path (e.g., implementation intention to student behavior, a small effect), and an MDES of d = 0.10 for the direct path from treatment to student outcome.

Discussion

Innovation

This hybrid type 3 trial will contribute to the literature on pragmatic implementation strategies, as well as nascent but expanding literature on implementation mechanisms [101,102,103]. Aside from BASIS-T’s counterpart strategy, BASIS-C [104], no studies of strategies have been explicitly designed to impact TPB and HAPA mechanisms while testing those mechanisms via mediation models. Recent systematic reviews [103] indicate that much more work is necessary surrounding implementation strategy mechanisms to allow for the development of streamlined, pragmatic approaches to improving implementation outcomes that can be generalized across EBPPs. The current study also contributes significantly to theory-building in implementation by exploring factors beyond TPB and HAPA that help to explain hypothesis-defying residuals’ relationships between mechanisms and behavior.

In addition, this project will examine the costs and cost-effectiveness of the BASIS-T strategy. Cost-effectiveness is an important driver of adoption decisions, especially at system and policy levels [105]. Examination of cost-effectiveness is particularly critical for implementation strategies designed with pragmatism in mind. Efficient delivery and impact are key components of pragmatism, as are clear links between prioritized implementation determinants (e.g., self-efficacy in BASIS-T) and strategy components [106].

Limitations

This study uses a block randomization approach in which all participating teachers in each school are randomized to either BASIS-T or the ACC condition. Although our team considered randomizing at the individual teacher level to align with the individual focus of the BASIS-T strategy, we opted against this because it presented a significant risk of contamination because of the extent to which some of BASIS-T’s putative mechanisms are likely to be socially influenced (especially subjective social norms).

Conclusion and impact

The current study will provide evidence of the efficacy and cost-effectiveness of applying BASIS-T in an educational setting alongside EBP training to improve implementation outcomes. Trial results will be disseminated via publications, presentations via traditional (e.g., press releases) and social media, and through networks of practitioners. Positive findings from this trial would support the generalizability of BASIS-T to additional universal, school-based EBPPs for social, emotional, and behavioral health. More generally, if effective, it will add to the growing evidence for pragmatic implementation strategies and the mechanisms through which they operate.

Availability of data and materials

Please contact the lead author for more information.

Abbreviations

ACC:

Active control condition

BASIS-T:

Beliefs and attitudes for successful implementation in schools

EBPP:

Evidence-based prevention programs

HAPA:

Health action process approach

MI:

Motivational interviewing

MDES:

Minimal detectable effect size

PGD:

Positive greetings at the door

SEB:

Social, emotional, and behavioral health

TAU:

Treatment-as-usual

TPB:

Theory of planned behavior

References

  1. Chang X, Jiang X, Mkandarwire T, Shen M. Associations between adverse childhood experiences and health outcomes in adults aged 18–59 years. PLos One. 2019;14(2):e0211850.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  2. Colizzi M, Lasalvia A, Ruggeri M. Prevention and early intervention in youth mental health: is it time for a multidisciplinary and trans-diagnostic model for care? Int J Ment Heal Syst. 2020;14(1):23.

    Article  Google Scholar 

  3. Durlak JA, Wells AM. Primary prevention mental health programs for children and adolescents: a meta-analytic review. Am J Community Psychol. 1997;25(2):115–52.

    Article  CAS  PubMed  Google Scholar 

  4. Mendelson T, Eaton WW. Recent advances in the prevention of mental disorders. Soc Psychiatry Psychiatr Epidemiol. 2018;53(4):325–39.

    Article  PubMed  Google Scholar 

  5. Tennant R, Goens C, Barlow J, Day C, Stewart-Brown S. A systematic review of reviews of interventions to promote mental health and prevent mental health problems in children and young people. J Public Ment Health. 2007;6(1):25–32.

    Article  Google Scholar 

  6. Webster-Stratton C, Rinaldi J, Jamila MR. Long-term outcomes of incredible years parenting program: predictors of adolescent adjustment. Child Adolesc Ment Health. 2011;16(1):38–46.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Meherali S, Punjani N, Louie-Poon S, Abdul Rahim K, Das JK, Salam RA, et al. Mental health of children and adolescents amidst COVID-19 and past pandemics: a rapid systematic review. Int J Environ Res Public Health. 2021;18(7):3432.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Leeb RT, Bitsko RH, Radhakrishnan L, Martinez P, Njai R, Holland KM. Mental health-related emergency department visits among children aged <18 years during the COVID-19 pandemic - United States, January 1-October 17, 2020. MMWR Morb Mortal Wkly Rep. 2020;69(45):1675–80.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  9. Burns BJ, Costello EJ, Angold A, Tweed D, Stangl D, Farmer EM, et al. Children’s mental health service use across service sectors. Health Aff (Millwood). 1995;14(3):147–59.

    Article  CAS  PubMed  Google Scholar 

  10. Duong MT, Bruns EJ, Lee K, Cox S, Coifman J, Mayworm A, et al. Rates of mental health service utilization by children and adolescents in schools and other common service settings: a systematic review and meta-analysis. Adm Policy Ment Health. 2021;48(3):420–39.

    Article  PubMed  Google Scholar 

  11. Merikangas KR, He JP, Burstein M, Swendsen J, Avenevoli S, Case B, et al. Service utilization for lifetime mental disorders in U.S. adolescents: results of the National Comorbidity Survey-Adolescent Supplement (NCS-A). J Am Acad Child Adolesc Psychiatry. 2011;50(1):32–45.

    Article  PubMed  Google Scholar 

  12. Final Report of the Federal Commission on School Safety | National Center on Safe Supportive Learning Environments (NCSSLE). Federal Commission on School Safety; 2018. Available from: https://safesupportivelearning.ed.gov/resources/final-report-federal-commission-school-safety. Cited 2023 Apr 11.

  13. An act to reauthorize the Elementary and Secondary Education Act of 1965 to ensure that every child achieves. U.S. Government Publishing Office; 2015. Available from: https://www.govinfo.gov/app/details/PLAW-114publ95. Cited 2023 Apr 11.

  14. Burns MK, Jimerson SR, Van Der Heyden AM, Deno SL. Toward a unified response-to-intervention model: multi-tiered systems of support. In: Handbook of response to intervention: The science and practice of multi-tiered systems of support. 2nd ed. New York, NY, US: Springer Science + Business Media; 2016. p. 719–32.

    Chapter  Google Scholar 

  15. Greenberg MT, Abenavoli R. Universal interventions: fully exploring their impacts and potential to produce population-level impacts. J Res Educ Effect. 2017;10(1):40–67.

    Google Scholar 

  16. Fazel M, Hoagwood K, Stephan S, Ford T. Mental health interventions in schools 1: Mental health interventions in schools in high-income countries. Lancet Psychiatry. 2014;1(5):377–87.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Horner RH, Sugai G, Smolkowski K, Eber L, Nakasato J, Todd AW, et al. A randomized, wait-list controlled effectiveness trial assessing school-wide positive behavior support in elementary schools. J Posit Behav Interv. 2009;11:133–44.

    Article  Google Scholar 

  18. Durlak JA, Weissberg RP, Dymnicki AB, Taylor RD, Schellinger KB. The impact of enhancing students’ social and emotional learning: a meta-analysis of school-based universal interventions. Child Dev. 2011;82(1):405–32.

    Article  PubMed  Google Scholar 

  19. García-Moya I. The importance of student-teacher relationships for wellbeing in schools. the importance of connectedness in student-teacher relationships: insights from the teacher connectedness project. 2020:1–25.

  20. Borgmeier C, Loman SL, Hara M. Teacher self-assessment of evidence-based classroom practices: preliminary findings across primary, intermediate and secondary level teachers. Teach Dev. 2016;20(1):40–56.

    Article  Google Scholar 

  21. Evans SW, Weist MD. Implementing empirically supported treatments in the schools: what are we asking? Clin Child Fam Psychol Rev. 2004;7(4):263–7.

    Article  PubMed  Google Scholar 

  22. Lyon AR, Bruns EJ. From evidence to impact: Joining our best school mental health practices with our best implementation strategies. School Ment Health. 2019;11(1):106–14.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Wilson DB, Gottfredson DC, Najaka SS. School-based prevention of problem behaviors: a meta-analysis. J Quant Criminol. 2001;17(3):247–72.

    Article  Google Scholar 

  24. Vernez G, Karam RT, Mariano LT, DeMartini C. Evaluating comprehensive school reform models at scale: focus on implementation. RAND Corporation; 2006. Available from: https://www.rand.org/pubs/monographs/MG546.html. Cited 2023 Apr 11.

  25. Locke J, Olsen A, Wideman R, Downey MM, Kretzmann M, Kasari C, et al. A tangled web: the challenges of implementing an evidence-based social engagement intervention for children with autism in urban public school settings. Behav Ther. 2015;46(1):54–67.

    Article  PubMed  Google Scholar 

  26. Stahmer AC, Reed S, Lee E, Reisinger EM, Connell JE, Mandell DS. Training teachers to use evidence-based practices for autism: examining procedural implementation fidelity. Psychol Sch. 2015;52(2):181–95.

    Article  PubMed  Google Scholar 

  27. Suhrheinrich J, Stahmer AC, Reed S, Schreibman L, Reisinger E, Mandell D. Implementation challenges in translating pivotal response training into community settings. J Autism Dev Disord. 2013;43(12):2970. https://doi.org/10.1007/s10803-013-1826-7.

    Article  PubMed  Google Scholar 

  28. Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41(3–4):327–50.

    Article  PubMed  Google Scholar 

  29. Glisson C. The organizational context of children’s mental health services. Clin Child Fam Psychol Rev. 2002;5(4):233–53.

    Article  PubMed  Google Scholar 

  30. Cook CR, Lyon AR, Kubergovic D, Wright DB, Zhang Y. A supportive beliefs intervention to facilitate the implementation of evidence-based practices within a multi-tiered system of supports. School Mental Health: A Multidiscipl Res Pract J. 2015;7:49–60.

    Article  Google Scholar 

  31. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Locke J, Kang-Yi C, Frederick L, Mandell DS. Individual and organizational characteristics predicting intervention use for children with autism in schools. Autism. 2020;24(5):1152–63.

    Article  PubMed  Google Scholar 

  33. Merle JL, Cook CR, Pullmann MD, Larson MF, Hamlin CM, Hugh ML, et al. Longitudinal effects of a motivationally focused strategy to increase the yield of training and consultation on teachers’ adoption and fidelity of a universal program. Sch Ment Heal. 2023;15(1):105–22.

    Article  Google Scholar 

  34. Locke J, Lawson GM, Beidas RS, Aarons GA, Xie M, Lyon AR, et al. Individual and organizational factors that affect implementation of evidence-based practices for children with autism in public schools: a cross-sectional observational study. Implement Sci. 2019;14(1):29.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Kincaid D, Childs K, Blase KA, Wallace F. Identifying barriers and facilitators in implementing schoolwide positive behavior support. J Posit Behav Interv. 2007;9:174–84.

    Article  Google Scholar 

  36. Sanford Derousie RM, Bierman KL. Examining the sustainability of an evidence-based preschool curriculum: the REDI program. Early Child Res Q. 2012;27(1):55–6.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Beidas RS, Kendall PC. Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clin Psychol (New York). 2010;17(1):1–30.

    PubMed  Google Scholar 

  38. McHugh RK, Barlow DH. The dissemination and implementation of evidence-based psychological treatments. A review of current efforts. Am Psychol. 2010;65(2):73–84.

    Article  PubMed  Google Scholar 

  39. Ajzen I. From intentions to actions: a theory of planned behavior. In: Kuhl J, Beckmann J, editors. Action Control: From Cognition to Behavior. Berlin, Heidelberg: Springer; 1985. p. 11–39. (SSSP Springer Series in Social Psychology). Available from: https://doi.org/10.1007/978-3-642-69746-3_2. Cited 2023 Apr 11.

  40. Schwarzer R. Self-efficacy in the adoption and maintenance of health behaviors: theoretical approaches and a new model. In: Self-efficacy: Thought control of action. Washington, DC, US: Hemisphere Publishing Corp; 1992. p. 217–43.

    Google Scholar 

  41. Ajzen I. The theory of planned behavior. Organ Behav Hum Decis Process. 1991;50(2):179–211.

    Article  Google Scholar 

  42. Ajzen I, Manstead ASR. Changing health-related behaviours: an approach based on the theory of planned behaviour. In: The scope of social psychology: Theory and applications. New York, NY, US: Psychology Press; 2007. p. 43–63.

    Google Scholar 

  43. Eccles MP, Hrisos S, Francis J, Kaner EF, Dickinson HO, Beyer F, et al. Do self- reported intentions predict clinicians’ behaviour: a systematic review. Implement Sci. 2006;1:28.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Steinmetz H, Knappstein M, Ajzen I, Schmidt P, Kabst R. How effective are behavior change interventions based on the theory of planned behavior? A three-level meta-analysis. Zeitschrift für Psychologie. 2016;224:216–33.

    Article  Google Scholar 

  45. Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the exploration, preparation, implementation, sustainment (EPIS) framework. Implement Sci. 2019;14(1):1.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Duong MT, Cook CR, Lee K, Davis CJ, Vázquez-Colón CA, Lyon AR. User testing to drive the iterative development of a strategy to improve implementation of evidence-based practices in school mental health. Evid-Based Pract Child Adolesc Mental Health. 2020;5(4):414–25.

    Article  Google Scholar 

  47. Larson M, Cook CR, Brewer SK, Pullmann MD, Hamlin C, Merle JL, et al. Examining the effects of a brief, group-based motivational implementation strategy on mechanisms of teacher behavior change. Prev Sci. 2021;22(6):722–36.

    Article  PubMed  Google Scholar 

  48. Flower A, McKenna JW, Bunuan RL, Muething CS, Vega R Jr. Effects of the good behavior game on challenging behaviors in school settings. Rev Educ Res. 2014;84:546–71.

    Article  Google Scholar 

  49. Joslyn PR, Donaldson JM, Austin JL, Vollmer TR. The good behavior game: a brief review. J Appl Behav Anal. 2019;52(3):811–5.

    Article  PubMed  Google Scholar 

  50. Barrish HH, Saunders M, Wolf MM. Good behavior game: effects of individual contingencies for group consequences on disruptive behavior in a classroom. J Appl Behav Anal. 1969;2(2):119–24.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  51. Allday RA, Pakurar K. Effects of teacher greetings on student on-task behavior. J Appl Behav Anal. 2007;40(2):317–20.

    Article  PubMed  Google Scholar 

  52. Allday RA, Hinkson-Lee K, Hudson T, Neilsen-Gatti S, Kleinke A, Russel C. Training general educators to increase behavior-specific praise: effects on students with EBD. Behavioral Disorders. 2012; Available from: https://ir.stthomas.edu/celc_ed_speced_pub/13.

  53. Cook CR, Davis C, Brown EC, Locke J, Ehrhart MG, Aarons GA, et al. Confirmatory factor analysis of the evidence-based practice attitudes scale with school-based behavioral health consultants. Implement Sci. 2018;13(1):116.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Ialongo N, Poduska J, Werthamer L, Kellam S. The distal impact of two first-grade preventive interventions on conduct problems and disorder in early adolescence. J Emot Behav Disord. 2001;9:146–60.

    Article  Google Scholar 

  55. Antisocial behavior in children and adolescents: a developmental analysis and model for intervention. Washington, DC, US: American Psychological Association; 2002. xi, 337 p. (Reid JB, Patterson GR, Snyder J, editors. Antisocial behavior in children and adolescents: A developmental analysis and model for intervention).

  56. Datu JAD, King RB. Subjective well-being is reciprocally associated with academic engagement: a two-wave longitudinal study. J Sch Psychol. 2018;69:100–10.

    Article  PubMed  Google Scholar 

  57. Smith ND, Suldo S, Hearon B, Ferron J. An application of the dual-factor model of mental health in elementary school children: examining academic engagement and social outcomes. J Positive School Psychol. 2020;4(1):49–68.

    Google Scholar 

  58. Tipton E, Miller K. The generalizer: a webtool for improving the generalizability of results from experiments. Available from: https://www.thegeneralizer.org/. Cited 2023 Apr 11.

  59. Lyon AR, Whitaker K, Locke J, Cook CR, King KM, Duong M, et al. The impact of inter-organizational alignment (IOA) on implementation outcomes: evaluating unique and shared organizational influences in education sector mental health. Implement Sci. 2018;13(1):24.

    Article  PubMed  PubMed Central  Google Scholar 

  60. Dong N, Lipsey M. The statistical power of the cluster randomized block design with matched pairs--a simulation study. Society for Research on Educational Effectiveness. Society for Research on Educational Effectiveness; 2010. Available from: https://eric.ed.gov/?id=ED512728. Cited 2023 Apr 18.

  61. Rathvon N. Effective school interventions: evidence-based strategies for improving student outcomes. 2nd ed. New York, NY, US: Guilford Press; 2008. p. xvii–460 (Effective school interventions: Evidence-based strategies for improving student outcomes, 2nd ed).

    Google Scholar 

  62. Chassin L, Presson CC, Sherman SJ, Corty E, Olshavsky RW. Predicting the onset of cigarette smoking in adolescents: a longitudinal study. J Appl Soc Psychol. 1984;14(3):224–43.

    Article  Google Scholar 

  63. Perkins HW. The contextual effect of secular norms on religiosity as moderator of student alcohol and other drug use. Res Soc Scient Study Religion. 1994;6:187–208.

    Google Scholar 

  64. Perkins HW, Meilman PW, Leichliter JS, Cashin JR, Presley CA. Misperceptions of the norms for the frequency of alcohol and other drug use on college campuses. J Am Coll Health. 1999;47(6):253–8.

    Article  CAS  PubMed  Google Scholar 

  65. Petrova PK, Cialdini RB, Sills SJ. Consistency-based compliance across cultures. J Exp Soc Psychol. 2007;43:104–11.

    Article  Google Scholar 

  66. Pratkanis AR. An invitation to social influence research. In: The science of social influence: advances and future progress. New York, NY, US: Psychology Press; 2007. p. 1–15 (Frontiers of social psychology).

    Google Scholar 

  67. Miller WR, Rollnick S. Motivational interviewing: helping people change. Guilford Press; 2012. 497 p.

  68. Reinke WM, Herman KC, Sprick R. Motivational interviewing for effective classroom management: the classroom check-up. Guilford Press: Practical Intervention in the Schools Series. Guilford Publications; 2011.

    Google Scholar 

  69. Frey AJ, Lee J, Small JW, Seeley JR, Walker HM, Feil EG. The motivational interviewing navigation guide: a process for enhancing teachers’ motivation to adopt and implement school-based interventions. Adv School Ment Health Promot. 2013;6:158–73.

    Article  Google Scholar 

  70. Sanetti LMH, Kratochwill TR, Long ACJ. Applying adult behavior change theory to support mediator-based intervention implementation. Sch Psychol Q. 2013;28(1):47–62.

    Article  PubMed  Google Scholar 

  71. Adriaanse MA, Vinkers CDW, De Ridder DTD, Hox JJ, De Wit JBF. Do implementation intentions help to eat a healthy diet? A systematic review and meta-analysis of the empirical evidence. Appetite. 2011;56(1):183–93.

    Article  PubMed  Google Scholar 

  72. Bélanger-Gravel A, Godin G, Amireault S. A meta-analytic review of the effect of implementation intentions on physical activity. Health Psychol Rev. 2013;7(1):23–54.

    Article  Google Scholar 

  73. Sheeran P, Orbell S. Implementation intentions and repeated behaviour: augmenting the predictive validity of the theory of planned behaviour. Eur J Soc Psychol. 1999;29(2–3):349–69.

    Article  Google Scholar 

  74. Orbell S, Sheeran P. Motivational and volitional processes in action initiation: a field study of the role of implementation intentions. J Appl Soc Psychol. 2000;30:780–97.

    Article  Google Scholar 

  75. Orbell S, Hodgkins S, Sheeran P. Implementation intentions and the theory of planned behavior. Pers Soc Psychol Bull. 1997;23(9):945–54.

    Article  PubMed  Google Scholar 

  76. Sanetti LMH, Collier-Meek MA, Long ACJ, Kim J, Kratochwill TR. Using implementation planning to increase teachers’ adherence and quality to behavior support plans. Psychol Sch. 2014;51:879–95.

    Article  Google Scholar 

  77. Hagger MS, Luszczynska A. Implementation intention and action planning interventions in health contexts: state of the research and proposals for the way forward. Appl Psychol Health Well Being. 2014;6(1):1–47.

    Article  PubMed  Google Scholar 

  78. DeSantis L, Ugarriza DN. The concept of theme as used in qualitative nursing research. West J Nurs Res. 2000;22(3):351–72.

    Article  CAS  PubMed  Google Scholar 

  79. Hill CE, Thompson BJ, Williams EN. A guide to conducting consensual qualitative research. Couns Psychol. 1997;25(4):517–72.

    Article  Google Scholar 

  80. Hollands FM, Pratt-Williams J, Shand R. Cost analysis standards & guidelines 1.1. Cost Analysis in Practice (CAP) Project. Revised. Grantee Submission. 2021. Available from: https://eric.ed.gov/?id=ED624010. Cited 2023 Apr 18.

  81. Levin HM, McEwan PJ, Belfield C, Bowden AB, Shand R. Economic evaluation in education: cost-effectiveness and benefit-cost analysis. SAGE Publications; 2017. 212 p.

  82. Dopp AR, Mundey P, Beasley LO, Silovsky JF, Eisenberg D. Mixed-method approaches to strengthen economic evaluations in implementation research. Implement Sci. 2019;14(1):2.

    Article  PubMed  PubMed Central  Google Scholar 

  83. What Works Clearinghouse. What works clearinghouse standards handbook, Version 4.1. U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistanc; 2020. Available from: https://ies.ed.gov/ncee/wwc/handbooks. Cited 2023 Apr 11.

  84. Raudenbush SW, Bryk AS. Hierarchical linear models. Second. SAGE Publications; 2023. (Advanced Quantitative Techniques in the Social Sciences; vol. 1). Available from: https://us.sagepub.com/en-us/nam/hierarchical-linear-models/book9230. Cited 2023 Apr 11.

  85. Singer JD, Willett JB. Survival analysis. In: Handbook of psychology: research methods in psychology, Vol 2. Hoboken, NJ, US: John Wiley & Sons, Inc.; 2003. p. 555–80.

    Chapter  Google Scholar 

  86. Benjamini Y, Hochberg Y. Controlling the false discovery rate: a practical and powerful approach to multiple testing. J Roy Stat Soc: Ser B (Methodol). 1995;57(1):289–300.

    Google Scholar 

  87. Seaman SR, White IR. Review of inverse probability weighting for dealing with missing data. Stat Methods Med Res. 2013;22(3):278–95.

    Article  PubMed  Google Scholar 

  88. Helfrich CD, Kohn MJ, Stapleton A, Allen CL, Hammerback KE, Chan KCG, et al. Readiness to change over time: change commitment and change efficacy in a workplace health-promotion trial. Front Public Health. 2018;6:110.

    Article  PubMed  PubMed Central  Google Scholar 

  89. Loehlin JC, Beaujean AA. Latent variable models: an introduction to factor, path, and structural equation analysis, fifth edition. Taylor & Francis; 2016. 391 p.

  90. Kenny DA, Korchmaros JD, Bolger N. Lower level mediation in multilevel models. Psychol Methods. 2003;8(2):115–28.

    Article  PubMed  Google Scholar 

  91. Bauer DJ, Preacher KJ, Gil KM. Conceptualizing and testing random indirect effects and moderated mediation in multilevel models: new procedures and recommendations. Psychol Methods. 2006;11(2):142–63.

    Article  PubMed  Google Scholar 

  92. Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–88.

    Article  PubMed  Google Scholar 

  93. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23.

    Article  PubMed  Google Scholar 

  94. Hill CE, Knox S, Thompson BJ, Williams EN, Hess SA, Ladany N. Consensual qualitative research: an update. J Couns Psychol. 2005;52:196–205.

    Article  Google Scholar 

  95. Hollands FM, Hanisch-Cerda B, Levin HM, Belfield CR, Menon A, Shand R, Pan Y, Bakir I, Cheng H. CostOut-the CBCSE cost tool kit. Center for Benefit-Cost Studies of Education, Teachers College: Columbia University. www. cbcsecosttoolkit. org; 2015.

    Google Scholar 

  96. Hamilton AB, Finley EP. Qualitative methods in implementation research: an introduction. Psychiatry Res. 2019;280:112516.

    Article  PubMed  PubMed Central  Google Scholar 

  97. Taylor B, Henshall C, Kenyon S, Litchfield I, Greenfield S. Can rapid approaches to qualitative analysis deliver timely, valid findings to clinical leaders? A mixed methods study comparing rapid and thematic analysis. BMJ Open. 2018;8(10):e019993.

    Article  PubMed  PubMed Central  Google Scholar 

  98. Briggs AH, Gray AM. Handling uncertainty in economic evaluations of healthcare interventions. BMJ. 1999;319(7210):635–8.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  99. Levin HM, Belfield CR. Chapter 15: cost-effectiveness and educational efficiency. In: Handbook of Contemporary Education Economics. 2nd ed. Edward Elgar Publishing; 2017. p. 338–56. Available from: https://www-elgaronline-com.ezp1.lib.umn.edu/display/edcoll/9781785369063/9781785369063.00020.xml. Cited 2023 Apr 18.

  100. Dong N, Maynard R. PowerUp!: a tool for calculating minimum detectable effect sizes and minimum required sample sizes for experimental and quasi-experimental design studies. J Res Educ Effect. 2013;6(1):24–67.

    Google Scholar 

  101. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6:136.

    Article  PubMed  PubMed Central  Google Scholar 

  102. Lewis CC, Klasnja P, Lyon AR, Powell BJ, Lengnick-Hall R, Buchanan G, et al. The mechanics of implementation strategies and measures: advancing the study of implementation mechanisms. Implement Sci Commun. 2022;3(1):114.

    Article  PubMed  PubMed Central  Google Scholar 

  103. Lewis CC, Boyd MR, Walsh-Bailey C, Lyon AR, Beidas R, Mittman B, et al. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15(1):21.

    Article  PubMed  PubMed Central  Google Scholar 

  104. Lyon AR, Pullmann MD, Dorsey S, Levin C, Gaias LM, Brewer SK, et al. Protocol for a hybrid type 2 cluster randomized trial of trauma-focused cognitive behavioral therapy and a pragmatic individual-level implementation strategy. Implement Sci. 2021;16(1):3.

    Article  PubMed  PubMed Central  Google Scholar 

  105. Walker SC, Lyon AR, Aos S, Trupin EW. The consistencies and vagaries of the Washington State inventory of evidence-based practice: the definition of “evidence-based” in a policy context. Admin Pol Mental Health Mental Health Serv Res. 2017;44:42–54.

    Article  Google Scholar 

  106. Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, et al. Enhancing the impact of implementation strategies in healthcare: a research agenda. Front Public Health. 2019;7:3.

    Article  PubMed  PubMed Central  Google Scholar 

  107. Merle JL, Cook CR, Locke JJ, Ehrhart MG, Brown EC, Davis CJ, Lyon AR. Teacher attitudes toward evidence-based practices: exploratory and confirmatory analyses of the school-adapted evidence-based practice attitude scale. Implement Res Pract. 2023;4:26334895221151024. https://doi.org/10.1177/26334895221151026.

    Article  Google Scholar 

  108. Francis J, Eccles MP, Johnston M, et al. Constructing questionnaires based on the theory of planned behaviour: a manual for health services researchers. Published online 2004. http://openaccess.city.ac.uk/1735/.

  109. Ramsay CR, Thomas RE, Croal BL, Grimshaw JM, Eccles MP. Using the theory of planned behaviour as a process evaluation tool in randomised trials of knowledge translation strategies: a case study from UK primary care. Implement Sci. 2010;5(1):71. https://doi.org/10.1186/1748-5908-5-71.

    Article  PubMed  PubMed Central  Google Scholar 

  110. Schmitz GS, Schwarzer R. Selbstwirksamkeitserwartung von Lehrern: Längsschnittbefunde mit einem neuen Instrument. Zeitschrift für Pädagogische Psychologie/German J Educ Psychol. 2000. https://doi.org/10.1024/1010-0652.14.1.12.

    Article  Google Scholar 

  111. Kortteisto T, Kaila M, Komulainen J, Mäntyranta T, Rissanen P. Healthcare professionals’ intentions to use clinical guidelines: a survey using the theory of planned behaviour. Implement Sci. 2010;5:1.

    Article  Google Scholar 

  112. Lyon AR, Cook CR, Brown EC, Locke J, Davis C, Ehrhart M, Aarons GA. Assessing organizational implementation context in the education sector: confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implement Sci. 2018;13:1–4.

    Article  Google Scholar 

  113. Lyon AR, Corbin CM, Brown EC, Ehrhart MG, Locke J, Davis C, Picozzi E, Aarons GA, Cook CR. Leading the charge in the education sector: development and validation of the School Implementation Leadership Scale (SILS). Implement Sci. 2022;17(1):48.

    Article  PubMed  PubMed Central  Google Scholar 

  114. Thayer AJ, Cook CR, Davis C, Brown EC, Locke J, Ehrhart MG, Aarons GA, Picozzi E, Lyon AR. Construct validity of the school-implementation climate scale. Implement Res Pract. 2022;3:26334895221116064. https://doi.org/10.1177/26334895221116065.

    Article  Google Scholar 

  115. Kilgus SP, Chafouleas SM, Riley-Tillman TC, Welsh ME. Direct behavior rating scales as screeners: a preliminary investigation of diagnostic accuracy in elementary school. Sch Psychol Q. 2012;27(1):41.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

None.

Funding

This publication was supported by grant R305A220481, awarded by the Institute of Education Sciences (IES; https://ies.ed.gov/). The content is solely the responsibility of the authors and does not necessarily represent the official views of the IES. IES had no role in study design; data collection, management, analysis, or interpretation; writing of the manuscript; or the decision to submit the manuscript for publication.

Author information

Authors and Affiliations

Authors

Contributions

ARL, CRC, ML, AD, and MDP developed the overarching scientific aims and design of the project. CH, PR, MB, and RG assisted in the development and operationalization of the study methods and worked with the investigators to obtain institutional review board approval. AG, NM, and AWH are supporting the study recruitment and data collection. CRC, ML, ARL, MLH, and AL have worked to refine and update the BASIS-T implementation strategy in preparation for deployment. All authors contributed to the development, drafting, or review of the manuscript. All authors approved the final manuscript.

Corresponding author

Correspondence to Aaron R. Lyon.

Ethics declarations

Ethics approval and consent to participate

This project was approved by the University of Washington Institutional Review Board (IRB).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

SPIRIT 2013 Checklist.

Additional file 2.

IRB Determination.

Additional file 3.

Consent Form for Teachers.

Additional file 4.

Detailed Study Measures [107,108,109,110,111,112,113,114,115].

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lyon, A.R., Cook, C.R., Larson, M. et al. Protocol for a hybrid type 3 effectiveness-implementation trial of a pragmatic individual-level implementation strategy for supporting school-based prevention programming. Implementation Sci 19, 2 (2024). https://doi.org/10.1186/s13012-023-01330-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-023-01330-y

Keywords