Skip to main content
  • Study protocol
  • Open access
  • Published:

A randomized controlled trial in schools aimed at exploring mechanisms of change of a multifaceted implementation strategy for promoting mental health at the workplace

Abstract

Background

This study will explore implementation mechanisms through which a single implementation strategy and a multifaceted implementation strategy operate to affect the implementation outcome, which is fidelity to the Guideline For The Prevention of Mental Ill Health within schools. The guideline gives recommendations on how workplaces can prevent mental ill health among their personnel by managing social and organizational risks factors in the work environment. Schools are chosen as the setting for the study due to the high prevalence of mental ill health among teachers and other personnel working in schools. The study builds on our previous research, in which we compared the effectiveness of the two strategies on fidelity to the guideline. Small improvements in guideline adherence were observed for the majority of the indicators in the multifaceted strategy group. This study will focus on exploring the underlying mechanisms of change through which the implementation strategies may operate to affect the implementation outcome.

Methods

We will conduct a cluster-randomized-controlled trial among public schools (n=55 schools) in Sweden. Schools are randomized (1:1 ratio) to receive a multifaceted strategy (implementation teams, educational meeting, ongoing training, Plan-Do-Study-Act cycles) or a single strategy (implementation teams, educational meeting). The implementation outcome is fidelity to the guideline. Hypothesized mediators originate from the COM-B model. A mixed-method design will be employed, entailing a qualitative study of implementation process embedded within the cluster-randomized controlled trail examining implementation mechanisms. The methods will be used in a complementary manner to get a full understanding of the implementation mechanisms.

Discussion

This implementation study will provide valuable knowledge on how implementation strategies work (or fail) to affect implementation outcomes. The knowledge gained will aid the selection of effective implementation strategies that fit specific determinants, which is a priority for the field. Despite recent initiatives to advance the understanding of implementation mechanisms, studies testing these mechanisms are still uncommon.

Trial registration

ClinicalTrials.org dr.nr 2020-01214.

Peer Review reports

Introduction

The importance of creating a sustainable working environment in schools

A professional group that has a high prevalence of mental-ill health, and related presenteeism and sick leave, are teachers [1, 2]. Teachers’ work environment is characterized by high workload, role overload, increased class size per teacher, and lack of support from management, resulting in a high risk for mental ill health [1,2,3]. One way of preventing work-related mental ill health is to apply a systematic approach to the management of organizational and social risks within the work environment as recommended by the Swedish Agency for Work Environment and Health. Many schools in Sweden, however, lack such as systematic approach [4]. To support workplaces, including schools, with the management of their social and organizational work environment and the prevention of mental ill health, we launched the Guideline For The Prevention of Mental Ill-Health At The Workplace [5]. The guideline is based on the best available evidence (e.g., [6]) and has been compiled through a practice-based research network including employers, occupational health services staff, and researchers.

Supporting the implementation of the guideline within schools

Even though guidelines are an essential part of achieving sustainable working environments, it is well known that solely disseminating guidelines rarely results in full implementation in practice [7, 8]. Between 2017 and 2019, we conducted a cluster-randomized controlled trial with the aim to support schools with the implementation of the guideline to prevent mental ill health. We developed a multifaceted implementation strategy containing an educational meeting, ongoing training through workshops, implementation teams, and Plan-Do-Study-Act cycles [9]. The effectiveness of the multifaceted strategy was compared with a discrete implementation strategy (educational meeting) among 19 schools in Sweden. Small improvements in guideline adherence were observed for the majority of the indicators in the multifaceted strategy group; however, improvements were not statistically significant from the discrete strategy group [10]. One of the reasons behind the lack of effectiveness could be the large organizational changes that occurred in some of the participating schools. This was confirmed by the sensitivity analysis in organizationally stable schools, which demonstrated larger and more consistent improvements. To further understand the (lack) of effectiveness of the multifaceted implementation strategy, we will conduct a new cluster-randomized controlled trial to explore the underlying mechanisms of change through which the implementation strategy may operate to affect the implementation outcome. The need for understanding how and why implementation strategies work as well as to which extent has been highlighted in several studies [11,12,13]. Despite recent initiatives to advance the understanding of implementation mechanisms [11], studies testing these mechanisms are still uncommon [12, 13]. An important prerequisite for exploring mechanism of change is selecting implementation strategies based on a systematic approach. This includes the identification of barriers and facilitators, and the selection of implementation strategies that address the identified barriers and facilitators. Several existing methods and frameworks can be leveraged to support researchers and planners in executing a more systematic approach [14,15,16,17].

Steps for testing specific mechanisms of change

The current effort to specify mechanisms of change builds directly on our previous trial [9]. The first step was the specification of target-behaviors related to the recommendations of the guideline and the identification of barriers and facilitators [9]. First, the barriers were identified from a European survey conducted by the Organization for Economic Co-operation and Development (OECD) on barriers that hinder organizations from managing organizational and social risks [18]. The main barriers identified included the lack of knowledge and lack of guidance on how to manage organizational and social risk factors in the work environment [18]. To supplement these survey findings, planning workshops were conducted with school principals to identify barriers to implementing the Guideline For The Prevention of Mental Ill-Health At The Workplace within their school [9]. Barriers identified by the principals included the lack of knowledge on how to manage organizational and social risks at the workplace, unclear professional roles regarding who has the responsibility for the prevention of mental ill health within the school, lack of support from staff and school district, and difficulty prioritizing the prevention of mental ill health due to lack of time. An important facilitator identified by the principals was the need for a systematic approach (working with the work environment routinely) to implementing the guideline recommendations in their workplace [9].

In the second step, we selected the COM-B as a model to inform the pathways of change. The COM-B model postulates that for a behaviour to occur, a person must have the capability, opportunity, and motivation to perform the behaviour in question [19]. Capability refers to whether an individual has the necessary knowledge, skills, and ability to perform the behavior [19]. Opportunity relates to factors that are external to the individual that make the performance of the behavior possible and can be divided into physical opportunity, including time and resources, and social opportunity, such as social support and social norms [19]. Motivation refers to internal processes that influence decision making and behavior, including making plans [19]. Following the pathways of change proposed by the COM-B model, the identified barriers and facilitators were organized and structured according to the COM-B constructs and applied to the principals’ role. It was hypothesized that principals needed to have the capability to engage in the behaviour (i.e., have knowledge and skills related to the prevention of mental ill health in accordance with the guideline); the opportunity to engage in the behaviour (i.e., have the time to engage in the behavior, prioritize the behavior, receive support from staff and school district and have clearly defined roles; and have the motivation to perform the behavior (i.e., decision to implement the guideline through planning and structure).

In the third step, implementation strategies were selected to overcome the barriers and enable the facilitators. The selection was informed by existing compilations of implementation strategies (e.g., [20,21,22]. In consensus with experts and principals, implementation strategies were selected by matching strategies from the compilations with the determinants related to the three constructs of the COM-B model [9]. For example, the strategies of conducting educational meetings and ongoing training were chosen to address determinants related to capability (i.e., knowledge and skills). There is evidence that educational meetings and workshops can impact professional behavior [23] by providing access to knowledge and information [20]. Conducting ongoing training can also be used to provide individuals with skills to perform the behavior [24].

The formation of local implementation teams was a strategy chosen to address determinants related to opportunity and to provide support to the principal. Even though the evidence for the effectiveness of implementation teams is limited, implementation teams have been identified as a critical component for facilitating implementation by the Quality Implementation Framework [25]. Implementation teams create an internal support structure for implementation by specifying who will perform the tasks related to delivering the intervention and monitoring the implementation process [25]. A core function of implementation teams is to conduct improvement cycles, such as Plan-Do-Study-Act cycles (PDSA-cycles [26]. Implementation teams employ PDSA-cycles to identify, problem-solve, and address barriers and improve implementation [26]. PDSA-cycles can be used as a strategy to address opportunity (i.e., by creating an environmental context and resources for implementation), capability (i.e., increasing self-efficacy by conducting small changes), and motivation (i.e., by facilitating planning and decision-making) [20].

A strategy addressing opportunity was added to the multifaceted strategy based on findings of our process evaluation (unpublished observations). The process evaluation conducted parallel to our previous study [9] identified the lack of support from the school district as an important barrier for implementation. To formalize the role of the school district in the implementation process, a decision was made to add internal facilitators as an implementation strategy. Internal facilitators have been shown to support the implementation process, by among others overcoming obstacles for implementation [27, 28]. Core activities of implementation facilitators identified in the literature include for example problem identification, action/implementation planning, clarifying roles, goal/priority setting, and assessing, and monitoring implementation [29]. There is growing evidence for the effectiveness of implementation facilitation in improving implementation [30, 31] and facilitators have in several studies been shown to successfully facilitate implementation efforts [28, 32].

The current study

A new cluster-randomized controlled trial will be conducted focussing on the processes through which the multifaceted implementation strategy is hypothesized to operate to affect the implementation outcome. Previous work suggested that the multifaceted strategy was no more successful than a single component strategy. The current study directly builds on this finding by focusing on mechanisms, the distinct processes that explain how and why an implementation strategy leads to implementation success. More specifically, the current study will rigorously test theoretically driven hypothesized mechanisms of change in the multifaceted strategy. This will significantly refine and expand previous findings because it allows for the examination of exactly how the multifaceted strategy can lead to improvements in implementation outcomes (or fail to do so). This quantitative assessment will be supplemented with qualitative work, which will provide additional practice-based insight for understanding how the hypothesized mechanisms explain effectiveness in this context, and what additional considerations may be important for understanding why one strategy outperforms another. Additional enhancements in the current cluster-randomized controlled trial include refinement of the psychometric properties of the implementation outcome measure, adding an additional component (internal facilitators) to the multifaceted strategy, and using a larger sample.

Study aim

The aim of the study is to explore the implementation mechanisms through which a single-implementation strategy and a multifaceted implementation strategy operate to affect the primary implementation outcome, which is fidelity to the Guideline For The Prevention of Mental Ill-Health At The Workplace. The implementation mechanisms will be examined by exploring different causal pathways in line with the COM-B model and applying a mixed method design. The mixed method design will entail a qualitative study of implementation process embedded within the cluster-randomized controlled trail examining implementation mechanisms. The methods will be used in a complementary manner to get a full understanding of the implementation mechanisms. Through its exploratory nature, the study will provide valuable knowledge on how implementation strategies work (or fail) to affect implementation outcomes.

Research questions

Q1. How do the implementation strategies affect capability, opportunity, and motivation over time?

Q2. Is the effect of the implementation strategies on fidelity to the guideline mediated by capability, opportunity, and motivation?

Q3. Does baseline readiness to change moderate the implementation strategies’ implementation mechanisms?

Trial design

The study has a cluster-randomized-controlled trial design with before and after measurements. Schools are randomized (1:1 ratio) to ARM 1 or ARM 2. ARM 1 receives all strategies during year 1, while ARM 2 forms implementation teams and receives the educational strategy in year 1 and the other strategies during year 2 (Fig. 1). The study is funded by the Swedish Research Council for Health, Working Life and Welfare, approved by the Swedish Ethical Review Agency (2021-01828) and registered at ClinicalTrials.org (dr.nr 2020-01214).

Fig. 1
figure 1

CONSORT flow-chart

The description of the design applied in the protocol follows the CONSORT—and TIDier—reporting guidelines (see checklist in Additional files 1 and 2).

Methods

Study setting and population

The study is conducted among public primary and upper-secondary schools (n=55 schools) in four municipalities in Sweden. Two municipalities are located in an urban area and two municipalities in a rural area. The municipalities give a good representation of different geographical areas and socioeconomic as well as urban and rural areas.

Eligibility criteria

All personnel employed by the schools are eligible to participate, including teachers, administrators, and support personnel (e.g., reading specialist, teacher’s aide, and paraprofessional). Individuals not employed by the school (e.g., external cleaning and maintenance personnel) will be excluded, as they do not fall under the management of the school.

Interventions

Guideline to be implemented

The object that will be implemented is the Guideline For The Prevention Of Mental Ill-Health At The Workplace [5]. The guideline includes recommendations how employers in cooperation with their personnel can prevent mental ill health within their organization. The guideline includes the following recommendations: (1) workplaces have well-established routines/policies related to organizational and social risk management, (2) employers have knowledge of the relationship between organizational and social risks and mental ill health, and (3) workplaces regularly assess their organizational and social work environment and intervene on identified risk factors. Personnel involvement is strongly emphasised in the guideline. For example, it is recommended to conduct group discussions with personnel to prioritize work environment risks that need changing to prevent mental ill health and to involve personnel in the development of action plans describing changes that need to be made. The guideline was systematically developed in a collaboration between researchers, employer representatives, and occupational health service staff and includes recommendations that are based on the best available evidence in the field (e.g., [33,34,35,36]. The guideline complies fully with the Swedish Work Environment Authority’s organizational and social work environment (AFS 2015:4) provisions. Since 2018, the guideline is disseminated through the Swedish Agency for Work Environment and Expertise (https://sawee.se/). A full description of the recommendations has been published previously [9].

Implementation strategies

The strategies to be evaluated are based on those developed in our previous study [9]. An additional strategy was added, namely an internal facilitator. Refinements were made to the content of the educational meeting and workshops, with more focus on knowledge provision regarding the guideline recommendations and Plan-Do-Study-Act methodology. Moreover, educational material to support the formation of Plan-Do-Study-Act cycles was added in the form of templates to be used by the implementation team. To provide a deeper understanding of the mechanism of change, the original COM-B pathways were reassessed and refined based on the Theoretical Domains Framework (TDF) [37]. The domains of the TDF have previously been successfully mapped onto the COM-B, with excellent agreement [37]. Implementation strategies specified in accordance with Proctor recommendations for specification [38] are described in Table 1. The implementation logic model is depicted in Fig. 2.

Table 1 Specification of the implementation strategies
Fig. 2
figure 2

Implementation logic model

Implementation teams

At each school the headmaster will form an implementation team consisting of 4–5 people depending on the size of the school. The team will include the principal, teacher (union)—representative, occupational health and safety officer, and support staff representative. The members of the team should represent the actual mix of staff who will be involved in the implementation of the guideline. To support the principal with the formation of the team, instructions will be sent by email, including a template to specify team members’ names, roles, and motivation for inclusion. The team will be encouraged to meet regularly. The schools will choose the length and frequency of the meetings.

Educational meeting

At each municipality, implementation teams and school-district representatives (e.g., director of education, HR-specialist) will participate in a 1-day educational meeting conducted by an implementation researcher (LK) and a researcher in occupational health (CB). The educational meeting will be a mixture of lectures, discussions, activities, and group exercises. During the first lecture, the researchers will provide information on mental ill health, the guideline, and how working in accordance with the guideline can prevent mental ill health. Each team will conduct exercises aimed at reflecting on their current adherence to the guideline recommendations, identifying the benefits of adhering to the guideline, setting an implementation goal, and planning for implementation. Instructions related to the exercises will be given through several short lectures given by the implementation researcher and complementary materials, including templates for goal formulation. At the end of the meeting, a lecture will be given on what is needed to succeed with the developed plan. Potential barriers and facilitators are also introduced.

Ongoing training in the form of workshops

To support the implementation teams with their implementation process, five 2.5-h workshops will be held at each municipality by the researchers (LK and CB) over a 12-month period. Each workshop contains a lecture aimed at providing detailed information on guideline recommendations (one recommendation per workshop). The workshops will mix lectures with discussions, activities, and exercises. Each workshop is divided into three modules. During the first module, the teams will present the progress they have made with their PDSA-cycle and adjustments that need to be made to continue (see description below). The second module is aimed at providing detailed information on the guideline recommendations. This module includes lectures combined with discussions and exercises. During the exercises, teams compare how they currently work with the prevention of mental ill health with what is recommended by the guideline. At the end of the module, teams will have identified what needs to be changed. The third module is aimed at providing teams with information and skills regarding how to implement the guideline within their workplace. Lectures will be given on Specific, Measurable, Achievable, Realistic, and Timely (SMART)—goals and how to conduct PDSA cycles. During this module, teams will conduct exercises aimed at formulating goals and making plans for implementation in accordance with the PDSA cycles. During each workshop, implementation teams receive complementary materials, including handouts of the lectures, material, and templates for the exercises.

Plan-Do-Study-Act cycles

The implementation teams will start with their first PDSA cycle during the first workshop. In line with the PDSA structure, the teams will compare their current situation with the guideline recommendations, formulate a goal based on the recommendations, identify barriers to achieving the goal, identify what changes are needed to achieve the goal, and develop an implementation plan that describes how the changes are to be implemented and who is responsible for the changes. Between workshops, the teams are encouraged to implement the changes and measure the effects, compare the results with the formulated goal, and, if necessary, adjust the changes. To support the teams with their PDSA cycles, lectures will be given during the workshops explaining how to conduct PDSA cycles; moreover, teams will during each workshop receive PDSA planning templates. The template is based on the planning tool developed by the National Implementation Research Network and translated to Swedish for the purpose of the study [26]. At the end of each workshop, the teams will hand in a copy of their template to the research team. Between workshops, teams will be reminded to complete their template encouraging them to reflect over their implementation progress.

Internal facilitator

At each municipality a representative of the school district will be selected in collaboration with the research team to act as internal facilitator. The internal facilitator should be familiar with the school-level organizational structures and procedures. In this study, the functions of the facilitator can include supporting implementation teams with identifying changes to be made, helping to prioritize, supporting with identification, and understanding barriers, helping with problem-solving if needed, and providing positive reinforcement. Overall, the facilitator will provide support needed for to the implementation teams to work according to the recommendations of the guideline, for example, by providing resources and technical support. The internal facilitator will participate during the educational meeting and each workshop.

Outcomes

Table 2 describes the measurement variables, method of data collection, data source, and time-point for each measure.

Table 2 Measurement variables, method of data collection, data source, and time points

Participant timeline

The time schedule of enrolment, implementation strategies, and assessments is described in Fig. 3.

Fig. 3
figure 3

The time schedule of enrolment, implementation strategies, and assessments

Sample size and power

For the mediation analysis, the power calculation by Lee et al. [44] is used due to the lack of information on the mediators to conduct own calculation. In the study by Lee and colleagues, the same TDF constructs as well as a similar modeling strategy were used to assess mediators as in the present study. Their calculation indicated that a sample of 121 provides 80% power to detect moderate treatment mediator and mediator outcome effects. With 55 participating schools and 4 to 5 implementation team members per school, we are expecting a sample size of around 220 participants, clustered in their respective schools, for mediation analysis. However, for the implementation outcome, we will have data for all school personnel, with an estimated sample size of around 1500 participants (equivalent to 28 participants per school).

Recruitment

Recruitment of municipalities started in September 2020 by contacting municipalities that had previously shown interest in participating in research projects, advertising the project via stakeholder-related channels, and contacting municipalities by email. Between September 2020 and April 2021, the research team met with four municipalities that showed interest in participation. All municipalities agreed to participate. Next, informational meetings were held with principals and union representatives to inform them about the background, rationale, and logistics of the project. Finally, school personnel were recruited. An informational film recorded by the research team describing the project and what participation entails was disseminated to all personnel by the principals. An informational letter describing the study objectives, research approach, voluntary participation, data collection process, and that participation can be stopped at any time on request by the participants was sent by email. School personnel were given the opportunity to contact the research team if anything was unclear. Individuals who decided to participate in the study accordingly completed an informed consent form.

Assignment of interventions

Due to practical necessity (spread of recruitment), schools were randomized in two rounds. The first randomization was stratified by municipality and clustered, so that all schools in a school district with same central leadership or alternatively all schools with the same headmaster were randomized as a cluster. Upper secondary schools were randomized as a separate stratum in the first randomization round to guarantee their equal distribution among the two groups. The second round of randomisation included only one municipality. Two school pairs (four schools in total) had the same principles and were randomized as clusters. These schools were randomized as a separate stratum from the other schools to maintain balance in the number of pupils and staff. No further stratifications were made. The randomization was done by randomly ordering the clusters within the strata and assigning group allocation based on order. A seed was set for replication. Randomization was conducted by an independent statistician, blind to the identity of the schools and not involved within the project. The principal investigator was not involved in the group assignment. Due to logistical reasons (i.e., planning workshops within the school’s schedules) randomization occurred prior to baseline measurements. The principal investigator informed the municipalities and schools of their group allocation. Blinding of principals or school personnel is not possible within the chosen study design.

Data collection methods

Fidelity to the guideline

Fidelity to the guideline will be assessed by a checklist and web survey at baseline and 12- and 24-month follow-ups. The checklist contains three sections and 12 statements with one section for each guideline recommendation. Per recommendation, the school management indicates their agreement with the related statement, e.g., “At our school, we have updated work environment policies”. Respondents indicate whether they agree with the statement and if so, provide a detailed description of how and when the activity was performed, and attach the related documents (e.g., work environment policies). The checklist is electronic, developed for the purpose of the study, and was pilot-tested among a small sample of principals not participating in the study.

The web survey provides a measure of fidelity to the guideline from the perception of the school personnel. It is hypothesized that if schools adhere to the guideline recommendations, then the school personnel will be exposed to the related activities. The web survey was developed for our previous study [9]. For the present study, cognitive testing of the questions and response categories was conducted among teachers (n=5), to ensure that the questions successfully capture the scientific intent and that they make sense to the respondents. The survey contains 12 statements related to the recommendations of the guideline. Respondents indicate on a 5-point Likert Scale ((1) “strongly disagree”- (5) “strongly agree”) the extent of their agreement with the statements. For example, “During the last work environment survey results of the survey were presented to personnel (e.g., by email or during a joint meeting)”. A link to the web survey will be sent by e-mail by the project team. The survey can be obtained upon request from the corresponding author.

Hypothesized mediators

Mediators will be assessed with the Determinants of Implementation Behaviour Questionnaire based on the Theoretical Domains Framework (DIBQ [39]). For this study, items were translated and back-translated from English to Swedish by an independent researcher. The items were pilot-tested among participants of our previous school study. The questionnaire will be completed on four occasions by all participants exposed to the implementation strategies. The following domains will be assessed, each with three items, on a 7-point scale ranging from (1) strongly disagree to (7) strongly agree: knowledge, skills, beliefs about consequences, beliefs about capability, social influences, intention, goals, behavioural regulation, and environmental context and resources. Table 2 provides an overview of the time points when each specific domain will be assessed.

Descriptive variables

Descriptive variables, including age, gender, number of years working at the school, and number of years working within the profession, will be assessed by web survey completed by principals and school personnel at baseline and 12- and 24-month follow-ups. Self-reported stress and self-rated health will be assessed with a single item with 5-point response anchors ranging from (1) “not at all” (5) “very much” [40, 41]. Self-perceived health is assessed with a single question from the SF-12 Health Survey [42] with 5-point response anchors ranging from (1) excellent to (5) bad. Self-reported psychosocial safety climate (PSC) will be assessed with the 4-item Psychosocial Safety Climate Scale with 5-point response anchors ranging from (1) strongly disagree to (5) strongly agree [43].

Readiness to implement

Readiness to implement will be assessed with the Leader/Staff Readiness to Implement Tool during the educational meeting and during workshop 5. The tool was developed for use in the school context and has a leader and staff version (Cook et al., submitted). Both versions contain 14 items with 5-point response anchors ranging from (1) strongly disagree to (5) strongly agree.

Process outcomes

The implementation process will be evaluated by collecting information on implementation outcomes as defined by Proctor and colleagues [45]. Implementation outcomes include penetration and fidelity. Moreover, we will explore participants’ perspectives regarding how or why the implementation strategies worked or failed (and how they might be optimized in subsequent efforts) via observations and semi-structured interviews.

Penetration

Penetration will be assessed through attendance lists, completed by the research team during the educational meetings and during each workshop. Penetration will be operationalized as the absolute number and proportion of individuals participating in the educational meeting and workshops in relation to those expected to participate. Information on penetration is collected as a low penetration could influence the functioning of the implementation mechanisms.

Fidelity

Fidelity to the educational meeting and workshops will be assessed by the research team during the meeting and workshops by using a checklist to note whether they were implemented in accordance with the study protocol. A research log is kept describing deviations and possible reasons for deviations. Fidelity to the implementation teams will be assessed during the educational meeting by examining whether each school has formed an implementation team that is comprised of the recommended representatives. Throughout the study, information will be collected on whether the formation of the teams has changed along with possible reasons for those changes. Fidelity to the PDSA cycles will be assessed by collecting the implementation teams’ PDSA templates after each workshop and comparing the templates against the key principles of the strategy [46]. Fidelity to the internal facilitator will be assessed by checking whether each municipality has appointed an internal facilitator. Information on fidelity is collected as low fidelity to the different strategies could influence the functioning of the implementation mechanisms.

Implementation process

The functioning of the implementation strategies will be assessed by observation and by semi-structured interviews. Observations of the implementation teams will be made during workshops 2 and 5 by using an observation protocol (based on [47]) focusing on the following themes: planning and organisation, interest and engagement, productivity, process, and group-dynamic and climate. The collected information will be used to assess the functioning of the implementation teams. Semi-structured interviews will be conducted at a 12-month follow-up with all principals and a purposive selection of implementation team members based on their role in the implementation process. An interview guide will be developed for the purpose of the study covering the following themes: activities related to the implementation process, how the implementation strategies have supported the implementation process, challenges experienced during the implementation process, how implementation strategies might be optimized in future efforts, additional strategies that might be needed, and other areas for improvement. Semi-structured interviews will also be conducted with school district representatives to assess the extent to which they have acted as internal facilitators supporting the implementation process among their schools according to their functions. The interviews will be conducted by the research team, audio-recorded, and transcribed verbatim.

Context

Six months after the educational meeting the research team will conduct a telephone follow-up (20–30 min) with all principals to assess contextual factors potentially influencing the functioning of the implementation mechanisms. For this study, an interview guide was developed based on an existing guide [24], which was adapted to fit the school context. The guide includes two questions and follow-up prompts. First, the principals will be asked to summarize the activities that they have undertaken to implement the guideline since the educational meeting. Second, they will be asked to identify any circumstances that may have occurred at their school and/or school district and may have influenced the implementation process, such as personnel turnover and organizational changes. The follow-ups will be audio-recorded and transcribed verbatim.

Data management

Data will be collected by using the secure web platform Research Electronic Data Capture (RedCap [48]) and stored electronically in a password-protected folder on a secure server at our institute. All data will be deidentified, by removing all person-identifiable information from the database and replacing it with a code. The code key is saved, which will enable individuals to request an extract of the collected data and demand that information on him/her will be destroyed without any given reason. Only the research team will have access to the identification code, which is stored separately from the data.

Data analysis

The study will use mixed methods to fulfil the complementarity function for evaluation purpose [49]. Thus, causal linkages between the strategies, mediators, and implementation outcomes will be tested with the help of quantitative methods, while data from semi-structured interviews will be analyzed to provide further insight into the process of the mechanism functioning and experiences of those involved. The results of the qualitative study of the implementation process will be nested within the quantitative study of implementation mechanisms.

Statistical analysis

Categorical variables will be presented with count and percentage, continuous variables with median and interquartile range, unless the mean and standard deviation is deemed more informative. The primary outcomes of interest are the mediated effects between the outcome fidelity and the implementation strategy. The mediators are the domains of the Determinants of Implementation Behaviour Questionnaire (DIBQ) [39]. All the domains consist of three items. The first step will be to use a confirmatory factor analysis to confirm that the hypothesized domains are reasonable and can be used. We will then use a mediation analysis to estimate the average mediation effect, average direct effect, and average total effect as well as the proportion mediated. The mediation analysis will be done within the Structural Equations Modeling framework to best take advantage of the data structure. A detailed statistical analysis plan will be developed in collaboration with a statistician prior to starting the data analysis.

Ethics

The study has been approved by the Swedish Ethical Review Authority (No. 2021-01828). The study complies fully with current ethical requirements regarding the handling and storage of personal data and regarding the informed consent process in accordance with General Data Protection Regulation. An information letter is sent to potential participants describing the study aim, research approach, data collection process, and voluntary participation, including the possibility to withdraw at any time. The letter also states that data is only collected for the purpose of the study, presented at the group level and that no personal data will be shared with the school or school district. The letter allows participants to make informed decisions about whether to participate. Informed consent will accordingly be collected from all participants.

Plans for dissemination

Results will be disseminated within the scientific community through peer-reviewed open-access papers and scientific conferences. Results will also be disseminated outside of the research community, for example, through social media, popular scientific reports, and national seminars for key stakeholders, including municipalities and schools.

Discussion

The present study will further our understanding of how implementation strategies work (or fail) to affect implementation outcomes. This study will provide both knowledge on the impact of implementation strategies on theorized mediators and on whether the impact on these mediators facilitates implementation. Ultimately, the knowledge gained will aid the selection of effective implementation strategies that fit specific determinants, which is a priority for the field [50]. Despite recent initiatives to advance the understanding of implementation mechanisms [11], studies testing these mechanisms are still uncommon [11,12,13, 51].

The study has several strengths. First, to assess implementation mechanisms, we will conduct a cluster-randomized controlled trial exploring different causal pathways of how a multifaceted implementation strategy compared to a discrete strategy impacts the fidelity to the guideline via hypothesized mediators originating from the COM-B model. The need for randomized controlled trials with high-quality designs testing implementation strategies and articulating and evaluating theory-derived mechanisms has been underscored in several reviews [12, 13]. Second, implementation mechanisms will be explored with mixed methods. Few studies have used a mixed methods approach to understanding implementation mechanisms [13]. Finally, this study will provide valuable knowledge on how implementation strategies work in a school context. Most implementation effectiveness studies are conducted within a health care setting, emphasizing the need for school context-specific studies [52]. As the Swedish school setting shares many similar features with other countries’ school contexts, we believe that the knowledge gained in the present study will be generalizable to school systems outside of Sweden.

Availability of data and materials

Not applicable

References

  1. Kidger J, Brockman R, Tilling K, Campbell R, Ford T, Araya R, et al. Teachers’ wellbeing and depressive symptoms, and associated risk factors: a large cross sectional study in English secondary schools. J Affect Disord. 2016;192:76–82.

    Article  PubMed  Google Scholar 

  2. Arvidsson I, Hakansson C, Karlson B, Bjork J, Persson R. Burnout among Swedish school teachers - a cross-sectional analysis. BMC Public Health. 2016;16(1):823.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Hultell D, Gustavsson JP. Factors affecting burnout and work engagement in teachers when entering employment. Work. 2011;40(1):85–98.

    Article  PubMed  Google Scholar 

  4. Swedish Work Environment Authority. Report on the project for the national school inspection 2013-2016. [Swedish: Arbetsmiljöverket. Projektrapport för Arbetsmiljöverkets nationella tillsyn av skolan 2013-2016. Stockholm; 2017]. Stockholm: Swedish Work Environment Authority; 2017.

  5. Jensen I. Mental ill-health at the workplace. Guideline for the assessment and treatment of mental ill-health at the workplace [Swedish: Riktlinjen för Psyksik Ohälsa på arbetsplatsen]; 2015.

    Google Scholar 

  6. Montano D, Hoven H, Siegrist J. Effects of organisational-level interventions at work on employees’ health: a systematic review. BMC Public Health. 2014;14:135.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Gagliardi AR, Malinowski J, Munn Z, Peters S, Senerth E. Trends in guideline implementation: an updated scoping review protocol. JBI Evid Synth. 2022;20(4):1106–12.

  8. Gagliardi AR, Alhabib S. members of Guidelines International Network Implementation Working G. Trends in guideline implementation: a scoping systematic review. Implement Sci. 2015;10:54.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Kwak L, Lornudd C, Bjorklund C, Bergstrom G, Nybergh L, Elinder LS, et al. Implementation of the Swedish Guideline for Prevention of Mental ill-health at the Workplace: study protocol of a cluster randomized controlled trial, using multifaceted implementation strategies in schools. BMC Public Health. 2019;19(1):1668.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Toropova A, Bjorklund C, Bergstrom G, Elinder LS, Stigmar K, Wahlin C, et al. Effectiveness of a multifaceted implementation strategy for improving adherence to the guideline for prevention of mental ill-health among school personnel in Sweden: a cluster randomized trial. Implement Sci. 2022;17(1):23.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Lewis CC, Powell BJ, Brewer SK, Nguyen AM, Schriger SH, Vejnoska SF, et al. Advancing mechanisms of implementation to accelerate sustainable evidence-based practice integration: protocol for generating a research agenda. BMJ Open. 2021;11(10):e053474.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Williams NJ. Multilevel mechanisms of implementation strategies in mental health: integrating theory, research, and practice. Adm Policy Ment Health. 2016;43(5):783–98.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Lewis CC, Boyd MR, Walsh-Bailey C, Lyon AR, Beidas R, Mittman B, et al. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15(1):21.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Fernandez ME, Ten Hoor GA, van Lieshout S, Rodriguez SA, Beidas RS, Parcel G, et al. Implementation mapping: using intervention mapping to develop implementation strategies. Front Public Health. 2019;7:158.

    Article  PubMed  PubMed Central  Google Scholar 

  15. French SD, Green SE, O'Connor DA, McKenzie JE, Francis JJ, Michie S, et al. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework. Implement Sci. 2012;7:38.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Colquhoun HL, Squires JE, Kolehmainen N, Fraser C, Grimshaw JM. Methods for designing interventions to change healthcare professionals’ behaviour: a systematic review. Implement Sci. 2017;12(1):30.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2017;44(2):177–94.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Organisation for Economic Co-operation and Development. Health at a glance: Europe 2018 state of health in the EU cycle. Paris: OECD Publishing; 2018.

    Google Scholar 

  19. Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6:42.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Waltz TJ, Powell BJ, Fernandez ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14(1):42.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10:109.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Forsetlund L, O'Brien MA, Forsen L, Reinar LM, Okwen MP, Horsley T, et al. Continuing education meetings and workshops: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2021;9:CD003030.

    PubMed  Google Scholar 

  24. Lengnick-Hall R, Willging CE, Hurlburt MS, Aarons GA. Incorporators, Early Investors, and Learners: a longitudinal study of organizational adaptation during EBP implementation and sustainment. Implement Sci. 2020;15(1):74.

  25. Meyers DC, Durlak JA, Wandersman A. The quality implementation framework: a synthesis of critical steps in the implementation process. Am J Community Psychol. 2012;50(3-4):462–80.

    Article  PubMed  Google Scholar 

  26. National implementation research network. Module 1: improvement cycles. Available from: https://nirn.fpg.unc.edu/module-1/improvement-cycles. Accessed 26 Jan 2022.

  27. Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implement Sci. 2008;3:1.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Moussa L, Garcia-Cardenas V, Benrimoj SI. Change facilitation strategies used in the implementation of innovations in healthcare practice: a systematic review. J Change Manag. 2019;19(4):283–301.

    Article  Google Scholar 

  29. Ritchie MJ, Dollar KM, Miller CJ, Smith JL, Oliver KA, Kim B, et al. Using implementation facilitation to improve healthcare (Version 3). 2020.

    Google Scholar 

  30. Kirchner JE, Ritchie MJ, Pitcock JA, Parker LE, Curran GM, Fortney JC. Outcomes of a partnered facilitation strategy to implement primary care-mental health. J Gen Intern Med. 2014;29(Suppl 4):904–12.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Ritchie MJ, Parker LE, Edlund CN, Kirchner JE. Using implementation facilitation to foster clinical practice quality and adherence to evidence in challenged settings: a qualitative study. BMC Health Serv Res. 2017;17(1):294.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Baskerville NB, Liddy C, Hogg W. Systematic review and meta-analysis of practice facilitation within primary care settings. Ann Fam Med. 2012;10(1):63–74.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Bambra C, Gibson M, Sowden AJ, Wright K, Whitehead M, Petticrew M. Working for health? Evidence from systematic reviews on the effects on health and health inequalities of organisational changes to the psychosocial work environment. Prev Med. 2009;48(5):454–61.

    Article  CAS  PubMed  Google Scholar 

  34. Bhui KS, Dinos S, Stansfeld SA, White PD. A synthesis of the evidence for managing stress at work: a review of the reviews reporting on anxiety, depression, and absenteeism. J Environ Public Health. 2012;2012:515874.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Wynne R, De Broeck V, Leka S, Houtman I, Mc DD. Promoting mental health in the workplace. Guidance to implementing a comprehensive approach. Hague: European Commission; 2014.

  36. Joosen MC, Brouwers EP, van Beurden KM, Terluin B, Ruotsalainen JH, Woo JM, et al. An international comparison of occupational health guidelines for the management of mental disorders and stress-related psychological symptoms. Occup Environ Med. 2015;72(5):313–22.

    Article  PubMed  Google Scholar 

  37. Cane J, O'Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7:37.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Huijg JM, Gebhardt WA, Crone MR, Dusseldorp E, Presseau J. Discriminant content validity of a theoretical domains framework questionnaire for use in implementation research. Implement Sci. 2014;9:11.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Arapovic-Johansson B, Wahlin C, Kwak L, Bjorklund C, Jensen I. Work-related stress assessed by a text message single-item stress question. Occup Med (Lond). 2017;67(8):601–8.

    Article  CAS  Google Scholar 

  41. Elo AL, Leppanen A, Jahkola A. Validity of a single-item measure of stress symptoms. Scand J Work Environ Health. 2003;29(6):444–51.

    Article  PubMed  Google Scholar 

  42. Ware JEKM, Turner-Bowker DM. How to score version 2 of the SF-12 health survey. Boston: Lincoln Quality Matric Inc.; 2002.

  43. Berthelsen H, Muhonen T, Bergstrom G, Westerlund H, Dollard MF. Benchmarks for evidence-based risk assessment with the Swedish version of the 4-item psychosocial safety climate scale. Int J Environ Res Public Health. 2020;17(22):8675.

  44. Lee H, Hall A, Nathan N, Reilly KL, Seward K, Williams CM, et al. Mechanisms of implementing public health interventions: a pooled causal mediation analysis of randomised trials. Implement Sci. 2018;13(1):42.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009;36(1):24–34.

    Article  PubMed  Google Scholar 

  46. McNicholas C, Lennox L, Woodcock T, Bell D, Reed JE. Evolving quality improvement support strategies to improve Plan-Do-Study-Act cycle fidelity: a retrospective mixed-methods study. BMJ Qual Saf. 2019;28(5):356–65.

    Article  PubMed  PubMed Central  Google Scholar 

  47. New York State Teacher Certification Examinations. In: Pearson Education i, editor. Framework for the observation of effective teaching; 2011.

    Google Scholar 

  48. Harris PA, Taylor R, Minor BL, Elliott V, Fernandez M, O'Neal L, et al. The REDCap consortium: building an international community of software platform partners. J Biomed Inform. 2019;95:103208.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Adm Policy Ment Health. 2011;38(1):44–53.

    Article  PubMed  Google Scholar 

  50. Wensing M, Grol R. Knowledge translation in health: how implementation science could contribute more. BMC Med. 2019;17(1):88.

    Article  PubMed  PubMed Central  Google Scholar 

  51. Edmondson D, Falzon L, Sundquist KJ, Julian J, Meli L, Sumner JA, et al. A systematic review of the inclusion of mechanisms of action in NIH-funded intervention trials to improve medication adherence. Behav Res Ther. 2018;101:12–9.

    Article  PubMed  Google Scholar 

  52. Lyon AR, Cook CR, Locke J, Davis C, Powell BJ, Waltz TJ. Importance and feasibility of an adapted set of implementation strategies in schools. J Sch Psychol. 2019;76:66–77.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

We would like to thank Anna Warnqvist, the statistician of this study.

Funding

Open access funding provided by Karolinska Institute. The funding body of the study described in this study is the Swedish Research Council for Health, Working Life and Welfare (FORTE), Sweden, grant number 2020-01214. The funding body has no role in the design of the study or collection, analysis, and interpretation of the data or in writing the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

LK is the principal investigator of the study described in the study protocol. LK and CB are involved in the execution of the project. LK, CB, and AT are involved in the evaluation of the project. IJ has been responsible for the development of the guideline, which is being implemented. LK, AT, IJ, BP, and RLH have been involved in the choice and refinement of measurement instruments and defining mechanisms. LK, AT, CB, CW, IJ, KS, LSE, and GB have been involved in the design of the study. Project administration was done by AT and LK; funding acquisition was done by LK. LK has written the first draft of the manuscript. The authors have been involved in revising the manuscript and given final approval of the version submitted.

Corresponding author

Correspondence to Lydia Kwak.

Ethics declarations

Ethics approval and consent to participate

The Regional Ethical Board in Stockholm (dr. nr. 2021-01828) has approved the study. The study complies fully with current ethical requirements regarding the handling and storage of personal data and regarding the informed consent process in accordance with Sweden’s Personal Data Act and Secrecy Act.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

The TIDieR (Template for Intervention Description and Replication) Checklist.

Additional file 2: Table 1

. CONSORT 2010 checklist of information to include when reporting a cluster randomised trial. Table 2. Extension of CONSORT for abstracts to reports of cluster randomised trials.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kwak, L., Toropova, A., Powell, B.J. et al. A randomized controlled trial in schools aimed at exploring mechanisms of change of a multifaceted implementation strategy for promoting mental health at the workplace. Implementation Sci 17, 59 (2022). https://doi.org/10.1186/s13012-022-01230-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-022-01230-7

Keywords