Skip to main content

Sustained effects of the INFORM cluster randomized trial: an observational post-intervention study

Abstract

Background

Numerous studies have examined the efficacy and effectiveness of health services interventions. However, much less research is available on the sustainability of study outcomes. The purpose of this study was to assess the lasting benefits of INFORM (Improving Nursing Home Care Through Feedback On perfoRMance data) and associated factors 2.5 years after removal of study supports. INFORM was a complex, theory-based, three-arm, parallel cluster-randomized trial. In 2015–2016, we successfully implemented two theory-based feedback strategies (compared to a simple feedback approach) to increase nursing home (NH) care aides’ involvement in formal communications about resident care.

Methods

Sustainability analyses included 51 Western Canadian NHs that had been randomly allocated to a simple and two assisted feedback interventions in INFORM. We measured care aide involvement in formal interactions (e.g., resident rounds, family conferences) and other study outcomes at baseline (T1, 09/2014-05/2015), post-intervention (T2, 01/2017-12/2017), and long-term follow-up (T3, 06/2019–03/2020). Using repeated measures, hierarchical mixed models, adjusted for care aide, care unit, and facility variables, we assess sustainability and associated factors: organizational context (leadership, culture, evaluation) and fidelity of the original INFORM intervention.

Results

We analyzed data from 18 NHs (46 units, 529 care aides) in simple feedback, 19 NHs (60 units, 731 care aides) in basic assisted feedback, and 14 homes (41 units, 537 care aides) in enhanced assisted feedback. T2 (post-intervention) scores remained stable at T3 in the two enhanced feedback arms, indicating sustainability. In the simple feedback group, where scores were had remained lower than in the enhanced groups during the intervention, T3 scores rose to the level of the two enhanced feedback groups. Better culture (β = 0.099, 95% confidence interval [CI] 0.005; 0.192), evaluation (β = 0.273, 95% CI 0.196; 0.351), and fidelity enactment (β = 0.290, 95% CI 0.196; 0.384) increased care aide involvement in formal interactions at T3.

Conclusions

Theory-informed feedback provides long-lasting improvement in care aides’ involvement in formal communications about resident care. Greater intervention intensity neither implies greater effectiveness nor sustainability. Modifiable context elements and fidelity enactment during the intervention period may facilitate sustained improvement, warranting further study—as does possible post-intervention spread of our intervention to simple feedback homes.

Peer Review reports

Background

The design and evaluation of health services interventions is key to improving quality of healthcare and the patient experience. While numerous studies have examined the efficacy and effectiveness of health services interventions, much less attention has been paid to the sustainability of outcomes [1,2,3]. Failure to sustain intervention outcomes of effective interventions significantly limits the potential benefits of intervention investment. When researchers withdraw intervention supports, intervention activities and improved outcomes of successful interventions regularly decrease [4, 5], highlighting the need for post-intervention studies on intervention sustainability. Sustainability becomes increasingly challenging with increasing intervention complexity [6,7,8]—i.e., with an increasing number of intervention components interacting in complex ways, requiring multiple staff, often affecting multiple outcomes [9, 10]. According to a 2020 systematic review [11], little research has been published to date on the sustainability of complex interventions. This study responds to calls for research that examines the sustainability stage of successful interventions [5, 12] and contributes important knowledge on modifiable factors associated with sustainability of evidence-based interventions in health care settings [2].

Study objectives

Outcomes of interest vary depending on whether a study addresses intervention use, the effects of an intervention for those people it is designed to help, or both as in hybrid study designs. In the present study, our objectives are:

  1. 1.

    To examine the sustainability of the primary study outcome of a successful health services trial—INFORM (Improving Nursing Home Care Through Feedback On perfoRMance data) [13,14,15] in each of the 3 study arms. The primary outcome in INFORM was care aide involvement in formal communications about resident care.

  2. 2.

    To examine the extent to which the higher intervention intensity (study arm), better fidelity of initial intervention implementation, and key contextual variables (better leadership, work culture and feedback activities [evaluation]) predict higher sustainability of the primary INFORM study outcome.

Sustainability: definition and state of research

Reviews of sustainability research continue to identify the need for conceptual clarity and clear and consistent definitions of the construct [4, 6, 16]. A 2012 systematic review looking at the sustainability of new programs/interventions found that 65% of 125 included studies did not define sustainability with esoteric investigator-generated definitions provided in most of the remaining studies [5]. The concept of sustainability can refer to lasting benefits of an intervention and has been defined as the “an evidence-based intervention can deliver its intended benefits over an extended period of time after external support […] is terminated” [17] (p. 118). However, this definition is only “outcomes” focused while other conceptualizations of sustainability are broader and include the continued use of core intervention tools, processes, and behaviors [6, 18, 19]. For added clarity, recent work [1, 19] suggests distinguishing between sustainability (lasting benefits—i.e., sustaining or further improving study outcomes) and sustainment (continued enactment of intervention activities). In these broader conceptualizations, that encompass both sustainability and sustainment, adaptation as well as “institutionalization” of intervention activities have been identified as part of a dynamic sustainability process [1, 2, 19] taking place within complex systems [20]. Central to recent conceptualizations of sustainability is the recognition that many interventions interact with inner organizational contexts as well as outer contexts and are ideally adapted to fit those contexts [1, 12].

Sustainment studies are more common in the literature than sustainability studies. The 2012 systematic review cited above found that fewer than 25% of studies reported on the sustained impact of the program [5]. The reason there has been greater attention to sustainment of intervention activities may be that it is easier to measure and can occur within the timeframes of funded studies, compared to sustainability studies that examine lasting benefits several years beyond completion of an intervention. Studies that have examined sustainability suggest that it remains elusive [21,22,23,24], even for highly implementable prescribing practices such as use of aspirin, beta-blockers, and ACE inhibitors post AMI [25]. More common in the literature is evidence of challenges with sustainability (e.g., [8, 26,27,28]).

Determinants of intervention sustainability

There are many models that identify and categorize determinants of intervention sustainability in terms of multi-level contexts internal and external to host organizations [1, 29], a theme that has long been prevalent in technology transfer and knowledge utilization literatures as “mutual adaptation” [30, 31]. Prominent models of diffusion especially in terms of the importance of compatibility of an innovation with user context [2] and sustainability [32, 33] identify aspects of the intervention (e.g., adaptability), the micro and macro context, aspects of the implementation process, as well as readiness of/fit with the organization as important determinants of sustainability. Empirical work that comprehensively tests sustainability frameworks, however, remains limited [32]. There has been some empirical work examining certain factors that influence sustainability of complex interventions [7] and large scale QI programs [34]. A recent systematic review of 32 mostly qualitative studies identified contextual variables, such as role accountability, leadership, and organizational support as the main facilitators influencing sustainability of hospital-based interventions [26]. Other recent reviews identified aspects of organizational context and capacity as important determinants of sustainability but highlight the need for more rigorous empirical work in this area [5, 32].

Additional research is necessary, particularly given the evolution of sustainability models which now reflect ecological challenges to intervention-context fit [1]. For instance, there is a need to understand whether key contextual and other determinants of sustainability hold across a range of settings and intervention types [32] and how determinants may interact to influence sustainability [5]. In addition, we found almost no studies regarding the determinants of sustainability in complex versus simple interventions. Finally, while the need for additional research on fidelity (in relation to adaptation) of sustained intervention actions is well described in the literature, we found little empirical research assessing whether fidelity of an original health services intervention may or may not predict sustainability. Fidelity is “the degree to which an intervention is implemented as it is prescribed in the original protocol.” [17] (p. 120).

The INFORM trial

INFORM was a complex, pragmatic, three-arm, cluster-randomized trial designed to increase involvement of unregulated care aides in formal team communications about resident care in nursing homes. Care aides (personal support workers, nursing assistants) provide up to 90% of direct care to nursing home residents [35,36,37,38]. However, their intimate knowledge of resident care needs [39] often remains tacit rather than shared as they are rarely included in formal care decision making processes—leading to communication breakdowns and missed care [40]. While care aide involvement in formal team communications about resident care was our primary study outcome, INFORM is designed to be tailorable to address any of a wide variety of possible outcomes (e.g., care staff quality of work-life, leadership practices, or resident outcomes). We published the methods of INFORM in a trial protocol [13] and subsequently published results on INFORM’s effectiveness [14] and the impact of intervention fidelity on intervention effectiveness [15]. This study addresses an important additional aim outlined in our trial protocol [13]: to assess longer-term effects of INFORM.

INFORM was an audit and feedback intervention based on goal setting theory, designed to improve performance. We purposefully designed INFORM based on factors that positively influence implementation and sustainability, [41] including intervention attributes [42] (e.g., relative advantage, trialability or observability) and contextual elements [3, 5, 32] (e.g., engaging supportive leaders in the design and implementation of INFORM). We compared (1) a simple feedback approach to (2) a basic and (3) an enhanced assisted feedback approach. Two hundred one care unit teams in 67 Western Canadian nursing homes participated in INFORM. Teams in all three study arms received oral and written reports regarding the level of care aide involvement in formal team communications about resident care (and other contextual variables). Teams in the basic and enhanced arms also participated in three workshops where they defined learning and performance goals for increasing care aide involvement in decisions, created action plans, defined measures of success, reported progress and challenges implementing their action plans (workshops 2 and 3), and interacted with teams from other nursing homes. All three workshops in the enhanced assisted feedback arm were 3-h face to face events. In the basic assisted feedback arm, workshops 2 and 3 were virtual 1.5-h workshops.

Results showed [14] that care aide involvement in formal communications about resident care at follow up was 0.17 points higher in both the basic (95% confidence interval [CI]: 0.03; 0.32, p = 0.021) and enhanced study arms (95% CI 0.01; 0.33, p = 0.035), compared to simple feedback—with no differences between the study arms with the highest (enhanced) and mid-level (basic) intensity. Intervention fidelity was moderate to high, and higher fidelity enactment was associated with larger improvements in formal team communications [15]. Fidelity enactment refers to the extent to which participants adhere to and carry out core intervention components as intended by the study team during the intervention period [43].

Methods

Study design

This observational sustainability study is part of TREC (Translating Research in Elder Care)—a longitudinal program of applied health services research that since 2007 has collected comprehensive data on nursing home residents, care staff, care units, and facilities [44]. To assess sustainability of INFORM outcomes, we used 3 waves of TREC data: (1) pre-INFORM baseline (T1, 09/2014–05/2015), (2) post-INFORM (T2, 01/2017–12/2017), and (3) long-term follow-up (T3, 06/2019–03/2020)—data captured approximately 2.5 years following the end of the INFORM trial. These time periods constitute the waves of TREC data collection. The objective of consistent intervals of data collection is not always possible as it is contingent on the TREC research team’s resources (funding, researcher capacity) as well as the facilities capacity (sufficient staffing and no competing projects).

Study setting and sample

Our analytic cohort in this sustainability study included 18 facilities (46 care units) in the simple feedback arm, 19 facilities (60 care units) in the basic assisted feedback arm, and 14 facilities (41 care units) in the enhanced assisted feedback arm. Data for 16 units that were part of the initial INFORM study were not available for this sustainability study (either because a site discontinued participation in TREC or because unit configuration changed, details in Fig. 1).

Fig. 1
figure 1

INFORM sustainability study sample

Primary study outcome

Table 1 summarizes our study outcomes. Formal interactions, the primary study outcome, is taken from the well-validated Alberta Context Tool (ACT) [45, 46] that assesses 10 modifiable features of nursing home care unit work environments. Formal interactions is a self-reported measure of care aides’ participation in 4 types of formal meetings about resident care: team meetings about residents, family conferences, change of shift reports, and continuing education outside of the care aide’s facility. Each item is rated on a 5-point scale (never–almost always).

Table 1 Primary and secondary study outcomes and measures

Independent variables and covariates

Consistent with the Promoting Action on Research Implementation in Health Services (PARiHS) framework [47,48,49], we included the following contextual variables in our analysis (also part of the ACT): leadership (6 items rating care aides’ perception of transformational leadership of the person they report to most of the time, 5-point agreement scale, strongly disagree–strongly agree), culture (6 items rating care aides’ perception of the supportive work culture, 5-point agreement scale, strongly disagree–strongly agree), and evaluation (6 items rating care aides’ participation in data-based feedback and performance improvement activities, 5-point agreement scale, strongly disagree–strongly agree) (Table 1). We also included fidelity enactment, measured using a 5-point scale used in the initial INFORM study at the close of the intervention [15]. We adjusted our sustainability models for the same variables that we used to adjust models in our effectiveness study [14]: care aide age, sex, and first language [44]; care unit staffing [50]; and facility region, size, and ownership (Table 1).

Statistical analyses

Using SAS® 9.4, and following analytic methods described in our intervention effectiveness paper [14], we first descriptively compared our primary study outcome (care aide reported involvement in formal team communications about resident care—formal interactions) and covariates by study arm and time of data collection. To compare trajectories of the primary study outcome by study arm and time of data collection (objective 1), we estimated adjusted least square means and mean differences of these outcomes, using repeated measures mixed effects regression models with random intercepts for care unit and facility levels, and a random effect for care aides responding to our survey repeatedly. We added study arm and time of data collection as categorical variables, included an interaction term between study arm and time of data collection, and adjusted for facility variables (region, owner-operator model, size), care unit staffing (total care hours per resident day and percentage of total hours per resident day provided by care aides), and care aide characteristics (sex, age, English as first language [yes/no]).

Finally, to assess the impact of organizational context and fidelity enactment on the sustainability of our primary study outcome (objective 2), we specified two mixed effects regression models with formal interactions at T3 as the dependent variable. To the first model, we added leadership, culture, and evaluation at T3 and adjusted for variables that, based on our initial models, were predictive of formal interactions: study arm, region, care aide’s sex, and care aide’s first language. We also adjusted for the unit aggregated T2 formal interactions score (to account for differences in formal interactions pre-sustainability measurement). In addition, we ran the same model with fidelity enactment added (model 2). Since this outcome was only available in the assisted feedback arms, we excluded the simple feedback sample from this model. To assess whether the strength of the effects between our four main independent variables (leadership, culture, evaluation, and—for model 2—fidelity enactment) differed by study arm, we added interaction terms between each of these four variables and study arm. None of interaction terms improved model fit (based on the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) [51]) or had p values < 0.05. Despite the common practice to accept p values of up to 0.25 as statistically significant (i.e., raising the type 1 error rate), we decided—in line with the recommendation of a recent simulation study [52]—not to follow this practice since this increases the risk of including spurious interaction terms. Therefore, we did not include any interaction terms in the final models.

The amount of missing data in our data set was minimal. Leadership was missing in 35 of 11,988 (0.29%) of the records, culture was missing in 13 (0.11%) of the records, and evaluation was missing in 29 (0.24%) of the records. Data were missing completely at random as per Little’s MCAR test. No responses were missing for any of our other variables. Therefore, we deleted records with missing data listwise.

Sensitivity analyses

We conducted sensitivity analyses to address the question whether the trends in formal interactions that we found in our analyses may constitute an issue of regression to the mean. We compared trends in formal interactions in our three study arms to those in 12 TREC nursing homes (data from 544 care aides on 44 care units) located in the province of Manitoba, a province in which we had not carried out any INFORM activities. Therefore, the Manitoba sample can be seen as a natural control group.

Results

Table 2 includes facility, care unit, and care aide characteristics by study arm and time of data collection. Based on our descriptive statistics, total staffing hours per resident day, the proportion of care aide hours among total care staffing hours, and the proportion of care aides whose first language was not English seemed to differ between study arms (at all three points in time) and changed over time within each study arm. Care aide age and sex did not differ substantially between study arms and were stable over time. The Manitoba sample did not differ substantially from our Alberta and British Columbia samples.

Table 2 Facility, care unit, and care aide characteristics at by time of data collection and study arm

Our adjusted analyses (Fig. 2) demonstrate that at T3 (2.5 years after the intervention delivery had ended), care units in both assisted feedback groups had sustained the T2 (post-intervention) gains in care aide involvement in formal communications about resident care, indicating sustainability. While there seems to be a T2-T3 downward trend in the enhanced assisted feedback arm and a T2-T3 upward trend in the basic assisted feedback arm, neither of these group-specific trends are statistically significant. Notably, involvement of care aides in formal communications about resident care did not increase during the intervention period in the simple feedback group. However, it increased significantly at T3, rising to the levels seen in the two assisted feedback arms.

Fig. 2
figure 2

Adjusted formal interactions scores by study arm and time of assessment. Numbers presented in the figure are adjusted least square means based on repeated measures mixed effects regression models. Numbers in the table are p values of adjusted mean differences based on repeated measures mixed effects regression models. BAF, basic assisted feedback; EAF, enhanced assisted feedback; SF, simple feedback

Our sensitivity analyses (Fig. 3) illustrate that in the Manitoba sample formal interactions scores followed a consistent downward trend that differs substantially from the trends seen in the three INFORM study arms.

Fig. 3
figure 3

Unadjusted formal interactions scores by study arm and time of assessment, including the Winnipeg sample of nursing homes (bands are 95% confidence intervals). BAF, basic assisted feedback; EAF, enhanced assisted feedback; SF, simple feedback

As per Table 3, evaluation (feedback of performance data) at T3 was associated statistically significantly with higher involvement of care aides in formal communications about resident care at T3 in both models. Higher leadership scores (i.e., care aide ratings of the leadership of the persons they report to—primarily nurses) were not associated with care aide involvement in formal communications about resident care at T3. A more supportive work culture was associated with higher involvement of care aides in formal communications about resident care at T3 in model 1 (that included the simple feedback sample) but not in model 2. However, in model 2, better fidelity enactment at T2 predicted higher T3 involvement of care aides in formal communications about resident care. As noted, none of our interaction terms (fidelity by study arm and each of leadership, culture, evaluation by study arm) were statistically significant (p values consistently > 0.2) nor did they improve model fit and we therefore did not include any interaction terms in our final model. None of the other covariates were associated with care aide involvement in formal team communications about resident care at T3.

Table 3 Association of organizational context (leadership, culture, evaluation) and fidelity enactment with formal interactions, based on adjusted mixed effects regressions

Discussion

This study examined sustained effects of the INFORM intervention as well as the association of fidelity enactment and key variables of organizational context with sustained involvement of care aides in formal communications about resident care (formal interactions). Our findings suggest sustained benefits of the INFORM intervention 2.5 years after the research team had finalized the delivery of the intervention—signified by the lack of a statistically significant difference in formal interactions scores between T2 and T3 in both assisted feedback groups. Like at T2, formal interactions scores in the two assisted feedback groups did not differ at T3. Notably, formal interaction scores did not increase in the simple feedback arm during the intervention (signified by the lack of a statistically significant difference in formal interactions scores between T1 and T2), but rose to the levels of the assisted feedback arms at T3 after the end of the INFORM study. Higher fidelity enactment during the intervention period, more supportive work culture, and more feedback activities (evaluation) with a care unit were associated with higher formal interaction scores at T3, regardless of the intervention intensity (study arm allocation).

Our study is one of the few that has examined the degree to which the effects of a complex intervention can be sustained in a complex care setting, such as nursing homes. TREC’s comprehensive database of longitudinal data is one of the main reasons why we could conduct this work and provide these insights—highlighting the critical need for more longitudinal studies of intervention effectiveness and sustainability that cover time periods of 5 years or more.

Our finding that higher fidelity enactment during the intervention period was associated with sustained intervention benefits is noteworthy. To the best of our knowledge, this is the first study to examine this association. As noted, fidelity enactment refers to the extent to which participants adhere to and carry out core intervention components as intended during the intervention period [43]. Core components of the INFORM intervention include (1) managers and care teams setting learning and performance goals, (2) managers engaging with care teams to work towards goal achievement, and (3) measuring success towards these goals. While fidelity enactment is critical for intervention success, its role for intervention sustainability is poorly understood as reported in contemporary empirical research. The dynamic sustainability framework (DSF) suggests that intervention approaches that emphasize adaptation (adjusting and refining the intervention to fit the local context) sustain an intervention more successfully than approaches that emphasize fidelity to an original protocol [1]. Future research is needed to better understand how and why care teams that adhere more closely to intervention protocols are more successful in sustaining intervention benefits and what role adaptation plays in these processes. Additionally, related research is needed on the extent to which successful care teams are able to adapt interventions to their needs, under what circumstances these adaptations violate intervention fidelity, and attain outcomes that are superior to those originally reported in clinical trials, and how different ways of adapting the intervention affect sustainability [21, 60].

Our finding that evaluation (feedback of performance data to care teams) was associated with sustained intervention success is in line with previous studies [3, 5, 32]. Evaluation is part of what the DSF labels as information systems [1]. Evaluation activities—i.e., consistent feedback on a team’s performance and discussion of possible solutions—facilitate rapid learning and real-time problem-solving among care teams and integrate team members in the generation, rather than just the application of knowledge. The DSF assumes that an organization that develops this kind of a culture will be more successful in sustaining an intervention by improving the fit of an intervention over time.

Our finding that leadership was not associated with sustainability is not consistent with available reviews [3, 5, 32], nor with the DSF (where this construct is labeled supervision) [1]. For example, Hailemariam et al. [3] found 12 studies suggesting an association of leadership with intervention sustainability. However, from these studies, it is largely unclear what type of leadership style was associated with sustainability. For example, in their systematic review, Hailemariam et al. [3] found that “organizational leadership” was consistently associated with sustainment of evidence-based practices, but the authors provide no further details on the leadership styles measured. The ACT leadership scale [45, 46] that we used in our survey measures transformational leadership, which is only one of many leadership styles that may have been measured by other studies. More research is needed to determine whether this leadership style is associated with sustainability of study outcomes. Furthermore, the ACT leadership scale asks care aides to rate leadership characteristics of the person (or group of persons) to whom they report most of the time. In nursing home settings, these individuals are generally registered nurses or licensed practical nurses [53]. Therefore, the leadership scale we used measures care aides’ rating of nurses’ leadership, not their rating of facility-level or unit-level formal leaders/supervisors (organizational leadership). Care aides rate, for example, whether nurses regularly ask them for feedback even if it is difficult to hear, actively mentor or coach them, or focus on success rather than failure. This rating of nurses’ leadership may not be reflective of the managers’ leadership—and managers, not nurses, are the ones enabling, encouraging, and/or requiring care aides to participate in formal team meetings about resident care. Furthermore, the leadership skills listed above are not necessarily specific to enabling and encouraging a care aide to participate in formal meetings about resident care. A specific leadership skill would be, for example, a manager’s ability to recognize and address the care aide’s specific needs (such as lack of confidence to participate in such meetings, inability to envision what to say and how to best contribute, concern to neglect residents while participating, etc.).

Our finding that culture was associated with sustainability is in line with the DSF [1], but is not consistent with available reviews [3, 5, 32]. Culture has been identified as a factor associated with sustainability in only a small number of studies [3, 5, 32]. In their review, Stirman et al. [5] highlight that the small number of studies empirically supporting culture to be a key factor for intervention sustainability stands in contrast with the extent to which culture is discussed as an important factor in the implementation and sustainability literature. This most likely has to do with how culture is operationalized and measured. For instance, qualitative studies suggest that local unit-level processes and interactions that may reflect a particular culture or leadership style are more likely to support sustainability than the overall culture of an organization (which is commonly measured in quantitative sustainability studies) [5]. The ACT culture scale [45, 46] we used in this study asks care aides about particular interactions and processes (e.g., whether they are members of a supportive work group or whether their immediate team works to provide what residents need).

We found that the factors influencing sustainability were independent of whether the intervention participants were exposed to the more versus the less intense study arm. We even saw a slightly lower success and sustainability in our most intense study arm, compared to the less intense study arm (although not statistically significant). It is possible that our enhanced feedback was too onerous for participants (meeting face-to-face for half-day workshops, versus 1.5-h video conferences in the basic assisted arm), limiting short- and long-term benefits. An alternative explanation is that the minimum “dose” of feedback required to facilitate sustainability was achieved with the less intense study arm. This finding contrasts with clinical studies reporting that higher intervention intensity supports better intervention success and sustainability [54,55,56]. Shelton et al. [32] highlight the important research gap related to whether sustainability depends on the type and intensity of the intervention. Little work is available on how to determine the optimal intensity of an intervention, and a particular gap exists regarding the influence of intervention intensity on the sustainability of highly complex health services interventions. Our study contributes in important ways to addressing this knowledge gap. However, more research is needed on whether the factors we identified applying a complex intervention in nursing homes are consistent or different in other healthcare settings or when applying different types of interventions.

We found that homes in the simple feedback arm saw a substantial increase in their formal interaction scores in the post-intervention period. In contrast to nursing homes in the assisted feedback arms, simple feedback homes did not receive major intervention components, such as goal setting and support workshops, support by the study team during intervention workshops to set goals and complete an action plan, peer-to-peer support by teams from other nursing homes, etc. However, we systematically fed back INFORM findings to facilities and decision makers as part of our routine feedback activities after the T2 data collections (i.e., after the INFORM intervention period). It is also possible that TREC’s relationships with key stakeholders over 10–15 years (regional and provincial decision makers, nursing home owner-operators and care teams), their systematic involvement with INFORM, and our longitudinal work with nursing homes in our cohort may have played an important role. In fact, many of the key stakeholders in our study know each other and actively collaborate on a regular basis. For example, health region decision makes hold regular meetings with nursing home administrators in their region, giving decision makers the opportunity to interact with facilities and allowing administrators to interact and exchange information across nursing homes. Often, nursing home administrators or directors of care oversee more than one facility. Administrators of facilities operated by the same owner regularly meet and collaborate on improvement activities. While this is not a definite explanation of the increased informal interaction scores we found in non-intervention facilities after the end of our intervention, it is a plausible explanation. However, the exact reasons of this surprising finding are an interesting and hard to understand question. Pending resources, we are hoping to conduct focus groups with decision makers and key stakeholders in our facilities to further explore this question.

Study strengths and limitations

Our study has important strengths. It is one of the few studies available that assesses longer-term effects of a rigorous, complex health services intervention. We used comprehensive longitudinal data, collected using validated surveys from a large, representative sample of nursing homes and care aides. The involvement of key system- and practice-level stakeholders aimed at improving intervention success may have contributed to sustainability—and possibly unintentional and informal spread. There are, however, some limitations to note. While TREC facilities are sampled to be representative of the nursing home population in Western Canada (using a stratified, random sampling approach), facilities that have decided to engage and stay engaged with TREC for years may be more motivated, more ready for change, and have different resource configurations than other facilities. Therefore, generalizability of our findings to non-TREC homes may be limited. Future work needs to empirically investigate the question whether and how TREC facilities are different from non-TREC facilities.

As discussed in our main trial results paper [14], the effect sizes of increased formal interactions scores are small (we found Cohen’s d values of less than 0.2). However, our absolute improvement in formal interactions was 6.4%, which is comparable to effect sizes found in other audit and feedback studies [57, 58]. Like at the time when the intervention ended, we still do not know whether such small increases in formal interactions are sufficient to improve resident or care staff outcomes. As pointed out by Wensing and Grol [59], effects of interventions such as INFORM are incremental in nature. Studies are rarely funded over a long enough time for these interventions to improve resident and care staff outcomes. Our plan was to assess whether formal interactions were further sustained and whether care staff and resident outcomes hat started to improve in INFORM homes at TREC’s next wave of survey data collection (at the end 0f 2021). However, the COVID-19 pandemic likely has overridden any such effect, making it impossible to answer this question.

From our study, we do not know to what extent the sustained benefits in study outcomes are due to sustainment of intervention activities in care sites (i.e., if teams that successfully enacted core components during the intervention kept doing so after the research team had stopped delivering the intervention). To advance knowledge in this area, future studies should endeavor to look at both the sustainability of outcomes and the sustainment of intervention actions after the end of an intervention study.

Conclusions

Our study findings suggest that the benefits in INFORM’s primary outcome—care aide involvement in formal interactions about resident care—was sustained in the two more intense study arms, 2.5 years after intervention delivery had ended. Care teams in our least intense study arm started to increase care aide involvement in formal interactions about resident care after our trial had ended, raising the possibility of informal spread. Higher fidelity enactment during the intervention period, more supportive work culture, and more feedback activities (evaluation) within a care unit were associated with sustained care aide involvement in formal communications about resident care, regardless of the intervention intensity (study arm allocation). This study supports some of the assumptions posed by the DSF, but also raises questions—especially related to the role of intervention fidelity versus adaptation and their complex interplay.

Availability of data and materials

The data used for this article are housed in the secure and confidential Health Research Data Repository (HRDR) in the Faculty of Nursing at the University of Alberta (https://www.ualberta.ca/nursing/research/supports-and-services/hrdr), in accordance with the health privacy legislation of participating TREC jurisdictions. These health privacy legislations and the ethics approvals covering TREC data do not allow public sharing or removal of completely disaggregated data (resident-level records) from the HRDR, even if de-identified. The data were provided under specific data sharing agreements only for approved use by TREC within the HRDR. Where necessary, access to the HRDR to review the original source data may be granted to those who meet pre-specified criteria for confidential access, available at request from the TREC data unit manager (https://trecresearch.ca/about/people), with the consent of the original data providers and the required privacy and ethical review bodies. Statistical and anonymous aggregate data, the full dataset creation plan, and underlying analytic code associated with this paper are available from the authors upon request, understanding that the programs may rely on coding templates or macros that are unique to TREC.

Abbreviations

INFORM:

Improving Nursing Home Care Through Feedback On perfoRMance data

TREC:

Translating Research in Elder Care

References

  1. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8(1):117. https://doi.org/10.1186/1748-5908-8-117.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629. https://doi.org/10.1111/j.0887-378X.2004.00325.x.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Hailemariam M, Bustos T, Montgomery B, Barajas R, Evans LB, Drahota A. Evidence-based intervention sustainability strategies: a systematic review. Implement Sci. 2019;14(1):57. https://doi.org/10.1186/s13012-019-0910-6.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Scheirer MA. Is sustainability possible? A review and commentary on empirical studies of program sustainability. Am J Eval. 2005;26(3):320–47. https://doi.org/10.1177/1098214005278752.

    Article  Google Scholar 

  5. Stirman SW, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012;7(1):17. https://doi.org/10.1186/1748-5908-7-17.

    Article  Google Scholar 

  6. Moore JE, Mascarenhas A, Bain J, Straus SE. Developing a comprehensive definition of sustainability. Implement Sci. 2017;12(1):110. https://doi.org/10.1186/s13012-017-0637-1.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Colon-Emeric C, Toles M, Cary MP Jr, Batchelor-Murphy M, Yap T, Song Y, et al. Sustaining complex interventions in long-term care: a qualitative study of direct care staff and managers. Implement Sci. 2016;11:94.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Willis K, Small R, Brown S. Using documents to investigate links between implementation and sustainability in a complex community intervention: the PRISM study. Soc Sci Med. 2012;75(7):1222–9. https://doi.org/10.1016/j.socscimed.2012.05.025.

    Article  PubMed  Google Scholar 

  9. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: new guidance. London: Medical Research Council (MRC); 2008.

  11. Herlitz L, MacIntyre H, Osborn T, Bonell C. The sustainability of public health interventions in schools: a systematic review. Implement Sci. 2020;15(1):4. https://doi.org/10.1186/s13012-019-0961-8.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101(11):2059–67. https://doi.org/10.2105/AJPH.2011.300193.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Hoben M, Norton PG, Ginsburg LR, Anderson RA, Cummings GG, Lanham HJ, et al. Improving Nursing Home Care through Feedback On PerfoRMance Data (INFORM): protocol for a cluster-randomized trial. Trials. 2017;18(1):9. https://doi.org/10.1186/s13063-016-1748-8.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Hoben M, Ginsburg LR, Easterbrook A, Norton PG, Anderson RA, Andersen EA, et al. Comparing effects of two higher intensity feedback interventions with simple feedback on improving staff communication in nursing homes—the INFORM cluster-randomized controlled trial. Implement Sci. 2020;15(1):75. https://doi.org/10.1186/s13012-020-01038-3.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Ginsburg LR, Hoben M, Easterbrook A, Andersen E, Anderson RA, Cranley L, et al. Examining fidelity in the INFORM trial: a complex team-based behavioral intervention. Implement Sci. 2020;15(1):78. https://doi.org/10.1186/s13012-020-01039-2.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Proctor E, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, et al. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implement Sci. 2015;10(1):88. https://doi.org/10.1186/s13012-015-0274-5.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Rabin BA, Brownson RC, Haire-Joshu D, Kreuter MW, Weaver NL. A glossary for dissemination and implementation research in health. J Public Health Manag Pract. 2008;14(2):117–23. https://doi.org/10.1097/01.PHH.0000311888.06252.bb.

    Article  PubMed  Google Scholar 

  18. Ovretveit J, Gustafson D. Using research to inform quality programmes. BMJ. 2003;326(7392):759–61. https://doi.org/10.1136/bmj.326.7392.759.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Fleiszer AR, Semenic SE, Ritchie JA, Richer M-C, Denis J-L. An organizational perspective on the long-term sustainability of a nursing best practice guidelines program: a case study. BMC Health Serv Res. 2015;15(1):535. https://doi.org/10.1186/s12913-015-1192-6.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Gruen RL, Elliott JH, Nolan ML, Lawton PD, Parkhill A, McLaren CJ, et al. Sustainability science: an integrated approach for health-programme planning. Lancet. 2008;372(9649):1579–89. https://doi.org/10.1016/S0140-6736(08)61659-1.

    Article  PubMed  Google Scholar 

  21. Kislov R, Humphreys J, Harvey G. How do managerial techniques evolve over time? The distortion of “facilitation” in healthcare service improvement. Public Management Review. 2017;19(8):1165–83. https://doi.org/10.1080/14719037.2016.1266022.

    Article  Google Scholar 

  22. Robert G, Sarre S, Maben J, Griffiths P, Chable R: Exploring the sustainability of quality improvement interventions in healthcare organisations: a multiple methods study of the 10-year impact of the ‘Productive Ward: Releasing Time to Care’ programme in English acute hospitals. BMJ Quality Safety 2020, 29(1):31.

  23. Belostotsky V, Laing C, White DE. The sustainability of a quality improvement initiative. Healthc Manage Forum. 2020;33(5):195–9. https://doi.org/10.1177/0840470420913055.

    Article  PubMed  Google Scholar 

  24. Clements DH, Sarama J, Wolfe CB, Spitler ME. Sustainability of a scale-up intervention in early mathematics: a longitudinal evaluation of implementation fidelity. Early Education and Development. 2015;26(3):427–49. https://doi.org/10.1080/10409289.2015.968242.

    Article  Google Scholar 

  25. Olomu AB, Stommel M, Holmes-Rovner MM, Prieto AR, Corser WD, Gourineni V, et al. Is quality improvement sustainable? Findings of the American College of Cardiology's Guidelines Applied in Practice. Int J Qual Health Care. 2014;26(3):215–22. https://doi.org/10.1093/intqhc/mzu030.

    Article  PubMed  Google Scholar 

  26. Cowie J, Nicoll A, Dimova ED, Campbell P, Duncan EA. The barriers and facilitators influencing the sustainability of hospital-based interventions: a systematic review. BMC Health Serv Res. 2020;20(1):588. https://doi.org/10.1186/s12913-020-05434-9.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Bridges J, May C, Fuller A, Griffiths P, Wigley W, Gould L, Barker H, Libberton P: Optimising impact and sustainability: a qualitative process evaluation of a complex intervention targeted at compassionate care. BMJ Quality Safety 2017, 26(12):970.

  28. Moucheraud C, Sarma H, Ha TTT, Ahmed T, Epstein A, Glenn J, et al. Can complex programs be sustained? A mixed methods sustainability evaluation of a national infant and young child feeding program in Bangladesh and Vietnam. BMC Public Health. 2020;20(1):1361. https://doi.org/10.1186/s12889-020-09438-2.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14(1):1. https://doi.org/10.1186/s13012-018-0842-6.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Leonard-Barton D. Implementation as mutual adaptation of technology and organization. Res Pol. 1988;17(5):251–67. https://doi.org/10.1016/0048-7333(88)90006-6.

    Article  Google Scholar 

  31. Hutchinson JR, Huberman M. Knowledge dissemination and use in science and mathematics education: a literature review. Journal of Science Education and Technology. 1994;3(1):27–47. https://doi.org/10.1007/BF01575814.

    Article  Google Scholar 

  32. Shelton RC, Cooper BR, Stirman SW. The sustainability of evidence-based interventions and practices in public health and health care. Annu Rev Public Health. 2018;39(1):55–76. https://doi.org/10.1146/annurev-publhealth-040617-014731.

    Article  PubMed  Google Scholar 

  33. Maher L, Gustafson D, Evans A. Sustainability model and guide: NHS Institute for Innovation and Improvement; 2010.

    Google Scholar 

  34. Flynn R, Scott SD. Understanding determinants of sustainability through a realist investigation of a large-scale quality improvement initiative (Lean): a refined program theory. J Nurs Scholarsh. 2020;52(1):65–74. https://doi.org/10.1111/jnu.12527.

    Article  PubMed  Google Scholar 

  35. PHI: U.S. nursing assistants employed on nursing homes: key facts. Bronx, NY: PHI; 2018.

  36. Chamberlain SA, Hoben M, Squires JE, Cummings GG, Norton P, Estabrooks CA. Who is (still) looking after mom and dad? Few improvements in care aides' quality-of-work life. Can J Aging. 2018;37(1):1–16. https://doi.org/10.1017/S0714980817000563.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Open access NMDS-SC dashboards [https://www.nmds-sc-online.org.uk/reportengine/dashboard.aspx]

  38. Hewko SJ, Cooper SL, Huynh H, Spiwek TL, Carleton HL, Reid S, et al. Invisible no more: a scoping review of the health care aide workforce literature. BMC Nurs. 2015;14(1):38. https://doi.org/10.1186/s12912-015-0090-x.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Morley JE. Certified nursing assistants: a key to resident quality of life. J Am Med Dir Assoc. 2014;15(9):610–2. https://doi.org/10.1016/j.jamda.2014.06.016.

    Article  PubMed  Google Scholar 

  40. Kolanowski A, Van Haitsma K, Penrod J, Hill N, Yevchak A. “Wish we would have known that!” communication breakdown impedes person-centered care. Gerontologist. 2015;55(Suppl 1):S50–60. https://doi.org/10.1093/geront/gnv014.

    Article  PubMed  Google Scholar 

  41. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50. https://doi.org/10.1186/1748-5908-4-50.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Rogers EM. Diffusion of innovations. 5th ed. New York: Free Press; 2003.

    Google Scholar 

  43. Bellg AJ, Borrelli B, Resnick B, Hecht J, Minicucci DS, Ory M, et al. Enhancing treatment fidelity in health behavior change studies: best practices and recommendations from the NIH Behavior Change Consortium. Health Psychol. 2004;23(5):443–51. https://doi.org/10.1037/0278-6133.23.5.443.

    Article  PubMed  Google Scholar 

  44. Estabrooks CA, Squires JE, Cummings GG, Teare GF, Norton PG. Study protocol for the translating research in elder care (TREC): building context – an organizational monitoring program in long-term care project (project one). Implement Sci. 2009;4(1):52. https://doi.org/10.1186/1748-5908-4-52.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Estabrooks CA, Squires JE, Cummings GG, Birdsell JM, Norton PG. Development and assessment of the Alberta Context Tool. BMC Health Serv Res. 2009;9(1):234. https://doi.org/10.1186/1472-6963-9-234.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Estabrooks CA, Squires JE, Hayduk LA, Cummings GG, Norton PG. Advancing the argument for validity of the Alberta Context Tool with healthcare aides in residential long-term care. BMC Med Res Methodol. 2011;11(1):107. https://doi.org/10.1186/1471-2288-11-107.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care. 1998;7(3):149–58. https://doi.org/10.1136/qshc.7.3.149.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  48. Rycroft-Malone J, Kitson A, Harvey G, McCormack B, Seers K, Titchen A, et al. Ingredients for change: revisiting a conceptual framework. Qual Saf Health Care. 2002;11(2):174–80. https://doi.org/10.1136/qhc.11.2.174.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  49. Rycroft-Malone J, Harvey G, Seers K, Kitson A, McCormack B, Titchen A. An exploration of the factors that influence the implementation of evidence into practice. J Clin Nurs. 2004;13(8):913–24. https://doi.org/10.1111/j.1365-2702.2004.01007.x.

    Article  PubMed  Google Scholar 

  50. Cummings GG, Doupe M, Ginsburg L, McGregor MJ, Norton PG, Estabrooks CA. Development and validation of a Scheduled Shifts Staffing (ASSiST) measure of unit-level staffing in nursing homes. Gerontologist. 2016;3(1):509–16.

    Google Scholar 

  51. Aho K, Derryberry D, Peterson T. Model selection for ecologists: the worldviews of AIC and BIC. Ecology. 2014;95(3):631–6. https://doi.org/10.1890/13-1452.1.

    Article  PubMed  Google Scholar 

  52. Durand CP. Does raising type 1 error rate improve power to detect interactions in linear regression models? A simulation study. PLoS One. 2013;8(8):e71079. https://doi.org/10.1371/journal.pone.0071079.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  53. Aloisio LD, Demery Varin M, Hoben M, Baumbusch J, Estabrooks CA, Cummings G, Squires JE: To whom health care aides report: effect on nursing home resident outcomes. Int J Older People Nurs in press.

  54. Nohlert E, Öhrvik J, Tegelberg Å, Tillgren P, Helgason ÁR. Long-term follow-up of a high- and a low-intensity smoking cessation intervention in a dental setting--a randomized trial. BMC Public Health. 2013;13(1):592. https://doi.org/10.1186/1471-2458-13-592.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Hentschke C, Hofmann J, Pfeifer K. A bio-psycho-social exercise program (RÜCKGEWINN) for chronic low back pain in rehabilitation aftercare--study protocol for a randomised controlled trial. BMC Musculoskelet Disord. 2010;11(1):266. https://doi.org/10.1186/1471-2474-11-266.

    Article  PubMed  PubMed Central  Google Scholar 

  56. Heerman WJ, Sommer EC, Qi A, Burgess LE, Mitchell SJ, Samuels LR, et al. Evaluating dose delivered of a behavioral intervention for childhood obesity prevention: a secondary analysis. BMC Public Health. 2020;20(1):885. https://doi.org/10.1186/s12889-020-09020-w.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, O'Brien MA, Johansen M, Grimshaw J, Oxman AD: Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev 2012, 2012(6):Art. No.: CD000259.

  58. Sykes MJ, McAnuff J, Kolehmainen N. When is audit and feedback effective in dementia care? A systematic review. International Journal of Nursing Studies. 2018;79:27–35. https://doi.org/10.1016/j.ijnurstu.2017.10.013.

    Article  PubMed  Google Scholar 

  59. Wensing M, Grol R. Knowledge translation in health: how implementation science could contribute more. BMC Med. 2019;17(1):88. https://doi.org/10.1186/s12916-019-1322-9.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

We would like to thank the facilities, administrators, and their care teams who participated in the INFORM study.

Funding

INFORM was funded by a Canadian Institute of Health Research (CIHR) Transitional Operating Grant (#341532). During the time of INFORM, Matthias Hoben was holding a Postdoctoral Fellowship from Alberta Innovates (formerly Alberta Innovates–Health Solutions; 2014–2017; File No: 201300543, CA #: 3723). He also received top-up funding from TREC and Estabrooks’ Tier 1 Canada Research Chair and his last year of postdoctoral training (2017–2018) was funded by a TREC Postdoctoral Fellowship. Matthias Hoben currently holds a University of Alberta Faculty of Nursing Professorship in Continuing Care Policy Research and a University of Alberta Faculty of Nursing Establishment Grant, both of which were partially used to fund work related to this study. Carole Estabrooks holds a Tier 1 Canada Research Chair in Knowledge Translation.

Author information

Authors and Affiliations

Authors

Contributions

MH carried out the analyses, created figures and tables, and drafted the manuscript. LG, PGN, MBD, WBB, JWD, JMK, and CAE collaborated with MH on developing the analytic plan, interpreting outputs, and developing the outline of the manuscript. CAE is the principal investigator of the INFORM trial and MH, LRG, and PGN are co-leads. All authors revised the paper critically for intellectual content and approved the final version.

Corresponding author

Correspondence to Matthias Hoben.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the Research Ethics Boards of the University of Alberta (Pro00059741), Covenant Health (1758), University of British Columbia (H15-03344), Fraser Health Authority (2016-026), and Interior Health Authority (2015-16-082-H). Operational approval was obtained from all included facilities as required. All participating facilities have agreed and signed written informed consent to participate in the TREC observational study and in INFORM.

Consent for publication

Not applicable

Competing interests

All authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hoben, M., Ginsburg, L.R., Norton, P.G. et al. Sustained effects of the INFORM cluster randomized trial: an observational post-intervention study. Implementation Sci 16, 83 (2021). https://doi.org/10.1186/s13012-021-01151-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-021-01151-x