Skip to main content

Predictors of the sustainability for an evidence-based eating disorder prevention program delivered by college peer educators

Abstract

Background

Despite ongoing efforts to introduce evidence-based interventions (EBIs) into mental health care settings, little research has focused on the sustainability of EBIs in these settings. College campuses are a natural place to intervene with young adults who are at high risk for mental health disorders, including eating disorders. The current study tested the effect of three levels of implementation support on the sustainability of an evidence-based group eating disorder prevention program, the Body Project, delivered by peer educators. We also tested whether intervention, contextual, or implementation process factors predicted sustainability.

Methods

We recruited 63 colleges with peer educator programs and randomly assigned them to (a) receive a 2-day Train-the-Trainer (TTT) training in which peer educators were trained to implement the Body Project and supervisors were taught how to train future peer educators (TTT), (b) TTT training plus a technical assistance (TA) workshop (TTT + TA), or (c) TTT plus the TA workshop and quality assurance (QA) consultations over 1-year (TTT + TA + QA). We tested whether implementation support strategies, perceived characteristics of the intervention and attitudes towards evidence-based interventions at baseline and the proportion of completed implementation activities during the implementation year predicted three school-level dichotomous sustainability outcomes (offering Body Project groups, training peer educators, training supervisors) over the subsequent two-year sustainability period using logistic regression models.

Results

Implementation support strategies did not significantly predict any sustainability outcomes, although a trend suggested that colleges randomized to the TTT + TA + QA strategy were more likely to train new supervisors (OR = 5.46, 95% CI [0.89–33.38]). Colleges that completed a greater proportion of implementation activities were more likely to offer Body Project groups (OR = 1.53, 95% CI [1.19–1.98]) and train new peer educators during the sustainability phase (OR = 1.39, 95% CI [1.10–1.74]). Perceived positive characteristics of the Body Project predicted training new peer educators (OR = 18.42, 95% CI [1.48–299.66]), which may be critical for sustainability in routine settings with high provider turnover.

Conclusions

Helping schools complete more implementation activities and increasing the perceived positive characteristics of a prevention program may result in greater sustainment of prevention program implementation.

Trial Registration

This study was preregistered on 12/07/17 with ClinicalTrials.gov, ID NCT03409809, https://clinicaltrials.gov/ct2/show/NCT03409809.

Peer Review reports

Background

Over the last several decades, efforts to develop and test mental health prevention and treatment programs have resulted in an impressive list of evidence-based interventions (EBIs). However, individuals in need of mental health services remain unlikely to receive an EBI. This public health dilemma has prompted increased consideration of factors that promote EBI implementation in the complex settings where most people receive mental health services, including universities and colleges (hereafter, colleges). Implementation involves several processes and outcomes, among them initial adoption, fidelity, feasibility, acceptability, effectiveness, and sustainability [1]. Of these, sustainability—the “extent to which a newly implemented treatment is maintained or institutionalized within a service setting’s ongoing, stable operations” [1] (p70)—has received relatively little attention [2]. Given the high costs associated with initial training and support of EBIs [3, 4], sustainability is an important way to measure return on investment. Additionally, for EBIs to meaningfully reduce the burden of mental illness they must be implemented in a sustained fashion. Few studies have compared different implementation strategies using experimental designs and evaluated their impact on EBI sustainability [2, 5], and even fewer have studied EBI sustainability in college settings [6].

Most young adults attend college [7], making college campuses a natural place to intervene with mental health disorders—especially given the high prevalence of these disorders among college students [8, 9] which has increased following COVID-19 [10]. Prolonged isolation during the pandemic has been linked to various mental health concerns, including body image distress and eating disorders for adolescents and young adults [11, 12]. Eating disorder prevention efforts are particularly well-suited for college settings, since access to treatment services at college mental health clinics is limited [13, 14], and eating disorders, once developed, are often chronic and associated with high levels of distress, medical complications, impairment, and mortality [15, 16]. Due to the low ratio of mental health providers to college students in need, prevention programs that can be provided by college peer educators could dramatically increase access to services.

Among eating disorder prevention programs, the Body Project has a robust evidence base [17]. The Body Project is an EBI for the prevention of eating disorders wherein adolescent girls/young women with body image concerns collectively critique pursuit of the thin beauty ideal in verbal, written, and behavioral exercises based on cognitive dissonance principles. Prior studies of the Body Project with college peer educators [18,19,20,21,22] have demonstrated significant reductions in eating disorder symptoms and future eating disorder onset. Moreover, a recent randomized trial found that the Body Project delivered by college peer educators remained effective across varying levels of implementation support [23] and had similar results on other key implementation outcomes (reach, adherence, competence). Although the Body Project demonstrated substantial benefits, the factors that predict sustainability are unknown. A mixed-methods naturalistic study of college clinician reactions to the Body Project two years after an effectiveness trial found that while 50% of colleges continued to implement the Body Project at one-year follow-up, by year 2 only one college (12%) sustained the practice [24].

Most research related to sustainability of EBIs has been in the form of qualitative interviews and retrospective surveys after active implementation efforts [5, 25], alongside conceptual papers. Results of studies set in mental health clinics [26, 27], Veteran’s Affairs hospitals [28], primary care [29] K-12 schools [30, 31] and colleges [24] suggest that numerous factors influence whether an intervention is sustained, including characteristics of the intervention and the practitioners who provide it. This is consistent with theoretical frameworks of sustainability, such as the dynamic sustainability framework (DSF [32]), which proposes that features of the interventions, practice settings, and the broader ecological system affect sustainability both directly and interactively. The DSF further contends that the better the fit between the intervention and the setting, the more likely that the intervention will be maintained32(p6). A unique aspect of the DSF is the expectation that as the practice setting and broader context change dynamically, so too must the intervention to maintain fit. Thus, even interventions that are considered already “optimized” based on efficacy and effectiveness testing may be difficult to sustain amidst ever-changing service settings. Perhaps that is why providers often report modifying interventions when they are sustained [26, 33, 34].

Implementation strategies like provider training, ongoing support, and quality assurance are embedded in conceptual frameworks like the DSF and could impact intervention sustainability. A study of parent–child interaction therapy that randomized 50 mental health organizations to a cascading “train-the-trainer” (TTT) model of training, a learning collaborative model, or a distance learning model found that clinicians in the TTT model implemented the treatment approach with more clients at the 24-month follow-up assessment [35]. Continued reach of the EBI over time is a critical marker of sustainability.

Adequate staffing of individuals who are trained to provide the EBI is another key indicator of a setting’s ability to sustain a practice [36]. Lack of training opportunities following initial implementation has been identified as a barrier to sustainability [37], despite evidence that later generations of providers trained internally can implement EBIs with a comparable level of fidelity as those originally trained by intervention developers [38]. Supervisory support within a setting has also been found to promote adherent practice and client benefit [39, 40]; in contrast, when internal supervisors are not adequately trained in the EBI being offered by their supervisees, working alliance is lessened which may affect service quality [41]. Thus, an organization’s ability to continue to offer an EBI is also affected by their ability to maintain a trained staff of providers and supervisors beyond those initially trained despite the high rates of expected turnover among mental health providers [42], which can undermine sustainability [31]. On college campuses where mental health promotion and prevention efforts are often provided by peer educators [43], sustainability efforts are made more complicated by the transient nature of college students. We were unable to find any studies that measured the extent to which activities like trainings continued after initial EBI implementation, or that examined variables associated with those activities, despite their relevance to an organization’s internal capacity to sustain services.

The current study

Considerable resources have been invested in the development, testing, and initial implementation of EBIs, but without attention to sustainability their potential for real impact is limited. As noted, college students are a population with a high prevalence of mental health problems, including eating disorders, and often lack access to EBIs. The Body Project is a highly effective preventive intervention that can be efficaciously provided by college peer educators, however a rigorous evaluation of its sustainability within college peer educator programs has not been conducted. As part of a larger implementation study [44], we recruited 63 colleges with peer educator programs and randomly assigned them to one of three implementation strategies: Train-the-Trainer (TTT) training, TTT training plus a technical assistance workshop (TTT + TA), and TTT plus the TA workshop and one year of quality assurance (TTT + TA + QA). After one year of active implementation, results indicated that implementation support strategies did not significantly predict primary outcomes including attendance, adherence, competence, and reach. However, adding technical assistance and quality assurance strategies was related to larger reductions in eating disorder risk factors and symptoms. Following active implementation, all participating colleges entered a two-year sustainability phase. The current study compared the impact of different levels of implementation support for the Body Project on secondary outcomes related to sustainability.

Our main research questions concerned whether level of implementation support would predict whether Body Project groups were conducted during the two-year sustainability phase, and secondarily whether level of implementation support predicted two key ongoing implementation activities: training new peer educators and training new supervisors. Based on main effects reported by Stice et al. [44], we hypothesized that the TTT + TA + QA implementation support strategy would be superior to both the TTT alone and TTT + TA strategies in terms of conducting Body Project groups and conducting internal trainings for new peer educators and staff. Although the technical assistance (TA) workshop was intended to enhance the fit of the intervention to each individual context, which has been theorized to result in better sustainability [32], differences between the TTT and TTT + TA strategies during implementation were nonsignificant or small in magnitude. In contrast, the addition of QA consultations, which explicitly focused on managing barriers to implementation and planning for sustainability, resulted in stronger effects during implementation. In alignment with the DSF, the QA consultations permitted progress review, problem-solving of implementation barriers, and sustainability planning, which theoretically would improve the “optimization” of the Body Project for long-term sustainment.

Because intervention factors (perceived characteristics of the EBI), provider characteristics (attitudes and beliefs) and implementation process (the completion of implementation activities) have emerged as possible predictors of sustainability in existing qualitative and naturalistic studies of EBT sustainability, we also explored whether provider’s perceived characteristics of the intervention, supervisor attitudes towards the delivery of evidence-based practices, and degree of completion of activities during implementation predicted sustainability outcomes. Because they predicted initial implementation outcomes in a previously published report of the same study [23], we hypothesized that supervisor attitudes towards evidence-based practices and perceived intervention characteristics, and a greater number of completed implementation activities would predict all three sustainability outcomes. All models included covariates related to the organizational context of the colleges since prior research on the Body Project found that available resources are related to some implementation outcomes [45], and because theoretical models such as the DSF and others describe these outer and inner settings implementation indicators as influential to sustainability [32, 46].

Method

Participants and procedures

US colleges with peer educator programs not already implementing the Body Project were contacted to participate. To be included, schools had to have (a) two or more peer education supervisors (college employees who oversee peer educator programming), and (b) eight or more peer educators interested in this training. Sixty-three colleges agreed to participate and were trained. Once the peer educator supervisors and peer educators were trained by project staff, schools recruited undergraduate participants to complete a body acceptance program (which is how the Body Project is typically described). Interested female-identifying students were directed to a screening web page to confirm that they had body image concerns (the sole inclusion criterion) and administered the Eating Disorder Diagnostic Scale [44]. No exclusion criteria were recommended. This study’s design and hypotheses were preregistered; see https://clinicaltrials.gov/ct2/show/NCT03409809. The Standards for Reporting Implementation Studies checklist has been included as an additional file. Other details of the study, including student group member characteristics, recruitment procedures, inclusion criteria, and the consideration of sample size are provided by Stice et al. [23] and Rohde et al. [45].

Although student group members were research participants for other aspects of this study, they did not provide any data during the sustainability period. Thus, the present report relies solely on outcome data provided by the peer educators and supervisors supervisors. Supervisors completed an interview and surveys before and after the initial TTT training assessing knowledge and attitudes towards eating disorder prevention, as well as the needs and resources (including the peer education program) on their campus. Supervisors then completed Qualtrics surveys assessing implementation progress at one-year (i.e., end of implementation), two-year, and three-year (i.e., end of Sustainability) follow-ups; the two- and three-year sustainability surveys were completed by one peer education supervisor at each college who had knowledge of the program’s delivery on campus but may not have completed previous assessments (due to staff turnover). Peer educators completed a survey after training assessing their perceptions of the characteristics of the Body Project. Neither supervisors nor peer educators received monetary compensation.

Sustainability outcome measures

Implementation and Sustainability Progress. Because we were unable to locate a measure that assesses adoption and implementation of prevention programs, we created the Prevention Implementation Progress Scale (PIPS). We drew general principles for our scale from the Progress Monitoring Instrument (PMI [47]) and the Stages of Implementation Completion measure (SIC [48]). Following the strategies used by these research teams, we delineated the major milestones of the implementation procedure for this prevention program. This measure assesses the following discrete and observable steps in the implementation process: (1) engagement, (2) consideration of feasibility, (3) readiness planning, (4) staff recruitment and training, (5) service initiation, (6) ongoing service provision, (7) adherence monitoring and (8) competency. Most steps were assessed with multiple items that were obtained by a Qualtrics survey (followed up with clarifying emails, if necessary). We used a preliminary version of the PIPS in a previous implementation project with the Body Project, where it showed predictive validity; Body Project participants in organizations that completed 90% or more versus fewer of the activities in the implementation process showed significantly greater pre-to-post declines in avoidance of life activities due to body image concerns and eating disorder symptoms (M d = 0.27). Herein, we report three outcome measures from the PIPS occurring in the two-year sustainability period: (1) whether the school conducted one or more Body Project groups (primary outcome), (2) whether the school trained new peer educators to conduct Body Project groups, and (3) whether the school trained one or more new peer education supervisors on this EBI.

Predictor variables

Implementation support strategies

Colleges were randomly assigned to receive TTT only (n = 21), TTT + TA (n = 21), or TTT + TA + QA (n = 21), which resulted in 264 Body Project groups implemented by 613 peer educators to 1,387 students in the one-year Implementation period following TTT. Peer educators and supervisors in all three support strategy arms were given the tools needed to conduct Body Project groups, including the intervention script and access to a facilitator support website (https://sticebodyprojectsupport.weebly.com). Implementation support strategies pairwise contrasts using dummy variables were included in the models described below and include TTT (= 0) vs. TTT + TA (= 1), TTT (= 0) vs. TTT + TA + QA (= 1), and TTT + TA (= 0) vs. TTT + TA + QA (= 1).

Train-The-Trainer strategy

All schools received TTT training via an on-campus 2-day TTT training that has been previously validated [21, 22] by an expert trainer. Training included: (1) reviewing the intervention theory, components, and evidence base, (2) roleplay delivery of sessions by teams of peer educators, (3) supervisory feedback from the trainer and supervisors to train peer educators, (4) completion of homework assignments, and (5) reviewing facilitation skills, common problems, assessment, and crisis response plans.

Technical assistance strategy

Schools randomized to TTT + TA received an additional ½ day of training to articulate goals, needs, leadership structure, communication, adoption options, and recruitment strategies. The expert trainer met with trainees to assess the practical needs and resources for successful implementation and sustainability. They also answered remaining questions.

Quality assurance strategy

Supervisors at schools randomized to TTT + TA + QA were offered monthly consultation calls with their trainer over a 1-year period following the TTT and TA training. These phone or video consultation meetings reviewed progress in conducting groups, problem-solving barriers, planning future implementation events, knowledge sharing, and sustainability planning.

Intervention, contextual, and implementation process factors

Perceived Characteristics of the Intervention. Following TTT training, peer educators and their supervisors completed a survey assessing perceptions of relative advantage, compatibility, complexity, observability, and trialability [49] in an organization. We used the 28-item adoption scale (the Provider Intervention Adoption Scale [50]), modified to refer to the Body Project. Response options were on a 5-point scale (1 = strongly disagree, 5 = strongly agree). Exploratory factor analyses identified two factors in the present data. The first factor included 19 items that measured positive perceptions of Body Project characteristics (α = 0.96 and 0.93 for peer educators and supervisors, respectively). The second factor included 9 items that measured negative perceptions of Body Project characteristics (α = 0.87 and 0.77 for peer educators and supervisors, respectively). All scales were averages across items unless otherwise indicated.

The Modified Practice Attitudes Scale (MPAS [51]) is an 8-item measure designed to assess therapists’ attitudes towards evidence-based practices. The eight items use a five-point Likert scale to indicate the extent to which the participant agrees or disagrees with each statement (0 = “not at all” and 4 = “to a very great extent”). Higher scores indicate more favorable attitudes towards evidence-based practices. The MPAS has demonstrated acceptable internal consistency (α = 0.77-0.80) in studies of community mental health providers [52]. A mean score was computed, and higher scores indicate greater support for evidence-based practices (α = 0.77).

Implementation process factors. We used 11 indicators of implementation progress collected from the PIPS during the 1-year Implementation period as a potential predictor of greater sustainability. Items addressed the school beginning recruitment for a Body Project group, beginning each of up to six groups during implementation, training new peer educators and supervisors in the intervention, and providing structured (observation-based) supervision of peer educator delivery of the Body Project. Schools received one point for completion of each activity, with scores ranging from 0–11.

Covariate adjustments

Outer and inner setting implementation indicators

Covariates hypothesized to contribute to the variance of sustainability outcomes are derived from the outer and inner settings domains of the Consolidated Framework for Implementation Research (CFIR [46]). For a longer description, see Rohde et al [53]. At baseline, we evaluated the presence or absence of formal policies related to EBPs, fiscal and other organizational resources for peer educator programs and the student mental health centers, and the mental health needs and resources of students using a standardized qualitative interview conducted with peer educator supervisors prior to the TTT training to assess the following CFIR domains: (1) Available Resources (i.e., How well-resourced [staffing, time, and budget] is the university?); (2) Access to Knowledge and Information (i.e., Does the supervisor feel that they can easily find new information and do they seek it out?); and (3) Leadership Engagement (i.e., How supported do supervisors feel by their supervisor at the university?). Interviews were audio recorded and transcribed, and responses were coded into 3-point quantitative scores per domain, with double coding to ensure acceptable inter-rater reliability: Available resources (0 = not supported in staffing, time budget; 1 = support in one area, 2 = support in 2 + areas); Access to knowledge and information (0 = no access, 1 = one outlet they can use to access information, 2 = materials are available to them from multiple outlets); and Leadership Engagement (0 = minimal/no involvement, 1 = feels support but leader could be more involved, 2 = feels supported).

The Team Climate Inventory [54] was completed by peer educators and supervisors at baseline to measure perceived organizational climate, an inner setting factor. This 14-item survey assesses team Vision (e.g., I am in agreement with the objectives of our peer education program), Participative Safety (e.g., People in the peer education program feel understood and accepted by each other), Task Orientation (e.g., Peer education team members are prepared to question the basis of what the team is doing), and Support for Innovation (e.g., In this peer education team we take the time needed to develop new ideas). Response options were on a 5-point Likert scale of agreement (1 = strongly disagree; 5 = strongly agree). It has shown internal consistency (α = 0.90—0.93) and predictive validity for team innovation and effectiveness [54,55,56]. For the current study, a total composite mean score was computed, with a higher score indicating greater levels of team climate. Internal consistency was high for peer educators and supervisors (α = 0.96 and 0.95, respectively), and scores were grand mean centered.

Data analysis

Preliminary analyses

We examined the distribution of variables, extent of missing data, summarized the sample of supervisors and peer educators, and examined the distribution and intercorrelations of the predictors.

Prediction models

We relied on colleges to provide data matching supervisor and peers to groups, however, less than 50% of colleges provided those data. Thus, multilevel modeling of supervisors and peers nested with groups was not possible and we conducted non-nested models with mean supervisor and peer-educator predictors for each school. We examined logistic regression models for dichotomous outcomes that were estimated with a logit link using SPSS (Version 26). All continuous predictor variables were mean-centered. For each outcome, we estimated one model for each of the nine predictors and each model simultaneously adjusted for the four covariates. The likelihood ratio test was used to evaluate model fit. The likelihood ratio test does not always perform well [57] and others have suggested the Hosmer–Lemeshow test [58] as an alternative, however, these require larger sample sizes, with some recommending up to 400 [59]. The Nagelkerke pseudo R2 was used to approximate percentage of variance accounted for [60] and was selected over the Cox and Snell pseudo R2 because the Cox and Snell pseudo R2 value cannot reach 1.0 [61]. Post modeling, we examined for the presence of multicollinearity by inspecting the variance inflation factor (VIF) for model predictors, with values greater than 10.0 indicating the presence of multicollinearity [62]. For effect sizes, we report odds ratios with 95% confidence intervals. Although we did not correct for multiple testing given limited sensitivity with the uncorrected analyses, we computed and report the Benjamini–Hochberg false discovery rate adjusted p-values [63] along with the unadjusted p-values in Table 3.

Results

Preliminary analyses

Table 1 provides a descriptive summary for the three outcomes. Due to the low endorsement of non-zero categories, we created three dichotomous (Yes/No) outcome measures for the variables. Complete data were available for all 63 colleges.

Table 1 Descriptive Summary of Outcome Measures

We collected data from 232 supervisors and 510 peer educators. On average, there were 4.1 (SD = 2.0, minimum = 1, maximum = 12) supervisors and 10.0 (SD = 3.86, minimum = 3, maximum = 26) peer educators per college. The supervisors were mostly female (93%) and primarily reported their age as between 26 – 30 years of age (23%) followed by 31 – 35 years of age (20%) and 51 years or older (14%). The majority identified as White (74%), followed by Black (8%), Hispanic (7%), Asian (6%) and multi-racial (4%). Peer educators were also mostly female (96%), with a mean age of 21.0 years (SD = 3.5) and identified primarily as White (51%), followed by Hispanic (19%), Black (15%), Asian (10%), Native American (3%), and Pacific Islander (2%). A minority of the supervisors (13%) and peer educators (21%) reported a history of eating disorders, and most peer educators (75%) indicated that body image concerns kept them from doing something they enjoyed. We examined whether supervisor or peer educator self-reported demographic characteristics and history of eating disorder or body image concerns were significantly associated with randomized implementation support strategies. No significant associations were found (at p < 0.05).

The intercorrelations and descriptive statistics for the 9 predictors are summarized in Table 2. No multicollinearity among the predictors was detected and the largest bivariate correlation was between peer educator reports of positive and negative implementation of the Body Project (r = -0.56).

Table 2 Pearson Correlation Matrix and Descriptive Summary for Predictors

Prediction models

Annotated parameter estimates from the logistic regression models for the three sustainability outcomes are shown in Table 3 and only include results for the model predictors. See Additional File 1 for results of the full models including parameter estimates for the covariates, and for each model the fit, estimated variance accounted for, and VIF for the predictors and covariates. One variable significantly predicted the primary outcome of a college peer education program conducting one or more Body Project groups in the 2-year sustainability period: completion of a greater number of PIPS activities during the first year of implementation (OR = 1.48).

Table 3 Annotated Results of Logistic Regression Models for College-level Predictors

Two variables significantly predicted whether a new peer educator was trained during sustainability, which included the greater number of PIPS implementation activities (OR = 1.39). In addition, higher scores on peer educator positive report of the Body Project characteristics were significantly associated with greater odds of whether new peer educators were trained during sustainability (OR = 18.42).

No variables were identified that significantly predicted whether a new peer education supervisor was trained during the 2-year sustainability period. The only predictor variable that approached significance (OR = 5.46; p = 0.066) was whether the college had received the complete package of implementation support strategies (i.e., ongoing QA in addition to TTT and TA).

Discussion

The Body Project is a group preventive intervention for eating disorders that has proven effectiveness when offered as a peer educator-led service [18,19,20,21,22] and with differing levels of implementation support [23]. The current study is the first to test whether implementation support predicts the sustainability of the Body Project on college campuses. Sixty-three college campuses were randomly assigned to three implementation support strategies: TTT only, TTT + TA, and TTT + TA + QA. During a two-year sustainability period, supervisors reported on the number of additional Body Project groups conducted, as well as whether they trained new peer educators and supervisors. We hypothesized that schools randomized to the TTT + TA + QA strategy would be more likely to offer Body Project groups and would train additional peer educators and supervisors during the sustainability phase. Because theories of sustainability [32] and naturalistic studies of EBI implementation [27] suggest that intervention, provider, and implementation process factors may also influence whether interventions are continued, we also tested whether provider perceived characteristics of the Body Project, supervisor attitudes towards evidence-based practices, and the extent to which colleges completed implementation activities in the earlier stages predicted sustainability outcomes. Although the highest level of implementation support approached significance as a predictor of training new supervisors, completion of implementation activities emerged as the strongest predictor of sustainability.

Of the 63 colleges that participated in the study, 38 of them (60%) did not offer any Body Project groups following the active year of implementation and reported that no students participated in groups, perhaps in part due to disruptions caused by COVID-19 during follow-up. Among the colleges that did continue to offer groups, the number of groups reported ranged from one to 13, with an average of 4.3 groups representing on average 0.76% (minimum = 0.09, maximum = 4.34) of the eligible female student population. Likewise, most colleges (54%) did not train new peer educators or peer educator supervisors (76%). The finding that only 40% of colleges implemented the Body Project during the sustainability phase was not unexpected, as a prior review of EBI implementation efforts noted that programs are rarely sustained in full over time [25]. Indeed, the rates of Body Project sustainment compare favorably to the rates of sustainment described in a study of over 1,000 sites using the Universal Stages of Implementation Completion scale, which averaged between 25 to 37%, depending on the definition of sustainment used [64]. Nonetheless, this rate of discontinuation underscores the need for increased attention to the factors that bolster program continuation, particularly those factors that may be malleable.

We hypothesized that level of implementation support might be one malleable factor, however there were no significant differences by implementation support strategies for whether Body Project groups were offered or peer educators trained. Colleges that were randomly assigned to receive QA in addition to TTT and TA during the implementation stage appeared to be more likely to train new supervisors during the two-year sustainability period, relative to those who received TTT and TTT + TA, but this was only a trend. The odds ratio (OR = 5.46) corresponds to a moderate-to-large association [65] with a large confidence interval that may reflect the small sample size. Nonetheless, this trend makes intuitive sense given that QA calls specifically included a focus on sustainability planning, which would address continued training of supervisors. Supervisors in the study were trained not only to provide clinical oversight of peer educator-led groups but also to train new peer educators, an activity critical to the longevity of the Body Project. In a randomized trial of parent–child interaction therapy training, the training arm in which internal supervisors were trained to provide ongoing training and support resulted in more clients seen at 24 months following active implementation, even though there were no differences between this arm and the others (Learning Collaborative and Distance Education) in number of total clients seen since the initial training [35]. It may be that investing in trainers and supervisors results in more durable services over time, even if it does not initially predict greater reach.

The number of completed activities during the active implementation stage was the most robust predictor of sustainability outcomes. Prior research has shown that completing a greater proportion of activities in the early stages of implementation predicted program start-up in a study of multidimensional treatment foster care in 53 child-welfare serving sites, using different stages of the Stages of Implementation Completion [66]. Our measure of implementation progress likewise measured the proportion of completed activities across the different stages of implementation and we found that greater completion of critical activities during the implementation stage predicted whether groups were offered during the sustainability stage (OR = 1.53), as well as whether new peer educators were trained (OR = 1.39), both small but durable effects [65] that remained significant even after correction for multiple tests. This relationship could have emerged for a variety of reasons. Colleges that are more committed to an EBI may be more likely to complete both implementation activities and those during sustainability. Alternatively, colleges that completed fewer implementation activities may have encountered more challenges, including financial constraints, turnover of leaders or EBI champions, or other emergent priorities [66]. Regardless, completion of implementation activities may be a valuable litmus test for later sustainability and could inform decisions about when to invest resources over time.

Consistent with the DSF [32], peer educators’ perceptions of the characteristics of the Body Project following their initial training predicted whether new peer educators were trained (OR = 18.42) in the expected direction. This effect was very large in magnitude [65], though it became non-significant after correcting for multiple tests. Peer educators’ perceptions that the Body Project offered advantages relative to other programs, was compatible with the setting, was appropriately complex, and offered observable benefits predicted the training of new peer educators. This dovetails with the results of a study by Massey Combs and colleagues [31] in which higher teacher ratings of an intervention’s benefit, compatibility and complexity predicted intervention sustainability. Similarly, a study by Lau et al. [27] found that community therapists’ negative perception of particular EBIs predicted their discontinuation. In contrast, supervisors’ perceptions of the Body Project characteristics and evidence-based practices in general did not significantly predict any sustainability outcomes. In other studies, both provider and supervisor endorsement of EBIs has been suggested to foster sustainment [67,68,69]. The DSF specifically proposes that the better the fit between the intervention and the setting, the more likely that the intervention will be maintained over time. Peer educators are both the intervention providers and members of the student body for whom the Body Project is intended, perhaps lending them particular insight into the extent to which the Body Project was an optimal fit with their settings’ needs. For colleges where the Body Project wasn’t sustained, it is possible that program modifications could increase the likelihood of sustainment, as the DSF suggests that even programs considered already “optimized” may require adjustment to be maintained [32].

Limitations

The current study is among very few that have used randomization to test the effects of different implementation support strategies on the sustainability of an evidence-based intervention; even fewer studies have studied the sustainability of prevention interventions on college campuses. Despite this, several limitations should be considered. First, all sustainability outcomes were examined at the college level (N = 63) which resulted in less-than-optimal power to detect more than large effects. Although 63 organizations is a respectable sample for dissemination and implementation research projects, it is still a small sample in terms of statistical power. We only had a power of 0.68 to detect medium-sized effects (e.g., OR = 3.47), so we were likely unable to detect clinically meaningful effects that were small to medium in magnitude. Second, the active implementation stage of this study for all colleges took place in early 2020 at the start of the COVID-19 pandemic. This disrupted typical procedures for most college campuses and may have influenced a lack of initial implementation for a sizable number of colleges, 17% of whom did not implement the Body Project in the active phase. Perhaps related, most colleges did not engage in any of the sustainability activities examined as outcomes in this study. This resulted in low endorsement of non-zero categories and led us to use dichotomous (Yes/No) outcomes for most sustainability measures rather than continuous, which may have reduced statistical power that was already limited due to the relatively modest number of colleges that participated. In addition, during the sustainability phase, we no longer collected participant data so we cannot assess whether Body Project groups remained effective over time or whether there were differences in effectiveness by implementation support strategies. Intervention effectiveness is a critical sustainability outcome, and future studies should consider this alongside the outcomes measured in the current study. Third, the study sample size was small, especially in the context of dichotomous outcomes and logistic regression analyses. The small sample hampered the study in a few statistical ways. We were not able to use more appropriate indicators of model fit (e.g., Hosmer–Lemeshow test), explore potential better modeling options such as robust Poisson regression with estimates of risk ratios [70] or include more covariates adjustments in our prediction models. Finally, this study relied on peer educator supervisor self-reports of all outcomes, which may have biased the results.

Conclusions

The sustainability of evidence-based interventions has been described as “one of the most significant translational research problems of our time” [71] (p2). Any benefits yielded by the development, testing, adoption, and deployment of interventions are short-lived if practices are ultimately abandoned following the initial stages of implementation. In the current study, 60% of participating colleges did not sustain the Body Project after the initial, active implementation stage. Although sustainability rates in this study compared favorably to many EBI implementation efforts [64], even intentional efforts to plan for sustainability with the most intensive implementation support strategy did not significantly increase the likelihood of activities such as continuing to offer groups or training new providers.

It is likely that some of the factors that influence sustainability, such as funding and competing organizational priorities, were beyond the reach of the implementation support interventions in the current study [25]. In a qualitative study by Massatti et al [72], key warning signs that an organization might de-adopt a practice included lack of financial resources and evidence that external agents did not support continued implementation efforts. The current study included such outer and inner setting implementation indicators as covariates [46]; though very few were significant predictors of outcomes in any models (see Additional File 1). Other risks included goodness-of-fit of the intervention with employees’ skills and beliefs and perceptions about ease of implementation [72]. Assessing these risks during implementation, using measures like the Evidence Based Practice Sustainability Index [36], could allow for more personalized approaches to increase sustainability and prevent the de-adoption of EBIs.

The proportion of activities that a college completed during the active implementation stage emerged as the most robust predictor of whether Body Project groups were offered during the sustainability stage, as well as whether new peer educators were trained to lead future groups. This aligns with conceptual models that describe implementation as a process of moving through distinct stages in a potentially non-linear and recursive fashion, with each stage interacting with and influencing subsequent stages [73]. Studies of other EBIs in settings different from the current study have also found that the proportion of completed activities in one stage predicted overall success [66]; perhaps this type of progress assessment can be used to guide decision-making about whether an organization has received an adequate dose of support during one stage or whether additional efforts to assist with initial implementation may be required to yield ongoing results.

When implementing EBIs in complex settings, results of the current study suggest that provider positive perceptions of the intervention’s characteristics should be intentionally bolstered during training. Likewise, provider concerns about the relative advantage, compatibility, complexity, and benefit of interventions should be assessed and addressed—perhaps through modifications that increase goodness of fit. Future research should consider whether pre-implementation interventions that promote provider positive perceptions of the intervention might improve sustainability for organizations where baseline attitudes are less positive. Ultimately, sustainment of EBIs in complex service settings is critical to realizing the potential of intervention development, testing, and implementation and supporting meaningful improvements in mental health care.

Availability of data and materials

The data and study materials are available from Eric Stice (e.g., copies of assessment measures; estice@stanford.edu) or Jeff M. Gau (jeffg@ori.org) for analysis code.

Abbreviations

EBI:

Evidence based intervention

TTT:

Train-the-trainer

TA:

Technical assistance

QA:

Quality assurance

DSF:

Dynamic Sustainability Framework

PIPS:

Prevention Implementation Progress Scale

CFIR:

Consolidated Framework for Intervention Research

References

  1. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration and policy in mental health and mental health services research. 2011;38:65–76. https://doi.org/10.1186/s13012-015-0274-5.

    Article  PubMed  Google Scholar 

  2. Shelton RC, Cooper BR, Stirman SW. The sustainability of evidence-based interventions and practices in public health and health care. Annu Rev Public Health. 2018;1(39):55–76.

    Article  Google Scholar 

  3. Eisman AB, Kilbourne AM, Dopp AR, Saldana L, Eisenberg D. Economic evaluation in implementation science: making the business case for implementation strategies. Psychiatry Res. 2020;1(283):112433. https://doi.org/10.1016/j.psychres.2019.06.008.

    Article  Google Scholar 

  4. Pegg SL, Walsh LM, Becker-Haimes EM, Ramirez V, Jensen-Doss A. Money makes the world go ‘round: A qualitative examination of the role funding plays in large-scale implementation and sustainment of youth evidence-based practice. Psychol Serv. 2021;18(2):265. https://doi.org/10.1037/ser0000399.

    Article  PubMed  Google Scholar 

  5. Shoesmith A, Hall A, Wolfenden L, Shelton RC, Powell BJ, Brown H, McCrabb S, Sutherland R, Yoong S, Lane C, Booth D. Barriers and facilitators influencing the sustainment of health behaviour interventions in schools and childcare services: a systematic review. Implement Sci. 2021;16(1):62. https://doi.org/10.1186/s13012-021-01134-y.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Soicher RN, Becker-Blease KA, Bostwick KC. Adapting implementation science for higher education research: the systematic study of implementing evidence-based practices in college classrooms. Cognitive research: principles and implications. 2020;5:1–5. https://doi.org/10.1186/s41235-020-00255-0.

    Article  Google Scholar 

  7. Van Strien T, Frijters JE, Van Staveren WA, Defares PB, Deurenberg P. The predictive validity of the Dutch restrained eating scale. Int J Eat Disord. 1986;5(4):747–55. https://doi.org/10.1002/1098-108X(198605)5:4%3c747::AID-EAT2260050413%3e3.0.CO;2-6.

    Article  Google Scholar 

  8. Center for Collegiate Mental Health. Center for Collegiate Mental Health 2020 Annual Report. University Park: Center for Collegiate Mental Health; 2021. Publication number STA 21-045.

  9. Twenge JM, Cooper AB, Joiner TE, Duffy ME, Binau SG. Age, period, and cohort trends in mood disorder indicators and suicide-related outcomes in a nationally representative dataset, 2005–2017. J Abnorm Psychol. 2019;128(3):185. https://doi.org/10.1037/abn0000410.

    Article  PubMed  Google Scholar 

  10. Wood CI, Yu Z, Sealy DA, Moss I, Zigbuo-Wenzler E, McFadden C, Landi D, Brace AM. Mental health impacts of the COVID-19 pandemic on college students. J Am Coll Health. 2022;10:1–6. https://doi.org/10.1080/07448481.2022.2040515.

    Article  Google Scholar 

  11. Rodgers RF, Lombardo C, Cerolini S, Franko DL, Omori M, Fuller-Tyszkiewicz M, Linardon J, Courtet P, Guillaume S. The impact of the COVID-19 pandemic on eating disorder risk and symptoms. Int J Eat Disord. 2020;53(7):1166–70. https://doi.org/10.1002/eat.23318.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Vitagliano JA, Jhe G, Milliren CE, Lin JA, Spigel R, Freizinger M, Woods ER, Forman SF, Richmond TK. COVID-19 and eating disorder and mental health concerns in patients with eating disorders. J Eat Disord. 2021;9(1):1–8. https://doi.org/10.1186/s40337-021-00437-1.

    Article  Google Scholar 

  13. Auerbach RP, Alonso J, Axinn WG, Cuijpers P, Ebert DD, Green JG, Hwang I, Kessler RC, Liu H, Mortier P, Nock MK. Mental disorders among college students in the World Health Organization world mental health surveys. Psychol Med. 2016;46(14):2955–70. https://doi.org/10.1017/S0033291716001665.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  14. Bailey RJ, Erekson DM, Cattani K, Jensen D, Simpson DM, Klundt J, Vogeler HA, Schmuck D, Worthen VE, Caldwell Y, Beecher ME. Adapting stepped care: Changes to service delivery format in the context of high demand. Psychol Serv. 2022;19(3):494. https://doi.org/10.1037/ser0000564.

    Article  PubMed  Google Scholar 

  15. Arcelus J, Mitchell AJ, Wales J, Nielsen S. Mortality rates in patients with anorexia nervosa and other eating disorders: a meta-analysis of 36 studies. Arch Gen Psychiatry. 2011;68(7):724–31. https://doi.org/10.1001/archgenpsychiatry.2011.74.

    Article  PubMed  Google Scholar 

  16. Crow SJ, Peterson CB, Swanson SA, Raymond NC, Specker S, Eckert ED, Mitchell JE. Increased mortality in bulimia nervosa and other eating disorders. Am J Psychiatry. 2009;166(12):1342–6. https://doi.org/10.1176/appi.ajp.2009.09020247.

    Article  PubMed  Google Scholar 

  17. Stice E, Onipede ZA, Marti CN. A meta-analytic review of trials that tested whether eating disorder prevention programs prevent eating disorder onset. Clin Psychol Rev. 2021;1(87): 102046. https://doi.org/10.1146/annurev.clinpsy.3.022806.091447.

    Article  Google Scholar 

  18. Becker CB, Wilson C, Williams A, Kelly M, McDaniel L, Elmquist J. Peer-facilitated cognitive dissonance versus healthy weight eating disorders prevention: A randomized comparison. Body Image. 2010;7(4):280–8. https://doi.org/10.1016/j.bodyim.2010.06.004.

    Article  PubMed  Google Scholar 

  19. Halliwell E, Jarman H, McNamara A, Risdon H, Jankowski G. Dissemination of evidence-based body image interventions: A pilot study into the effectiveness of using undergraduate students as interventionists in secondary schools. Body Image. 2015;1(14):1–4. https://doi.org/10.1016/j.bodyim.2015.02.002.

    Article  Google Scholar 

  20. Stice E, Marti CN, Rohde P. Prevalence, incidence, impairment, and course of the proposed DSM-5 eating disorder diagnoses in an 8-year prospective community study of young women. J Abnorm Psychol. 2013;122(2):445. https://doi.org/10.1037/a0030679.

    Article  PubMed  Google Scholar 

  21. Stice E, Rohde P, Shaw H, Gau JM. Clinician-led, peer-led, and internet-delivered dissonance-based eating disorder prevention programs: Acute effectiveness of these delivery modalities. J Consult Clin Psychol. 2017;85(9):883. https://doi.org/10.1037/ccp0000211.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Stice E, Rohde P, Shaw H, Gau JM. Clinician-led, peer-led, and internet-delivered dissonance-based eating disorder prevention programs: Effectiveness of these delivery modalities through 4-year follow-up. J Consult Clin Psychol. 2020;88(5):481. https://doi.org/10.1037/ccp0000796.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Stice E, Rohde P, Gau JM, Bearman SK, Shaw H. An experimental test of increasing implementation support for college peer educators delivering an evidence-based prevention program. J Consult Clin Psychol. 2023. https://doi.org/10.1037/ccp0000806.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Rohde P, Shaw H, Butryn ML, Stice E. Assessing program sustainability in an eating disorder prevention effectiveness trial delivered by college clinicians. Behav Res Ther. 2015;1(72):1–8. https://doi.org/10.1016/j.brat.2015.06.009.

    Article  Google Scholar 

  25. Wiltsey Stirman S, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012;7(1):1–9. https://doi.org/10.1186/1748-5908-7-17.

    Article  Google Scholar 

  26. Bearman SK, Bailin A, Terry R, Weisz JR. After the study ends: A qualitative study of factors influencing intervention sustainability. Prof Psychol Res Pract. 2020;51(2):134. https://doi.org/10.1037/pro0000258.

    Article  Google Scholar 

  27. Lau AS, Lind T, Crawley M, Rodriguez A, Smith A, Brookman-Frazee L. When do therapists stop using evidence-based practices? Findings from a mixed method study on system-driven implementation of multiple EBPs for children. Administration and Policy in Mental Health and Mental Health Services Research. 2020;47:323–37. https://doi.org/10.1007/s10488-019-00987-2.

    Article  PubMed  Google Scholar 

  28. Mohr DC, Rosen CS, Schnurr PP, Orazem RJ, Noorbaloochi S, Clothier BA, Eftekhari A, Bernardy NC, Chard KM, Crowley JJ, Cook JM. The influence of team functioning and workload on sustainability of trauma-focused evidence-based psychotherapies. Psychiatr Serv. 2018;69(8):879–86. https://doi.org/10.1176/appi.ps.201700432.

    Article  PubMed  Google Scholar 

  29. Berkel C, Rudo-Stern J, Abraczinskas M, Wilson C, Lokey F, Flanigan E, Villamar JA, Dishion TJ, Smith JD. Translating evidence-based parenting programs for primary care: Stakeholder recommendations for sustainable implementation. J Community Psychol. 2020;48(4):1178–93. https://doi.org/10.1002/jcop.22317.

    Article  PubMed  PubMed Central  Google Scholar 

  30. LoCurto J, Pella J, Chan G, Ginsburg G. School-based clinicians sustained use of a cognitive behavioral treatment for anxiety disorders. Sch Ment Heal. 2020;12(4):677–88. https://doi.org/10.1007/s12310-020-09381-y.

    Article  Google Scholar 

  31. Combs KM, Buckley PR, Lain MA, Drewelow KM, Urano G, Kerns SE. Influence of classroom-level factors on implementation fidelity during scale-up of evidence-based interventions. Prev Sci. 2022;23(6):969–81. https://doi.org/10.1007/s11121-022-01375-3.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8(1):1–1. https://doi.org/10.1186/1748-5908-8-117.

    Article  Google Scholar 

  33. Chu BC, Talbott Crocco S, Arnold CC, Brown R, Southam-Gerow MA, Weisz JR. Sustained implementation of cognitive-behavioral therapy for youth anxiety and depression: Long-term effects of structured training and consultation on therapist practice in the field. Prof Psychol Res Pract. 2015;46(1):70. https://doi.org/10.1037/a0038000.

    Article  Google Scholar 

  34. Swain K, Whitley R, McHugo GJ, Drake RE. The sustainability of evidence-based practices in routine mental health agencies. Community Ment Health J. 2010;46:119–29. https://doi.org/10.1007/s10597-009-9202-y.

    Article  PubMed  Google Scholar 

  35. Jackson CB, Herschell AD, Scudder AT, Hart J, Schaffner KF, Kolko DJ, Mrozowski S. Making implementation last: The impact of training design on the sustainability of an evidence-based treatment in a randomized controlled trial. Administration and Policy in Mental Health and Mental Health Services Research. 2021;48:757–67. https://doi.org/10.1007/s10488-021-01126-6.

    Article  PubMed  Google Scholar 

  36. Manthey TJ, Goscha R. Conceptual Underpinnings of the Evidence-Based Practice Sustainability Index. Best Pract Ment Health. 2013;9(1):83–98.

    Google Scholar 

  37. Loman SL, Rodriguez BJ, Horner RH. Sustainability of a targeted intervention package: first step to success in Oregon. J Emot Behav Disord. 2010;18(3):178–91. https://doi.org/10.1177/1063426610362899.

    Article  Google Scholar 

  38. Buchanan R, Chamberlain P, Price JM, Sprengelmeyer P. Examining the equivalence of fidelity over two generations of KEEP implementation: A preliminary analysis. Child Youth Serv Rev. 2013;35(1):188–93. https://doi.org/10.1016/j.childyouth.2012.10.002.

    Article  PubMed  Google Scholar 

  39. Schoenwald SK, Sheidow AJ, Chapman JE. Clinical supervision in treatment transport: effects on adherence and outcomes. J Consult Clin Psychol. 2009;77(3):410. https://doi.org/10.1037/a0013788.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Weisz JR, Ugueto AM, Herren J, Marchette LK, Bearman SK, Lee EH, Thomassin K, Alleyne A, Cheron DM, Tweed JL, Hersh J. When the torch is passed, does the flame still burn? Testing a “train the supervisor” model for the Child STEPs treatment program. J Consult Clin Psychol. 2018;86(9):726. https://doi.org/10.1037/ccp0000331.

    Article  PubMed  Google Scholar 

  41. Boyd MR, Park AL, Becker KD, Chorpita BF. The relation between training asymmetry and supervisory working alliance: Implications for the role of supervisors in implementation. Clin Superv. 2021;40(1):49–67. https://doi.org/10.1080/07325223.2020.1871460.

    Article  Google Scholar 

  42. Adams DR, Williams NJ, Becker-Haimes EM, Skriner L, Shaffer L, DeWitt K, Neimark G, Jones DT, Beidas RS. Therapist financial strain and turnover: interactions with system-level implementation of evidence-based practices. Administration and Policy in Mental Health and Mental Health Services Research. 2019;46:713–23. https://doi.org/10.1007/s10488-019-00949-8.

    Article  PubMed  Google Scholar 

  43. Newton FB, Ender SC. Students helping students: A guide for peer educators on college campuses. San Francisco: John Wiley & Sons; 2010.

  44. Stice E, Fisher M, Martinez E. Eating disorder diagnostic scale: additional evidence of reliability and validity. Psychol Assess. 2004;16(1):60. https://doi.org/10.1037/1040-3590.16.1.60.

    Article  PubMed  Google Scholar 

  45. Rohde P, Bearman SK, Pauling S, Gau JM, Shaw H, Stice E. Setting and Provider Predictors of Implementation Success for an Eating Disorder Prevention Program Delivered by College Peer Educators. Administration and Policy in Mental Health and Mental Health Services Research. 2023;29:1–4. https://doi.org/10.1007/s10488-023-01288-5.

    Article  Google Scholar 

  46. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:1–5.

    Article  Google Scholar 

  47. Bergh AM, Arsalo I, Malan AF, Patrick M, Pattinson RC, Phillips N. Measuring implementation progress in kangaroo mother care. Acta Paediatr. 2005;94(8):1102–8. https://doi.org/10.1111/j.1651-2227.2005.tb02052.x.

    Article  PubMed  Google Scholar 

  48. Chamberlain P, Brown CH, Saldana L. Observational measure of implementation progress in community based settings: the stages of implementation completion (SIC). Implement Sci. 2011;6(1):1–8. https://doi.org/10.1186/1748-5908-6-116.

    Article  Google Scholar 

  49. Rogers E. Diffusion of Innovations. 5th ed. New York: Free Press; 2003.

  50. Windsor R, Cleary S, Ramiah K, Clark J, Abroms L, Davis A. The Smoking Cessation and Reduction in Pregnancy Treatment (SCRIPT) Adoption Scale: evaluating the diffusion of a tobacco treatment innovation to a statewide prenatal care program and providers. J Health Commun. 2013;18(10):1201–20. https://doi.org/10.1080/10810730.2013.778358.

    Article  PubMed  Google Scholar 

  51. Borntrager CF, Chorpita BF, Higa-McMillan C, Weisz JR. Provider attitudes toward evidence-based practices: are the concerns with the evidence or with the manuals? Psychiatr Serv. 2009;60(5):677–81. https://doi.org/10.1176/ps.2009.60.5.677.

    Article  PubMed  Google Scholar 

  52. Burgess AM, Okamura KH, Izmirian SC, Higa-McMillan CK, Shimabukuro S, Nakamura BJ. Therapist attitudes towards evidence-based practice: A joint factor analysis. J Behav Health Serv Res. 2017;44:414–27. https://doi.org/10.1007/s11414-016-9517-8.

    Article  PubMed  Google Scholar 

  53. Rohde P, Bearman SK, Pauling S, Gau JM, Shaw H, Stice E. Setting and Provider Predictors of Implementation Success for an Eating Disorder Prevention Program Delivered by College Peer Educators. Administration and Policy in Mental Health and Mental Health Services Research. 2023;50(6):912–25.

    Article  PubMed  Google Scholar 

  54. Kivimaki M, Elovainio M. A short version of the Team Climate Inventory: Development and psychometric properties. J Occup Organ Psychol. 1999;72(2):241–6.

    Article  Google Scholar 

  55. Loo R, Loewen P. A confirmatory factor-analytic and psychometric examination of the team climate inventory: full and short versions. Small Group Research. 2002;33(2):254–65.

    Article  Google Scholar 

  56. Anderson N, West MA. The Team Climate Inventory: Development of the TCI and its applications in teambuilding for innovativeness. Eur J Work Organ Psy. 1996;5(1):53–66.

    Article  Google Scholar 

  57. McCullagh P. On the asymptotic distribution of Pearson’s statistic in linear exponential-family models. International Statistical Review/Revue Internationale de Statistique. 1985;1:61–7.

    Google Scholar 

  58. Hosmer DW, Lemesbow S. Goodness of fit tests for the multiple logistic regression model. Communications in statistics-Theory and Methods. 1980;9(10):1043–69.

    Article  Google Scholar 

  59. Allison PD. Measures of fit for logistic regression. In Proceedings of the SAS global forum 2014 conference. Cary, NC, USA: SAS Institute Inc.; 2014. p. 1–13.

    Google Scholar 

  60. Nagelkerke NJ. A note on a general definition of the coefficient of determination. Biometrika. 1991;78(3):691–2.

    Article  Google Scholar 

  61. Cox DR. Analysis of binary data. New York: Routledge; 2018.

  62. Pallant, J., 2020. SPSS survival manual: A step by step guide to data analysis using IBM SPSS. London: Routledge; 2020.

  63. Benjamini Y, Hochberg Y. Controlling the false discovery rate: a practical and powerful approach to multiple testing. J Roy Stat Soc: Ser B (Methodol). 1995;57(1):289–300. https://doi.org/10.1111/j.2517-6161.1995.tb02031.x.

    Article  Google Scholar 

  64. Wong DR, Schaper H, Saldana L. Rates of sustainment in the Universal Stages of Implementation Completion. Implementation Science Communications. 2022;3(1):1–2. https://doi.org/10.1186/s43058-021-00250-6.

    Article  Google Scholar 

  65. Chen H, Cohen P, Chen S. How big is a big odds ratio? Interpreting the magnitudes of odds ratios in epidemiological studies. Communications in Statistics—simulation and Computation®. 2010;39(4):860–4. https://doi.org/10.1080/03610911003650383.

  66. Saldana L, Chamberlain P, Wang W, Hendricks BC. Predicting program start-up using the stages of implementation measure. Administration and Policy in Mental Health and Mental Health Services Research. 2012;39:419–25. https://doi.org/10.1007/s10488-011-0363-y.

    Article  PubMed  Google Scholar 

  67. Horwitz SM, Lewis K, Gleacher A, Wang N, Bradbury DM, Ray-LaBatt M, Hoagwood KE. Sustainability of an evidence-based practice in community mental health agencies serving children. Psychiatr Serv. 2019;70(5):413–6. https://doi.org/10.1176/appi.ps.201800430.

    Article  PubMed  Google Scholar 

  68. Ryba MM, Lo SB, Andersen BL. Sustainability of a biobehavioral intervention implemented by therapists and sustainment in community settings. Translational Behavioral Medicine. 2021;11(1):96–103. https://doi.org/10.1093/tbm/ibz175.

    Article  PubMed  Google Scholar 

  69. Solberg LI, Brekke ML, Fazio CJ, Fowles J, Jacobsen DN, Kottke TE, Mosser G, O’Connor PJ, Ohnsorg KA, Rolnick SJ. Lessons from experienced guideline implementers: attend to many factors and use multiple strategies. Jt Comm J Qual Improv. 2000;26(4):171–88. https://doi.org/10.1016/s1070-3241(00)26013-6.

    Article  CAS  PubMed  Google Scholar 

  70. Chen W, Qian L, Shi J, Franklin M. Comparing performance between log-binomial and robust Poisson regression models for estimating risk ratios under model misspecification. BMC Med Res Methodol. 2018;18:1–2.

    Article  Google Scholar 

  71. Proctor E, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, Padek M. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implement Sci. 2015;10:1–3. https://doi.org/10.1186/s13012-015-0274-5.

    Article  Google Scholar 

  72. Massatti RR, Sweeney HA, Panzano PC, Roth D. The de-adoption of innovative mental health practices (IMHP): Why organizations choose not to sustain an IMHP. Administration and Policy in Mental Health and Mental Health Services Research. 2008;35(1–2):50–65. https://doi.org/10.1007/s10488-007-0141-z.

    Article  PubMed  Google Scholar 

  73. Fixsen DL, Blase KA, Naoom SF, Van Dyke M, Wallace F. Implementation: The missing link between research and practice. NIRN implementation brief. 2009;1(1):218–27.

    Google Scholar 

Download references

Acknowledgements

The authors wish to acknowledge the supervisors, peer educators, and participants who contributed to this study. They also would like to thank Lindsey Greenwaldt for her efforts in coordinating the collection of study implementation data.

Funding

This research was supported by funding received by ES from The National Institutes of Health (MH112743).

Author information

Authors and Affiliations

Authors

Contributions

SKB developed the outline and initial draft of this manuscript with PR, SP, and JMG and played equal role in conceptualization, funding acquisition, data collection, methodology, supervision, and review and editing. PR played equal role in conceptualization, funding acquisition, data collection, methodology, supervision, data curation, formal analyses, and review and editing. SP played equal role in conceptualization and review and editing. JMG played equal role in data curation, formal analysis, methodology, visualization, and review and editing. HS played equal role in conceptualization, funding acquisition, investigation, supervision and review and editing. ES played equal role in conceptualization, formal analysis, funding acquisition, investigation, methodology, project administration, supervision, and review and editing. All authors read and approved the final manuscript.

Corresponding authors

Correspondence to Sarah Kate Bearman or Paul Rohde.

Ethics declarations

Ethics approval and consent to participate

The Oregon Research Institute Institutional Review Board (IRB) approved this study, and all participants gave informed consent.

Consent for publication

All individuals gave consent for de-identified data to be published.

Competing interests

The authors report no conflicts of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bearman, S.K., Rohde, P., Pauling, S. et al. Predictors of the sustainability for an evidence-based eating disorder prevention program delivered by college peer educators. Implementation Sci 19, 47 (2024). https://doi.org/10.1186/s13012-024-01373-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-024-01373-9

Keywords